1
|
Adaptive Response Behavior in the Pursuit of Unpredictably Moving Sounds. eNeuro 2021; 8:ENEURO.0556-20.2021. [PMID: 33875456 PMCID: PMC8116108 DOI: 10.1523/eneuro.0556-20.2021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Revised: 03/05/2021] [Accepted: 03/13/2021] [Indexed: 11/21/2022] Open
Abstract
Although moving sound-sources abound in natural auditory scenes, it is not clear how the human brain processes auditory motion. Previous studies have indicated that, although ocular localization responses to stationary sounds are quite accurate, ocular smooth pursuit of moving sounds is very poor. We here demonstrate that human subjects faithfully track a sound’s unpredictable movements in the horizontal plane with smooth-pursuit responses of the head. Our analysis revealed that the stimulus–response relation was well described by an under-damped passive, second-order low-pass filter in series with an idiosyncratic, fixed, pure delay. The model contained only two free parameters: the system’s damping coefficient, and its central (resonance) frequency. We found that the latter remained constant at ∼0.6 Hz throughout the experiment for all subjects. Interestingly, the damping coefficient systematically increased with trial number, suggesting the presence of an adaptive mechanism in the auditory pursuit system (APS). This mechanism functions even for unpredictable sound-motion trajectories endowed with fixed, but covert, frequency characteristics in open-loop tracking conditions. We conjecture that the APS optimizes a trade-off between response speed and effort. Taken together, our data support the existence of a pursuit system for auditory head-tracking, which would suggest the presence of a neural representation of a spatial auditory fovea (AF).
Collapse
|
2
|
Representation of Auditory Motion Directions and Sound Source Locations in the Human Planum Temporale. J Neurosci 2019; 39:2208-2220. [PMID: 30651333 DOI: 10.1523/jneurosci.2289-18.2018] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2018] [Revised: 12/20/2018] [Accepted: 12/21/2018] [Indexed: 11/21/2022] Open
Abstract
The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is, however, poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to sounds moving left, right, up, and down as well as static sounds. Whole-brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human planum temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were, however, significantly distinct. Altogether, our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENT Compared with what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human planum temporale (hPT) and that they rely on partially shared pattern geometries. Our study, therefore, sheds important new light on how computing the location or direction of sounds is implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a "preferred axis of motion" organization, reminiscent of the coding mechanisms typically observed in the occipital middle-temporal cortex (hMT+/V5) region for computing visual motion.
Collapse
|
3
|
Carlile S, Leung J. The Perception of Auditory Motion. Trends Hear 2016; 20:2331216516644254. [PMID: 27094029 PMCID: PMC4871213 DOI: 10.1177/2331216516644254] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2015] [Revised: 03/22/2016] [Accepted: 03/22/2016] [Indexed: 11/16/2022] Open
Abstract
The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception.
Collapse
Affiliation(s)
- Simon Carlile
- School of Medical Sciences, University of Sydney, NSW, Australia Starkey Hearing Research Center, Berkeley, CA, USA
| | - Johahn Leung
- School of Medical Sciences, University of Sydney, NSW, Australia
| |
Collapse
|
4
|
Kaplan BA, Lansner A. A spiking neural network model of self-organized pattern recognition in the early mammalian olfactory system. Front Neural Circuits 2014; 8:5. [PMID: 24570657 PMCID: PMC3916767 DOI: 10.3389/fncir.2014.00005] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2013] [Accepted: 01/09/2014] [Indexed: 01/01/2023] Open
Abstract
Olfactory sensory information passes through several processing stages before an odor percept emerges. The question how the olfactory system learns to create odor representations linking those different levels and how it learns to connect and discriminate between them is largely unresolved. We present a large-scale network model with single and multi-compartmental Hodgkin-Huxley type model neurons representing olfactory receptor neurons (ORNs) in the epithelium, periglomerular cells, mitral/tufted cells and granule cells in the olfactory bulb (OB), and three types of cortical cells in the piriform cortex (PC). Odor patterns are calculated based on affinities between ORNs and odor stimuli derived from physico-chemical descriptors of behaviorally relevant real-world odorants. The properties of ORNs were tuned to show saturated response curves with increasing concentration as seen in experiments. On the level of the OB we explored the possibility of using a fuzzy concentration interval code, which was implemented through dendro-dendritic inhibition leading to winner-take-all like dynamics between mitral/tufted cells belonging to the same glomerulus. The connectivity from mitral/tufted cells to PC neurons was self-organized from a mutual information measure and by using a competitive Hebbian-Bayesian learning algorithm based on the response patterns of mitral/tufted cells to different odors yielding a distributed feed-forward projection to the PC. The PC was implemented as a modular attractor network with a recurrent connectivity that was likewise organized through Hebbian-Bayesian learning. We demonstrate the functionality of the model in a one-sniff-learning and recognition task on a set of 50 odorants. Furthermore, we study its robustness against noise on the receptor level and its ability to perform concentration invariant odor recognition. Moreover, we investigate the pattern completion capabilities of the system and rivalry dynamics for odor mixtures.
Collapse
Affiliation(s)
- Bernhard A Kaplan
- Department of Computational Biology, School of Computer Science and Communication, Royal Institute of Technology Stockholm, Sweden ; Stockholm Brain Institute, Karolinska Institute Stockholm, Sweden
| | - Anders Lansner
- Department of Computational Biology, School of Computer Science and Communication, Royal Institute of Technology Stockholm, Sweden ; Stockholm Brain Institute, Karolinska Institute Stockholm, Sweden ; Department of Numerical Analysis and Computer Science, Stockholm University Stockholm, Sweden
| |
Collapse
|
5
|
Zimmer U, Macaluso E. Interaural temporal and coherence cues jointly contribute to successful sound movement perception and activation of parietal cortex. Neuroimage 2009; 46:1200-8. [PMID: 19303934 DOI: 10.1016/j.neuroimage.2009.03.022] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2008] [Revised: 01/27/2009] [Accepted: 03/08/2009] [Indexed: 11/24/2022] Open
Abstract
The perception of movement in the auditory modality requires dynamic changes in the input that reaches the two ears (e.g. sequential changes of interaural time differences; dynamic ITDs). However, it is still unclear as to what extent these temporal cues interact with other interaural cues to determine successful movement perception, and which brain regions are involved in sound movement processing. Here, we presented trains of white-noise bursts containing either static or dynamic ITDs, and we varied parametrically the level of binaural coherence (BC) of both types of stimuli. Behaviorally, we found that movement discrimination sensitivity decreased with decreasing levels of BC. fMRI analyses highlighted a network of temporal, frontal and parietal regions where activity decreased with decreasing BC. Critically, in the intra-parietal sulcus and the supra-marginal gyrus brain activity decreased with decreasing BC, but only for dynamic-ITD sounds (BC by ITD interaction). Thus, these regions activated selectively when the sounds contained both dynamic ITDs and high levels of BC; i.e. when subjects perceived sound movement. We conclude that sound movement perception requires both dynamic changes of the auditory input and effective sound-source localization, and that parietal cortex utilizes interaural temporal and coherence cues for the successful perception of sound movement.
Collapse
Affiliation(s)
- U Zimmer
- NeuroImaging Laboratory, Santa Lucia Foundation, Italy.
| | | |
Collapse
|
6
|
Bentvelzen A, Leung J, Alais D. Discriminating Audiovisual Speed: Optimal Integration of Speed Defaults to Probability Summation When Component Reliabilities Diverge. Perception 2009; 38:966-87. [DOI: 10.1068/p6261] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
We investigated audiovisual speed perception to test the maximum-likelihood-estimation (MLE) model of multisensory integration. According to MLE, audiovisual speed perception will be based on a weighted average of visual and auditory speed estimates, with each component weighted by its inverse variance, a statistically optimal combination that produces a fused estimate with minimised variance and thereby affords maximal discrimination. We use virtual auditory space to create ecologically valid auditory motion, together with visual apparent motion around an array of 63 LEDs. To degrade the usual dominance of vision over audition, we added positional jitter to the motion sequences, and also measured peripheral trajectories. Both factors degraded visual speed discrimination, while auditory speed perception was unaffected by trajectory location. In the bimodal conditions, a speed conflict was introduced (48° versus 60° s−1) and two measures were taken: perceived audiovisual speed, and the precision (variability) of audiovisual speed discrimination. These measures showed only a weak tendency to follow MLE predictions. However, splitting the data into two groups based on whether the unimodal component weights were similar or disparate revealed interesting findings: similarly weighted components were integrated in a manner closely matching MLE predictions, while dissimilarity weighted components (greater than 3: 1 difference) were integrated according to probability-summation predictions. These results suggest that different multisensory integration strategies may be implemented depending on relative component reliabilities, with MLE integration vetoed when component weights are highly disparate.
Collapse
Affiliation(s)
- Adam Bentvelzen
- School of Psychology, University of Sydney, Sydney 2006, Australia
| | - Johahn Leung
- School of Psychology, University of Sydney, Sydney 2006, Australia
| | - David Alais
- School of Psychology, University of Sydney, Sydney 2006, Australia
| |
Collapse
|
7
|
Functional gradients of auditory sensitivity along the anterior ectosylvian sulcus of the cat. J Neurosci 2008; 28:3657-67. [PMID: 18385324 DOI: 10.1523/jneurosci.4539-07.2008] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Determining the spatial direction of sound sources is one of the major computations performed by the auditory system. The anterior ectosylvian sulcus (AES) of cat cortex is known to be important for sound localization. However, there are contradicting reports as to the spatial response properties of neurons in AES: whereas some studies found narrowly tuned neurons, others reported mostly spatially widely tuned neurons. We hypothesized that this is the result of a nonhomogenous distribution of the auditory neurons in this area. To test this possibility, we recorded neuronal activity along the AES, together with a sample of neurons from primary auditory cortex (A1) of cats in response to pure tones and to virtual acoustic space stimuli. In all areas, most neurons responded to both types of stimuli. Neurons located in posterior AES (pAES) showed special response properties that distinguished them from neurons in A1 and from neurons in anterior AES (aAES). The proportion of space-selective neurons among auditory neurons was significantly higher in pAES (82%) than in A1 (72%) and in aAES (60%). Furthermore, whereas the large majority of A1 neurons responded preferentially to contralateral sounds, neurons in pAES (and to a lesser extent in aAES) had their spatial selectivity distributed more homogenously. In particular, 28% of the space-selective neurons in pAES had highly modulated frontal receptive fields, against 8% in A1 and 17% in aAES. We conclude that in cats, pAES contains a secondary auditory cortical field which is specialized for spatial processing, in particular for the representation of frontal space.
Collapse
|
8
|
Robert PY, Sawan M. An independent-component-analysis-based time-space processor for the identification of neural stimulation sources. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2007; 2007:3876-3879. [PMID: 18002845 DOI: 10.1109/iembs.2007.4353179] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
A new scheme for enhancing both compression and interpretability of multichannel biopotentials acquired from implanted neural sensors is presented. It uses spatial and temporal correlation between samples to find patterns associated to external stimulation producing an effect on the sensed tissues, which can be interpreted as sensations or intentions of the subject. Temporal analysis includes spike detection and sorting. Spatial processing is based on an independent component analysis (ICA) of the observed neural activity. Demonstration through modeling and simulation is made of the capability of the system to associate the observed potentials to external stimulations.
Collapse
Affiliation(s)
- Pierre-Yves Robert
- Polystim Neurotechnology Laboratory, Ecole Polytechnique de Montréal, QC, Canada.
| | | |
Collapse
|
9
|
Xiang J, Daniel SJ, Ishii R, Holowka S, Harrison RV, Chuang S. Auditory Detection of Motion Velocity in Humans: a Magnetoencephalographic Study. Brain Topogr 2005; 17:139-49. [PMID: 15974473 DOI: 10.1007/s10548-005-4447-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
To investigate the cerebral mechanisms of auditory detection of motion velocity in the human brain, neuromagnetic fields elicited by six moving sounds and one stationary sound were investigated with a whole-cortex magnetoencephalography (MEG) system. The stationary sound evoked only one clear response at a latency of 109+/-6 ms (first response, or M100), but the six moving sounds evoked two clear responses: an earlier response at a latency of 116+/-7 ms (M100) and a later response at a latency ranging from 180 to 760 ms (magnetic motion response, or MM). The latency and amplitude of the MM were inversely related to the velocity of the moving sounds (p<0.02). The magnetic source of MM was related to the velocity of the moving sounds (p<0.05). A dynamic neuromagnetic response, MM, was elicited by the moving sounds, which likely encoded the neural processing of auditory detection of motion velocity. A specific neural network that processes the motion velocity in the human brain probably includes the bilateral superior temporal cortices and the brainstem. The left posterior and lateral part of the auditory cortex may play a pivotal role in the auditory detection of motion velocity.
Collapse
Affiliation(s)
- Jing Xiang
- Department of Diagnostic Imaging, The Hospital for Sick Children, Toronto, Ontario, Canada.
| | | | | | | | | | | |
Collapse
|
10
|
Nikitin NI, Varfolomeev AL, Kotelenko LM. Responses of cat primary auditory cortex neurons to moving stimuli with dynamically changing interaural delays. ACTA ACUST UNITED AC 2005; 34:949-59. [PMID: 15686141 DOI: 10.1023/b:neab.0000042654.09989.85] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
The spike responses of individual neurons in the primary auditory cortex were studied in anesthetized cats during exposure to stationary and moving stimuli with static or dynamically changing interaural delays (deltaT). Static stimuli were tones and clicks. Dynamic stimuli were created using series of synphase and antiphase clicks with interaural delays which changed over time. Sensitivity to changes in deltaT was predominantly present in neurons with low characteristic frequencies (less than 2.8 kHz). Changes in deltaT in moving stimuli induced responses in neurons sensitive to changes in deltaT in the stationary stimulus. The effect of movement could be a relationship between the level of spike activity and the direction and rate of change of deltaT or it could be a displacement of the tuning curve for the response to deltaT (the deltaT function) in the direction opposite to that of the direction of the change in deltaT. The magnitude of the effects of movement depended on the position of the period for changes in deltaT relative to the deltaT function. The greatest effects were seen with changes in deltaT on the sloping part of the deltaT function.
Collapse
Affiliation(s)
- N I Nikitin
- Auditory Physiology Group, I. P. Pavlov Institute of Physiology, Russian Academy of Sciences, 6 Makarov Bank, 199034 St. Petersburg, Russia
| | | | | |
Collapse
|
11
|
Context-dependent adaptive coding of interaural phase disparity in the auditory cortex of awake macaques. J Neurosci 2002. [PMID: 12040069 DOI: 10.1523/jneurosci.22-11-04625.2002] [Citation(s) in RCA: 79] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
In the ascending auditory pathway, the context in which a particular stimulus occurs can influence the character of the responses that encode it. Here we demonstrate that the cortical representation of a binaural cue to sound source location is profoundly context-dependent: spike rates elicited by a 0 degrees interaural phase disparity (IPD) were very different when preceded by 90 degrees versus -90 degrees IPD. The changes in firing rate associated with equivalent stimuli occurring in different contexts are comparable to changes in discharge rate that establish cortical tuning to the cue itself. Single-unit responses to trapezoidally modulated IPD stimuli were recorded in the auditory cortices of awake rhesus monkeys. Each trapezoidal stimulus consisted of linear modulations of IPD between two steady-state IPDs differing by 90 degrees. The stimulus set was constructed so that identical IPDs and sweeps through identical IPD ranges recurred as elements of disparate sequences. We routinely observed orderly context-induced shifts in IPD tuning. These shifts reflected an underlying enhancement of the contrast in the discharge rate representation of different IPDs. This process is subserved by sensitivity to stimulus events in the recent past, involving multiple adaptive mechanisms operating on timescales ranging from tens of milliseconds to seconds. These findings suggest that the cortical processing of dynamic acoustic signals is dominated by an adaptive coding strategy that prioritizes the representation of stimulus changes over actual stimulus values. We show how cortical selectivity for motion direction in real space could emerge as a consequence of this general coding principle.
Collapse
|
12
|
Abstract
Efforts to locate a cortical map of auditory space generally have proven unsuccessful. At moderate sound levels, cortical neurons generally show large or unbounded spatial receptive fields. Within those large receptive fields, however, changes in sound location result in systematic changes in the temporal firing patterns such that single-neuron firing patterns can signal the locations of sound sources throughout as much as 360 degrees of auditory space. Neurons in the cat's auditory cortex show accurate localization of broad-band sounds, which human listeners localize accurately. Conversely, in response to filtered sounds that produce spatial illusions in human listeners, neurons signal systematically incorrect locations that can be predicted by a model that also predicts the listeners' illusory reports. These results from the cat's auditory cortex, as well as more limited results from nonhuman primates, suggest a model in which the location of any particular sound source is represented in a distributed fashion within individual auditory cortical areas and among multiple cortical areas.
Collapse
Affiliation(s)
- John C Middlebrooks
- Kresge Hearing Research Institute, University of Michigan, Ann Arbor 48109-0506, USA.
| | | | | | | |
Collapse
|
13
|
Location Signaling by Cortical Neurons. INTEGRATIVE FUNCTIONS IN THE MAMMALIAN AUDITORY PATHWAY 2002. [DOI: 10.1007/978-1-4757-3654-0_8] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/09/2023]
|
14
|
Xiang J, Chuang S, Wilson D, Otsubo H, Pang E, Holowka S, Sharma R, Ochi A, Chitoku S. Sound motion evoked magnetic fields. Clin Neurophysiol 2002; 113:1-9. [PMID: 11801418 DOI: 10.1016/s1388-2457(01)00709-x] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
OBJECTIVE The aim of present study was to determine which brain regions are involved in the conscious perception of sound motion in humans. METHODS Six kinds of sound stimuli were studied. Two static sound stimuli with durations of 100 or 1000 ms remained at a fixed position during the stimulation period. Four moving sound stimuli with duration of 100 or 1000 ms were moving from left to right, or right to left, during the stimulation period. Evoked magnetic fields were recorded using a 151-channel whole cortex magnetoencephalographic system. RESULTS The response identified in all sound stimuli was M100. Responses identified only in moving sound stimuli were M180, M280 and M680. Contour maps and dipoles overlapped on magnetic resonance imaging indicated that both the M100 and M680 responses were generated in the superior temporal cortex (left and right), while M180 and M280 were generated in the parietal cortex (right). CONCLUSIONS The results of this MEG study indicated that the right parietal cortex was involved in sound motion processing. We hypothesize that the right parietal cortex, in association with the left and right superior temporal cortex, forms a network to process sound motion information.
Collapse
Affiliation(s)
- Jing Xiang
- Department of Diagnostic Imaging, The Hospital for Sick Children, 555 University Avenue, Toronto, Ont., Canada M5G 1X8.
| | | | | | | | | | | | | | | | | |
Collapse
|
15
|
Firzlaff U, Schuller G. Motion processing in the auditory cortex of the rufous horseshoe bat: role of GABAergic inhibition. Eur J Neurosci 2001; 14:1687-701. [PMID: 11860463 DOI: 10.1046/j.0953-816x.2001.01797.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
This study examined the influence of inhibition on motion-direction-sensitive responses of neurons in the dorsal fields of auditory cortex of the rufous horseshoe bat. Responses to auditory apparent motion stimuli were recorded extracellularly from neurons while microiontophoretically applying gamma-aminobutyric acid (GABA) and the GABAA receptor antagonist bicuculline methiodide (BMI). Neurons could respond with a directional preference exhibiting stronger responses to one direction of motion or a shift of receptive field (RF) borders depending on direction of motion. BMI influenced the motion direction sensitivity of 53% of neurons. In 21% of neurons the motion-direction sensitivity was decreased by BMI by decreasing either directional preference or RF shift. In neurons with a directional preference, BMI increased the spike number for the preferred direction by a similar amount as for the nonpreferred direction. Thus, inhibition was not direction specific. BMI increased motion-direction sensitivity by either increasing directional preference or magnitude of RF shifts in 22% of neurons. Ten percent of neurons changed their response from a RF shift to a directional preference under BMI. In these neurons, the observed effects could often be better explained by adaptation of excitation rather than inhibition. The results suggest, that adaptation of excitation, as well as cortex specific GABAergic inhibition, contribute to motion-direction sensitivity in the auditory cortex of the rufous horseshoe bat.
Collapse
Affiliation(s)
- U Firzlaff
- Department Biologie II, Ludwig-Maximilians-Universität München, Luisenstr. 14, D-80333 München, Germany.
| | | |
Collapse
|
16
|
Malone BJ, Semple MN. Effects of auditory stimulus context on the representation of frequency in the gerbil inferior colliculus. J Neurophysiol 2001; 86:1113-30. [PMID: 11535662 DOI: 10.1152/jn.2001.86.3.1113] [Citation(s) in RCA: 55] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Prior studies of dynamic conditioning have focused on modulation of binaural localization cues, revealing that the responses of inferior colliculus (IC) neurons to particular values of interaural phase and level disparities depend critically on the context in which they occur. Here we show that monaural frequency transitions, which do not simulate azimuthal motion, also condition the responses of IC neurons. We characterized single-unit responses to two frequency transition stimuli: a glide stimulus comprising two tones linked by a linear frequency sweep (origin-sweep-target) and a step stimulus consisting of one tone followed immediately by another (origin-target). Using sets of glide and step stimuli converging on a common target, we constructed conditioned response functions (RFs) depicting the variability in the response to an identical stimulus as a function of the preceding origin frequency. For nearly all cells, the response to the target depended on the origin frequency, even for origins outside the excitatory frequency response area of the cell. Results from conditioned RFs based on long (2-4 s) and short (200 ms) duration step stimuli indicate that conditioning effects can be induced in the absence of the dynamic sweep, and by stimuli of relatively short duration. Because IC neurons are tuned to frequency, changes in the origin frequency often change the "effective" stimulus duty cycle. In many cases, the enhancement of the target response appeared related to the decrease in the "effective" stimulus duty cycle rather than to the prior presentation of a particular origin frequency. Although this implies that nonselective adaptive mechanisms are responsible for conditioned responses, slightly more than half of IC neurons in each paradigm responded significantly differently to targets following origins that elicited statistically indistinguishable responses. The prevailing influence of stimulus context when discharge history is controlled demonstrates that not all the mechanisms governing conditioning depend on the discharge history of the recorded neuron. Selective adaptation among the neuron's variously tuned afferents may help engender stimulus-specific conditioning. The demonstration that conditioning effects reflect sensitivity to spectral as well as spatial stimulus contrast has broad implications for the processing of a wide range of dynamic acoustic signals and sound sequences.
Collapse
Affiliation(s)
- B J Malone
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA.
| | | |
Collapse
|
17
|
Firzlaff U, Schuller G. Cortical representation of acoustic motion in the rufous horseshoe bat, Rhinolophus rouxi. Eur J Neurosci 2001; 13:1209-20. [PMID: 11285018 DOI: 10.1046/j.0953-816x.2001.01978.x] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Responses of neurons to apparent auditory motion in the azimuth were recorded in three different fields of auditory cortex of the rufous horseshoe bat. Motion was simulated using successive stimuli with dynamically changing interaural intensity differences presented via earphones. Seventy-one percent of sampled neurons were motion-direction-sensitive. Two types of responses could be distinguished. Thirty-four percent of neurons showed a directional preference exhibiting stronger responses to one direction of motion. Fifty-seven percent of neurons responded with a shift of spatial receptive field position depending on direction of motion. Both effects could occur in the same neuron depending on the parameters of apparent motion. Most neurons with contralateral receptive fields exhibited directional preference only with motion entering the receptive field from the opposite direction. Receptive field shifts were opposite to the direction of motion. Specific combinations of spatiotemporal parameters determined the motion-direction-sensitive responses. Velocity was not encoded as a specific parameter. Temporal parameters of motion and azimuth position of the moving sound source were differentially encoded by neurons in different fields of auditory cortex. Neurons with a directional preference in the dorsal fields can encode motion with short interpulse intervals, whereas direction-preferring neurons in the primary field can best encode motion with medium interpulse intervals. Furthermore, neurons with a directional preference in the dorsal fields are specialized for encoding motion in the midfield of azimuth, whereas direction-preferring neurons in the primary field can encode motion in lateral positions. The results suggest that motion information is differentially processed in different fields of the auditory cortex of the rufous horseshoe bat.
Collapse
Affiliation(s)
- U Firzlaff
- Zoologisches Institut der Ludwig-Maximilians-Universität München, Luisenstr. 14, D-80333 München, Germany.
| | | |
Collapse
|
18
|
Reinhardt-Rutland AH. The spectrally dependent monotic component in the decreasing-loudness aftereffect: implications for dynamic auditory localization. THE JOURNAL OF GENERAL PSYCHOLOGY 2001; 128:43-56. [PMID: 11277446 DOI: 10.1080/00221300109598897] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
Listeners exposed to a tone increasing in intensity report an aftereffect of decreasing loudness in a steady tone heard afterward. In the present study, the spectral dependence of the monotic decreasing-loudness aftereffect (adapting and testing 1 ear) was compared with (a) the spectral dependence of the interotic decreasing-loudness aftereffect (adapting 1 ear and testing the other ear) and (b) a non-adaptation control condition. The purpose was to test the hypothesis that the decreasing-loudness aftereffect may concern the sensory processing associated with dynamic localization. The hypothesis is based on two premises: (a) dynamic localization requires monaural sensory processing, and (b) sensory processing is reflected in spectral selectivity. Hence, the hypothesis would be supported if the monotic aftereffect were more spectrally dependent and stronger than the interotic aftereffect; A. H. Reinhardt-Rutland (1998) showed that the hypothesis is supported with regard to the related increasing-loudness aftereffect. Two listeners were exposed to a 1-kHz adapting stimulus. From responses of "growing softer" or "growing louder" to test stimuli changing in intensity, nulls were calculated; test carrier frequencies ranged from 0.5 kHz to 2 kHz. Confirming the hypothesis, the monotic aftereffect peaked at around the 1-kHz test carrier frequency. In contrast, the interotic aftereffect showed little evidence of spectrally dependent peaking. Except when test and adaptation carrier frequencies differed markedly, the interotic aftereffect was smaller than the monotic aftereffect.
Collapse
|
19
|
Witton C, Green GG, Rees A, Henning GB. Monaural and binaural detection of sinusoidal phase modulation of a 500-Hz tone. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2000; 108:1826-33. [PMID: 11051509 DOI: 10.1121/1.1310195] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
The detectability of phase modulation was measured for three subjects in two-alternative temporal forced-choice experiments. In experiment 1, the detectability of sinusoidal phase modulation in a 1500-ms burst of an 80-dB (SPL), 500-Hz sinusoidal carrier presented to the left ear (monaural condition) was measured. The experiment was repeated with an 80-dB, 500-Hz static (unmodulated) tone at the right ear (dichotic condition). At a modulation rate of 1 Hz, subjects were an order of magnitude more sensitive to phase modulation in the dichotic condition than in the monaural condition. The dichotic advantage decreased monotonically with increasing modulation rate. Subjects ceased to detect movement in the dichotic stimulus above 10 Hz, but a dichotic advantage remained up to a modulation rate of 40 Hz. Thus, although sound movement detection is sluggish, detection of internal phase modulation is not. In experiment 2, thresholds for detecting 2-Hz phase modulation were measured in the dichotic condition as a function of the level of the pure tone in the right ear. The dichotic advantage persisted even when the level of the pure tone was reduced by 50 dB or more. The findings demonstrate a large dichotic advantage which persists to high modulation rates and which depends very little on interaural level differences.
Collapse
Affiliation(s)
- C Witton
- Department of Physiological Sciences, University of Newcastle upon Tyne, The Medical School, United Kingdom.
| | | | | | | |
Collapse
|
20
|
Jiang H, Lepore F, Poirier P, Guillemot JP. Responses of cells to stationary and moving sound stimuli in the anterior ectosylvian cortex of cats. Hear Res 2000; 139:69-85. [PMID: 10601714 DOI: 10.1016/s0378-5955(99)00176-8] [Citation(s) in RCA: 22] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
The azimuthal, directional and angular speed sound selectivities of single units were examined in the posterior part of the anterior ectosylvian cortex. Broadband noise bursts and simulated moving sounds were delivered from 16 loudspeakers fixed on the horizontal plane in a quasi-anechoic sound-isolation chamber. The activity of 78 neurons was recorded and quantitatively analyzed. Most cells responded to at least the static sound. The relative strengths of their responses suggested that the cells could be classed as omnidirectional (37.2%), contralateral hemifield (29.5%), ipsilateral hemifield (2.5%) and azimuth (7.7%) selective. The remaining 23.1% could not be classified. All cells responded to a simulated moving sound displaced at five different speeds. A majority (88%) of them showed some directional preference in that they discharged at least twice as strongly for one direction as for the other for at least one speed. 14.7% displayed angular speed selectivity. Different patterns of neuronal discharges were evoked. For static sounds, most of the cells gave ON-type responses. A large proportion (60%) of the cells responded in a sustained manner to maintained stimulation. Among these, 68% also gave sustained discharges to moving sounds. The spatial tuning and the directional and angular speed selectivity of neurons in the posterior part of the AEC suggest that this area is involved in the processing of static and moving sounds.
Collapse
Affiliation(s)
- H Jiang
- Groupe de Recherche en Neuropsychologie Expérimentale, Département de Psychologie, Université de Montréal, CP 6128, Succ. Centre-ville, Montréal, Qué., Canada
| | | | | | | |
Collapse
|
21
|
Doan DE, Saunders JC. Sensitivity to simulated directional sound motion in the rat primary auditory cortex. J Neurophysiol 1999; 81:2075-87. [PMID: 10322049 DOI: 10.1152/jn.1999.81.5.2075] [Citation(s) in RCA: 19] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Sensitivity to simulated directional sound motion in the rat primary auditory cortex. This paper examines neuron responses in rat primary auditory cortex (AI) during sound stimulation of the two ears designed to simulate sound motion in the horizontal plane. The simulated sound motion was synthesized from mathematical equations that generated dynamic changes in interaural phase, intensity, and Doppler shifts at the two ears. The simulated sounds were based on moving sources in the right frontal horizontal quadrant. Stimuli consisted of three circumferential segments between 0 and 30 degrees, 30 and 60 degrees, and 60 and 90 degrees and four radial segments at 0, 30, 60, and 90 degrees. The constant velocity portion of each segment was 0.84 m long. The circumferential segments and center of the radial segments were calculated to simulate a distance of 2 m from the head. Each segment had two trajectories that simulated motion in both directions, and each trajectory was presented at two velocities. Young adult rats were anesthetized, the left primary auditory cortex was exposed, and microelectrode recordings were obtained from sound responsive cells in AI. All testing took place at a tonal frequency that most closely approximated the best frequency of the unit at a level 20 dB above the tuning curve threshold. The results were presented on polar plots that emphasized the two directions of simulated motion for each segment rather than the location of sound in space. The trajectory exhibiting a "maximum motion response" could be identified from these plots. "Neuron discharge profiles" within these trajectories were used to demonstrate neuron activity for the two motion directions. Cells were identified that clearly responded to simulated uni- or multidirectional sound motion (39%), that were sensitive to sound location only (19%), or that were sound driven but insensitive to our location or sound motion stimuli (42%). The results demonstrated the capacity of neurons in rat auditory cortex to selectively process dynamic stimulus conditions representing simulated motion on the horizontal plane. Our data further show that some cells were responsive to location along the horizontal plane but not sensitive to motion. Cells sensitive to motion, however, also responded best to the moving sound at a particular location within the trajectory. It would seem that the mechanisms underlying sensitivity to sound location as well as direction of motion converge on the same cell.
Collapse
Affiliation(s)
- D E Doan
- Department of Bioengineering, University of Pennsylvania, Philadelphia, Pennsylvania 19104, USA
| | | |
Collapse
|