1
|
Huang Y, Brosch M. Absence of eye position effects in the early auditory cortex of monkeys. Neuroreport 2024; 35:209-215. [PMID: 38251450 DOI: 10.1097/wnr.0000000000001985] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2024]
Abstract
This study aims to investigate whether the position of the eyes affects the neuronal activity in auditory cortex in a condition in which not the active control of eye position but the execution of hand movements was required relative to stimuli. Two monkeys were trained to perform audio-visual tasks in which they had to use their hand to respond to both the visual and the auditory stimuli to earn a reward. We recorded the spiking activity and the local field potentials from the core fields of auditory cortex, along with the eye position of the monkeys while they performed the tasks. We found that both the spiking activity and the local field potentials did not significantly vary with the eye position. This was the case both during the presentation of sounds and during other periods of the tasks. Our results indicate that eye position did not affect the neuronal activity in auditory cortex during the audio-visual tasks. Our results, together with the previous finding that eye position affects the neuronal activity in auditory cortex during eye fixation tasks, suggest that the presence of eye position effects in auditory cortex depends on the specific behavior a subject has to exhibit to obtain a reward.
Collapse
Affiliation(s)
- Ying Huang
- Research Group Comparative Neuroscience, Leibniz Institute for Neurobiology
| | - Michael Brosch
- Research Group Comparative Neuroscience, Leibniz Institute for Neurobiology
- Center for Behavioral Brain Sciences, Otto-von-Guericke-University, Magdeburg, Germany
| |
Collapse
|
2
|
King CD, Lovich SN, Murphy DL, Landrum R, Kaylie D, Shera CA, Groh JM. Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs). Hear Res 2023; 440:108899. [PMID: 37979436 PMCID: PMC11081086 DOI: 10.1016/j.heares.2023.108899] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/03/2023] [Revised: 10/18/2023] [Accepted: 10/23/2023] [Indexed: 11/20/2023]
Abstract
We recently discovered a unique type of otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.
Collapse
Affiliation(s)
- Cynthia D King
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA.
| | - Stephanie N Lovich
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - David Lk Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - Rachel Landrum
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - David Kaylie
- Department of Otolaryngology, Duke University Medical Center, Durham, NC, USA
| | - Christopher A Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA; Department of Computer Science, Duke University, Durham, NC, USA; Department of Biomedical Engineering, Duke University, Durham, NC, USA
| |
Collapse
|
3
|
Lovich SN, King CD, Murphy DLK, Landrum RE, Shera CA, Groh JM. Parametric information about eye movements is sent to the ears. Proc Natl Acad Sci U S A 2023; 120:e2303562120. [PMID: 37988462 PMCID: PMC10691342 DOI: 10.1073/pnas.2303562120] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 09/28/2023] [Indexed: 11/23/2023] Open
Abstract
Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements. We show that EMREOs contain parametric information about horizontal and vertical eye displacement as well as initial/final eye position with respect to the head. The parametric information in the horizontal and vertical directions can be modeled as combining linearly, allowing accurate prediction of the EMREOs associated with oblique (diagonal) eye movements. Target location can also be inferred from the EMREO signals recorded during eye movements to those targets. We hypothesize that the (currently unknown) mechanism underlying EMREOs could impose a two-dimensional eye-movement-related transfer function on any incoming sound, permitting subsequent processing stages to compute the positions of sounds in relation to the visual scene.
Collapse
Affiliation(s)
- Stephanie N. Lovich
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Cynthia D. King
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - David L. K. Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Rachel E. Landrum
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Christopher A. Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA90007
| | - Jennifer M. Groh
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
- Department of Computer Science, Duke University, Durham, NC27708
- Department of Biomedical Engineering, Duke University, Durham, NC27708
| |
Collapse
|
4
|
Lovich SN, King CD, Murphy DLK, Abbasi H, Bruns P, Shera CA, Groh JM. Conserved features of eye movement related eardrum oscillations (EMREOs) across humans and monkeys. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220340. [PMID: 37545299 PMCID: PMC10404921 DOI: 10.1098/rstb.2022.0340] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2023] [Accepted: 05/23/2023] [Indexed: 08/08/2023] Open
Abstract
Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioural tasks, subjects and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than in humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Stephanie N. Lovich
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
| | - Cynthia D. King
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
| | - David L. K. Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
- Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC 27708-0187, USA
| | - Hossein Abbasi
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Hamburg 20146, Germany
| | - Patrick Bruns
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Hamburg 20146, Germany
| | - Christopher A. Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA 90033, USA
| | - Jennifer M. Groh
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
- Department of Computer Science, Duke University, Durham, NC 27708-0187, USA
- Department of Biomedical Engineering, Duke University, Durham, NC 27708-0187, USA
| |
Collapse
|
5
|
King CD, Lovich SN, Murphy DLK, Landrum R, Kaylie D, Shera CA, Groh JM. Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs). BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.09.531896. [PMID: 36945521 PMCID: PMC10028987 DOI: 10.1101/2023.03.09.531896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
We recently discovered a unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.
Collapse
|
6
|
Lovich SN, King CD, Murphy DLK, Abbasi H, Bruns P, Shera CA, Groh J. Conserved features of eye movement related eardrum oscillations (EMREOs) across humans and monkeys. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.08.531768. [PMID: 36945629 PMCID: PMC10028923 DOI: 10.1101/2023.03.08.531768] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioral tasks, subjects, and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for.
Collapse
|
7
|
Jun NY, Ruff DA, Kramer LE, Bowes B, Tokdar ST, Cohen MR, Groh JM. Coordinated multiplexing of information about separate objects in visual cortex. eLife 2022; 11:e76452. [PMID: 36444983 PMCID: PMC9708082 DOI: 10.7554/elife.76452] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2021] [Accepted: 10/21/2022] [Indexed: 11/30/2022] Open
Abstract
Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some time period, suggesting a form of neural multiplexing of different stimuli (Caruso et al., 2018). Here, we investigate (a) whether such coding fluctuations occur in early visual cortical areas; (b) how coding fluctuations are coordinated across the neural population; and (c) how coordinated coding fluctuations depend on the parsing of stimuli into separate vs. fused objects. We found coding fluctuations do occur in macaque V1 but only when the two stimuli form separate objects. Such separate objects evoked a novel pattern of V1 spike count ('noise') correlations involving distinct distributions of positive and negative values. This bimodal correlation pattern was most pronounced among pairs of neurons showing the strongest evidence for coding fluctuations or multiplexing. Whether a given pair of neurons exhibited positive or negative correlations depended on whether the two neurons both responded better to the same object or had different object preferences. Distinct distributions of spike count correlations based on stimulus preferences were also seen in V4 for separate objects but not when two stimuli fused to form one object. These findings suggest multiple objects evoke different response dynamics than those evoked by single stimuli, lending support to the multiplexing hypothesis and suggesting a means by which information about multiple objects can be preserved despite the apparent coarseness of sensory coding.
Collapse
Affiliation(s)
- Na Young Jun
- Department of Neurobiology, Duke UniversityDurhamUnited States
- Center for Cognitive Neuroscience, Duke UniversityDurhamUnited States
- Duke Institute for Brain SciencesDurhamUnited States
| | - Douglas A Ruff
- Department of Neuroscience, University of PittsburghPittsburghUnited States
- Center for the Neural Basis of Cognition, University of PittsburghPittsburghUnited States
| | - Lily E Kramer
- Department of Neuroscience, University of PittsburghPittsburghUnited States
- Center for the Neural Basis of Cognition, University of PittsburghPittsburghUnited States
| | - Brittany Bowes
- Department of Neuroscience, University of PittsburghPittsburghUnited States
- Center for the Neural Basis of Cognition, University of PittsburghPittsburghUnited States
| | - Surya T Tokdar
- Department of Statistical Science, Duke UniversityDurhamUnited States
| | - Marlene R Cohen
- Department of Neuroscience, University of PittsburghPittsburghUnited States
- Center for the Neural Basis of Cognition, University of PittsburghPittsburghUnited States
| | - Jennifer M Groh
- Department of Neurobiology, Duke UniversityDurhamUnited States
- Center for Cognitive Neuroscience, Duke UniversityDurhamUnited States
- Duke Institute for Brain SciencesDurhamUnited States
- Department of Psychology and Neuroscience, Duke UniversityDurhamUnited States
- Department of Biomedical Engineering, Duke UniversityDurhamUnited States
- Department of Computer Science, Duke UniversityDurhamUnited States
| |
Collapse
|
8
|
Willett SM, Groh JM. Multiple sounds degrade the frequency representation in monkey inferior colliculus. Eur J Neurosci 2021; 55:528-548. [PMID: 34844286 PMCID: PMC9267755 DOI: 10.1111/ejn.15545] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2021] [Revised: 11/16/2021] [Accepted: 11/17/2021] [Indexed: 11/28/2022]
Abstract
How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit the number of stimuli driving any given neuron when multiple stimuli are present. To test this hypothesis, we recorded the activity of neurons in the inferior colliculus while monkeys made saccades to either one or two simultaneous sounds differing in frequency and spatial location. Although monkeys easily distinguished simultaneous sounds (~90% correct performance), the frequency selectivity of inferior colliculus neurons on dual‐sound trials did not improve in any obvious way. Frequency selectivity was degraded on dual‐sound trials compared to single‐sound trials: neural response functions broadened and frequency accounted for less of the variance in firing rate. These changes in neural firing led a maximum‐likelihood decoder to perform worse on dual‐sound trials than on single‐sound trials. These results fail to support the hypothesis that changes in frequency response functions serve to reduce the overlap in the representation of simultaneous sounds. Instead, these results suggest that alternative possibilities, such as recent evidence of alternations in firing rate between the rates corresponding to each of the two stimuli, offer a more promising approach.
Collapse
Affiliation(s)
- Shawn M Willett
- Department of Ophthalmology, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Department of Neurobiology, Center for Cognitive Neuroscience, Duke University, Durham, North Carolina, USA
| | - Jennifer M Groh
- Department of Neurobiology, Center for Cognitive Neuroscience, Duke University, Durham, North Carolina, USA
| |
Collapse
|
9
|
Abstract
Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement-related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well. Expected final online publication date for the Annual Review of Vision Science, Volume 7 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Meredith N Schmehl
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Department of Psychology & Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Department of Computer Science, Duke University, Durham, North Carolina 27708, USA.,Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|
10
|
Opoku-Baah C, Schoenhaut AM, Vassall SG, Tovar DA, Ramachandran R, Wallace MT. Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review. J Assoc Res Otolaryngol 2021; 22:365-386. [PMID: 34014416 PMCID: PMC8329114 DOI: 10.1007/s10162-021-00789-0] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 02/07/2021] [Indexed: 01/03/2023] Open
Abstract
In a naturalistic environment, auditory cues are often accompanied by information from other senses, which can be redundant with or complementary to the auditory information. Although the multisensory interactions derived from this combination of information and that shape auditory function are seen across all sensory modalities, our greatest body of knowledge to date centers on how vision influences audition. In this review, we attempt to capture the state of our understanding at this point in time regarding this topic. Following a general introduction, the review is divided into 5 sections. In the first section, we review the psychophysical evidence in humans regarding vision's influence in audition, making the distinction between vision's ability to enhance versus alter auditory performance and perception. Three examples are then described that serve to highlight vision's ability to modulate auditory processes: spatial ventriloquism, cross-modal dynamic capture, and the McGurk effect. The final part of this section discusses models that have been built based on available psychophysical data and that seek to provide greater mechanistic insights into how vision can impact audition. The second section reviews the extant neuroimaging and far-field imaging work on this topic, with a strong emphasis on the roles of feedforward and feedback processes, on imaging insights into the causal nature of audiovisual interactions, and on the limitations of current imaging-based approaches. These limitations point to a greater need for machine-learning-based decoding approaches toward understanding how auditory representations are shaped by vision. The third section reviews the wealth of neuroanatomical and neurophysiological data from animal models that highlights audiovisual interactions at the neuronal and circuit level in both subcortical and cortical structures. It also speaks to the functional significance of audiovisual interactions for two critically important facets of auditory perception-scene analysis and communication. The fourth section presents current evidence for alterations in audiovisual processes in three clinical conditions: autism, schizophrenia, and sensorineural hearing loss. These changes in audiovisual interactions are postulated to have cascading effects on higher-order domains of dysfunction in these conditions. The final section highlights ongoing work seeking to leverage our knowledge of audiovisual interactions to develop better remediation approaches to these sensory-based disorders, founded in concepts of perceptual plasticity in which vision has been shown to have the capacity to facilitate auditory learning.
Collapse
Affiliation(s)
- Collins Opoku-Baah
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Adriana M Schoenhaut
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Sarah G Vassall
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - David A Tovar
- Neuroscience Graduate Program, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Ramnarayan Ramachandran
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Department of Psychology, Vanderbilt University, Nashville, TN, USA
- Department of Hearing and Speech, Vanderbilt University Medical Center, Nashville, TN, USA
- Vanderbilt Vision Research Center, Nashville, TN, USA
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA.
- Department of Psychology, Vanderbilt University, Nashville, TN, USA.
- Department of Hearing and Speech, Vanderbilt University Medical Center, Nashville, TN, USA.
- Vanderbilt Vision Research Center, Nashville, TN, USA.
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, TN, USA.
- Department of Pharmacology, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
11
|
Olthof BMJ, Rees A, Gartside SE. Multiple Nonauditory Cortical Regions Innervate the Auditory Midbrain. J Neurosci 2019; 39:8916-8928. [PMID: 31541020 PMCID: PMC6832679 DOI: 10.1523/jneurosci.1436-19.2019] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2019] [Revised: 08/02/2019] [Accepted: 08/21/2019] [Indexed: 02/03/2023] Open
Abstract
Our perceptual experience of sound depends on the integration of multiple sensory and cognitive domains, however the networks subserving this integration are unclear. Connections linking different cortical domains have been described, but we do not know the extent to which connections also exist between multiple cortical domains and subcortical structures. Retrograde tracing in adult male rats (Rattus norvegicus) revealed that the inferior colliculus, the auditory midbrain, receives dense descending projections not only, as previously established, from the auditory cortex, but also from the visual, somatosensory, motor, and prefrontal cortices. While all these descending connections are bilateral, those from sensory areas show a more pronounced ipsilateral dominance than those from motor and prefrontal cortices. Injections of anterograde tracers into the cortical areas identified by retrograde tracing confirmed those findings and revealed cortical fibers terminating in all three subdivisions of the inferior colliculus. Immunolabeling showed that cortical terminals target both GABAergic inhibitory, and putative glutamatergic excitatory neurons. These findings demonstrate that auditory perception and behavior are served by a network that includes extensive descending connections to the midbrain from sensory, behavioral, and executive cortices.SIGNIFICANCE STATEMENT Making sense of what we hear depends not only on the analysis of sound, but also on information from other senses together with the brain's predictions about the properties and significance of the sound. Previous work suggested that this interplay between the senses and the predictions from higher cognitive centers occurs within the cerebral cortex. By tracing neural connections in rat, we show that the inferior colliculus, the subcortical, midbrain center for hearing, receives extensive connections from areas of the cerebral cortex concerned with vision, touch, movement, and cognitive function, in addition to areas representing hearing. These findings demonstrate that wide-ranging cortical feedback operates at an earlier stage of the hearing pathway than previously recognized.
Collapse
Affiliation(s)
- Bas M J Olthof
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne NE2 4HH, United Kingdom
| | - Adrian Rees
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne NE2 4HH, United Kingdom
| | - Sarah E Gartside
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne NE2 4HH, United Kingdom
| |
Collapse
|
12
|
Abstract
Our perceptual experience of sound depends on the integration of multiple sensory and cognitive domains, however the networks subserving this integration are unclear. Connections linking different cortical domains have been described, but we do not know the extent to which connections also exist between multiple cortical domains and subcortical structures. Retrograde tracing in adult male rats (Rattus norvegicus) revealed that the inferior colliculus, the auditory midbrain, receives dense descending projections not only, as previously established, from the auditory cortex, but also from the visual, somatosensory, motor, and prefrontal cortices. While all these descending connections are bilateral, those from sensory areas show a more pronounced ipsilateral dominance than those from motor and prefrontal cortices. Injections of anterograde tracers into the cortical areas identified by retrograde tracing confirmed those findings and revealed cortical fibers terminating in all three subdivisions of the inferior colliculus. Immunolabeling showed that cortical terminals target both GABAergic inhibitory, and putative glutamatergic excitatory neurons. These findings demonstrate that auditory perception and behavior are served by a network that includes extensive descending connections to the midbrain from sensory, behavioral, and executive cortices.SIGNIFICANCE STATEMENT Making sense of what we hear depends not only on the analysis of sound, but also on information from other senses together with the brain's predictions about the properties and significance of the sound. Previous work suggested that this interplay between the senses and the predictions from higher cognitive centers occurs within the cerebral cortex. By tracing neural connections in rat, we show that the inferior colliculus, the subcortical, midbrain center for hearing, receives extensive connections from areas of the cerebral cortex concerned with vision, touch, movement, and cognitive function, in addition to areas representing hearing. These findings demonstrate that wide-ranging cortical feedback operates at an earlier stage of the hearing pathway than previously recognized.
Collapse
|
13
|
Dong CM, Leong ATL, Manno FA, Lau C, Ho LC, Chan RW, Feng Y, Gao PP, Wu EX. Functional MRI Investigation of Audiovisual Interactions in Auditory Midbrain. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2019; 2018:5527-5530. [PMID: 30441589 DOI: 10.1109/embc.2018.8513629] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
The brain integrates information from different sensory modalities to form a representation of the environment and facilitate behavioral responses. The auditory midbrain or inferior colliculus (IC) is a pivotal station in the auditory system, integrating ascending and descending information from various auditory sources and cortical systems. The present study investigated the modulation of auditory responses in the IC by visual stimuli of different frequencies and intensities in rats using functional MRI (fMRI). Low-frequency (1 Hz) high-intensity visual stimulus suppressed IC auditory responses. However, high-frequency (10 Hz) or low-intensity visual stimuli did not alter the IC auditory responses. This finding demonstrates that cross-modal processing occurs in the IC in a manner that depends on the stimulus. Furthermore, only low-frequency high-intensity visual stimulus elicited responses in non-visual cortical regions, suggesting that the above cross-modal modulation effect may arise from top-down cortical feedback. These fMRI results provide insight to guide future studies of cross-modal processing in sensory pathways.
Collapse
|
14
|
Sadras N, Pesaran B, Shanechi MM. A point-process matched filter for event detection and decoding from population spike trains. J Neural Eng 2019; 16:066016. [PMID: 31437831 DOI: 10.1088/1741-2552/ab3dbc] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
OBJECTIVE Information encoding in neurons can be described through their response fields. The spatial response field of a neuron is the region of space in which a sensory stimulus or a behavioral event causes that neuron to fire. Neurons can also exhibit temporal response fields (TRFs), which characterize a transient response to stimulus or behavioral event onsets. These neurons can thus be described by a spatio-temporal response field (STRF). The activity of neurons with STRFs can be well-described with point process models that characterize binary spike trains with an instantaneous firing rate that is a function of both time and space. However, developing decoders for point process models of neurons that exhibit TRFs is challenging because it requires prior knowledge of event onset times, which are unknown. Indeed, point process filters (PPF) to date have largely focused on decoding neuronal activity without considering TRFs. Also, neural classifiers have required data to be behavior- or stimulus-aligned, i.e. event times to be known, which is often not possible in real-world applications. Our objective in this work is to develop a viable decoder for neurons with STRFs when event times are unknown. APPROACH To enable decoding of neurons with STRFs, we develop a novel point-process matched filter (PPMF) that can detect events and estimate their onset times from population spike trains. We also devise a PPF for neurons with transient responses as characterized by STRFs. When neurons exhibit STRFs and event times are unknown, the PPMF can be combined with the PPF or with discrete classifiers for continuous and discrete brain state decoding, respectively. MAIN RESULTS We validate our algorithm on two datasets: simulated spikes from neurons that encode visual saliency in response to stimuli, and prefrontal spikes recorded in a monkey performing a delayed-saccade task. We show that the PPMF can estimate the stimulus times and saccade times accurately. Further, the PPMF combined with the PPF can decode visual saliency maps without knowing the stimulus times. Similarly, the PPMF combined with a point process classifier can decode the saccade direction without knowing the saccade times. SIGNIFICANCE These event detection and decoding algorithms can help develop neurotechnologies to decode cognitive states from neural responses that exhibit STRFs.
Collapse
Affiliation(s)
- Nitin Sadras
- Ming Hsieh Department of Electrical and Computer Engineering, Viterbi School of Engineering, University of Southern California, Los Angeles, CA, United States of America
| | | | | |
Collapse
|
15
|
Visual input shapes the auditory frequency responses in the inferior colliculus of mouse. Hear Res 2019; 381:107777. [PMID: 31430633 DOI: 10.1016/j.heares.2019.107777] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/16/2019] [Revised: 07/30/2019] [Accepted: 08/02/2019] [Indexed: 11/23/2022]
Abstract
The integration of visual and auditory information is important for humans or animals to build an accurate and coherent perception of the external world. Although some evidence has shown some principles of the audiovisual integration, little insight has been gained into its functional purpose. In this study, we investigated the functional influence of dynamic visual input on auditory frequency processing by recording single unit activity in the central nucleus of the inferior colliculus (ICc). Results showed that the auditory responses of ICc neurons to sound frequencies could be enhanced or suppressed by visual stimuli even though the same visual stimuli induced no neural responses when presented alone. For each ICc neuron, the most effective visual stimuli were located in the same azimuth as for auditory stimuli and preceded for ∼20 ms. Additionally, visual stimuli could steepen or flatten the frequency tuning curves (FTCs) of ICc neurons by various visual effects at each responsive frequency. The modulation degree of auditory FTCs was dependent on the minimal thresholds (MTs) of ICc neurons, i.e., with MTs increasing, the modulation degree decreased. Due to the non-homogeneous distribution of MTs which was lowest at 10 kHz, visual modulation of auditory FTCs exhibited a frequency-specific manner, the closer it reached the characteristic frequency (CF) of 10 kHz, the greater modulation. Thus, visual modulation of auditory frequency responses in ICc is dependent not only on the visual stimulus but also on the auditory characteristics of ICc neurons. These results suggest a moment-to-moment visual modulation of auditory frequency responses that in real time increase auditory frequency sensitivity to audiovisual stimuli. Furthermore, in the long term such modulation could serve to instruct auditory adaptive plasticity to maintain necessary and accurate auditory detection and perceptual behavior.
Collapse
|
16
|
Leong ATL, Dong CM, Gao PP, Chan RW, To A, Sanes DH, Wu EX. Optogenetic auditory fMRI reveals the effects of visual cortical inputs on auditory midbrain response. Sci Rep 2018; 8:8736. [PMID: 29880842 PMCID: PMC5992211 DOI: 10.1038/s41598-018-26568-1] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2017] [Accepted: 05/10/2018] [Indexed: 12/20/2022] Open
Abstract
Sensory cortices contain extensive descending (corticofugal) pathways, yet their impact on brainstem processing - particularly across sensory systems - remains poorly understood. In the auditory system, the inferior colliculus (IC) in the midbrain receives cross-modal inputs from the visual cortex (VC). However, the influences from VC on auditory midbrain processing are unclear. To investigate whether and how visual cortical inputs affect IC auditory responses, the present study combines auditory blood-oxygenation-level-dependent (BOLD) functional MRI (fMRI) with cell-type specific optogenetic manipulation of visual cortex. The results show that predominant optogenetic excitation of the excitatory pyramidal neurons in the infragranular layers of the primary VC enhances the noise-evoked BOLD fMRI responses within the IC. This finding reveals that inputs from VC influence and facilitate basic sound processing in the auditory midbrain. Such combined optogenetic and auditory fMRI approach can shed light on the large-scale modulatory effects of corticofugal pathways and guide detailed electrophysiological studies in the future.
Collapse
Affiliation(s)
- Alex T L Leong
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China.,Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Celia M Dong
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China.,Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Patrick P Gao
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China.,Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Russell W Chan
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China.,Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Anthea To
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China.,Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
| | - Dan H Sanes
- Center for Neural Science, New York University, New York, NY, 10003, United States
| | - Ed X Wu
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong SAR, China. .,Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam, Hong Kong SAR, China. .,School of Biomedical Sciences, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China. .,Department of Medicine, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China.
| |
Collapse
|
17
|
The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing. Proc Natl Acad Sci U S A 2018; 115:E1309-E1318. [PMID: 29363603 PMCID: PMC5819440 DOI: 10.1073/pnas.1717948115] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
The peripheral hearing system contains several motor mechanisms that allow the brain to modify the auditory transduction process. Movements or tensioning of either the middle ear muscles or the outer hair cells modifies eardrum motion, producing sounds that can be detected by a microphone placed in the ear canal (e.g., as otoacoustic emissions). Here, we report a form of eardrum motion produced by the brain via these systems: oscillations synchronized with and covarying with the direction and amplitude of saccades. These observations suggest that a vision-related process modulates the first stage of hearing. In particular, these eye movement-related eardrum oscillations may help the brain connect sights and sounds despite changes in the spatial relationship between the eyes and the ears. Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.
Collapse
|
18
|
Mellott JG, Beebe NL, Schofield BR. GABAergic and non-GABAergic projections to the superior colliculus from the auditory brainstem. Brain Struct Funct 2018; 223:1923-1936. [PMID: 29302743 DOI: 10.1007/s00429-017-1599-4] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2016] [Accepted: 12/22/2017] [Indexed: 02/02/2023]
Abstract
The superior colliculus (SC) contains an auditory space map that is shaped by projections from several subcortical auditory nuclei. Both GABAergic (inhibitory) and excitatory cells contribute to these inputs, but there are contradictory reports regarding the sources of these inputs. We used retrograde tracing techniques in guinea pigs to identify cells in the auditory brainstem that project to the SC. We combined retrograde tracing with immunohistochemistry for glutamic acid decarboxylase (GAD) to identify putative GABAergic cells that participate in this pathway. Following a tracer injection in the SC, the nucleus of the brachium of the inferior colliculus (NBIC) contained the most labeled cells, followed by the inferior colliculus (IC). Smaller populations were observed in the sagulum, paralemniscal area, periolivary nuclei and ventrolateral tegmental nucleus. Overall, only 10% of the retrogradely labeled cells were GAD immunopositive. The presumptive inhibitory cells were observed in the NBIC, IC, superior paraolivary nucleus, sagulum and paralemniscal area. We conclude that the guinea pig SC receives input from a diverse set of auditory brainstem nuclei, some of which provide GABAergic input. These diverse origins of input to the SC likely represent a variety of functions. Inputs from the NBIC and IC likely provide spatial information for guiding orienting behaviors. Inputs from subcollicular nuclei are less likely to provide spatial information; rather, they may provide a shorter route for auditory information to reach the SC, and could generate avoidance or escape responses to an external threat.
Collapse
Affiliation(s)
- Jeffrey G Mellott
- Department of Anatomy and Neurobiology, Northeast Ohio Medical University, 4209 State Route 44, PO Box 95, Rootstown, OH, USA
| | - Nichole L Beebe
- Department of Anatomy and Neurobiology, Northeast Ohio Medical University, 4209 State Route 44, PO Box 95, Rootstown, OH, USA
| | - Brett R Schofield
- Department of Anatomy and Neurobiology, Northeast Ohio Medical University, 4209 State Route 44, PO Box 95, Rootstown, OH, USA.
| |
Collapse
|
19
|
Stitt I, Galindo-Leon E, Pieper F, Hollensteiner KJ, Engler G, Engel AK. Auditory and visual interactions between the superior and inferior colliculi in the ferret. Eur J Neurosci 2015; 41:1311-20. [PMID: 25645363 DOI: 10.1111/ejn.12847] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2014] [Revised: 12/09/2014] [Accepted: 01/03/2015] [Indexed: 11/27/2022]
Abstract
The integration of visual and auditory spatial information is important for building an accurate perception of the external world, but the fundamental mechanisms governing such audiovisual interaction have only partially been resolved. The earliest interface between auditory and visual processing pathways is in the midbrain, where the superior (SC) and inferior colliculi (IC) are reciprocally connected in an audiovisual loop. Here, we investigate the mechanisms of audiovisual interaction in the midbrain by recording neural signals from the SC and IC simultaneously in anesthetized ferrets. Visual stimuli reliably produced band-limited phase locking of IC local field potentials (LFPs) in two distinct frequency bands: 6-10 and 15-30 Hz. These visual LFP responses co-localized with robust auditory responses that were characteristic of the IC. Imaginary coherence analysis confirmed that visual responses in the IC were not volume-conducted signals from the neighboring SC. Visual responses in the IC occurred later than retinally driven superficial SC layers and earlier than deep SC layers that receive indirect visual inputs, suggesting that retinal inputs do not drive visually evoked responses in the IC. In addition, SC and IC recording sites with overlapping visual spatial receptive fields displayed stronger functional connectivity than sites with separate receptive fields, indicating that visual spatial maps are aligned across both midbrain structures. Reciprocal coupling between the IC and SC therefore probably serves the dynamic integration of visual and auditory representations of space.
Collapse
Affiliation(s)
- Iain Stitt
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Edgar Galindo-Leon
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Florian Pieper
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Karl J Hollensteiner
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Gerhard Engler
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| | - Andreas K Engel
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, 20246, Hamburg, Germany
| |
Collapse
|
20
|
Maddox RK, Pospisil DA, Stecker GC, Lee AKC. Directing eye gaze enhances auditory spatial cue discrimination. Curr Biol 2014; 24:748-52. [PMID: 24631242 DOI: 10.1016/j.cub.2014.02.021] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2013] [Revised: 12/12/2013] [Accepted: 02/11/2014] [Indexed: 11/29/2022]
Abstract
The present study demonstrates, for the first time, a specific enhancement of auditory spatial cue discrimination due to eye gaze. Whereas the region of sharpest visual acuity, called the fovea, can be directed at will by moving one's eyes, auditory spatial information is derived primarily from head-related acoustic cues. Past auditory studies have found better discrimination in front of the head [1-3] but have not manipulated subjects' gaze, thus overlooking potential oculomotor influences. Electrophysiological studies have shown that the inferior colliculus, a critical auditory midbrain nucleus, shows visual and oculomotor responses [4-6] and modulations of auditory activity [7-9], and that auditory neurons in the superior colliculus show shifting receptive fields [10-13]. How the auditory system leverages this crossmodal information at the behavioral level remains unknown. Here we directed subjects' gaze (with an eccentric dot) or auditory attention (with lateralized noise) while they performed an auditory spatial cue discrimination task. We found that directing gaze toward a sound significantly enhances discrimination of both interaural level and time differences, whereas directing auditory spatial attention does not. These results show that oculomotor information variably enhances auditory spatial resolution even when the head remains stationary, revealing a distinct behavioral benefit possibly arising from auditory-oculomotor interactions at an earlier level of processing than previously demonstrated.
Collapse
Affiliation(s)
- Ross K Maddox
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA
| | - Dean A Pospisil
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA
| | - G Christopher Stecker
- Department of Speech and Hearing Sciences, University of Washington, 1417 NE 42(nd) Street, Eagleson Hall, Box 354875, Seattle, WA 98105, USA; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, 1215 21(st) Avenue South, Room 8310, Nashville, TN 37232, USA
| | - Adrian K C Lee
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA; Department of Speech and Hearing Sciences, University of Washington, 1417 NE 42(nd) Street, Eagleson Hall, Box 354875, Seattle, WA 98105, USA.
| |
Collapse
|
21
|
Gruters KG, Groh JM. Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus. Front Neural Circuits 2012; 6:96. [PMID: 23248584 PMCID: PMC3518932 DOI: 10.3389/fncir.2012.00096] [Citation(s) in RCA: 74] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2012] [Accepted: 11/15/2012] [Indexed: 11/20/2022] Open
Abstract
The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC “hears” would seem to be passed both “upward” to thalamus and thence to auditory cortex and beyond, as well as “downward” via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.
Collapse
Affiliation(s)
- Kurtis G Gruters
- Department of Psychology and Neuroscience, Duke University Durham, NC, USA
| | | |
Collapse
|