1
|
Gehmacher Q, Schubert J, Schmidt F, Hartmann T, Reisinger P, Rösch S, Schwarz K, Popov T, Chait M, Weisz N. Eye movements track prioritized auditory features in selective attention to natural speech. Nat Commun 2024; 15:3692. [PMID: 38693186 PMCID: PMC11063150 DOI: 10.1038/s41467-024-48126-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/22/2024] [Indexed: 05/03/2024] Open
Abstract
Over the last decades, cognitive neuroscience has identified a distributed set of brain regions that are critical for attention. Strong anatomical overlap with brain regions critical for oculomotor processes suggests a joint network for attention and eye movements. However, the role of this shared network in complex, naturalistic environments remains understudied. Here, we investigated eye movements in relation to (un)attended sentences of natural speech. Combining simultaneously recorded eye tracking and magnetoencephalographic data with temporal response functions, we show that gaze tracks attended speech, a phenomenon we termed ocular speech tracking. Ocular speech tracking even differentiates a target from a distractor in a multi-speaker context and is further related to intelligibility. Moreover, we provide evidence for its contribution to neural differences in speech processing, emphasizing the necessity to consider oculomotor activity in future research and in the interpretation of neural differences in auditory cognition.
Collapse
Affiliation(s)
- Quirin Gehmacher
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria.
| | - Juliane Schubert
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
| | - Fabian Schmidt
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
| | - Thomas Hartmann
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
| | - Patrick Reisinger
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
| | - Sebastian Rösch
- Department of Otorhinolaryngology, Head and Neck Surgery, Paracelsus Medical University Salzburg, 5020, Salzburg, Austria
| | | | - Tzvetan Popov
- Methods of Plasticity Research, Department of Psychology, University of Zurich, CH-8050, Zurich, Switzerland
- Department of Psychology, University of Konstanz, DE- 78464, Konstanz, Germany
| | - Maria Chait
- Ear Institute, University College London, London, UK
| | - Nathan Weisz
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
- Neuroscience Institute, Christian Doppler University Hospital, Paracelsus Medical University, Salzburg, Austria
| |
Collapse
|
2
|
King CD, Lovich SN, Murphy DL, Landrum R, Kaylie D, Shera CA, Groh JM. Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs). Hear Res 2023; 440:108899. [PMID: 37979436 PMCID: PMC11081086 DOI: 10.1016/j.heares.2023.108899] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/03/2023] [Revised: 10/18/2023] [Accepted: 10/23/2023] [Indexed: 11/20/2023]
Abstract
We recently discovered a unique type of otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.
Collapse
Affiliation(s)
- Cynthia D King
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA.
| | - Stephanie N Lovich
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - David Lk Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - Rachel Landrum
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - David Kaylie
- Department of Otolaryngology, Duke University Medical Center, Durham, NC, USA
| | - Christopher A Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA; Department of Computer Science, Duke University, Durham, NC, USA; Department of Biomedical Engineering, Duke University, Durham, NC, USA
| |
Collapse
|
3
|
Lovich SN, King CD, Murphy DLK, Landrum RE, Shera CA, Groh JM. Parametric information about eye movements is sent to the ears. Proc Natl Acad Sci U S A 2023; 120:e2303562120. [PMID: 37988462 PMCID: PMC10691342 DOI: 10.1073/pnas.2303562120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 09/28/2023] [Indexed: 11/23/2023] Open
Abstract
Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements. We show that EMREOs contain parametric information about horizontal and vertical eye displacement as well as initial/final eye position with respect to the head. The parametric information in the horizontal and vertical directions can be modeled as combining linearly, allowing accurate prediction of the EMREOs associated with oblique (diagonal) eye movements. Target location can also be inferred from the EMREO signals recorded during eye movements to those targets. We hypothesize that the (currently unknown) mechanism underlying EMREOs could impose a two-dimensional eye-movement-related transfer function on any incoming sound, permitting subsequent processing stages to compute the positions of sounds in relation to the visual scene.
Collapse
Affiliation(s)
- Stephanie N. Lovich
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Cynthia D. King
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - David L. K. Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Rachel E. Landrum
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Christopher A. Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA90007
| | - Jennifer M. Groh
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
- Department of Computer Science, Duke University, Durham, NC27708
- Department of Biomedical Engineering, Duke University, Durham, NC27708
| |
Collapse
|
4
|
Lovich SN, King CD, Murphy DLK, Abbasi H, Bruns P, Shera CA, Groh JM. Conserved features of eye movement related eardrum oscillations (EMREOs) across humans and monkeys. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220340. [PMID: 37545299 PMCID: PMC10404921 DOI: 10.1098/rstb.2022.0340] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2023] [Accepted: 05/23/2023] [Indexed: 08/08/2023] Open
Abstract
Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioural tasks, subjects and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than in humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Stephanie N. Lovich
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
| | - Cynthia D. King
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
| | - David L. K. Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
- Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC 27708-0187, USA
| | - Hossein Abbasi
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Hamburg 20146, Germany
| | - Patrick Bruns
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Hamburg 20146, Germany
| | - Christopher A. Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA 90033, USA
| | - Jennifer M. Groh
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
- Department of Computer Science, Duke University, Durham, NC 27708-0187, USA
- Department of Biomedical Engineering, Duke University, Durham, NC 27708-0187, USA
| |
Collapse
|
5
|
King CD, Lovich SN, Murphy DLK, Landrum R, Kaylie D, Shera CA, Groh JM. Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs). BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.09.531896. [PMID: 36945521 PMCID: PMC10028987 DOI: 10.1101/2023.03.09.531896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
We recently discovered a unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.
Collapse
|
6
|
Lovich SN, King CD, Murphy DLK, Abbasi H, Bruns P, Shera CA, Groh J. Conserved features of eye movement related eardrum oscillations (EMREOs) across humans and monkeys. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.08.531768. [PMID: 36945629 PMCID: PMC10028923 DOI: 10.1101/2023.03.08.531768] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioral tasks, subjects, and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for.
Collapse
|
7
|
Jun NY, Ruff DA, Kramer LE, Bowes B, Tokdar ST, Cohen MR, Groh JM. Coordinated multiplexing of information about separate objects in visual cortex. eLife 2022; 11:e76452. [PMID: 36444983 PMCID: PMC9708082 DOI: 10.7554/elife.76452] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2021] [Accepted: 10/21/2022] [Indexed: 11/30/2022] Open
Abstract
Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some time period, suggesting a form of neural multiplexing of different stimuli (Caruso et al., 2018). Here, we investigate (a) whether such coding fluctuations occur in early visual cortical areas; (b) how coding fluctuations are coordinated across the neural population; and (c) how coordinated coding fluctuations depend on the parsing of stimuli into separate vs. fused objects. We found coding fluctuations do occur in macaque V1 but only when the two stimuli form separate objects. Such separate objects evoked a novel pattern of V1 spike count ('noise') correlations involving distinct distributions of positive and negative values. This bimodal correlation pattern was most pronounced among pairs of neurons showing the strongest evidence for coding fluctuations or multiplexing. Whether a given pair of neurons exhibited positive or negative correlations depended on whether the two neurons both responded better to the same object or had different object preferences. Distinct distributions of spike count correlations based on stimulus preferences were also seen in V4 for separate objects but not when two stimuli fused to form one object. These findings suggest multiple objects evoke different response dynamics than those evoked by single stimuli, lending support to the multiplexing hypothesis and suggesting a means by which information about multiple objects can be preserved despite the apparent coarseness of sensory coding.
Collapse
Affiliation(s)
- Na Young Jun
- Department of Neurobiology, Duke UniversityDurhamUnited States
- Center for Cognitive Neuroscience, Duke UniversityDurhamUnited States
- Duke Institute for Brain SciencesDurhamUnited States
| | - Douglas A Ruff
- Department of Neuroscience, University of PittsburghPittsburghUnited States
- Center for the Neural Basis of Cognition, University of PittsburghPittsburghUnited States
| | - Lily E Kramer
- Department of Neuroscience, University of PittsburghPittsburghUnited States
- Center for the Neural Basis of Cognition, University of PittsburghPittsburghUnited States
| | - Brittany Bowes
- Department of Neuroscience, University of PittsburghPittsburghUnited States
- Center for the Neural Basis of Cognition, University of PittsburghPittsburghUnited States
| | - Surya T Tokdar
- Department of Statistical Science, Duke UniversityDurhamUnited States
| | - Marlene R Cohen
- Department of Neuroscience, University of PittsburghPittsburghUnited States
- Center for the Neural Basis of Cognition, University of PittsburghPittsburghUnited States
| | - Jennifer M Groh
- Department of Neurobiology, Duke UniversityDurhamUnited States
- Center for Cognitive Neuroscience, Duke UniversityDurhamUnited States
- Duke Institute for Brain SciencesDurhamUnited States
- Department of Psychology and Neuroscience, Duke UniversityDurhamUnited States
- Department of Biomedical Engineering, Duke UniversityDurhamUnited States
- Department of Computer Science, Duke UniversityDurhamUnited States
| |
Collapse
|
8
|
Willett SM, Groh JM. Multiple sounds degrade the frequency representation in monkey inferior colliculus. Eur J Neurosci 2021; 55:528-548. [PMID: 34844286 PMCID: PMC9267755 DOI: 10.1111/ejn.15545] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2021] [Revised: 11/16/2021] [Accepted: 11/17/2021] [Indexed: 11/28/2022]
Abstract
How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit the number of stimuli driving any given neuron when multiple stimuli are present. To test this hypothesis, we recorded the activity of neurons in the inferior colliculus while monkeys made saccades to either one or two simultaneous sounds differing in frequency and spatial location. Although monkeys easily distinguished simultaneous sounds (~90% correct performance), the frequency selectivity of inferior colliculus neurons on dual‐sound trials did not improve in any obvious way. Frequency selectivity was degraded on dual‐sound trials compared to single‐sound trials: neural response functions broadened and frequency accounted for less of the variance in firing rate. These changes in neural firing led a maximum‐likelihood decoder to perform worse on dual‐sound trials than on single‐sound trials. These results fail to support the hypothesis that changes in frequency response functions serve to reduce the overlap in the representation of simultaneous sounds. Instead, these results suggest that alternative possibilities, such as recent evidence of alternations in firing rate between the rates corresponding to each of the two stimuli, offer a more promising approach.
Collapse
Affiliation(s)
- Shawn M Willett
- Department of Ophthalmology, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Department of Neurobiology, Center for Cognitive Neuroscience, Duke University, Durham, North Carolina, USA
| | - Jennifer M Groh
- Department of Neurobiology, Center for Cognitive Neuroscience, Duke University, Durham, North Carolina, USA
| |
Collapse
|
9
|
Abstract
Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement-related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well. Expected final online publication date for the Annual Review of Vision Science, Volume 7 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Meredith N Schmehl
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Department of Psychology & Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Department of Computer Science, Duke University, Durham, North Carolina 27708, USA.,Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|
10
|
Abstract
To achieve visual space constancy, our brain remaps eye-centered projections of visual objects across saccades. Here, we measured saccade trajectory curvature following the presentation of visual, auditory, and audiovisual distractors in a double-step saccade task to investigate if this stability mechanism also accounts for localized sounds. We found that saccade trajectories systematically curved away from the position at which either a light or a sound was presented, suggesting that both modalities are represented in eye-centered oculomotor centers. Importantly, the same effect was observed when the distractor preceded the execution of the first saccade. These results suggest that oculomotor centers keep track of visual, auditory and audiovisual objects by remapping their eye-centered representations across saccades. Furthermore, they argue for the existence of a supra-modal map which keeps track of multi-sensory object locations across our movements to create an impression of space constancy.
Collapse
|
11
|
|
12
|
Visual input shapes the auditory frequency responses in the inferior colliculus of mouse. Hear Res 2019; 381:107777. [PMID: 31430633 DOI: 10.1016/j.heares.2019.107777] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/16/2019] [Revised: 07/30/2019] [Accepted: 08/02/2019] [Indexed: 11/23/2022]
Abstract
The integration of visual and auditory information is important for humans or animals to build an accurate and coherent perception of the external world. Although some evidence has shown some principles of the audiovisual integration, little insight has been gained into its functional purpose. In this study, we investigated the functional influence of dynamic visual input on auditory frequency processing by recording single unit activity in the central nucleus of the inferior colliculus (ICc). Results showed that the auditory responses of ICc neurons to sound frequencies could be enhanced or suppressed by visual stimuli even though the same visual stimuli induced no neural responses when presented alone. For each ICc neuron, the most effective visual stimuli were located in the same azimuth as for auditory stimuli and preceded for ∼20 ms. Additionally, visual stimuli could steepen or flatten the frequency tuning curves (FTCs) of ICc neurons by various visual effects at each responsive frequency. The modulation degree of auditory FTCs was dependent on the minimal thresholds (MTs) of ICc neurons, i.e., with MTs increasing, the modulation degree decreased. Due to the non-homogeneous distribution of MTs which was lowest at 10 kHz, visual modulation of auditory FTCs exhibited a frequency-specific manner, the closer it reached the characteristic frequency (CF) of 10 kHz, the greater modulation. Thus, visual modulation of auditory frequency responses in ICc is dependent not only on the visual stimulus but also on the auditory characteristics of ICc neurons. These results suggest a moment-to-moment visual modulation of auditory frequency responses that in real time increase auditory frequency sensitivity to audiovisual stimuli. Furthermore, in the long term such modulation could serve to instruct auditory adaptive plasticity to maintain necessary and accurate auditory detection and perceptual behavior.
Collapse
|
13
|
The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing. Proc Natl Acad Sci U S A 2018; 115:E1309-E1318. [PMID: 29363603 PMCID: PMC5819440 DOI: 10.1073/pnas.1717948115] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
The peripheral hearing system contains several motor mechanisms that allow the brain to modify the auditory transduction process. Movements or tensioning of either the middle ear muscles or the outer hair cells modifies eardrum motion, producing sounds that can be detected by a microphone placed in the ear canal (e.g., as otoacoustic emissions). Here, we report a form of eardrum motion produced by the brain via these systems: oscillations synchronized with and covarying with the direction and amplitude of saccades. These observations suggest that a vision-related process modulates the first stage of hearing. In particular, these eye movement-related eardrum oscillations may help the brain connect sights and sounds despite changes in the spatial relationship between the eyes and the ears. Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.
Collapse
|
14
|
Pages DS, Ross DA, Puñal VM, Agashe S, Dweck I, Mueller J, Grill WM, Wilson BS, Groh JM. Effects of Electrical Stimulation in the Inferior Colliculus on Frequency Discrimination by Rhesus Monkeys and Implications for the Auditory Midbrain Implant. J Neurosci 2016; 36:5071-83. [PMID: 27147659 PMCID: PMC4854969 DOI: 10.1523/jneurosci.3540-15.2016] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2015] [Revised: 02/29/2016] [Accepted: 03/02/2016] [Indexed: 11/21/2022] Open
Abstract
UNLABELLED Understanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation. Monkeys performed a frequency-based discrimination task, reporting whether a probe sound was higher or lower in frequency than a reference sound. Stimulation pulses were paired with the probe sound on 50% of trials (0.5-80 μA, 100-300 Hz, n = 172 IC locations in 3 rhesus monkeys). Electrical stimulation tended to bias the animals' judgments in a fashion that was coarsely but significantly correlated with the best frequency of the stimulation site compared with the reference frequency used in the task. Although there was considerable variability in the effects of stimulation (including impairments in performance and shifts in performance away from the direction predicted based on the site's response properties), the results indicate that stimulation of the IC can evoke percepts correlated with the frequency-tuning properties of the IC. Consistent with the implications of recent human studies, the main avenue for improvement for the auditory midbrain implant suggested by our findings is to increase the number and spatial extent of electrodes, to increase the size of the region that can be electrically activated, and to provide a greater range of evoked percepts. SIGNIFICANCE STATEMENT Patients with hearing loss stemming from causes that interrupt the auditory pathway after the cochlea need a brain prosthetic to restore hearing. Recently, prosthetic stimulation in the human inferior colliculus (IC) was evaluated in a clinical trial. Thus far, speech understanding was limited for the subjects and this limitation is thought to be partly due to challenges in harnessing the sound frequency representation in the IC. Here, we tested the effects of IC stimulation in monkeys trained to report the sound frequencies they heard. Our results indicate that the IC can be used to introduce a range of frequency percepts and suggest that placement of a greater number of electrode contacts may improve the effectiveness of such implants.
Collapse
Affiliation(s)
- Daniel S Pages
- Department of Psychology and Neuroscience, Center for Cognitive Neuroscience,
| | | | | | | | | | - Jerel Mueller
- Department of Biomedical Engineering, and School of Biomedical Engineering and Sciences, Virginia Polytechnic Institute and State University, Blacksburg, Virginia 24061
| | | | - Blake S Wilson
- Schools of Medicine and Engineering, Duke University, Durham, North Carolina 27708, and
| | - Jennifer M Groh
- Department of Psychology and Neuroscience, Center for Cognitive Neuroscience, Department of Neurobiology,
| |
Collapse
|
15
|
Sound localization in a changing world. Curr Opin Neurobiol 2015; 35:35-43. [PMID: 26126152 DOI: 10.1016/j.conb.2015.06.005] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2015] [Revised: 06/04/2015] [Accepted: 06/15/2015] [Indexed: 12/11/2022]
Abstract
In natural environments, neural systems must be continuously updated to reflect changes in sensory inputs and behavioral goals. Recent studies of sound localization have shown that adaptation and learning involve multiple mechanisms that operate at different timescales and stages of processing, with other sensory and motor-related inputs playing a key role. We are only just beginning to understand, however, how these processes interact with one another to produce adaptive changes at the level of neuronal populations and behavior. Because there is no explicit map of auditory space in the cortex, studies of sound localization may also provide much broader insight into the plasticity of complex neural representations that are not topographically organized.
Collapse
|
16
|
Williamson RS, Hancock KE, Shinn-Cunningham BG, Polley DB. Locomotion and Task Demands Differentially Modulate Thalamic Audiovisual Processing during Active Search. Curr Biol 2015; 25:1885-91. [PMID: 26119749 DOI: 10.1016/j.cub.2015.05.045] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2015] [Revised: 04/23/2015] [Accepted: 05/22/2015] [Indexed: 10/23/2022]
Abstract
Active search is a ubiquitous goal-driven behavior wherein organisms purposefully investigate the sensory environment to locate a target object. During active search, brain circuits analyze a stream of sensory information from the external environment, adjusting for internal signals related to self-generated movement or "top-down" weighting of anticipated target and distractor properties. Sensory responses in the cortex can be modulated by internal state, though the extent and form of modulation arising in the cortex de novo versus an inheritance from subcortical stations is not clear. We addressed this question by simultaneously recording from auditory and visual regions of the thalamus (MG and LG, respectively) while mice used dynamic auditory or visual feedback to search for a hidden target within an annular track. Locomotion was associated with strongly suppressed responses and reduced decoding accuracy in MG but a subtle increase in LG spiking. Because stimuli in one modality provided critical information about target location while the other served as a distractor, we could also estimate the importance of task relevance in both thalamic subdivisions. In contrast to the effects of locomotion, we found that LG responses were reduced overall yet decoded stimuli more accurately when vision was behaviorally relevant, whereas task relevance had little effect on MG responses. This double dissociation between the influences of task relevance and movement in MG and LG highlights a role for extrasensory modulation in the thalamus but also suggests key differences in the organization of modulatory circuitry between the auditory and visual pathways.
Collapse
Affiliation(s)
- Ross S Williamson
- Massachusetts Eye and Ear Infirmary, Eaton-Peabody Laboratories, 243 Charles Street, Boston, MA 02114, USA; Center for Computational Neuroscience and Neural Technology, Boston University, 677 Beacon Street, Boston, MA 02215, USA.
| | - Kenneth E Hancock
- Massachusetts Eye and Ear Infirmary, Eaton-Peabody Laboratories, 243 Charles Street, Boston, MA 02114, USA; Department of Otology and Laryngology, Harvard Medical School, 25 Shattuck Street, Boston, MA 02115, USA
| | - Barbara G Shinn-Cunningham
- Center for Computational Neuroscience and Neural Technology, Boston University, 677 Beacon Street, Boston, MA 02215, USA
| | - Daniel B Polley
- Massachusetts Eye and Ear Infirmary, Eaton-Peabody Laboratories, 243 Charles Street, Boston, MA 02114, USA; Center for Computational Neuroscience and Neural Technology, Boston University, 677 Beacon Street, Boston, MA 02215, USA; Department of Otology and Laryngology, Harvard Medical School, 25 Shattuck Street, Boston, MA 02115, USA
| |
Collapse
|
17
|
Maddox RK, Pospisil DA, Stecker GC, Lee AKC. Directing eye gaze enhances auditory spatial cue discrimination. Curr Biol 2014; 24:748-52. [PMID: 24631242 DOI: 10.1016/j.cub.2014.02.021] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2013] [Revised: 12/12/2013] [Accepted: 02/11/2014] [Indexed: 11/29/2022]
Abstract
The present study demonstrates, for the first time, a specific enhancement of auditory spatial cue discrimination due to eye gaze. Whereas the region of sharpest visual acuity, called the fovea, can be directed at will by moving one's eyes, auditory spatial information is derived primarily from head-related acoustic cues. Past auditory studies have found better discrimination in front of the head [1-3] but have not manipulated subjects' gaze, thus overlooking potential oculomotor influences. Electrophysiological studies have shown that the inferior colliculus, a critical auditory midbrain nucleus, shows visual and oculomotor responses [4-6] and modulations of auditory activity [7-9], and that auditory neurons in the superior colliculus show shifting receptive fields [10-13]. How the auditory system leverages this crossmodal information at the behavioral level remains unknown. Here we directed subjects' gaze (with an eccentric dot) or auditory attention (with lateralized noise) while they performed an auditory spatial cue discrimination task. We found that directing gaze toward a sound significantly enhances discrimination of both interaural level and time differences, whereas directing auditory spatial attention does not. These results show that oculomotor information variably enhances auditory spatial resolution even when the head remains stationary, revealing a distinct behavioral benefit possibly arising from auditory-oculomotor interactions at an earlier level of processing than previously demonstrated.
Collapse
Affiliation(s)
- Ross K Maddox
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA
| | - Dean A Pospisil
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA
| | - G Christopher Stecker
- Department of Speech and Hearing Sciences, University of Washington, 1417 NE 42(nd) Street, Eagleson Hall, Box 354875, Seattle, WA 98105, USA; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, 1215 21(st) Avenue South, Room 8310, Nashville, TN 37232, USA
| | - Adrian K C Lee
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA; Department of Speech and Hearing Sciences, University of Washington, 1417 NE 42(nd) Street, Eagleson Hall, Box 354875, Seattle, WA 98105, USA.
| |
Collapse
|
18
|
Gruters KG, Groh JM. Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus. Front Neural Circuits 2012; 6:96. [PMID: 23248584 PMCID: PMC3518932 DOI: 10.3389/fncir.2012.00096] [Citation(s) in RCA: 70] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2012] [Accepted: 11/15/2012] [Indexed: 11/20/2022] Open
Abstract
The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC “hears” would seem to be passed both “upward” to thalamus and thence to auditory cortex and beyond, as well as “downward” via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.
Collapse
Affiliation(s)
- Kurtis G Gruters
- Department of Psychology and Neuroscience, Duke University Durham, NC, USA
| | | |
Collapse
|
19
|
Bulkin DA, Groh JM. Distribution of visual and saccade related information in the monkey inferior colliculus. Front Neural Circuits 2012; 6:61. [PMID: 22973196 PMCID: PMC3433683 DOI: 10.3389/fncir.2012.00061] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2012] [Accepted: 08/18/2012] [Indexed: 11/29/2022] Open
Abstract
The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visual stimulus. Visual/visuomotor associated activity was found throughout the IC (overall, 84 of 199 sites tested or 42%), but with a far reduced prevalence and strength along recording penetrations passing through the tonotopically organized region of the IC, putatively the central nucleus (11 of 42 sites tested, or 26%). These results suggest that visual information has only a weak effect on early auditory processing in core regions, but more strongly targets the modulatory shell regions of the IC.
Collapse
Affiliation(s)
- David A Bulkin
- Department of Psychology, Cornell University Ithaca, NY, USA
| | | |
Collapse
|