1
|
Paromov D, Moïn-Darbari K, Cedras AM, Maheu M, Bacon BA, Champoux F. Body representation drives auditory spatial perception. iScience 2024; 27:109196. [PMID: 38433911 PMCID: PMC10906536 DOI: 10.1016/j.isci.2024.109196] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2022] [Revised: 08/23/2023] [Accepted: 02/07/2024] [Indexed: 03/05/2024] Open
Abstract
In contrast to the large body of findings confirming the influence of auditory cues on body perception and movement-related activity, the influence of body representation on spatial hearing remains essentially unexplored. Here, we use a disorientation task to assess whether a change in the body's orientation in space could lead to an illusory shift in the localization of a sound source. While most of the participants were initially able to locate the sound source with great precision, they all made substantial errors in judging the position of the same sound source following the body orientation-altering task. These results demonstrate that a change in body orientation can have a significant impact on the auditory processes underlying sound localization. The illusory errors not only confirm the strong connection between the auditory system and the representation of the body in space but also raise questions about the importance of hearing in determining spatial position.
Collapse
Affiliation(s)
- Daniel Paromov
- Université de Montréal, Montréal, QC, Canada
- Centre de recherche de l’Institut Universitaire de Gériatrie de Montréal, Montréal, QC, Canada
| | - Karina Moïn-Darbari
- Université de Montréal, Montréal, QC, Canada
- Centre de recherche de l’Institut Universitaire de Gériatrie de Montréal, Montréal, QC, Canada
| | | | | | - Benoit-Antoine Bacon
- Department of Psychology, The University of British Columbia, Vancouver, BC, Canada
| | - François Champoux
- Université de Montréal, Montréal, QC, Canada
- Centre de recherche de l’Institut Universitaire de Gériatrie de Montréal, Montréal, QC, Canada
| |
Collapse
|
2
|
Alberts BBGT, Selen LPJ, Verhagen WIM, Pennings RJE, Medendorp WP. Bayesian quantification of sensory reweighting in a familial bilateral vestibular disorder (DFNA9). J Neurophysiol 2017; 119:1209-1221. [PMID: 29357473 DOI: 10.1152/jn.00082.2017] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
DFNA9 is a rare progressive autosomal dominantly inherited vestibulo-cochlear disorder, resulting in a homogeneous group of patients with hearing impairment and bilateral vestibular function loss. These patients suffer from a deteriorated sense of spatial orientation, leading to balance problems in darkness, especially on irregular surfaces. Both behavioral and functional imaging studies suggest that the remaining sensory cues could compensate for the loss of vestibular information. A thorough model-based quantification of this reweighting in individual patients is, however, missing. Here we psychometrically examined the individual patient's sensory reweighting of these cues after complete vestibular loss. We asked a group of DFNA9 patients and healthy control subjects to judge the orientation (clockwise or counterclockwise relative to gravity) of a rod presented within an oriented square frame (rod-in-frame task) in three different head-on-body tilt conditions. Our results show a cyclical frame-induced bias in perceived gravity direction across a 90° range of frame orientations. The magnitude of this bias was significantly increased in the patients compared with the healthy control subjects. Response variability, which increased with head-on-body tilt, was also larger for the patients. Reverse engineering of the underlying signal properties, using Bayesian inference principles, suggests a reweighting of sensory signals, with an increase in visual weight of 20-40% in the patients. Our approach of combining psychophysics and Bayesian reverse engineering is the first to quantify the weights associated with the different sensory modalities at an individual patient level, which could make it possible to develop personal rehabilitation programs based on the patient's sensory weight distribution. NEW & NOTEWORTHY It has been suggested that patients with vestibular deficits can compensate for this loss by increasing reliance on other sensory cues, although an actual quantification of this reweighting is lacking. We combine experimental psychophysics with a reverse engineering approach based on Bayesian inference principles to quantify sensory reweighting in individual vestibular patients. We discuss the suitability of this approach for developing personal rehabilitation programs based on the patient's sensory weight distribution.
Collapse
Affiliation(s)
- Bart B G T Alberts
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen , Nijmegen , The Netherlands
| | - Luc P J Selen
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen , Nijmegen , The Netherlands
| | - Wim I M Verhagen
- Neurology, Canisius Wilhelmina Hospital , Nijmegen , The Netherlands
| | - Ronald J E Pennings
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen , Nijmegen , The Netherlands.,Department of Otorhinolaryngology, Radboud University Medical Centre , Nijmegen , The Netherlands
| | - W Pieter Medendorp
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen , Nijmegen , The Netherlands
| |
Collapse
|
3
|
Ueberfuhr MA, Braun A, Wiegrebe L, Grothe B, Drexl M. Modulation of auditory percepts by transcutaneous electrical stimulation. Hear Res 2017; 350:235-243. [DOI: 10.1016/j.heares.2017.03.008] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/11/2017] [Revised: 02/10/2017] [Accepted: 03/15/2017] [Indexed: 10/19/2022]
|
4
|
Van Opstal AJ, Vliegen J, Van Esch T. Reconstructing spectral cues for sound localization from responses to rippled noise stimuli. PLoS One 2017; 12:e0174185. [PMID: 28333967 PMCID: PMC5363849 DOI: 10.1371/journal.pone.0174185] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2016] [Accepted: 03/03/2017] [Indexed: 11/18/2022] Open
Abstract
Human sound localization in the mid-saggital plane (elevation) relies on an analysis of the idiosyncratic spectral shape cues provided by the head and pinnae. However, because the actual free-field stimulus spectrum is a-priori unknown to the auditory system, the problem of extracting the elevation angle from the sensory spectrum is ill-posed. Here we test different spectral localization models by eliciting head movements toward broad-band noise stimuli with randomly shaped, rippled amplitude spectra emanating from a speaker at a fixed location, while varying the ripple bandwidth between 1.5 and 5.0 cycles/octave. Six listeners participated in the experiments. From the distributions of localization responses toward the individual stimuli, we estimated the listeners' spectral-shape cues underlying their elevation percepts, by applying maximum-likelihood estimation. The reconstructed spectral cues resulted to be invariant to the considerable variation in ripple bandwidth, and for each listener they had a remarkable resemblance to the idiosyncratic head-related transfer functions (HRTFs). These results are not in line with models that rely on the detection of a single peak or notch in the amplitude spectrum, nor with a local analysis of first- and second-order spectral derivatives. Instead, our data support a model in which the auditory system performs a cross-correlation between the sensory input at the eardrum-auditory nerve, and stored representations of HRTF spectral shapes, to extract the perceived elevation angle.
Collapse
Affiliation(s)
- A. John Van Opstal
- Radboud University/Department of Biophysics, HG00.831//Donders Center for Neuroscience/Heyendaalseweg 135, 6525 AJ Nijmegen, the Netherlands
- * E-mail:
| | - Joyce Vliegen
- Radboud University/Department of Biophysics, HG00.831//Donders Center for Neuroscience/Heyendaalseweg 135, 6525 AJ Nijmegen, the Netherlands
| | - Thamar Van Esch
- Radboud University/Department of Biophysics, HG00.831//Donders Center for Neuroscience/Heyendaalseweg 135, 6525 AJ Nijmegen, the Netherlands
| |
Collapse
|
5
|
Byl JA, Miersch L, Wieskotten S, Dehnhardt G. Underwater sound localization of pure tones in the median plane by harbor seals (Phoca vitulina). THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2016; 140:4490. [PMID: 28040008 DOI: 10.1121/1.4972531] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
In an underwater environment the physical characteristics of sound propagation differ considerably from those in air. For this reason, sound localization underwater is associated with difficulties, especially in the median plane. It was the approach of the present study to investigate whether harbor seals (Phoca vitulina) are able to determine the direction of a tonal signal form above or below in the underwater environment. Minimum audible angles (MAAs) or the angular range in which the animals could localize a pure tone stimulus in the vertical plane were obtained for frequencies from 0.35 up to 16 kHz. Testing was conducted with four male harbor seals in a semi-circle area of 6 m in diameter in about 2.5 m depth, by using a two alternative forced choice method. The results show that harbor seals are able to localize a pure tone in the median plane under water with a high performance for low frequency stimuli between 350 Hz and 2 kHz with MAAs ranging from below 2.5° up to about 25°. For higher frequencies the animals show strong individual differences.
Collapse
Affiliation(s)
- Jenny Ann Byl
- Sensory and Cognitive Ecology, University of Rostock, Albert-Einstein Strasse 3, 18059 Rostock, Germany
| | - Lars Miersch
- Sensory and Cognitive Ecology, University of Rostock, Albert-Einstein Strasse 3, 18059 Rostock, Germany
| | - Sven Wieskotten
- Sensory and Cognitive Ecology, University of Rostock, Albert-Einstein Strasse 3, 18059 Rostock, Germany
| | - Guido Dehnhardt
- Sensory and Cognitive Ecology, University of Rostock, Albert-Einstein Strasse 3, 18059 Rostock, Germany
| |
Collapse
|
6
|
Van Barneveld DCPBM, Van Wanrooij MM. The influence of static eye and head position on the ventriloquist effect. Eur J Neurosci 2013; 37:1501-10. [PMID: 23463919 DOI: 10.1111/ejn.12176] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2012] [Revised: 12/20/2012] [Accepted: 01/30/2013] [Indexed: 11/28/2022]
Abstract
Orienting responses to audiovisual events have shorter reaction times and better accuracy and precision when images and sounds in the environment are aligned in space and time. How the brain constructs an integrated audiovisual percept is a computational puzzle because the auditory and visual senses are represented in different reference frames: the retina encodes visual locations with respect to the eyes; whereas the sound localisation cues are referenced to the head. In the well-known ventriloquist effect, the auditory spatial percept of the ventriloquist's voice is attracted toward the synchronous visual image of the dummy, but does this visual bias on sound localisation operate in a common reference frame by correctly taking into account eye and head position? Here we studied this question by independently varying initial eye and head orientations, and the amount of audiovisual spatial mismatch. Human subjects pointed head and/or gaze to auditory targets in elevation, and were instructed to ignore co-occurring visual distracters. Results demonstrate that different initial head and eye orientations are accurately and appropriately incorporated into an audiovisual response. Effectively, sounds and images are perceptually fused according to their physical locations in space independent of an observer's point of view. Implications for neurophysiological findings and modelling efforts that aim to reconcile sensory and motor signals for goal-directed behaviour are discussed.
Collapse
Affiliation(s)
- Denise C P B M Van Barneveld
- Department of Biophysics, Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, P.O. Box 9010, 6500 GL, Nijmegen, The Netherlands
| | | |
Collapse
|
7
|
Interactions between the vestibular nucleus and the dorsal cochlear nucleus: implications for tinnitus. Hear Res 2012; 292:80-2. [PMID: 22960359 DOI: 10.1016/j.heares.2012.08.006] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/09/2012] [Revised: 08/16/2012] [Accepted: 08/17/2012] [Indexed: 11/21/2022]
Abstract
The peripheral auditory and vestibular systems are recognised to be closely related anatomically and physiologically; however, less well understood is the interaction of these two sensory systems in the brain. A number of previous studies in different species have reported that the dorsal and ventral cochlear nuclei receive direct projections from the primary vestibular nerve and one previous study had reported projections from the vestibular nucleus to the dorsal cochlear nucleus (DCN) in rabbit. Recently, Barker et al. (2012 PLoS One. 7(5): e35955) have reported new evidence that the lateral vestibular nucleus (LVN) projects to the DCN in rat and that these synapses are mediated by glutamate acting on AMPA and NMDA receptors. These recent findings, in addition to the earlier ones, suggest that the auditory and vestibular systems may be intimately connected centrally as well as peripherally and this may have important implications for disorders such as tinnitus.
Collapse
|
8
|
Meyer GF, Noppeney U. Multisensory integration: from fundamental principles to translational research. Exp Brain Res 2011; 213:163-6. [PMID: 21800253 DOI: 10.1007/s00221-011-2803-z] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
9
|
Van Barneveld DCPBM, Binkhorst F, Van Opstal AJ. Absence of compensation for vestibular-evoked passive head rotations in human sound localization. Eur J Neurosci 2011; 34:1149-60. [PMID: 21895805 DOI: 10.1111/j.1460-9568.2011.07844.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
A world-fixed sound presented to a moving head produces changing sound-localization cues, from which the audiomotor system could infer sound movement relative to the head. When appropriately combined with self-motion signals, sound localization remains spatially accurate. Indeed, free-field orienting responses fully incorporate intervening eye-head movements under open-loop localization conditions. Here we investigate the default strategy of the audiomotor system when localizing sounds in the absence of efferent and proprioceptive head-movement signals. Head- and body-restrained listeners made saccades in total darkness toward brief (3, 10 or 100 ms) broadband noise bursts, while being rotated sinusoidally (f=1/9 Hz, V(peak) =112 deg/s) around the vertical body axis. As the loudspeakers were attached to the chair, the 100 ms sounds might be perceived as rotating along with the chair, and localized in head-centred coordinates. During 3 and 10 ms stimuli, however, the amount of chair rotation remained well below the minimum audible movement angle. These brief sounds would therefore be perceived as stationary in space and, as in open-loop gaze orienting, expected to be localized in world-centred coordinates. Analysis of the saccades shows, however, that all stimuli were accurately localized on the basis of imposed acoustic cues, but remained in head-centred coordinates. These results suggest that, in the absence of motor planning, the audio motor system keeps sounds in head-centred coordinates when unsure about sound motion relative to the head. To that end, it ignores vestibular canal signals of passive-induced head rotation, but incorporates intervening eye displacements from vestibular nystagmus during the saccade-reaction time.
Collapse
Affiliation(s)
- Denise C P B M Van Barneveld
- Radboud University Nijmegen, Donders Institute for Brain, Cognition and Behaviour, Department of Biophysics, Geert Grooteplein 21, Nijmegen, The Netherlands
| | | | | |
Collapse
|