1
|
Malone AK, Hungerford ME, Smith SB, Chang NYN, Uchanski RM, Oh YH, Lewis RF, Hullar TE. Age-Related Changes in Temporal Binding Involving Auditory and Vestibular Inputs. Semin Hear 2024; 45:110-122. [PMID: 38370520 PMCID: PMC10872654 DOI: 10.1055/s-0043-1770137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/20/2024] Open
Abstract
Maintaining balance involves the combination of sensory signals from the visual, vestibular, proprioceptive, and auditory systems. However, physical and biological constraints ensure that these signals are perceived slightly asynchronously. The brain only recognizes them as simultaneous when they occur within a period of time called the temporal binding window (TBW). Aging can prolong the TBW, leading to temporal uncertainty during multisensory integration. This effect might contribute to imbalance in the elderly but has not been examined with respect to vestibular inputs. Here, we compared the vestibular-related TBW in 13 younger and 12 older subjects undergoing 0.5 Hz sinusoidal rotations about the earth-vertical axis. An alternating dichotic auditory stimulus was presented at the same frequency but with the phase varied to determine the temporal range over which the two stimuli were perceived as simultaneous at least 75% of the time, defined as the TBW. The mean TBW among younger subjects was 286 ms (SEM ± 56 ms) and among older subjects was 560 ms (SEM ± 52 ms). TBW was related to vestibular sensitivity among younger but not older subjects, suggesting that a prolonged TBW could be a mechanism for imbalance in the elderly person independent of changes in peripheral vestibular function.
Collapse
Affiliation(s)
| | - Michelle E. Hungerford
- VA RR&D National Center for Rehabilitative Auditory Research, VA Portland Health Care System, Portland, Oregon
- Department of Otolaryngology—Head and Neck Surgery, Oregon Health and Science University, Portland, Oregon
| | - Spencer B. Smith
- Department of Speech, Language, and Hearing Sciences, University of Texas, Austin, Texas
| | - Nai-Yuan N. Chang
- Department of Oral and Maxillofacial Surgery, Oregon Health and Science University, Portland, Oregon
| | - Rosalie M. Uchanski
- Department of Otolaryngology - Head and Neck Surgery, Washington University in St. Louis, St. Louis, Missouri
| | - Yong-Hee Oh
- University of Louisville, Louisville, Kentucky
| | - Richard F. Lewis
- Departments of Otolaryngology and Neurology, Harvard Medical School, Boston, Massachusetts
| | - Timothy E. Hullar
- VA RR&D National Center for Rehabilitative Auditory Research, VA Portland Health Care System, Portland, Oregon
- Department of Otolaryngology—Head and Neck Surgery, Oregon Health and Science University, Portland, Oregon
| |
Collapse
|
2
|
Zanchi S, Cuturi LF, Sandini G, Gori M, Ferrè ER. Vestibular contribution to spatial encoding. Eur J Neurosci 2023; 58:4034-4042. [PMID: 37688501 DOI: 10.1111/ejn.16146] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Revised: 08/23/2023] [Accepted: 08/25/2023] [Indexed: 09/11/2023]
Abstract
Determining the spatial relation between objects and our location in the surroundings is essential for survival. Vestibular inputs provide key information about the position and movement of our head in the three-dimensional space, contributing to spatial navigation. Yet, their role in encoding spatial localisation of environmental targets remains to be fully understood. We probed the accuracy and precision of healthy participants' representations of environmental space by measuring their ability to encode the spatial location of visual targets (Experiment 1). Participants were asked to detect a visual light and then walk towards it. Vestibular signalling was artificially disrupted using stochastic galvanic vestibular stimulation (sGVS) applied selectively during encoding targets' location. sGVS impaired the accuracy and precision of locating the environmental visual targets. Importantly, this effect was specific to the visual modality. The location of acoustic targets was not influenced by vestibular alterations (Experiment 2). Our findings indicate that the vestibular system plays a role in localising visual targets in the surrounding environment, suggesting a crucial functional interaction between vestibular and visual signals for the encoding of the spatial relationship between our body position and the surrounding objects.
Collapse
Affiliation(s)
- Silvia Zanchi
- Unit of Visually Impaired People, Italian Institute of Technology, Genoa, Italy
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| | - Luigi F Cuturi
- Unit of Visually Impaired People, Italian Institute of Technology, Genoa, Italy
- Department of Cognitive Sciences, Psychology, Education and Cultural Studies, University of Messina, Messina, Italy
| | - Giulio Sandini
- Robotics Brain and Cognitive Sciences, Italian Institute of Technology, Genoa, Italy
| | - Monica Gori
- Unit of Visually Impaired People, Italian Institute of Technology, Genoa, Italy
| | - Elisa R Ferrè
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| |
Collapse
|
3
|
Mowery TM, Wackym PA, Nacipucha J, Dangcil E, Stadler RD, Tucker A, Carayannopoulos NL, Beshy MA, Hong SS, Yao JD. Superior semicircular canal dehiscence and subsequent closure induces reversible impaired decision-making. Front Neurol 2023; 14:1259030. [PMID: 37905188 PMCID: PMC10613502 DOI: 10.3389/fneur.2023.1259030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Accepted: 09/14/2023] [Indexed: 11/02/2023] Open
Abstract
Background Vestibular loss and dysfunction has been associated with cognitive deficits, decreased spatial navigation, spatial memory, visuospatial ability, attention, executive function, and processing speed among others. Superior semicircular canal dehiscence (SSCD) is a vestibular-cochlear disorder in humans in which a pathological third mobile window of the otic capsule creates changes to the flow of sound pressure energy through the perilymph/endolymph. The primary symptoms include sound-induced dizziness/vertigo, inner ear conductive hearing loss, autophony, headaches, and visual problems; however, individuals also experience measurable deficits in basic decision-making, short-term memory, concentration, spatial cognition, and depression. These suggest central mechanisms of impairment are associated with vestibular disorders; therefore, we directly tested this hypothesis using both an auditory and visual decision-making task of varying difficulty levels in our model of SSCD. Methods Adult Mongolian gerbils (n = 33) were trained on one of four versions of a Go-NoGo stimulus presentation rate discrimination task that included standard ("easy") or more difficult ("hard") auditory and visual stimuli. After 10 days of training, preoperative ABR and c+VEMP testing was followed by a surgical fenestration of the left superior semicircular canal. Animals with persistent circling or head tilt were excluded to minimize effects from acute vestibular injury. Testing recommenced at postoperative day 5 and continued through postoperative day 15 at which point final ABR and c+VEMP testing was carried out. Results Behavioral data (d-primes) were compared between preoperative performance (training day 8-10) and postoperative days 6-8 and 13-15. Behavioral performance was measured during the peak of SSCD induced ABR and c + VEMP impairment and the return towards baseline as the dehiscence began to resurface by osteoneogenesis. There were significant differences in behavioral performance (d-prime) and its behavioral components (Hits, Misses, False Alarms, and Correct Rejections). These changes were highly correlated with persistent deficits in c + VEMPs at the end of training (postoperative day 15). The controls demonstrated additional learning post procedure that was absent in the SSCD group. Conclusion These results suggest that aberrant asymmetric vestibular output results in decision-making impairments in these discrimination tasks and could be associated with the other cognitive impairments resulting from vestibular dysfunction.
Collapse
Affiliation(s)
- Todd M. Mowery
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
- Rutgers Brain Health Institute, New Brunswick, NJ, United States
| | - P. Ashley Wackym
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
- Rutgers Brain Health Institute, New Brunswick, NJ, United States
| | - Jacqueline Nacipucha
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
| | - Evelynne Dangcil
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
| | - Ryan D. Stadler
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
| | - Aaron Tucker
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
| | - Nicolas L. Carayannopoulos
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
| | - Mina A. Beshy
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
| | - Sean S. Hong
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
| | - Justin D. Yao
- Department of Otolaryngology – Head and Neck Surgery, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, United States
- Rutgers Brain Health Institute, New Brunswick, NJ, United States
| |
Collapse
|
4
|
Peetermans O, Dobbels B, Mertens G, Moyaert J, van de Berg R, Vanderveken O, Van de Heyning P, Pérez Fornos A, Guinand N, Lammers MJW, Van Rompaey V. Sound localization in patients with bilateral vestibulopathy. Eur Arch Otorhinolaryngol 2022; 279:5601-5613. [PMID: 35536383 DOI: 10.1007/s00405-022-07414-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2021] [Accepted: 04/19/2022] [Indexed: 01/04/2023]
Abstract
PURPOSE The goal of this study was to evaluate if bilaterally (partially) absent vestibular function during static sound localization testing, would have a negative impact on sound localization skills. Therefore, this study compared horizontal static sound localization skills of normal-hearing patients with bilateral vestibulopathy (BV) and healthy controls. METHODS Thirteen normal-hearing patients with BV and thirteen age-matched healthy controls were included. Sound localization skills were tested using seven loudspeakers in a frontal semicircle, ranging from - 90° to + 90°. Sound location accuracy was analyzed using the root-mean-square error (RMSE) and the mean absolute error (MAE). To evaluate the severity of the BV symptoms, the following questionnaires were used: Dizziness Handicap Inventory (DHI), Oscillopsia severity questionnaire (OSQ), 12-item Spatial, Speech, and Qualities Questionnaire (SSQ12), and Health Utilities Index Mark 3 (HUI3). RESULTS The RMSE and MAE were significantly larger (worse) in the BV group than in the healthy control group, with respective median RMSE of 4.6° and 0°, and a median MAE of 0.7° and 0°. The subjective reporting of speech perception, spatial hearing, and quality of life only demonstrated a moderate correlation between DHI (positive correlation) and HUI total score (negative correlation), and localization scores. CONCLUSION Static sound localization skills of patients with BV were only mildly worse compared to healthy controls. However, this difference was very small and therefore most likely due to impaired cognitive function. The vestibular system does not seem to have a modulating role in sound localization during static conditions, and its impact is negligible in contrast to the impact of hearing impairment. Furthermore, the subjective reporting of speech perception, spatial hearing, and quality of life was not strongly correlated with localization scores.
Collapse
Affiliation(s)
- Olivier Peetermans
- Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium. .,Department of Otorhinolaryngology and Head and Neck Surgery, Antwerp University Hospital, Antwerp, Belgium. .,Department of Otorhinolaryngology and Head and Neck Surgery, UZA, Drie Eikenstraat 655, 2650, Edegem, Belgium.
| | - Bieke Dobbels
- Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium.,Department of Otorhinolaryngology and Head and Neck Surgery, Antwerp University Hospital, Antwerp, Belgium
| | - Griet Mertens
- Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium.,Department of Otorhinolaryngology and Head and Neck Surgery, Antwerp University Hospital, Antwerp, Belgium
| | - Julie Moyaert
- Department of Otorhinolaryngology and Head and Neck Surgery, Antwerp University Hospital, Antwerp, Belgium
| | - Raymond van de Berg
- Department of Otolaryngology, Head and Neck Surgery, Maastricht University Medical Centre, Maastricht, The Netherlands.,Faculty of Physics, Tomsk State University, Tomsk, Russia
| | - Olivier Vanderveken
- Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium.,Department of Otorhinolaryngology and Head and Neck Surgery, Antwerp University Hospital, Antwerp, Belgium
| | - Paul Van de Heyning
- Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium.,Department of Otorhinolaryngology and Head and Neck Surgery, Antwerp University Hospital, Antwerp, Belgium
| | - Angélica Pérez Fornos
- Service of Otorhinolaryngology Head and Neck Surgery, Department of Clinical Neurosciences, Geneva University Hospitals, Geneva, Switzerland
| | - Nils Guinand
- Service of Otorhinolaryngology Head and Neck Surgery, Department of Clinical Neurosciences, Geneva University Hospitals, Geneva, Switzerland
| | - Marc J W Lammers
- Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium.,Department of Otorhinolaryngology and Head and Neck Surgery, Antwerp University Hospital, Antwerp, Belgium
| | - Vincent Van Rompaey
- Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium.,Department of Otorhinolaryngology and Head and Neck Surgery, Antwerp University Hospital, Antwerp, Belgium
| |
Collapse
|
5
|
Gaveau V, Coudert A, Salemme R, Koun E, Desoche C, Truy E, Farnè A, Pavani F. Benefits of active listening during 3D sound localization. Exp Brain Res 2022; 240:2817-2833. [PMID: 36071210 PMCID: PMC9587935 DOI: 10.1007/s00221-022-06456-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Accepted: 08/28/2022] [Indexed: 11/29/2022]
Abstract
In everyday life, sound localization entails more than just the extraction and processing of auditory cues. When determining sound position in three dimensions, the brain also considers the available visual information (e.g., visual cues to sound position) and resolves perceptual ambiguities through active listening behavior (e.g., spontaneous head movements while listening). Here, we examined to what extent spontaneous head movements improve sound localization in 3D—azimuth, elevation, and depth—by comparing static vs. active listening postures. To this aim, we developed a novel approach to sound localization based on sounds delivered in the environment, brought into alignment thanks to a VR system. Our system proved effective for the delivery of sounds at predetermined and repeatable positions in 3D space, without imposing a physically constrained posture, and with minimal training. In addition, it allowed measuring participant behavior (hand, head and eye position) in real time. We report that active listening improved 3D sound localization, primarily by ameliorating accuracy and variability of responses in azimuth and elevation. The more participants made spontaneous head movements, the better was their 3D sound localization performance. Thus, we provide proof of concept of a novel approach to the study of spatial hearing, with potentials for clinical and industrial applications.
Collapse
Affiliation(s)
- V Gaveau
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Av. Doyen Lépine, BRON cedex, 69500, Lyon, France. .,University of Lyon 1, Lyon, France.
| | - A Coudert
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Av. Doyen Lépine, BRON cedex, 69500, Lyon, France.,University of Lyon 1, Lyon, France.,ENT Departments, Hôpital Femme-Mère-Enfant and Edouard Herriot University Hospitals, Lyon, France
| | - R Salemme
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Av. Doyen Lépine, BRON cedex, 69500, Lyon, France.,University of Lyon 1, Lyon, France.,Neuro-immersion, Lyon, France
| | - E Koun
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Av. Doyen Lépine, BRON cedex, 69500, Lyon, France.,University of Lyon 1, Lyon, France
| | - C Desoche
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Av. Doyen Lépine, BRON cedex, 69500, Lyon, France.,University of Lyon 1, Lyon, France.,Neuro-immersion, Lyon, France
| | - E Truy
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Av. Doyen Lépine, BRON cedex, 69500, Lyon, France.,University of Lyon 1, Lyon, France.,ENT Departments, Hôpital Femme-Mère-Enfant and Edouard Herriot University Hospitals, Lyon, France
| | - A Farnè
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Av. Doyen Lépine, BRON cedex, 69500, Lyon, France.,University of Lyon 1, Lyon, France.,Neuro-immersion, Lyon, France
| | - F Pavani
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Av. Doyen Lépine, BRON cedex, 69500, Lyon, France.,University of Lyon 1, Lyon, France.,Center for Mind/Brain Sciences - CIMeC, University of Trento, Rovereto, Italy
| |
Collapse
|
6
|
Occhigrossi C, Brosch M, Giommetti G, Panichi R, Ricci G, Ferraresi A, Roscini M, Pettorossi VE, Faralli M. Auditory perception is influenced by the orientation of the trunk relative to a sound source. Exp Brain Res 2021; 239:1223-1234. [PMID: 33587165 DOI: 10.1007/s00221-021-06047-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Accepted: 01/18/2021] [Indexed: 12/15/2022]
Abstract
The study investigated how hearing depends on the whole body, head and trunk orientation relative to a sound source. In normal hearing humans we examined auditory thresholds and their ability to recognize logatomes (bi-syllabic non-sense words) at different whole body, head and trunk rotation relative to a sound source. We found that auditory threshold was increased and logatome recognition was impaired when the body or the trunk were rotated 40° away from a sound source compared to when the body or the trunk was oriented towards the sound source. Conversely, no effects were seen when only the head was rotated. Further, an increase of thresholds and impairment of logatome recognition were also observed after unilateral vibration of dorsal neck muscles that induces, per se, long-lasting illusory trunk displacement relative to the head. Thus, our findings support the idea that processing of acoustic signals depends on where a sound is located within a reference system defined by the subject's trunk coordinates.
Collapse
Affiliation(s)
- Chiara Occhigrossi
- Department of Experimental Medicine, Human Physiology Section, Università degli Studi di Perugia, Perugia, Italy
| | - Michael Brosch
- Research Group Comparative Neuroscience, Leibniz Institute for Neurobiology, Brenneckestraße 6, 39118, Magdeburg, Germany
- Center for Behavioral Brain Sciences, Otto-Von-Guericke-University, Universitätsplatz 2, 39106, Magdeburg, Germany
| | - Giorgia Giommetti
- Department of Surgical and Biomedical Sciences, Università degli Studi di Perugia, Perugia, Italy
| | - Roberto Panichi
- Department of Experimental Medicine, Human Physiology Section, Università degli Studi di Perugia, Perugia, Italy
| | - Giampietro Ricci
- Department of Surgical and Biomedical Sciences, Università degli Studi di Perugia, Perugia, Italy
| | - Aldo Ferraresi
- Department of Experimental Medicine, Human Physiology Section, Università degli Studi di Perugia, Perugia, Italy
| | - Mauro Roscini
- Department of Experimental Medicine, Human Physiology Section, Università degli Studi di Perugia, Perugia, Italy
| | - Vito Enrico Pettorossi
- Department of Experimental Medicine, Human Physiology Section, Università degli Studi di Perugia, Perugia, Italy.
| | - Mario Faralli
- Department of Surgical and Biomedical Sciences, Università degli Studi di Perugia, Perugia, Italy
| |
Collapse
|
7
|
Kaliuzhna M, Serino A, Berger S, Blanke O. Differential effects of vestibular processing on orienting exogenous and endogenous covert visual attention. Exp Brain Res 2018; 237:401-410. [PMID: 30421244 DOI: 10.1007/s00221-018-5403-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2018] [Accepted: 10/12/2018] [Indexed: 11/30/2022]
Abstract
Recent research highlights the overwhelming role of vestibular information for higher order cognition. Central to body perception, vestibular cues provide information about self-location in space, self-motion versus object motion, and modulate the perception of space. Surprisingly, however, little research has dealt with how vestibular information combines with other senses to orient one's attention in space. Here we used passive whole body rotations as exogenous (Experiment 1) or endogenous (Experiment 2) attentional cues and studied their effects on orienting visual attention in a classical Posner paradigm. We show that-when employed as an exogenous stimulus-rotation impacts attention orienting only immediately after vestibular stimulation onset. However, when acting as an endogenous stimulus, vestibular stimulation provides a robust benefit to target detection throughout the rotation profile. Our data also demonstrate that vestibular stimulation boosts attentional processing more generally, independent of rotation direction, associated with a general improvement in performance. These data provide evidence for distinct effects of vestibular processing on endogenous and exogenous attention as well as alertness that differ with respect to the temporal dynamics of the motion profile. These data reveal that attentional spatial processing and spatial body perception as manipulated through vestibular stimulation share important brain mechanisms.
Collapse
Affiliation(s)
- Mariia Kaliuzhna
- Center for Neuroprosthetics, Brain Mind Institute, Faculty of Life Sciences, School of Life Science, Ecole Polytechnique Fédérale de Lausanne, 1015, Lausanne, Switzerland
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, School of Life Science, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Andrea Serino
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, School of Life Science, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
- Department of Psychology, University of Bologna, Bologna, Italy
| | - Steve Berger
- Center for Neuroprosthetics, Brain Mind Institute, Faculty of Life Sciences, School of Life Science, Ecole Polytechnique Fédérale de Lausanne, 1015, Lausanne, Switzerland
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, School of Life Science, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Olaf Blanke
- Center for Neuroprosthetics, Brain Mind Institute, Faculty of Life Sciences, School of Life Science, Ecole Polytechnique Fédérale de Lausanne, 1015, Lausanne, Switzerland.
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, School of Life Science, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland.
- Department of Neurology, University Hospital, Geneva, Switzerland.
| |
Collapse
|
8
|
Wu C, Shore SE. Multisensory activation of ventral cochlear nucleus D-stellate cells modulates dorsal cochlear nucleus principal cell spatial coding. J Physiol 2018; 596:4537-4548. [PMID: 30074618 DOI: 10.1113/jp276280] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2018] [Accepted: 08/02/2018] [Indexed: 01/27/2023] Open
Abstract
KEY POINTS Dorsal cochlear nucleus fusiform cells receive spectrally relevant auditory input for sound localization. Fusiform cells integrate auditory with other multisensory inputs. Here we elucidate how somatosensory and vestibular stimulation modify the fusiform cell spatial code through activation of an inhibitory interneuron: the ventral cochlear nucleus D-stellate cell. These results suggests that multisensory cues interact early in an ascending sensory pathway to serve an essential function. ABSTRACT In the cochlear nucleus (CN), the first central site for coding sound location, numerous multisensory projections and their modulatory effects have been reported. However, multisensory influences on sound location processing in the CN remain unknown. The principal output neurons of the dorsal CN, fusiform cells, encode spatial information through frequency-selective responses to direction-dependent spectral features. Here, single-unit recordings from the guinea pig CN revealed transient alterations by somatosensory and vestibular stimulation in fusiform cell spatial coding. Changes in fusiform cell spectral sensitivity correlated with multisensory modulation of ventral CN D-stellate cell responses, which provide direct, wideband inhibition to fusiform cells. These results suggest that multisensory inputs contribute to spatial coding in DCN fusiform cells via an inhibitory interneuron, the D-stellate cell. This early multisensory integration circuit likely confers important consequences on perceptual organization downstream.
Collapse
Affiliation(s)
- Calvin Wu
- Kresge Hearing Research Institute, Department of Otolaryngology, University of Michigan, Ann Arbor, MI, 48109, USA
| | - Susan E Shore
- Kresge Hearing Research Institute, Department of Otolaryngology, University of Michigan, Ann Arbor, MI, 48109, USA
| |
Collapse
|
9
|
Shayman CS, Seo JH, Oh Y, Lewis RF, Peterka RJ, Hullar TE. Relationship between vestibular sensitivity and multisensory temporal integration. J Neurophysiol 2018; 120:1572-1577. [PMID: 30020839 DOI: 10.1152/jn.00379.2018] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
A single event can generate asynchronous sensory cues due to variable encoding, transmission, and processing delays. To be interpreted as being associated in time, these cues must occur within a limited time window, referred to as a "temporal binding window" (TBW). We investigated the hypothesis that vestibular deficits could disrupt temporal visual-vestibular integration by determining the relationships between vestibular threshold and TBW in participants with normal vestibular function and with vestibular hypofunction. Vestibular perceptual thresholds to yaw rotation were characterized and compared with the TBWs obtained from participants who judged whether a suprathreshold rotation occurred before or after a brief visual stimulus. Vestibular thresholds ranged from 0.7 to 16.5 deg/s and TBWs ranged from 13.8 to 395 ms. Among all participants, TBW and vestibular thresholds were well correlated ( R2 = 0.674, P < 0.001), with vestibular-deficient patients having higher thresholds and wider TBWs. Participants reported that the rotation onset needed to lead the light flash by an average of 80 ms for the visual and vestibular cues to be perceived as occurring simultaneously. The wide TBWs in vestibular-deficient participants compared with normal functioning participants indicate that peripheral sensory loss can lead to abnormal multisensory integration. A reduced ability to temporally combine sensory cues appropriately may provide a novel explanation for some symptoms reported by patients with vestibular deficits. Even among normal functioning participants, a high correlation between TBW and vestibular thresholds was observed, suggesting that these perceptual measurements are sensitive to small differences in vestibular function. NEW & NOTEWORTHY While spatial visual-vestibular integration has been well characterized, the temporal integration of these cues is not well understood. The relationship between sensitivity to whole body rotation and duration of the temporal window of visual-vestibular integration was examined using psychophysical techniques. These parameters were highly correlated for those with normal vestibular function and for patients with vestibular hypofunction. Reduced temporal integration performance in patients with vestibular hypofunction may explain some symptoms associated with vestibular loss.
Collapse
Affiliation(s)
- Corey S Shayman
- Department of Otolaryngology-Head and Neck Surgery, Oregon Health and Science University , Portland, Oregon
| | - Jae-Hyun Seo
- Department of Otolaryngology-Head and Neck Surgery, Oregon Health and Science University , Portland, Oregon.,Department of Otolaryngology-Head and Neck Surgery, The Catholic University of Korea, Seoul, Republic of Korea
| | - Yonghee Oh
- Department of Otolaryngology-Head and Neck Surgery, Oregon Health and Science University , Portland, Oregon
| | - Richard F Lewis
- Department of Otolaryngology, Harvard Medical School , Boston, Massachusetts.,Department of Neurology, Harvard Medical School , Boston, Massachusetts.,Jenks Vestibular Physiology Laboratory, Massachusetts Eye and Ear Infirmary, Boston, Massachusetts
| | - Robert J Peterka
- National Center for Rehabilitative Auditory Research-VA Portland Health Care System , Portland, Oregon.,Department of Neurology, Oregon Health and Science University , Portland, Oregon
| | - Timothy E Hullar
- Department of Otolaryngology-Head and Neck Surgery, Oregon Health and Science University , Portland, Oregon
| |
Collapse
|
10
|
Karim AM, Rumalla K, King LA, Hullar TE. The effect of spatial auditory landmarks on ambulation. Gait Posture 2018; 60:171-174. [PMID: 29241100 PMCID: PMC5809182 DOI: 10.1016/j.gaitpost.2017.12.003] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/09/2016] [Revised: 12/01/2017] [Accepted: 12/02/2017] [Indexed: 02/02/2023]
Abstract
The maintenance of balance and posture is a result of the collaborative efforts of vestibular, proprioceptive, and visual sensory inputs, but a fourth neural input, audition, may also improve balance. Here, we tested the hypothesis that auditory inputs function as environmental spatial landmarks whose effectiveness depends on sound localization ability during ambulation. Eight blindfolded normal young subjects performed the Fukuda-Unterberger test in three auditory conditions: silence, white noise played through headphones (head-referenced condition), and white noise played through a loudspeaker placed directly in front at 135 centimeters away from the ear at ear height (earth-referenced condition). For the earth-referenced condition, an additional experiment was performed where the effect of moving the speaker azimuthal position to 45, 90, 135, and 180° was tested. Subjects performed significantly better in the earth-referenced condition than in the head-referenced or silent conditions. Performance progressively decreased over the range from 0° to 135° but all subjects then improved slightly at the 180° compared to the 135° condition. These results suggest that presence of sound dramatically improves the ability to ambulate when vision is limited, but that sound sources must be located in the external environment in order to improve balance. This supports the hypothesis that they act by providing spatial landmarks against which head and body movement and orientation may be compared and corrected. Balance improvement in the azimuthal plane mirrors sensitivity to sound movement at similar positions, indicating that similar auditory mechanisms may underlie both processes. These results may help optimize the use of auditory cues to improve balance in particular patient populations.
Collapse
Affiliation(s)
| | | | - Laurie A King
- Oregon Health and Science University, Portland, OR, USA
| | | |
Collapse
|
11
|
Hambrook DA, Ilievski M, Mosadeghzad M, Tata M. A Bayesian computational basis for auditory selective attention using head rotation and the interaural time-difference cue. PLoS One 2017; 12:e0186104. [PMID: 28982139 PMCID: PMC5629026 DOI: 10.1371/journal.pone.0186104] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2017] [Accepted: 09/25/2017] [Indexed: 11/18/2022] Open
Abstract
The process of resolving mixtures of several sounds into their separate individual streams is known as auditory scene analysis and it remains a challenging task for computational systems. It is well-known that animals use binaural differences in arrival time and intensity at the two ears to find the arrival angle of sounds in the azimuthal plane, and this localization function has sometimes been considered sufficient to enable the un-mixing of complex scenes. However, the ability of such systems to resolve distinct sound sources in both space and frequency remains limited. The neural computations for detecting interaural time difference (ITD) have been well studied and have served as the inspiration for computational auditory scene analysis systems, however a crucial limitation of ITD models is that they produce ambiguous or "phantom" images in the scene. This has been thought to limit their usefulness at frequencies above about 1khz in humans. We present a simple Bayesian model and an implementation on a robot that uses ITD information recursively. The model makes use of head rotations to show that ITD information is sufficient to unambiguously resolve sound sources in both space and frequency. Contrary to commonly held assumptions about sound localization, we show that the ITD cue used with high-frequency sound can provide accurate and unambiguous localization and resolution of competing sounds. Our findings suggest that an "active hearing" approach could be useful in robotic systems that operate in natural, noisy settings. We also suggest that neurophysiological models of sound localization in animals could benefit from revision to include the influence of top-down memory and sensorimotor integration across head rotations.
Collapse
Affiliation(s)
- Dillon A. Hambrook
- Department of Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Marko Ilievski
- Department of Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Mohamad Mosadeghzad
- Department of Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Matthew Tata
- Department of Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| |
Collapse
|
12
|
Ueberfuhr MA, Braun A, Wiegrebe L, Grothe B, Drexl M. Modulation of auditory percepts by transcutaneous electrical stimulation. Hear Res 2017; 350:235-243. [DOI: 10.1016/j.heares.2017.03.008] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/11/2017] [Revised: 02/10/2017] [Accepted: 03/15/2017] [Indexed: 10/19/2022]
|
13
|
Carlile S, Leung J. The Perception of Auditory Motion. Trends Hear 2016; 20:2331216516644254. [PMID: 27094029 PMCID: PMC4871213 DOI: 10.1177/2331216516644254] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2015] [Revised: 03/22/2016] [Accepted: 03/22/2016] [Indexed: 11/16/2022] Open
Abstract
The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception.
Collapse
Affiliation(s)
- Simon Carlile
- School of Medical Sciences, University of Sydney, NSW, Australia Starkey Hearing Research Center, Berkeley, CA, USA
| | - Johahn Leung
- School of Medical Sciences, University of Sydney, NSW, Australia
| |
Collapse
|
14
|
Yost WA, Zhong X, Najam A. Judging sound rotation when listeners and sounds rotate: Sound source localization is a multisystem process. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2015; 138:3293-310. [PMID: 26627802 DOI: 10.1121/1.4935091] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
In four experiments listeners were rotated or were stationary. Sounds came from a stationary loudspeaker or rotated from loudspeaker to loudspeaker around an azimuth array. When either sounds or listeners rotate the auditory cues used for sound source localization change, but in the everyday world listeners perceive sound rotation only when sounds rotate not when listeners rotate. In the everyday world sound source locations are referenced to positions in the environment (a world-centric reference system). The auditory cues for sound source location indicate locations relative to the head (a head-centric reference system), not locations relative to the world. This paper deals with a general hypothesis that the world-centric location of sound sources requires the auditory system to have information about auditory cues used for sound source location and cues about head position. The use of visual and vestibular information in determining rotating head position in sound rotation perception was investigated. The experiments show that sound rotation perception when sources and listeners rotate was based on acoustic, visual, and, perhaps, vestibular information. The findings are consistent with the general hypotheses and suggest that sound source localization is not based just on acoustics. It is a multisystem process.
Collapse
Affiliation(s)
- William A Yost
- Speech and Hearing Science, Arizona State University, P.O. Box 870102, Tempe, Arizona 85287, USA
| | - Xuan Zhong
- Speech and Hearing Science, Arizona State University, P.O. Box 870102, Tempe, Arizona 85287, USA
| | - Anbar Najam
- Speech and Hearing Science, Arizona State University, P.O. Box 870102, Tempe, Arizona 85287, USA
| |
Collapse
|
15
|
Brimijoin WO, Akeroyd MA. The moving minimum audible angle is smaller during self motion than during source motion. Front Neurosci 2014; 8:273. [PMID: 25228856 PMCID: PMC4151253 DOI: 10.3389/fnins.2014.00273] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2014] [Accepted: 08/12/2014] [Indexed: 11/17/2022] Open
Abstract
We are rarely perfectly still: our heads rotate in three axes and move in three dimensions, constantly varying the spectral and binaural cues at the ear drums. In spite of this motion, static sound sources in the world are typically perceived as stable objects. This argues that the auditory system—in a manner not unlike the vestibulo-ocular reflex—works to compensate for self motion and stabilize our sensory representation of the world. We tested a prediction arising from this postulate: that self motion should be processed more accurately than source motion. We used an infrared motion tracking system to measure head angle, and real-time interpolation of head related impulse responses to create “head-stabilized” signals that appeared to remain fixed in space as the head turned. After being presented with pairs of simultaneous signals consisting of a man and a woman speaking a snippet of speech, normal and hearing impaired listeners were asked to report whether the female voice was to the left or the right of the male voice. In this way we measured the moving minimum audible angle (MMAA). This measurement was made while listeners were asked to turn their heads back and forth between ± 15° and the signals were stabilized in space. After this “self-motion” condition we measured MMAA in a second “source-motion” condition when listeners remained still and the virtual locations of the signals were moved using the trajectories from the first condition. For both normal and hearing impaired listeners, we found that the MMAA for signals moving relative to the head was ~1–2° smaller when the movement was the result of self motion than when it was the result of source motion, even though the motion with respect to the head was identical. These results as well as the results of past experiments suggest that spatial processing involves an ongoing and highly accurate comparison of spatial acoustic cues with self-motion cues.
Collapse
Affiliation(s)
- W Owen Brimijoin
- Scottish Section, Institute of Hearing Research, Medical Research Council/Chief Scientist Office Glasgow, UK
| | - Michael A Akeroyd
- Scottish Section, Institute of Hearing Research, Medical Research Council/Chief Scientist Office Glasgow, UK
| |
Collapse
|
16
|
Teramoto W, Cui Z, Sakamoto S, Gyoba J. Distortion of auditory space during visually induced self-motion in depth. Front Psychol 2014; 5:848. [PMID: 25140162 PMCID: PMC4122181 DOI: 10.3389/fpsyg.2014.00848] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2014] [Accepted: 07/16/2014] [Indexed: 11/13/2022] Open
Abstract
Perception of self-motion is based on the integration of multiple sensory inputs, in particular from the vestibular and visual systems. Our previous study demonstrated that vestibular linear acceleration information distorted auditory space perception (Teramoto et al., 2012). However, it is unclear whether this phenomenon is contingent on vestibular signals or whether it can be caused by inputs from other sensory modalities involved in self-motion perception. Here, we investigated whether visual linear self-motion information can also alter auditory space perception. Large-field visual motion was presented to induce self-motion perception with constant accelerations (Experiment 1) and a constant velocity (Experiment 2) either in a forward or backward direction. During participants' experience of self-motion, a short noise burst was delivered from one of the loudspeakers aligned parallel to the motion direction along a wall to the left of the listener. Participants indicated from which direction the sound was presented, forward or backward, relative to their coronal (i.e., frontal) plane. Results showed that the sound position aligned with the subjective coronal plane (SCP) was significantly displaced in the direction of self-motion, especially in the backward self-motion condition as compared with a no motion condition. These results suggest that self-motion information, irrespective of its origin, is crucial for auditory space perception.
Collapse
Affiliation(s)
- Wataru Teramoto
- Department of Computer Science and Systems Engineering, Muroran Institute of Technology Muroran, Japan ; Research Institute of Electrical Communication, Tohoku University Sendai, Japan
| | - Zhenglie Cui
- Research Institute of Electrical Communication, Tohoku University Sendai, Japan
| | - Shuichi Sakamoto
- Research Institute of Electrical Communication, Tohoku University Sendai, Japan
| | - Jiro Gyoba
- Department of Psychology, Graduate School of Arts and Letters, Tohoku University Sendai, Japan
| |
Collapse
|
17
|
Carlile S. The plastic ear and perceptual relearning in auditory spatial perception. Front Neurosci 2014; 8:237. [PMID: 25147497 PMCID: PMC4123622 DOI: 10.3389/fnins.2014.00237] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2014] [Accepted: 07/18/2014] [Indexed: 11/28/2022] Open
Abstract
The auditory system of adult listeners has been shown to accommodate to altered spectral cues to sound location which presumably provides the basis for recalibration to changes in the shape of the ear over a life time. Here we review the role of auditory and non-auditory inputs to the perception of sound location and consider a range of recent experiments looking at the role of non-auditory inputs in the process of accommodation to these altered spectral cues. A number of studies have used small ear molds to modify the spectral cues that result in significant degradation in localization performance. Following chronic exposure (10–60 days) performance recovers to some extent and recent work has demonstrated that this occurs for both audio-visual and audio-only regions of space. This begs the questions as to the teacher signal for this remarkable functional plasticity in the adult nervous system. Following a brief review of influence of the motor state in auditory localization, we consider the potential role of auditory-motor learning in the perceptual recalibration of the spectral cues. Several recent studies have considered how multi-modal and sensory-motor feedback might influence accommodation to altered spectral cues produced by ear molds or through virtual auditory space stimulation using non-individualized spectral cues. The work with ear molds demonstrates that a relatively short period of training involving audio-motor feedback (5–10 days) significantly improved both the rate and extent of accommodation to altered spectral cues. This has significant implications not only for the mechanisms by which this complex sensory information is encoded to provide spatial cues but also for adaptive training to altered auditory inputs. The review concludes by considering the implications for rehabilitative training with hearing aids and cochlear prosthesis.
Collapse
Affiliation(s)
- Simon Carlile
- School of Medical Sciences and Bosch Institute, University of Sydney Sydney, NSW, Australia
| |
Collapse
|
18
|
Carlile S, Balachandar K, Kelly H. Accommodating to new ears: the effects of sensory and sensory-motor feedback. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2014; 135:2002-2011. [PMID: 25234999 DOI: 10.1121/1.4868369] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Changing the shape of the outer ear using small in-ear molds degrades sound localization performance consistent with the distortion of monaural spectral cues to location. It has been shown recently that adult listeners re-calibrate to these new spectral cues for locations both inside and outside the visual field. This raises the question as to the teacher signal for this remarkable functional plasticity. Furthermore, large individual differences in the extent and rate of accommodation suggests a number of factors may be driving this process. A training paradigm exploiting multi-modal and sensory-motor feedback during accommodation was examined to determine whether it might accelerate this process. So as to standardize the modification of the spectral cues, molds filling 40% of the volume of each outer ear were custom made for each subject. Daily training sessions for about an hour, involving repetitive auditory stimuli and exploratory behavior by the subject, significantly improved the extent of accommodation measured by both front-back confusions and polar angle localization errors, with some improvement in the rate of accommodation demonstrated by front-back confusion errors. This work has implications for both the process by which a coherent representation of auditory space is maintained and for accommodative training for hearing aid wearers.
Collapse
Affiliation(s)
- Simon Carlile
- School of Medical Sciences and The Bosch Institute, University of Sydney, Sydney, New South Wales 2006, Australia
| | - Kapilesh Balachandar
- School of Medical Sciences and The Bosch Institute, University of Sydney, Sydney, New South Wales 2006, Australia
| | - Heather Kelly
- School of Medical Sciences and The Bosch Institute, University of Sydney, Sydney, New South Wales 2006, Australia
| |
Collapse
|
19
|
Interactions between the vestibular nucleus and the dorsal cochlear nucleus: implications for tinnitus. Hear Res 2012; 292:80-2. [PMID: 22960359 DOI: 10.1016/j.heares.2012.08.006] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/09/2012] [Revised: 08/16/2012] [Accepted: 08/17/2012] [Indexed: 11/21/2022]
Abstract
The peripheral auditory and vestibular systems are recognised to be closely related anatomically and physiologically; however, less well understood is the interaction of these two sensory systems in the brain. A number of previous studies in different species have reported that the dorsal and ventral cochlear nuclei receive direct projections from the primary vestibular nerve and one previous study had reported projections from the vestibular nucleus to the dorsal cochlear nucleus (DCN) in rabbit. Recently, Barker et al. (2012 PLoS One. 7(5): e35955) have reported new evidence that the lateral vestibular nucleus (LVN) projects to the DCN in rat and that these synapses are mediated by glutamate acting on AMPA and NMDA receptors. These recent findings, in addition to the earlier ones, suggest that the auditory and vestibular systems may be intimately connected centrally as well as peripherally and this may have important implications for disorders such as tinnitus.
Collapse
|
20
|
Teramoto W, Sakamoto S, Furune F, Gyoba J, Suzuki Y. Compression of auditory space during forward self-motion. PLoS One 2012; 7:e39402. [PMID: 22768076 PMCID: PMC3387142 DOI: 10.1371/journal.pone.0039402] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2011] [Accepted: 05/21/2012] [Indexed: 11/19/2022] Open
Abstract
BACKGROUND Spatial inputs from the auditory periphery can be changed with movements of the head or whole body relative to the sound source. Nevertheless, humans can perceive a stable auditory environment and appropriately react to a sound source. This suggests that the inputs are reinterpreted in the brain, while being integrated with information on the movements. Little is known, however, about how these movements modulate auditory perceptual processing. Here, we investigate the effect of the linear acceleration on auditory space representation. METHODOLOGY/PRINCIPAL FINDINGS Participants were passively transported forward/backward at constant accelerations using a robotic wheelchair. An array of loudspeakers was aligned parallel to the motion direction along a wall to the right of the listener. A short noise burst was presented during the self-motion from one of the loudspeakers when the listener's physical coronal plane reached the location of one of the speakers (null point). In Experiments 1 and 2, the participants indicated which direction the sound was presented, forward or backward relative to their subjective coronal plane. The results showed that the sound position aligned with the subjective coronal plane was displaced ahead of the null point only during forward self-motion and that the magnitude of the displacement increased with increasing the acceleration. Experiment 3 investigated the structure of the auditory space in the traveling direction during forward self-motion. The sounds were presented at various distances from the null point. The participants indicated the perceived sound location by pointing a rod. All the sounds that were actually located in the traveling direction were perceived as being biased towards the null point. CONCLUSIONS/SIGNIFICANCE These results suggest a distortion of the auditory space in the direction of movement during forward self-motion. The underlying mechanism might involve anticipatory spatial shifts in the auditory receptive field locations driven by afferent signals from vestibular system.
Collapse
Affiliation(s)
- Wataru Teramoto
- Research Institute of Electrical Communication, Tohoku University, Sendai, Japan.
| | | | | | | | | |
Collapse
|
21
|
Does caloric vestibular stimulation modulate tinnitus? Neurosci Lett 2011; 492:52-4. [PMID: 21295114 DOI: 10.1016/j.neulet.2011.01.052] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2009] [Revised: 01/20/2011] [Accepted: 01/20/2011] [Indexed: 11/22/2022]
Abstract
Caloric vestibular stimulation (CVS) has been demonstrated to transiently modulate a variety of cognitive functions. These effects are associated with the brain activation induced by CVS, involving the temporal-parietal cortex, anterior cingulate cortex and insular cortex, which are thought to form a multimodal vestibular cortical network. The present study investigated the effect of CVS upon tinnitus. Twenty patients undergoing vestibular function tests for symptoms of imbalance and who reported tinnitus were asked to rate their tinnitus using visual analogue measures of pitch and intensity immediately before and after CVS (H(2)O at 44°C) in the ear ipsilateral to the tinnitus. One patient was excluded due to test findings indicative of a central vestibular abnormality. The mean VAS pitch (pre-post) changed from 5.65 to 5.28 (95% confidence interval (-0.87, 0.12), p-value 0.13) and the mean change in intensity changed from 5.21 to 4.43 (95% confidence interval (-1.60, 0.04), p-value 0.06). The findings indicate that there is no consistent influence of CVS upon tinnitus, and we propose that perceived pitch and intensity of tinnitus are independent of the multimodal vestibular network that is activated by CVS.
Collapse
|
22
|
Perceived timing of vestibular stimulation relative to touch, light and sound. Exp Brain Res 2009; 198:221-31. [PMID: 19352639 DOI: 10.1007/s00221-009-1779-4] [Citation(s) in RCA: 66] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2008] [Accepted: 03/14/2009] [Indexed: 10/20/2022]
Abstract
Different senses have different processing times. Here we measured the perceived timing of galvanic vestibular stimulation (GVS) relative to tactile, visual and auditory stimuli. Simple reaction times for perceived head movement (438 +/- 49 ms) were significantly longer than to touches (245 +/- 14 ms), lights (220 +/- 13 ms), or sounds (197 +/- 13 ms). Temporal order and simultaneity judgments both indicated that GVS had to occur about 160 ms before other stimuli to be perceived as simultaneous with them. This lead was significantly less than the relative timing predicted by reaction time differences compatible with an incomplete tendency to compensate for differences in processing times.
Collapse
|
23
|
Cooper J, Carlile S, Alais D. Distortions of auditory space during rapid head turns. Exp Brain Res 2008; 191:209-19. [PMID: 18696058 DOI: 10.1007/s00221-008-1516-4] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2006] [Accepted: 07/21/2008] [Indexed: 10/21/2022]
Abstract
Auditory localisation was examined using brief broadband sounds presented during rapid head turns to visual targets in the peripheral field. Presenting sounds during a rapid head movement will "smear" the acoustic cues to the sound's location. During the early stages of a head turn, sound localisation accuracy was comparable to a no-turn control condition. However, significant localisation errors occurred when the probe sound was presented during the later part of a head turn. After correcting for head position, the estimate of lateral angle (horizontal position) in the front hemisphere was generally accurate. However, lateral angle estimates for positions in the rear hemisphere exhibited systematic errors that were especially large around the midline. Polar angle (elevation) perception remained robust, being comparable to no-turn controls whether tested early or late in the head turn. The results are interpreted in terms of a 'multiple look' strategy for calculating sound location, and the allocation of attention to the hemisphere containing the head-turn target.
Collapse
Affiliation(s)
- Joel Cooper
- Auditory Neuroscience Laboratory, School of Medical Science and Bosch Institute, University of Sydney, Sydney, NSW 2006, Australia
| | | | | |
Collapse
|
24
|
Abstract
Sound localization is known to be a complex phenomenon, combining multisensory information processing, experience-dependent plasticity, and movement. Here we present a sensorimotor model that addresses the question of how an organism could learn to localize sound sources without any a priori neural representation of its head-related transfer function or prior experience with auditory spatial information. We demonstrate quantitatively that the experience of the sensory consequences of its voluntary motor actions allows an organism to learn the spatial location of any sound source. Using examples from humans and echolocating bats, our model shows that a naive organism can learn the auditory space based solely on acoustic inputs and their relation to motor states.
Collapse
Affiliation(s)
- Murat Aytekin
- Department of Psychology, University of Maryland, College Park, MD 20742, U.S.A.
| | | | | |
Collapse
|
25
|
Combined effects of vestibular stimulation and gaze direction on orientation of sound lateralization. Neurosci Lett 2008; 436:158-62. [DOI: 10.1016/j.neulet.2008.03.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2007] [Revised: 03/05/2008] [Accepted: 03/06/2008] [Indexed: 11/20/2022]
|
26
|
Abstract
Studies of spatial perception during visual saccades have demonstrated compressions of visual space around the saccade target. Here we psychophysically investigated perception of auditory space during rapid head turns, focusing on the "perisaccadic" interval. Using separate perceptual and behavioral response measures we show that spatial compression also occurs for rapid head movements, with the auditory spatial representation compressing by up to 50%. Similar to observations in the visual system, this occurred only when spatial locations were measured by using a perceptual response; it was absent for the behavioral measure involving a nose-pointing task. These findings parallel those observed in vision during saccades and suggest that a common neural mechanism may subserve these distortions of space in each modality.
Collapse
|
27
|
Otake R, Saito Y, Suzuki M. The effect of eye position on the orientation of sound lateralization. Acta Otolaryngol 2007:34-7. [PMID: 18340559 DOI: 10.1080/03655230701595337] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
CONCLUSIONS The data suggest that sound lateralization sensitivity during interaural time difference (ITD) discrimination is altered by eccentric gaze and that sound lateralization shifts toward the direction of the gaze. OBJECTIVE Using dichotic sounds, the effect of eye position on the orientation of sound lateralization was investigated in humans. SUBJECTS AND METHODS The subjects were studied by testing ITD discrimination under different conditions of visual fixation. RESULTS The amplitudes obtained during the ITD discrimination tests during eccentric fixation were significantly greater than those obtained while gazing straight ahead (p<0.05). The median line of amplitude obtained during the ITD discrimination test shifted toward the direction of the gaze.
Collapse
|
28
|
Been G, Ngo TT, Miller SM, Fitzgerald PB. The use of tDCS and CVS as methods of non-invasive brain stimulation. ACTA ACUST UNITED AC 2007; 56:346-61. [PMID: 17900703 DOI: 10.1016/j.brainresrev.2007.08.001] [Citation(s) in RCA: 111] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2007] [Revised: 08/16/2007] [Accepted: 08/19/2007] [Indexed: 11/30/2022]
Abstract
Transcranial direct current stimulation (tDCS) and caloric vestibular stimulation (CVS) are safe methods for selectively modulating cortical excitability and activation, respectively, which have recently received increased interest regarding possible clinical applications. tDCS involves the application of low currents to the scalp via cathodal and anodal electrodes and has been shown to affect a range of motor, somatosensory, visual, affective and cognitive functions. Therapeutic effects have been demonstrated in clinical trials of tDCS for a variety of conditions including tinnitus, post-stroke motor deficits, fibromyalgia, depression, epilepsy and Parkinson's disease. Its effects can be modulated by combination with pharmacological treatment and it may influence the efficacy of other neurostimulatory techniques such as transcranial magnetic stimulation. CVS involves irrigating the auditory canal with cold water which induces a temperature gradient across the semicircular canals of the vestibular apparatus. This has been shown in functional brain-imaging studies to result in activation in several contralateral cortical and subcortical brain regions. CVS has also been shown to have effects on a wide range of visual and cognitive phenomena, as well as on post-stroke conditions, mania and chronic pain states. Both these techniques have been shown to modulate a range of brain functions, and display potential as clinical treatments. Importantly, they are both inexpensive relative to other brain stimulation techniques such as electroconvulsive therapy (ECT) and transcranial magnetic stimulation (TMS).
Collapse
Affiliation(s)
- Gregory Been
- Alfred Psychiatry Research Centre, The Alfred Hospital and Monash University School of Psychology, Psychiatry and Psychological Medicine, Commercial Rd, Melbourne, VIC 3004, Australia
| | | | | | | |
Collapse
|
29
|
Otake R, Kashio A, Sato T, Suzuki M. The effect of optokinetic stimulation on orientation of sound lateralization. Acta Otolaryngol 2006; 126:718-23. [PMID: 16803711 DOI: 10.1080/00016480500469586] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
CONCLUSIONS The current study demonstrates that sound lateralization sensitivity during interaural time difference (ITD) discrimination may be altered by optokinetic (OK) stimulation, and that sound lateralization sensitivity of ITD discrimination may be more susceptible to OK stimulation than that of interaural intensity difference (IID) discrimination. These data suggest that nystagmus or the sensation of self-rotation induced by OK stimulation influences auditory afferent information such as sound lateralization. OBJECTIVE Using dichotic sound, the effect of optokinetic stimulation on the orientation of sound lateralization was investigated in humans. MATERIALS AND METHODS Subjects were studied by testing ITD and IID discrimination during OK stimulation. RESULTS At 90 degrees/s of the light stripes angular velocity the amplitudes for the ITD discrimination tests during OK stimulation were significantly greater than those either before the beginning of OK stimulation or at 30 degrees/s (p<0.05). No significant difference in the amplitude for the IID discrimination test was observed between the results obtained before and during OK stimulation. During OK stimulation, all subjects felt that their perceptual body axes shifted toward the quick phase of OK nystagmus. In 8 of 12 subjects, the median line of amplitude for the ITD discrimination test shifted to the quick phase side of the OK nystagmus.
Collapse
Affiliation(s)
- Rika Otake
- Department of Otolaryngology, Tokyo Metropolitan Police Hospital, Tokyo, Japan
| | | | | | | |
Collapse
|
30
|
Seizova-Cajic T, Sachtler WLB, Curthoys IS. Eye movements cannot explain vibration-induced visual motion and motion aftereffect. Exp Brain Res 2006; 173:141-52. [PMID: 16555104 DOI: 10.1007/s00221-006-0373-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2005] [Accepted: 01/16/2006] [Indexed: 10/24/2022]
Abstract
Eye movements are thought to account for a number of visual motion illusions involving stationary objects presented against a featureless background or apparent motion of the whole visual field. We tested two different versions of the eye movement account: (a) the retinal slip explanation and (b) the nystagmus-suppression explanation, in particular their ability to account for visual motion experienced during vibration of the neck muscles, and for the visual motion aftereffect following vibration. We vibrated the neck (ventral sternocleidomastoid muscles, bilaterally, or right dorsal muscles) and measured eye movements in conjunction with perceived illusory displacement of an LED presented in complete darkness (N=10). To test the retinal-slip explanation, we compared the direction of slow eye movements to the direction of illusory motion of the visual target. To test the suppression explanation, we estimated the direction of suppressed slow-phase eye movements and compared it to the direction of illusory motion. Two main findings show that neither actual nor suppressed eye movements cause the illusory motion and motion aftereffect. Firstly, eye movements do not reverse direction when the illusory motion reverses after vibration stops. Secondly, there are large individual differences with regards to the direction of eye movements in observers who all experience a similar visual illusion. We conclude that, rather than eye movements, a more global spatial constancy mechanism that takes into account head movement is responsible for the illusion. The results also argue against the notion of a single central signal that determines both perceptual experience and oculomotor behaviour.
Collapse
Affiliation(s)
- Tatjana Seizova-Cajic
- Department of Psychology, The University of Sydney, Brennan Building A18, Sydney, NSW 2006, Australia.
| | | | | |
Collapse
|
31
|
Figliozzi F, Guariglia P, Silvetti M, Siegler I, Doricchi F. Effects of Vestibular Rotatory Accelerations on Covert Attentional Orienting in Vision and Touch. J Cogn Neurosci 2005; 17:1638-51. [PMID: 16269102 DOI: 10.1162/089892905774597272] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Peripheral vestibular organs feed the central nervous system with inputs favoring the correct perception of space during head and body motion. Applying temporal order judgments (TOJs) to pairs of simultaneous or asynchronous stimuli presented in the left and right egocentric space, we evaluated the influence of leftward and rightward vestibular rotatory accelerations given around the vertical head-body axis on covert attentional orienting. In a first experiment, we presented visual stimuli in the left and right hemifield. In a second experiment, tactile stimuli were presented to hands lying on their anatomical side or in a crossed position across the sagittal body midline. In both experiments, stimuli were presented while normal subjects suppressed or did not suppress the vestibulo-ocular response (VOR) evoked by head-body rotation. Independently of VOR suppression, visual and tactile stimuli presented on the side of rotation were judged to precede simultaneous stimuli presented on the side opposite the rotation. When limbs were crossed, attentional facilitatory effects were only observed for stimuli presented to the right hand lying in the left hemispace during leftward rotatory trials with VOR suppression. This result points to spatiotopic rather than somatotopic influences of vestibular inputs, suggesting that cross-modal effects of these inputs on tactile ones operate on a representation of space that is updated following arm crossing. In a third control experiment, we demonstrated that temporal prioritization of stimuli presented on the side of rotation was not determined by response bias linked to spatial compatibility between the directions of rotation and the directional labels used in TOJs (i.e., “left” or “right” first). These findings suggest that during passive rotatory head-body accelerations, covert attention is shifted toward the direction of rotation and the direction of the fast phases of the VOR.
Collapse
|
32
|
Pettorossi VE, Brosch M, Panichi R, Botti F, Grassi S, Troiani D. Contribution of self-motion perception to acoustic target localization. Acta Otolaryngol 2005; 125:524-8. [PMID: 16092545 DOI: 10.1080/00016480510028465] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
CONCLUSION The findings of this study suggest that acoustic spatial perception during head movement is achieved by the vestibular system, which is responsible for the correct dynamic of acoustic target pursuit. OBJECTIVE The ability to localize sounds in space during whole-body rotation relies on the auditory localization system, which recognizes the position of sound in a head-related frame, and on the sensory systems, namely the vestibular system, which perceive head and body movement. The aim of this study was to analyse the contribution of head motion cues to the spatial representation of acoustic targets in humans. MATERIAL AND METHODS Healthy subjects standing on a rotating platform in the dark were asked to pursue with a laser pointer an acoustic target which was horizontally rotated while the body was kept stationary or maintained stationary while the whole body was rotated. The contribution of head motion to the spatial acoustic representation could be inferred by comparing the gains and phases of the pursuit in the two experimental conditions when the frequency was varied. RESULTS During acoustic target rotation there was a reduction in the gain and an increase in the phase lag, while during whole-body rotations the gain tended to increase and the phase remained constant. The different contributions of the vestibular and acoustic systems were confirmed by analysing the acoustic pursuit during asymmetric body rotation. In this particular condition, in which self-motion perception gradually diminished, an increasing delay in target pursuit was observed.
Collapse
Affiliation(s)
- V E Pettorossi
- Department of Internal Medicine, Section of Human Physiology, University of Perugia, Perugia, Italy.
| | | | | | | | | | | |
Collapse
|
33
|
Schmerber S, Sheykholeslami K, Kermany MH, Hotta S, Kaga K. Time–intensity trading in bilateral congenital aural atresia patients. Hear Res 2005; 202:248-57. [PMID: 15811716 DOI: 10.1016/j.heares.2004.11.012] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/16/2004] [Accepted: 11/19/2004] [Indexed: 10/26/2022]
Abstract
In an effort to examine the rules by which information of bilaterally applied bone-conducted signals arising from interaural time differences (ITD) and interaural intensity differences (IID) is combined, data were measured for continuous 500 Hz narrow band noise at 65-70 dB HL in 11 patients with bilateral congenital aural atresia. Time-intensity trading functions were obtained by shifting the sound image towards one side using ITD, and shifting back to a centered sound image by varying the IID in the same ear (auditory midline task). ITD values were varied from -600 to +600 micros at 200 micros steps, where negative values indicate delays to the right ear. The results indicate that time-intensity trading is present in patients with bilateral aural atresia. The gross response properties of time-intensity trading in response to bone-conducted signals were comparable in patients with bilateral aural atresia and normal-hearing subjects, though there was a larger inter-subject variability and higher discrimination thresholds across IIDs in the atresia group. These results suggest that the mature auditory brainstem has a potential to employ binaural cues later in life, although to a restricted degree. A binaural fitting of a bone-conducted hearing aid might optimize binaural hearing and improve sound lateralization, and we recommend now systematically bilateral fitting in aural atresia patients.
Collapse
Affiliation(s)
- Sébastien Schmerber
- Department of Otolaryngology, University Hospital, Service O.R.L C.H.U de Grenoble, France.
| | | | | | | | | |
Collapse
|
34
|
Abstract
The control and perception of body orientation and motion are subserved by multiple sensory and motor mechanisms ranging from relatively simple, peripheral mechanisms to complex ones involving the highest levels of cognitive function and sensory-motor integration. Vestibular contributions to body orientation and to spatial localization of auditory and visual stimuli have long been recognized. These contributions are reviewed here along with new insights relating to sensory-motor calibration of the body gained from space flight, parabolic flight, and artificial gravity environments. Recently recognized contributions of proprioceptive and somatosensory signals to the appreciation of body orientation and configuration are described. New techniques for stabilizing posture by means of haptic touch and for studying and modeling postural mechanisms are reviewed. Path integration, place cells, and head direction cells are described along with implications for using immersive virtual environments for training geographic spatial knowledge of real environments.
Collapse
Affiliation(s)
- James R Lackner
- Ashton Graybiel Spatial Orientation Laboratory, Brandeis University, Waltham, MA 02454, USA.
| | | |
Collapse
|
35
|
Lewald J, Wienemann M, Boroojerdi B. Shift in sound localization induced by rTMS of the posterior parietal lobe. Neuropsychologia 2004; 42:1598-607. [PMID: 15327928 DOI: 10.1016/j.neuropsychologia.2004.04.012] [Citation(s) in RCA: 45] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2003] [Accepted: 04/23/2004] [Indexed: 11/25/2022]
Abstract
Neuroimaging studies in human subjects and single-unit recordings in monkeys have suggested the primate posterior parietal cortex (PPC) to be involved in auditory space perception. Here we tested this hypothesis by combining repetitive focal transcranial magnetic stimulation (rTMS) of the right PPC with a task of pointing to free-field-sound stimuli. After a period of 15 min rTMS at 1Hz, subjects exhibited an overall signed error in pointing by 2.5 degrees, directed to the left and downward, with reference to a baseline condition with "sham rTMS". No effects of rTMS on the general precision of sound localization (unsigned errors) were found. Thus, low-frequency offline rTMS may have specifically affected neuronal circuits transforming auditory spatial coordinates in both azimuth and elevation. This is in accordance with the view that the PPC may represent a neural substrate of the perceptual stability in spatial hearing.
Collapse
Affiliation(s)
- Jörg Lewald
- Department of Cognitive and Environmental Psychology, Faculty for Psychology, Ruhr University Bochum, D-44780 Bochum, Germany.
| | | | | |
Collapse
|
36
|
Abstract
The effect of passive whole-body tilt in the frontal plane on the lateralization of dichotic sound was investigated in human subjects. Pure-tone pulses (1 kHz, 100 ms duration) with various interaural time differences were presented via headphones while the subject was in an upright position or tilted 45 degrees or 90 degrees to the left or right. Subjects made two-alternative forced-choice (left/right) judgements on the intracranial sound image. During body tilt, the auditory median plane of the head, computed from the resulting psychometric functions, was always shifted to the upward ear, indicating a shift of the auditory percept to the downward ear, that is, in the direction of gravitational linear acceleration. The mean maximum magnitude of the auditory shift obtained with 90 degrees body tilt was 25 micro s. On the one hand, these findings suggest a certain influence of the otolith information about body position relative to the direction of gravity on the representation of auditory space. However, in partial contradiction to previous work, which had assumed existence of a significant 'audiogravic illusion', the very slight magnitude of the present effect rather reflects the excellent stability in the neural processing of auditory spatial cues in humans. Thus, it might be misleading to use the term 'illusion' for this quite marginal effect.
Collapse
Affiliation(s)
- Jörg Lewald
- Institut für Arbeitsphysiologie an der Universität Dortmund, Ardeystrasse 67, D-44139 Dortmund, Germany.
| | | |
Collapse
|
37
|
Lewald J, Ehrenstein WH. Spatial coordinates of human auditory working memory. BRAIN RESEARCH. COGNITIVE BRAIN RESEARCH 2001; 12:153-9. [PMID: 11489618 DOI: 10.1016/s0926-6410(01)00042-8] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The accuracy of localizing remembered sound sources was investigated by employing a delayed-response task, where a small light spot, projected onto a screen by a laser diode attached to the head, had to be spatially aligned with either actual or remembered stimulus positions. Systematic errors indicated overestimation of the eccentricity of remembered targets compared to direct stimulus localization. This overestimation increased with prolonged response delay, suggesting that the coordinates of memorized space are distorted with respect to perceived actual sound location and that this distortion increases as a function of time.
Collapse
Affiliation(s)
- J Lewald
- Institut für Arbeitsphysiologie, Ardeystr. 67, D-44139 Dortmund, Germany.
| | | |
Collapse
|
38
|
Abstract
The effect of passive whole-body rotation about the earth-vertical axis on the lateralization of dichotic sound was investigated in human subjects. Pure-tone pulses (1 kHz; 0.1 s duration) with various interaural time differences were presented via headphones during brief, low-amplitude rotation (angular acceleration 400 degrees/s2; maximum velocity 90 degrees/s; maximum displacement 194 degrees ). Subjects made two-alternative forced-choice (left/right) judgements on the acoustic stimuli. The auditory median plane of the head was shifted opposite to the direction of rotation, indicating a shift of the intracranial auditory percept in the direction of rotation. The mean magnitude of the shift was 10.7 micros. This result demonstrates a slight, but significant, influence of rotation on sound lateralization, suggesting that vestibular information is taken into account by the brain for accurate localization of stationary sound sources during natural head and body motion.
Collapse
Affiliation(s)
- J Lewald
- Institut für Arbeitsphysiologie an der Universität Dortmund, Ardeystrasse 67, D-44139 Dortmund, Germany.
| | | |
Collapse
|