1
|
Thaler L, Castillo-Serrano JG, Kish D, Norman LJ. Effects of type of emission and masking sound, and their spatial correspondence, on blind and sighted people's ability to echolocate. Neuropsychologia 2024; 196:108822. [PMID: 38342179 DOI: 10.1016/j.neuropsychologia.2024.108822] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2023] [Revised: 01/30/2024] [Accepted: 02/08/2024] [Indexed: 02/13/2024]
Abstract
Ambient sound can mask acoustic signals. The current study addressed how echolocation in people is affected by masking sound, and the role played by type of sound and spatial (i.e. binaural) similarity. We also investigated the role played by blindness and long-term experience with echolocation, by testing echolocation experts, as well as blind and sighted people new to echolocation. Results were obtained in two echolocation tasks where participants listened to binaural recordings of echolocation and masking sounds, and either localized echoes in azimuth or discriminated echo audibility. Echolocation and masking sounds could be either clicks or broad band noise. An adaptive staircase method was used to adjust signal-to-noise ratios (SNRs) based on participants' responses. When target and masker had the same binaural cues (i.e. both were monoaural sounds), people performed better (i.e. had lower SNRs) when target and masker used different types of sound (e.g. clicks in noise-masker or noise in clicks-masker), as compared to when target and masker used the same type of sound (e.g. clicks in click-, or noise in noise-masker). A very different pattern of results was observed when masker and target differed in their binaural cues, in which case people always performed better when clicks were the masker, regardless of type of emission used. Further, direct comparison between conditions with and without binaural difference revealed binaural release from masking only when clicks were used as emissions and masker, but not otherwise (i.e. when noise was used as masker or emission). This suggests that echolocation with clicks or noise may differ in their sensitivity to binaural cues. We observed the same pattern of results for echolocation experts, and blind and sighted people new to echolocation, suggesting a limited role played by long-term experience or blindness. In addition to generating novel predictions for future work, the findings also inform instruction in echolocation for people who are blind or sighted.
Collapse
Affiliation(s)
- L Thaler
- Department of Psychology, Durham University, South Road, Durham, DH1 5AY, UK.
| | | | - D Kish
- World Access for the Blind, 1007 Marino Drive, Placentia, CA, 92870, USA
| | - L J Norman
- Department of Psychology, Durham University, South Road, Durham, DH1 5AY, UK
| |
Collapse
|
2
|
Koehler H, Croy I, Oleszkiewicz A. Late Blindness and Deafness are Associated with Decreased Tactile Sensitivity, But Early Blindness is Not. Neuroscience 2023; 526:164-174. [PMID: 37385331 DOI: 10.1016/j.neuroscience.2023.06.016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Revised: 06/08/2023] [Accepted: 06/17/2023] [Indexed: 07/01/2023]
Abstract
Perceptual experience is shaped by a complex interaction between our sensory systems in which each sense conveys information on specific properties of our surroundings. This multisensory processing of complementary information improves the accuracy of our perceptual judgments and leads to more precise and faster reactions. Sensory impairment or loss in one modality leads to information deficiency that can impact other senses in various ways. For early auditory or visual loss, impairment and/or compensatory increase of the sensitivity of other senses are equally well described. Investigating individuals with deafness (N = 73), early (N = 51), late blindness (N = 49) and corresponding controls, we compared tactile sensitivity using the standard monofilament test on two locations, the finger and handback. Results indicate lower tactile sensitivity in people with deafness and late blindness but not in people with early blindness compared to respective controls, irrespective of stimulation location, gender, and age. Results indicate that neither sensory compensation nor simple use-dependency or a hindered development of the tactile sensory system is sufficient to explain changes in somatosensation after the sensory loss but that a complex interaction of effects is present.
Collapse
Affiliation(s)
- Hanna Koehler
- Department of Neurology, Jena University Hospital, Am Klinikum 1, 07747 Jena, Germany; Biomagnetic Center, Jena University Hospital, Am Klinikum 1, 07747 Jena, Germany; Department of Psychology, Clinical Psychology, Friedrich Schiller University Jena, Fürstengraben 1, 07743 Jena, Germany.
| | - Ilona Croy
- Department of Psychology, Clinical Psychology, Friedrich Schiller University Jena, Fürstengraben 1, 07743 Jena, Germany; Department of Psychotherapy and Psychosomatic Medicine, Carl Gustav Carus Faculty of Medicine, Technische Universität Dresden, Fetscherstr. 74, 01307 Dresden, Germany
| | - Anna Oleszkiewicz
- Department of Otorhinolaryngology, Smell and Taste Clinic, Technische Universität Dresden, Fetscherstrasse 74, 01307 Dresden, Germany; Institute of Psychology, University of Wrocław, ul. Dawida 1, 50-527 Wroclaw, Poland
| |
Collapse
|
3
|
Bouguiyoud N, Morales-Grahl E, Bronchti G, Frasnelli J, Roullet FI, Al Aïn S. Effects of Congenital Blindness on Ultrasonic Vocalizations and Social Behaviors in the ZRDBA Mouse. Front Behav Neurosci 2022; 16:884688. [PMID: 35592638 PMCID: PMC9110969 DOI: 10.3389/fnbeh.2022.884688] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2022] [Accepted: 03/29/2022] [Indexed: 11/25/2022] Open
Abstract
Mice produce ultrasonic vocalizations (USVs) at different ages and social contexts, including maternal-pup separation, social play in juveniles, social interactions, and mating in adults. The USVs' recording can be used as an index of sensory detection, internal state, and social motivation. While sensory deprivation may alter USVs' emission and some social behaviors in deaf and anosmic rodents, little is known about the effects of visual deprivation in rodents. This longitudinal study aimed to assess acoustic communication and social behaviors using a mouse model of congenital blindness. Anophthalmic and sighted mice were assayed to a series of behavioral tests at three different ages, namely, the maternal isolation-induced pup USV test and the home odor discrimination and preference test on postnatal day (PND) 7, the juvenile social test on PND 30-35, and the female urine-induced USVs and scent-marking behavior at 2-3 months. Our results evidenced that (1) at PND 7, USVs' total number between both groups was similar, all mice vocalized less during the second isolation period than the first period, and both phenotypes showed similar discrimination and preference, favoring exploration of the home bedding odor; (2) at PND 30-35, anophthalmic mice engaged less in social behaviors in the juvenile play test than sighted ones, but the number of total USVs produced is not affected; and (3) at adulthood, when exposed to a female urine spot, anophthalmic male mice displayed faster responses in terms of USVs' emission and sniffing behavior, associated with a longer time spent exploring the female urinary odor. Interestingly, acoustic behavior in the pups and adults was correlated in sighted mice only. Together, our study reveals that congenital visual deprivation had no effect on the number of USVs emitted in the pups and juveniles, but affected the USVs' emission in the adult male and impacted the social behavior in juvenile and adult mice.
Collapse
Affiliation(s)
- Nouhaila Bouguiyoud
- Department of Anatomy, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
- Cognition, Neurosciences, Affect et Comportement (CogNAC) Research Group, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
| | | | - Gilles Bronchti
- Department of Anatomy, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
| | - Johannes Frasnelli
- Department of Anatomy, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
- Cognition, Neurosciences, Affect et Comportement (CogNAC) Research Group, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
| | - Florence I. Roullet
- Department of Psychiatry and Behavioural Neurosciences, McMaster University, Hamilton, ON, Canada
| | - Syrina Al Aïn
- Department of Anatomy, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
- Cognition, Neurosciences, Affect et Comportement (CogNAC) Research Group, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
| |
Collapse
|
4
|
Anderson DL, Baguhn SJ. Confrontation Testing Echoidentification: Principle Perceptual Space for Novice Echoidentifiers. JOURNAL OF VISUAL IMPAIRMENT & BLINDNESS 2022. [DOI: 10.1177/0145482x221089970] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
5
|
Kim YH, Schrode KM, Engel J, Vicencio-Jimenez S, Rodriguez G, Lee HK, Lauer AM. Auditory Behavior in Adult-Blinded Mice. J Assoc Res Otolaryngol 2022; 23:225-239. [PMID: 35084628 PMCID: PMC8964904 DOI: 10.1007/s10162-022-00835-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2021] [Accepted: 12/31/2021] [Indexed: 10/19/2022] Open
Abstract
Cross-modal plasticity occurs when the function of remaining senses is enhanced following deprivation or loss of a sensory modality. Auditory neural responses are enhanced in the auditory cortex, including increased sensitivity and frequency selectivity, following short-term visual deprivation in adult mice (Petrus et al. Neuron 81:664-673, 2014). Whether or not these visual deprivation-induced neural changes translate into improved auditory perception and performance remains unclear. As an initial investigation of the effects of adult visual deprivation on auditory behaviors, CBA/CaJ mice underwent binocular enucleation at 3-4 weeks old and were tested on a battery of learned behavioral tasks, acoustic startle response (ASR), and prepulse inhibition (PPI) tests beginning at least 2 weeks after the enucleation procedure. Auditory brain stem responses (ABRs) were also measured to screen for potential effects of visual deprivation on non-behavioral hearing function. Control and enucleated mice showed similar tone detection sensitivity and frequency discrimination in a conditioned lick suppression test. Both groups showed normal reactivity to sound as measured by ASR in a quiet background. However, when startle-eliciting stimuli were presented in noise, enucleated mice showed decreased ASR amplitude relative to controls. Control and enucleated mice displayed no significant differences in ASR habituation, PPI tests, or ABR thresholds, or wave morphology. Our findings suggest that while adult-onset visual deprivation induces cross-modal plasticity at the synaptic and circuit levels, it does not substantially influence simple auditory behavioral performance.
Collapse
Affiliation(s)
- Ye-Hyun Kim
- Department of Otolaryngology-Head and Neck Surgery and Center for Hearing and Balance, Johns Hopkins University, Baltimore, MD, 21205, USA
| | - Katrina M Schrode
- Department of Otolaryngology-Head and Neck Surgery and Center for Hearing and Balance, Johns Hopkins University, Baltimore, MD, 21205, USA
| | - James Engel
- Department of Otolaryngology-Head and Neck Surgery and Center for Hearing and Balance, Johns Hopkins University, Baltimore, MD, 21205, USA
| | - Sergio Vicencio-Jimenez
- Department of Otolaryngology-Head and Neck Surgery and Center for Hearing and Balance, Johns Hopkins University, Baltimore, MD, 21205, USA
| | - Gabriela Rodriguez
- Cell, Molecular, Developmental Biology, and Biophysics (CMDB) Graduate Program, Johns Hopkins University, Baltimore, MD, USA
| | - Hey-Kyoung Lee
- Cell, Molecular, Developmental Biology, and Biophysics (CMDB) Graduate Program, Johns Hopkins University, Baltimore, MD, USA.,Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA.,Zanvyl-Krieger Mind/Brain Institute and Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD, USA
| | - Amanda M Lauer
- Department of Otolaryngology-Head and Neck Surgery and Center for Hearing and Balance, Johns Hopkins University, Baltimore, MD, 21205, USA. .,Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA.
| |
Collapse
|
6
|
Downey G. Echolocation among the blind: an argument for an ontogenetic turn. JOURNAL OF THE ROYAL ANTHROPOLOGICAL INSTITUTE 2021. [DOI: 10.1111/1467-9655.13607] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
- Greg Downey
- Macquarie School of Social Sciences Macquarie University Room B514, Level 5, 25B Wally's Walk NSW 2109 Australia
| |
Collapse
|
7
|
Pan N, Zheng K, Zhao Y, Zhang D, Dong C, Xu J, Li X, Zheng Y. Morphometry Difference of the Hippocampal Formation Between Blind and Sighted Individuals. Front Neurosci 2021; 15:715749. [PMID: 34803579 PMCID: PMC8601390 DOI: 10.3389/fnins.2021.715749] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 10/07/2021] [Indexed: 11/25/2022] Open
Abstract
The detailed morphometry alterations of the human hippocampal formation (HF) for blind individuals are still understudied. 50 subjects were recruited from Yantai Affiliated Hospital of Binzhou Medical University, including 16 congenital blindness, 14 late blindness, and 20 sighted controls. Volume and shape analysis were conducted between the blind (congenital or late) and sighted groups to observe the (sub)regional alterations of the HF. No significant difference of the hippocampal volume was observed between the blind and sighted subjects. Rightward asymmetry of the hippocampal volume was found for both congenital and late blind individuals, while no significant hemispheric difference was observed for the sighted controls. Shape analysis showed that the superior and inferior parts of both the hippocampal head and tail expanded, while the medial and lateral parts constrained for the blind individuals as compared to the sighted controls. The morphometry alterations for the congenital blind and late blind individuals are nearly the same. Significant expansion of the superior part of the hippocampal tail for both congenital and late blind groups were observed for the left hippocampi after FDR correction. Current results suggest that the cross-model plastic may occur in both hemispheres of the HF to improve the navigation ability without the stimuli of visual cues, and the alteration is more prominent for the left hemisphere.
Collapse
Affiliation(s)
- Ningning Pan
- School of Information Science and Engineering, Shandong Normal University, Jinan, China.,Master of Public Administration Education Center, Xinjiang Agricultural University, Xinjiang, China
| | - Ke Zheng
- College of Intelligence and Computing, Tianjin Key Lab of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Yanna Zhao
- School of Information Science and Engineering, Shandong Normal University, Jinan, China
| | - Dan Zhang
- Department of Mathematics and Computer Science, Eindhoven University of Technology, Eindhoven, Netherlands
| | - Changxu Dong
- School of Information Science and Engineering, Shandong Normal University, Jinan, China
| | - Junhai Xu
- College of Intelligence and Computing, Tianjin Key Lab of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Xianglin Li
- Medical Imaging Research Institute, Binzhou Medical University, Yantai, China
| | - Yuanjie Zheng
- School of Information Science and Engineering, Shandong Normal University, Jinan, China
| |
Collapse
|
8
|
Touj S, Paquette T, Bronchti G, Piché M. Early and late visual deprivation induce hypersensitivity to mechanical and thermal noxious stimuli in the ZRDBA mouse. Eur J Pain 2021; 25:2257-2265. [PMID: 34260794 DOI: 10.1002/ejp.1839] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2021] [Accepted: 07/09/2021] [Indexed: 01/19/2023]
Abstract
BACKGROUND Visual deprivation leads to behavioural adaptations. Early visual deprivation has greater effects on sensory systems compared with late visual deprivation. Although this has been well studied, the impact of visual deprivation on pain sensitivity has scarcely been investigated. In humans, one study indicates that pain sensitivity is increased in early, but not late-onset blindness. In animals, one study indicates that sensitivity to noxious stimulation is increased in anophthalmic mice, but the impact of late visual deprivation on sensitivity remains unknown. The aim of this behavioural study was to examine sensitivity to noxious stimulation in mice with early and late visual deprivation. We hypothesized that visual deprivation would have different effects on sensitivity to noxious stimulation depending on its onset. METHODS In Experiment 1, mechanical and thermal sensitivity was examined in four ZRDBA mouse groups: sighted mice, anophthalmic mice, dark-reared sighted mice and adult sighted mice deprived of vision for one week. In Experiment 2, mechanical and thermal sensitivity was examined in adult sighted ZRDBA mice deprived of vision for two months. RESULTS Anophthalmic and dark-reared mice showed mechanical and thermal hypersensitivity, while the one-week visual deprivation did not alter sensitivity. The two-month deprivation also resulted in mechanical and thermal hypersensitivity. CONCLUSIONS These results indicate that early visual deprivation, regardless of the integrity of the visual system, induces hypersensitivity. Moreover, the present findings indicate that late visual deprivation may induce mechanical and thermal hypersensitivity, although this depends on visual deprivation duration. These results have implications for the biological significance of pain in the blind. SIGNIFICANCE Sensory deprivation induces behavioural adaptions. For most sensory systems, the extent of these adaptations generally depends on the stage of cerebral development. In contrast, the present results indicate that for the nociceptive system, both early and late visual deprivation have similar effects. Anophthalmic, dark-reared mice and adult mice deprived of vision for two months showed thermal and mechanical hypersensitivity. This shows a clear interaction between visual and nociceptive systems and has implications for the biological significance of pain in the blind.
Collapse
Affiliation(s)
- Sara Touj
- Department of Anatomy, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada.,CogNAC Research Group, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
| | - Thierry Paquette
- Department of Anatomy, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada.,CogNAC Research Group, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
| | - Gilles Bronchti
- Department of Anatomy, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada.,CogNAC Research Group, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
| | - Mathieu Piché
- Department of Anatomy, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada.,CogNAC Research Group, Université du Québec à Trois-Rivières, Trois-Rivières, QC, Canada
| |
Collapse
|
9
|
Kolarik AJ, Moore BCJ, Cirstea S, Aggius-Vella E, Gori M, Campus C, Pardhan S. Factors Affecting Auditory Estimates of Virtual Room Size: Effects of Stimulus, Level, and Reverberation. Perception 2021; 50:646-663. [PMID: 34053354 DOI: 10.1177/03010066211020598] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
When vision is unavailable, auditory level and reverberation cues provide important spatial information regarding the environment, such as the size of a room. We investigated how room-size estimates were affected by stimulus type, level, and reverberation. In Experiment 1, 15 blindfolded participants estimated room size after performing a distance bisection task in virtual rooms that were either anechoic (with level cues only) or reverberant (with level and reverberation cues) with a relatively short reverberation time of T60 = 400 milliseconds. Speech, noise, or clicks were presented at distances between 1.9 and 7.1 m. The reverberant room was judged to be significantly larger than the anechoic room (p < .05) for all stimuli. In Experiment 2, only the reverberant room was used and the overall level of all sounds was equalized, so only reverberation cues were available. Ten blindfolded participants took part. Room-size estimates were significantly larger for speech than for clicks or noise. The results show that when level and reverberation cues are present, reverberation increases judged room size. Even relatively weak reverberation cues provide room-size information, which could potentially be used by blind or visually impaired individuals encountering novel rooms.
Collapse
Affiliation(s)
- Andrew J Kolarik
- Anglia Ruskin University, Cambridge, UK.,Anglia Ruskin University, Cambridge, UK
| | - Brian C J Moore
- Anglia Ruskin University, Cambridge, UK; University of Cambridge, Cambridge, UK.,Anglia Ruskin University, Cambridge, UK
| | - Silvia Cirstea
- Anglia Ruskin University, Cambridge, UK.,Anglia Ruskin University, Cambridge, UK
| | - Elena Aggius-Vella
- Fondazione Istituto Italiano di Tecnologia, Genoa, Italy; Institute for Mind, Brain and Technology, Herzeliya, Israel.,Anglia Ruskin University, Cambridge, UK
| | | | - Claudio Campus
- Fondazione Istituto Italiano di Tecnologia, Genoa, Italy.,Anglia Ruskin University, Cambridge, UK
| | | |
Collapse
|
10
|
Tirado C, Gerdfeldter B, Kärnekull SC, Nilsson ME. Comparing Echo-Detection and Echo-Localization in Sighted Individuals. Perception 2021; 50:308-327. [PMID: 33673742 PMCID: PMC8044610 DOI: 10.1177/03010066211000617] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Echolocation is the ability to gather information from sound reflections. Most previous studies have focused on the ability to detect sound reflections, others on the ability to localize sound reflections, but no previous study has compared the two abilities in the same individuals. Our study compared echo-detection (reflecting object present or not?) and echo-localization (reflecting object to the left or right?) in 10 inexperienced sighted participants across 10 distances (1-4.25 m) to the reflecting object, using an automated system for studying human echolocation. There were substantial individual differences, particularly in the performance on the echo-localization task. However, most participants performed better on the detection than the localization task, in particular at the closest distances (1 and 1.7 m), illustrating that it sometimes may be hard to perceive whether an audible reflection came from the left or right.
Collapse
|
11
|
Ptito M, Bleau M, Djerourou I, Paré S, Schneider FC, Chebat DR. Brain-Machine Interfaces to Assist the Blind. Front Hum Neurosci 2021; 15:638887. [PMID: 33633557 PMCID: PMC7901898 DOI: 10.3389/fnhum.2021.638887] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2020] [Accepted: 01/19/2021] [Indexed: 12/31/2022] Open
Abstract
The loss or absence of vision is probably one of the most incapacitating events that can befall a human being. The importance of vision for humans is also reflected in brain anatomy as approximately one third of the human brain is devoted to vision. It is therefore unsurprising that throughout history many attempts have been undertaken to develop devices aiming at substituting for a missing visual capacity. In this review, we present two concepts that have been prevalent over the last two decades. The first concept is sensory substitution, which refers to the use of another sensory modality to perform a task that is normally primarily sub-served by the lost sense. The second concept is cross-modal plasticity, which occurs when loss of input in one sensory modality leads to reorganization in brain representation of other sensory modalities. Both phenomena are training-dependent. We also briefly describe the history of blindness from ancient times to modernity, and then proceed to address the means that have been used to help blind individuals, with an emphasis on modern technologies, invasive (various type of surgical implants) and non-invasive devices. With the advent of brain imaging, it has become possible to peer into the neural substrates of sensory substitution and highlight the magnitude of the plastic processes that lead to a rewired brain. Finally, we will address the important question of the value and practicality of the available technologies and future directions.
Collapse
Affiliation(s)
- Maurice Ptito
- École d’Optométrie, Université de Montréal, Montréal, QC, Canada
- Department of Nuclear Medicine, University of Southern Denmark, Odense, Denmark
- Department of Neuroscience, University of Copenhagen, Copenhagen, Denmark
| | - Maxime Bleau
- École d’Optométrie, Université de Montréal, Montréal, QC, Canada
| | - Ismaël Djerourou
- École d’Optométrie, Université de Montréal, Montréal, QC, Canada
| | - Samuel Paré
- École d’Optométrie, Université de Montréal, Montréal, QC, Canada
| | - Fabien C. Schneider
- TAPE EA7423 University of Lyon-Saint Etienne, Saint Etienne, France
- Neuroradiology Unit, University Hospital of Saint-Etienne, Saint-Etienne, France
| | - Daniel-Robert Chebat
- Visual and Cognitive Neuroscience Laboratory (VCN Lab), Department of Psychology, Faculty of Social Sciences and Humanities, Ariel University, Ariel, Israël
- Navigation and Accessibility Research Center of Ariel University (NARCA), Ariel, Israël
| |
Collapse
|
12
|
Hu W, Wang K, Yang K, Cheng R, Ye Y, Sun L, Xu Z. A Comparative Study in Real-Time Scene Sonification for Visually Impaired People. SENSORS 2020; 20:s20113222. [PMID: 32517134 PMCID: PMC7309097 DOI: 10.3390/s20113222] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 06/01/2020] [Accepted: 06/02/2020] [Indexed: 11/29/2022]
Abstract
In recent years, with the development of depth cameras and scene detection algorithms, a wide variety of electronic travel aids for visually impaired people have been proposed. However, it is still challenging to convey scene information to visually impaired people efficiently. In this paper, we propose three different auditory-based interaction methods, i.e., depth image sonification, obstacle sonification as well as path sonification, which convey raw depth images, obstacle information and path information respectively to visually impaired people. Three sonification methods are compared comprehensively through a field experiment attended by twelve visually impaired participants. The results show that the sonification of high-level scene information, such as the direction of pathway, is easier to learn and adapt, and is more suitable for point-to-point navigation. In contrast, through the sonification of low-level scene information, such as raw depth images, visually impaired people can understand the surrounding environment more comprehensively. Furthermore, there is no interaction method that is best suited for all participants in the experiment, and visually impaired individuals need a period of time to find the most suitable interaction method. Our findings highlight the features and the differences of three scene detection algorithms and the corresponding sonification methods. The results provide insights into the design of electronic travel aids, and the conclusions can also be applied in other fields, such as the sound feedback of virtual reality applications.
Collapse
Affiliation(s)
- Weijian Hu
- National Engineering Research Center of Optical Instrumentation, Zhejiang University, Hangzhou 310058, China; (W.H.); (R.C.); (Y.Y.); (L.S.)
| | - Kaiwei Wang
- National Engineering Research Center of Optical Instrumentation, Zhejiang University, Hangzhou 310058, China; (W.H.); (R.C.); (Y.Y.); (L.S.)
- Correspondence: ; Tel.: +86-571-8795-1186
| | - Kailun Yang
- Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology, 76131 Karlsruhe, Germany;
| | - Ruiqi Cheng
- National Engineering Research Center of Optical Instrumentation, Zhejiang University, Hangzhou 310058, China; (W.H.); (R.C.); (Y.Y.); (L.S.)
| | - Yaozu Ye
- National Engineering Research Center of Optical Instrumentation, Zhejiang University, Hangzhou 310058, China; (W.H.); (R.C.); (Y.Y.); (L.S.)
| | - Lei Sun
- National Engineering Research Center of Optical Instrumentation, Zhejiang University, Hangzhou 310058, China; (W.H.); (R.C.); (Y.Y.); (L.S.)
| | - Zhijie Xu
- School of Computing and Engineering, University of Huddersfield, Huddersfield HD1 3DH, UK;
| |
Collapse
|
13
|
Kolarik AJ, Raman R, Moore BCJ, Cirstea S, Gopalakrishnan S, Pardhan S. The accuracy of auditory spatial judgments in the visually impaired is dependent on sound source distance. Sci Rep 2020; 10:7169. [PMID: 32346036 PMCID: PMC7189236 DOI: 10.1038/s41598-020-64306-8] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2019] [Accepted: 04/13/2020] [Indexed: 11/09/2022] Open
Abstract
Blindness leads to substantial enhancements in many auditory abilities, and deficits in others. It is unknown how severe visual losses need to be before changes in auditory abilities occur, or whether the relationship between severity of visual loss and changes in auditory abilities is proportional and systematic. Here we show that greater severity of visual loss is associated with increased auditory judgments of distance and room size. On average participants with severe visual losses perceived sounds to be twice as far away, and rooms to be three times larger, than sighted controls. Distance estimates for sighted controls were most accurate for closer sounds and least accurate for farther sounds. As the severity of visual impairment increased, accuracy decreased for closer sounds and increased for farther sounds. However, it is for closer sounds that accurate judgments are needed to guide rapid motor responses to auditory events, e.g. planning a safe path through a busy street to avoid collisions with other people, and falls. Interestingly, greater visual impairment severity was associated with more accurate room size estimates. The results support a new hypothesis that crossmodal calibration of audition by vision depends on the severity of visual loss.
Collapse
Affiliation(s)
- Andrew J Kolarik
- Vision and Eye Research Institute, School of Medicine, Anglia Ruskin University, Cambridge, United Kingdom. .,Department of Psychology, University of Cambridge, Cambridge, United Kingdom.
| | - Rajiv Raman
- Vision and Eye Research Institute, School of Medicine, Anglia Ruskin University, Cambridge, United Kingdom.,Shri Bhagwan Mahavir Vitreoretinal Services, Sankara Nethralaya Eye Hospital, Chennai, India
| | - Brian C J Moore
- Vision and Eye Research Institute, School of Medicine, Anglia Ruskin University, Cambridge, United Kingdom.,Department of Psychology, University of Cambridge, Cambridge, United Kingdom
| | - Silvia Cirstea
- Vision and Eye Research Institute, School of Medicine, Anglia Ruskin University, Cambridge, United Kingdom.,School of Computing and Information Science, Anglia Ruskin University, Cambridge, United Kingdom
| | - Sarika Gopalakrishnan
- Faculty of Low Vision Care, Elite School of Optometry, Chennai, India.,Low Vision Care Department, Sankara Nethralaya Eye Hospital, Chennai, India
| | - Shahina Pardhan
- Vision and Eye Research Institute, School of Medicine, Anglia Ruskin University, Cambridge, United Kingdom
| |
Collapse
|
14
|
The Cross-Modal Effects of Sensory Deprivation on Spatial and Temporal Processes in Vision and Audition: A Systematic Review on Behavioral and Neuroimaging Research since 2000. Neural Plast 2019; 2019:9603469. [PMID: 31885540 PMCID: PMC6914961 DOI: 10.1155/2019/9603469] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2019] [Revised: 07/06/2019] [Accepted: 10/31/2019] [Indexed: 01/12/2023] Open
Abstract
One of the most significant effects of neural plasticity manifests in the case of sensory deprivation when cortical areas that were originally specialized for the functions of the deprived sense take over the processing of another modality. Vision and audition represent two important senses needed to navigate through space and time. Therefore, the current systematic review discusses the cross-modal behavioral and neural consequences of deafness and blindness by focusing on spatial and temporal processing abilities, respectively. In addition, movement processing is evaluated as compiling both spatial and temporal information. We examine whether the sense that is not primarily affected changes in its own properties or in the properties of the deprived modality (i.e., temporal processing as the main specialization of audition and spatial processing as the main specialization of vision). References to the metamodal organization, supramodal functioning, and the revised neural recycling theory are made to address global brain organization and plasticity principles. Generally, according to the reviewed studies, behavioral performance is enhanced in those aspects for which both the deprived and the overtaking senses provide adequate processing resources. Furthermore, the behavioral enhancements observed in the overtaking sense (i.e., vision in the case of deafness and audition in the case of blindness) are clearly limited by the processing resources of the overtaking modality. Thus, the brain regions that were previously recruited during the behavioral performance of the deprived sense now support a similar behavioral performance for the overtaking sense. This finding suggests a more input-unspecific and processing principle-based organization of the brain. Finally, we highlight the importance of controlling for and stating factors that might impact neural plasticity and the need for further research into visual temporal processing in deaf subjects.
Collapse
|
15
|
Mieda T, Kokubu M, Saito M. Rapid identification of sound direction in blind footballers. Exp Brain Res 2019; 237:3221-3231. [PMID: 31628519 DOI: 10.1007/s00221-019-05670-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2019] [Accepted: 10/10/2019] [Indexed: 11/28/2022]
Abstract
Earlier studies have demonstrated that blind footballers are more accurate in identifying sound direction with less front-back confusion than sighted and blind non-football playing individuals. However, it is unknown whether blind footballers are faster than sighted footballers and nonathletes in identifying sound direction using auditory cues. Here, the present study aimed to investigate the auditory reaction times (RTs) and response accuracy of blind footballers during auditory RT tasks, including the identification of sound direction. Participants executed goal-directed stepping towards the loudspeaker as quickly and accurately as possible after identifying the sound direction. Simple, two-choice, and four-choice auditory RT tasks were completed. The results revealed that blind footballers had shorter RTs than sighted footballers in the choice RT tasks, but not in the simple RT task. These findings suggest that blind footballers are faster in identifying sound direction based on auditory cues, which is an essential perceptual-cognitive skill specific to blind football.
Collapse
Affiliation(s)
- Takumi Mieda
- Graduate School of Comprehensive Human Sciences, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki, 305-8574, Japan.
| | - Masahiro Kokubu
- Faculty of Health and Sport Sciences, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki, 305-8574, Japan
| | - Mayumi Saito
- Faculty of Health and Sport Sciences, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki, 305-8574, Japan
| |
Collapse
|
16
|
Teng S, Whitney D. The Acuity of Echolocation: Spatial Resolution in Sighted Persons Compared to the Performance of an Expert who is Blind. JOURNAL OF VISUAL IMPAIRMENT & BLINDNESS 2019. [DOI: 10.1177/0145482x1110500103] [Citation(s) in RCA: 48] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Compared with the echolocation performance of an expert who is blind, sighted novices rapidly learned size and position discrimination with surprising precision. We used a novel task to characterize the population distribution of echolocation skills in sighted persons and report the highest-known human echolocation acuity in the expert who is blind.
Collapse
Affiliation(s)
- Santani Teng
- Whitney Laboratory for Perception and Action, University of California, Berkeley, 3210 Tolman Hall, Berkeley, CA 94720
| | - David Whitney
- Whitney Laboratory for Perception and Action, University of California, Berkeley
| |
Collapse
|
17
|
Yang K, Wang K, Bergasa LM, Romera E, Hu W, Sun D, Sun J, Cheng R, Chen T, López E. Unifying Terrain Awareness for the Visually Impaired through Real-Time Semantic Segmentation. SENSORS 2018; 18:s18051506. [PMID: 29748508 PMCID: PMC5982125 DOI: 10.3390/s18051506] [Citation(s) in RCA: 69] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/05/2018] [Revised: 05/05/2018] [Accepted: 05/08/2018] [Indexed: 11/24/2022]
Abstract
Navigational assistance aims to help visually-impaired people to ambulate the environment safely and independently. This topic becomes challenging as it requires detecting a wide variety of scenes to provide higher level assistive awareness. Vision-based technologies with monocular detectors or depth sensors have sprung up within several years of research. These separate approaches have achieved remarkable results with relatively low processing time and have improved the mobility of impaired people to a large extent. However, running all detectors jointly increases the latency and burdens the computational resources. In this paper, we put forward seizing pixel-wise semantic segmentation to cover navigation-related perception needs in a unified way. This is critical not only for the terrain awareness regarding traversable areas, sidewalks, stairs and water hazards, but also for the avoidance of short-range obstacles, fast-approaching pedestrians and vehicles. The core of our unification proposal is a deep architecture, aimed at attaining efficient semantic understanding. We have integrated the approach in a wearable navigation system by incorporating robust depth segmentation. A comprehensive set of experiments prove the qualified accuracy over state-of-the-art methods while maintaining real-time speed. We also present a closed-loop field test involving real visually-impaired users, demonstrating the effectivity and versatility of the assistive framework.
Collapse
Affiliation(s)
- Kailun Yang
- State Key Laboratory of Modern Optical Instrumentation, Zhejiang University, Hangzhou 310027, China.
| | - Kaiwei Wang
- State Key Laboratory of Modern Optical Instrumentation, Zhejiang University, Hangzhou 310027, China.
| | - Luis M Bergasa
- Department of Electronics, University of Alcalá, Madrid 28805, Spain.
| | - Eduardo Romera
- Department of Electronics, University of Alcalá, Madrid 28805, Spain.
| | - Weijian Hu
- State Key Laboratory of Modern Optical Instrumentation, Zhejiang University, Hangzhou 310027, China.
| | - Dongming Sun
- Department of Computing, Imperial College London, London SW7 2AZ, UK.
| | - Junwei Sun
- KR-VISION Technology Co., Ltd., Hangzhou 310023, China.
| | - Ruiqi Cheng
- State Key Laboratory of Modern Optical Instrumentation, Zhejiang University, Hangzhou 310027, China.
| | - Tianxue Chen
- Department of Electrical and Computer Engineering, University of California, Los Angeles, CA 90095, USA.
| | - Elena López
- Department of Electronics, University of Alcalá, Madrid 28805, Spain.
| |
Collapse
|
18
|
Thaler L, Foresteire D. Visual sensory stimulation interferes with people's ability to echolocate object size. Sci Rep 2017; 7:13069. [PMID: 29026115 PMCID: PMC5638915 DOI: 10.1038/s41598-017-12967-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2016] [Accepted: 09/14/2017] [Indexed: 12/03/2022] Open
Abstract
Echolocation is the ability to use sound-echoes to infer spatial information about the environment. People can echolocate for example by making mouth clicks. Previous research suggests that echolocation in blind people activates brain areas that process light in sighted people. Research has also shown that echolocation in blind people may replace vision for calibration of external space. In the current study we investigated if echolocation may also draw on ‘visual’ resources in the sighted brain. To this end, we paired a sensory interference paradigm with an echolocation task. We found that exposure to an uninformative visual stimulus (i.e. white light) while simultaneously echolocating significantly reduced participants’ ability to accurately judge object size. In contrast, a tactile stimulus (i.e. vibration on the skin) did not lead to a significant change in performance (neither in sighted, nor blind echo expert participants). Furthermore, we found that the same visual stimulus did not affect performance in auditory control tasks that required detection of changes in sound intensity, sound frequency or sound location. The results suggest that processing of visual and echo-acoustic information draw on common neural resources.
Collapse
Affiliation(s)
- L Thaler
- Department of Psychology, Durham University, Durham, United Kingdom.
| | - D Foresteire
- Department of Psychology, Durham University, Durham, United Kingdom
| |
Collapse
|
19
|
Kolarik AJ, Scarfe AC, Moore BCJ, Pardhan S. Blindness enhances auditory obstacle circumvention: Assessing echolocation, sensory substitution, and visual-based navigation. PLoS One 2017; 12:e0175750. [PMID: 28407000 PMCID: PMC5391114 DOI: 10.1371/journal.pone.0175750] [Citation(s) in RCA: 37] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2016] [Accepted: 03/30/2017] [Indexed: 11/18/2022] Open
Abstract
Performance for an obstacle circumvention task was assessed under conditions of visual, auditory only (using echolocation) and tactile (using a sensory substitution device, SSD) guidance. A Vicon motion capture system was used to measure human movement kinematics objectively. Ten normally sighted participants, 8 blind non-echolocators, and 1 blind expert echolocator navigated around a 0.6 x 2 m obstacle that was varied in position across trials, at the midline of the participant or 25 cm to the right or left. Although visual guidance was the most effective, participants successfully circumvented the obstacle in the majority of trials under auditory or SSD guidance. Using audition, blind non-echolocators navigated more effectively than blindfolded sighted individuals with fewer collisions, lower movement times, fewer velocity corrections and greater obstacle detection ranges. The blind expert echolocator displayed performance similar to or better than that for the other groups using audition, but was comparable to that for the other groups using the SSD. The generally better performance of blind than of sighted participants is consistent with the perceptual enhancement hypothesis that individuals with severe visual deficits develop improved auditory abilities to compensate for visual loss, here shown by faster, more fluid, and more accurate navigation around obstacles using sound.
Collapse
Affiliation(s)
- Andrew J. Kolarik
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Cambridge, United Kingdom
- Department of Psychology, University of Cambridge, Cambridge, United Kingdom
- Centre for the Study of the Senses, Institute of Philosophy, University of London, London, United Kingdom
- * E-mail:
| | - Amy C. Scarfe
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Cambridge, United Kingdom
- Department of Clinical Engineering, Medical Imaging and Medical Physics Directorate, Sheffield Teaching Hospitals NHS Foundation Trust, Sheffield, United Kingdom
| | - Brian C. J. Moore
- Department of Psychology, University of Cambridge, Cambridge, United Kingdom
| | - Shahina Pardhan
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Cambridge, United Kingdom
| |
Collapse
|
20
|
Hearing Scenes: A Neuromagnetic Signature of Auditory Source and Reverberant Space Separation. eNeuro 2017; 4:eN-NWR-0007-17. [PMID: 28451630 PMCID: PMC5394928 DOI: 10.1523/eneuro.0007-17.2017] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2016] [Revised: 02/03/2017] [Accepted: 02/06/2017] [Indexed: 11/21/2022] Open
Abstract
Perceiving the geometry of surrounding space is a multisensory process, crucial to contextualizing object perception and guiding navigation behavior. Humans can make judgments about surrounding spaces from reverberation cues, caused by sounds reflecting off multiple interior surfaces. However, it remains unclear how the brain represents reverberant spaces separately from sound sources. Here, we report separable neural signatures of auditory space and source perception during magnetoencephalography (MEG) recording as subjects listened to brief sounds convolved with monaural room impulse responses (RIRs). The decoding signature of sound sources began at 57 ms after stimulus onset and peaked at 130 ms, while space decoding started at 138 ms and peaked at 386 ms. Importantly, these neuromagnetic responses were readily dissociable in form and time: while sound source decoding exhibited an early and transient response, the neural signature of space was sustained and independent of the original source that produced it. The reverberant space response was robust to variations in sound source, and vice versa, indicating a generalized response not tied to specific source-space combinations. These results provide the first neuromagnetic evidence for robust, dissociable auditory source and reverberant space representations in the human brain and reveal the temporal dynamics of how auditory scene analysis extracts percepts from complex naturalistic auditory signals.
Collapse
|
21
|
Abstract
Valuable insights into the role played by visual experience in shaping spatial representations can be gained by studying the effects of visual deprivation on the remaining sensory modalities. For instance, it has long been debated how spatial hearing evolves in the absence of visual input. While several anecdotal accounts tend to associate complete blindness with exceptional hearing abilities, experimental evidence supporting such claims is, however, matched by nearly equal amounts of evidence documenting spatial hearing deficits. The purpose of this review is to summarize the key findings which support either enhancements or deficits in spatial hearing observed following visual loss and to provide a conceptual framework that isolates the specific conditions under which they occur. Available evidence will be examined in terms of spatial dimensions (horizontal, vertical, and depth perception) and in terms of frames of reference (egocentric and allocentric). Evidence suggests that while early blind individuals show superior spatial hearing in the horizontal plane, they also show significant deficits in the vertical plane. Potential explanations underlying these contrasting findings will be discussed. Early blind individuals also show spatial hearing impairments when performing tasks that require the use of an allocentric frame of reference. Results obtained with late-onset blind individuals suggest that early visual experience plays a key role in the development of both spatial hearing enhancements and deficits.
Collapse
Affiliation(s)
- Patrice Voss
- Cognitive Neuroscience Unit, Department of Neurology and Neurosurgery, Montreal Neurological Institute – McGill UniversityMontreal, QC, Canada
| |
Collapse
|
22
|
Cornell Kärnekull S, Arshamian A, Nilsson ME, Larsson M. From Perception to Metacognition: Auditory and Olfactory Functions in Early Blind, Late Blind, and Sighted Individuals. Front Psychol 2016; 7:1450. [PMID: 27729884 PMCID: PMC5037222 DOI: 10.3389/fpsyg.2016.01450] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2016] [Accepted: 09/09/2016] [Indexed: 11/23/2022] Open
Abstract
Although evidence is mixed, studies have shown that blind individuals perform better than sighted at specific auditory, tactile, and chemosensory tasks. However, few studies have assessed blind and sighted individuals across different sensory modalities in the same study. We tested early blind (n = 15), late blind (n = 15), and sighted (n = 30) participants with analogous olfactory and auditory tests in absolute threshold, discrimination, identification, episodic recognition, and metacognitive ability. Although the multivariate analysis of variance (MANOVA) showed no overall effect of blindness and no interaction with modality, follow-up between-group contrasts indicated a blind-over-sighted advantage in auditory episodic recognition, that was most pronounced in early blind individuals. In contrast to the auditory modality, there was no empirical support for compensatory effects in any of the olfactory tasks. There was no conclusive evidence for group differences in metacognitive ability to predict episodic recognition performance. Taken together, the results showed no evidence of an overall superior performance in blind relative sighted individuals across olfactory and auditory functions, although early blind individuals exceled in episodic auditory recognition memory. This observation may be related to an experience-induced increase in auditory attentional capacity.
Collapse
Affiliation(s)
| | - Artin Arshamian
- Gösta Ekman Laboratory, Department of Psychology, Stockholm UniversityStockholm, Sweden; Division of Psychology, Department of Clinical Neuroscience, Karolinska InstitutetStockholm, Sweden; Center for Language Studies and Donders Institute for Brain, Cognition, and Behavior, Radboud UniversityNijmegen, Netherlands
| | - Mats E Nilsson
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University Stockholm, Sweden
| | - Maria Larsson
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University Stockholm, Sweden
| |
Collapse
|
23
|
Thaler L, Goodale MA. Echolocation in humans: an overview. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2016; 7:382-393. [PMID: 27538733 DOI: 10.1002/wcs.1408] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/28/2016] [Revised: 06/23/2016] [Accepted: 06/27/2016] [Indexed: 01/08/2023]
Abstract
Bats and dolphins are known for their ability to use echolocation. They emit bursts of sounds and listen to the echoes that bounce back to detect the objects in their environment. What is not as well-known is that some blind people have learned to do the same thing, making mouth clicks, for example, and using the returning echoes from those clicks to sense obstacles and objects of interest in their surroundings. The current review explores some of the research that has examined human echolocation and the changes that have been observed in the brains of echolocation experts. We also discuss potential applications and assistive technology based on echolocation. Blind echolocation experts can sense small differences in the location of objects, differentiate between objects of various sizes and shapes, and even between objects made of different materials, just by listening to the reflected echoes from mouth clicks. It is clear that echolocation may enable some blind people to do things that are otherwise thought to be impossible without vision, potentially providing them with a high degree of independence in their daily lives and demonstrating that echolocation can serve as an effective mobility strategy in the blind. Neuroimaging has shown that the processing of echoes activates brain regions in blind echolocators that would normally support vision in the sighted brain, and that the patterns of these activations are modulated by the information carried by the echoes. This work is shedding new light on just how plastic the human brain is. WIREs Cogn Sci 2016, 7:382-393. doi: 10.1002/wcs.1408 For further resources related to this article, please visit the WIREs website.
Collapse
Affiliation(s)
- Lore Thaler
- Department of Psychology, Durham University, Durham, UK.
| | - Melvyn A Goodale
- The Brain and Mind Institute, Department of Psychology, University of Western Ontario, Ontario, Canada
| |
Collapse
|
24
|
Cappagli G, Gori M. Auditory spatial localization: Developmental delay in children with visual impairments. RESEARCH IN DEVELOPMENTAL DISABILITIES 2016; 53-54:391-398. [PMID: 27002960 DOI: 10.1016/j.ridd.2016.02.019] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2015] [Revised: 01/06/2016] [Accepted: 02/29/2016] [Indexed: 06/05/2023]
Abstract
For individuals with visual impairments, auditory spatial localization is one of the most important features to navigate in the environment. Many works suggest that blind adults show similar or even enhanced performance for localization of auditory cues compared to sighted adults (Collignon, Voss, Lassonde, & Lepore, 2009). To date, the investigation of auditory spatial localization in children with visual impairments has provided contrasting results. Here we report, for the first time, that contrary to visually impaired adults, children with low vision or total blindness show a significant impairment in the localization of static sounds. These results suggest that simple auditory spatial tasks are compromised in children, and that this capacity recovers over time.
Collapse
Affiliation(s)
- Giulia Cappagli
- Robotics, Brain and Cognitive Sciences Department, Fondazione Istituto Italiano di Tecnologia, via Morego 30, 16163 Genoa, Italy.
| | - Monica Gori
- Robotics, Brain and Cognitive Sciences Department, Fondazione Istituto Italiano di Tecnologia, via Morego 30, 16163 Genoa, Italy
| |
Collapse
|
25
|
van den Bosch KA, Andringa TC, Başkent D, Vlaskamp C. The Role of Sound in Residential Facilities for People With Profound Intellectual and Multiple Disabilities. JOURNAL OF POLICY AND PRACTICE IN INTELLECTUAL DISABILITIES 2016. [DOI: 10.1111/jppi.12147] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
| | - Tjeerd C. Andringa
- University College Groningen, Department of Science, University of Groningen; the Netherlands
| | - Deniz Başkent
- University Medical Centre Groningen, Department of Otorhinolaryngology/Head and Neck Surgery, University of Groningen; the Netherlands
| | - Carla Vlaskamp
- Special Needs Education and Youth Care, University of Groningen; the Netherlands
| |
Collapse
|
26
|
An assessment of auditory-guided locomotion in an obstacle circumvention task. Exp Brain Res 2016; 234:1725-35. [PMID: 26879767 PMCID: PMC4851710 DOI: 10.1007/s00221-016-4567-y] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2015] [Accepted: 11/30/2015] [Indexed: 12/11/2022]
Abstract
This study investigated how effectively audition can be used to guide navigation around an obstacle. Ten blindfolded normally sighted participants navigated around a 0.6 × 2 m obstacle while producing self-generated mouth click sounds. Objective movement performance was measured using a Vicon motion capture system. Performance with full vision without generating sound was used as a baseline for comparison. The obstacle’s location was varied randomly from trial to trial: it was either straight ahead or 25 cm to the left or right relative to the participant. Although audition provided sufficient information to detect the obstacle and guide participants around it without collision in the majority of trials, buffer space (clearance between the shoulder and obstacle), overall movement times, and number of velocity corrections were significantly (p < 0.05) greater with auditory guidance than visual guidance. Collisions sometime occurred under auditory guidance, suggesting that audition did not always provide an accurate estimate of the space between the participant and obstacle. Unlike visual guidance, participants did not always walk around the side that afforded the most space during auditory guidance. Mean buffer space was 1.8 times higher under auditory than under visual guidance. Results suggest that sound can be used to generate buffer space when vision is unavailable, allowing navigation around an obstacle without collision in the majority of trials.
Collapse
|
27
|
Blind people are more sensitive than sighted people to binaural sound-location cues, particularly inter-aural level differences. Hear Res 2016; 332:223-232. [DOI: 10.1016/j.heares.2015.09.012] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/21/2015] [Revised: 09/12/2015] [Accepted: 09/15/2015] [Indexed: 11/20/2022]
|
28
|
The role of head movements in the discrimination of 2-D shape by blind echolocation experts. Atten Percept Psychophys 2015; 76:1828-37. [PMID: 24874262 DOI: 10.3758/s13414-014-0695-2] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Similar to certain bats and dolphins, some blind humans can use sound echoes to perceive their silent surroundings. By producing an auditory signal (e.g., a tongue click) and listening to the returning echoes, these individuals can obtain information about their environment, such as the size, distance, and density of objects. Past research has also hinted at the possibility that blind individuals may be able to use echolocation to gather information about 2-D surface shape, with definite results pending. Thus, here we investigated people's ability to use echolocation to identify the 2-D shape (contour) of objects. We also investigated the role played by head movements--that is, exploratory movements of the head while echolocating--because anecdotal evidence suggests that head movements might be beneficial for shape identification. To this end, we compared the performance of six expert echolocators to that of ten blind nonecholocators and ten blindfolded sighted controls in a shape identification task, with and without head movements. We found that the expert echolocators could use echoes to determine the shapes of the objects with exceptional accuracy when they were allowed to make head movements, but that their performance dropped to chance level when they had to remain still. Neither blind nor blindfolded sighted controls performed above chance, regardless of head movements. Our results show not only that experts can use echolocation to successfully identify 2-D shape, but also that head movements made while echolocating are necessary for the correct identification of 2-D shape.
Collapse
|
29
|
Sohl-Dickstein J, Teng S, Gaub BM, Rodgers CC, Li C, DeWeese MR, Harper NS. A Device for Human Ultrasonic Echolocation. IEEE Trans Biomed Eng 2015; 62:1526-1534. [PMID: 25608301 PMCID: PMC4536767 DOI: 10.1109/tbme.2015.2393371] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
OBJECTIVE We present a device that combines principles of ultrasonic echolocation and spatial hearing to provide human users with environmental cues that are 1) not otherwise available to the human auditory system, and 2) richer in object and spatial information than the more heavily processed sonar cues of other assistive devices. The device consists of a wearable headset with an ultrasonic emitter and stereo microphones with affixed artificial pinnae. The goal of this study is to describe the device and evaluate the utility of the echoic information it provides. METHODS The echoes of ultrasonic pulses were recorded and time stretched to lower their frequencies into the human auditory range, then played back to the user. We tested performance among naive and experienced sighted volunteers using a set of localization experiments, in which the locations of echo-reflective surfaces were judged using these time-stretched echoes. RESULTS Naive subjects were able to make laterality and distance judgments, suggesting that the echoes provide innately useful information without prior training. Naive subjects were generally unable to make elevation judgments from recorded echoes. However, trained subjects demonstrated an ability to judge elevation as well. CONCLUSION This suggests that the device can be used effectively to examine the environment and that the human auditory system can rapidly adapt to these artificial echolocation cues. SIGNIFICANCE Interpreting and interacting with the external world constitutes a major challenge for persons who are blind or visually impaired. This device has the potential to aid blind people in interacting with their environment.
Collapse
Affiliation(s)
| | | | | | | | | | | | - Nicol S. Harper
- University of Oxford, and was with the University of California, Berkeley
| |
Collapse
|
30
|
Rowan D, Papadopoulos T, Edwards D, Allen R. Use of binaural and monaural cues to identify the lateral position of a virtual object using echoes. Hear Res 2015; 323:32-9. [PMID: 25660196 DOI: 10.1016/j.heares.2015.01.012] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/12/2014] [Revised: 01/23/2015] [Accepted: 01/27/2015] [Indexed: 10/24/2022]
Abstract
Under certain conditions, sighted and blind humans can use echoes to discern characteristics of otherwise silent objects. Previous research concluded that robust horizontal-plane object localisation ability, without using head movement, depends on information above 2 kHz. While a strong interaural level difference (ILD) cue is available, it was not clear if listeners were using that or the monaural level cue that necessarily accompanies ILD. In this experiment, 13 sighted and normal-hearing listeners were asked to identify the right-vs.-left position of an object in virtual auditory space. Sounds were manipulated to remove binaural cues (binaural vs. diotic presentation) and prevent the use of monaural level cues (using level roving). With low- (<2 kHz) and high- (>2 kHz) frequency bands of noise, performance with binaural presentation and level rove exceeded that expected from use of monaural level cues and that with diotic presentation. It is argued that a high-frequency binaural cue (most likely ILD), and not a monaural level cue, is crucial for robust object localisation without head movement.
Collapse
Affiliation(s)
- Daniel Rowan
- Institute of Sound and Vibration Research, University of Southampton, UK.
| | - Timos Papadopoulos
- Institute of Sound and Vibration Research, University of Southampton, UK
| | - David Edwards
- Institute of Sound and Vibration Research, University of Southampton, UK
| | - Robert Allen
- Institute of Sound and Vibration Research, University of Southampton, UK
| |
Collapse
|
31
|
Wallmeier L, Wiegrebe L. Ranging in human sonar: effects of additional early reflections and exploratory head movements. PLoS One 2014; 9:e115363. [PMID: 25551226 PMCID: PMC4281102 DOI: 10.1371/journal.pone.0115363] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2014] [Accepted: 11/21/2014] [Indexed: 11/19/2022] Open
Abstract
Many blind people rely on echoes from self-produced sounds to assess their environment. It has been shown that human subjects can use echolocation for directional localization and orientation in a room, but echo-acoustic distance perception--e.g. to determine one's position in a room--has received little scientific attention, and systematic studies on the influence of additional early reflections and exploratory head movements are lacking. This study investigates echo-acoustic distance discrimination in virtual echo-acoustic space, using the impulse responses of a real corridor. Six blindfolded sighted subjects and a blind echolocation expert had to discriminate between two positions in the virtual corridor, which differed by their distance to the front wall, but not to the lateral walls. To solve this task, participants evaluated echoes that were generated in real time from self-produced vocalizations. Across experimental conditions, we systematically varied the restrictions for head rotations, the subjects' orientation in virtual space and the reference position. Three key results were observed. First, all participants successfully solved the task with discrimination thresholds below 1 m for all reference distances (0.75-4 m). Performance was best for the smallest reference distance of 0.75 m, with thresholds around 20 cm. Second, distance discrimination performance was relatively robust against additional early reflections, compared to other echolocation tasks like directional localization. Third, free head rotations during echolocation can improve distance discrimination performance in complex environmental settings. However, head movements do not necessarily provide a benefit over static echolocation from an optimal single orientation. These results show that accurate distance discrimination through echolocation is possible over a wide range of reference distances and environmental conditions. This is an important functional benefit of human echolocation, which may also play a major role in the calibration of auditory space representations.
Collapse
Affiliation(s)
- Ludwig Wallmeier
- Graduate School of Systemic Neuroscience, Ludwig-Maximilians-Universität München, Planegg-Martinsried, Germany
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Planegg-Martinsried, Germany
| | - Lutz Wiegrebe
- Graduate School of Systemic Neuroscience, Ludwig-Maximilians-Universität München, Planegg-Martinsried, Germany
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Planegg-Martinsried, Germany
| |
Collapse
|
32
|
Wallmeier L, Wiegrebe L. Self-motion facilitates echo-acoustic orientation in humans. ROYAL SOCIETY OPEN SCIENCE 2014; 1:140185. [PMID: 26064556 PMCID: PMC4448837 DOI: 10.1098/rsos.140185] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/17/2014] [Accepted: 10/17/2014] [Indexed: 06/04/2023]
Abstract
The ability of blind humans to navigate complex environments through echolocation has received rapidly increasing scientific interest. However, technical limitations have precluded a formal quantification of the interplay between echolocation and self-motion. Here, we use a novel virtual echo-acoustic space technique to formally quantify the influence of self-motion on echo-acoustic orientation. We show that both the vestibular and proprioceptive components of self-motion contribute significantly to successful echo-acoustic orientation in humans: specifically, our results show that vestibular input induced by whole-body self-motion resolves orientation-dependent biases in echo-acoustic cues. Fast head motions, relative to the body, provide additional proprioceptive cues which allow subjects to effectively assess echo-acoustic space referenced against the body orientation. These psychophysical findings clearly demonstrate that human echolocation is well suited to drive precise locomotor adjustments. Our data shed new light on the sensory-motor interactions, and on possible optimization strategies underlying echolocation in humans.
Collapse
Affiliation(s)
- Ludwig Wallmeier
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Großhadernerstr. 2, 82152 Planegg-Martinsried, Germany
- Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Großhadernerstr. 2, 82152 Planegg-Martinsried, Germany
| | - Lutz Wiegrebe
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Großhadernerstr. 2, 82152 Planegg-Martinsried, Germany
- Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Großhadernerstr. 2, 82152 Planegg-Martinsried, Germany
| |
Collapse
|
33
|
Hüg MX, Arias C, Tommasini FC, Ramos OA. Auditory localization and precedence effect: an exploratory study in infants and toddlers with visual impairment and normal vision. RESEARCH IN DEVELOPMENTAL DISABILITIES 2014; 35:2015-2025. [PMID: 24864055 DOI: 10.1016/j.ridd.2014.04.022] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/15/2013] [Revised: 04/20/2014] [Accepted: 04/22/2014] [Indexed: 06/03/2023]
Abstract
The precedence effect is a spatial hearing phenomenon implicated in sound localization on reverberant environments. It occurs when a pair of sounds, with a brief delay between them, is presented from different directions; listeners give greater perceptual weight to localization cues coming from the first-arriving sound, called lead, and suppress localization cues from the later-arriving reflection, called lag. Developmental studies with sighted infants show that the first responses to precedence effect stimuli are observed at 4-5 months of life. In this exploratory study, we use the minimum audible angle (MAA) paradigm in conjunction with the observer-based psychophysical procedure to test the ability of infants and toddlers, with visual impairment and normal vision, to discriminate changes in the azimuthal position of sounds configured under precedence effect conditions. The results indicated that similar and, in some conditions, higher performances were obtained by blind toddlers when compared to sighted children of similar age, and revealed that the observer-based psychophysical procedure is a valuable method to measure auditory localization acuity in infants and toddlers with visual impairment. The video records showed auditory orienting behaviors specific of the blind children group.
Collapse
Affiliation(s)
- Mercedes X Hüg
- Facultad de Psicología, Universidad Nacional de Córdoba, Enfermera Gordillo esq. Enrique Barros, 5016, Córdoba, Argentina; Centro de Investigación y Transferencia en Acústica (CINTRA), Universidad Tecnológica Nacional, Facultad Regional Córdoba, Mtro. López esq. Cruz Roja Argentina, 5016, Córdoba, Argentina.
| | - Claudia Arias
- Facultad de Psicología, Universidad Nacional de Córdoba, Enfermera Gordillo esq. Enrique Barros, 5016, Córdoba, Argentina; Centro de Investigación y Transferencia en Acústica (CINTRA), Universidad Tecnológica Nacional, Facultad Regional Córdoba, Mtro. López esq. Cruz Roja Argentina, 5016, Córdoba, Argentina; Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Argentina
| | - Fabián C Tommasini
- Centro de Investigación y Transferencia en Acústica (CINTRA), Universidad Tecnológica Nacional, Facultad Regional Córdoba, Mtro. López esq. Cruz Roja Argentina, 5016, Córdoba, Argentina
| | - Oscar A Ramos
- Centro de Investigación y Transferencia en Acústica (CINTRA), Universidad Tecnológica Nacional, Facultad Regional Córdoba, Mtro. López esq. Cruz Roja Argentina, 5016, Córdoba, Argentina; Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Argentina
| |
Collapse
|
34
|
Thaler L, Wilson RC, Gee BK. Correlation between vividness of visual imagery and echolocation ability in sighted, echo-naïve people. Exp Brain Res 2014; 232:1915-25. [PMID: 24584899 DOI: 10.1007/s00221-014-3883-3] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2013] [Accepted: 02/18/2014] [Indexed: 11/28/2022]
Abstract
The ability of humans to echolocate has been recognized since the 1940s. Little is known about what determines individual differences in echolocation ability, however. Although hearing ability has been suggested as an important factor in blind people and sighted-trained echolocators, there is evidence to suggest that this may not be the case for sighted novices. Therefore, non-auditory aspects of human cognition might be relevant. Previous brain imaging studies have shown activation of the early 'visual', i.e. calcarine, cortex during echolocation in blind echolocation experts, and also during visual imagery in blind and sighted people. Therefore, here we investigated the relationship between echolocation ability and vividness of visual imagery (VVI). Twenty-four sighted echolocation novices completed Marks' (Br J Psychol 1:17-24, 1973) VVI questionnaire and they also performed an echolocation size-discrimination task. Furthermore, they participated in a battery of auditory tests that determined their ability to detect fluctuations in sound frequency and intensity, as well as hearing differences between the right and left ear. A correlational analysis revealed a significant relationship between participants' VVI and echolocation ability, i.e. participants with stronger VVI also had higher echolocation ability, even when differences in auditory abilities were taken into account. In terms of underlying mechanisms, we suggest that either the use of visual imagery is a strategy for echolocation, or that visual imagery and echolocation both depend on the ability to recruit calcarine cortex for cognitive tasks that do not rely on retinal input.
Collapse
Affiliation(s)
- Lore Thaler
- Department of Psychology, Durham University, Science Site, South Road, Durham, DH1 3LE, UK,
| | | | | |
Collapse
|
35
|
Kolarik AJ, Cirstea S, Pardhan S, Moore BCJ. A summary of research investigating echolocation abilities of blind and sighted humans. Hear Res 2014; 310:60-8. [PMID: 24524865 DOI: 10.1016/j.heares.2014.01.010] [Citation(s) in RCA: 118] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/29/2013] [Revised: 01/30/2014] [Accepted: 01/31/2014] [Indexed: 11/25/2022]
Abstract
There is currently considerable interest in the consequences of loss in one sensory modality on the remaining senses. Much of this work has focused on the development of enhanced auditory abilities among blind individuals, who are often able to use sound to navigate through space. It has now been established that many blind individuals produce sound emissions and use the returning echoes to provide them with information about objects in their surroundings, in a similar manner to bats navigating in the dark. In this review, we summarize current knowledge regarding human echolocation. Some blind individuals develop remarkable echolocation abilities, and are able to assess the position, size, distance, shape, and material of objects using reflected sound waves. After training, normally sighted people are also able to use echolocation to perceive objects, and can develop abilities comparable to, but typically somewhat poorer than, those of blind people. The underlying cues and mechanisms, operable range, spatial acuity and neurological underpinnings of echolocation are described. Echolocation can result in functional real life benefits. It is possible that these benefits can be optimized via suitable training, especially among those with recently acquired blindness, but this requires further study. Areas for further research are identified.
Collapse
Affiliation(s)
- Andrew J Kolarik
- Department of Psychology, University of Cambridge, Downing Street, Cambridge CB2 3EB, United Kingdom.
| | - Silvia Cirstea
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Eastings 204, East Road, Cambridge CB1 1PT, United Kingdom.
| | - Shahina Pardhan
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Eastings 204, East Road, Cambridge CB1 1PT, United Kingdom.
| | - Brian C J Moore
- Department of Psychology, University of Cambridge, Downing Street, Cambridge CB2 3EB, United Kingdom.
| |
Collapse
|
36
|
Thaler L, Milne JL, Arnott SR, Kish D, Goodale MA. Neural correlates of motion processing through echolocation, source hearing, and vision in blind echolocation experts and sighted echolocation novices. J Neurophysiol 2014; 111:112-27. [DOI: 10.1152/jn.00501.2013] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We have shown in previous research (Thaler L, Arnott SR, Goodale MA. PLoS One 6: e20162, 2011) that motion processing through echolocation activates temporal-occipital cortex in blind echolocation experts. Here we investigated how neural substrates of echo-motion are related to neural substrates of auditory source-motion and visual-motion. Three blind echolocation experts and twelve sighted echolocation novices underwent functional MRI scanning while they listened to binaural recordings of moving or stationary echolocation or auditory source sounds located either in left or right space. Sighted participants' brain activity was also measured while they viewed moving or stationary visual stimuli. For each of the three modalities separately (echo, source, vision), we then identified motion-sensitive areas in temporal-occipital cortex and in the planum temporale. We then used a region of interest (ROI) analysis to investigate cross-modal responses, as well as laterality effects. In both sighted novices and blind experts, we found that temporal-occipital source-motion ROIs did not respond to echo-motion, and echo-motion ROIs did not respond to source-motion. This double-dissociation was absent in planum temporale ROIs. Furthermore, temporal-occipital echo-motion ROIs in blind, but not sighted, participants showed evidence for contralateral motion preference. Temporal-occipital source-motion ROIs did not show evidence for contralateral preference in either blind or sighted participants. Our data suggest a functional segregation of processing of auditory source-motion and echo-motion in human temporal-occipital cortex. Furthermore, the data suggest that the echo-motion response in blind experts may represent a reorganization rather than exaggeration of response observed in sighted novices. There is the possibility that this reorganization involves the recruitment of “visual” cortical areas.
Collapse
Affiliation(s)
- L. Thaler
- Department of Psychology, Durham University, Durham, United Kingdom
| | - J. L. Milne
- The Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada
| | - S. R. Arnott
- The Rotman Research Institute, Baycrest, Toronto, Ontario, Canada; and
| | - D. Kish
- World Access for the Blind, Encino, California
| | - M. A. Goodale
- The Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
37
|
Wallmeier L, Geßele N, Wiegrebe L. Echolocation versus echo suppression in humans. Proc Biol Sci 2013; 280:20131428. [PMID: 23986105 PMCID: PMC3768302 DOI: 10.1098/rspb.2013.1428] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2013] [Accepted: 08/05/2013] [Indexed: 11/12/2022] Open
Abstract
Several studies have shown that blind humans can gather spatial information through echolocation. However, when localizing sound sources, the precedence effect suppresses spatial information of echoes, and thereby conflicts with effective echolocation. This study investigates the interaction of echolocation and echo suppression in terms of discrimination suppression in virtual acoustic space. In the 'Listening' experiment, sighted subjects discriminated between positions of a single sound source, the leading or the lagging of two sources, respectively. In the 'Echolocation' experiment, the sources were replaced by reflectors. Here, the same subjects evaluated echoes generated in real time from self-produced vocalizations and thereby discriminated between positions of a single reflector, the leading or the lagging of two reflectors, respectively. Two key results were observed. First, sighted subjects can learn to discriminate positions of reflective surfaces echo-acoustically with accuracy comparable to sound source discrimination. Second, in the Listening experiment, the presence of the leading source affected discrimination of lagging sources much more than vice versa. In the Echolocation experiment, however, the presence of both the lead and the lag strongly affected discrimination. These data show that the classically described asymmetry in the perception of leading and lagging sounds is strongly diminished in an echolocation task. Additional control experiments showed that the effect is owing to both the direct sound of the vocalization that precedes the echoes and owing to the fact that the subjects actively vocalize in the echolocation task.
Collapse
Affiliation(s)
- Ludwig Wallmeier
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Großhadernerstraße 2, 82152 Planegg-Martinsried, Germany
- Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Großhadernerstraße 2, 82152 Planegg-Martinsried, Germany
| | - Nikodemus Geßele
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Großhadernerstraße 2, 82152 Planegg-Martinsried, Germany
| | - Lutz Wiegrebe
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Großhadernerstraße 2, 82152 Planegg-Martinsried, Germany
- Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Großhadernerstraße 2, 82152 Planegg-Martinsried, Germany
| |
Collapse
|
38
|
Kupers R, Ptito M. Compensatory plasticity and cross-modal reorganization following early visual deprivation. Neurosci Biobehav Rev 2013; 41:36-52. [PMID: 23954750 DOI: 10.1016/j.neubiorev.2013.08.001] [Citation(s) in RCA: 160] [Impact Index Per Article: 14.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2013] [Revised: 07/30/2013] [Accepted: 08/01/2013] [Indexed: 10/26/2022]
Abstract
For human and non-human primates, vision is one of the most privileged sensory channels used to interact with the environment. The importance of vision is strongly embedded in the organization of the primate brain as about one third of its cortical surface is involved in visual functions. It is therefore not surprising that the absence of vision from birth, or the loss of vision later in life, has huge consequences, both anatomically and functionally. Studies in animals and humans, conducted over the past few decades, have demonstrated that the absence of vision causes massive structural changes that take place not only in the visually deprived cortex but also in other brain areas. These studies have further shown that the visually deprived cortex becomes responsive to a wide variety of non-visual sensory inputs. Recent studies even showed a role of the visually deprived cortex in cognitive processes. At the behavioral level, increases in acuity for auditory and tactile processes have been reported. The study of the congenitally blind brain also offers a unique model to gain better insights into the functioning of the normal sighted brain and to understand to what extent visual experience is necessary for the brain to develop its functional architecture. Finally, the study of the blind brain allows us to investigate how consciousness develops in the absence of vision. How does the brain of someone who has never had any visual perception form an image of the external world? In this paper, we discuss recent findings from animal studies as well as from behavioural and functional brain imaging studies in sighted and blind individuals that address these questions.
Collapse
Affiliation(s)
- Ron Kupers
- BRAINlab, Department of Neuroscience & Pharmacology, Panum Institute, University of Copenhagen, Copenhagen, Denmark; École d'Optométrie, Université de Montréal, Montréal, QC, Canada.
| | - Maurice Ptito
- BRAINlab, Department of Neuroscience & Pharmacology, Panum Institute, University of Copenhagen, Copenhagen, Denmark; École d'Optométrie, Université de Montréal, Montréal, QC, Canada
| |
Collapse
|
39
|
Rowan D, Papadopoulos T, Edwards D, Holmes H, Hollingdale A, Evans L, Allen R. Identification of the lateral position of a virtual object based on echoes by humans. Hear Res 2013; 300:56-65. [DOI: 10.1016/j.heares.2013.03.005] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/20/2012] [Revised: 03/08/2013] [Accepted: 03/12/2013] [Indexed: 11/28/2022]
|
40
|
Kolarik AJ, Cirstea S, Pardhan S. Evidence for enhanced discrimination of virtual auditory distance among blind listeners using level and direct-to-reverberant cues. Exp Brain Res 2012. [PMID: 23178908 DOI: 10.1007/s00221-012-3340-0] [Citation(s) in RCA: 56] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
Totally blind listeners often demonstrate better than normal capabilities when performing spatial hearing tasks. Accurate representation of three-dimensional auditory space requires the processing of available distance information between the listener and the sound source; however, auditory distance cues vary greatly depending upon the acoustic properties of the environment, and it is not known which distance cues are important to totally blind listeners. Our data show that totally blind listeners display better performance compared to sighted age-matched controls for distance discrimination tasks in anechoic and reverberant virtual rooms simulated using a room-image procedure. Totally blind listeners use two major auditory distance cues to stationary sound sources, level and direct-to-reverberant ratio, more effectively than sighted controls for many of the virtual distances tested. These results show that significant compensation among totally blind listeners for virtual auditory spatial distance leads to benefits across a range of simulated acoustic environments. No significant differences in performance were observed between listeners with partial non-correctable visual losses and sighted controls, suggesting that sensory compensation for virtual distance does not occur for listeners with partial vision loss.
Collapse
Affiliation(s)
- Andrew J Kolarik
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Eastings 204, East Road, Cambridge, CB1 1PT, UK.
| | | | | |
Collapse
|
41
|
Schenkman BN, Nilsson ME. Human echolocation: pitch versus loudness information. Perception 2012; 40:840-52. [PMID: 22128556 DOI: 10.1068/p6898] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
Blind persons emit sounds to detect objects by echolocation. Both perceived pitch and perceived loudness of the emitted sound change as they fuse with the reflections from nearby objects. Blind persons generally are better than sighted at echolocation, but it is unclear whether this superiority is related to detection of pitch, loudness, or both. We measured the ability of twelve blind and twenty-five sighted listeners to determine which of two sounds, 500 ms noise bursts, that had been recorded in the presence of a reflecting object in a room with reflecting walls using an artificial head. The sound pairs were original recordings differing in both pitch and loudness, or manipulated recordings with either the pitch or the loudness information removed. Observers responded using a 2AFC method with verbal feedback. For both blind and sighted listeners the performance declined more with the pitch information removed than with the loudness information removed. In addition, the blind performed clearly better than the sighted as long as the pitch information was present, but not when it was removed. Taken together, these results show that the ability to detect pitch is a main factor underlying high performance in human echolocation.
Collapse
Affiliation(s)
- Bo N Schenkman
- Blekinge Institute of Technology, Box 520, SE-372 25 Ronneby, Sweden and Centre for Speech Technology, Department of Speech, Hearing and Music, Royal Institute of Technology, Stockholm, Sweden.
| | | |
Collapse
|
42
|
Teng S, Puri A, Whitney D. Ultrafine spatial acuity of blind expert human echolocators. Exp Brain Res 2011; 216:483-8. [PMID: 22101568 DOI: 10.1007/s00221-011-2951-1] [Citation(s) in RCA: 67] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2011] [Accepted: 11/09/2011] [Indexed: 11/30/2022]
Abstract
Echolocating organisms represent their external environment using reflected auditory information from emitted vocalizations. This ability, long known in various non-human species, has also been documented in some blind humans as an aid to navigation, as well as object detection and coarse localization. Surprisingly, our understanding of the basic acuity attainable by practitioners-the most fundamental underpinning of echoic spatial perception-remains crude. We found that experts were able to discriminate horizontal offsets of stimuli as small as ~1.2° auditory angle in the frontomedial plane, a resolution approaching the maximum measured precision of human spatial hearing and comparable to that found in bats performing similar tasks. Furthermore, we found a strong correlation between echolocation acuity and age of blindness onset. This first measure of functional spatial resolution in a population of expert echolocators demonstrates precision comparable to that found in the visual periphery of sighted individuals.
Collapse
Affiliation(s)
- Santani Teng
- Department of Psychology, The University of California at Berkeley, 3210 Tolman Hall, Berkeley, CA 94720, USA.
| | | | | |
Collapse
|
43
|
Collignon O, Champoux F, Voss P, Lepore F. Sensory rehabilitation in the plastic brain. PROGRESS IN BRAIN RESEARCH 2011; 191:211-31. [PMID: 21741554 DOI: 10.1016/b978-0-444-53752-2.00003-5] [Citation(s) in RCA: 50] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
The purpose of this review is to consider new sensory rehabilitation avenues in the context of the brain's remarkable ability to reorganize itself following sensory deprivation. Here, deafness and blindness are taken as two illustrative models. Mainly, two promising rehabilitative strategies based on opposing theoretical principles will be considered: sensory substitution and neuroprostheses. Sensory substitution makes use of the remaining intact senses to provide blind or deaf individuals with coded information of the lost sensory system. This technique thus benefits from added neural resources in the processing of the remaining senses resulting from crossmodal plasticity, which is thought to be coupled with behavioral enhancements in the intact senses. On the other hand, neuroprostheses represent an invasive approach aimed at stimulating the deprived sensory system directly in order to restore, at least partially, its functioning. This technique therefore relies on the neuronal integrity of the brain areas normally dedicated to the deprived sense and is rather hindered by the compensatory reorganization observed in the deprived cortex. Here, we stress that our understanding of the neuroplastic changes that occur in sensory-deprived individuals may help guide the design and the implementation of such rehabilitative methods.
Collapse
Affiliation(s)
- Olivier Collignon
- Centre de Recherche en Neuropsychologie et Cognition, CERNEC, Université de Montréal, Montréal, Québec, Canada.
| | | | | | | |
Collapse
|
44
|
Chan CCH, Wong AWK, Ting KH, Whitfield-Gabrieli S, He J, Lee TMC. Cross auditory-spatial learning in early-blind individuals. Hum Brain Mapp 2011; 33:2714-27. [PMID: 21932260 DOI: 10.1002/hbm.21395] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2010] [Revised: 05/29/2011] [Accepted: 05/31/2011] [Indexed: 11/10/2022] Open
Abstract
Cross-modal processing enables the utilization of information received via different sensory organs to facilitate more complicated human actions. We used functional MRI on early-blind individuals to study the neural processes associated with cross auditory-spatial learning. The auditory signals, converted from echoes of ultrasonic signals emitted from a navigation device, were novel to the participants. The subjects were trained repeatedly for 4 weeks in associating the auditory signals with different distances. Subjects' blood-oxygenation-level-dependent responses were captured at baseline and after training using a sound-to-distance judgment task. Whole-brain analyses indicated that the task used in the study involved auditory discrimination as well as spatial localization. The learning process was shown to be mediated by the inferior parietal cortex and the hippocampus, suggesting the integration and binding of auditory features to distances. The right cuneus was found to possibly serve a general rather than a specific role, forming an occipital-enhanced network for cross auditory-spatial learning. This functional network is likely to be unique to those with early blindness, since the normal-vision counterparts shared activities only in the parietal cortex.
Collapse
Affiliation(s)
- Chetwyn C H Chan
- Applied Cognitive Neuroscience Laboratory, Department of Rehabilitation Sciences, The Hong Kong Polytechnic University, Hong Kong, China.
| | | | | | | | | | | |
Collapse
|
45
|
Papadopoulos T, Edwards DS, Rowan D, Allen R. Identification of auditory cues utilized in human echolocation—Objective measurement results. Biomed Signal Process Control 2011. [DOI: 10.1016/j.bspc.2011.03.005] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
46
|
Thaler L, Arnott SR, Goodale MA. Neural correlates of natural human echolocation in early and late blind echolocation experts. PLoS One 2011; 6:e20162. [PMID: 21633496 PMCID: PMC3102086 DOI: 10.1371/journal.pone.0020162] [Citation(s) in RCA: 112] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2011] [Accepted: 04/13/2011] [Indexed: 12/04/2022] Open
Abstract
Background A small number of blind people are adept at echolocating silent objects simply by producing mouth clicks and listening to the returning echoes. Yet the neural architecture underlying this type of aid-free human echolocation has not been investigated. To tackle this question, we recruited echolocation experts, one early- and one late-blind, and measured functional brain activity in each of them while they listened to their own echolocation sounds. Results When we compared brain activity for sounds that contained both clicks and the returning echoes with brain activity for control sounds that did not contain the echoes, but were otherwise acoustically matched, we found activity in calcarine cortex in both individuals. Importantly, for the same comparison, we did not observe a difference in activity in auditory cortex. In the early-blind, but not the late-blind participant, we also found that the calcarine activity was greater for echoes reflected from surfaces located in contralateral space. Finally, in both individuals, we found activation in middle temporal and nearby cortical regions when they listened to echoes reflected from moving targets. Conclusions These findings suggest that processing of click-echoes recruits brain regions typically devoted to vision rather than audition in both early and late blind echolocation experts.
Collapse
Affiliation(s)
- Lore Thaler
- Department of Psychology, University of Western Ontario, London, Ontario, Canada
| | | | - Melvyn A. Goodale
- Department of Psychology, University of Western Ontario, London, Ontario, Canada
- * E-mail:
| |
Collapse
|
47
|
Voss P, Collignon O, Lassonde M, Lepore F. Adaptation to sensory loss. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2010; 1:308-328. [DOI: 10.1002/wcs.13] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Affiliation(s)
- Patrice Voss
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Université de Montréal, Montreal, Canada
| | - Olivier Collignon
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Université de Montréal, Montreal, Canada
- Université catholique de Louvain, Institute of Neuroscience, Neural Rehabilitation Engineering Laboratory, Belgium
| | - Maryse Lassonde
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Université de Montréal, Montreal, Canada
- Centre de Recherche CHU Sainte‐Justine, Montreal, Canada
| | - Franco Lepore
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Université de Montréal, Montreal, Canada
- Centre de Recherche CHU Sainte‐Justine, Montreal, Canada
| |
Collapse
|
48
|
Collignon O, Voss P, Lassonde M, Lepore F. Cross-modal plasticity for the spatial processing of sounds in visually deprived subjects. Exp Brain Res 2008; 192:343-58. [DOI: 10.1007/s00221-008-1553-z] [Citation(s) in RCA: 202] [Impact Index Per Article: 12.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2008] [Accepted: 08/15/2008] [Indexed: 11/28/2022]
|
49
|
Subcortical auditory input to the primary visual cortex in anophthalmic mice. Neurosci Lett 2008; 433:129-34. [PMID: 18276073 DOI: 10.1016/j.neulet.2008.01.003] [Citation(s) in RCA: 45] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2007] [Revised: 11/28/2007] [Accepted: 01/01/2008] [Indexed: 11/20/2022]
Abstract
Anatomical and imaging studies show ample evidence for auditory activation of the visual cortex following early onset of blindness in both humans and animal models. Anatomical studies in animal models of early blindness clearly show intermodal pathways through which auditory information can reach the primary visual cortex. There is clear evidence for intermodal corticocortical pathways linking auditory and visual cortex and also novel connections between the inferior colliculus and the visual thalamus. A recent publication [L.K. Laemle, N.L. Strominger, D.O. Carpenter, Cross-modal innervation of primary visual cortex by auditory fibers in congenitally anophthalmic mice, Neurosci. Lett. 396 (2006) 108-112] suggested the presence of a direct reciprocal connection between the inferior colliculus and the primary visual cortex (V1) in congenitally anophthalmic ZRDCT/An mice. This implies that this mutant mouse would be the only known vertebrate having a direct tectal connection with a primary sensory cortex. The presence of this peculiar pathway was reinvestigated in the ZRDCT/An mouse with highly sensitive neuronal tracers. We found the connections normally described in the ZRDCT/An mouse between: (i) the inferior colliculus and the dorsal lateral geniculate nucleus, (ii) V1 and the superior colliculus, (iii) the lateral posterior nucleus and V1 and between (iv) the inferior colliculus and the medial geniculate nucleus. We also show unambiguously that the auditory subcortical structures do not connect the primary visual cortex in the anophthalmic mouse. In particular, we find no evidence of a direct projection from the auditory mesencephalon to the cortex in this animal model of blindness.
Collapse
|