1
|
Snir A, Cieśla K, Vekslar R, Amedi A. Highly compromised auditory spatial perception in aided congenitally hearing-impaired and rapid improvement with tactile technology. iScience 2024; 27:110808. [PMID: 39290844 PMCID: PMC11407022 DOI: 10.1016/j.isci.2024.110808] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2024] [Revised: 07/11/2024] [Accepted: 08/21/2024] [Indexed: 09/19/2024] Open
Abstract
Spatial understanding is a multisensory construct while hearing is the only natural sense enabling the simultaneous perception of the entire 3D space. To test whether such spatial understanding is dependent on auditory experience, we study congenitally hearing-impaired users of assistive devices. We apply an in-house technology, which, inspired by the auditory system, performs intensity-weighting to represent external spatial positions and motion on the fingertips. We see highly impaired auditory spatial capabilities for tracking moving sources, which based on the "critical periods" theory emphasizes the role of nature in sensory development. Meanwhile, for tactile and audio-tactile spatial motion perception, the hearing-impaired show performance similar to typically hearing individuals. The immediate availability of 360° external space representation through touch, despite the lack of such experience during the lifetime, points to the significant role of nurture in spatial perception development, and to its amodal character. The findings show promise toward advancing multisensory solutions for rehabilitation.
Collapse
Affiliation(s)
- Adi Snir
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8 Herzliya 461010, Israel
| | - Katarzyna Cieśla
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8 Herzliya 461010, Israel
- World Hearing Centre, Institute of Physiology and Pathology of Hearing, Mokra 17, 05-830 Kajetany, Nadarzyn, Poland
| | - Rotem Vekslar
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8 Herzliya 461010, Israel
| | - Amir Amedi
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8 Herzliya 461010, Israel
| |
Collapse
|
2
|
Aker SC, Faulkner KF, Innes-Brown H, Vatti M, Marozeau J. Some, but not all, cochlear implant users prefer music stimuli with congruent haptic stimulation. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2024; 155:3101-3117. [PMID: 38722101 DOI: 10.1121/10.0025854] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/02/2023] [Accepted: 04/10/2024] [Indexed: 09/20/2024]
Abstract
Cochlear implant (CI) users often report being unsatisfied by music listening through their hearing device. Vibrotactile stimulation could help alleviate those challenges. Previous research has shown that musical stimuli was given higher preference ratings by normal-hearing listeners when concurrent vibrotactile stimulation was congruent in intensity and timing with the corresponding auditory signal compared to incongruent. However, it is not known whether this is also the case for CI users. Therefore, in this experiment, we presented 18 CI users and 24 normal-hearing listeners with five melodies and five different audio-to-tactile maps. Each map varied the congruence between the audio and tactile signals related to intensity, fundamental frequency, and timing. Participants were asked to rate the maps from zero to 100, based on preference. It was shown that almost all normal-hearing listeners, as well as a subset of the CI users, preferred tactile stimulation, which was congruent with the audio in intensity and timing. However, many CI users had no difference in preference between timing aligned and timing unaligned stimuli. The results provide evidence that vibrotactile music enjoyment enhancement could be a solution for some CI users; however, more research is needed to understand which CI users can benefit from it most.
Collapse
Affiliation(s)
- Scott C Aker
- Music and CI Lab, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, 1165, Denmark
- Oticon A/S, Smørum, 2765, Denmark
| | | | - Hamish Innes-Brown
- Eriksholm Research Centre, Snekkersten, 3070, Denmark
- Hearing Systems Section, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, 1165, Denmark
| | | | - Jeremy Marozeau
- Music and CI Lab, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, 1165, Denmark
| |
Collapse
|
3
|
Schulte A, Marozeau J, Ruhe A, Büchner A, Kral A, Innes-Brown H. Improved speech intelligibility in the presence of congruent vibrotactile speech input. Sci Rep 2023; 13:22657. [PMID: 38114599 PMCID: PMC10730903 DOI: 10.1038/s41598-023-48893-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2023] [Accepted: 11/30/2023] [Indexed: 12/21/2023] Open
Abstract
Vibrotactile stimulation is believed to enhance auditory speech perception, offering potential benefits for cochlear implant (CI) users who may utilize compensatory sensory strategies. Our study advances previous research by directly comparing tactile speech intelligibility enhancements in normal-hearing (NH) and CI participants, using the same paradigm. Moreover, we assessed tactile enhancement considering stimulus non-specific, excitatory effects through an incongruent audio-tactile control condition that did not contain any speech-relevant information. In addition to this incongruent audio-tactile condition, we presented sentences in an auditory only and a congruent audio-tactile condition, with the congruent tactile stimulus providing low-frequency envelope information via a vibrating probe on the index fingertip. The study involved 23 NH listeners and 14 CI users. In both groups, significant tactile enhancements were observed for congruent tactile stimuli (5.3% for NH and 5.4% for CI participants), but not for incongruent tactile stimulation. These findings replicate previously observed tactile enhancement effects. Juxtaposing our study with previous research, the informational content of the tactile stimulus emerges as a modulator of intelligibility: Generally, congruent stimuli enhanced, non-matching tactile stimuli reduced, and neutral stimuli did not change test outcomes. We conclude that the temporal cues provided by congruent vibrotactile stimuli may aid in parsing continuous speech signals into syllables and words, consequently leading to the observed improvements in intelligibility.
Collapse
Affiliation(s)
- Alina Schulte
- Department of Experimental Otology of the Clinics of Otolaryngology, Hannover Medical School, Hannover, Germany.
- Eriksholm Research Center, Oticon A/S, Snekkersten, Denmark.
| | - Jeremy Marozeau
- Music and Cochlear Implants Lab, Department of Health Technology, Technical University Denmark, Kongens Lyngby, Denmark
| | - Anna Ruhe
- Department of Experimental Otology of the Clinics of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Andreas Büchner
- Department of Experimental Otology of the Clinics of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Andrej Kral
- Department of Experimental Otology of the Clinics of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Hamish Innes-Brown
- Eriksholm Research Center, Oticon A/S, Snekkersten, Denmark
- Hearing Systems Section, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, Denmark
| |
Collapse
|
4
|
Baş B, Yücel E. Sensory profiles of children using cochlear implant and auditory brainstem implant. Int J Pediatr Otorhinolaryngol 2023; 170:111584. [PMID: 37224736 DOI: 10.1016/j.ijporl.2023.111584] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/09/2022] [Revised: 04/18/2023] [Accepted: 04/29/2023] [Indexed: 05/26/2023]
Affiliation(s)
- Banu Baş
- Ankara Yıldırım Beyazıt University, Faculty of Health Sciences, Department of Audiology, Ankara, Turkey.
| | - Esra Yücel
- Hacettepe University, Faculty of Health Sciences, Department of Audiology, Ankara, Turkey
| |
Collapse
|
5
|
Murray CA, Shams L. Crossmodal interactions in human learning and memory. Front Hum Neurosci 2023; 17:1181760. [PMID: 37266327 PMCID: PMC10229776 DOI: 10.3389/fnhum.2023.1181760] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Accepted: 05/02/2023] [Indexed: 06/03/2023] Open
Abstract
Most studies of memory and perceptual learning in humans have employed unisensory settings to simplify the study paradigm. However, in daily life we are often surrounded by complex and cluttered scenes made up of many objects and sources of sensory stimulation. Our experiences are, therefore, highly multisensory both when passively observing the world and when acting and navigating. We argue that human learning and memory systems are evolved to operate under these multisensory and dynamic conditions. The nervous system exploits the rich array of sensory inputs in this process, is sensitive to the relationship between the sensory inputs, and continuously updates sensory representations, and encodes memory traces based on the relationship between the senses. We review some recent findings that demonstrate a range of human learning and memory phenomena in which the interactions between visual and auditory modalities play an important role, and suggest possible neural mechanisms that can underlie some surprising recent findings. We outline open questions as well as directions of future research to unravel human perceptual learning and memory.
Collapse
Affiliation(s)
- Carolyn A. Murray
- Department of Psychology, University of California, Los Angeles, Los Angeles, CA, United States
| | - Ladan Shams
- Department of Psychology, University of California, Los Angeles, Los Angeles, CA, United States
- Department of Bioengineering, Neuroscience Interdepartmental Program, University of California, Los Angeles, Los Angeles, CA, United States
| |
Collapse
|
6
|
Aker SC, Innes-Brown H, Faulkner KF, Vatti M, Marozeau J. Effect of audio-tactile congruence on vibrotactile music enhancement. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2022; 152:3396. [PMID: 36586853 DOI: 10.1121/10.0016444] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 11/21/2022] [Indexed: 06/17/2023]
Abstract
Music listening experiences can be enhanced with tactile vibrations. However, it is not known which parameters of the tactile vibration must be congruent with the music to enhance it. Devices that aim to enhance music with tactile vibrations often require coding an acoustic signal into a congruent vibrotactile signal. Therefore, understanding which of these audio-tactile congruences are important is crucial. Participants were presented with a simple sine wave melody through supra-aural headphones and a haptic actuator held between the thumb and forefinger. Incongruent versions of the stimuli were made by randomizing physical parameters of the tactile stimulus independently of the auditory stimulus. Participants were instructed to rate the stimuli against the incongruent stimuli based on preference. It was found making the intensity of the tactile stimulus incongruent with the intensity of the auditory stimulus, as well as misaligning the two modalities in time, had the biggest negative effect on ratings for the melody used. Future vibrotactile music enhancement devices can use time alignment and intensity congruence as a baseline coding strategy, which improved strategies can be tested against.
Collapse
Affiliation(s)
- Scott C Aker
- Music and Cochlear Implant Lab, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, 2800, Denmark
| | | | | | | | - Jeremy Marozeau
- Music and Cochlear Implant Lab, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, 2800, Denmark
| |
Collapse
|
7
|
Lessons from infant learning for unsupervised machine learning. NAT MACH INTELL 2022. [DOI: 10.1038/s42256-022-00488-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
8
|
Li H, Song L, Wang P, Weiss PH, Fink GR, Zhou X, Chen Q. Impaired body-centered sensorimotor transformations in congenitally deaf people. Brain Commun 2022; 4:fcac148. [PMID: 35774184 PMCID: PMC9240416 DOI: 10.1093/braincomms/fcac148] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Revised: 02/26/2022] [Accepted: 06/03/2022] [Indexed: 11/20/2022] Open
Abstract
Congenital deafness modifies an individual’s daily interaction with the environment and alters the fundamental perception of the external world. How congenital deafness shapes the interface between the internal and external worlds remains poorly understood. To interact efficiently with the external world, visuospatial representations of external target objects need to be effectively transformed into sensorimotor representations with reference to the body. Here, we tested the hypothesis that egocentric body-centred sensorimotor transformation is impaired in congenital deafness. Consistent with this hypothesis, we found that congenital deafness induced impairments in egocentric judgements, associating the external objects with the internal body. These impairments were due to deficient body-centred sensorimotor transformation per se, rather than the reduced fidelity of the visuospatial representations of the egocentric positions. At the neural level, we first replicated the previously well-documented critical involvement of the frontoparietal network in egocentric processing, in both congenitally deaf participants and hearing controls. However, both the strength of neural activity and the intra-network connectivity within the frontoparietal network alone could not account for egocentric performance variance. Instead, the inter-network connectivity between the task-positive frontoparietal network and the task-negative default-mode network was significantly correlated with egocentric performance: the more cross-talking between them, the worse the egocentric judgement. Accordingly, the impaired egocentric performance in the deaf group was related to increased inter-network connectivity between the frontoparietal network and the default-mode network and decreased intra-network connectivity within the default-mode network. The altered neural network dynamics in congenital deafness were observed for both evoked neural activity during egocentric processing and intrinsic neural activity during rest. Our findings thus not only demonstrate the optimal network configurations between the task-positive and -negative neural networks underlying coherent body-centred sensorimotor transformations but also unravel a critical cause (i.e. impaired body-centred sensorimotor transformation) of a variety of hitherto unexplained difficulties in sensory-guided movements the deaf population experiences in their daily life.
Collapse
Affiliation(s)
- Hui Li
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education , China
- School of Psychology, Center for Studies of Psychological Application, and Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University , China
| | - Li Song
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education , China
- School of Psychology, Center for Studies of Psychological Application, and Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University , China
| | - Pengfei Wang
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education , China
- School of Psychology, Center for Studies of Psychological Application, and Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University , China
| | - Peter H. Weiss
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Centre Jülich, Germany, Wilhelm-Johnen-Strasse , 52428 Jülich, Germany
- Department of Neurology, University Hospital Cologne, Cologne University , 509737 Cologne, Germany
| | - Gereon R. Fink
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Centre Jülich, Germany, Wilhelm-Johnen-Strasse , 52428 Jülich, Germany
- Department of Neurology, University Hospital Cologne, Cologne University , 509737 Cologne, Germany
| | - Xiaolin Zhou
- Shanghai Key Laboratory of Mental Health and Psychological Crisis Intervention, School of Psychology and Cognitive Science, East China Normal University , 200062 Shanghai, China
| | - Qi Chen
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Centre Jülich, Germany, Wilhelm-Johnen-Strasse , 52428 Jülich, Germany
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education , China
- School of Psychology, Center for Studies of Psychological Application, and Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University , China
| |
Collapse
|
9
|
Audiovisual Integration for Saccade and Vergence Eye Movements Increases with Presbycusis and Loss of Selective Attention on the Stroop Test. Brain Sci 2022; 12:brainsci12050591. [PMID: 35624979 PMCID: PMC9139407 DOI: 10.3390/brainsci12050591] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2022] [Revised: 04/23/2022] [Accepted: 04/28/2022] [Indexed: 11/17/2022] Open
Abstract
Multisensory integration is a capacity allowing us to merge information from different sensory modalities in order to improve the salience of the signal. Audiovisual integration is one of the most used kinds of multisensory integration, as vision and hearing are two senses used very frequently in humans. However, the literature regarding age-related hearing loss (presbycusis) on audiovisual integration abilities is almost nonexistent, despite the growing prevalence of presbycusis in the population. In that context, the study aims to assess the relationship between presbycusis and audiovisual integration using tests of saccade and vergence eye movements to visual vs. audiovisual targets, with a pure tone as an auditory signal. Tests were run with the REMOBI and AIDEAL technologies coupled with the pupil core eye tracker. Hearing abilities, eye movement characteristics (latency, peak velocity, average velocity, amplitude) for saccade and vergence eye movements, and the Stroop Victoria test were measured in 69 elderly and 30 young participants. The results indicated (i) a dual pattern of aging effect on audiovisual integration for convergence (a decrease in the aged group relative to the young one, but an increase with age within the elderly group) and (ii) an improvement of audiovisual integration for saccades for people with presbycusis associated with lower scores of selective attention in the Stroop test, regardless of age. These results bring new insight on an unknown topic, that of audio visuomotor integration in normal aging and in presbycusis. They highlight the potential interest of using eye movement targets in the 3D space and pure tone sound to objectively evaluate audio visuomotor integration capacities.
Collapse
|
10
|
Fletcher MD. Can Haptic Stimulation Enhance Music Perception in Hearing-Impaired Listeners? Front Neurosci 2021; 15:723877. [PMID: 34531717 PMCID: PMC8439542 DOI: 10.3389/fnins.2021.723877] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2021] [Accepted: 08/11/2021] [Indexed: 01/07/2023] Open
Abstract
Cochlear implants (CIs) have been remarkably successful at restoring hearing in severely-to-profoundly hearing-impaired individuals. However, users often struggle to deconstruct complex auditory scenes with multiple simultaneous sounds, which can result in reduced music enjoyment and impaired speech understanding in background noise. Hearing aid users often have similar issues, though these are typically less acute. Several recent studies have shown that haptic stimulation can enhance CI listening by giving access to sound features that are poorly transmitted through the electrical CI signal. This “electro-haptic stimulation” improves melody recognition and pitch discrimination, as well as speech-in-noise performance and sound localization. The success of this approach suggests it could also enhance auditory perception in hearing-aid users and other hearing-impaired listeners. This review focuses on the use of haptic stimulation to enhance music perception in hearing-impaired listeners. Music is prevalent throughout everyday life, being critical to media such as film and video games, and often being central to events such as weddings and funerals. It represents the biggest challenge for signal processing, as it is typically an extremely complex acoustic signal, containing multiple simultaneous harmonic and inharmonic sounds. Signal-processing approaches developed for enhancing music perception could therefore have significant utility for other key issues faced by hearing-impaired listeners, such as understanding speech in noisy environments. This review first discusses the limits of music perception in hearing-impaired listeners and the limits of the tactile system. It then discusses the evidence around integration of audio and haptic stimulation in the brain. Next, the features, suitability, and success of current haptic devices for enhancing music perception are reviewed, as well as the signal-processing approaches that could be deployed in future haptic devices. Finally, the cutting-edge technologies that could be exploited for enhancing music perception with haptics are discussed. These include the latest micro motor and driver technology, low-power wireless technology, machine learning, big data, and cloud computing. New approaches for enhancing music perception in hearing-impaired listeners could substantially improve quality of life. Furthermore, effective haptic techniques for providing complex sound information could offer a non-invasive, affordable means for enhancing listening more broadly in hearing-impaired individuals.
Collapse
Affiliation(s)
- Mark D Fletcher
- University of Southampton Auditory Implant Service, Faculty of Engineering and Physical Sciences, University of Southampton, Southampton, United Kingdom.,Institute of Sound and Vibration Research, Faculty of Engineering and Physical Sciences, University of Southampton, Southampton, United Kingdom
| |
Collapse
|
11
|
Fletcher MD, Verschuur CA. Electro-Haptic Stimulation: A New Approach for Improving Cochlear-Implant Listening. Front Neurosci 2021; 15:581414. [PMID: 34177440 PMCID: PMC8219940 DOI: 10.3389/fnins.2021.581414] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2020] [Accepted: 04/29/2021] [Indexed: 12/12/2022] Open
Abstract
Cochlear implants (CIs) have been remarkably successful at restoring speech perception for severely to profoundly deaf individuals. Despite their success, several limitations remain, particularly in CI users' ability to understand speech in noisy environments, locate sound sources, and enjoy music. A new multimodal approach has been proposed that uses haptic stimulation to provide sound information that is poorly transmitted by the implant. This augmenting of the electrical CI signal with haptic stimulation (electro-haptic stimulation; EHS) has been shown to improve speech-in-noise performance and sound localization in CI users. There is also evidence that it could enhance music perception. We review the evidence of EHS enhancement of CI listening and discuss key areas where further research is required. These include understanding the neural basis of EHS enhancement, understanding the effectiveness of EHS across different clinical populations, and the optimization of signal-processing strategies. We also discuss the significant potential for a new generation of haptic neuroprosthetic devices to aid those who cannot access hearing-assistive technology, either because of biomedical or healthcare-access issues. While significant further research and development is required, we conclude that EHS represents a promising new approach that could, in the near future, offer a non-invasive, inexpensive means of substantially improving clinical outcomes for hearing-impaired individuals.
Collapse
Affiliation(s)
- Mark D. Fletcher
- Faculty of Engineering and Physical Sciences, University of Southampton Auditory Implant Service, University of Southampton, Southampton, United Kingdom
- Faculty of Engineering and Physical Sciences, Institute of Sound and Vibration Research, University of Southampton, Southampton, United Kingdom
| | - Carl A. Verschuur
- Faculty of Engineering and Physical Sciences, University of Southampton Auditory Implant Service, University of Southampton, Southampton, United Kingdom
| |
Collapse
|
12
|
Scurry AN, Chifamba K, Jiang F. Electrophysiological Dynamics of Visual-Tactile Temporal Order Perception in Early Deaf Adults. Front Neurosci 2020; 14:544472. [PMID: 33071731 PMCID: PMC7539666 DOI: 10.3389/fnins.2020.544472] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 08/19/2020] [Indexed: 11/17/2022] Open
Abstract
Studies of compensatory plasticity in early deaf (ED) individuals have mainly focused on unisensory processing, and on spatial rather than temporal coding. However, precise discrimination of the temporal relationship between stimuli is imperative for successful perception of and interaction with the complex, multimodal environment. Although the properties of cross-modal temporal processing have been extensively studied in neurotypical populations, remarkably little is known about how the loss of one sense impacts the integrity of temporal interactions among the remaining senses. To understand how auditory deprivation affects multisensory temporal interactions, ED and age-matched normal hearing (NH) controls performed a visual-tactile temporal order judgment task in which visual and tactile stimuli were separated by varying stimulus onset asynchronies (SOAs) and subjects had to discern the leading stimulus. Participants performed the task while EEG data were recorded. Group averaged event-related potential waveforms were compared between groups in occipital and fronto-central electrodes. Despite similar temporal order sensitivities and performance accuracy, ED had larger visual P100 amplitudes for all SOA levels and larger tactile N140 amplitudes for the shortest asynchronous (± 30 ms) and synchronous SOA levels. The enhanced signal strength reflected in these components from ED adults are discussed in terms of compensatory recruitment of cortical areas for visual-tactile processing. In addition, ED adults had similar tactile P200 amplitudes as NH but longer P200 latencies suggesting reduced efficiency in later processing of tactile information. Overall, these results suggest that greater responses by ED for early processing of visual and tactile signals are likely critical for maintained performance in visual-tactile temporal order discrimination.
Collapse
Affiliation(s)
- Alexandra N Scurry
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| | - Kudzai Chifamba
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| | - Fang Jiang
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| |
Collapse
|
13
|
Alkhamra RA, Abu-Dahab SMN. Sensory processing disorders in children with hearing impairment: Implications for multidisciplinary approach and early intervention. Int J Pediatr Otorhinolaryngol 2020; 136:110154. [PMID: 32521420 DOI: 10.1016/j.ijporl.2020.110154] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/31/2020] [Revised: 04/26/2020] [Accepted: 05/27/2020] [Indexed: 10/24/2022]
Abstract
OBJECTIVE To explore the differences in sensory processing between children with hearing impairments and children with normal hearing and the variables that influence sensory processing disorder (SPD). METHODS The sensory processing abilities of 90 children were compared in three age-matched groups of 30, with cochlear implants (CIs), hearing aids (HAs), and normal hearing (NH). The Arabic Sensory Profile (Arabic_SP) was used. RESULTS Findings were presented in the Arabic_SP section and factor levels. Sections: The NH group performed better (p < .05) than the CI group in 57% of the sections and better than the HA group in 14%. The CI group exhibited more signs of SPD than the HA group with vestibular processing, multisensory processing, and emotional-social responses. FACTORS The NH group differed from the CI group on all the factors that showed significance and from the HA group with inattention/distractibility and poor registration. There were great differences between the CI and the HA groups on all the factors except with poor registration and fine motor/perceptual. Hearing loss variables that most affected results in the Arabic_SP were the age at receiving a hearing device and type of hearing loss onset. CONCLUSION Along with speech and language problems, children with hearing impairment are especially vulnerable to SPD. Children with CIs and HAs are increasingly susceptible to auditory processing disorders. Higher risks of balance, multisensory processing, social-emotional, and fine motor problems are in children with CIs. Increased SPD risks came with a higher age at implantation. Findings indicate the importance of a multidisciplinary approach for early detection and intervention for children with hearing impairment, especially those with CIs.
Collapse
Affiliation(s)
- Rana A Alkhamra
- Department of Hearing and Speech Sciences, Faculty of Rehabilitation Sciences, The University of Jordan, Amman, 11942, Jordan.
| | - Sana M N Abu-Dahab
- Department of Occupational Therapy, Faculty of Rehabilitation Sciences, The University of Jordan, Amman, 11942, Jordan
| |
Collapse
|
14
|
Sharp A, Bacon BA, Champoux F. Enhanced tactile identification of musical emotion in the deaf. Exp Brain Res 2020; 238:1229-1236. [DOI: 10.1007/s00221-020-05789-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2020] [Accepted: 03/18/2020] [Indexed: 10/24/2022]
|
15
|
Improved tactile frequency discrimination in musicians. Exp Brain Res 2019; 237:1575-1580. [PMID: 30927044 DOI: 10.1007/s00221-019-05532-z] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2019] [Accepted: 03/25/2019] [Indexed: 10/27/2022]
Abstract
Music practice is a multisensory training that is of great interest to neuroscientists because of its implications for neural plasticity. Music-related modulation of sensory systems has been observed in neuroimaging data, and has been supported by results in behavioral tasks. Some studies have shown that musicians react faster than non-musicians to visual, tactile and auditory stimuli. Behavioral enhancement in more complex tasks has received considerably less attention in musicians. This study aims to investigate unisensory and multisensory discrimination capabilities in musicians. More specifically, the goal of this study is to examine auditory, tactile and auditory-tactile discrimination in musicians. The literature suggesting better auditory and auditory-tactile discrimination in musicians is scarce, and no study to date has examined pure tactile discrimination capabilities in musicians. A two-alternative forced-choice frequency discrimination task was used in this experiment. The task was inspired by musical production, and participants were asked to identify whether a frequency was the same as or different than a standard stimulus of 160 Hz in three conditions: auditory only, auditory-tactile only and tactile only. Three waveforms were used to replicate the variability of pitch that can be found in music. Stimuli were presented through headphones for auditory stimulation and a glove with haptic audio exciters for tactile stimulation. Results suggest that musicians have lower discrimination thresholds than non-musicians for auditory-only and auditory-tactile conditions for all waveforms. The results also revealed that musicians have lower discrimination thresholds than non-musicians in the tactile condition for sine and square waveforms. Taken together, these results support the hypothesis that musical training can lead to better unisensory tactile discrimination which is in itself a new and major finding.
Collapse
|
16
|
Stevenson RA, Sheffield SW, Butera IM, Gifford RH, Wallace MT. Multisensory Integration in Cochlear Implant Recipients. Ear Hear 2018; 38:521-538. [PMID: 28399064 DOI: 10.1097/aud.0000000000000435] [Citation(s) in RCA: 53] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Speech perception is inherently a multisensory process involving integration of auditory and visual cues. Multisensory integration in cochlear implant (CI) recipients is a unique circumstance in that the integration occurs after auditory deprivation and the provision of hearing via the CI. Despite the clear importance of multisensory cues for perception, in general, and for speech intelligibility, specifically, the topic of multisensory perceptual benefits in CI users has only recently begun to emerge as an area of inquiry. We review the research that has been conducted on multisensory integration in CI users to date and suggest a number of areas needing further research. The overall pattern of results indicates that many CI recipients show at least some perceptual gain that can be attributable to multisensory integration. The extent of this gain, however, varies based on a number of factors, including age of implantation and specific task being assessed (e.g., stimulus detection, phoneme perception, word recognition). Although both children and adults with CIs obtain audiovisual benefits for phoneme, word, and sentence stimuli, neither group shows demonstrable gain for suprasegmental feature perception. Additionally, only early-implanted children and the highest performing adults obtain audiovisual integration benefits similar to individuals with normal hearing. Increasing age of implantation in children is associated with poorer gains resultant from audiovisual integration, suggesting a sensitive period in development for the brain networks that subserve these integrative functions, as well as length of auditory experience. This finding highlights the need for early detection of and intervention for hearing loss, not only in terms of auditory perception, but also in terms of the behavioral and perceptual benefits of audiovisual processing. Importantly, patterns of auditory, visual, and audiovisual responses suggest that underlying integrative processes may be fundamentally different between CI users and typical-hearing listeners. Future research, particularly in low-level processing tasks such as signal detection will help to further assess mechanisms of multisensory integration for individuals with hearing loss, both with and without CIs.
Collapse
Affiliation(s)
- Ryan A Stevenson
- 1Department of Psychology, University of Western Ontario, London, Ontario, Canada; 2Brain and Mind Institute, University of Western Ontario, London, Ontario, Canada; 3Walter Reed National Military Medical Center, Audiology and Speech Pathology Center, London, Ontario, Canada; 4Vanderbilt Brain Institute, Nashville, Tennesse; 5Vanderbilt Kennedy Center, Nashville, Tennesse; 6Department of Psychology, Vanderbilt University, Nashville, Tennesse; 7Department of Psychiatry, Vanderbilt University Medical Center, Nashville, Tennesse; and 8Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennesse
| | | | | | | | | |
Collapse
|
17
|
Schierholz I, Finke M, Kral A, Büchner A, Rach S, Lenarz T, Dengler R, Sandmann P. Auditory and audio-visual processing in patients with cochlear, auditory brainstem, and auditory midbrain implants: An EEG study. Hum Brain Mapp 2017; 38:2206-2225. [PMID: 28130910 DOI: 10.1002/hbm.23515] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2016] [Revised: 12/26/2016] [Accepted: 01/03/2017] [Indexed: 11/10/2022] Open
Abstract
There is substantial variability in speech recognition ability across patients with cochlear implants (CIs), auditory brainstem implants (ABIs), and auditory midbrain implants (AMIs). To better understand how this variability is related to central processing differences, the current electroencephalography (EEG) study compared hearing abilities and auditory-cortex activation in patients with electrical stimulation at different sites of the auditory pathway. Three different groups of patients with auditory implants (Hannover Medical School; ABI: n = 6, CI: n = 6; AMI: n = 2) performed a speeded response task and a speech recognition test with auditory, visual, and audio-visual stimuli. Behavioral performance and cortical processing of auditory and audio-visual stimuli were compared between groups. ABI and AMI patients showed prolonged response times on auditory and audio-visual stimuli compared with NH listeners and CI patients. This was confirmed by prolonged N1 latencies and reduced N1 amplitudes in ABI and AMI patients. However, patients with central auditory implants showed a remarkable gain in performance when visual and auditory input was combined, in both speech and non-speech conditions, which was reflected by a strong visual modulation of auditory-cortex activation in these individuals. In sum, the results suggest that the behavioral improvement for audio-visual conditions in central auditory implant patients is based on enhanced audio-visual interactions in the auditory cortex. Their findings may provide important implications for the optimization of electrical stimulation and rehabilitation strategies in patients with central auditory prostheses. Hum Brain Mapp 38:2206-2225, 2017. © 2017 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Irina Schierholz
- Department of Neurology, Hannover Medical School, Hannover, Germany.,Cluster of Excellence "Hearing4all,", Hannover, Germany.,Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Mareike Finke
- Cluster of Excellence "Hearing4all,", Hannover, Germany.,Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Andrej Kral
- Cluster of Excellence "Hearing4all,", Hannover, Germany.,Department of Otolaryngology, Hannover Medical School, Hannover, Germany.,Institute of AudioNeuroTechnology and Department of Experimental Otology, Hannover Medical School, Hannover, Germany.,School of Behavioral and Brain Sciences, The University of Texas at Dallas, Dallas, Texas
| | - Andreas Büchner
- Cluster of Excellence "Hearing4all,", Hannover, Germany.,Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Stefan Rach
- Department of Epidemiological Methods and Etiological Research, Leibniz Institute for Prevention Research and Epidemiology - BIPS, Bremen, Germany
| | - Thomas Lenarz
- Cluster of Excellence "Hearing4all,", Hannover, Germany.,Department of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Reinhard Dengler
- Department of Neurology, Hannover Medical School, Hannover, Germany.,Cluster of Excellence "Hearing4all,", Hannover, Germany
| | - Pascale Sandmann
- Department of Neurology, Hannover Medical School, Hannover, Germany.,Cluster of Excellence "Hearing4all,", Hannover, Germany.,Department of Otorhinolaryngology, University Hospital Cologne, Cologne, Germany
| |
Collapse
|
18
|
Stropahl M, Chen LC, Debener S. Cortical reorganization in postlingually deaf cochlear implant users: Intra-modal and cross-modal considerations. Hear Res 2017; 343:128-137. [DOI: 10.1016/j.heares.2016.07.005] [Citation(s) in RCA: 41] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/15/2016] [Revised: 07/12/2016] [Accepted: 07/18/2016] [Indexed: 10/21/2022]
|
19
|
Landry SP, Sharp A, Pagé S, Champoux F. Temporal and spectral audiotactile interactions in musicians. Exp Brain Res 2016; 235:525-532. [PMID: 27803971 DOI: 10.1007/s00221-016-4813-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2016] [Accepted: 10/22/2016] [Indexed: 11/26/2022]
Abstract
Previous investigations have revealed that the complex sensory exposure of musical training alters audiovisual interactions. As of yet, there has been little evidence on the effects of musical training on audiotactile interactions at a behavioural level. Here, we tested audiotactile interaction in musicians using the audiotactile illusory flash and the parchment-skin illusion. Significant differences were only found between musicians and non-musicians for the audiotactile illusory flash. Both groups had similar task-relevant unisensory abilities, but unlike non-musicians, the number of auditory stimulations did not have a statistically important influence on the number of perceived tactile stimulations for musicians. Musicians and non-musicians similarly perceived the parchment-skin illusion. Spectral alterations of self-generated palmar sounds similarly altered the perception of wetness and dryness for both groups. These results suggest that musical training does not seem to alter multisensory interactions at large. The specificity of the sensory enhancement suggests that musical training specifically alters processes underlying the interaction of temporal audiotactile stimuli and not the global interaction between these modalities. These results are consistent with previous unisensory and multisensory investigations on sensory abilities related to audiotactile processing in musicians.
Collapse
Affiliation(s)
- Simon P Landry
- Faculté de médecine, École d'orthophonie et d'audiologie, Université de Montréal, C.P. 6128, Succursale Centre-Ville, Montreal, QC, H3C 3J7, Canada
| | - Andréanne Sharp
- Faculté de médecine, École d'orthophonie et d'audiologie, Université de Montréal, C.P. 6128, Succursale Centre-Ville, Montreal, QC, H3C 3J7, Canada
| | - Sara Pagé
- Faculté de médecine, École d'orthophonie et d'audiologie, Université de Montréal, C.P. 6128, Succursale Centre-Ville, Montreal, QC, H3C 3J7, Canada
| | - François Champoux
- Faculté de médecine, École d'orthophonie et d'audiologie, Université de Montréal, C.P. 6128, Succursale Centre-Ville, Montreal, QC, H3C 3J7, Canada.
| |
Collapse
|
20
|
Body Perception and Action Following Deafness. Neural Plast 2016; 2016:5260671. [PMID: 26881115 PMCID: PMC4737455 DOI: 10.1155/2016/5260671] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2015] [Revised: 11/13/2015] [Accepted: 11/16/2015] [Indexed: 11/29/2022] Open
Abstract
The effect of deafness on sensory abilities has been the topic of extensive investigation over the past decades. These investigations have mostly focused on visual capacities. We are only now starting to investigate how the deaf experience their own bodies and body-related abilities. Indeed, a growing corpus of research suggests that auditory input could play an important role in body-related processing. Deafness could therefore disturb such processes. It has also been suggested that many unexplained daily difficulties experienced by the deaf could be related to deficits in this underexplored field. In the present review, we propose an overview of the current state of knowledge on the effects of deafness on body-related processing.
Collapse
|
21
|
Schierholz I, Finke M, Schulte S, Hauthal N, Kantzke C, Rach S, Büchner A, Dengler R, Sandmann P. Enhanced audio–visual interactions in the auditory cortex of elderly cochlear-implant users. Hear Res 2015; 328:133-47. [DOI: 10.1016/j.heares.2015.08.009] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/05/2015] [Revised: 08/12/2015] [Accepted: 08/19/2015] [Indexed: 11/29/2022]
|
22
|
Wu C, Stefanescu RA, Martel DT, Shore SE. Listening to another sense: somatosensory integration in the auditory system. Cell Tissue Res 2015; 361:233-50. [PMID: 25526698 PMCID: PMC4475675 DOI: 10.1007/s00441-014-2074-7] [Citation(s) in RCA: 50] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2014] [Accepted: 11/18/2014] [Indexed: 12/19/2022]
Abstract
Conventionally, sensory systems are viewed as separate entities, each with its own physiological process serving a different purpose. However, many functions require integrative inputs from multiple sensory systems and sensory intersection and convergence occur throughout the central nervous system. The neural processes for hearing perception undergo significant modulation by the two other major sensory systems, vision and somatosensation. This synthesis occurs at every level of the ascending auditory pathway: the cochlear nucleus, inferior colliculus, medial geniculate body and the auditory cortex. In this review, we explore the process of multisensory integration from (1) anatomical (inputs and connections), (2) physiological (cellular responses), (3) functional and (4) pathological aspects. We focus on the convergence between auditory and somatosensory inputs in each ascending auditory station. This review highlights the intricacy of sensory processing and offers a multisensory perspective regarding the understanding of sensory disorders.
Collapse
Affiliation(s)
- Calvin Wu
- Department of Otolaryngology, Kresge Hearing Research Institute, University of Michigan, Ann Arbor, MI, 48109, USA
| | | | | | | |
Collapse
|
23
|
Hauthal N, Debener S, Rach S, Sandmann P, Thorne JD. Visuo-tactile interactions in the congenitally deaf: a behavioral and event-related potential study. Front Integr Neurosci 2015; 8:98. [PMID: 25653602 PMCID: PMC4300915 DOI: 10.3389/fnint.2014.00098] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2014] [Accepted: 12/19/2014] [Indexed: 11/13/2022] Open
Abstract
Auditory deprivation is known to be accompanied by alterations in visual processing. Yet not much is known about tactile processing and the interplay of the intact sensory modalities in the deaf. We presented visual, tactile, and visuo-tactile stimuli to congenitally deaf and hearing individuals in a speeded detection task. Analyses of multisensory responses showed a redundant signals effect that was attributable to a coactivation mechanism in both groups, although the redundancy gain was less in the deaf. In line with these behavioral results, on a neural level, there were multisensory interactions in both groups that were again weaker in the deaf. In hearing but not deaf participants, somatosensory event-related potential N200 latencies were modulated by simultaneous visual stimulation. A comparison of unisensory responses between groups revealed larger N200 amplitudes for visual and shorter N200 latencies for tactile stimuli in the deaf. Furthermore, P300 amplitudes were also larger in the deaf. This group difference was significant for tactile and approached significance for visual targets. The differences in visual and tactile processing between deaf and hearing participants, however, were not reflected in behavior. Both the behavioral and electroencephalography (EEG) results suggest more pronounced multisensory interaction in hearing than in deaf individuals. Visuo-tactile enhancements could not be explained by perceptual deficiency, but could be partly attributable to inverse effectiveness.
Collapse
Affiliation(s)
- Nadine Hauthal
- Neuropsychology Lab, Department of Psychology, Cluster of Excellence "Hearing4all," European Medical School, University of Oldenburg Oldenburg, Germany
| | - Stefan Debener
- Neuropsychology Lab, Department of Psychology, Cluster of Excellence "Hearing4all," European Medical School, University of Oldenburg Oldenburg, Germany ; Research Center Neurosensory Science, University of Oldenburg Oldenburg, Germany
| | - Stefan Rach
- Research Center Neurosensory Science, University of Oldenburg Oldenburg, Germany ; Experimental Psychology Lab, Department of Psychology, European Medical School, University of Oldenburg Oldenburg, Germany ; Department of Epidemiological Methods and Etiologic Research, Leibniz Institute for Prevention Research and Epidemiology - BIPS Bremen, Germany
| | - Pascale Sandmann
- Neuropsychology Lab, Department of Psychology, Cluster of Excellence "Hearing4all," European Medical School, University of Oldenburg Oldenburg, Germany ; Department of Neurology, Cluster of Excellence "Hearing4all," Hannover Medical School Hannover, Germany
| | - Jeremy D Thorne
- Neuropsychology Lab, Department of Psychology, Cluster of Excellence "Hearing4all," European Medical School, University of Oldenburg Oldenburg, Germany
| |
Collapse
|
24
|
Age-related hearing loss increases cross-modal distractibility. Hear Res 2014; 316:28-36. [DOI: 10.1016/j.heares.2014.07.005] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/22/2014] [Revised: 07/11/2014] [Accepted: 07/16/2014] [Indexed: 12/11/2022]
|
25
|
Nava E, Bottari D, Villwock A, Fengler I, Büchner A, Lenarz T, Röder B. Audio-tactile integration in congenitally and late deaf cochlear implant users. PLoS One 2014; 9:e99606. [PMID: 24918766 PMCID: PMC4053428 DOI: 10.1371/journal.pone.0099606] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2014] [Accepted: 05/16/2014] [Indexed: 12/02/2022] Open
Abstract
Several studies conducted in mammals and humans have shown that multisensory processing may be impaired following congenital sensory loss and in particular if no experience is achieved within specific early developmental time windows known as sensitive periods. In this study we investigated whether basic multisensory abilities are impaired in hearing-restored individuals with deafness acquired at different stages of development. To this aim, we tested congenitally and late deaf cochlear implant (CI) recipients, age-matched with two groups of hearing controls, on an audio-tactile redundancy paradigm, in which reaction times to unimodal and crossmodal redundant signals were measured. Our results showed that both congenitally and late deaf CI recipients were able to integrate audio-tactile stimuli, suggesting that congenital and acquired deafness does not prevent the development and recovery of basic multisensory processing. However, we found that congenitally deaf CI recipients had a lower multisensory gain compared to their matched controls, which may be explained by their faster responses to tactile stimuli. We discuss this finding in the context of reorganisation of the sensory systems following sensory loss and the possibility that these changes cannot be “rewired” through auditory reafferentation.
Collapse
Affiliation(s)
- Elena Nava
- University of Hamburg, Biological Psychology and Neuropsychology, Hamburg, Germany
- * E-mail:
| | - Davide Bottari
- University of Hamburg, Biological Psychology and Neuropsychology, Hamburg, Germany
| | - Agnes Villwock
- University of Hamburg, Biological Psychology and Neuropsychology, Hamburg, Germany
| | - Ineke Fengler
- University of Hamburg, Biological Psychology and Neuropsychology, Hamburg, Germany
| | - Andreas Büchner
- German Hearing Centre, Medical Clinic Hannover, Hannover, Germany
| | - Thomas Lenarz
- German Hearing Centre, Medical Clinic Hannover, Hannover, Germany
| | - Brigitte Röder
- University of Hamburg, Biological Psychology and Neuropsychology, Hamburg, Germany
| |
Collapse
|
26
|
Landry SP, Guillemot JP, Champoux F. Audiotactile interaction can change over time in cochlear implant users. Front Hum Neurosci 2014; 8:316. [PMID: 24904359 PMCID: PMC4033126 DOI: 10.3389/fnhum.2014.00316] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2014] [Accepted: 04/28/2014] [Indexed: 11/13/2022] Open
Abstract
Recent results suggest that audiotactile interactions are disturbed in cochlear implant (CI) users. However, further exploration regarding the factors responsible for such abnormal sensory processing is still required. Considering the temporal nature of a previously used multisensory task, it remains unclear whether any aberrant results were caused by the specificity of the interaction studied or rather if it reflects an overall abnormal interaction. Moreover, although duration of experience with a CI has often been linked with the recovery of auditory functions, its impact on multisensory performance remains uncertain. In the present study, we used the parchment-skin illusion, a robust illustration of sound-biased perception of touch based on changes in auditory frequencies, to investigate the specificities of audiotactile interactions in CI users. Whereas individuals with relatively little experience with the CI performed similarly to the control group, experienced CI users showed a significantly greater illusory percept. The overall results suggest that despite being able to ignore auditory distractors in a temporal audiotactile task, CI users develop to become greatly influenced by auditory input in a spectral audiotactile task. When considered with the existing body of research, these results confirm that normal sensory interaction processing can be compromised in CI users.
Collapse
Affiliation(s)
- Simon P Landry
- Centre de Recherche en Neuropsychologie Expérimentale et Cognition, Université de Montréal Montréal, QC, Canada ; Département de Kinanthropologie, Université du Québec à Montréal Montréal, QC, Canada
| | - Jean-Paul Guillemot
- Centre de Recherche en Neuropsychologie Expérimentale et Cognition, Université de Montréal Montréal, QC, Canada ; Département de Kinanthropologie, Université du Québec à Montréal Montréal, QC, Canada
| | - François Champoux
- Centre de Recherche en Neuropsychologie Expérimentale et Cognition, Université de Montréal Montréal, QC, Canada ; Institut Raymond-Dewar, Centre de Recherche Interdisciplinaire en Réadaptation du Montréal Métropolitain Montréal, QC, Canada ; École d'Orthophonie et d'Audiologie, Faculté de Médecine, Université de Montréal Montréal, QC, Canada
| |
Collapse
|