1
|
Strelnikov K, Karoui C, Payoux P, Salabert AS, James C, Deguine O, Barone P, Marx M. Adaptive Strategies of Single-Sided Deaf Cochlear-Implant Users Revealed Through Resting State Activity: an Auditory PET Brain Imaging Study. Hear Res 2024; 451:109079. [PMID: 39053297 DOI: 10.1016/j.heares.2024.109079] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/20/2024] [Revised: 06/25/2024] [Accepted: 07/10/2024] [Indexed: 07/27/2024]
Abstract
Brain plasticity refers to the brain's ability to reorganize its structure or function in response to experiences, learning, and environmental influences. This phenomenon is particularly significant in individuals with deafness, as the brain adapts to compensate for the lack of auditory stimulation. The aim of this study is to investigate whether cochlear implantation can restore a normal pattern of brain activation following auditory stimulation in cases of asymmetric hearing loss. We used a PET-scan technique to assess brain activity after cochlear implantation, specifically during an auditory voice/non-voice discrimination task. The results indicated a nearly normal pattern of brain activity during the auditory discrimination task, except for increased activation in areas related to attentional processes compared to controls. Additionally, brain activity at rest showed significant changes in implanted participants, including cross modal visuo-auditory processing. Therefore, cochlear implants can restore the brain's activation pattern through long-term adaptive adjustments in intrinsic brain activity.
Collapse
Affiliation(s)
- K Strelnikov
- UMR 5549, Faculté de Médecine Purpan, Centre National de la Recherche Scientifique, Toulouse, France; Centre de Recherche Cerveau et Cognition, Université de Toulouse, Université Paul Sabatier, Toulouse, France; Centre for Cognitive and Brain Sciences, University of Macau, Taipa, Macau Special Administrative Regions of China.
| | - C Karoui
- UMR 5549, Faculté de Médecine Purpan, Centre National de la Recherche Scientifique, Toulouse, France; Centre de Recherche Cerveau et Cognition, Université de Toulouse, Université Paul Sabatier, Toulouse, France
| | - P Payoux
- Nuclear Medicine Department, Purpan University Hospital, Toulouse, France; ToNIC, Toulouse NeuroImaging Center, Université de Toulouse, Inserm, UPS, France
| | - A S Salabert
- Nuclear Medicine Department, Purpan University Hospital, Toulouse, France; ToNIC, Toulouse NeuroImaging Center, Université de Toulouse, Inserm, UPS, France
| | - C James
- Cochlear France SAS, Toulouse, France; Service d'Oto-Rhino-Laryngologie et Oto-Neurologie, CHU Toulouse France, Université Toulouse 3, France
| | - O Deguine
- UMR 5549, Faculté de Médecine Purpan, Centre National de la Recherche Scientifique, Toulouse, France; Centre de Recherche Cerveau et Cognition, Université de Toulouse, Université Paul Sabatier, Toulouse, France; Service d'Oto-Rhino-Laryngologie et Oto-Neurologie, CHU Toulouse France, Université Toulouse 3, France
| | - P Barone
- UMR 5549, Faculté de Médecine Purpan, Centre National de la Recherche Scientifique, Toulouse, France; Centre de Recherche Cerveau et Cognition, Université de Toulouse, Université Paul Sabatier, Toulouse, France
| | - M Marx
- UMR 5549, Faculté de Médecine Purpan, Centre National de la Recherche Scientifique, Toulouse, France; Centre de Recherche Cerveau et Cognition, Université de Toulouse, Université Paul Sabatier, Toulouse, France; Service d'Oto-Rhino-Laryngologie et Oto-Neurologie, CHU Toulouse France, Université Toulouse 3, France
| |
Collapse
|
2
|
Kartheiser G, Cormier K, Bell-Souder D, Dye M, Sharma A. Neurocognitive outcomes in young adults with cochlear implants: The role of early language access and crossmodal plasticity. Hear Res 2024; 451:109074. [PMID: 39018768 DOI: 10.1016/j.heares.2024.109074] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/31/2024] [Revised: 06/03/2024] [Accepted: 07/02/2024] [Indexed: 07/19/2024]
Abstract
Many children with profound hearing loss have received cochlear implants (CI) to help restore some sense of hearing. There is, however, limited research on long-term neurocognitive outcomes in young adults who have grown up hearing through a CI. This study compared the cognitive outcomes of early-implanted (n = 20) and late-implanted (n = 21) young adult CI users, and typically hearing (TH) controls (n=56), all of whom were enrolled in college. Cognitive fluidity, nonverbal intelligence, and American Sign Language (ASL) comprehension were assessed, revealing no significant differences in cognition and nonverbal intelligence between the early and late-implanted groups. However, there was a difference in ASL comprehension, with the late-implanted group having significantly higher ASL comprehension. Although young adult CI users showed significantly lower scores in a working memory and processing speed task than TH age-matched controls, there were no significant differences in tasks involving executive function shifting, inhibitory control, and episodic memory between young adult CI and young adult TH participants. In an exploratory analysis of a subset of CI participants (n = 17) in whom we were able to examine crossmodal plasticity, we saw greater evidence of crossmodal recruitment from the visual system in late-implanted compared with early-implanted CI young adults. However, cortical visual evoked potential latency biomarkers of crossmodal plasticity were not correlated with cognitive measures or ASL comprehension. The results suggest that in the late-implanted CI users, early access to sign language may have served as a scaffold for appropriate cognitive development, while in the early-implanted group early access to oral language benefited cognitive development. Furthermore, our results suggest that the persistence of crossmodal neuroplasticity into adulthood does not necessarily impact cognitive development. In conclusion, early access to language - spoken or signed - may be important for cognitive development, with no observable effect of crossmodal plasticity on cognitive outcomes.
Collapse
Affiliation(s)
- Geo Kartheiser
- Rochester Institute of Technology, Rochester, NY, United States of America
| | - Kayla Cormier
- Department of Speech Language and Hearing Sciences, University of Colorado Boulder, Boulder, CO, United States of America
| | - Don Bell-Souder
- Department of Speech Language and Hearing Sciences, University of Colorado Boulder, Boulder, CO, United States of America
| | - Matthew Dye
- Rochester Institute of Technology, Rochester, NY, United States of America
| | - Anu Sharma
- Department of Speech Language and Hearing Sciences, University of Colorado Boulder, Boulder, CO, United States of America.
| |
Collapse
|
3
|
Schormans AL, Allman BL. Layer-specific enhancement of visual-evoked activity in the audiovisual cortex following a mild degree of hearing loss in adult rats. Hear Res 2024; 450:109071. [PMID: 38941694 DOI: 10.1016/j.heares.2024.109071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/01/2024] [Revised: 06/12/2024] [Accepted: 06/17/2024] [Indexed: 06/30/2024]
Abstract
Following adult-onset hearing impairment, crossmodal plasticity can occur within various sensory cortices, often characterized by increased neural responses to visual stimulation in not only the auditory cortex, but also in the visual and audiovisual cortices. In the present study, we used an established model of loud noise exposure in rats to examine, for the first time, whether the crossmodal plasticity in the audiovisual cortex that occurs following a relatively mild degree of hearing loss emerges solely from altered intracortical processing or if thalamocortical changes also contribute to the crossmodal effects. Using a combination of an established pharmacological 'cortical silencing' protocol and current source density analysis of the laminar activity recorded across the layers of the audiovisual cortex (i.e., the lateral extrastriate visual cortex, V2L), we observed layer-specific changes post-silencing in the strength of the residual visual, but not auditory, input in the noise exposed rats with mild hearing loss compared to rats with normal hearing. Furthermore, based on a comparison of the laminar profiles pre- versus post-silencing in both groups, we can conclude that noise exposure caused a re-allocation of the strength of visual inputs across the layers of the V2L cortex, including enhanced visual-evoked activity in the granular layer; findings consistent with thalamocortical plasticity. Finally, we confirmed that audiovisual integration within the V2L cortex depends on intact processing within intracortical circuits, and that this form of multisensory processing is vulnerable to disruption by noise-induced hearing loss. Ultimately, the present study furthers our understanding of the contribution of intracortical and thalamocortical processing to crossmodal plasticity as well as to audiovisual integration under both normal and mildly-impaired hearing conditions.
Collapse
Affiliation(s)
- Ashley L Schormans
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, 1151 Richmond St., London, ON N6A 5C1, Canada.
| | - Brian L Allman
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, 1151 Richmond St., London, ON N6A 5C1, Canada
| |
Collapse
|
4
|
Yusuf PA, Hubka P, Konerding W, Land R, Tillein J, Kral A. Congenital deafness reduces alpha-gamma cross-frequency coupling in the auditory cortex. Hear Res 2024; 449:109032. [PMID: 38797035 DOI: 10.1016/j.heares.2024.109032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/15/2024] [Revised: 04/30/2024] [Accepted: 05/13/2024] [Indexed: 05/29/2024]
Abstract
Neurons within a neuronal network can be grouped by bottom-up and top-down influences using synchrony in neuronal oscillations. This creates the representation of perceptual objects from sensory features. Oscillatory activity can be differentiated into stimulus-phase-locked (evoked) and non-phase-locked (induced). The former is mainly determined by sensory input, the latter by higher-level (cortical) processing. Effects of auditory deprivation on cortical oscillations have been studied in congenitally deaf cats (CDCs) using cochlear implant (CI) stimulation. CI-induced alpha, beta, and gamma activity were compromised in the auditory cortex of CDCs. Furthermore, top-down information flow between secondary and primary auditory areas in hearing cats, conveyed by induced alpha oscillations, was lost in CDCs. Here we used the matching pursuit algorithm to assess components of such oscillatory activity in local field potentials recorded in primary field A1. Additionally to the loss of induced alpha oscillations, we also found a loss of evoked theta activity in CDCs. The loss of theta and alpha activity in CDCs can be directly related to reduced high-frequency (gamma-band) activity due to cross-frequency coupling. Here we quantified such cross-frequency coupling in adult 1) hearing-experienced, acoustically stimulated cats (aHCs), 2) hearing-experienced cats following acute pharmacological deafening and subsequent CIs, thus in electrically stimulated cats (eHCs), and 3) electrically stimulated CDCs. We found significant cross-frequency coupling in all animal groups in > 70% of auditory-responsive sites. The predominant coupling in aHCs and eHCs was between theta/alpha phase and gamma power. In CDCs such coupling was lost and replaced by alpha oscillations coupling to delta/theta phase. Thus, alpha/theta oscillations synchronize high-frequency gamma activity only in hearing-experienced cats. The absence of induced alpha and theta oscillations contributes to the loss of induced gamma power in CDCs, thereby signifying impaired local network activity.
Collapse
Affiliation(s)
- Prasandhya A Yusuf
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany; Faculty of Medicine University of Indonesia, Department of Medical Physiology and Biophysics / Medical Technology IMERI, Jakarta, Indonesia.
| | - Peter Hubka
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany
| | - Wiebke Konerding
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany
| | - Rüdiger Land
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany
| | - Jochen Tillein
- J.W. Goethe University, Department of Otorhinolaryngology, Frankfurt am Main, Germany
| | - Andrej Kral
- Hannover Medical School, Institute of AudioNeuroTechnology and Department of Experimental Otology of the ENT Clinics, Hannover, Germany; Australian Hearing Hub, School of Medicine and Health Sciences, Macquarie University, Sydney, Australia
| |
Collapse
|
5
|
Esmaelpoor J, Peng T, Jelfs B, Mao D, Shader MJ, McKay CM. Resting-State Functional Connectivity Predicts Cochlear-Implant Speech Outcomes. Ear Hear 2024:00003446-990000000-00313. [PMID: 39012793 DOI: 10.1097/aud.0000000000001564] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/18/2024]
Abstract
OBJECTIVES Cochlear implants (CIs) have revolutionized hearing restoration for individuals with severe or profound hearing loss. However, a substantial and unexplained variability persists in CI outcomes, even when considering subject-specific factors such as age and the duration of deafness. In a pioneering study, we use resting-state functional near-infrared spectroscopy to predict speech-understanding outcomes before and after CI implantation. Our hypothesis centers on resting-state functional connectivity (FC) reflecting brain plasticity post-hearing loss and implantation, specifically targeting the average clustering coefficient in resting FC networks to capture variation among CI users. DESIGN Twenty-three CI candidates participated in this study. Resting-state functional near-infrared spectroscopy data were collected preimplantation and at 1 month, 3 months, and 1 year postimplantation. Speech understanding performance was assessed using consonant-nucleus-consonant words in quiet and Bamford-Kowal-Bench sentences in noise 1-year postimplantation. Resting-state FC networks were constructed using regularized partial correlation, and the average clustering coefficient was measured in the signed weighted networks as a predictive measure for implantation outcomes. RESULTS Our findings demonstrate a significant correlation between the average clustering coefficient in resting-state functional networks and speech understanding outcomes, both pre- and postimplantation. CONCLUSIONS This approach uses an easily deployable resting-state functional brain imaging metric to predict speech-understanding outcomes in implant recipients. The results indicate that the average clustering coefficient, both pre- and postimplantation, correlates with speech understanding outcomes.
Collapse
Affiliation(s)
- Jamal Esmaelpoor
- Department of Medical Bionics, University of Melbourne, Melbourne, Australia
- The Bionics Institute of Australia, Melbourne, Australia
| | - Tommy Peng
- Department of Medical Bionics, University of Melbourne, Melbourne, Australia
- The Bionics Institute of Australia, Melbourne, Australia
| | - Beth Jelfs
- Department of Electronic, Electrical and Systems Engineering, University of Birmingham, Birmingham, United Kingdom
| | - Darren Mao
- Department of Medical Bionics, University of Melbourne, Melbourne, Australia
- The Bionics Institute of Australia, Melbourne, Australia
| | - Maureen J Shader
- Department of Speech, Language, and Hearing Sciences, Purdue University, West Lafayette, Indiana, USA
| | - Colette M McKay
- Department of Medical Bionics, University of Melbourne, Melbourne, Australia
- The Bionics Institute of Australia, Melbourne, Australia
| |
Collapse
|
6
|
Yu Q, Li H, Li S, Tang P. Prosodic and Visual Cues Facilitate Irony Comprehension by Mandarin-Speaking Children With Cochlear Implants. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2024; 67:2172-2190. [PMID: 38820233 DOI: 10.1044/2024_jslhr-23-00701] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2024]
Abstract
PURPOSE This study investigated irony comprehension by Mandarin-speaking children with cochlear implants, focusing on how prosodic and visual cues contribute to their comprehension, and whether second-order Theory of Mind is required for using these cues. METHOD We tested 52 Mandarin-speaking children with cochlear implants (aged 3-7 years) and 52 age- and gender-matched children with normal hearing. All children completed a Theory of Mind test and a story comprehension test. Ironic stories were presented in three conditions, each providing different cues: (a) context-only, (b) context and prosody, and (c) context, prosody, and visual cues. Comparisons were conducted on the accuracy of story understanding across the three conditions to examine the role of prosodic and visual cues. RESULTS The results showed that, compared to the context-only condition, the additional prosodic and visual cues both improved the accuracy of irony comprehension for children with cochlear implants, similar to their normal-hearing peers. Furthermore, such improvements were observed for all children, regardless of whether they passed the second-order Theory of Mind test or not. CONCLUSIONS This study is the first to demonstrate the benefits of prosodic and visual cues in irony comprehension, without reliance on second-order Theory of Mind, for Mandarin-speaking children with cochlear implants. It implies potential insights for utilizing prosodic and visual cues in intervention strategies to promote irony comprehension.
Collapse
Affiliation(s)
- Qianxi Yu
- School of Foreign Studies, Nanjing University of Science and Technology, China
| | - Honglan Li
- School of Foreign Studies, Nanjing University of Science and Technology, China
| | - Shanpeng Li
- School of Foreign Studies, Nanjing University of Science and Technology, China
| | - Ping Tang
- School of Foreign Studies, Nanjing University of Science and Technology, China
| |
Collapse
|
7
|
Nematova S, Zinszer B, Morlet T, Morini G, Petitto LA, Jasińska KK. Impact of ASL Exposure on Spoken Phonemic Discrimination in Adult CI Users: A Functional Near-Infrared Spectroscopy Study. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2024; 5:553-588. [PMID: 38939730 PMCID: PMC11210937 DOI: 10.1162/nol_a_00143] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Accepted: 03/11/2024] [Indexed: 06/29/2024]
Abstract
We examined the impact of exposure to a signed language (American Sign Language, or ASL) at different ages on the neural systems that support spoken language phonemic discrimination in deaf individuals with cochlear implants (CIs). Deaf CI users (N = 18, age = 18-24 yrs) who were exposed to a signed language at different ages and hearing individuals (N = 18, age = 18-21 yrs) completed a phonemic discrimination task in a spoken native (English) and non-native (Hindi) language while undergoing functional near-infrared spectroscopy neuroimaging. Behaviorally, deaf CI users who received a CI early versus later in life showed better English phonemic discrimination, albeit phonemic discrimination was poor relative to hearing individuals. Importantly, the age of exposure to ASL was not related to phonemic discrimination. Neurally, early-life language exposure, irrespective of modality, was associated with greater neural activation of left-hemisphere language areas critically involved in phonological processing during the phonemic discrimination task in deaf CI users. In particular, early exposure to ASL was associated with increased activation in the left hemisphere's classic language regions for native versus non-native language phonemic contrasts for deaf CI users who received a CI later in life. For deaf CI users who received a CI early in life, the age of exposure to ASL was not related to neural activation during phonemic discrimination. Together, the findings suggest that early signed language exposure does not negatively impact spoken language processing in deaf CI users, but may instead potentially offset the negative effects of language deprivation that deaf children without any signed language exposure experience prior to implantation. This empirical evidence aligns with and lends support to recent perspectives regarding the impact of ASL exposure in the context of CI usage.
Collapse
Affiliation(s)
- Shakhlo Nematova
- Department of Linguistics and Cognitive Science, University of Delaware, Newark, DE, USA
| | - Benjamin Zinszer
- Department of Psychology, Swarthmore College, Swarthmore, PA, USA
| | - Thierry Morlet
- Nemours Children’s Hospital, Delaware, Wilmington, DE, USA
| | - Giovanna Morini
- Department of Communication Sciences and Disorders, University of Delaware, Newark, DE, USA
| | - Laura-Ann Petitto
- Brain and Language Center for Neuroimaging, Gallaudet University, Washington, DC, USA
| | - Kaja K. Jasińska
- Department of Applied Psychology and Human Development, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
8
|
Dhanik K, Pandey HR, Mishra M, Keshri A, Kumar U. Neural adaptations to congenital deafness: enhanced tactile discrimination through cross-modal neural plasticity - an fMRI study. Neurol Sci 2024:10.1007/s10072-024-07615-4. [PMID: 38797764 DOI: 10.1007/s10072-024-07615-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2024] [Accepted: 05/22/2024] [Indexed: 05/29/2024]
Abstract
BACKGROUND This study explores the compensatory neural mechanisms associated with congenital deafness through an examination of tactile discrimination abilities using high-resolution functional magnetic resonance imaging (fMRI). OBJECTIVE To analyze the neural substrates underlying tactile processing in congenitally deaf individuals and compare them with hearing controls. METHODS Our participant pool included thirty-five congenitally deaf individuals and thirty-five hearing controls. All participants engaged in tactile discrimination tasks involving the identification of common objects by touch. We utilized an analytical suite comprising voxel-based statistics, functional connectivity multivariate/voxel pattern analysis (fc-MVPA), and seed-based connectivity analysis to examine neural activity. RESULTS Our findings revealed pronounced neural activity in congenitally deaf participants within regions typically associated with auditory processing, including the bilateral superior temporal gyrus, right middle temporal gyrus, and right rolandic operculum. Additionally, unique activation and connectivity patterns were observed in the right insula and bilateral supramarginal gyrus, indicating a strategic reorganization of neural pathways for tactile information processing. Behaviorally, both groups demonstrated high accuracy in the tactile tasks, exceeding 90%. However, the deaf participants outperformed their hearing counterparts in reaction times, showcasing significantly enhanced efficiency in tactile information processing. CONCLUSION These insights into the brain's adaptability to sensory loss through compensatory neural reorganization highlight the intricate mechanisms by which tactile discrimination is enhanced in the absence of auditory input. Understanding these adaptations can help develop strategies to harness the brain's plasticity to improve sensory processing in individuals with sensory impairments, ultimately enhancing their quality of life through improved tactile perception and sensory integration.
Collapse
Affiliation(s)
- Kalpana Dhanik
- Centre of Bio-Medical Research, Sanjay Gandhi Postgraduate Institute of Medical Sciences Campus, Lucknow, Uttar Pradesh, 226014, India
| | - Himanshu R Pandey
- Centre of Bio-Medical Research, Sanjay Gandhi Postgraduate Institute of Medical Sciences Campus, Lucknow, Uttar Pradesh, 226014, India
| | - Mrutyunjaya Mishra
- Department of Special Education (Hearing Impairments), Dr. Shakuntala Misra National Rehabilitation University, Lucknow, India
| | - Amit Keshri
- Department of Neuro-otology, Sanjay Gandhi Postgraduate Institute of Medical Sciences Campus, Lucknow, India
| | - Uttam Kumar
- Centre of Bio-Medical Research, Sanjay Gandhi Postgraduate Institute of Medical Sciences Campus, Lucknow, Uttar Pradesh, 226014, India.
| |
Collapse
|
9
|
Wischmann S, Kamper NR, Jantzen L, Hammer L, Reipur DB, Serafin S, Percy-Smith L. Explaining neurological factors of hearing loss through digital technologies. Int J Pediatr Otorhinolaryngol 2024; 176:111825. [PMID: 38128354 DOI: 10.1016/j.ijporl.2023.111825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/11/2023] [Revised: 11/29/2023] [Accepted: 12/10/2023] [Indexed: 12/23/2023]
Abstract
The study investigated how inclusion of the considerable amount of knowledge generated through basic research in multisensory experiences can be brought into clinical paediatric audiology with a specific focus to enhance understanding of the neurological implications of childhood hearing loss. OBJECTIVES The overall aim of the project was to investigate how to use emerging technologies to enhance the understanding of the neurological impact of paediatric hearing loss. The specific objectives were to develop an app and to evaluate its ease of use and the understanding of neurology by all types of stakeholders and end-users. METHODS A collaborative participatory and human centred research design was used. This methodological approach brought stakeholders into the design process at an early point of time and workshops mapped the content and interaction of the iterative development of the app. Nine clinicians from Copenhagen Hearing and Balance Centre and 4 media technologists from Multisensory Experience Lab participated in the development of the app-prototype. Evaluations were made by use of questionnaires completed by stakeholders and end-users and focus group interviews. Eight parents with children with hearing loss, 13 internal stakeholders and 14 external stakeholders participated in the evaluation of the app. RESULTS The app was overall positively evaluated. End users/parents with children with hearing loss were slightly more positive than stakeholders/professionals in audiology. CONCLUSIONS Apps are a future media for providing health care information and it proved both relevant and applicable to start using apps also to provide complex information such as neurological implications of childhood hearing loss.
Collapse
Affiliation(s)
- Signe Wischmann
- Rigshospitalet, Copenhagen Hearing and Balance Centre. Ear, Nose and Throat (ENT) and Audiology Clinic, Inge Lehmanns Vej 8, 2100, København Ø, Denmark.
| | - Nete Rudbeck Kamper
- Rigshospitalet, Copenhagen Hearing and Balance Centre. Ear, Nose and Throat (ENT) and Audiology Clinic, Inge Lehmanns Vej 8, 2100, København Ø, Denmark.
| | - Lone Jantzen
- Rigshospitalet, Copenhagen Hearing and Balance Centre. Ear, Nose and Throat (ENT) and Audiology Clinic, Inge Lehmanns Vej 8, 2100, København Ø, Denmark.
| | - Lærke Hammer
- Rigshospitalet, Copenhagen Hearing and Balance Centre. Ear, Nose and Throat (ENT) and Audiology Clinic, Inge Lehmanns Vej 8, 2100, København Ø, Denmark.
| | - Daniel Boonma Reipur
- Aalborg University Copenhagen, Multisensory Experience Lab, A.C. Meyers Vænge 15, 2450, København SV, Denmark.
| | - Stefania Serafin
- Aalborg University Copenhagen, Multisensory Experience Lab, A.C. Meyers Vænge 15, 2450, København SV, Denmark.
| | - Lone Percy-Smith
- Rigshospitalet, Copenhagen Hearing and Balance Centre. Ear, Nose and Throat (ENT) and Audiology Clinic, Inge Lehmanns Vej 8, 2100, København Ø, Denmark.
| |
Collapse
|
10
|
Schulte A, Marozeau J, Ruhe A, Büchner A, Kral A, Innes-Brown H. Improved speech intelligibility in the presence of congruent vibrotactile speech input. Sci Rep 2023; 13:22657. [PMID: 38114599 PMCID: PMC10730903 DOI: 10.1038/s41598-023-48893-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2023] [Accepted: 11/30/2023] [Indexed: 12/21/2023] Open
Abstract
Vibrotactile stimulation is believed to enhance auditory speech perception, offering potential benefits for cochlear implant (CI) users who may utilize compensatory sensory strategies. Our study advances previous research by directly comparing tactile speech intelligibility enhancements in normal-hearing (NH) and CI participants, using the same paradigm. Moreover, we assessed tactile enhancement considering stimulus non-specific, excitatory effects through an incongruent audio-tactile control condition that did not contain any speech-relevant information. In addition to this incongruent audio-tactile condition, we presented sentences in an auditory only and a congruent audio-tactile condition, with the congruent tactile stimulus providing low-frequency envelope information via a vibrating probe on the index fingertip. The study involved 23 NH listeners and 14 CI users. In both groups, significant tactile enhancements were observed for congruent tactile stimuli (5.3% for NH and 5.4% for CI participants), but not for incongruent tactile stimulation. These findings replicate previously observed tactile enhancement effects. Juxtaposing our study with previous research, the informational content of the tactile stimulus emerges as a modulator of intelligibility: Generally, congruent stimuli enhanced, non-matching tactile stimuli reduced, and neutral stimuli did not change test outcomes. We conclude that the temporal cues provided by congruent vibrotactile stimuli may aid in parsing continuous speech signals into syllables and words, consequently leading to the observed improvements in intelligibility.
Collapse
Affiliation(s)
- Alina Schulte
- Department of Experimental Otology of the Clinics of Otolaryngology, Hannover Medical School, Hannover, Germany.
- Eriksholm Research Center, Oticon A/S, Snekkersten, Denmark.
| | - Jeremy Marozeau
- Music and Cochlear Implants Lab, Department of Health Technology, Technical University Denmark, Kongens Lyngby, Denmark
| | - Anna Ruhe
- Department of Experimental Otology of the Clinics of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Andreas Büchner
- Department of Experimental Otology of the Clinics of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Andrej Kral
- Department of Experimental Otology of the Clinics of Otolaryngology, Hannover Medical School, Hannover, Germany
| | - Hamish Innes-Brown
- Eriksholm Research Center, Oticon A/S, Snekkersten, Denmark
- Hearing Systems Section, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, Denmark
| |
Collapse
|
11
|
Alemi R, Wolfe J, Neumann S, Manning J, Towler W, Koirala N, Gracco VL, Deroche M. Audiovisual integration in children with cochlear implants revealed through EEG and fNIRS. Brain Res Bull 2023; 205:110817. [PMID: 37989460 DOI: 10.1016/j.brainresbull.2023.110817] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Revised: 09/22/2023] [Accepted: 11/13/2023] [Indexed: 11/23/2023]
Abstract
Sensory deprivation can offset the balance of audio versus visual information in multimodal processing. Such a phenomenon could persist for children born deaf, even after they receive cochlear implants (CIs), and could potentially explain why one modality is given priority over the other. Here, we recorded cortical responses to a single speaker uttering two syllables, presented in audio-only (A), visual-only (V), and audio-visual (AV) modes. Electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) were successively recorded in seventy-five school-aged children. Twenty-five were children with normal hearing (NH) and fifty wore CIs, among whom 26 had relatively high language abilities (HL) comparable to those of NH children, while 24 others had low language abilities (LL). In EEG data, visual-evoked potentials were captured in occipital regions, in response to V and AV stimuli, and they were accentuated in the HL group compared to the LL group (the NH group being intermediate). Close to the vertex, auditory-evoked potentials were captured in response to A and AV stimuli and reflected a differential treatment of the two syllables but only in the NH group. None of the EEG metrics revealed any interaction between group and modality. In fNIRS data, each modality induced a corresponding activity in visual or auditory regions, but no group difference was observed in A, V, or AV stimulation. The present study did not reveal any sign of abnormal AV integration in children with CI. An efficient multimodal integrative network (at least for rudimentary speech materials) is clearly not a sufficient condition to exhibit good language and literacy.
Collapse
Affiliation(s)
- Razieh Alemi
- Department of Psychology, Concordia University, 7141 Sherbrooke St. West, Montreal, Quebec H4B 1R6, Canada.
| | - Jace Wolfe
- Oberkotter Foundation, Oklahoma City, OK, USA
| | - Sara Neumann
- Hearts for Hearing Foundation, 11500 Portland Av., Oklahoma City, OK 73120, USA
| | - Jacy Manning
- Hearts for Hearing Foundation, 11500 Portland Av., Oklahoma City, OK 73120, USA
| | - Will Towler
- Hearts for Hearing Foundation, 11500 Portland Av., Oklahoma City, OK 73120, USA
| | - Nabin Koirala
- Haskins Laboratories, 300 George St., New Haven, CT 06511, USA
| | | | - Mickael Deroche
- Department of Psychology, Concordia University, 7141 Sherbrooke St. West, Montreal, Quebec H4B 1R6, Canada
| |
Collapse
|
12
|
Marriage JE, Keshavarzi M, Moore BCJ. An association between auditory responsiveness of children and duration of entertainment screen time in the early years of life. Int J Audiol 2023:1-7. [PMID: 37750302 DOI: 10.1080/14992027.2023.2260097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2023] [Accepted: 09/11/2023] [Indexed: 09/27/2023]
Abstract
OBJECTIVE To examine whether the responsiveness of young children to simple sounds was associated with entertainment screen time (EST), opportunities for social interaction, and social and communication skills. DESIGN Parents completed a questionnaire covering, for years one and two, the number of times the child met with other children; the number of words the child spoke; and the daily amount of EST. Social, attention and communication skills were assessed. STUDY SAMPLE Participants were 118 children, aged 15 to 46 months. They were initially assessed behaviourally using simple sounds. Children who responded to such sounds were denoted the Responsive group. Children who did not were assessed using familiar songs and denoted the Unresponsive group. RESULTS The two groups did not differ significantly in mean age or the number of opportunities to meet other children. The Unresponsive group had significantly fewer words than the Responsive group at 12 and 24 months and had significantly higher EST than the Responsive group for years 1 and 2. The Unresponsive group showed lower social, attention and communication skills than the Responsive group. CONCLUSIONS High EST was associated with poorer auditory and social skills. Hence, it may be wise to limit the EST of young children.
Collapse
Affiliation(s)
| | - Mahmoud Keshavarzi
- Centre for Neuroscience in Education, Cambridge Hearing Group, Department of Psychology, University of Cambridge, Cambridge, UK
| | - Brian C J Moore
- Cambridge Hearing Group, Department of Psychology, University of Cambridge, Cambridge, UK
| |
Collapse
|
13
|
Kallioinen P, Olofsson JK, von Mentzer CN. Semantic processing in children with Cochlear Implants: A review of current N400 studies and recommendations for future research. Biol Psychol 2023; 182:108655. [PMID: 37541539 DOI: 10.1016/j.biopsycho.2023.108655] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 07/28/2023] [Accepted: 08/01/2023] [Indexed: 08/06/2023]
Abstract
Deaf and hard of hearing children with cochlear implants (CI) often display impaired spoken language skills. While a large number of studies investigated brain responses to sounds in this population, relatively few focused on semantic processing. Here we summarize and discuss findings in four studies of the N400, a cortical response that reflects semantic processing, in children with CI. A study with auditory target stimuli found N400 effects at delayed latencies at 12 months after implantation, but at 18 and 24 months after implantation effects had typical latencies. In studies with visual target stimuli N400 effects were larger than or similar to controls in children with CI, despite lower semantic abilities. We propose that in children with CI, the observed large N400 effect reflects a stronger reliance on top-down predictions, relative to bottom-up language processing. Recent behavioral studies of children and adults with CI suggest that top-down processing is a common compensatory strategy, but with distinct limitations such as being effortful. A majority of the studies have small sample sizes (N < 20), and only responses to image targets were studied repeatedly in similar paradigms. This precludes strong conclusions. We give suggestions for future research and ways to overcome the scarcity of participants, including extending research to children with conventional hearing aids, an understudied group.
Collapse
Affiliation(s)
- Petter Kallioinen
- Department of Linguistics, Stockholm University, Stockholm, Sweden; Lund University Cognitive Science, Lund University, Lund, Sweden.
| | - Jonas K Olofsson
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | | |
Collapse
|