1
|
Corina DP, Coffey-Corina S, Pierotti E, Mankel K, Miller LM. Electrophysiological study of visual processing in children with cochlear implants. Neuropsychologia 2024; 194:108774. [PMID: 38145800 DOI: 10.1016/j.neuropsychologia.2023.108774] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2023] [Revised: 12/12/2023] [Accepted: 12/19/2023] [Indexed: 12/27/2023]
Abstract
Electrophysiological studies of congenitally deaf children and adults have reported atypical visual evoked potentials (VEPs) which have been associated with both behavioral enhancements of visual attention as well as poorer performance and outcomes in tests of spoken language speech processing. This pattern has often been interpreted as a maladaptive consequence of early auditory deprivation, whereby a remapping of auditory cortex by the visual system ultimately reduces resources necessary for optimal rehabilitative outcomes of spoken language acquisition and use. Making use of a novel electrophysiological paradigm, we compare VEPs in children with severe to profound congenital deafness who received a cochlear implant(s) prior to 31 months (n = 28) and typically developing age matched controls (n = 28). We observe amplitude enhancements and in some cases latency differences in occipitally expressed P1 and N1 VEP components in CI-using children as well as an early frontal negativity, N1a. We relate these findings to developmental factors such as chronological age and spoken language understanding. We further evaluate whether VEPs are additionally modulated by auditory stimulation. Collectively, these data provide a means to examine the extent to which atypical VEPs are consistent with prior accounts of maladaptive cross-modal plasticity. Our results support a view that VEP changes reflect alterations to visual-sensory attention and saliency mechanisms rather than a re-mapping of auditory cortex. The present data suggests that early auditory deprivation may have temporally prolonged effects on visual system processing even after activation and use of cochlear implant.
Collapse
Affiliation(s)
- David P Corina
- Center for Mind and Brain, University of California, Davis, USA; Department of Linguistics, University of California, Davis, USA; Department of Psychology, University of California, Davis, USA.
| | - S Coffey-Corina
- Center for Mind and Brain, University of California, Davis, USA
| | - E Pierotti
- Center for Mind and Brain, University of California, Davis, USA; Department of Psychology, University of California, Davis, USA
| | - Kelsey Mankel
- Center for Mind and Brain, University of California, Davis, USA
| | - Lee M Miller
- Center for Mind and Brain, University of California, Davis, USA; Department of Neurobiology, Physiology and Behavior, University of California, Davis, USA; Department of Otolaryngology / Head and Neck Surgery, University of California, Davis, USA
| |
Collapse
|
2
|
Kamal F, Segado M, Shaigetz VG, Perron M, Lau B, Alain C, Choudhury N. Effects of virtual reality working memory task difficulty on the passive processing of irrelevant auditory stimuli. Neuroreport 2023; 34:811-816. [PMID: 37823446 DOI: 10.1097/wnr.0000000000001958] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/13/2023]
Abstract
The virtual reality (VR) environment is claimed to be highly immersive. Participants may thus be potentially unaware of their real, external world. The present study presented irrelevant auditory stimuli while participants were engaged in an easy or difficult visual working memory (WM) task within the VR environment. The difficult WM task should be immersive and require many cognitive resources, thus few will be available for the processing of task-irrelevant auditory stimuli. Sixteen young adults wore a 3D head-mounted VR device. In the easy WM task, the stimuli were nameable objects. In the difficult WM task, the stimuli were abstract objects that could not be easily named. A novel paradigm using event-related potentials (ERPs) was implemented to examine the feasibility of quantifying the extent of processing of task-irrelevant stimuli occurring outside of the VR environment. Auditory stimuli irrelevant to the WM task were presented concurrently at every 1.5 or 12 s in separate conditions. Performance on the WM task varied with task difficulty, with accuracy significantly lower during the difficult task. The auditory ERPs consisted of N1 and a later P2/P3a deflection which were larger when the auditory stimuli were presented slowly. ERPs were unaffected by task difficulty, but significant correlations were found. N1 and P2/P3a amplitudes were smallest when performance on the Easy WM task was highest. It is possible that even the easy WM task was so immersive and required many processing resources that few were available for the co-processing of the task-irrelevant auditory stimuli.
Collapse
Affiliation(s)
- Farooq Kamal
- National Research Council Canada, Boucherville, Quebec
| | | | | | - Maxime Perron
- Rotman Research Institute, Baycrest Health Sciences
- Department of Psychology, University of Toronto
| | - Brian Lau
- Rotman Research Institute, Baycrest Health Sciences
- Department of Psychology, University of Toronto
| | - Claude Alain
- Rotman Research Institute, Baycrest Health Sciences
- Department of Psychology, University of Toronto
- Institute of Medical Science, Temerty Faculty of Medicine, University of Toronto
- Music and Health Research Collaboratory, Faculty of Music, University of Toronto, Toronto, Ontario, Canada
| | | |
Collapse
|
3
|
Berger JI, Gander PE, Kim S, Schwalje AT, Woo J, Na YM, Holmes A, Hong JM, Dunn CC, Hansen MR, Gantz BJ, McMurray B, Griffiths TD, Choi I. Neural Correlates of Individual Differences in Speech-in-Noise Performance in a Large Cohort of Cochlear Implant Users. Ear Hear 2023; 44:1107-1120. [PMID: 37144890 PMCID: PMC10426791 DOI: 10.1097/aud.0000000000001357] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2021] [Accepted: 01/11/2023] [Indexed: 05/06/2023]
Abstract
OBJECTIVES Understanding speech-in-noise (SiN) is a complex task that recruits multiple cortical subsystems. Individuals vary in their ability to understand SiN. This cannot be explained by simple peripheral hearing profiles, but recent work by our group ( Kim et al. 2021 , Neuroimage ) highlighted central neural factors underlying the variance in SiN ability in normal hearing (NH) subjects. The present study examined neural predictors of SiN ability in a large cohort of cochlear-implant (CI) users. DESIGN We recorded electroencephalography in 114 postlingually deafened CI users while they completed the California consonant test: a word-in-noise task. In many subjects, data were also collected on two other commonly used clinical measures of speech perception: a word-in-quiet task (consonant-nucleus-consonant) word and a sentence-in-noise task (AzBio sentences). Neural activity was assessed at a vertex electrode (Cz), which could help maximize eventual generalizability to clinical situations. The N1-P2 complex of event-related potentials (ERPs) at this location were included in multiple linear regression analyses, along with several other demographic and hearing factors as predictors of SiN performance. RESULTS In general, there was a good agreement between the scores on the three speech perception tasks. ERP amplitudes did not predict AzBio performance, which was predicted by the duration of device use, low-frequency hearing thresholds, and age. However, ERP amplitudes were strong predictors for performance for both word recognition tasks: the California consonant test (which was conducted simultaneously with electroencephalography recording) and the consonant-nucleus-consonant (conducted offline). These correlations held even after accounting for known predictors of performance including residual low-frequency hearing thresholds. In CI-users, better performance was predicted by an increased cortical response to the target word, in contrast to previous reports in normal-hearing subjects in whom speech perception ability was accounted for by the ability to suppress noise. CONCLUSIONS These data indicate a neurophysiological correlate of SiN performance, thereby revealing a richer profile of an individual's hearing performance than shown by psychoacoustic measures alone. These results also highlight important differences between sentence and word recognition measures of performance and suggest that individual differences in these measures may be underwritten by different mechanisms. Finally, the contrast with prior reports of NH listeners in the same task suggests CI-users performance may be explained by a different weighting of neural processes than NH listeners.
Collapse
Affiliation(s)
- Joel I. Berger
- Department of Neurosurgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
| | - Phillip E. Gander
- Department of Neurosurgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
| | - Subong Kim
- Department of Speech, Language, and Hearing Sciences, Purdue University, West Lafayette, Indiana, USA
| | - Adam T. Schwalje
- Department of Otolaryngology – Head and Neck Surgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
| | - Jihwan Woo
- Department of Biomedical Engineering, University of Ulsan, Ulsan, South Korea
| | - Young-min Na
- Department of Biomedical Engineering, University of Ulsan, Ulsan, South Korea
| | - Ann Holmes
- Department of Psychological and Brain Sciences, University of Louisville, Louisville, Kentucky, USA
| | - Jean M. Hong
- Department of Otolaryngology – Head and Neck Surgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
| | - Camille C. Dunn
- Department of Otolaryngology – Head and Neck Surgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
| | - Marlan R. Hansen
- Department of Otolaryngology – Head and Neck Surgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
| | - Bruce J. Gantz
- Department of Otolaryngology – Head and Neck Surgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
| | - Bob McMurray
- Department of Otolaryngology – Head and Neck Surgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
- Department of Psychological and Brain Sciences, University of Iowa, Iowa City, Iowa, USA
- Department of Communication Sciences and Disorders, University of Iowa, Iowa City, Iowa, USA
| | - Timothy D. Griffiths
- Biosciences Institute, Newcastle University, Newcastle upon Tyne, United Kingdom
| | - Inyong Choi
- Department of Otolaryngology – Head and Neck Surgery, University of Iowa Hospitals and Clinics, Iowa City, Iowa, USA
- Department of Communication Sciences and Disorders, University of Iowa, Iowa City, Iowa, USA
| |
Collapse
|
4
|
Kamal F, Morrison C, Campbell K, Taler V. Event-Related Potential Measures of the Passive Processing of Rapidly and Slowly Presented Auditory Stimuli in MCI. Front Aging Neurosci 2021; 13:659618. [PMID: 33867972 PMCID: PMC8046914 DOI: 10.3389/fnagi.2021.659618] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2021] [Accepted: 03/12/2021] [Indexed: 11/13/2022] Open
Abstract
Much research effort is currently devoted to the development of a simple, low-cost method to determine early signs of Alzheimer’s disease (AD) pathology. The present study employs a simple paradigm in which event-related potentials (ERPs) were recorded to a single auditory stimulus that was presented rapidly or very slowly while the participant was engaged in a visual task. A multi-channel EEG was recorded in 20 healthy older adults and 20 people with mild cognitive impairment (MCI). In two different conditions, a single 80 dB sound pressure level (SPL) auditory stimulus was presented every 1.5 s (fast condition) or every 12.0 s (slow condition). Participants were instructed to watch a silent video and ignore the auditory stimuli. Auditory processing thus occurred passively. When the auditory stimuli were presented rapidly (every 1.5 s), N1 and P2 amplitudes did not differ between the two groups. When the stimuli were presented very slowly, the amplitude of N1 and P2 increased in both groups and their latencies were prolonged. The amplitude of N1 did not significantly differ between the two groups. However, the subsequent positivity was reduced in people with MCI compared to healthy older adults. This late positivity in the slow condition may reflect a delayed P2 or a summation of a composite P2 + P3a. In people with MCI, the priority of processing may not be switched from the visual task to the potentially much more relevant auditory input. ERPs offer promise as a means to identify the pathology underlying cognitive impairment associated with MCI.
Collapse
Affiliation(s)
- Farooq Kamal
- School of Psychology, University of Ottawa, Ontario, ON, Canada.,Bruyère Research Institute, Ottawa, ON, Canada
| | - Cassandra Morrison
- School of Psychology, University of Ottawa, Ontario, ON, Canada.,Bruyère Research Institute, Ottawa, ON, Canada
| | | | - Vanessa Taler
- School of Psychology, University of Ottawa, Ontario, ON, Canada.,Bruyère Research Institute, Ottawa, ON, Canada
| |
Collapse
|