1
|
Thierfelder P. The time course of Cantonese and Hong Kong Sign Language phonological activation: An ERP study of deaf bimodal bilingual readers of Chinese. Cognition 2024; 251:105878. [PMID: 39024841 DOI: 10.1016/j.cognition.2024.105878] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2022] [Revised: 06/21/2024] [Accepted: 07/05/2024] [Indexed: 07/20/2024]
Abstract
This study investigated Cantonese and Hong Kong Sign Language (HKSL) phonological activation patterns in Hong Kong deaf readers using the ERP technique. Two experiments employing the error disruption paradigm were conducted while recording participants' EEGs. Experiment 1 focused on orthographic and speech-based phonological processing, while Experiment 2 examined sign-phonological processing. ERP analyses focused on the P200 (180-220 ms) and N400 (300-500 ms) components. The results of Experiment 1 showed that hearing readers exhibited both orthographic and phonological effects in the P200 and N400 windows, consistent with previous studies on Chinese reading. In deaf readers, significant speech-based phonological effects were observed in the P200 window, and orthographic effects spanned both the P200 and N400 windows. Comparative analysis between the two groups revealed distinct spatial distributions for orthographic and speech-based phonological ERP effects, which may indicate the engagement of different neural networks during early processing stages. Experiment 2 found evidence of sign-phonological activation in both the P200 and N400 windows among deaf readers, which may reflect the involvement of sign-phonological representations in early lexical access and later semantic integration. Furthermore, exploratory analysis revealed that higher reading fluency in deaf readers correlated with stronger orthographic effects in the P200 window and diminished effects in the N400 window, indicating that efficient orthographic processing during early lexical access is a distinguishing feature of proficient deaf readers.
Collapse
Affiliation(s)
- Philip Thierfelder
- The Centre for Sign Linguistics and Deaf Studies, The Chinese University of Hong Kong, Hong Kong.
| |
Collapse
|
2
|
Thierfelder P, Cai ZG, Huang S, Lin H. The Chinese lexicon of deaf readers: A database of character decisions and a comparison between deaf and hearing readers. Behav Res Methods 2024; 56:5732-5753. [PMID: 38114882 DOI: 10.3758/s13428-023-02305-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/22/2023] [Indexed: 12/21/2023]
Abstract
We present a psycholinguistic study investigating lexical effects on simplified Chinese character recognition by deaf readers. Prior research suggests that deaf readers exhibit efficient orthographic processing and decreased reliance on speech-based phonology in word recognition compared to hearing readers. In this large-scale character decision study (25 participants, each evaluating 2500 real characters and 2500 pseudo-characters), we analyzed various factors influencing character recognition accuracy and speed in deaf readers. Deaf participants demonstrated greater accuracy and faster recognition when characters were more frequent, were acquired earlier, had more strokes, displayed higher orthographic complexity, were more imageable in reference, or were less concrete in reference. Comparison with a previous study of hearing readers revealed that the facilitative effect of frequency on character decision accuracy was stronger for deaf readers than hearing readers. The effect of orthographic-phonological regularity differed significantly for the two groups, indicating that deaf readers rely more on orthographic structure and less on phonological information during character recognition. Notably, increased stroke counts (i.e., higher orthographic complexity) hindered hearing readers but facilitated recognition processes in deaf readers, suggesting that deaf readers excel at recognizing characters based on orthographic structure. The database generated from this large-scale character decision study offers a valuable resource for further research and practical applications in deaf education and literacy.
Collapse
Affiliation(s)
- Philip Thierfelder
- Department of Linguistics and Modern Languages, The Chinese University of Hong Kong, Sha Tin, N.T., Hong Kong, SAR
| | - Zhenguang G Cai
- Department of Linguistics and Modern Languages, The Chinese University of Hong Kong, Sha Tin, N.T., Hong Kong, SAR.
| | - Shuting Huang
- Department of Linguistics and Modern Languages, The Chinese University of Hong Kong, Sha Tin, N.T., Hong Kong, SAR
| | - Hao Lin
- Shanghai International Studies University, 550 Dalian Road(W), Shanghai, People's Republic of China.
| |
Collapse
|
3
|
Kotowicz J, Banaszkiewicz A, Dzięgiel-Fivet G, Emmorey K, Marchewka A, Jednoróg K. Neural underpinnings of sentence reading in deaf, native sign language users. BRAIN AND LANGUAGE 2024; 255:105447. [PMID: 39079468 DOI: 10.1016/j.bandl.2024.105447] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Revised: 05/29/2024] [Accepted: 07/17/2024] [Indexed: 08/11/2024]
Abstract
The goal of this study was to investigate sentence-level reading circuits in deaf native signers, a unique group of deaf people who are immersed in a fully accessible linguistic environment from birth, and hearing readers. Task-based fMRI, functional connectivity and lateralization analyses were conducted. Both groups exhibited overlapping brain activity in the left-hemispheric perisylvian regions in response to a semantic sentence task. We found increased activity in left occipitotemporal and right frontal and temporal regions in deaf readers. Lateralization analyses did not confirm more rightward asymmetry in deaf individuals. Deaf readers exhibited weaker functional connectivity between inferior frontal and middle temporal gyri and enhanced coupling between temporal and insular cortex. In conclusion, despite the shared functional activity within the semantic reading network across both groups, our results suggest greater reliance on cognitive control processes for deaf readers, possibly resulting in greater effort required to perform the task in this group.
Collapse
Affiliation(s)
| | - Anna Banaszkiewicz
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland.
| | - Gabriela Dzięgiel-Fivet
- Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - Karen Emmorey
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, San Diego, USA
| | - Artur Marchewka
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - Katarzyna Jednoróg
- Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland.
| |
Collapse
|
4
|
Sehyr ZS, Midgley KJ, Emmorey K, Holcomb PJ. Asymetric Event-Related Potential Priming Effects Between English Letters and American Sign Language Fingerspelling Fonts. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2023; 4:361-381. [PMID: 37546690 PMCID: PMC10403274 DOI: 10.1162/nol_a_00104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Accepted: 02/23/2023] [Indexed: 08/08/2023]
Abstract
Letter recognition plays an important role in reading and follows different phases of processing, from early visual feature detection to the access of abstract letter representations. Deaf ASL-English bilinguals experience orthography in two forms: English letters and fingerspelling. However, the neurobiological nature of fingerspelling representations, and the relationship between the two orthographies, remains unexplored. We examined the temporal dynamics of single English letter and ASL fingerspelling font processing in an unmasked priming paradigm with centrally presented targets for 200 ms preceded by 100 ms primes. Event-related brain potentials were recorded while participants performed a probe detection task. Experiment 1 examined English letter-to-letter priming in deaf signers and hearing non-signers. We found that English letter recognition is similar for deaf and hearing readers, extending previous findings with hearing readers to unmasked presentations. Experiment 2 examined priming effects between English letters and ASL fingerspelling fonts in deaf signers only. We found that fingerspelling fonts primed both fingerspelling fonts and English letters, but English letters did not prime fingerspelling fonts, indicating a priming asymmetry between letters and fingerspelling fonts. We also found an N400-like priming effect when the primes were fingerspelling fonts which might reflect strategic access to the lexical names of letters. The studies suggest that deaf ASL-English bilinguals process English letters and ASL fingerspelling differently and that the two systems may have distinct neural representations. However, the fact that fingerspelling fonts can prime English letters suggests that the two orthographies may share abstract representations to some extent.
Collapse
Affiliation(s)
- Zed Sevcikova Sehyr
- San Diego State University Research Foundation, San Diego State University, San Diego, CA, USA
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| | | | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| | - Phillip J. Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| |
Collapse
|
5
|
Lee B, Martinez PM, Midgley KJ, Holcomb PJ, Emmorey K. Sensitivity to orthographic vs. phonological constraints on word recognition: An ERP study with deaf and hearing readers. Neuropsychologia 2022; 177:108420. [PMID: 36396091 PMCID: PMC10152474 DOI: 10.1016/j.neuropsychologia.2022.108420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2022] [Revised: 09/28/2022] [Accepted: 11/07/2022] [Indexed: 11/16/2022]
Abstract
The role of phonology in word recognition has previously been investigated using a masked lexical decision task and transposed letter (TL) nonwords that were either pronounceable (barve) or unpronounceable (brvae). We used event-related potentials (ERPs) to investigate these effects in skilled deaf readers, who may be more sensitive to orthotactic than phonotactic constraints, which are conflated in English. Twenty deaf and twenty hearing adults completed a masked lexical decision task while ERPs were recorded. The groups were matched in reading skill and IQ, but deaf readers had poorer phonological ability. Deaf readers were faster and more accurate at rejecting TL nonwords than hearing readers. Neither group exhibited an effect of nonword pronounceability in RTs or accuracy. For both groups, the N250 and N400 components were modulated by lexicality (more negative for nonwords). The N250 was not modulated by nonword pronounceability, but pronounceable nonwords elicited a larger amplitude N400 than unpronounceable nonwords. Because pronounceable nonwords are more word-like, they may incite activation that is unresolved when no lexical entry is found, leading to a larger N400 amplitude. Similar N400 pronounceability effects for deaf and hearing readers, despite differences in phonological sensitivity, suggest these TL effects arise from sensitivity to lexical-level orthotactic constraints. Deaf readers may have an advantage in processing TL nonwords because of enhanced early visual attention and/or tight orthographic-to-semantic connections, bypassing the phonologically mediated route to word recognition.
Collapse
Affiliation(s)
- Brittany Lee
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, United States.
| | | | | | | | | |
Collapse
|
6
|
Winsler K, Holcomb PJ, Emmorey K. Electrophysiological patterns of visual word recognition in deaf and hearing readers: An ERP mega-study. LANGUAGE, COGNITION AND NEUROSCIENCE 2022; 38:636-650. [PMID: 37304206 PMCID: PMC10249718 DOI: 10.1080/23273798.2022.2135746] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/04/2022] [Accepted: 10/03/2022] [Indexed: 06/13/2023]
Abstract
Deaf and hearing readers have different access to spoken phonology which may affect the representation and recognition of written words. We used ERPs to investigate how a matched sample of deaf and hearing adults (total n = 90) responded to lexical characteristics of 480 English words in a go/no-go lexical decision task. Results from mixed effect regression models showed a) visual complexity produced small effects in opposing directions for deaf and hearing readers, b) similar frequency effects, but shifted earlier for deaf readers, c) more pronounced effects of orthographic neighborhood density for hearing readers, and d) more pronounced effects of concreteness for deaf readers. We suggest hearing readers have visual word representations that are more integrated with phonological representations, leading to larger lexically-mediated effects of neighborhood density. Conversely, deaf readers weight other sources of information more heavily, leading to larger semantically-mediated effects and altered responses to low-level visual variables.
Collapse
Affiliation(s)
- Kurt Winsler
- Department of Psychology, University of California - Davis. Davis, CA, United States
| | - Phillip J Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, United States
| | - Karen Emmorey
- School of Speech, Language and Hearing Science, San Diego State University, San Diego, CA, United States
| |
Collapse
|
7
|
Sehyr ZS, Emmorey K. Contribution of Lexical Quality and Sign Language Variables to Reading Comprehension. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2022; 27:355-372. [PMID: 35775152 DOI: 10.1093/deafed/enac018] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/18/2022] [Revised: 05/17/2022] [Accepted: 05/21/2022] [Indexed: 06/15/2023]
Abstract
The lexical quality hypothesis proposes that the quality of phonological, orthographic, and semantic representations impacts reading comprehension. In Study 1, we evaluated the contributions of lexical quality to reading comprehension in 97 deaf and 98 hearing adults matched for reading ability. While phonological awareness was a strong predictor for hearing readers, for deaf readers, orthographic precision and semantic knowledge, not phonology, predicted reading comprehension (assessed by two different tests). For deaf readers, the architecture of the reading system adapts by shifting reliance from (coarse-grained) phonological representations to high-quality orthographic and semantic representations. In Study 2, we examined the contribution of American Sign Language (ASL) variables to reading comprehension in 83 deaf adults. Fingerspelling (FS) and ASL comprehension skills predicted reading comprehension. We suggest that FS might reinforce orthographic-to-semantic mappings and that sign language comprehension may serve as a linguistic basis for the development of skilled reading in deaf signers.
Collapse
Affiliation(s)
- Zed Sevcikova Sehyr
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, CA, USA
| | - Karen Emmorey
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, CA, USA
| |
Collapse
|
8
|
Hänel-Faulhaber B, Groen MA, Röder B, Friedrich CK. Ongoing Sign Processing Facilitates Written Word Recognition in Deaf Native Signing Children. Front Psychol 2022; 13:917700. [PMID: 35992405 PMCID: PMC9390089 DOI: 10.3389/fpsyg.2022.917700] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2022] [Accepted: 06/24/2022] [Indexed: 11/13/2022] Open
Abstract
Signed and written languages are intimately related in proficient signing readers. Here, we tested whether deaf native signing beginning readers are able to make rapid use of ongoing sign language to facilitate recognition of written words. Deaf native signing children (mean 10 years, 7 months) received prime target pairs with sign word onsets as primes and written words as targets. In a control group of hearing children (matched in their reading abilities to the deaf children, mean 8 years, 8 months), spoken word onsets were instead used as primes. Targets (written German words) either were completions of the German signs or of the spoken word onsets. Task of the participants was to decide whether the target word was a possible German word. Sign onsets facilitated processing of written targets in deaf children similarly to spoken word onsets facilitating processing of written targets in hearing children. In both groups, priming elicited similar effects in the simultaneously recorded event related potentials (ERPs), starting as early as 200 ms after the onset of the written target. These results suggest that beginning readers can use ongoing lexical processing in their native language - be it signed or spoken - to facilitate written word recognition. We conclude that intimate interactions between sign and written language might in turn facilitate reading acquisition in deaf beginning readers.
Collapse
Affiliation(s)
| | | | - Brigitte Röder
- Biological Psychology and Neuropsychology, Universität Hamburg, Hamburg, Germany
| | - Claudia K. Friedrich
- Department of Developmental Psychology, University of Tübingen, Tübingen, Germany
| |
Collapse
|
9
|
Grégoire A, Deggouj N, Dricot L, Decat M, Kupers R. Brain Morphological Modifications in Congenital and Acquired Auditory Deprivation: A Systematic Review and Coordinate-Based Meta-Analysis. Front Neurosci 2022; 16:850245. [PMID: 35418829 PMCID: PMC8995770 DOI: 10.3389/fnins.2022.850245] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Accepted: 03/01/2022] [Indexed: 12/02/2022] Open
Abstract
Neuroplasticity following deafness has been widely demonstrated in both humans and animals, but the anatomical substrate of these changes is not yet clear in human brain. However, it is of high importance since hearing loss is a growing problem due to aging population. Moreover, knowing these brain changes could help to understand some disappointing results with cochlear implant, and therefore could improve hearing rehabilitation. A systematic review and a coordinate-based meta-analysis were realized about the morphological brain changes highlighted by MRI in severe to profound hearing loss, congenital and acquired before or after language onset. 25 papers were included in our review, concerning more than 400 deaf subjects, most of them presenting prelingual deafness. The most consistent finding is a volumetric decrease in gray matter around bilateral auditory cortex. This change was confirmed by the coordinate-based meta-analysis which shows three converging clusters in this region. The visual areas of deaf children is also significantly impacted, with a decrease of the volume of both gray and white matters. Finally, deafness is responsible of a gray matter increase within the cerebellum, especially at the right side. These results are largely discussed and compared with those from deaf animal models and blind humans, which demonstrate for example a much more consistent gray matter decrease along their respective primary sensory pathway. In human deafness, a lot of other factors than deafness could interact on the brain plasticity. One of the most important is the use of sign language and its age of acquisition, which induce among others changes within the hand motor region and the visual cortex. But other confounding factors exist which have been too little considered in the current literature, such as the etiology of the hearing impairment, the speech-reading ability, the hearing aid use, the frequent associated vestibular dysfunction or neurocognitive impairment. Another important weakness highlighted by this review concern the lack of papers about postlingual deafness, whereas it represents most of the deaf population. Further studies are needed to better understand these issues, and finally try to improve deafness rehabilitation.
Collapse
Affiliation(s)
- Anaïs Grégoire
- Department of ENT, Cliniques Universitaires Saint-Luc, Brussels, Belgium
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
| | - Naïma Deggouj
- Department of ENT, Cliniques Universitaires Saint-Luc, Brussels, Belgium
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
| | - Laurence Dricot
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
| | - Monique Decat
- Department of ENT, Cliniques Universitaires Saint-Luc, Brussels, Belgium
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
| | - Ron Kupers
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
- Department of Neuroscience, Panum Institute, University of Copenhagen, Copenhagen, Denmark
- Ecole d’Optométrie, Université de Montréal, Montréal, QC, Canada
| |
Collapse
|
10
|
Predictors of Word and Text Reading Fluency of Deaf Children in Bilingual Deaf Education Programmes. LANGUAGES 2022. [DOI: 10.3390/languages7010051] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Reading continues to be a challenging task for most deaf children. Bimodal bilingual education creates a supportive environment that stimulates deaf children’s learning through the use of sign language. However, it is still unclear how exposure to sign language might contribute to improving reading ability. Here, we investigate the relative contribution of several cognitive and linguistic variables to the development of word and text reading fluency in deaf children in bimodal bilingual education programmes. The participants of this study were 62 school-aged (8 to 10 years old at the start of the 3-year study) deaf children who took part in bilingual education (using Dutch and Sign Language of The Netherlands) and 40 age-matched hearing children. We assessed vocabulary knowledge in speech and sign, phonological awareness in speech and sign, receptive fingerspelling ability, and short-term memory at time 1 (T1). At times 2 (T2) and 3 (T3), we assessed word and text reading fluency. We found that (1) speech-based vocabulary strongly predicted word and text reading at T2 and T3, (2) fingerspelling ability was a strong predictor of word and text reading fluency at T2 and T3, (3) speech-based phonological awareness predicted word reading accuracy at T2 and T3 but did not predict text reading fluency, and (4) fingerspelling and STM predicted word reading latency at T2 while sign-based phonological awareness predicted this outcome measure at T3. These results suggest that fingerspelling may have an important function in facilitating the construction of orthographical/phonological representations of printed words for deaf children and strengthening word decoding and recognition abilities.
Collapse
|
11
|
Gutierrez-Sigut E, Vergara-Martínez M, Perea M. The impact of visual cues during visual word recognition in deaf readers: An ERP study. Cognition 2021; 218:104938. [PMID: 34678681 DOI: 10.1016/j.cognition.2021.104938] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 09/13/2021] [Accepted: 10/11/2021] [Indexed: 11/28/2022]
Abstract
Although evidence is still scarce, recent research suggests key differences in how deaf and hearing readers use visual information during visual word recognition. Here we compared the time course of lexical access in deaf and hearing readers of similar reading ability. We also investigated whether one visual property of words, the outline-shape, modulates visual word recognition differently in both groups. We recorded the EEG signal of twenty deaf and twenty hearing readers while they performed a lexical decision task. In addition to the effect of lexicality, we assessed the impact of outline-shape by contrasting responses to pseudowords with an outline-shape that was consistent (e.g., mofor) or inconsistent (e.g., mosor) with their baseword (motor). Despite hearing readers having higher phonological abilities, results showed a remarkably similar time course of the lexicality effect in deaf and hearing readers. We also found that only for deaf readers, inconsistent-shape pseudowords (e.g., mosor) elicited larger amplitude ERPs than consistent-shape pseudowords (e.g., mofor) from 150 ms after stimulus onset and extending into the N400 time window. This latter finding supports the view that deaf readers rely more on visual characteristics than typical hearing readers during visual word recognition. Altogether, our results suggest different mechanisms underlying effective word recognition in deaf and hearing readers.
Collapse
Affiliation(s)
- Eva Gutierrez-Sigut
- University of Essex, UK; DCAL Research Centre, University College London, UK.
| | | | - Manuel Perea
- ERI-Lectura, University of Valencia, Spain; Universidad Nebrija, Spain
| |
Collapse
|