1
|
Sehyr ZS, Midgley KJ, Emmorey K, Holcomb PJ. Asymetric Event-Related Potential Priming Effects Between English Letters and American Sign Language Fingerspelling Fonts. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2023; 4:361-381. [PMID: 37546690 PMCID: PMC10403274 DOI: 10.1162/nol_a_00104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Accepted: 02/23/2023] [Indexed: 08/08/2023]
Abstract
Letter recognition plays an important role in reading and follows different phases of processing, from early visual feature detection to the access of abstract letter representations. Deaf ASL-English bilinguals experience orthography in two forms: English letters and fingerspelling. However, the neurobiological nature of fingerspelling representations, and the relationship between the two orthographies, remains unexplored. We examined the temporal dynamics of single English letter and ASL fingerspelling font processing in an unmasked priming paradigm with centrally presented targets for 200 ms preceded by 100 ms primes. Event-related brain potentials were recorded while participants performed a probe detection task. Experiment 1 examined English letter-to-letter priming in deaf signers and hearing non-signers. We found that English letter recognition is similar for deaf and hearing readers, extending previous findings with hearing readers to unmasked presentations. Experiment 2 examined priming effects between English letters and ASL fingerspelling fonts in deaf signers only. We found that fingerspelling fonts primed both fingerspelling fonts and English letters, but English letters did not prime fingerspelling fonts, indicating a priming asymmetry between letters and fingerspelling fonts. We also found an N400-like priming effect when the primes were fingerspelling fonts which might reflect strategic access to the lexical names of letters. The studies suggest that deaf ASL-English bilinguals process English letters and ASL fingerspelling differently and that the two systems may have distinct neural representations. However, the fact that fingerspelling fonts can prime English letters suggests that the two orthographies may share abstract representations to some extent.
Collapse
Affiliation(s)
- Zed Sevcikova Sehyr
- San Diego State University Research Foundation, San Diego State University, San Diego, CA, USA
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| | | | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| | - Phillip J. Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| |
Collapse
|
2
|
Development of visual sustained selective attention and response inhibition in deaf children. Mem Cognit 2023; 51:509-525. [PMID: 35794408 DOI: 10.3758/s13421-022-01330-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/18/2022] [Indexed: 11/08/2022]
Abstract
Studies of deaf and hard-of-hearing (henceforth, deaf) children tend to make comparisons with typically hearing children for the purpose of either identifying deficits to be remediated or understanding the impact of auditory deprivation on visual or domain general processing. Here, we eschew these clinical and theoretical aims, seeking instead to understand factors that explain variability in cognitive function within deaf children. A total of 108 bilingual deaf children ages 7-13 years who use both English and American Sign Language (ASL) participated in a longitudinal study of executive function (EF) development. We report longitudinal data from a visual continuous performance task that measured sustained selective attention and response inhibition. Results show that the impact of deafness on these processes is negligible, but that language skills have a positive relationship with both: better English abilities were associated with better selective sustained attention, and better ASL abilities with better response inhibition. The relationship between sustained selective attention and English abilities may reflect the cognitive demands of spoken language acquisition for deaf children, whereas better ASL abilities may promote an "inner voice," associated with improved response inhibition. The current study cannot conclusively demonstrate causality or directionality of effects. However, these data highlight the importance of studies that focus on atypical individuals, for whom the relationships between language and cognition may be different from those observed in typically developing populations.
Collapse
|
3
|
Hirshorn EA, Harris LN. Culture is not destiny, for reading: highlighting variable routes to literacy within writing systems. Ann N Y Acad Sci 2022; 1513:31-47. [PMID: 35313016 DOI: 10.1111/nyas.14768] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2021] [Accepted: 02/17/2022] [Indexed: 01/17/2023]
Abstract
Cross-writing system research in psychology and cognitive neuroscience has yielded important findings regarding how a writing system's structure can influence the cognitive challenges of learning to read and the neural underpinnings of literacy. The current paper reviews these differences and extends the findings to demonstrate diversity in how skilled reading is accomplished within a single writing system, English. We argue that broad clusters of behavioral and neural patterns found across writing systems can also be found within subpopulations who display atypical routes to skilled English reading, subpopulations including Chinese-English bilinguals, deaf native signers, compensated readers, and distortion-sensitive readers. The patterns of interest include a tradeoff between the degree of reliance on phonological and morphological processing for skilled reading, a shift in attentional focus from smaller to larger orthographic units, and enhanced bilaterality of neural processing during word reading. Lastly, we consider how understanding atypical routes to reading may apply to other writing systems.
Collapse
Affiliation(s)
| | - Lindsay N Harris
- Department of Leadership, Educational Psychology and Foundations, Northern Illinois University, DeKalb, Illinois
| |
Collapse
|
4
|
Abstract
The first 40 years of research on the neurobiology of sign languages (1960-2000) established that the same key left hemisphere brain regions support both signed and spoken languages, based primarily on evidence from signers with brain injury and at the end of the 20th century, based on evidence from emerging functional neuroimaging technologies (positron emission tomography and fMRI). Building on this earlier work, this review focuses on what we have learned about the neurobiology of sign languages in the last 15-20 years, what controversies remain unresolved, and directions for future research. Production and comprehension processes are addressed separately in order to capture whether and how output and input differences between sign and speech impact the neural substrates supporting language. In addition, the review includes aspects of language that are unique to sign languages, such as pervasive lexical iconicity, fingerspelling, linguistic facial expressions, and depictive classifier constructions. Summary sketches of the neural networks supporting sign language production and comprehension are provided with the hope that these will inspire future research as we begin to develop a more complete neurobiological model of sign language processing.
Collapse
|
5
|
Gutierrez-Sigut E, Vergara-Martínez M, Perea M. The impact of visual cues during visual word recognition in deaf readers: An ERP study. Cognition 2021; 218:104938. [PMID: 34678681 DOI: 10.1016/j.cognition.2021.104938] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 09/13/2021] [Accepted: 10/11/2021] [Indexed: 11/28/2022]
Abstract
Although evidence is still scarce, recent research suggests key differences in how deaf and hearing readers use visual information during visual word recognition. Here we compared the time course of lexical access in deaf and hearing readers of similar reading ability. We also investigated whether one visual property of words, the outline-shape, modulates visual word recognition differently in both groups. We recorded the EEG signal of twenty deaf and twenty hearing readers while they performed a lexical decision task. In addition to the effect of lexicality, we assessed the impact of outline-shape by contrasting responses to pseudowords with an outline-shape that was consistent (e.g., mofor) or inconsistent (e.g., mosor) with their baseword (motor). Despite hearing readers having higher phonological abilities, results showed a remarkably similar time course of the lexicality effect in deaf and hearing readers. We also found that only for deaf readers, inconsistent-shape pseudowords (e.g., mosor) elicited larger amplitude ERPs than consistent-shape pseudowords (e.g., mofor) from 150 ms after stimulus onset and extending into the N400 time window. This latter finding supports the view that deaf readers rely more on visual characteristics than typical hearing readers during visual word recognition. Altogether, our results suggest different mechanisms underlying effective word recognition in deaf and hearing readers.
Collapse
Affiliation(s)
- Eva Gutierrez-Sigut
- University of Essex, UK; DCAL Research Centre, University College London, UK.
| | | | - Manuel Perea
- ERI-Lectura, University of Valencia, Spain; Universidad Nebrija, Spain
| |
Collapse
|
6
|
Wauters L, van Gelder H, Tijsseling C. Simple View of Reading in Deaf and Hard-of-Hearing Adults. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2021; 26:535-545. [PMID: 34218274 DOI: 10.1093/deafed/enab020] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/19/2021] [Revised: 06/04/2021] [Accepted: 06/08/2021] [Indexed: 06/13/2023]
Abstract
The present study investigated the relative contribution of the two components in the simple view of reading to the reading comprehension skills of deaf and hard-of-hearing (DHH) adults in the Netherlands. Eighty DHH adults, aged between 30 and 80 years old, were tested on word reading, reading fluency, vocabulary, and reading comprehension. Regression analyses showed that both decoding skills and vocabulary contributed to the reading comprehension skills of DHH adults, with vocabulary being the strongest predictor. For skilled decoders, the picture was somewhat different with only vocabulary being a predictor of reading comprehension. The results of this study show that the simple view of reading is applicable to DHH adults' reading comprehension skills: both decoding skills and vocabulary contribute to reading comprehension. Also, as in previous studies on the simple view of reading, as readers become more skilled in the decoding process, vocabulary becomes the only predictor of reading comprehension.
Collapse
Affiliation(s)
- Loes Wauters
- Royal Dutch Kentalis, Sint-Michielsgestel, The Netherlands
- Behavioural Science Institute, Radboud University, Nijmegen, The Netherlands
| | | | | |
Collapse
|
7
|
Bosworth RG, Binder EM, Tyler SC, Morford JP. Automaticity of lexical access in deaf and hearing bilinguals: Cross-linguistic evidence from the color Stroop task across five languages. Cognition 2021; 212:104659. [PMID: 33798950 DOI: 10.1016/j.cognition.2021.104659] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2020] [Revised: 12/08/2020] [Accepted: 03/07/2021] [Indexed: 11/15/2022]
Abstract
The well-known Stroop interference effect has been instrumental in revealing the highly automated nature of lexical processing as well as providing new insights to the underlying lexical organization of first and second languages within proficient bilinguals. The present cross-linguistic study had two goals: 1) to examine Stroop interference for dynamic signs and printed words in deaf ASL-English bilinguals who report no reliance on speech or audiological aids; 2) to compare Stroop interference effects in several groups of bilinguals whose two languages range from very distinct to very similar in their shared orthographic patterns: ASL-English bilinguals (very distinct), Chinese-English bilinguals (low similarity), Korean-English bilinguals (moderate similarity), and Spanish-English bilinguals (high similarity). Reaction time and accuracy were measured for the Stroop color naming and word reading tasks, for congruent and incongruent color font conditions. Results confirmed strong Stroop interference for both dynamic ASL stimuli and English printed words in deaf bilinguals, with stronger Stroop interference effects in ASL for deaf bilinguals who scored higher in a direct assessment of ASL proficiency. Comparison of the four groups of bilinguals revealed that the same-script bilinguals (Spanish-English bilinguals) exhibited significantly greater Stroop interference effects for color naming than the other three bilingual groups. The results support three conclusions. First, Stroop interference effects are found for both signed and spoken languages. Second, contrary to some claims in the literature about deaf signers who do not use speech being poor readers, deaf bilinguals' lexical processing of both signs and written words is highly automated. Third, cross-language similarity is a critical factor shaping bilinguals' experience of Stroop interference in their two languages. This study represents the first comparison of both deaf and hearing bilinguals on the Stroop task, offering a critical test of theories about bilingual lexical access and cognitive control.
Collapse
Affiliation(s)
- Rain G Bosworth
- National Technical Institute for the Deaf, Rochester Institute of Technology, USA.
| | | | - Sarah C Tyler
- Department of Psychology, University of California, San Diego, USA
| | - Jill P Morford
- Department of Linguistics, University of New Mexico, USA
| |
Collapse
|
8
|
Costello B, Caffarra S, Fariña N, Duñabeitia JA, Carreiras M. Reading without phonology: ERP evidence from skilled deaf readers of Spanish. Sci Rep 2021; 11:5202. [PMID: 33664324 PMCID: PMC7933439 DOI: 10.1038/s41598-021-84490-5] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2020] [Accepted: 02/16/2021] [Indexed: 11/10/2022] Open
Abstract
Reading typically involves phonological mediation, especially for transparent orthographies with a regular letter to sound correspondence. In this study we ask whether phonological coding is a necessary part of the reading process by examining prelingually deaf individuals who are skilled readers of Spanish. We conducted two EEG experiments exploiting the pseudohomophone effect, in which nonwords that sound like words elicit phonological encoding during reading. The first, a semantic categorization task with masked priming, resulted in modulation of the N250 by pseudohomophone primes in hearing but not in deaf readers. The second, a lexical decision task, confirmed the pattern: hearing readers had increased errors and an attenuated N400 response for pseudohomophones compared to control pseudowords, whereas deaf readers did not treat pseudohomophones any differently from pseudowords, either behaviourally or in the ERP response. These results offer converging evidence that skilled deaf readers do not rely on phonological coding during visual word recognition. Furthermore, the finding demonstrates that reading can take place in the absence of phonological activation, and we speculate about the alternative mechanisms that allow these deaf individuals to read competently.
Collapse
Affiliation(s)
- Brendan Costello
- Basque Center on Cognition, Brain and Language, Paseo Mikeletegi, 69, 20009, Donostia-San Sebstián, Spain.
| | - Sendy Caffarra
- Basque Center on Cognition, Brain and Language, Paseo Mikeletegi, 69, 20009, Donostia-San Sebstián, Spain.,Division of Developmental-Behavioral Pediatrics, Stanford University School of Medicine, Stanford University, Stanford, CA, USA.,Stanford University Graduate School of Education, Stanford, CA, USA
| | - Noemi Fariña
- Basque Center on Cognition, Brain and Language, Paseo Mikeletegi, 69, 20009, Donostia-San Sebstián, Spain.,Departamento de Psicología de la Educación y Psicobiología, Facultad de Educación, Universidad Internacional de La Rioja, Logroño, Spain
| | - Jon Andoni Duñabeitia
- Centro de Ciencia Cognitiva - C3, Universidad Nebrija, Madrid, Spain.,Department of Language and Culture, The Arctic University of Norway, Tromsö, Norway
| | - Manuel Carreiras
- Basque Center on Cognition, Brain and Language, Paseo Mikeletegi, 69, 20009, Donostia-San Sebstián, Spain.,Departamento de Lengua Vasca y Comunicación, UPV/EHU, Bilbao, Spain.,Basque Foundation for Science, Bilbao, Spain
| |
Collapse
|
9
|
Emmorey K, Holcomb PJ, Midgley KJ. Masked ERP repetition priming in deaf and hearing readers. BRAIN AND LANGUAGE 2021; 214:104903. [PMID: 33486233 PMCID: PMC8299519 DOI: 10.1016/j.bandl.2020.104903] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/24/2020] [Revised: 11/11/2020] [Accepted: 12/21/2020] [Indexed: 06/12/2023]
Abstract
Deaf readers provide unique insights into how the reading circuit is modified by altered linguistic and sensory input. We investigated whether reading-matched deaf and hearing readers (n = 62) exhibit different ERP effects associated with orthographic to phonological mapping (N250) or lexico-semantic processes (N400). In a visual masked priming paradigm, participants performed a go/no-go categorization task; target words were preceded by repeated or unrelated primes. Prime duration and word frequency were manipulated. Hearing readers exhibited typical N250 and N400 priming effects with 50 ms primes (greater negativity for unrelated primes) and smaller effects with 100 ms primes. Deaf readers showed a surprising reversed priming effect with 50 ms primes (greater negativity for related primes), and more typical N250 and N400 effects with 100 ms primes. Correlation results suggested deaf readers with poorer phonological skills drove this effect. We suggest that weak phonological activation may create orthographic "repetition enhancement" or form/lexical competition in deaf readers.
Collapse
Affiliation(s)
- Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, CA, USA.
| | | | | |
Collapse
|
10
|
Emmorey K, Lee B. The neurocognitive basis of skilled reading in prelingually and profoundly deaf adults. LANGUAGE AND LINGUISTICS COMPASS 2021; 15:e12407. [PMID: 34306178 PMCID: PMC8302003 DOI: 10.1111/lnc3.12407] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Accepted: 02/03/2021] [Indexed: 05/09/2023]
Abstract
Deaf individuals have unique sensory and linguistic experiences that influence how they read and become skilled readers. This review presents our current understanding of the neurocognitive underpinnings of reading skill in deaf adults. Key behavioural and neuroimaging studies are integrated to build a profile of skilled adult deaf readers and to examine how changes in visual attention and reduced access to auditory input and phonology shape how they read both words and sentences. Crucially, the behaviours, processes, and neural circuity of deaf readers are compared to those of hearing readers with similar reading ability to help identify alternative pathways to reading success. Overall, sensitivity to orthographic and semantic information is comparable for skilled deaf and hearing readers, but deaf readers rely less on phonology and show greater engagement of the right hemisphere in visual word processing. During sentence reading, deaf readers process visual word forms more efficiently and may have a greater reliance on and altered connectivity to semantic information compared to their hearing peers. These findings highlight the plasticity of the reading system and point to alternative pathways to reading success.
Collapse
Affiliation(s)
- Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, California, USA
- Joint Doctoral Program in Language and Communicative Disorders, University of California, San Diego, California, USA
| | - Brittany Lee
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, California, USA
- Joint Doctoral Program in Language and Communicative Disorders, University of California, San Diego, California, USA
| |
Collapse
|
11
|
Krasa N, Bell Z. Silent word-reading fluency is strongly associated with orthotactic sensitivity among elementary school children. J Exp Child Psychol 2021; 205:105061. [PMID: 33460862 DOI: 10.1016/j.jecp.2020.105061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2020] [Revised: 11/18/2020] [Accepted: 11/23/2020] [Indexed: 11/24/2022]
Abstract
Some written languages (the so-called "deep orthographies" such as English) have often unpredictable links to word sounds, making some written words difficult to associate with their spoken forms (i.e., to decode), thereby impeding comprehension. To read these languages efficiently for comprehension, readers require visual cues such as predictable spelling patterns (orthotactic conventions). Sensitivity to English orthotactic conventions (e.g., which letters are sometimes doubled, where configurations such as wh can typically be found in a word) was assessed in a cross-sectional sample of children (N = 271, ages 5-11 years) in kindergarten through Grade 5 using a word-likeness task. Orthotactic sensitivity was strongly correlated with silent word-reading fluency, an important reading skill used frequently in daily life to obtain information, and was modestly correlated with lexical spelling recognition. Among fluent decoders of predictable letter-sound relations, orthotactic sensitivity began to emerge prior to formal reading instruction and developed rapidly from kindergarten to Grade 2. About two thirds of dysfluent decoders (a proxy for dyslexia) demonstrated above-chance orthotactic sensitivity; however, their performance lagged behind that of fluent decoders through Grade 5. Orthotactic acquisition, possible reasons for impairment, and classroom implications are discussed.
Collapse
Affiliation(s)
- Nancy Krasa
- Department of Psychology, The Ohio State University, Columbus, OH 43210, USA.
| | - Ziv Bell
- Department of Psychology, The Ohio State University, Columbus, OH 43210, USA
| |
Collapse
|
12
|
Sehyr ZS, Midgley KJ, Holcomb PJ, Emmorey K, Plaut DC, Behrmann M. Unique N170 signatures to words and faces in deaf ASL signers reflect experience-specific adaptations during early visual processing. Neuropsychologia 2020; 141:107414. [PMID: 32142729 DOI: 10.1016/j.neuropsychologia.2020.107414] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2019] [Revised: 02/24/2020] [Accepted: 02/26/2020] [Indexed: 11/24/2022]
Abstract
Previous studies with deaf adults reported reduced N170 waveform asymmetry to visual words, a finding attributed to reduced phonological mapping in left-hemisphere temporal regions compared to hearing adults. An open question remains whether this pattern indeed results from reduced phonological processing or from general neurobiological adaptations in visual processing of deaf individuals. Deaf ASL signers and hearing nonsigners performed a same-different discrimination task with visually presented words, faces, or cars, while scalp EEG time-locked to the onset of the first item in each pair was recorded. For word recognition, the typical left-lateralized N170 in hearing participants and reduced left-sided asymmetry in deaf participants were replicated. The groups did not differ on word discrimination but better orthographic skill was associated with larger N170 in the right hemisphere only for deaf participants. Face recognition was characterized by unique N170 signatures for both groups, and deaf individuals exhibited superior face discrimination performance. Laterality or discrimination performance effects did not generalize to the N170 responses to cars, confirming that deaf signers are not inherently less lateralized in their electrophysiological responses to words and critically, giving support to the phonological mapping hypothesis. P1 was attenuated for deaf participants compared to the hearing, but in both groups, P1 selectively discriminated between highly learned familiar objects - words and faces versus less familiar objects - cars. The distinct electrophysiological signatures to words and faces reflected experience-driven adaptations to words and faces that do not generalize to object recognition.
Collapse
|
13
|
Gutierrez-Sigut E, Vergara-Martínez M, Perea M. Deaf readers benefit from lexical feedback during orthographic processing. Sci Rep 2019; 9:12321. [PMID: 31444497 PMCID: PMC6707270 DOI: 10.1038/s41598-019-48702-3] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2018] [Accepted: 08/05/2019] [Indexed: 12/03/2022] Open
Abstract
It has been proposed that poor reading abilities in deaf readers might be related to weak connections between the orthographic and lexical-semantic levels of processing. Here we used event related potentials (ERPs), known for their excellent time resolution, to examine whether lexical feedback modulates early orthographic processing. Twenty congenitally deaf readers made lexical decisions to target words and pseudowords. Each of those target stimuli could be preceded by a briefly presented matched-case or mismatched-case identity prime (e.g., ALTAR-ALTAR vs. altar- ALTAR). Results showed an early effect of case overlap at the N/P150 for all targets. Critically, this effect disappeared for words but not for pseudowords, at the N250—an ERP component sensitive to orthographic processing. This dissociation in the effect of case for word and pseudowords targets provides strong evidence of early automatic lexical-semantic feedback modulating orthographic processing in deaf readers. Interestingly, despite the dissociation found in the ERP data, behavioural responses to words still benefited from the physical overlap between prime and target, particularly in less skilled readers and those with less experience with words. Overall, our results support the idea that skilled deaf readers have a stronger connection between the orthographic and the lexical-semantic levels of processing.
Collapse
Affiliation(s)
- Eva Gutierrez-Sigut
- Department of Psychology, University of Essex, Essex, UK. .,ERI-Lectura, University of Valencia, Valencia, Spain. .,UCL DCAL Centre, University College London, London, UK.
| | | | - Manuel Perea
- ERI-Lectura, University of Valencia, Valencia, Spain.,Nebrija University, Madrid, Spain.,Basque Center of Cognition, Brain, and Language, Donostia, Spain
| |
Collapse
|