1
|
Cheng Q, Yan X, Yang L, Lin H. Resolving syntactic-semantic conflicts: comprehension and processing patterns by deaf Chinese readers. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2024; 29:396-411. [PMID: 38439566 DOI: 10.1093/deafed/enae008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/15/2023] [Revised: 01/05/2024] [Accepted: 01/20/2024] [Indexed: 03/06/2024]
Abstract
The current study combined sentence plausibility judgment and self-paced reading tasks to examine the comprehension strategies and processing patterns of Chinese deaf individuals when comprehending written Chinese sentences with syntactic-semantic cue conflicts. Similar to findings from previous crosslinguistic studies on deaf readers, the Chinese deaf readers showed great variability in their comprehension strategies, with only 38% robustly relying on syntactic cues. Regardless of their overall comprehension preferences, the deaf readers all showed additional processing efforts as reflected by longer reading time at the verb regions when they relied on the syntactic cues. Those with less robust reliance on syntactic cues also showed longer reading time at the verb regions even when they relied on the semantic cues, suggesting sensitivity to the syntactic cues regardless of the comprehension strategy. These findings suggest that deaf readers in general endure more processing burden while resolving conflicting syntactic and semantic cues, likely due to their overall high reliance on semantic information during sentence comprehension. Increased processing burden thus may contribute to an overall tendency of over-reliance on semantic cues when comprehending sentences with cue conflicts.
Collapse
Affiliation(s)
- Qi Cheng
- Department of Linguistics, University of Washington, Seattle, United States
| | - Xu Yan
- Department of Electrical and Computer Engineering, University of California Los Angeles, Los Angeles, United States
| | - Lujia Yang
- Department of Communication Sciences and Disorders, University of Alberta, Edmonton, Canada
| | - Hao Lin
- Institute of Linguistics, Shanghai International Studies University, Shanghai, China
- China Braille and Sign Language Research and Application Center, Nanjing Normal University of Special Education, Nanjing, China
| |
Collapse
|
2
|
Nematova S, Zinszer B, Morlet T, Morini G, Petitto LA, Jasińska KK. Impact of ASL Exposure on Spoken Phonemic Discrimination in Adult CI Users: A Functional Near-Infrared Spectroscopy Study. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2024; 5:553-588. [PMID: 38939730 PMCID: PMC11210937 DOI: 10.1162/nol_a_00143] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Accepted: 03/11/2024] [Indexed: 06/29/2024]
Abstract
We examined the impact of exposure to a signed language (American Sign Language, or ASL) at different ages on the neural systems that support spoken language phonemic discrimination in deaf individuals with cochlear implants (CIs). Deaf CI users (N = 18, age = 18-24 yrs) who were exposed to a signed language at different ages and hearing individuals (N = 18, age = 18-21 yrs) completed a phonemic discrimination task in a spoken native (English) and non-native (Hindi) language while undergoing functional near-infrared spectroscopy neuroimaging. Behaviorally, deaf CI users who received a CI early versus later in life showed better English phonemic discrimination, albeit phonemic discrimination was poor relative to hearing individuals. Importantly, the age of exposure to ASL was not related to phonemic discrimination. Neurally, early-life language exposure, irrespective of modality, was associated with greater neural activation of left-hemisphere language areas critically involved in phonological processing during the phonemic discrimination task in deaf CI users. In particular, early exposure to ASL was associated with increased activation in the left hemisphere's classic language regions for native versus non-native language phonemic contrasts for deaf CI users who received a CI later in life. For deaf CI users who received a CI early in life, the age of exposure to ASL was not related to neural activation during phonemic discrimination. Together, the findings suggest that early signed language exposure does not negatively impact spoken language processing in deaf CI users, but may instead potentially offset the negative effects of language deprivation that deaf children without any signed language exposure experience prior to implantation. This empirical evidence aligns with and lends support to recent perspectives regarding the impact of ASL exposure in the context of CI usage.
Collapse
Affiliation(s)
- Shakhlo Nematova
- Department of Linguistics and Cognitive Science, University of Delaware, Newark, DE, USA
| | - Benjamin Zinszer
- Department of Psychology, Swarthmore College, Swarthmore, PA, USA
| | - Thierry Morlet
- Nemours Children’s Hospital, Delaware, Wilmington, DE, USA
| | - Giovanna Morini
- Department of Communication Sciences and Disorders, University of Delaware, Newark, DE, USA
| | - Laura-Ann Petitto
- Brain and Language Center for Neuroimaging, Gallaudet University, Washington, DC, USA
| | - Kaja K. Jasińska
- Department of Applied Psychology and Human Development, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
3
|
Xu L, Gong T, Shuai L, Feng J. Significantly different noun-verb distinguishing mechanisms in written Chinese and Chinese sign language: An event-related potential study of bilingual native signers. Front Neurosci 2022; 16:910263. [DOI: 10.3389/fnins.2022.910263] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Accepted: 10/05/2022] [Indexed: 11/13/2022] Open
Abstract
Little is known about: (a) whether bilingual signers possess dissociated neural mechanisms for noun and verb processing in written language (just like native non-signers), or they utilize similar neural mechanisms for those processing (due to general lack of part-of-speech criterion in sign languages); and (b) whether learning a language from another modality (L2) influences corresponding neural mechanism of L1. In order to address these issues, we conducted an electroencephalogram (EEG) based reading comprehension study on bimodal bilinguals, namely Chinese native deaf signers, whose L1 is Chinese Sign Language and L2 is written Chinese. Analyses identified significantly dissociated neural mechanisms in the bilingual signers’ written noun and verb processing (which also became more explicit along with increase in their written Chinese understanding levels), but not in their understanding of verbal and nominal meanings in Chinese Sign Language. These findings reveal relevance between modality-based linguistic features and processing mechanisms, which suggests that: processing modality-based features of a language is unlikely affected by learning another language in a different modality; and cross-modal language transfer is subject to modal constraints rather than explicit linguistic features.
Collapse
|
4
|
Grégoire A, Deggouj N, Dricot L, Decat M, Kupers R. Brain Morphological Modifications in Congenital and Acquired Auditory Deprivation: A Systematic Review and Coordinate-Based Meta-Analysis. Front Neurosci 2022; 16:850245. [PMID: 35418829 PMCID: PMC8995770 DOI: 10.3389/fnins.2022.850245] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Accepted: 03/01/2022] [Indexed: 12/02/2022] Open
Abstract
Neuroplasticity following deafness has been widely demonstrated in both humans and animals, but the anatomical substrate of these changes is not yet clear in human brain. However, it is of high importance since hearing loss is a growing problem due to aging population. Moreover, knowing these brain changes could help to understand some disappointing results with cochlear implant, and therefore could improve hearing rehabilitation. A systematic review and a coordinate-based meta-analysis were realized about the morphological brain changes highlighted by MRI in severe to profound hearing loss, congenital and acquired before or after language onset. 25 papers were included in our review, concerning more than 400 deaf subjects, most of them presenting prelingual deafness. The most consistent finding is a volumetric decrease in gray matter around bilateral auditory cortex. This change was confirmed by the coordinate-based meta-analysis which shows three converging clusters in this region. The visual areas of deaf children is also significantly impacted, with a decrease of the volume of both gray and white matters. Finally, deafness is responsible of a gray matter increase within the cerebellum, especially at the right side. These results are largely discussed and compared with those from deaf animal models and blind humans, which demonstrate for example a much more consistent gray matter decrease along their respective primary sensory pathway. In human deafness, a lot of other factors than deafness could interact on the brain plasticity. One of the most important is the use of sign language and its age of acquisition, which induce among others changes within the hand motor region and the visual cortex. But other confounding factors exist which have been too little considered in the current literature, such as the etiology of the hearing impairment, the speech-reading ability, the hearing aid use, the frequent associated vestibular dysfunction or neurocognitive impairment. Another important weakness highlighted by this review concern the lack of papers about postlingual deafness, whereas it represents most of the deaf population. Further studies are needed to better understand these issues, and finally try to improve deafness rehabilitation.
Collapse
Affiliation(s)
- Anaïs Grégoire
- Department of ENT, Cliniques Universitaires Saint-Luc, Brussels, Belgium
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
| | - Naïma Deggouj
- Department of ENT, Cliniques Universitaires Saint-Luc, Brussels, Belgium
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
| | - Laurence Dricot
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
| | - Monique Decat
- Department of ENT, Cliniques Universitaires Saint-Luc, Brussels, Belgium
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
| | - Ron Kupers
- Institute of NeuroScience (IoNS), UCLouvain, Brussels, Belgium
- Department of Neuroscience, Panum Institute, University of Copenhagen, Copenhagen, Denmark
- Ecole d’Optométrie, Université de Montréal, Montréal, QC, Canada
| |
Collapse
|
5
|
Caldwell HB. Sign and Spoken Language Processing Differences in the Brain: A Brief Review of Recent Research. Ann Neurosci 2022; 29:62-70. [PMID: 35875424 PMCID: PMC9305909 DOI: 10.1177/09727531211070538] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Accepted: 11/29/2021] [Indexed: 11/27/2022] Open
Abstract
Background: It is currently accepted that sign languages and spoken languages have significant processing commonalities. The evidence supporting this often merely investigates frontotemporal pathways, perisylvian language areas, hemispheric lateralization, and event-related potentials in typical settings. However, recent evidence has explored beyond this and uncovered numerous modality-dependent processing differences between sign languages and spoken languages by accounting for confounds that previously invalidated processing comparisons and by delving into the specific conditions in which they arise. However, these processing differences are often shallowly dismissed as unspecific to language. Summary: This review examined recent neuroscientific evidence for processing differences between sign and spoken language modalities and the arguments against these differences’ importance. Key distinctions exist in the topography of the left anterior negativity (LAN) and with modulations of event-related potential (ERP) components like the N400. There is also differential activation of typical spoken language processing areas, such as the conditional role of the temporal areas in sign language (SL) processing. Importantly, sign language processing uniquely recruits parietal areas for processing phonology and syntax and requires the mapping of spatial information to internal representations. Additionally, modality-specific feedback mechanisms distinctively involve proprioceptive post-output monitoring in sign languages, contrary to spoken languages’ auditory and visual feedback mechanisms. The only study to find ERP differences post-production revealed earlier lexical access in sign than spoken languages. Themes of temporality, the validity of an analogous anatomical mechanisms viewpoint, and the comprehensiveness of current language models were also discussed to suggest improvements for future research. Key message: Current neuroscience evidence suggests various ways in which processing differs between sign and spoken language modalities that extend beyond simple differences between languages. Consideration and further exploration of these differences will be integral in developing a more comprehensive view of language in the brain.
Collapse
Affiliation(s)
- Hayley Bree Caldwell
- Cognitive and Systems Neuroscience Research Hub (CSN-RH), School of Justice and Society, University of South Australia Magill Campus, Magill, South Australia, Australia
| |
Collapse
|
6
|
Emmorey K, Lee B. The neurocognitive basis of skilled reading in prelingually and profoundly deaf adults. LANGUAGE AND LINGUISTICS COMPASS 2021; 15:e12407. [PMID: 34306178 PMCID: PMC8302003 DOI: 10.1111/lnc3.12407] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Accepted: 02/03/2021] [Indexed: 05/09/2023]
Abstract
Deaf individuals have unique sensory and linguistic experiences that influence how they read and become skilled readers. This review presents our current understanding of the neurocognitive underpinnings of reading skill in deaf adults. Key behavioural and neuroimaging studies are integrated to build a profile of skilled adult deaf readers and to examine how changes in visual attention and reduced access to auditory input and phonology shape how they read both words and sentences. Crucially, the behaviours, processes, and neural circuity of deaf readers are compared to those of hearing readers with similar reading ability to help identify alternative pathways to reading success. Overall, sensitivity to orthographic and semantic information is comparable for skilled deaf and hearing readers, but deaf readers rely less on phonology and show greater engagement of the right hemisphere in visual word processing. During sentence reading, deaf readers process visual word forms more efficiently and may have a greater reliance on and altered connectivity to semantic information compared to their hearing peers. These findings highlight the plasticity of the reading system and point to alternative pathways to reading success.
Collapse
Affiliation(s)
- Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, California, USA
- Joint Doctoral Program in Language and Communicative Disorders, University of California, San Diego, California, USA
| | - Brittany Lee
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, California, USA
- Joint Doctoral Program in Language and Communicative Disorders, University of California, San Diego, California, USA
| |
Collapse
|
7
|
Bottari D, Bednaya E, Dormal G, Villwock A, Dzhelyova M, Grin K, Pietrini P, Ricciardi E, Rossion B, Röder B. EEG frequency-tagging demonstrates increased left hemispheric involvement and crossmodal plasticity for face processing in congenitally deaf signers. Neuroimage 2020; 223:117315. [DOI: 10.1016/j.neuroimage.2020.117315] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2019] [Revised: 08/06/2020] [Accepted: 08/25/2020] [Indexed: 12/14/2022] Open
|
8
|
Richardson H, Koster-Hale J, Caselli N, Magid R, Benedict R, Olson H, Pyers J, Saxe R. Reduced neural selectivity for mental states in deaf children with delayed exposure to sign language. Nat Commun 2020; 11:3246. [PMID: 32591503 PMCID: PMC7319957 DOI: 10.1038/s41467-020-17004-y] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2019] [Accepted: 05/28/2020] [Indexed: 11/18/2022] Open
Abstract
Language provides a rich source of information about other people's thoughts and feelings. Consequently, delayed access to language may influence conceptual development in Theory of Mind (ToM). We use functional magnetic resonance imaging and behavioral tasks to study ToM development in child (n = 33, 4-12 years old) and adult (n = 36) fluent signers of American Sign Language (ASL), and characterize neural ToM responses during ASL and movie-viewing tasks. Participants include deaf children whose first exposure to ASL was delayed up to 7 years (n = 12). Neural responses to ToM stories (specifically, selectivity of the right temporo-parietal junction) in these children resembles responses previously observed in young children, who have similar linguistic experience, rather than those in age-matched native-signing children, who have similar biological maturation. Early linguistic experience may facilitate ToM development, via the development of a selective brain region for ToM.
Collapse
Affiliation(s)
- Hilary Richardson
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA.
- Laboratories of Cognitive Neuroscience, Division of Developmental Medicine, Boston Children's Hospital, 1 Autumn Street, Rm. 527, Boston, MA, 02215, USA.
- Department of Pediatrics, Harvard Medical School, 1 Autumn Street, Rm. 527, Boston, MA, 02215, USA.
| | - Jorie Koster-Hale
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA
| | - Naomi Caselli
- Wheelock College of Education and Human Development, Boston University, 621 Commonwealth Avenue, Rm. 218, Boston, MA, 02215, USA
| | - Rachel Magid
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA
| | - Rachel Benedict
- Wheelock College of Education and Human Development, Boston University, 621 Commonwealth Avenue, Rm. 218, Boston, MA, 02215, USA
| | - Halie Olson
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA
| | - Jennie Pyers
- Department of Psychology, Wellesley College, 106 Central Street, Wellesley, MA, 02481, USA
| | - Rebecca Saxe
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA
| |
Collapse
|
9
|
Quer J, Steinbach M. Handling Sign Language Data: The Impact of Modality. Front Psychol 2019; 10:483. [PMID: 30914998 PMCID: PMC6423168 DOI: 10.3389/fpsyg.2019.00483] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2018] [Accepted: 02/19/2019] [Indexed: 11/13/2022] Open
Abstract
Natural languages come in two different modalities. The impact of modality on the grammatical structure and linguistic theory has been discussed at great length in the last 20 years. By contrast, the impact of modality on linguistic data elicitation and collection, corpus studies, and experimental (psycholinguistic) studies is still underinvestigated. In this article, we address specific challenges that arise in judgment data elicitation and experimental studies of sign languages. These challenges are related to the socio-linguistic status of the Deaf community and the larger variability across signers within the same community, to the social status of sign languages, to properties of the visual-gestural modality and its interface with gesture, to methodological aspects of handling sign language data, and to specific linguistic features of sign languages. While some of these challenges also pertain to (some varieties of) spoken languages, other challenges are more modality-specific. The special combination of the challenges discussed in this article seems to be a specific facet empirical research on sign languages is faced with. In addition, we discuss the complementarity of theoretical approaches and experimental studies and show how the interaction of both approaches contributes to a better understanding of sign languages in particular and linguistic structures in general.
Collapse
Affiliation(s)
- Josep Quer
- ICREA-Pompeu Fabra University, Barcelona, Spain
| | | |
Collapse
|
10
|
Hall WC, Smith SR, Sutter EJ, DeWindt LA, Dye TDV. Considering parental hearing status as a social determinant of deaf population health: Insights from experiences of the "dinner table syndrome". PLoS One 2018; 13:e0202169. [PMID: 30183711 PMCID: PMC6124705 DOI: 10.1371/journal.pone.0202169] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2017] [Accepted: 07/17/2018] [Indexed: 11/18/2022] Open
Abstract
The influence of early language and communication experiences on lifelong health outcomes is receiving increased public health attention. Most deaf children have non-signing hearing parents, and are at risk for not experiencing fully accessible language environments, a possible factor underlying known deaf population health disparities. Childhood indirect family communication–such as spontaneous conversations and listening in the routine family environment (e.g. family meals, recreation, car rides)–is an important source of health-related contextual learning opportunities. The goal of this study was to assess the influence of parental hearing status on deaf people’s recalled access to childhood indirect family communication. We analyzed data from the Rochester Deaf Health Survey–2013 (n = 211 deaf adults) for associations between sociodemographic factors including parental hearing status, and recalled access to childhood indirect family communication. Parental hearing status predicted deaf adults’ recalled access to childhood indirect family communication (χ2 = 31.939, p < .001). The likelihood of deaf adults reporting “sometimes to never” for recalled comprehension of childhood family indirect communication increased by 17.6 times for those with hearing parents. No other sociodemographic or deaf-specific factors in this study predicted deaf adults’ access to childhood indirect family communication. This study finds that deaf people who have hearing parents were more likely to report limited access to contextual learning opportunities during childhood. Parental hearing status and early childhood language experiences, therefore, require further investigation as possible social determinants of health to develop interventions that improve lifelong health and social outcomes of the underserved deaf population.
Collapse
Affiliation(s)
- Wyatte C. Hall
- Obstetrics & Gynecology and Clinical & Translational Science Institute, University of Rochester Medical Center, Rochester, New York, United States of America
- * E-mail:
| | - Scott R. Smith
- Office of the Associate Dean of Research, National Technical Institute for the Deaf, Rochester Institute of Technology, Rochester, New York, United States of America
| | - Erika J. Sutter
- National Center for Deaf Health Research, University of Rochester Medical Center, Rochester, New York, United States of America
| | - Lori A. DeWindt
- National Center for Deaf Health Research, University of Rochester Medical Center, Rochester, New York, United States of America
- Deaf Wellness Center, University of Rochester Medical Center, Rochester, New York, United States of America
| | - Timothy D. V. Dye
- Obstetrics & Gynecology and Clinical & Translational Science Institute, University of Rochester Medical Center, Rochester, New York, United States of America
- Pediatrics and Public Health Sciences, University of Rochester Medical Center, Rochester, New York, United States of America
| |
Collapse
|
11
|
Hearing improvement with softband and implanted bone-anchored hearing devices and modified implantation surgery in patients with bilateral microtia-atresia. Int J Pediatr Otorhinolaryngol 2018; 104:120-125. [PMID: 29287851 DOI: 10.1016/j.ijporl.2017.11.010] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/27/2017] [Revised: 10/30/2017] [Accepted: 11/01/2017] [Indexed: 11/20/2022]
Abstract
OBJECTIVE To evaluate auditory development and hearing improvement in patients with bilateral microtia-atresia using softband and implanted bone-anchored hearing devices and to modify the implantation surgery. METHODS The subjects were divided into two groups: the softband group (40 infants, 3 months to 2 years old, Ponto softband) and the implanted group (6 patients, 6-28 years old, Ponto). The Infant-Toddler Meaning Auditory Integration Scale was used conducted to evaluate auditory development at baseline and after 3, 6, 12, and 24 months, and visual reinforcement audiometry was used to assess the auditory threshold in the softband group. In the implanted group, bone-anchored hearing devices were implanted combined with the auricular reconstruction surgery, and high-resolution CT was used to assess the deformity preoperatively. Auditory threshold and speech discrimination scores of the patients with implants were measured under the unaided, softband, and implanted conditions. RESULTS Total Infant-Toddler Meaning Auditory Integration Scale scores in the softband group improved significantly and approached normal levels. The average visual reinforcement audiometry values under the unaided and softband conditions were 76.75 ± 6.05 dB HL and 32.25 ± 6.20 dB HL (P < 0.01), respectively. In the implanted group, the auditory thresholds under the unaided, softband, and implanted conditions were 59.17 ± 3.76 dB HL, 32.5 ± 2.74 dB HL, and 17.5 ± 5.24 dB HL (P < 0.01), respectively. The respective speech discrimination scores were 23.33 ± 14.72%, 77.17 ± 6.46%, and 96.50 ± 2.66% (P < 0.01). CONCLUSIONS Using softband bone-anchored hearing devices is effective for auditory development and hearing improvement in infants with bilateral microtia-atresia. Wearing softband bone-anchored hearing devices before auricle reconstruction and combining bone-anchored hearing device implantation with auricular reconstruction surgery may bethe optimal clinical choice for these patients, and results in more significant hearing improvement and minimal surgical and anesthetic injury.
Collapse
|
12
|
Hall WC. What You Don't Know Can Hurt You: The Risk of Language Deprivation by Impairing Sign Language Development in Deaf Children. Matern Child Health J 2017; 21:961-965. [PMID: 28185206 DOI: 10.1007/s10995-017-2287-y] [Citation(s) in RCA: 79] [Impact Index Per Article: 11.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
A long-standing belief is that sign language interferes with spoken language development in deaf children, despite a chronic lack of evidence supporting this belief. This deserves discussion as poor life outcomes continue to be seen in the deaf population. This commentary synthesizes research outcomes with signing and non-signing children and highlights fully accessible language as a protective factor for healthy development. Brain changes associated with language deprivation may be misrepresented as sign language interfering with spoken language outcomes of cochlear implants. This may lead to professionals and organizations advocating for preventing sign language exposure before implantation and spreading misinformation. The existence of one-time-sensitive-language acquisition window means a strong possibility of permanent brain changes when spoken language is not fully accessible to the deaf child and sign language exposure is delayed, as is often standard practice. There is no empirical evidence for the harm of sign language exposure but there is some evidence for its benefits, and there is growing evidence that lack of language access has negative implications. This includes cognitive delays, mental health difficulties, lower quality of life, higher trauma, and limited health literacy. Claims of cochlear implant- and spoken language-only approaches being more effective than sign language-inclusive approaches are not empirically supported. Cochlear implants are an unreliable standalone first-language intervention for deaf children. Priorities of deaf child development should focus on healthy growth of all developmental domains through a fully-accessible first language foundation such as sign language, rather than auditory deprivation and speech skills.
Collapse
Affiliation(s)
- Wyatte C Hall
- Clinical & Translational Science Institute, University of Rochester Medical Center, Rochester, NY, USA.
| |
Collapse
|
13
|
Mehravari AS, Emmorey K, Prat CS, Klarman L, Osterhout L. Brain-based individual difference measures of reading skill in deaf and hearing adults. Neuropsychologia 2017; 101:153-168. [PMID: 28479187 PMCID: PMC5536185 DOI: 10.1016/j.neuropsychologia.2017.05.004] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2016] [Revised: 04/18/2017] [Accepted: 05/03/2017] [Indexed: 11/25/2022]
Abstract
Most deaf children and adults struggle to read, but some deaf individuals do become highly proficient readers. There is disagreement about the specific causes of reading difficulty in the deaf population, and consequently, disagreement about the effectiveness of different strategies for teaching reading to deaf children. Much of the disagreement surrounds the question of whether deaf children read in similar or different ways as hearing children. In this study, we begin to answer this question by using real-time measures of neural language processing to assess if deaf and hearing adults read proficiently in similar or different ways. Hearing and deaf adults read English sentences with semantic, grammatical, and simultaneous semantic/grammatical errors while event-related potentials (ERPs) were recorded. The magnitude of individuals' ERP responses was compared to their standardized reading comprehension test scores, and potentially confounding variables like years of education, speechreading skill, and language background of deaf participants were controlled for. The best deaf readers had the largest N400 responses to semantic errors in sentences, while the best hearing readers had the largest P600 responses to grammatical errors in sentences. These results indicate that equally proficient hearing and deaf adults process written language in different ways, suggesting there is little reason to assume that literacy education should necessarily be the same for hearing and deaf children. The results also show that the most successful deaf readers focus on semantic information while reading, which suggests aspects of education that may promote improved literacy in the deaf population.
Collapse
Affiliation(s)
- Alison S Mehravari
- Program in Neuroscience, University of Washington, Seattle, WA 98195, United States.
| | - Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, CA 98182, United States
| | - Chantel S Prat
- Program in Neuroscience, University of Washington, Seattle, WA 98195, United States; Department of Psychology, University of Washington, Seattle, WA 98195, United States; Institute for Learning and Brain Sciences, University of Washington, Seattle, WA 98195, United States
| | - Lindsay Klarman
- Institute for Learning and Brain Sciences, University of Washington, Seattle, WA 98195, United States
| | - Lee Osterhout
- Program in Neuroscience, University of Washington, Seattle, WA 98195, United States; Department of Psychology, University of Washington, Seattle, WA 98195, United States
| |
Collapse
|
14
|
Language deprivation syndrome: a possible neurodevelopmental disorder with sociocultural origins. Soc Psychiatry Psychiatr Epidemiol 2017; 52:761-776. [PMID: 28204923 PMCID: PMC5469702 DOI: 10.1007/s00127-017-1351-7] [Citation(s) in RCA: 67] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/07/2016] [Accepted: 01/22/2017] [Indexed: 10/20/2022]
Abstract
PURPOSE There is a need to better understand the epidemiological relationship between language development and psychiatric symptomatology. Language development can be particularly impacted by social factors-as seen in the developmental choices made for deaf children, which can create language deprivation. A possible mental health syndrome may be present in deaf patients with severe language deprivation. METHODS Electronic databases were searched to identify publications focusing on language development and mental health in the deaf population. Screening of relevant publications narrowed the search results to 35 publications. RESULTS Although there is very limited empirical evidence, there appears to be suggestions of a mental health syndrome by clinicians working with deaf patients. Possible features include language dysfluency, fund of knowledge deficits, and disruptions in thinking, mood, and/or behavior. CONCLUSION The clinical specialty of deaf mental health appears to be struggling with a clinically observed phenomenon that has yet to be empirically investigated and defined within the DSM. Descriptions of patients within the clinical setting suggest a language deprivation syndrome. Language development experiences have an epidemiological relationship with psychiatric outcomes in deaf people. This requires more empirical attention and has implications for other populations with behavioral health disparities as well.
Collapse
|
15
|
Hänel-Faulhaber B, Skotara N, Kügow M, Salden U, Bottari D, Röder B. ERP correlates of German Sign Language processing in deaf native signers. BMC Neurosci 2014; 15:62. [PMID: 24884527 PMCID: PMC4018965 DOI: 10.1186/1471-2202-15-62] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2013] [Accepted: 04/28/2014] [Indexed: 11/27/2022] Open
Abstract
Background The present study investigated the neural correlates of sign language processing of Deaf people who had learned German Sign Language (Deutsche Gebärdensprache, DGS) from their Deaf parents as their first language. Correct and incorrect signed sentences were presented sign by sign on a computer screen. At the end of each sentence the participants had to judge whether or not the sentence was an appropriate DGS sentence. Two types of violations were introduced: (1) semantically incorrect sentences containing a selectional restriction violation (implausible object); (2) morphosyntactically incorrect sentences containing a verb that was incorrectly inflected (i.e., incorrect direction of movement). Event-related brain potentials (ERPs) were recorded from 74 scalp electrodes. Results Semantic violations (implausible signs) elicited an N400 effect followed by a positivity. Sentences with a morphosyntactic violation (verb agreement violation) elicited a negativity followed by a broad centro-parietal positivity. Conclusions ERP correlates of semantic and morphosyntactic aspects of DGS clearly differed from each other and showed a number of similarities with those observed in other signed and oral languages. These data suggest a similar functional organization of signed and oral languages despite the visual-spacial modality of sign language.
Collapse
Affiliation(s)
- Barbara Hänel-Faulhaber
- University of Hamburg, Biological Psychology and Neuropsychology, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | | | | | | | | | | |
Collapse
|