1
|
Yang T, Fan X, Hou B, Wang J, Chen X. Linguistic network in early deaf individuals: A neuroimaging meta-analysis. Neuroimage 2024; 299:120720. [PMID: 38971484 DOI: 10.1016/j.neuroimage.2024.120720] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Revised: 07/01/2024] [Accepted: 07/03/2024] [Indexed: 07/08/2024] Open
Abstract
This meta-analysis summarizes evidence from 44 neuroimaging experiments and characterizes the general linguistic network in early deaf individuals. Meta-analytic comparisons with hearing individuals found that a specific set of regions (in particular the left inferior frontal gyrus and posterior middle temporal gyrus) participates in supramodal language processing. In addition to previously described modality-specific differences, the present study showed that the left calcarine gyrus and the right caudate were additionally recruited in deaf compared with hearing individuals. In addition, this study showed that the bilateral posterior superior temporal gyrus is shaped by cross-modal plasticity, whereas the left frontotemporal areas are shaped by early language experience. Although an overall left-lateralized pattern for language processing was observed in the early deaf individuals, regional lateralization was altered in the inferior frontal gyrus and anterior temporal lobe. These findings indicate that the core language network functions in a modality-independent manner, and provide a foundation for determining the contributions of sensory and linguistic experiences in shaping the neural bases of language processing.
Collapse
Affiliation(s)
- Tengyu Yang
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Xinmiao Fan
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Bo Hou
- Department of Radiology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Jian Wang
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China.
| | - Xiaowei Chen
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China.
| |
Collapse
|
2
|
Hall WC, Hecht JL. Primary health-care practices for deaf children should include early incorporation of a signed language. Lancet 2024:S0140-6736(24)01564-2. [PMID: 39216495 DOI: 10.1016/s0140-6736(24)01564-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/16/2024] [Accepted: 07/24/2024] [Indexed: 09/04/2024]
Affiliation(s)
- Wyatte C Hall
- University of Rochester Medical Center, Rochester, NY 14620, USA.
| | - Julia L Hecht
- University of New Mexico School of Medicine, Albuquerque, NM, USA
| |
Collapse
|
3
|
Ocuto OL. Deaf children, home language environments, and reciprocal-contingent family interactions. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2024; 29:322-334. [PMID: 38159302 DOI: 10.1093/deafed/enad063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/06/2023] [Revised: 10/24/2023] [Accepted: 12/06/2023] [Indexed: 01/03/2024]
Abstract
Engaged communication between mother and a child in their early developmental stages is one of the predictors of children's development of higher-order thinking skills. For deaf children, this engaged communication between mother and child hinges on the home language environment (HLE) being fully accessible to the child. This research uses agogical phenomenology in exploring the lived experiences of participants' HLE where sign language is used, with particular focus on the opportunities for extended discourse. Data were collected via semistructured interviews with the deaf children and their parents and observations in the HLEs of five signing families with at least one deaf child in the southwestern United States. The aim of this study was to document and provide insights into how language use in deaf children's HLE can impact their knowledge development; these insights uncovered the essence of reciprocal and contingent family interactions as a central aspect of the deaf child's HLE. It is hoped that the qualitative phenomenological findings will frame subsequent quantitative investigations of the variability in language access to home language components.
Collapse
Affiliation(s)
- Oscar L Ocuto
- Department of Education, Gallaudet University, 800 Florida Avenue NE, Washington, DC, United States
| |
Collapse
|
4
|
Gioiosa Maurno N, Phillips-Silver J, Daza González MT. Research of visual attention networks in deaf individuals: a systematic review. Front Psychol 2024; 15:1369941. [PMID: 38800679 PMCID: PMC11120974 DOI: 10.3389/fpsyg.2024.1369941] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2024] [Accepted: 04/22/2024] [Indexed: 05/29/2024] Open
Abstract
The impact of deafness on visual attention has been widely discussed in previous research. It has been noted that deficiencies and strengths of previous research can be attributed to temporal or spatial aspects of attention, as well as variations in development and clinical characteristics. Visual attention is categorized into three networks: orienting (exogenous and endogenous), alerting (phasic and tonic), and executive control. This study aims to contribute new neuroscientific evidence supporting this hypothesis. This paper presents a systematic review of the international literature from the past 15 years focused on visual attention in the deaf population. The final review included 24 articles. The function of the orienting network is found to be enhanced in deaf adults and children, primarily observed in native signers without cochlear implants, while endogenous orienting is observed only in the context of gaze cues in children, with no differences found in adults. Results regarding alerting and executive function vary depending on clinical characteristics and paradigms used. Implications for future research on visual attention in the deaf population are discussed.
Collapse
Affiliation(s)
- Nahuel Gioiosa Maurno
- Department of Psychology, University of Almería, Almería, Spain
- CIBIS Research Center, University of Almería, Almería, Spain
| | | | - María Teresa Daza González
- Department of Psychology, University of Almería, Almería, Spain
- CIBIS Research Center, University of Almería, Almería, Spain
| |
Collapse
|
5
|
Pontecorvo E, Higgins M, Mora J, Lieberman AM, Pyers J, Caselli NK. Learning a Sign Language Does Not Hinder Acquisition of a Spoken Language. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2023; 66:1291-1308. [PMID: 36972338 PMCID: PMC10187967 DOI: 10.1044/2022_jslhr-22-00505] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Revised: 12/02/2022] [Accepted: 12/12/2022] [Indexed: 05/18/2023]
Abstract
PURPOSE The purpose of this study is to determine whether and how learning American Sign Language (ASL) is associated with spoken English skills in a sample of ASL-English bilingual deaf and hard of hearing (DHH) children. METHOD This cross-sectional study of vocabulary size included 56 DHH children between 8 and 60 months of age who were learning both ASL and spoken English and had hearing parents. English and ASL vocabulary were independently assessed via parent report checklists. RESULTS ASL vocabulary size positively correlated with spoken English vocabulary size. Spoken English vocabulary sizes in the ASL-English bilingual DHH children in the present sample were comparable to those in previous reports of monolingual DHH children who were learning only English. ASL-English bilingual DHH children had total vocabularies (combining ASL and English) that were equivalent to same-age hearing monolingual children. Children with large ASL vocabularies were more likely to have spoken English vocabularies in the average range based on norms for hearing monolingual children. CONCLUSIONS Contrary to predictions often cited in the literature, acquisition of sign language does not harm spoken vocabulary acquisition. This retrospective, correlational study cannot determine whether there is a causal relationship between sign language and spoken language vocabulary acquisition, but if a causal relationship exists, the evidence here suggests that the effect would be positive. Bilingual DHH children have age-expected vocabularies when considering the entirety of their language skills. We found no evidence to support recommendations that families with DHH children avoid learning sign language. Rather, our findings show that children with early ASL exposure can develop age-appropriate vocabulary skills in both ASL and spoken English.
Collapse
|
6
|
Ma HL, Zeng TA, Jiang L, Zhang M, Li H, Su R, Wang ZX, Chen DM, Xu M, Xie WT, Dang P, Bu XO, Zhang T, Wang TZ. Altered resting-state network connectivity patterns for predicting attentional function in deaf individuals: An EEG study. Hear Res 2023; 429:108696. [PMID: 36669260 DOI: 10.1016/j.heares.2023.108696] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/29/2022] [Revised: 12/22/2022] [Accepted: 01/12/2023] [Indexed: 01/16/2023]
Abstract
Multiple aspects of brain development are influenced by early sensory loss such as deafness. Despite growing evidence of changes in attentional functions for prelingual profoundly deaf, the brain mechanisms underlying these attentional changes remain unclear. This study investigated the relationships between differential attention and the resting-state brain network difference in deaf individuals from the perspective of brain network connectivity. We recruited 36 deaf individuals and 34 healthy controls (HC). We recorded each participant's resting-state electroencephalogram (EEG) and the event-related potential (ERP) data from the Attention Network Test (ANT). The coherence (COH) method and graph theory were used to build brain networks and analyze network connectivity. First, the ERPs of analysis in task states were investigated. Then, we correlated the topological properties of the network functional connectivity with the ERPs. The results revealed a significant correlation between frontal-occipital connection in the resting state and the amplitude of alert N1 amplitude in the alpha band. Specifically, clustering coefficients and global and local efficiency correlate negatively with alert N1 amplitude, whereas the characteristic path length positively correlates with alert N1 amplitude. In addition, deaf individuals exhibited weaker frontal-occipital connections compared to the HC group. In executive control, the deaf group had longer reaction times and larger P3 amplitudes. However, the orienting function did not significantly differ from the HC group. Finally, the alert N1 amplitude in the ANT task for deaf individuals was predicted using a multiple linear regression model based on resting-state EEG network properties. Our results suggest that deafness affects the performance of alerting and executive control while orienting functions develop similarly to hearing individuals. Furthermore, weakened frontal-occipital connections in the deaf brain are a fundamental cause of altered alerting functions in the deaf. These results reveal important effects of brain networks on attentional function from the perspective of brain connections and provide potential physiological biomarkers to predicting attention.
Collapse
Affiliation(s)
- Hai-Lin Ma
- Faculty of Education, Shaanxi Normal University, No.199, Chang'an Road, Yanta District, Xi 'an, Shaanxi 710062, China; Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China
| | - Tong-Ao Zeng
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China
| | - Lin Jiang
- School of Life Science and Technology, Center for Information in BioMedicine, University of Electronic Science and Technology of China, Chengdu 611731, China
| | - Mei Zhang
- College of Special Education, Leshan Normal University, Leshan 614000, China
| | - Hao Li
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China
| | - Rui Su
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China
| | - Zhi-Xin Wang
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China; Department of Psychology, Shandong Normal University, No. 88East Wenhua Road, Jinan, Shandong 250014, China
| | - Dong-Mei Chen
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China
| | - Meng Xu
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China
| | - Wen-Ting Xie
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China
| | - Peng Dang
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China
| | - Xiao-Ou Bu
- Plateau Brain Science Research Center, Tibet University /South China Normal University, 850012/Guangzhou, Lhasa 510631, China; Faculty of Education, East China Normal University, Shanghai 200062, China
| | - Tao Zhang
- Mental Health Education Center and School of Science, Xihua University, Chengdu 610039, China,.
| | - Ting-Zhao Wang
- Faculty of Education, Shaanxi Normal University, No.199, Chang'an Road, Yanta District, Xi 'an, Shaanxi 710062, China.
| |
Collapse
|
7
|
Intersecting distributed networks support convergent linguistic functioning across different languages in bilinguals. Commun Biol 2023; 6:99. [PMID: 36697483 PMCID: PMC9876897 DOI: 10.1038/s42003-023-04446-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2022] [Accepted: 01/04/2023] [Indexed: 01/26/2023] Open
Abstract
How bilingual brains accomplish the processing of more than one language has been widely investigated by neuroimaging studies. The assimilation-accommodation hypothesis holds that both the same brain neural networks supporting the native language and additional new neural networks are utilized to implement second language processing. However, whether and how this hypothesis applies at the finer-grained levels of both brain anatomical organization and linguistic functions remains unknown. To address this issue, we scanned Chinese-English bilinguals during an implicit reading task involving Chinese words, English words and Chinese pinyin. We observed broad brain cortical regions wherein interdigitated distributed neural populations supported the same cognitive components of different languages. Although spatially separate, regions including the opercular and triangular parts of the inferior frontal gyrus, temporal pole, superior and middle temporal gyrus, precentral gyrus and supplementary motor areas were found to perform the same linguistic functions across languages, indicating regional-level functional assimilation supported by voxel-wise anatomical accommodation. Taken together, the findings not only verify the functional independence of neural representations of different languages, but show co-representation organization of both languages in most language regions, revealing linguistic-feature specific accommodation and assimilation between first and second languages.
Collapse
|
8
|
Humphries T, Mathur G, Napoli DJ, Padden C, Rathmann C. Deaf Children Need Rich Language Input from the Start: Support in Advising Parents. CHILDREN (BASEL, SWITZERLAND) 2022; 9:1609. [PMID: 36360337 PMCID: PMC9688581 DOI: 10.3390/children9111609] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/23/2022] [Revised: 10/13/2022] [Accepted: 10/19/2022] [Indexed: 01/25/2023]
Abstract
Bilingual bimodalism is a great benefit to deaf children at home and in schooling. Deaf signing children perform better overall than non-signing deaf children, regardless of whether they use a cochlear implant. Raising a deaf child in a speech-only environment can carry cognitive and psycho-social risks that may have lifelong adverse effects. For children born deaf, or who become deaf in early childhood, we recommend comprehensible multimodal language exposure and engagement in joint activity with parents and friends to assure age-appropriate first-language acquisition. Accessible visual language input should begin as close to birth as possible. Hearing parents will need timely and extensive support; thus, we propose that, upon the birth of a deaf child and through the preschool years, among other things, the family needs an adult deaf presence in the home for several hours every day to be a linguistic model, to guide the family in taking sign language lessons, to show the family how to make spoken language accessible to their deaf child, and to be an encouraging liaison to deaf communities. While such a support program will be complicated and challenging to implement, it is far less costly than the harm of linguistic deprivation.
Collapse
Affiliation(s)
- Tom Humphries
- Department of Communication, University of California at San Diego, La Jolla, CA 92093, USA
| | - Gaurav Mathur
- Department of Linguistics, Gallaudet University, Washington, DC 20002, USA
| | - Donna Jo Napoli
- Department of Linguistics, Swarthmore College, Swarthmore, PA 19081, USA
| | - Carol Padden
- Division of Social Sciences, Department of Communication and Dean, University of California at San Diego, La Jolla, CA 92093, USA
| | - Christian Rathmann
- Department of Deaf Studies and Sign Language Interpreting, Humboldt-Universität zu Berlin, 10019 Berlin, Germany
| |
Collapse
|
9
|
Villwock A, Grin K. Somatosensory processing in deaf and deafblind individuals: How does the brain adapt as a function of sensory and linguistic experience? A critical review. Front Psychol 2022; 13:938842. [PMID: 36324786 PMCID: PMC9618853 DOI: 10.3389/fpsyg.2022.938842] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2022] [Accepted: 09/22/2022] [Indexed: 11/17/2022] Open
Abstract
How do deaf and deafblind individuals process touch? This question offers a unique model to understand the prospects and constraints of neural plasticity. Our brain constantly receives and processes signals from the environment and combines them into the most reliable information content. The nervous system adapts its functional and structural organization according to the input, and perceptual processing develops as a function of individual experience. However, there are still many unresolved questions regarding the deciding factors for these changes in deaf and deafblind individuals, and so far, findings are not consistent. To date, most studies have not taken the sensory and linguistic experiences of the included participants into account. As a result, the impact of sensory deprivation vs. language experience on somatosensory processing remains inconclusive. Even less is known about the impact of deafblindness on brain development. The resulting neural adaptations could be even more substantial, but no clear patterns have yet been identified. How do deafblind individuals process sensory input? Studies on deafblindness have mostly focused on single cases or groups of late-blind individuals. Importantly, the language backgrounds of deafblind communities are highly variable and include the usage of tactile languages. So far, this kind of linguistic experience and its consequences have not been considered in studies on basic perceptual functions. Here, we will provide a critical review of the literature, aiming at identifying determinants for neuroplasticity and gaps in our current knowledge of somatosensory processing in deaf and deafblind individuals.
Collapse
Affiliation(s)
- Agnes Villwock
- Sign Languages, Department of Rehabilitation Sciences, Humboldt-Universität zu Berlin, Berlin, Germany
| | | |
Collapse
|
10
|
Zhou X, Feng M, Hu Y, Zhang C, Zhang Q, Luo X, Yuan W. The Effects of Cortical Reorganization and Applications of Functional Near-Infrared Spectroscopy in Deaf People and Cochlear Implant Users. Brain Sci 2022; 12:brainsci12091150. [PMID: 36138885 PMCID: PMC9496692 DOI: 10.3390/brainsci12091150] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Revised: 08/19/2022] [Accepted: 08/24/2022] [Indexed: 11/22/2022] Open
Abstract
A cochlear implant (CI) is currently the only FDA-approved biomedical device that can restore hearing for the majority of patients with severe-to-profound sensorineural hearing loss (SNHL). While prelingually and postlingually deaf individuals benefit substantially from CI, the outcomes after implantation vary greatly. Numerous studies have attempted to study the variables that affect CI outcomes, including the personal characteristics of CI candidates, environmental variables, and device-related variables. Up to 80% of the results remained unexplainable because all these variables could only roughly predict auditory performance with a CI. Brain structure/function differences after hearing deprivation, that is, cortical reorganization, has gradually attracted the attention of neuroscientists. The cross-modal reorganization in the auditory cortex following deafness is thought to be a key factor in the success of CI. In recent years, the adaptive and maladaptive effects of this reorganization on CI rehabilitation have been argued because the neural mechanisms of how this reorganization impacts CI learning and rehabilitation have not been revealed. Due to the lack of brain processes describing how this plasticity affects CI learning and rehabilitation, the adaptive and deleterious consequences of this reorganization on CI outcomes have recently been the subject of debate. This review describes the evidence for different roles of cross-modal reorganization in CI performance and attempts to explore the possible reasons. Additionally, understanding the core influencing mechanism requires taking into account the cortical changes from deafness to hearing restoration. However, methodological issues have restricted longitudinal research on cortical function in CI. Functional near-infrared spectroscopy (fNIRS) has been increasingly used for the study of brain function and language assessment in CI because of its unique advantages, which are considered to have great potential. Here, we review studies on auditory cortex reorganization in deaf patients and CI recipients, and then we try to illustrate the feasibility of fNIRS as a neuroimaging tool in predicting and assessing speech performance in CI recipients. Here, we review research on the cross-modal reorganization of the auditory cortex in deaf patients and CI recipients and seek to demonstrate the viability of using fNIRS as a neuroimaging technique to predict and evaluate speech function in CI recipients.
Collapse
Affiliation(s)
- Xiaoqing Zhou
- Department of Otolaryngolgy, Chongqing General Hospital, Chongqing 401147, China
- Chongqing Medical University, Chongqing 400042, China
- Chongqing School, University of Chinese Academy of Sciences, Chongqing 400714, China
- Chongqing Institute of Green and Intelligent Technology, University of Chinese Academy of Sciences, Chongqing 400714, China
| | - Menglong Feng
- Department of Otolaryngolgy, Chongqing General Hospital, Chongqing 401147, China
- Chongqing Medical University, Chongqing 400042, China
- Chongqing School, University of Chinese Academy of Sciences, Chongqing 400714, China
- Chongqing Institute of Green and Intelligent Technology, University of Chinese Academy of Sciences, Chongqing 400714, China
| | - Yaqin Hu
- Department of Otolaryngolgy, Chongqing General Hospital, Chongqing 401147, China
- Chongqing Medical University, Chongqing 400042, China
- Chongqing School, University of Chinese Academy of Sciences, Chongqing 400714, China
- Chongqing Institute of Green and Intelligent Technology, University of Chinese Academy of Sciences, Chongqing 400714, China
| | - Chanyuan Zhang
- Department of Otolaryngolgy, Chongqing General Hospital, Chongqing 401147, China
- Chongqing Medical University, Chongqing 400042, China
- Chongqing School, University of Chinese Academy of Sciences, Chongqing 400714, China
- Chongqing Institute of Green and Intelligent Technology, University of Chinese Academy of Sciences, Chongqing 400714, China
| | - Qingling Zhang
- Department of Otolaryngolgy, Chongqing General Hospital, Chongqing 401147, China
- Chongqing Medical University, Chongqing 400042, China
- Chongqing School, University of Chinese Academy of Sciences, Chongqing 400714, China
- Chongqing Institute of Green and Intelligent Technology, University of Chinese Academy of Sciences, Chongqing 400714, China
| | - Xiaoqin Luo
- Department of Otolaryngolgy, Chongqing General Hospital, Chongqing 401147, China
- Chongqing Medical University, Chongqing 400042, China
- Chongqing School, University of Chinese Academy of Sciences, Chongqing 400714, China
- Chongqing Institute of Green and Intelligent Technology, University of Chinese Academy of Sciences, Chongqing 400714, China
| | - Wei Yuan
- Department of Otolaryngolgy, Chongqing General Hospital, Chongqing 401147, China
- Chongqing Medical University, Chongqing 400042, China
- Chongqing School, University of Chinese Academy of Sciences, Chongqing 400714, China
- Chongqing Institute of Green and Intelligent Technology, University of Chinese Academy of Sciences, Chongqing 400714, China
- Correspondence: ; Tel.: +86-23-63535180
| |
Collapse
|
11
|
Tomaszewski P, Krzysztofiak P, Morford JP, Eźlakowski W. Effects of Age-of-Acquisition on Proficiency in Polish Sign Language: Insights to the Critical Period Hypothesis. Front Psychol 2022; 13:896339. [PMID: 35693522 PMCID: PMC9174753 DOI: 10.3389/fpsyg.2022.896339] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Accepted: 04/26/2022] [Indexed: 11/30/2022] Open
Abstract
This study focuses on the relationship between the age of acquisition of Polish Sign Language (PJM) by deaf individuals and their receptive language skills at the phonological, morphological and syntactic levels. Sixty Deaf signers of PJM were recruited into three equal groups (n = 20): (1) a group exposed to PJM from birth from their deaf parents; (2) a group of childhood learners of PJM, who reported learning PJM between 4 and 8 years; (3) a group of adolescent learners of PJM, who reported learning PJM between 9 and 13 years. The PJM Perception and Comprehension Test was used to assess three aspects of language processing: phonological, morphological and syntactic. Participants were asked to decide whether a series of signs and sentences were acceptable in PJM. Results show that the age of PJM acquisition has a significant impact on performance on this task. The earlier deaf people acquire PJM, the more likely they were to distinguish signs and sentences considered permissible and impermissible in PJM by native signers. Native signers had significantly greater accuracy on the phonological, morphological, and syntactic items than either the Childhood or the Adolescent signers. Further, the Childhood signers had significantly greater accuracy than the Adolescent signers on all three parts of the test. Comparing performance on specific structures targeted within each part of the test revealed that multi-channel signs and negative suffixes posed the greatest challenge for Adolescent signers relative to the Native signers. The above results provide evidence from a less-commonly studied signed language that the age of onset of first language acquisition affects ultimate outcomes in language acquisition across all levels of grammatical structure. In addition, this research corroborates prior studies demonstrating that the critical period is independent of language modality. Contrary to a common public health assumption that early exposure to language is less vital to signed than to spoken language development, the results of this study demonstrate that early exposure to a signed language promotes sensitivity to phonological, morphological and syntactic patterns in language.
Collapse
Affiliation(s)
| | - Piotr Krzysztofiak
- Faculty of Psychology, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Jill P. Morford
- Department of Linguistics, University of New Mexico, Albuquerque, NM, United States
| | | |
Collapse
|
12
|
Holmer E, Schönström K, Andin J. Associations Between Sign Language Skills and Resting-State Functional Connectivity in Deaf Early Signers. Front Psychol 2022; 13:738866. [PMID: 35369269 PMCID: PMC8975249 DOI: 10.3389/fpsyg.2022.738866] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Accepted: 02/03/2022] [Indexed: 11/13/2022] Open
Abstract
The processing of a language involves a neural language network including temporal, parietal, and frontal cortical regions. This applies to spoken as well as signed languages. Previous research suggests that spoken language proficiency is associated with resting-state functional connectivity (rsFC) between language regions and other regions of the brain. Given the similarities in neural activation for spoken and signed languages, rsFC-behavior associations should also exist for sign language tasks. In this study, we explored the associations between rsFC and two types of linguistic skills in sign language: phonological processing skill and accuracy in elicited sentence production. Fifteen adult, deaf early signers were enrolled in a resting-state functional magnetic resonance imaging (fMRI) study. In addition to fMRI data, behavioral tests of sign language phonological processing and sentence reproduction were administered. Using seed-to-voxel connectivity analysis, we investigated associations between behavioral proficiency and rsFC from language-relevant nodes: bilateral inferior frontal gyrus (IFG) and posterior superior temporal gyrus (STG). Results showed that worse sentence processing skill was associated with stronger positive rsFC between the left IFG and left sensorimotor regions. Further, sign language phonological processing skill was associated with positive rsFC from right IFG to middle frontal gyrus/frontal pole although this association could possibly be explained by domain-general cognitive functions. Our findings suggest a possible connection between rsFC and developmental language outcomes in deaf individuals.
Collapse
Affiliation(s)
- Emil Holmer
- Linnaeus Centre HEAD, Swedish Institute for Disability Research, Department of Behavioural Sciences and Learning, Linköping University, Linköping, Sweden
- Center for Medical Image Science and Visualization, Linköping, Sweden
- *Correspondence: Emil Holmer,
| | | | - Josefine Andin
- Linnaeus Centre HEAD, Swedish Institute for Disability Research, Department of Behavioural Sciences and Learning, Linköping University, Linköping, Sweden
| |
Collapse
|
13
|
Shen YX, Zhang C, Zuo L, Zhou X, Deng X, Zhang L. How I Speak Defines What I Do: Effects of the Functional Language Proficiency of Host Country Employees on Their Unethical Pro-organizational Behavior. Front Psychol 2022; 13:852450. [PMID: 35369215 PMCID: PMC8971832 DOI: 10.3389/fpsyg.2022.852450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Accepted: 02/14/2022] [Indexed: 11/13/2022] Open
Abstract
Functional language has been used in many multinational corporations (MNCs) as a way to overcome the problems caused by the coexistence of multiple languages in the workplace. The existing literature has explored the importance, adoption, and effectiveness of functional language. Yet, how functional language shapes host country employees’ moral cognition and behavior is insufficiently researched. Guided by the Social Identity Theory, this manuscript shows that host country employees’ functional language proficiency (i.e., English) enhances their unethical pro-organizational behavior through their linguistic group identification and moral disengagement. We tested our predictions using the data collected from 309 full-time host country employees through an online survey, and the results generally supported our hypotheses. The findings make contributions to both international management and language literature and organizational moral behavior literature.
Collapse
Affiliation(s)
- Ya Xi Shen
- Business School, Hunan University, Changsha, China
| | - Chuang Zhang
- Business School, Hunan University, Changsha, China
| | - Lamei Zuo
- Business School, Hunan University, Changsha, China
| | - Xingxing Zhou
- Institute of Facility Agriculture, Guangdong Academy of Agricultural Sciences, Guangzhou, China
- *Correspondence: Xingxing Zhou,
| | - Xuhui Deng
- Business School, Hunan University, Changsha, China
| | - Long Zhang
- Business School, Hunan University, Changsha, China
| |
Collapse
|
14
|
Goldberg EB, Hillis AE. Sign language aphasia. HANDBOOK OF CLINICAL NEUROLOGY 2022; 185:297-315. [PMID: 35078607 DOI: 10.1016/b978-0-12-823384-9.00019-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Signed languages are naturally occurring, fully formed linguistic systems that rely on the movement of the hands, arms, torso, and face within a sign space for production, and are perceived predominantly using visual perception. Despite stark differences in modality and linguistic structure, functional neural organization is strikingly similar to spoken language. Generally speaking, left frontal areas support sign production, and regions in the auditory cortex underlie sign comprehension-despite signers not relying on audition to process language. Given this, should a deaf or hearing signer suffer damage to the left cerebral hemisphere, language is vulnerable to impairment. Multiple cases of sign language aphasia have been documented following left hemisphere injury, and the general pattern of linguistic deficits mirrors those observed in spoken language. The right hemisphere likely plays a role in non-linguistic but critical visuospatial functions of sign language; therefore, individuals who are spared from damage to the left hemisphere but suffer injury to the right are at risk for a different set of communication deficits. In this chapter, we review the neurobiology of sign language and patterns of language deficits that follow brain injury in the deaf signing population.
Collapse
Affiliation(s)
- Emily B Goldberg
- Department of Neurology, Johns Hopkins University School of Medicine, Baltimore, MD, United States.
| | - Argye Elizabeth Hillis
- Department of Neurology, Johns Hopkins University School of Medicine, Baltimore, MD, United States; Department of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, MD, United States; Department of Cognitive Science, Johns Hopkins University, Baltimore, MD, United States
| |
Collapse
|
15
|
Matchin W, İlkbaşaran D, Hatrak M, Roth A, Villwock A, Halgren E, Mayberry RI. The Cortical Organization of Syntactic Processing Is Supramodal: Evidence from American Sign Language. J Cogn Neurosci 2022; 34:224-235. [PMID: 34964898 PMCID: PMC8764739 DOI: 10.1162/jocn_a_01790] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Areas within the left-lateralized neural network for language have been found to be sensitive to syntactic complexity in spoken and written language. Previous research has revealed that these areas are active for sign language as well, but whether these areas are specifically responsive to syntactic complexity in sign language independent of lexical processing has yet to be found. To investigate the question, we used fMRI to neuroimage deaf native signers' comprehension of 180 sign strings in American Sign Language (ASL) with a picture-probe recognition task. The ASL strings were all six signs in length but varied at three levels of syntactic complexity: sign lists, two-word sentences, and complex sentences. Syntactic complexity significantly affected comprehension and memory, both behaviorally and neurally, by facilitating accuracy and response time on the picture-probe recognition task and eliciting a left lateralized activation response pattern in anterior and posterior superior temporal sulcus (aSTS and pSTS). Minimal or absent syntactic structure reduced picture-probe recognition and elicited activation in bilateral pSTS and occipital-temporal cortex. These results provide evidence from a sign language, ASL, that the combinatorial processing of anterior STS and pSTS is supramodal in nature. The results further suggest that the neurolinguistic processing of ASL is characterized by overlapping and separable neural systems for syntactic and lexical processing.
Collapse
Affiliation(s)
- William Matchin
- University of California San Diego
- University of South Carolina, Columbia
| | | | | | | | - Agnes Villwock
- University of California San Diego
- Humboldt University of Berlin
| | | | | |
Collapse
|
16
|
Benetti S, Collignon O. Cross-modal integration and plasticity in the superior temporal cortex. HANDBOOK OF CLINICAL NEUROLOGY 2022; 187:127-143. [PMID: 35964967 DOI: 10.1016/b978-0-12-823493-8.00026-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
In congenitally deaf people, temporal regions typically believed to be primarily auditory enhance their response to nonauditory information. The neural mechanisms and functional principles underlying this phenomenon, as well as its impact on auditory recovery after sensory restoration, yet remain debated. In this chapter, we demonstrate that the cross-modal recruitment of temporal regions by visual inputs in congenitally deaf people follows organizational principles known to be present in the hearing brain. We propose that the functional and structural mechanisms allowing optimal convergence of multisensory information in the temporal cortex of hearing people also provide the neural scaffolding for feeding visual or tactile information into the deafened temporal areas. Innate in their nature, such anatomo-functional links between the auditory and other sensory systems would represent the common substrate of both early multisensory integration and expression of selective cross-modal plasticity in the superior temporal cortex.
Collapse
Affiliation(s)
- Stefania Benetti
- Center for Mind/Brain Sciences - CIMeC, University of Trento, Trento, Italy
| | - Olivier Collignon
- Center for Mind/Brain Sciences - CIMeC, University of Trento, Trento, Italy; Institute for Research in Psychology and Neuroscience, Faculty of Psychology and Educational Science, UC Louvain, Louvain-la-Neuve, Belgium.
| |
Collapse
|
17
|
Abstract
The first 40 years of research on the neurobiology of sign languages (1960-2000) established that the same key left hemisphere brain regions support both signed and spoken languages, based primarily on evidence from signers with brain injury and at the end of the 20th century, based on evidence from emerging functional neuroimaging technologies (positron emission tomography and fMRI). Building on this earlier work, this review focuses on what we have learned about the neurobiology of sign languages in the last 15-20 years, what controversies remain unresolved, and directions for future research. Production and comprehension processes are addressed separately in order to capture whether and how output and input differences between sign and speech impact the neural substrates supporting language. In addition, the review includes aspects of language that are unique to sign languages, such as pervasive lexical iconicity, fingerspelling, linguistic facial expressions, and depictive classifier constructions. Summary sketches of the neural networks supporting sign language production and comprehension are provided with the hope that these will inspire future research as we begin to develop a more complete neurobiological model of sign language processing.
Collapse
|
18
|
Amadeo MB, Tonelli A, Campus C, Gori M. Reduced flash lag illusion in early deaf individuals. Brain Res 2021; 1776:147744. [PMID: 34848173 DOI: 10.1016/j.brainres.2021.147744] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Revised: 10/21/2021] [Accepted: 11/24/2021] [Indexed: 11/28/2022]
Abstract
When a brief flash is quickly presented aligned with a moving target, the flash typically appears to lag behind the moving stimulus. This effect is widely known in the literature as a flash-lag illusion (FLI). The flash-lag is an example of a motion-induced position shift. Since auditory deprivation leads to both enhanced visual skills and impaired temporal abilities, both crucial for the perception of the flash-lag effect, here we hypothesized that lack of audition could influence the FLI. 13 early deaf and 18 hearing individuals were tested in a visual FLI paradigm to investigate this hypothesis. As expected, results demonstrated a reduction of the flash-lag effect following early deafness, both in the central and peripheral visual fields. Moreover, only for deaf individuals, there is a positive correlation between the flash-lag effect in the peripheral and central visual field, suggesting that the mechanisms underlying the effect in the center of the visual field expand to the periphery following deafness. Overall, these findings reveal that lack of audition early in life profoundly impacts early visual processing underlying the flash-lag effect.
Collapse
Affiliation(s)
- Maria Bianca Amadeo
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via E. Melen 83, 16152 Genova, Italy.
| | - Alessia Tonelli
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via E. Melen 83, 16152 Genova, Italy
| | - Claudio Campus
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via E. Melen 83, 16152 Genova, Italy
| | - Monica Gori
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via E. Melen 83, 16152 Genova, Italy
| |
Collapse
|
19
|
Deaf Children of Hearing Parents Have Age-Level Vocabulary Growth When Exposed to American Sign Language by 6 Months of Age. J Pediatr 2021; 232:229-236. [PMID: 33482219 PMCID: PMC8085057 DOI: 10.1016/j.jpeds.2021.01.029] [Citation(s) in RCA: 31] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Revised: 01/08/2021] [Accepted: 01/13/2021] [Indexed: 11/20/2022]
Abstract
OBJECTIVE To examine whether children who are deaf or hard of hearing who have hearing parents can develop age-level vocabulary skills when they have early exposure to a sign language. STUDY DESIGN This cross-sectional study of vocabulary size included 78 children who are deaf or hard of hearing between 8 and 68 months of age who were learning American Sign Language (ASL) and had hearing parents. Children who were exposed to ASL before 6 months of age or between 6 and 36 months of age were compared with a reference sample of 104 deaf and hard of hearing children who have parents who are deaf and sign. RESULTS Deaf and hard of hearing children with hearing parents who were exposed to ASL in the first 6 months of life had age-expected receptive and expressive vocabulary growth. Children who had a short delay in ASL exposure had relatively smaller expressive but not receptive vocabulary sizes, and made rapid gains. CONCLUSIONS Although hearing parents generally learn ASL alongside their children who are deaf, their children can develop age-expected vocabulary skills when exposed to ASL during infancy. Children who are deaf with hearing parents can predictably and consistently develop age-level vocabularies at rates similar to native signers; early vocabulary skills are robust predictors of development across domains.
Collapse
|
20
|
Palejwala AH, Dadario NB, Young IM, O'Connor K, Briggs RG, Conner AK, O'Donoghue DL, Sughrue ME. Anatomy and White Matter Connections of the Lingual Gyrus and Cuneus. World Neurosurg 2021; 151:e426-e437. [PMID: 33894399 DOI: 10.1016/j.wneu.2021.04.050] [Citation(s) in RCA: 67] [Impact Index Per Article: 22.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2021] [Accepted: 04/12/2021] [Indexed: 11/16/2022]
Abstract
BACKGROUND The medial occipital lobe, composed of the lingual gyrus and cuneus, is necessary for both basic and higher level visual processing. It is also known to facilitate cross-modal, nonvisual functions, such as linguistic processing and verbal memory, after the loss of the visual senses. A detailed cortical model elucidating the white matter connectivity associated with this area could improve our understanding of the interacting brain networks that underlie complex human processes and postoperative outcomes related to vision and language. METHODS Generalized q-sampling imaging tractography, validated by gross anatomic dissection for qualitative visual agreement, was performed on 10 healthy adult controls obtained from the Human Connectome Project. RESULTS Major white matter connections were identified by tractography and validated by gross dissection, which connected the medial occipital lobe with itself and the adjacent cortices, especially the temporal lobe. The short- and long-range connections identified consisted mainly of U-shaped association fibers, intracuneal fibers, and inferior fronto-occipital fasciculus, inferior longitudinal fasciculus, middle longitudinal fasciculus, and lingual-fusiform connections. CONCLUSIONS The medial occipital lobe is an extremely interconnected system, supporting its ability to perform coordinated basic visual processing, but also serves as a center for many long-range association fibers, supporting its importance in nonvisual functions, such as language and memory. The presented data represent clinically actionable anatomic information that can be used in multimodal navigation of white matter lesions in the medial occipital lobe to prevent neurologic deficits and improve patients' quality of life after cerebral surgery.
Collapse
Affiliation(s)
- Ali H Palejwala
- Department of Neurosurgery, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA
| | - Nicholas B Dadario
- Robert Wood Johnson Medical School, Rutgers University, New Brunswick, New Jersey, USA
| | | | - Kyle O'Connor
- Department of Neurosurgery, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA
| | - Robert G Briggs
- Department of Neurosurgery, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA
| | - Andrew K Conner
- Department of Neurosurgery, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA
| | - Daniel L O'Donoghue
- Department of Cell Biology, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA
| | - Michael E Sughrue
- Centre for Minimally Invasive Neurosurgery, Prince of Wales Private Hospital, Sydney, New South Wales, Australia.
| |
Collapse
|
21
|
Clark MD, Greene-Woods A, Alofi A, Sides M, Buchanan B, Hauschildt S, Alford A, Courson F, Venable T. The Spoken Language Checklist: A User-Friendly Normed Language Acquisition Checklist. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2021; 26:251-262. [PMID: 33555011 DOI: 10.1093/deafed/enaa043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/21/2020] [Revised: 12/03/2020] [Accepted: 12/13/2020] [Indexed: 06/12/2023]
Abstract
There are many variables having an impact on the spoken language acquisition of deaf and hard of hearing (DHH) children; therefore, it is critical for parents and professionals to have appropriate tools to monitor language acquisition. The Spoken Language Checklist (SLC) was developed to monitor and identify developmental milestones in a user-friendly checklist format that includes norms. The availability of the SLC will help parents and professionals to monitor the spoken language development of DHH children and provide interventions that should any delays be observed. Recognizing these delays early could prevent any insurmountable effects for cognitive development and further language development.
Collapse
|
22
|
Cheng Q, Mayberry RI. When event knowledge overrides word order in sentence comprehension: Learning a first language after childhood. Dev Sci 2020; 24:e13073. [PMID: 33296520 DOI: 10.1111/desc.13073] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Revised: 08/27/2020] [Accepted: 11/26/2020] [Indexed: 11/28/2022]
Abstract
Limited language experience in childhood is common among deaf individuals, which prior research has shown to lead to low levels of language processing. Although basic structures such as word order have been found to be resilient to conditions of sparse language input in early life, whether they are robust to conditions of extreme language delay is unknown. The sentence comprehension strategies of post-childhood, first-language (L1) learners of American Sign Language (ASL) with at least 9 years of language experience were investigated, in comparison to two control groups of learners with full access to language from birth (deaf native signers and hearing L2 learners who were native English speakers). The results of a sentence-to-picture matching experiment show that event knowledge overrides word order for post-childhood L1 learners, regardless of the animacy of the subject, while both deaf native signers and hearing L2 signers consistently rely on word order to comprehend sentences. Language inaccessibility throughout early childhood impedes the acquisition of even basic word order. Similar to the strategies used by very young children prior to the development of basic sentence structure, post-childhood L1 learners rely more on context and event knowledge to comprehend sentences. Language experience during childhood is critical to the development of basic sentence structure.
Collapse
Affiliation(s)
- Qi Cheng
- Department of Linguistics, University of Washington, Seattle, WA, USA.,University of California, San Diego, La Jolla, CA, USA
| | | |
Collapse
|
23
|
Banaszkiewicz A, Bola Ł, Matuszewski J, Szczepanik M, Kossowski B, Mostowski P, Rutkowski P, Śliwińska M, Jednoróg K, Emmorey K, Marchewka A. The role of the superior parietal lobule in lexical processing of sign language: Insights from fMRI and TMS. Cortex 2020; 135:240-254. [PMID: 33401098 DOI: 10.1016/j.cortex.2020.10.025] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2020] [Revised: 09/24/2020] [Accepted: 10/22/2020] [Indexed: 11/29/2022]
Abstract
There is strong evidence that neuronal bases for language processing are remarkably similar for sign and spoken languages. However, as meanings and linguistic structures of sign languages are coded in movement and space and decoded through vision, differences are also present, predominantly in occipitotemporal and parietal areas, such as superior parietal lobule (SPL). Whether the involvement of SPL reflects domain-general visuospatial attention or processes specific to sign language comprehension remains an open question. Here we conducted two experiments to investigate the role of SPL and the laterality of its engagement in sign language lexical processing. First, using unique longitudinal and between-group designs we mapped brain responses to sign language in hearing late learners and deaf signers. Second, using transcranial magnetic stimulation (TMS) in both groups we tested the behavioural relevance of SPL's engagement and its lateralisation during sign language comprehension. SPL activation in hearing participants was observed in the right hemisphere before and bilaterally after the sign language course. Additionally, after the course hearing learners exhibited greater activation in the occipital cortex and left SPL than deaf signers. TMS applied to the right SPL decreased accuracy in both hearing learners and deaf signers. Stimulation of the left SPL decreased accuracy only in hearing learners. Our results suggest that right SPL might be involved in visuospatial attention while left SPL might support phonological decoding of signs in non-proficient signers.
Collapse
Affiliation(s)
- A Banaszkiewicz
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - Ł Bola
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - J Matuszewski
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - M Szczepanik
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - B Kossowski
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - P Mostowski
- Section for Sign Linguistics, Faculty of Polish Studies, University of Warsaw, Warsaw, Poland
| | - P Rutkowski
- Section for Sign Linguistics, Faculty of Polish Studies, University of Warsaw, Warsaw, Poland
| | - M Śliwińska
- Department of Psychology, University of York, Heslington, UK
| | - K Jednoróg
- Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - K Emmorey
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, San Diego, USA
| | - A Marchewka
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland.
| |
Collapse
|
24
|
Cheng Q, Silvano E, Bedny M. Sensitive periods in cortical specialization for language: insights from studies with Deaf and blind individuals. Curr Opin Behav Sci 2020; 36:169-176. [PMID: 33718533 PMCID: PMC7945734 DOI: 10.1016/j.cobeha.2020.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Studies with Deaf and blind individuals demonstrate that linguistic and sensory experiences during sensitive periods have potent effects on neurocognitive basis of language. Native users of sign and spoken languages recruit similar fronto-temporal systems during language processing. By contrast, delays in sign language access impact proficiency and the neural basis of language. Analogously, early but not late-onset blindness modifies the neural basis of language. People born blind recruit 'visual' areas during language processing, show reduced left-lateralization of language and enhanced performance on some language tasks. Sensitive period plasticity in and outside fronto-temporal language systems shapes the neural basis of language.
Collapse
Affiliation(s)
- Qi Cheng
- University of California San Diego
- University of Washington
| | - Emily Silvano
- Federal University of Rio de Janeiro
- Johns Hopkins University
| | | |
Collapse
|
25
|
Hribar M, Šuput D, Battelino S, Vovk A. Review article: Structural brain alterations in prelingually deaf. Neuroimage 2020; 220:117042. [PMID: 32534128 DOI: 10.1016/j.neuroimage.2020.117042] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2020] [Revised: 05/14/2020] [Accepted: 06/06/2020] [Indexed: 11/20/2022] Open
Abstract
Functional studies show that our brain has a remarkable ability to reorganize itself in the absence of one or more sensory modalities. In this review, we gathered all the available articles investigating structural alterations in congenitally deaf subjects. Some concentrated only on specific regions of interest (e.g., auditory areas), while others examined the whole brain. The majority of structural alterations were observed in the auditory white matter and were more pronounced in the right hemisphere. A decreased white matter volume or fractional anisotropy in the auditory areas were the most common findings in congenitally deaf subjects. Only a few studies observed alterations in the auditory grey matter. Preservation of the grey matter might be due to the cross-modal plasticity as well as due to the lack of sensitivity of methods used for microstructural alterations of grey matter. Structural alterations were also observed in the frontal, visual, and other cerebral regions as well as in the cerebellum. The observed structural brain alterations in the deaf can probably be attributed mainly to the cross-modal plasticity in the absence of sound input and use of sign instead of spoken language.
Collapse
Affiliation(s)
- Manja Hribar
- Center for Clinical Physiology, Faculty of Medicine, University of Ljubljana, Slovenia; Clinic for Otorhinolaryngology and Cervicofacial Surgery, University Medical Centre Ljubljana, Slovenia; Department of Otorhinolaryngology, Faculty of Medicine, University of Ljubljana, Slovenia
| | - Dušan Šuput
- Center for Clinical Physiology, Faculty of Medicine, University of Ljubljana, Slovenia; Institute of Pathophysiology, Faculty of Medicine, University of Ljubljana, Slovenia
| | - Saba Battelino
- Clinic for Otorhinolaryngology and Cervicofacial Surgery, University Medical Centre Ljubljana, Slovenia; Department of Otorhinolaryngology, Faculty of Medicine, University of Ljubljana, Slovenia
| | - Andrej Vovk
- Center for Clinical Physiology, Faculty of Medicine, University of Ljubljana, Slovenia; Institute of Pathophysiology, Faculty of Medicine, University of Ljubljana, Slovenia.
| |
Collapse
|
26
|
Kushalnagar P, Ryan C, Paludneviciene R, Spellun A, Gulati S. Adverse Childhood Communication Experiences Associated With an Increased Risk of Chronic Diseases in Adults Who Are Deaf. Am J Prev Med 2020; 59:548-554. [PMID: 32636047 PMCID: PMC7508773 DOI: 10.1016/j.amepre.2020.04.016] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/09/2019] [Revised: 02/15/2020] [Accepted: 04/12/2020] [Indexed: 12/29/2022]
Abstract
INTRODUCTION This study explores adverse childhood communication experiences and its RRs for acquiring specific chronic diseases and mental health disorders in adults who are deaf and hard of hearing. METHODS A cross-sectional design with snowball sampling was used to recruit adults who were deaf and hard of hearing and were born or became deaf in both ears before age 13 years. Patient-reported outcomes surveys in American Sign Language and English were disseminated to collect data about early life communication experiences with caregivers. Modified Poisson regression with robust SEs was used to calculate RR estimates and 95% CIs for all medical conditions with early life communication experiences as main predictors. RESULTS Data collection occurred from May 2016 to July 2016, October 2016 to April 2018, and October 2018 to May 2019. The U.S. sample consisted of 1,524 adults who were born or became deaf early. After adjusting for parental hearing status and known correlates of medical conditions, poorer direct child-caregiver communication was significantly associated with an increased risk of being diagnosed with diabetes (RRR=1.12, 95% CI=1.01, 1.24), hypertension (RRR=1.10, 95% CI=1.03, 1.17), and heart disease (RRR=1.61, 95% CI=1.39, 1.87). Poor indirect family communication/inclusion increased risks for lung diseases (RRR=1.19, 95% CI=1.07, 1.33) and depression/anxiety disorders (RRR=1.34, 95% CI=1.24, 1.44). The absolute risk increase and number needed to harm are also reported. CONCLUSIONS Outcomes data reported by patients who were deaf and hard of hearing demonstrated that poorer direct child-caregiver communication and ongoing exclusion from incidental family communication were associated with increased risks for multiple chronic health outcomes. Practices should consider developing and utilizing an adverse childhood communication screening measure to prevent or remediate language deprivation and communication neglect in pediatric patients who were deaf and hard of hearing.
Collapse
Affiliation(s)
- Poorna Kushalnagar
- Department of Psychology, Gallaudet University, Washington, District of Columbia.
| | - Claire Ryan
- Department of Educational Psychology, University of Texas at Austin, Austin, Texas
| | | | - Arielle Spellun
- Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut
| | - Sanjay Gulati
- UMass/Boston Children's Hospital, Waltham, Massachusetts
| |
Collapse
|
27
|
Deng Q, Gu F, Tong SX. Lexical processing in sign language: A visual mismatch negativity study. Neuropsychologia 2020; 148:107629. [PMID: 32976852 DOI: 10.1016/j.neuropsychologia.2020.107629] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Revised: 09/08/2020] [Accepted: 09/14/2020] [Indexed: 10/23/2022]
Abstract
Event-related potential studies of spoken and written language show the automatic access of auditory and visual words, as indexed by mismatch negativity (MMN) or visual MMN (vMMN). The present study examined whether the same automatic lexical processing occurs in a visual-gestural language, i.e., Hong Kong Sign Language (HKSL). Using a classic visual oddball paradigm, deaf signers and hearing non-signers were presented with a sequence of static images representing HKSL lexical signs and non-signs. When compared with hearing non-signers, deaf signers exhibited an enhanced vMMN elicited by the lexical signs at around 230 ms, and a larger P1-N170 complex evoked by both lexical sign and non-sign standards at the parieto-occipital area in the early time window between 65 ms and 170 ms. These findings indicate that deaf signers implicitly process the lexical sign and that neural response differences between deaf signers and hearing non-signers occur at the early stage of sign processing.
Collapse
Affiliation(s)
- Qinli Deng
- Human Communication, Development, and Information Sciences, Faculty of Education, The University of Hong Kong, Hong Kong, China.
| | - Feng Gu
- Human Communication, Development, and Information Sciences, Faculty of Education, The University of Hong Kong, Hong Kong, China; The College of Literature and Journalism, Sichuan University, Chengdu, China.
| | - Shelley Xiuli Tong
- Human Communication, Development, and Information Sciences, Faculty of Education, The University of Hong Kong, Hong Kong, China.
| |
Collapse
|
28
|
Clark MD, Cue KR, Delgado NJ, Greene-Woods AN, Wolsey JLA. Early Intervention Protocols: Proposing a Default Bimodal Bilingual Approach for Deaf Children. Matern Child Health J 2020; 24:1339-1344. [PMID: 32897446 PMCID: PMC7477485 DOI: 10.1007/s10995-020-03005-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/29/2020] [Indexed: 11/25/2022]
Abstract
Despite advances in hearing technology, a growing body of research, as well as early intervention protocols, deaf children largely fail to meet age-based language milestones. This gap in language acquisition points to the inconsistencies that exist between research and practice. Current research suggests that bimodal bilingual early interventions at deaf identification provide children language foundations that can lead to more effective outcomes. Recommendations that support implementing bimodal bilingualism at deaf identification include early intervention protocols, language foundations, and the development of appropriate bimodal bilingual environments. All recommendations serve as multifaceted tools in a deaf child’s repertoire as language and modality preferences develop and solidify. This versatile approach allows for children to determine their own language and communication preferences.
Collapse
Affiliation(s)
- M Diane Clark
- Department of Deaf Studies and Deaf Education, Lamar University, 4400 MLK Blvd., P. O. Box 10113, Beaumont, TX, 77710, USA
| | - Katrina R Cue
- Department of Deaf Studies and Deaf Education, Lamar University, 4400 MLK Blvd., P. O. Box 10113, Beaumont, TX, 77710, USA.
| | - Natalie J Delgado
- Department of Deaf Studies and Deaf Education, Lamar University, 4400 MLK Blvd., P. O. Box 10113, Beaumont, TX, 77710, USA
| | - Ashley N Greene-Woods
- Department of Deaf Studies and Deaf Education, Lamar University, 4400 MLK Blvd., P. O. Box 10113, Beaumont, TX, 77710, USA
| | - Ju-Lee A Wolsey
- Deaf Studies, Interdisciplinary Programs, Towson University, 8000 York Road, Towson, MD, 21252, USA
| |
Collapse
|
29
|
Caselli NK, Hall WC, Henner J. American Sign Language Interpreters in Public Schools: An Illusion of Inclusion that Perpetuates Language Deprivation. Matern Child Health J 2020; 24:1323-1329. [DOI: 10.1007/s10995-020-02975-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
30
|
Richardson H, Koster-Hale J, Caselli N, Magid R, Benedict R, Olson H, Pyers J, Saxe R. Reduced neural selectivity for mental states in deaf children with delayed exposure to sign language. Nat Commun 2020; 11:3246. [PMID: 32591503 PMCID: PMC7319957 DOI: 10.1038/s41467-020-17004-y] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2019] [Accepted: 05/28/2020] [Indexed: 11/18/2022] Open
Abstract
Language provides a rich source of information about other people's thoughts and feelings. Consequently, delayed access to language may influence conceptual development in Theory of Mind (ToM). We use functional magnetic resonance imaging and behavioral tasks to study ToM development in child (n = 33, 4-12 years old) and adult (n = 36) fluent signers of American Sign Language (ASL), and characterize neural ToM responses during ASL and movie-viewing tasks. Participants include deaf children whose first exposure to ASL was delayed up to 7 years (n = 12). Neural responses to ToM stories (specifically, selectivity of the right temporo-parietal junction) in these children resembles responses previously observed in young children, who have similar linguistic experience, rather than those in age-matched native-signing children, who have similar biological maturation. Early linguistic experience may facilitate ToM development, via the development of a selective brain region for ToM.
Collapse
Affiliation(s)
- Hilary Richardson
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA.
- Laboratories of Cognitive Neuroscience, Division of Developmental Medicine, Boston Children's Hospital, 1 Autumn Street, Rm. 527, Boston, MA, 02215, USA.
- Department of Pediatrics, Harvard Medical School, 1 Autumn Street, Rm. 527, Boston, MA, 02215, USA.
| | - Jorie Koster-Hale
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA
| | - Naomi Caselli
- Wheelock College of Education and Human Development, Boston University, 621 Commonwealth Avenue, Rm. 218, Boston, MA, 02215, USA
| | - Rachel Magid
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA
| | - Rachel Benedict
- Wheelock College of Education and Human Development, Boston University, 621 Commonwealth Avenue, Rm. 218, Boston, MA, 02215, USA
| | - Halie Olson
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA
| | - Jennie Pyers
- Department of Psychology, Wellesley College, 106 Central Street, Wellesley, MA, 02481, USA
| | - Rebecca Saxe
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 43 Vassar Street, 46-4021, Cambridge, MA, 02139, USA
| |
Collapse
|
31
|
Corina DP, Farnady L, LaMarr T, Pedersen S, Lawyer L, Winsler K, Hickok G, Bellugi U. Effects of age on American Sign Language sentence repetition. Psychol Aging 2020; 35:529-535. [PMID: 32271068 PMCID: PMC8425788 DOI: 10.1037/pag0000461] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023]
Abstract
The study of deaf users of signed languages, who often experience delays in primary language (L1) acquisition, permits a unique opportunity to examine the effects of aging on the processing of an L1 acquired under delayed or protracted development. A cohort of 107 congenitally deaf adult signers ages 45-85 years who were exposed to American Sign Language (ASL) either in infancy, early childhood, or late childhood were tested using an ASL sentence repetition test. Participants repeated 20 sentences that gradually increased in length and complexity. Logistic mixed-effects regression with the variables of chronological age (CA) and age of acquisition (AoA) was used to assess sentence repetition accuracy. Results showed that CA was a significant predictor, with increased age being associated with decreased likelihood to reproduce a sentence correctly (odds ratio [OR] = 0.56, p = .010). In addition, effects of AoA were observed. Relative to native deaf signers, those who acquired ASL in early childhood were less likely to successfully reproduce a sentence (OR = 0.42, p = .003), as were subjects who learned ASL in late childhood (OR = 0.27, p < .001). These data show that aging affects verbatim recall in deaf users of ASL and that the age of sign language acquisition has a significant and lasting effect on repetition ability, even after decades of sign language use. These data show evidence for life-span continuity of early life effects. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Collapse
|
32
|
Bosworth R, Stone A, Hwang SO. Effects of Video Reversal on Gaze Patterns during Signed Narrative Comprehension. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2020; 25:283-297. [PMID: 32427289 PMCID: PMC7260695 DOI: 10.1093/deafed/enaa007] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/23/2019] [Revised: 01/29/2020] [Accepted: 02/24/2020] [Indexed: 06/11/2023]
Abstract
Language knowledge, age of acquisition (AoA), and stimulus intelligibility all affect gaze behavior for reading print, but it is unknown how these factors affect "sign-watching" among signers. This study investigated how these factors affect gaze behavior during sign language comprehension in 52 adult signers who acquired American Sign Language (ASL) at different ages. We examined gaze patterns and story comprehension in four subject groups who differ in hearing status and when they learned ASL (i.e. Deaf Early, Deaf Late, Hearing Late, and Hearing Novice). Participants watched signed stories in normal (high intelligibility) and video-reversed (low intelligibility) conditions. This video manipulation was used because it distorts word order and thus disrupts the syntax and semantic content of narratives, while preserving most surface phonological features of individual signs. Video reversal decreased story comprehension accuracy, and this effect was greater for those who learned ASL later in life. Reversal also was associated with more dispersed gaze behavior. Although each subject group had unique gaze patterns, the effect of video reversal on gaze measures was similar across all groups. Among fluent signers, gaze behavior was not correlated with AoA, suggesting that "efficient" sign watching can be quickly learnt even among signers exposed to signed language later in life.
Collapse
Affiliation(s)
- Rain Bosworth
- National Technical Institute for the Deaf, Rochester Institute of Technology, Rochester, New York
| | - Adam Stone
- Department of Psychology, University of California, San Diego
| | - So-One Hwang
- Center for Research in Language, University of California, San Diego
| |
Collapse
|
33
|
Abstract
Syntax, the structure of sentences, enables humans to express an infinite range of meanings through finite means. The neurobiology of syntax has been intensely studied but with little consensus. Two main candidate regions have been identified: the posterior inferior frontal gyrus (pIFG) and the posterior middle temporal gyrus (pMTG). Integrating research in linguistics, psycholinguistics, and neuroscience, we propose a neuroanatomical framework for syntax that attributes distinct syntactic computations to these regions in a unified model. The key theoretical advances are adopting a modern lexicalized view of syntax in which the lexicon and syntactic rules are intertwined, and recognizing a computational asymmetry in the role of syntax during comprehension and production. Our model postulates a hierarchical lexical-syntactic function to the pMTG, which interconnects previously identified speech perception and conceptual-semantic systems in the temporal and inferior parietal lobes, crucial for both sentence production and comprehension. These relational hierarchies are transformed via the pIFG into morpho-syntactic sequences, primarily tied to production. We show how this architecture provides a better account of the full range of data and is consistent with recent proposals regarding the organization of phonological processes in the brain.
Collapse
Affiliation(s)
- William Matchin
- Department of Communication Sciences and Disorders, University of South Carolina, Columbia, SC, 29208, USA
| | - Gregory Hickok
- Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, 92697, USA
- Department of Language Science, University of California, Irvine, Irvine, CA, 92697, USA
| |
Collapse
|
34
|
Pant R, Kanjlia S, Bedny M. A sensitive period in the neural phenotype of language in blind individuals. Dev Cogn Neurosci 2020; 41:100744. [PMID: 31999565 PMCID: PMC6994632 DOI: 10.1016/j.dcn.2019.100744] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2019] [Revised: 11/15/2019] [Accepted: 11/29/2019] [Indexed: 01/18/2023] Open
Abstract
Congenital blindness modifies the neural basis of language: "visual" cortices respond to linguistic information, and fronto-temporal language networks are less left-lateralized. We tested the hypothesis that this plasticity follows a sensitive period by comparing the neural basis of sentence processing between adult-onset blind (AB, n = 16), congenitally blind (CB, n = 22) and blindfolded sighted adults (n = 18). In Experiment 1, participants made semantic judgments for spoken sentences and, in a control condition, solved math equations. In Experiment 2, participants answered "who did what to whom" yes/no questions for grammatically complex (with syntactic movement) and simpler sentences. In a control condition, participants performed a memory task with non-words. In both experiments, visual cortices of CB and AB but not sighted participants responded more to sentences than control conditions, but the effect was much larger in the CB group. Only the "visual" cortex of CB participants responded to grammatical complexity. Unlike the CB group, the AB group showed no reduction in left-lateralization of fronto-temporal language network, relative to the sighted. These results suggest that congenital blindness modifies the neural basis of language differently from adult-onset blindness, consistent with a developmental sensitive period hypothesis.
Collapse
Affiliation(s)
- Rashi Pant
- Department of Psychological and Brain Sciences, Johns Hopkins University, USA; Biological Psychology and Neuropsychology, University of Hamburg, Germany.
| | - Shipra Kanjlia
- Department of Psychological and Brain Sciences, Johns Hopkins University, USA
| | - Marina Bedny
- Department of Psychological and Brain Sciences, Johns Hopkins University, USA
| |
Collapse
|
35
|
Malaia EA, Krebs J, Roehm D, Wilbur RB. Age of acquisition effects differ across linguistic domains in sign language: EEG evidence. BRAIN AND LANGUAGE 2020; 200:104708. [PMID: 31698097 PMCID: PMC6934356 DOI: 10.1016/j.bandl.2019.104708] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/04/2019] [Revised: 10/10/2019] [Accepted: 10/11/2019] [Indexed: 06/10/2023]
Abstract
One of the key questions in the study of human language acquisition is the extent to which the development of neural processing networks for different components of language are modulated by exposure to linguistic stimuli. Sign languages offer a unique perspective on this issue, because prelingually Deaf children who receive access to complex linguistic input later in life provide a window into brain maturation in the absence of language, and subsequent neuroplasticity of neurolinguistic networks during late language learning. While the duration of sensitive periods of acquisition of linguistic subsystems (sound, vocabulary, and syntactic structure) is well established on the basis of L2 acquisition in spoken language, for sign languages, the relative timelines for development of neural processing networks for linguistic sub-domains are unknown. We examined neural responses of a group of Deaf signers who received access to signed input at varying ages to three linguistic phenomena at the levels of classifier signs, syntactic structure, and information structure. The amplitude of the N400 response to the marked word order condition negatively correlated with the age of acquisition for syntax and information structure, indicating increased cognitive load in these conditions. Additionally, the combination of behavioral and neural data suggested that late learners preferentially relied on classifiers over word order for meaning extraction. This suggests that late acquisition of sign language significantly increases cognitive load during analysis of syntax and information structure, but not word-level meaning.
Collapse
Affiliation(s)
- Evie A Malaia
- Department of Communicative Disorders, University of Alabama, Speech and Hearing Clinic, 700 Johnny Stallings Drive, Tuscaloosa, AL 35401, USA.
| | - Julia Krebs
- Research Group Neurobiology of Language, Department of Linguistics, University of Salzburg, Erzabt-Klotz-Straße 1, 5020 Salzburg, Austria; Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Erzabt-Klotz-Straße 1, 5020 Salzburg, Austria
| | - Dietmar Roehm
- Research Group Neurobiology of Language, Department of Linguistics, University of Salzburg, Erzabt-Klotz-Straße 1, 5020 Salzburg, Austria; Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Erzabt-Klotz-Straße 1, 5020 Salzburg, Austria
| | - Ronnie B Wilbur
- Department of Linguistics, Purdue University, Lyles-Porter Hall, West Lafayette, IN 47907-2122, USA; Department of Speech, Language, and Hearing Sciences, Purdue University, Lyles-Porter Hall, West Lafayette, IN 47907-2122, USA
| |
Collapse
|
36
|
Twomey T, Price CJ, Waters D, MacSweeney M. The impact of early language exposure on the neural system supporting language in deaf and hearing adults. Neuroimage 2019; 209:116411. [PMID: 31857205 PMCID: PMC7985620 DOI: 10.1016/j.neuroimage.2019.116411] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2019] [Revised: 11/25/2019] [Accepted: 11/27/2019] [Indexed: 11/25/2022] Open
Abstract
Deaf late signers provide a unique perspective on the impact of impoverished early language exposure on the neurobiology of language: insights that cannot be gained from research with hearing people alone. Here we contrast the effect of age of sign language acquisition in hearing and congenitally deaf adults to examine the potential impact of impoverished early language exposure on the neural systems supporting a language learnt later in life. We collected fMRI data from deaf and hearing proficient users (N = 52) of British Sign Language (BSL), who learnt BSL either early (native) or late (after the age of 15 years) whilst they watched BSL sentences or strings of meaningless nonsense signs. There was a main effect of age of sign language acquisition (late > early) across deaf and hearing signers in the occipital segment of the left intraparietal sulcus. This finding suggests that late learners of sign language may rely on visual processing more than early learners, when processing both linguistic and nonsense sign input – regardless of hearing status. Region-of-interest analyses in the posterior superior temporal cortices (STC) showed an effect of age of sign language acquisition that was specific to deaf signers. In the left posterior STC, activation in response to signed sentences was greater in deaf early signers than deaf late signers. Importantly, responses in the left posterior STC in hearing early and late signers did not differ, and were similar to those observed in deaf early signers. These data lend further support to the argument that robust early language experience, whether signed or spoken, is necessary for left posterior STC to show a ‘native-like’ response to a later learnt language.
Collapse
Affiliation(s)
- Tae Twomey
- Institute of Cognitive Neuroscience, University College London, WC1N 3AZ, UK; Deafness, Cognition and Language Research Centre, University College London, WC1H 0PD, UK
| | - Cathy J Price
- Wellcome Centre for Human Neuroimaging, Institute of Neurology, University College London, WC1N 3BG, UK
| | - Dafydd Waters
- Institute of Cognitive Neuroscience, University College London, WC1N 3AZ, UK
| | - Mairéad MacSweeney
- Institute of Cognitive Neuroscience, University College London, WC1N 3AZ, UK; Deafness, Cognition and Language Research Centre, University College London, WC1H 0PD, UK.
| |
Collapse
|
37
|
Krebs J, Wilbur RB, Alday PM, Roehm D. The Impact of Transitional Movements and Non-Manual Markings on the Disambiguation of Locally Ambiguous Argument Structures in Austrian Sign Language (ÖGS). LANGUAGE AND SPEECH 2019; 62:652-680. [PMID: 30354860 DOI: 10.1177/0023830918801399] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Previous studies of Austrian Sign Language (ÖGS) word-order variations have demonstrated the human processing system's tendency to interpret a sentence-initial (case-) ambiguous argument as the subject of the clause ("subject preference"). The electroencephalogram study motivating the current report revealed earlier reanalysis effects for object-subject compared to subject-object sentences, in particular, before the start of the movement of the agreement marking sign. The effects were bound to time points prior to when both arguments were referenced in space and/or the transitional hand movement prior to producing the disambiguating sign. Due to the temporal proximity of these time points, it was not clear which visual cues led to disambiguation; that is, whether non-manual markings (body/shoulder/head shift towards the subject position) or the transitional hand movement resolved ambiguity. The present gating study further supports that disambiguation in ÖGS is triggered by cues occurring before the movement of the disambiguating sign. Further, the present study also confirms the presence of the subject preference in ÖGS, showing again that signers and speakers draw on similar strategies during language processing independent of language modality. Although the ultimate role of the visual cues leading to disambiguation (i.e., non-manual markings and transitional movements) requires further investigation, the present study shows that they contribute crucial information about argument structure during online processing. This finding provides strong support for granting these cues some degree of linguistic status (at least in ÖGS).
Collapse
|
38
|
Clark MD, Baker S, Simms L. A culture of assessment: A bioecological systems approach for early and continuous assessment of deaf infants and children. PSYCHOLOGY IN THE SCHOOLS 2019. [DOI: 10.1002/pits.22313] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- M. Diane Clark
- Department of Deaf Studies and Deaf EducationLamar UniversityBeaumont Texas
| | - Sharon Baker
- Department of EducationUniversity of TulsaTulsa Oklahoma
| | - Laurene Simms
- Department of EducationGallaudet UniversityWashington District of Columbia
| |
Collapse
|
39
|
Kite BJ. How the medical professionals impact ASL and English families’ language planning policy. PSYCHOLOGY IN THE SCHOOLS 2019. [DOI: 10.1002/pits.22324] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Affiliation(s)
- Bobbie Jo Kite
- Department of EducationGallaudet University Washington, DC District of Columbia
| |
Collapse
|
40
|
Zhang C, Lee TMC, Fu Y, Ren C, Chan CCH, Tao Q. Properties of cross-modal occipital responses in early blindness: An ALE meta-analysis. NEUROIMAGE-CLINICAL 2019; 24:102041. [PMID: 31677587 PMCID: PMC6838549 DOI: 10.1016/j.nicl.2019.102041] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/12/2019] [Revised: 09/20/2019] [Accepted: 10/17/2019] [Indexed: 11/10/2022]
Abstract
ALE meta-analysis reveals distributed brain networks for object and spatial functions in individuals with early blindness. ALE contrast analysis reveals specific activations in the left cuneus and lingual gyrus for language function, suggesting a reverse hierarchical organization of the visual cortex for early blind individuals. The findings contribute to visual rehabilitation in blind individuals by revealing the function-dependent and sensory-independent networks during nonvisual processing.
Cross-modal occipital responses appear to be essential for nonvisual processing in individuals with early blindness. However, it is not clear whether the recruitment of occipital regions depends on functional domain or sensory modality. The current study utilized a coordinate-based meta-analysis to identify the distinct brain regions involved in the functional domains of object, spatial/motion, and language processing and the common brain regions involved in both auditory and tactile modalities in individuals with early blindness. Following the PRISMA guidelines, a total of 55 studies were included in the meta-analysis. The specific analyses revealed the brain regions that are consistently recruited for each function, such as the dorsal fronto-parietal network for spatial function and ventral occipito-temporal network for object function. This is consistent with the literature, suggesting that the two visual streams are preserved in early blind individuals. The contrast analyses found specific activations in the left cuneus and lingual gyrus for language function. This finding is novel and suggests a reverse hierarchical organization of the visual cortex for early blind individuals. The conjunction analyses found common activations in the right middle temporal gyrus, right precuneus and a left parieto-occipital region. Clinically, this work contributes to visual rehabilitation in early blind individuals by revealing the function-dependent and sensory-independent networks during nonvisual processing.
Collapse
Affiliation(s)
- Caiyun Zhang
- Psychology Department, School of Medicine, Jinan University, Guangzhou 510632, China
| | - Tatia M C Lee
- Laboratory of Neuropsychology, The University of Hong Kong, Hong Kong, CHINA; Laboratory of Cognitive Affective Neuroscience, The University of Hong Kong, Hong Kong, CHINA; The Affiliated Brain Hospital of Guangzhou Medical University, Guangzhou, China
| | - Yunwei Fu
- Guangdong-Hongkong-Macau Institute of CNS Regeneration, Ministry of Education CNS Regeneration Collaborative Joint Laboratory, Jinan University, Guangzhou, 510632, China
| | - Chaoran Ren
- Guangdong-Hongkong-Macau Institute of CNS Regeneration, Ministry of Education CNS Regeneration Collaborative Joint Laboratory, Jinan University, Guangzhou, 510632, China; Guangdong key Laboratory of Brain Function and Diseases, Jinan University, Guangzhou, 510632, China; Co-innovation Center of Neuroregeneration, Nantong University, Nantong, 226001, China; Center for Brain Science and Brain-Inspired Intelligence, Guangdong-Hong Kong-Macao Greater Bay Area, Guangzhou, China
| | - Chetwyn C H Chan
- Applied Cognitive Neuroscience Laboratory, Department of Rehabilitation Sciences, The Hong Kong Polytechnic University, Hong Kong, CHINA.
| | - Qian Tao
- Psychology Department, School of Medicine, Jinan University, Guangzhou 510632, China; Center for Brain Science and Brain-Inspired Intelligence, Guangdong-Hong Kong-Macao Greater Bay Area, Guangzhou, China.
| |
Collapse
|
41
|
Cheng Q, Roth A, Halgren E, Mayberry RI. Effects of Early Language Deprivation on Brain Connectivity: Language Pathways in Deaf Native and Late First-Language Learners of American Sign Language. Front Hum Neurosci 2019; 13:320. [PMID: 31607879 PMCID: PMC6761297 DOI: 10.3389/fnhum.2019.00320] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2019] [Accepted: 09/02/2019] [Indexed: 01/24/2023] Open
Abstract
Previous research has identified ventral and dorsal white matter tracts as being crucial for language processing; their maturation correlates with increased language processing capacity. Unknown is whether the growth or maintenance of these language-relevant pathways is shaped by language experience in early life. To investigate the effects of early language deprivation and the sensory-motor modality of language on white matter tracts, we examined the white matter connectivity of language-relevant pathways in congenitally deaf people with or without early access to language. We acquired diffusion tensor imaging (DTI) data from two groups of individuals who experienced language from birth, twelve deaf native signers of American Sign Language, and twelve hearing L2 signers of ASL (native English speakers), and from three, well-studied individual cases who experienced minimal language during childhood. The results indicate that the sensory-motor modality of early language experience does not affect the white matter microstructure between crucial language regions. Both groups with early language experience, deaf and hearing, show leftward laterality in the two language-related tracts. However, all three cases with early language deprivation showed altered white matter microstructure, especially in the left dorsal arcuate fasciculus (AF) pathway.
Collapse
Affiliation(s)
- Qi Cheng
- Department of Linguistics, University of California, San Diego, San Diego, CA, United States
| | - Austin Roth
- Department of Linguistics, University of California, San Diego, San Diego, CA, United States
- Department of Radiology, University of California, San Diego, San Diego, CA, United States
| | - Eric Halgren
- Department of Radiology, University of California, San Diego, San Diego, CA, United States
| | - Rachel I. Mayberry
- Department of Linguistics, University of California, San Diego, San Diego, CA, United States
| |
Collapse
|
42
|
Deaf Children as ‘English Learners’: The Psycholinguistic Turn in Deaf Education. EDUCATION SCIENCES 2019. [DOI: 10.3390/educsci9020133] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The purpose of this literature review is to present the arguments in support of conceptualizing deaf children as ‘English Learners’, to explore the educational implications of such conceptualizations, and to suggest directions for future inquiry. Three ways of interpreting the label ‘English Learner’ in relationship to deaf children are explored: (1) as applied to deaf children whose native language is American Sign Language; (2) as applied to deaf children whose parents speak a language other than English; and (3) as applied to deaf children who have limited access to the spoken English used by their parents. Recent research from the fields of linguistics and neuroscience on the effects of language deprivation is presented and conceptualized within a framework that we refer to as the psycholinguistic turn in deaf education. The implications for developing the literacy skills of signing deaf children are explored, particularly around the theoretical construct of a ‘bridge’ between sign language proficiency and print-based literacy. Finally, promising directions for future inquiry are presented.
Collapse
|
43
|
Support for parents of deaf children: Common questions and informed, evidence-based answers. Int J Pediatr Otorhinolaryngol 2019; 118:134-142. [PMID: 30623850 DOI: 10.1016/j.ijporl.2018.12.036] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Revised: 11/15/2018] [Accepted: 12/27/2018] [Indexed: 11/20/2022]
Abstract
To assist medical and hearing-science professionals in supporting parents of deaf children, we have identified common questions that parents may have and provide evidence-based answers. In doing so, a compassionate and positive narrative about deafness and deaf children is offered, one that relies on recent research evidence regarding the critical nature of early exposure to a fully accessible visual language, which in the United States is American Sign Language (ASL). This evidence includes the role of sign language in language acquisition, cognitive development, and literacy. In order for parents to provide a nurturing and anxiety-free environment for early childhood development, signing at home is important even if their child also has the additional nurturing and care of a signing community. It is not just the early years of a child's life that matter for language acquisition; it's the early months, the early weeks, even the early days. Deaf children cannot wait for accessible language input. The whole family must learn simultaneously as the deaf child learns. Even moderate fluency on the part of the family benefits the child enormously. And learning the sign language together can be one of the strongest bonding experiences that the family and deaf child have.
Collapse
|
44
|
Abstract
Mayberry and Kluender review evidence that second language (L2) proficiency declines with age of acquisition (regardless of modality), but they also review evidence for variable L2 outcomes for individuals, with factors such as motivation, language aptitude, education, and L2 experience playing a role. They argue that if L2 outcomes were fully under the control of a critical period for language (CPL), these learning variables should not predict L2 outcome, and the outcome of L2 learning would not be consistently observed to be so variable. The questions raised in this commentary are whether there is variation in late L1 proficiency and whether such variation could provide insights into the CPL.
Collapse
|
45
|
Mayberry RI, Kluender R. Rethinking the critical period for language: New insights into an old question from American Sign Language. BILINGUALISM (CAMBRIDGE, ENGLAND) 2018; 21:886-905. [PMID: 30643489 PMCID: PMC6329394 DOI: 10.1017/s1366728917000724] [Citation(s) in RCA: 36] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
The hypothesis that children surpass adults in long-term second-language proficiency is accepted as evidence for a critical period for language. However, the scope and nature of a critical period for language has been the subject of considerable debate. The controversy centers on whether the age-related decline in ultimate second-language proficiency is evidence for a critical period or something else. Here we argue that age-onset effects for first vs. second language outcome are largely different. We show this by examining psycholinguistic studies of ultimate attainment in L2 vs. L1 learners, longitudinal studies of adolescent L1 acquisition, and neurolinguistic studies of late L2 and L1 learners. This research indicates that L1 acquisition arises from post-natal brain development interacting with environmental linguistic experience. By contrast, L2 learning after early childhood is scaffolded by prior childhood L1 acquisition, both linguistically and neurally, making it a less clear test of the critical period for language.
Collapse
Affiliation(s)
| | - Robert Kluender
- Department of Linguistics, University of California San Diego
| |
Collapse
|
46
|
Johnson L, Fitzhugh MC, Yi Y, Mickelsen S, Baxter LC, Howard P, Rogalsky C. Functional Neuroanatomy of Second Language Sentence Comprehension: An fMRI Study of Late Learners of American Sign Language. Front Psychol 2018; 9:1626. [PMID: 30237778 PMCID: PMC6136263 DOI: 10.3389/fpsyg.2018.01626] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2018] [Accepted: 08/14/2018] [Indexed: 01/16/2023] Open
Abstract
The neurobiology of sentence comprehension is well-studied but the properties and characteristics of sentence processing networks remain unclear and highly debated. Sign languages (i.e., visual-manual languages), like spoken languages, have complex grammatical structures and thus can provide valuable insights into the specificity and function of brain regions supporting sentence comprehension. The present study aims to characterize how these well-studied spoken language networks can adapt in adults to be responsive to sign language sentences, which contain combinatorial semantic and syntactic visual-spatial linguistic information. Twenty native English-speaking undergraduates who had completed introductory American Sign Language (ASL) courses viewed videos of the following conditions during fMRI acquisition: signed sentences, signed word lists, English sentences and English word lists. Overall our results indicate that native language (L1) sentence processing resources are responsive to ASL sentence structures in late L2 learners, but that certain L1 sentence processing regions respond differently to L2 ASL sentences, likely due to the nature of their contribution to language comprehension. For example, L1 sentence regions in Broca's area were significantly more responsive to L2 than L1 sentences, supporting the hypothesis that Broca's area contributes to sentence comprehension as a cognitive resource when increased processing is required. Anterior temporal L1 sentence regions were sensitive to L2 ASL sentence structure, but demonstrated no significant differences in activation to L1 than L2, suggesting its contribution to sentence processing is modality-independent. Posterior superior temporal L1 sentence regions also responded to ASL sentence structure but were more activated by English than ASL sentences. An exploratory analysis of the neural correlates of L2 ASL proficiency indicates that ASL proficiency is positively correlated with increased activations in response to ASL sentences in L1 sentence processing regions. Overall these results suggest that well-established fronto-temporal spoken language networks involved in sentence processing exhibit functional plasticity with late L2 ASL exposure, and thus are adaptable to syntactic structures widely different than those in an individual's native language. Our findings also provide valuable insights into the unique contributions of the inferior frontal and superior temporal regions that are frequently implicated in sentence comprehension but whose exact roles remain highly debated.
Collapse
Affiliation(s)
- Lisa Johnson
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Megan C Fitzhugh
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States.,Interdisciplinary Graduate Neuroscience Program, Arizona State University, Tempe, AZ, United States
| | - Yuji Yi
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Soren Mickelsen
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Leslie C Baxter
- Barrow Neurological Institute and St. Joseph's Hospital and Medical Center, Phoenix, AZ, United States
| | - Pamela Howard
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Corianne Rogalsky
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| |
Collapse
|
47
|
Hall WC, Smith SR, Sutter EJ, DeWindt LA, Dye TDV. Considering parental hearing status as a social determinant of deaf population health: Insights from experiences of the "dinner table syndrome". PLoS One 2018; 13:e0202169. [PMID: 30183711 PMCID: PMC6124705 DOI: 10.1371/journal.pone.0202169] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2017] [Accepted: 07/17/2018] [Indexed: 11/18/2022] Open
Abstract
The influence of early language and communication experiences on lifelong health outcomes is receiving increased public health attention. Most deaf children have non-signing hearing parents, and are at risk for not experiencing fully accessible language environments, a possible factor underlying known deaf population health disparities. Childhood indirect family communication–such as spontaneous conversations and listening in the routine family environment (e.g. family meals, recreation, car rides)–is an important source of health-related contextual learning opportunities. The goal of this study was to assess the influence of parental hearing status on deaf people’s recalled access to childhood indirect family communication. We analyzed data from the Rochester Deaf Health Survey–2013 (n = 211 deaf adults) for associations between sociodemographic factors including parental hearing status, and recalled access to childhood indirect family communication. Parental hearing status predicted deaf adults’ recalled access to childhood indirect family communication (χ2 = 31.939, p < .001). The likelihood of deaf adults reporting “sometimes to never” for recalled comprehension of childhood family indirect communication increased by 17.6 times for those with hearing parents. No other sociodemographic or deaf-specific factors in this study predicted deaf adults’ access to childhood indirect family communication. This study finds that deaf people who have hearing parents were more likely to report limited access to contextual learning opportunities during childhood. Parental hearing status and early childhood language experiences, therefore, require further investigation as possible social determinants of health to develop interventions that improve lifelong health and social outcomes of the underserved deaf population.
Collapse
Affiliation(s)
- Wyatte C. Hall
- Obstetrics & Gynecology and Clinical & Translational Science Institute, University of Rochester Medical Center, Rochester, New York, United States of America
- * E-mail:
| | - Scott R. Smith
- Office of the Associate Dean of Research, National Technical Institute for the Deaf, Rochester Institute of Technology, Rochester, New York, United States of America
| | - Erika J. Sutter
- National Center for Deaf Health Research, University of Rochester Medical Center, Rochester, New York, United States of America
| | - Lori A. DeWindt
- National Center for Deaf Health Research, University of Rochester Medical Center, Rochester, New York, United States of America
- Deaf Wellness Center, University of Rochester Medical Center, Rochester, New York, United States of America
| | - Timothy D. V. Dye
- Obstetrics & Gynecology and Clinical & Translational Science Institute, University of Rochester Medical Center, Rochester, New York, United States of America
- Pediatrics and Public Health Sciences, University of Rochester Medical Center, Rochester, New York, United States of America
| |
Collapse
|
48
|
Pattern of neural divergence in adults with prelingual deafness: Based on structural brain analysis. Brain Res 2018; 1701:58-63. [PMID: 30048625 DOI: 10.1016/j.brainres.2018.07.021] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2018] [Revised: 07/09/2018] [Accepted: 07/21/2018] [Indexed: 01/21/2023]
Abstract
Sensory input for hearing plays a significant role in the development of human brain. Absence of an early auditory input leads to the alteration of important neural regions, which in turn results in a complex process known as cross-modal neuroplasticity. Previous studies related to the structural brain alteration of adult deaf individuals have shown inconsistent results. To address this issue, we investigated the brain morphology in 50 prelingual adult deaf individuals and compared it with the same number of individuals with normal hearing, using structural magnetic resonance imaging and three inter-related but completely distinct analysis methods namely univariate approach (voxel based morphometry), multivariate approach (source based morphometry), and projection based cortical thickness. The findings from all these inter-related analyses suggest alterations in important neural regions such as bilateral superior temporal gyrus, bilateral inferior temporal, bilateral fusiform gyrus, and bilateral middle frontal. These findings also justify a strong ventral visual pathway in the deaf group. We suggest that these morphological alterations in important brain regions are due to the compensatory cross-modal reorganization.
Collapse
|
49
|
Subject preference emerges as cross-modal strategy for linguistic processing. Brain Res 2018; 1691:105-117. [PMID: 29627484 DOI: 10.1016/j.brainres.2018.03.029] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2017] [Revised: 01/30/2018] [Accepted: 03/24/2018] [Indexed: 11/23/2022]
Abstract
Research on spoken languages has identified a "subject preference" processing strategy for tackling input that is syntactically ambiguous as to whether a sentence-initial NP is a subject or object. The present study documents that the "subject preference" strategy is also seen in the processing of a sign language, supporting the hypothesis that the "subject"-first strategy is universal and not dependent on the language modality (spoken vs. signed). Deaf signers of Austrian Sign Language (ÖGS) were shown videos of locally ambiguous signed sentences in SOV and OSV word orders. Electroencephalogram (EEG) data indicated higher cognitive load in response to OSV stimuli (i.e. a negativity for OSV compared to SOV), indicative of syntactic reanalysis cost. A finding that is specific to the visual modality is that the ERP (event-related potential) effect reflecting linguistic reanalysis occurred earlier than might have been expected, that is, before the time point when the path movement of the disambiguating sign was visible. We suggest that in the visual modality, transitional movement of the articulators prior to the disambiguating verb position or co-occurring non-manual (face/body) markings were used in resolving the local ambiguity in ÖGS. Thus, whereas the processing strategy of "subject preference" is cross-modal at the linguistic level, the cues that enable the processor to apply that strategy differ in signing as compared to speech.
Collapse
|
50
|
Moreno A, Limousin F, Dehaene S, Pallier C. Brain correlates of constituent structure in sign language comprehension. Neuroimage 2018; 167:151-161. [PMID: 29175202 PMCID: PMC6044420 DOI: 10.1016/j.neuroimage.2017.11.040] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2017] [Revised: 10/27/2017] [Accepted: 11/19/2017] [Indexed: 01/16/2023] Open
Abstract
During sentence processing, areas of the left superior temporal sulcus, inferior frontal gyrus and left basal ganglia exhibit a systematic increase in brain activity as a function of constituent size, suggesting their involvement in the computation of syntactic and semantic structures. Here, we asked whether these areas play a universal role in language and therefore contribute to the processing of non-spoken sign language. Congenitally deaf adults who acquired French sign language as a first language and written French as a second language were scanned while watching sequences of signs in which the size of syntactic constituents was manipulated. An effect of constituent size was found in the basal ganglia, including the head of the caudate and the putamen. A smaller effect was also detected in temporal and frontal regions previously shown to be sensitive to constituent size in written language in hearing French subjects (Pallier et al., 2011). When the deaf participants read sentences versus word lists, the same network of language areas was observed. While reading and sign language processing yielded identical effects of linguistic structure in the basal ganglia, the effect of structure was stronger in all cortical language areas for written language relative to sign language. Furthermore, cortical activity was partially modulated by age of acquisition and reading proficiency. Our results stress the important role of the basal ganglia, within the language network, in the representation of the constituent structure of language, regardless of the input modality.
Collapse
Affiliation(s)
- Antonio Moreno
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France.
| | - Fanny Limousin
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France
| | - Stanislas Dehaene
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France; Collège de France, 11 Place Marcelin Berthelot, 75005 Paris, France
| | - Christophe Pallier
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France.
| |
Collapse
|