1
|
Coldham Y, Haluts N, Elbaz E, Ben-David T, Racabi N, Gal S, Bernstein-Eliav M, Friedmann N, Tavor I. Distinct neural representations of different linguistic components following sign language learning. Commun Biol 2025; 8:353. [PMID: 40033011 DOI: 10.1038/s42003-025-07793-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2024] [Accepted: 02/20/2025] [Indexed: 03/05/2025] Open
Abstract
Learning a new language is a process everyone undergoes at least once. However, studying the neural mechanisms behind first-time language learning is a challenging task. Here we aim to explore the functional alterations following learning Israeli Sign Language, a visuo-spatial rather than an auditory-based language. Specifically, we investigate how phonological, lexical, and sentence-level components of the language system differ in their neural representations. In this within-participant design, hearing individuals naïve to sign languages (n = 79) performed an fMRI task requiring the processing of different linguistic components, before and after attending an Israeli Sign Language course. A learning-induced increase in activation was detected in various brain regions in task contrasts related to all sign language linguistic components. Activation patterns while processing different linguistic components post-learning were spatially distinct, suggesting a unique neural representation for each component. Moreover, post-learning activation maps successfully predicted learning retention six months later, associating neural and performance measures.
Collapse
Affiliation(s)
- Yael Coldham
- Faculty of Medical & Health Sciences, Tel Aviv University, Tel Aviv, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Neta Haluts
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- Language and Brain Lab, School of Education, Tel Aviv University, Tel Aviv, Israel
| | - Eden Elbaz
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Tamar Ben-David
- Faculty of Medical & Health Sciences, Tel Aviv University, Tel Aviv, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Nell Racabi
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Shachar Gal
- Department of Psychology, Bar-Ilan University, Ramat-Gan, Israel
| | | | - Naama Friedmann
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- Language and Brain Lab, School of Education, Tel Aviv University, Tel Aviv, Israel
| | - Ido Tavor
- Faculty of Medical & Health Sciences, Tel Aviv University, Tel Aviv, Israel.
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel.
| |
Collapse
|
2
|
Banaszkiewicz A, Costello B, Marchewka A. Early language experience and modality affect parietal cortex activation in different hemispheres: Insights from hearing bimodal bilinguals. Neuropsychologia 2024; 204:108973. [PMID: 39151687 DOI: 10.1016/j.neuropsychologia.2024.108973] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Revised: 08/12/2024] [Accepted: 08/12/2024] [Indexed: 08/19/2024]
Abstract
The goal of this study was to investigate the impact of the age of acquisition (AoA) on functional brain representations of sign language in two exceptional groups of hearing bimodal bilinguals: native signers (simultaneous bilinguals since early childhood) and late signers (proficient sequential bilinguals, who learnt a sign language after puberty). We asked whether effects of AoA would be present across languages - signed and audiovisual spoken - and thus observed only in late signers as they acquired each language at different life stages, and whether effects of AoA would be present during sign language processing across groups. Moreover, we aimed to carefully control participants' level of sign language proficiency by implementing a battery of language tests developed for the purpose of the project, which confirmed that participants had high competences of sign language. Between-group analyses revealed a hypothesized modulatory effect of AoA in the right inferior parietal lobule (IPL) in native signers, compared to late signers. With respect to within-group differences across languages we observed greater involvement of the left IPL in response to sign language in comparison to spoken language in both native and late signers, indicating language modality effects. Overall, our results suggest that the neural underpinnings of language are molded by the linguistic characteristics of the language as well as by when in life the language is learnt.
Collapse
Affiliation(s)
- A Banaszkiewicz
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland; Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland.
| | - B Costello
- Basque Center of Cognition, Brain and Language, Donostia-San Sebstián, Spain; Ikerbasque, Basque Foundation for Science, Bilbao, Spain
| | - A Marchewka
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
3
|
Papanicolaou AC. Non-Invasive Mapping of the Neuronal Networks of Language. Brain Sci 2023; 13:1457. [PMID: 37891824 PMCID: PMC10605023 DOI: 10.3390/brainsci13101457] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2023] [Revised: 09/13/2023] [Accepted: 10/05/2023] [Indexed: 10/29/2023] Open
Abstract
This review consists of three main sections. In the first, the Introduction, the main theories of the neuronal mediation of linguistic operations, derived mostly from studies of the effects of focal lesions on linguistic performance, are summarized. These models furnish the conceptual framework on which the design of subsequent functional neuroimaging investigations is based. In the second section, the methods of functional neuroimaging, especially those of functional Magnetic Resonance Imaging (fMRI) and of Magnetoencephalography (MEG), are detailed along with the specific activation tasks employed in presurgical functional mapping. The reliability of these non-invasive methods and their validity, judged against the results of the invasive methods, namely, the "Wada" procedure and Cortical Stimulation Mapping (CSM), is assessed and their use in presurgical mapping is justified. In the third and final section, the applications of fMRI and MEG in basic research are surveyed in the following six sub-sections, each dealing with the assessment of the neuronal networks for (1) the acoustic and phonological, (2) for semantic, (3) for syntactic, (4) for prosodic operations, (5) for sign language and (6) for the operations of reading and the mechanisms of dyslexia.
Collapse
Affiliation(s)
- Andrew C Papanicolaou
- Department of Pediatrics, Division of Pediatric Neurology, College of Medicine, University of Tennessee Health Science Center, Memphis, TN 38013, USA
| |
Collapse
|
4
|
Lammert JM, Levine AT, Koshkebaghi D, Butler BE. Sign language experience has little effect on face and biomotion perception in bimodal bilinguals. Sci Rep 2023; 13:15328. [PMID: 37714887 PMCID: PMC10504335 DOI: 10.1038/s41598-023-41636-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Accepted: 08/29/2023] [Indexed: 09/17/2023] Open
Abstract
Sensory and language experience can affect brain organization and domain-general abilities. For example, D/deaf individuals show superior visual perception compared to hearing controls in several domains, including the perception of faces and peripheral motion. While these enhancements may result from sensory loss and subsequent neural plasticity, they may also reflect experience using a visual-manual language, like American Sign Language (ASL), where signers must process moving hand signs and facial cues simultaneously. In an effort to disentangle these concurrent sensory experiences, we examined how learning sign language influences visual abilities by comparing bimodal bilinguals (i.e., sign language users with typical hearing) and hearing non-signers. Bimodal bilinguals and hearing non-signers completed online psychophysical measures of face matching and biological motion discrimination. No significant group differences were observed across these two tasks, suggesting that sign language experience is insufficient to induce perceptual advantages in typical-hearing adults. However, ASL proficiency (but not years of experience or age of acquisition) was found to predict performance on the motion perception task among bimodal bilinguals. Overall, the results presented here highlight a need for more nuanced study of how linguistic environments, sensory experience, and cognitive functions impact broad perceptual processes and underlying neural correlates.
Collapse
Affiliation(s)
- Jessica M Lammert
- Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building Room 6126, London, ON, N6A 5C2, Canada
- Western Institute for Neuroscience, University of Western Ontario, London, Canada
| | - Alexandra T Levine
- Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building Room 6126, London, ON, N6A 5C2, Canada
- Western Institute for Neuroscience, University of Western Ontario, London, Canada
| | - Dursa Koshkebaghi
- Undergraduate Neuroscience Program, University of Western Ontario, London, Canada
| | - Blake E Butler
- Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building Room 6126, London, ON, N6A 5C2, Canada.
- Western Institute for Neuroscience, University of Western Ontario, London, Canada.
- National Centre for Audiology, University of Western Ontario, London, Canada.
- Children's Health Research Institute, Lawson Health Research, London, Canada.
| |
Collapse
|
5
|
Holmer E, Schönström K, Andin J. Associations Between Sign Language Skills and Resting-State Functional Connectivity in Deaf Early Signers. Front Psychol 2022; 13:738866. [PMID: 35369269 PMCID: PMC8975249 DOI: 10.3389/fpsyg.2022.738866] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Accepted: 02/03/2022] [Indexed: 11/13/2022] Open
Abstract
The processing of a language involves a neural language network including temporal, parietal, and frontal cortical regions. This applies to spoken as well as signed languages. Previous research suggests that spoken language proficiency is associated with resting-state functional connectivity (rsFC) between language regions and other regions of the brain. Given the similarities in neural activation for spoken and signed languages, rsFC-behavior associations should also exist for sign language tasks. In this study, we explored the associations between rsFC and two types of linguistic skills in sign language: phonological processing skill and accuracy in elicited sentence production. Fifteen adult, deaf early signers were enrolled in a resting-state functional magnetic resonance imaging (fMRI) study. In addition to fMRI data, behavioral tests of sign language phonological processing and sentence reproduction were administered. Using seed-to-voxel connectivity analysis, we investigated associations between behavioral proficiency and rsFC from language-relevant nodes: bilateral inferior frontal gyrus (IFG) and posterior superior temporal gyrus (STG). Results showed that worse sentence processing skill was associated with stronger positive rsFC between the left IFG and left sensorimotor regions. Further, sign language phonological processing skill was associated with positive rsFC from right IFG to middle frontal gyrus/frontal pole although this association could possibly be explained by domain-general cognitive functions. Our findings suggest a possible connection between rsFC and developmental language outcomes in deaf individuals.
Collapse
Affiliation(s)
- Emil Holmer
- Linnaeus Centre HEAD, Swedish Institute for Disability Research, Department of Behavioural Sciences and Learning, Linköping University, Linköping, Sweden
- Center for Medical Image Science and Visualization, Linköping, Sweden
- *Correspondence: Emil Holmer,
| | | | - Josefine Andin
- Linnaeus Centre HEAD, Swedish Institute for Disability Research, Department of Behavioural Sciences and Learning, Linköping University, Linköping, Sweden
| |
Collapse
|
6
|
Caldwell HB. Sign and Spoken Language Processing Differences in the Brain: A Brief Review of Recent Research. Ann Neurosci 2022; 29:62-70. [PMID: 35875424 PMCID: PMC9305909 DOI: 10.1177/09727531211070538] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Accepted: 11/29/2021] [Indexed: 11/27/2022] Open
Abstract
Background: It is currently accepted that sign languages and spoken languages have significant processing commonalities. The evidence supporting this often merely investigates frontotemporal pathways, perisylvian language areas, hemispheric lateralization, and event-related potentials in typical settings. However, recent evidence has explored beyond this and uncovered numerous modality-dependent processing differences between sign languages and spoken languages by accounting for confounds that previously invalidated processing comparisons and by delving into the specific conditions in which they arise. However, these processing differences are often shallowly dismissed as unspecific to language. Summary: This review examined recent neuroscientific evidence for processing differences between sign and spoken language modalities and the arguments against these differences’ importance. Key distinctions exist in the topography of the left anterior negativity (LAN) and with modulations of event-related potential (ERP) components like the N400. There is also differential activation of typical spoken language processing areas, such as the conditional role of the temporal areas in sign language (SL) processing. Importantly, sign language processing uniquely recruits parietal areas for processing phonology and syntax and requires the mapping of spatial information to internal representations. Additionally, modality-specific feedback mechanisms distinctively involve proprioceptive post-output monitoring in sign languages, contrary to spoken languages’ auditory and visual feedback mechanisms. The only study to find ERP differences post-production revealed earlier lexical access in sign than spoken languages. Themes of temporality, the validity of an analogous anatomical mechanisms viewpoint, and the comprehensiveness of current language models were also discussed to suggest improvements for future research. Key message: Current neuroscience evidence suggests various ways in which processing differs between sign and spoken language modalities that extend beyond simple differences between languages. Consideration and further exploration of these differences will be integral in developing a more comprehensive view of language in the brain.
Collapse
Affiliation(s)
- Hayley Bree Caldwell
- Cognitive and Systems Neuroscience Research Hub (CSN-RH), School of Justice and Society, University of South Australia Magill Campus, Magill, South Australia, Australia
| |
Collapse
|
7
|
Dicataldo R, Roch M. Direct and Indirect Pathways of Variation in Length of Exposure to the Majority Language, Cognitive and Language Skills in Preschoolers' Listening Narrative Comprehension. CHILDREN (BASEL, SWITZERLAND) 2021; 8:636. [PMID: 34438527 PMCID: PMC8391907 DOI: 10.3390/children8080636] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/08/2021] [Revised: 07/22/2021] [Accepted: 07/23/2021] [Indexed: 11/16/2022]
Abstract
Listening narrative comprehension, according to the theoretical framework of the multicomponent model for comprehension, involves numerous skills that interact dynamically between each other and have the potential to give rise to individual differences in comprehension. The purpose of the current work was to define a comprehensive and complete multicomponent model of listening narrative comprehension in preschool age. We investigated how variation in Length of Exposure to majority Language (i.e., how long children have been exposed to the Italian language), lower-order cognitive (WM, inhibitory control, attention shifting), language skills (receptive vocabulary, syntactic knowledge, rapid naming), and higher-order cognitive skills (inferences, TOM, knowledge of story-structure) are related to listening narrative comprehension in Italian of 111 preschool children (Mage = 61 months; SD = 6.8) growing in a monolingual or multilingual context. Structural equation modeling results showed that the model explained 60% variance in listening narrative comprehension in Italian of children aged four to six and predicted the outcome both through direct and mediated paths, coherently with the multicomponent model of comprehension.
Collapse
Affiliation(s)
- Raffaele Dicataldo
- Department of Development and Socialization Psychology, University of Padova, 35131 Padova, Italy
| | - Maja Roch
- Department of Development and Socialization Psychology, University of Padova, 35131 Padova, Italy
| |
Collapse
|
8
|
Abstract
Early sensory deprivation, such as deafness, shapes brain development in multiple ways. Deprived auditory areas become engaged in the processing of stimuli from the remaining modalities and in high-level cognitive tasks. Yet, structural and functional changes were also observed in non-deprived brain areas, which may suggest the whole-brain network changes in deaf individuals. To explore this possibility, we compared the resting-state functional network organization of the brain in early deaf adults and hearing controls and examined global network segregation and integration. Relative to hearing controls, deaf adults exhibited decreased network segregation and an altered modular structure. In the deaf, regions of the salience network were coupled with the fronto-parietal network, while in the hearing controls, they were coupled with other large-scale networks. Deaf adults showed weaker connections between auditory and somatomotor regions, stronger coupling between the fronto-parietal network and several other large-scale networks (visual, memory, cingulo-opercular and somatomotor), and an enlargement of the default mode network. Our findings suggest that brain plasticity in deaf adults is not limited to changes in the auditory cortex but additionally alters the coupling between other large-scale networks and the development of functional brain modules. These widespread functional connectivity changes may provide a mechanism for the superior behavioral performance of the deaf in visual and attentional tasks.
Collapse
|
9
|
Qu H, Tang H, Pan J, Zhao Y, Wang W. Alteration of Cortical and Subcortical Structures in Children With Profound Sensorineural Hearing Loss. Front Hum Neurosci 2020; 14:565445. [PMID: 33362488 PMCID: PMC7756106 DOI: 10.3389/fnhum.2020.565445] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2020] [Accepted: 11/11/2020] [Indexed: 12/14/2022] Open
Abstract
Profound sensorineural hearing loss (SNHL) is an auditory disability associated with auditory and cognitive dysfunction. Due to distinct pathogenesis, some associated structural and functional changes within the brain have been investigated in previous studies, but whole-brain structural alterations are incompletely understood. We extended the exploration of neuroanatomic differences in whole-brain structure in children with profound SNHL who are primarily users of Chinese sign language (CSL). We employed surface-based morphometry (SBM) and subcortical analyses. T1-weighted magnetic resonance images of 26 children with profound SNHL and 27 age- and sex-matched children with normal hearing were analyzed. Compared with the normal control (NC) group, children with profound SNHL showed diverse structural changes in surface-based and subcortical analyses, including decreased cortical thickness in the left postcentral gyrus, superior parietal lobule, paracentral lobule, precuneus, the right transverse temporal gyri, and the middle temporal gyrus; a noticeable increase in the Local Gyrification Index (LGI) in the left precuneus and superior parietal lobule; and diverse changes in gray-matter volume (GMV) in different brain regions. Surface-based vertex analyses revealed regional contractions in the right thalamus, putamen, pallidum, and the brainstem of children with profound SNHL when compared with those in the NC group. Volumetric analyses showed decreased volumes of the right thalamus and pallidum in children with profound SNHL. Our data suggest that children with profound SNHL are associated with diffuse cerebral dysfunction to cortical and subcortical nuclei, and revealed neuroplastic reorganization in the precuneus, superior parietal lobule, and temporal gyrus. Our study provides robust evidence for changes in connectivity and structure in the brain associated with hearing loss.
Collapse
Affiliation(s)
- Hang Qu
- Medical Imaging Center, Affiliated Hospital of Yangzhou University, Yangzhou, China
| | - Hui Tang
- College of Education, Central China Normal University, Wuhan, China
| | - Jiahao Pan
- Center for Orthopedic and Biomechanics Research, Boise State University, Boise, ID, United States
| | - Yi Zhao
- Medical Imaging Center, Affiliated Hospital of Yangzhou University, Yangzhou, China
| | - Wei Wang
- Medical Imaging Center, Affiliated Hospital of Yangzhou University, Yangzhou, China
| |
Collapse
|
10
|
Bogliotti C, Aksen H, Isel F. Language experience in LSF development: Behavioral evidence from a sentence repetition task. PLoS One 2020; 15:e0236729. [PMID: 33201887 PMCID: PMC7671551 DOI: 10.1371/journal.pone.0236729] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2020] [Accepted: 07/11/2020] [Indexed: 11/18/2022] Open
Abstract
In psycholinguistics and clinical linguistics, the Sentence Repetition Task (SRT) is known to be a valuable tool to screen general language abilities in both spoken and signed languages. This task enables users to reliably and quickly assess linguistic abilities at different levels of linguistic analysis such as phonology, morphology, lexicon, and syntax. To evaluate sign language proficiency in deaf children using French Sign Language (LSF), we designed a new SRT comprising 20 LSF sentences. The task was administered to a cohort of 62 children- 34 native signers (6;09-12 years) and 28 non-native signers (6;08-12;08 years)-in order to study their general linguistic development as a function of age of sign language acquisition (AOA) and chronological age (CA). Previously, a group of 10 adult native signers was also evaluated with this task. As expected, our results showed a significant effect of AOA, indicating that the native signers repeated more signs and were more accurate than non-native signers. A similar pattern of results was found for CA. Furthermore, native signers made fewer phonological errors (i.e., handshape, movement, and location) than non-native signers. Finally, as shown in previous sign language studies, handshape and movement proved to be the most difficult parameters to master regardless of AOA and CA. Taken together, our findings support the assumption that AOA is a crucial factor in the development of phonological skills regardless of language modality (spoken vs. signed). This study thus constitutes a first step toward a theoretical description of the developmental trajectory in LSF, a hitherto understudied language.
Collapse
Affiliation(s)
| | - Hatice Aksen
- Laboratoire Structures Formelles du Langage CNRS & Saint Denis University, Paris, France
| | - Frédéric Isel
- Laboratoire MODYCO CNRS & Paris Nanterre University, Nanterre, France
| |
Collapse
|
11
|
Dicataldo R, Roch M. Are the Effects of Variation in Quantity of Daily Bilingual Exposure and Socioeconomic Status on Language and Cognitive Abilities Independent in Preschool Children? INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2020; 17:E4570. [PMID: 32630383 PMCID: PMC7344960 DOI: 10.3390/ijerph17124570] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/13/2020] [Revised: 06/15/2020] [Accepted: 06/21/2020] [Indexed: 01/05/2023]
Abstract
Bilingual exposure (BE) and socioeconomic status (SES) are associated with children's development, but their specific and unique effects are still unclear. This study analyzed the influence of these environmental factors on a set of cognitive and linguistic abilities in preschoolers to disentangle their effects. One hundred-eleven Italian-speaking preschool children (mean age = 61 months; SD = 6.8) growing in a monolingual or multilingual context completed an assessment of cognitive (theory of mind, inhibition, attention shifting and working memory) and linguistic abilities (vocabulary, grammar, narrative comprehension, lexical access). The results of hierarchical regressions with predictors variation in BE (both Length and Daily exposure) and SES on each ability, shown a specific contribution of variation in SES, after controlling for BE, in vocabulary, grammar, and working memory (WM), and a specific contribution of variation in BE, over and above effect of SES, in vocabulary, narrative comprehension and WM. In addition, we found an interaction between these factors in predicting the performance of the theory of mind task (ToM). To conclude, variations in BE and SES are related independently to individual differences in linguistic and cognitive skills of children in preschool.
Collapse
Affiliation(s)
- Raffaele Dicataldo
- Department of Development and Socialization Psychology, University of Padova, 35131 Padova, Italy;
| | | |
Collapse
|
12
|
Neuroscience and Sign Language. PAJOUHAN SCIENTIFIC JOURNAL 2020. [DOI: 10.52547/psj.18.2.90] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/24/2023]
|
13
|
Malaia EA, Krebs J, Roehm D, Wilbur RB. Age of acquisition effects differ across linguistic domains in sign language: EEG evidence. BRAIN AND LANGUAGE 2020; 200:104708. [PMID: 31698097 PMCID: PMC6934356 DOI: 10.1016/j.bandl.2019.104708] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/04/2019] [Revised: 10/10/2019] [Accepted: 10/11/2019] [Indexed: 06/10/2023]
Abstract
One of the key questions in the study of human language acquisition is the extent to which the development of neural processing networks for different components of language are modulated by exposure to linguistic stimuli. Sign languages offer a unique perspective on this issue, because prelingually Deaf children who receive access to complex linguistic input later in life provide a window into brain maturation in the absence of language, and subsequent neuroplasticity of neurolinguistic networks during late language learning. While the duration of sensitive periods of acquisition of linguistic subsystems (sound, vocabulary, and syntactic structure) is well established on the basis of L2 acquisition in spoken language, for sign languages, the relative timelines for development of neural processing networks for linguistic sub-domains are unknown. We examined neural responses of a group of Deaf signers who received access to signed input at varying ages to three linguistic phenomena at the levels of classifier signs, syntactic structure, and information structure. The amplitude of the N400 response to the marked word order condition negatively correlated with the age of acquisition for syntax and information structure, indicating increased cognitive load in these conditions. Additionally, the combination of behavioral and neural data suggested that late learners preferentially relied on classifiers over word order for meaning extraction. This suggests that late acquisition of sign language significantly increases cognitive load during analysis of syntax and information structure, but not word-level meaning.
Collapse
Affiliation(s)
- Evie A Malaia
- Department of Communicative Disorders, University of Alabama, Speech and Hearing Clinic, 700 Johnny Stallings Drive, Tuscaloosa, AL 35401, USA.
| | - Julia Krebs
- Research Group Neurobiology of Language, Department of Linguistics, University of Salzburg, Erzabt-Klotz-Straße 1, 5020 Salzburg, Austria; Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Erzabt-Klotz-Straße 1, 5020 Salzburg, Austria
| | - Dietmar Roehm
- Research Group Neurobiology of Language, Department of Linguistics, University of Salzburg, Erzabt-Klotz-Straße 1, 5020 Salzburg, Austria; Centre for Cognitive Neuroscience (CCNS), University of Salzburg, Erzabt-Klotz-Straße 1, 5020 Salzburg, Austria
| | - Ronnie B Wilbur
- Department of Linguistics, Purdue University, Lyles-Porter Hall, West Lafayette, IN 47907-2122, USA; Department of Speech, Language, and Hearing Sciences, Purdue University, Lyles-Porter Hall, West Lafayette, IN 47907-2122, USA
| |
Collapse
|
14
|
Twomey T, Price CJ, Waters D, MacSweeney M. The impact of early language exposure on the neural system supporting language in deaf and hearing adults. Neuroimage 2019; 209:116411. [PMID: 31857205 PMCID: PMC7985620 DOI: 10.1016/j.neuroimage.2019.116411] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2019] [Revised: 11/25/2019] [Accepted: 11/27/2019] [Indexed: 11/25/2022] Open
Abstract
Deaf late signers provide a unique perspective on the impact of impoverished early language exposure on the neurobiology of language: insights that cannot be gained from research with hearing people alone. Here we contrast the effect of age of sign language acquisition in hearing and congenitally deaf adults to examine the potential impact of impoverished early language exposure on the neural systems supporting a language learnt later in life. We collected fMRI data from deaf and hearing proficient users (N = 52) of British Sign Language (BSL), who learnt BSL either early (native) or late (after the age of 15 years) whilst they watched BSL sentences or strings of meaningless nonsense signs. There was a main effect of age of sign language acquisition (late > early) across deaf and hearing signers in the occipital segment of the left intraparietal sulcus. This finding suggests that late learners of sign language may rely on visual processing more than early learners, when processing both linguistic and nonsense sign input – regardless of hearing status. Region-of-interest analyses in the posterior superior temporal cortices (STC) showed an effect of age of sign language acquisition that was specific to deaf signers. In the left posterior STC, activation in response to signed sentences was greater in deaf early signers than deaf late signers. Importantly, responses in the left posterior STC in hearing early and late signers did not differ, and were similar to those observed in deaf early signers. These data lend further support to the argument that robust early language experience, whether signed or spoken, is necessary for left posterior STC to show a ‘native-like’ response to a later learnt language.
Collapse
Affiliation(s)
- Tae Twomey
- Institute of Cognitive Neuroscience, University College London, WC1N 3AZ, UK; Deafness, Cognition and Language Research Centre, University College London, WC1H 0PD, UK
| | - Cathy J Price
- Wellcome Centre for Human Neuroimaging, Institute of Neurology, University College London, WC1N 3BG, UK
| | - Dafydd Waters
- Institute of Cognitive Neuroscience, University College London, WC1N 3AZ, UK
| | - Mairéad MacSweeney
- Institute of Cognitive Neuroscience, University College London, WC1N 3AZ, UK; Deafness, Cognition and Language Research Centre, University College London, WC1H 0PD, UK.
| |
Collapse
|
15
|
Sign and Speech Share Partially Overlapping Conceptual Representations. Curr Biol 2019; 29:3739-3747.e5. [PMID: 31668623 PMCID: PMC6839399 DOI: 10.1016/j.cub.2019.08.075] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2019] [Revised: 08/01/2019] [Accepted: 08/30/2019] [Indexed: 11/24/2022]
Abstract
Conceptual knowledge is fundamental to human cognition. Yet, the extent to which it is influenced by language is unclear. Studies of semantic processing show that similar neural patterns are evoked by the same concepts presented in different modalities (e.g., spoken words and pictures or text) [1, 2, 3]. This suggests that conceptual representations are “modality independent.” However, an alternative possibility is that the similarity reflects retrieval of common spoken language representations. Indeed, in hearing spoken language users, text and spoken language are co-dependent [4, 5], and pictures are encoded via visual and verbal routes [6]. A parallel approach investigating semantic cognition shows that bilinguals activate similar patterns for the same words in their different languages [7, 8]. This suggests that conceptual representations are “language independent.” However, this has only been tested in spoken language bilinguals. If different languages evoke different conceptual representations, this should be most apparent comparing languages that differ greatly in structure. Hearing people with signing deaf parents are bilingual in sign and speech: languages conveyed in different modalities. Here, we test the influence of modality and bilingualism on conceptual representation by comparing semantic representations elicited by spoken British English and British Sign Language in hearing early, sign-speech bilinguals. We show that representations of semantic categories are shared for sign and speech, but not for individual spoken words and signs. This provides evidence for partially shared representations for sign and speech and shows that language acts as a subtle filter through which we understand and interact with the world. RSA analyses show that semantic categories are shared for sign and speech Neural patterns for individual spoken words and signs differ Spoken word and sign form representations are found in auditory and visual cortices Language acts as a subtle filter through which we interact with the world
Collapse
|
16
|
Stroh AL, Rösler F, Dormal G, Salden U, Skotara N, Hänel-Faulhaber B, Röder B. Neural correlates of semantic and syntactic processing in German Sign Language. Neuroimage 2019; 200:231-241. [PMID: 31220577 DOI: 10.1016/j.neuroimage.2019.06.025] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2018] [Revised: 05/16/2019] [Accepted: 06/12/2019] [Indexed: 11/24/2022] Open
Abstract
The study of deaf and hearing native users of signed languages can offer unique insights into how biological constraints and environmental input interact to shape the neural bases of language processing. Here, we use functional magnetic resonance imaging (fMRI) to address two questions: (1) Do semantic and syntactic processing in a signed language rely on anatomically and functionally distinct neural substrates as it has been shown for spoken languages? and (2) Does hearing status affect the neural correlates of these two types of linguistic processing? Deaf and hearing native signers performed a sentence judgement task on German Sign Language (Deutsche Gebärdensprache: DGS) sentences which were correct or contained either syntactic or semantic violations. We hypothesized that processing of semantic and syntactic violations in DGS relies on distinct neural substrates as it has been shown for spoken languages. Moreover, we hypothesized that effects of hearing status are observed within auditory regions, as deaf native signers have been shown to activate auditory areas to a greater extent than hearing native signers when processing a signed language. Semantic processing activated low-level visual areas and the left inferior frontal gyrus (IFG), suggesting both modality-dependent and independent processing mechanisms. Syntactic processing elicited increased activation in the right supramarginal gyrus (SMG). Moreover, psychophysiological interaction (PPI) analyses revealed a cluster in left middle occipital regions showing increased functional coupling with the right SMG during syntactic relative to semantic processing, possibly indicating spatial processing mechanisms that are specific to signed syntax. Effects of hearing status were observed in the right superior temporal cortex (STC): deaf but not hearing native signers showed greater activation for semantic violations than for syntactic violations in this region. Taken together, the present findings suggest that the neural correlates of language processing are partly determined by biological constraints, but that they may additionally be influenced by the unique processing demands of the language modality and different sensory experiences.
Collapse
Affiliation(s)
- Anna-Lena Stroh
- Biological Psychology and Neuropsychology, University of Hamburg, Germany.
| | - Frank Rösler
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | - Giulia Dormal
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | - Uta Salden
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | - Nils Skotara
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | - Barbara Hänel-Faulhaber
- Biological Psychology and Neuropsychology, University of Hamburg, Germany; Special Education, University of Hamburg, Germany
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| |
Collapse
|
17
|
Malaia E, Wilbur RB. Visual and linguistic components of short-term memory: Generalized Neural Model (GNM) for spoken and sign languages. Cortex 2019; 112:69-79. [DOI: 10.1016/j.cortex.2018.05.020] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2018] [Revised: 04/02/2018] [Accepted: 05/29/2018] [Indexed: 10/14/2022]
|
18
|
Prieur J, Lemasson A, Barbu S, Blois‐Heulin C. History, development and current advances concerning the evolutionary roots of human right‐handedness and language: Brain lateralisation and manual laterality in non‐human primates. Ethology 2018. [DOI: 10.1111/eth.12827] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Affiliation(s)
- Jacques Prieur
- CNRS, EthoS (Ethologie animale et humaine) – UMR 6552 Universite de Rennes, Normandie Universite Paimpont France
| | - Alban Lemasson
- CNRS, EthoS (Ethologie animale et humaine) – UMR 6552 Universite de Rennes, Normandie Universite Paimpont France
| | - Stéphanie Barbu
- CNRS, EthoS (Ethologie animale et humaine) – UMR 6552 Universite de Rennes, Normandie Universite Paimpont France
| | - Catherine Blois‐Heulin
- CNRS, EthoS (Ethologie animale et humaine) – UMR 6552 Universite de Rennes, Normandie Universite Paimpont France
| |
Collapse
|
19
|
Johnson L, Fitzhugh MC, Yi Y, Mickelsen S, Baxter LC, Howard P, Rogalsky C. Functional Neuroanatomy of Second Language Sentence Comprehension: An fMRI Study of Late Learners of American Sign Language. Front Psychol 2018; 9:1626. [PMID: 30237778 PMCID: PMC6136263 DOI: 10.3389/fpsyg.2018.01626] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2018] [Accepted: 08/14/2018] [Indexed: 01/16/2023] Open
Abstract
The neurobiology of sentence comprehension is well-studied but the properties and characteristics of sentence processing networks remain unclear and highly debated. Sign languages (i.e., visual-manual languages), like spoken languages, have complex grammatical structures and thus can provide valuable insights into the specificity and function of brain regions supporting sentence comprehension. The present study aims to characterize how these well-studied spoken language networks can adapt in adults to be responsive to sign language sentences, which contain combinatorial semantic and syntactic visual-spatial linguistic information. Twenty native English-speaking undergraduates who had completed introductory American Sign Language (ASL) courses viewed videos of the following conditions during fMRI acquisition: signed sentences, signed word lists, English sentences and English word lists. Overall our results indicate that native language (L1) sentence processing resources are responsive to ASL sentence structures in late L2 learners, but that certain L1 sentence processing regions respond differently to L2 ASL sentences, likely due to the nature of their contribution to language comprehension. For example, L1 sentence regions in Broca's area were significantly more responsive to L2 than L1 sentences, supporting the hypothesis that Broca's area contributes to sentence comprehension as a cognitive resource when increased processing is required. Anterior temporal L1 sentence regions were sensitive to L2 ASL sentence structure, but demonstrated no significant differences in activation to L1 than L2, suggesting its contribution to sentence processing is modality-independent. Posterior superior temporal L1 sentence regions also responded to ASL sentence structure but were more activated by English than ASL sentences. An exploratory analysis of the neural correlates of L2 ASL proficiency indicates that ASL proficiency is positively correlated with increased activations in response to ASL sentences in L1 sentence processing regions. Overall these results suggest that well-established fronto-temporal spoken language networks involved in sentence processing exhibit functional plasticity with late L2 ASL exposure, and thus are adaptable to syntactic structures widely different than those in an individual's native language. Our findings also provide valuable insights into the unique contributions of the inferior frontal and superior temporal regions that are frequently implicated in sentence comprehension but whose exact roles remain highly debated.
Collapse
Affiliation(s)
- Lisa Johnson
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Megan C Fitzhugh
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States.,Interdisciplinary Graduate Neuroscience Program, Arizona State University, Tempe, AZ, United States
| | - Yuji Yi
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Soren Mickelsen
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Leslie C Baxter
- Barrow Neurological Institute and St. Joseph's Hospital and Medical Center, Phoenix, AZ, United States
| | - Pamela Howard
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Corianne Rogalsky
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| |
Collapse
|
20
|
Peressotti F, Scaltritti M, Miozzo M. Can sign language make you better at hand processing? PLoS One 2018; 13:e0194771. [PMID: 29590204 PMCID: PMC5874053 DOI: 10.1371/journal.pone.0194771] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2017] [Accepted: 03/11/2018] [Indexed: 11/19/2022] Open
Abstract
The languages developed by deaf communities are unique for using visual signs produced by the hand. In the present study, we explored the cognitive effects of employing the hand as articulator. We focused on the arbitrariness of the form-meaning relationship—a fundamental feature of natural languages—and asked whether sign languages change the processing of arbitrary non-linguistic stimulus-response (S-R) associations involving the hand. This was tested using the Simon effect, which specifically requires such type of associations. Differences between signers and speakers (non-signers) only appeared in the Simon task when hand stimuli were shown. Response-time analyses revealed that the distinctiveness of signers’ responses derived from an increased ability to process memory traces of arbitrary S-R pairs related to the hand. These results shed light on the interplay between language and cognition as well as on the effects of sign language acquisition.
Collapse
Affiliation(s)
- Francesca Peressotti
- Dipartimento di Psicologia dello Sviluppo e della Socializzazione, Università di Padova, Padova, Italy
- * E-mail:
| | - Michele Scaltritti
- Dipartimento di Psicologia e Scienze Cognitive, Università di Trento, Rovereto, Italy
| | - Michele Miozzo
- Department of Psychology, The New School for Social Research, NewYork, NewYork, United States of America
| |
Collapse
|
21
|
Moreno A, Limousin F, Dehaene S, Pallier C. Brain correlates of constituent structure in sign language comprehension. Neuroimage 2018; 167:151-161. [PMID: 29175202 PMCID: PMC6044420 DOI: 10.1016/j.neuroimage.2017.11.040] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2017] [Revised: 10/27/2017] [Accepted: 11/19/2017] [Indexed: 01/16/2023] Open
Abstract
During sentence processing, areas of the left superior temporal sulcus, inferior frontal gyrus and left basal ganglia exhibit a systematic increase in brain activity as a function of constituent size, suggesting their involvement in the computation of syntactic and semantic structures. Here, we asked whether these areas play a universal role in language and therefore contribute to the processing of non-spoken sign language. Congenitally deaf adults who acquired French sign language as a first language and written French as a second language were scanned while watching sequences of signs in which the size of syntactic constituents was manipulated. An effect of constituent size was found in the basal ganglia, including the head of the caudate and the putamen. A smaller effect was also detected in temporal and frontal regions previously shown to be sensitive to constituent size in written language in hearing French subjects (Pallier et al., 2011). When the deaf participants read sentences versus word lists, the same network of language areas was observed. While reading and sign language processing yielded identical effects of linguistic structure in the basal ganglia, the effect of structure was stronger in all cortical language areas for written language relative to sign language. Furthermore, cortical activity was partially modulated by age of acquisition and reading proficiency. Our results stress the important role of the basal ganglia, within the language network, in the representation of the constituent structure of language, regardless of the input modality.
Collapse
Affiliation(s)
- Antonio Moreno
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France.
| | - Fanny Limousin
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France
| | - Stanislas Dehaene
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France; Collège de France, 11 Place Marcelin Berthelot, 75005 Paris, France
| | - Christophe Pallier
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, 91191 Gif/Yvette, France.
| |
Collapse
|
22
|
Le HB, Zhang HH, Wu QL, Zhang J, Yin JJ, Ma SH. Neural Activity During Mental Rotation in Deaf Signers: The Influence of Long-Term Sign Language Experience. Ear Hear 2018; 39:1015-1024. [PMID: 29298164 DOI: 10.1097/aud.0000000000000540] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
OBJECTIVES Mental rotation is the brain's visuospatial understanding of what objects are and where they belong. Previous research indicated that deaf signers showed behavioral enhancement for nonlinguistic visual tasks, including mental rotation. In this study, we investigated the neural difference of mental rotation processing between deaf signers and hearing nonsigners using blood oxygen level-dependent (BOLD) functional magnetic resonance imaging (fMRI). DESIGN The participants performed a block-designed experiment, consisting of alternating blocks of comparison and rotation periods, separated by a baseline or fixation period. Mental rotation tasks were performed using three-dimensional figures. fMRI images were acquired during the entire experiment, and the fMRI data were analyzed with Analysis of Functional NeuroImages. A factorial design analysis of variance was designed for fMRI analyses. The differences of activation were analyzed for the main effects of group and task, as well as for the interaction of group by task. RESULTS The study showed differences in activated areas between deaf signers and hearing nonsigners on the mental rotation of three-dimensional figures. Subtracting activations of fixation from activations of rotation, both groups showed consistent activation in bilateral occipital lobe, bilateral parietal lobe, and bilateral posterior temporal lobe. There were different main effects of task (rotation versus comparison) with significant activation clusters in the bilateral precuneus, the right middle frontal gyrus, the bilateral medial frontal gyrus, the right interior frontal gyrus, the right superior frontal gyrus, the right anterior cingulate, and the bilateral posterior cingulate. There were significant interaction effects of group by task in the bilateral anterior cingulate, the right inferior frontal gyrus, the left superior frontal gyrus, the left posterior cingulate, the left middle temporal gyrus, and the right inferior parietal lobe. In simple effects of deaf and hearing groups with rotation minus comparison, deaf signers mainly showed activity in the right hemisphere, while hearing nonsigners showed bilateral activity. In the simple effects of rotation task, decreased activities were shown for deaf signers compared with hearing nonsigners throughout several regions, including the bilateral parahippocampal gyrus, the left posterior cingulate cortex, the right anterior cingulate cortex, and the right inferior parietal lobe. CONCLUSION Decreased activations in several brain regions of deaf signers when compared to hearing nonsigners reflected increased neural efficiency and a precise functional circuitry, which was generated through long-term experience with sign language processing. In addition, we inferred tentatively that there may be a lateralization pattern to the right hemisphere for deaf signers when performing mental rotation tasks.
Collapse
Affiliation(s)
- Hong-Bo Le
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Hui-Hong Zhang
- Department of Radiology, Shenzhen Hospital of Southern Medical University, Shenzhen, China
- MR Division, Shantou Central Hospital, Shantou, China
| | - Qiu-Lin Wu
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Jiong Zhang
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Jing-Jing Yin
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Shu-Hua Ma
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| |
Collapse
|
23
|
de Schonen S, Bertoncini J, Petroff N, Couloigner V, Van Den Abbeele T. Visual cortical activity before and after cochlear implantation: A follow up ERP prospective study in deaf children. Int J Psychophysiol 2017; 123:88-102. [PMID: 29108924 DOI: 10.1016/j.ijpsycho.2017.10.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2016] [Revised: 10/20/2017] [Accepted: 10/23/2017] [Indexed: 11/20/2022]
Abstract
ERPs were recorded in response to presentation of static colored patterned stimuli in 25 children (19 to 80months of age at cochlear implantation, CI) with very early prelingual profound deafness (PreLD), 21 postlingual profoundly deaf children (PostLD) (34 to 180months of age at CI) and gender- and age-matched control hearing children. Recording sessions were performed before CI, then 6 and 24months after CI. Results showed that prelingual and, at a lesser degree, postlingual auditory deprivation altered cortical visual neural activity associated to colored shapes from both P1 and N1 cortical processing stages. The P1 and N1 amplitude modifications vanished about 24months after CI in both PreLD and PostLD deaf children. In PreLD the visual processing pattern becomes similar to the typical one essentially by an amplitude decrease of P1 on the left hemisphere together with an amplitude increase of the N1 on the right hemisphere. Finally, in PreLD, increased LH advantage over the RH in N1 amplitude on the cerebellar-occipito-parietal region before CI showed a significant inverse relationship with speech perception outcomes 3years after CI. Investigating early visual processing development and its neural substrates in deaf children would help to understand the variability of CI outcome, because their cortical visual organization diverged from the one of typically developing hearing children, and cannot be predicted from what is observed in deaf adults.
Collapse
Affiliation(s)
- Scania de Schonen
- Laboratory Psychology of Perception, University Paris Descartes-CNRS (UMR8242), Neuroscience and Cognition Institute, Paris, France.
| | - Josiane Bertoncini
- Laboratory Psychology of Perception, University Paris Descartes-CNRS (UMR8242), Neuroscience and Cognition Institute, Paris, France.
| | - Nathalie Petroff
- Dpt of Otorhinolaryngology and ENT Surgery, University Hospital (CHU), Hôpital Robert Debré, Paris, France.
| | - Vincent Couloigner
- Dpt of Otorhinolaryngology and ENT Surgery, University Hospital (CHU), Hôpital Robert Debré, Paris, France.
| | - Thierry Van Den Abbeele
- Dpt of Otorhinolaryngology and ENT Surgery, University Hospital (CHU), Hôpital Robert Debré, Paris, France.
| |
Collapse
|
24
|
Thiessen ED, Girard S, Erickson LC. Statistical learning and the critical period: how a continuous learning mechanism can give rise to discontinuous learning. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2016; 7:276-88. [DOI: 10.1002/wcs.1394] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2015] [Revised: 03/31/2016] [Accepted: 04/06/2016] [Indexed: 11/08/2022]
Affiliation(s)
- Erik D. Thiessen
- Department of Psychology; Carnegie Mellon University; Pittsburgh PA USA
| | - Sandrine Girard
- Department of Psychology; Carnegie Mellon University; Pittsburgh PA USA
| | - Lucy C. Erickson
- Department of Psychology; Carnegie Mellon University; Pittsburgh PA USA
| |
Collapse
|
25
|
Ferjan Ramirez N, Leonard MK, Davenport TS, Torres C, Halgren E, Mayberry RI. Neural Language Processing in Adolescent First-Language Learners: Longitudinal Case Studies in American Sign Language. Cereb Cortex 2016; 26:1015-26. [PMID: 25410427 PMCID: PMC4737603 DOI: 10.1093/cercor/bhu273] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. 2014. Neural language processing in adolescent first-language learners. Cereb Cortex. 24 (10): 2772-2783). Here, we show that after an additional 15 months of language experience, the adolescents' neural responses remained atypical in terms of polarity. While their responses to less familiar signed words still showed atypical localization patterns, the localization of responses to highly familiar signed words became more concentrated in the left perisylvian language network. Our findings suggest that the timing of language experience affects the organization of neural language processing; however, even in adolescence, language representation in the human brain continues to evolve with experience.
Collapse
Affiliation(s)
- Naja Ferjan Ramirez
- Department of Linguistics
- Multimodal Imaging Laboratory
- Institute for Learning and Brain Sciences, University of Washington, Seattle, WA 98195, USA
| | - Matthew K. Leonard
- Multimodal Imaging Laboratory
- Department of Radiology
- Department of Neurological Surgery, University of California, San Francisco, CA 94158, USA
| | | | | | - Eric Halgren
- Multimodal Imaging Laboratory
- Department of Radiology
- Department of Neuroscience and
- Kavli Institute for Brain and Mind, University of California, San Diego, La Jolla, CA 92093, USA
| | | |
Collapse
|
26
|
Williams JT, Darcy I, Newman SD. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study. Cortex 2016; 75:56-67. [DOI: 10.1016/j.cortex.2015.11.015] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2015] [Revised: 09/22/2015] [Accepted: 11/17/2015] [Indexed: 12/13/2022]
|
27
|
Shen H, Sabaliauskas N, Yang L, Aoki C, Smith SS. Role of α4-containing GABA A receptors in limiting synaptic plasticity and spatial learning of female mice during the pubertal period. Brain Res 2016; 1654:116-122. [PMID: 26826007 DOI: 10.1016/j.brainres.2016.01.020] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2015] [Accepted: 01/10/2016] [Indexed: 10/22/2022]
Abstract
Expression of α4βδ GABAA receptors (GABARs) increases at the onset of puberty on dendritic spines of CA1 hippocampal pyramidal cells. These receptors reduce activation of NMDA receptors (NMDARs), impair induction of long-term potentiation (LTP) and reduce hippocampal-dependent spatial learning. These effects are not seen in the δ-/- mouse, implicating α4βδ GABARs. Here we show that knock-out of α4 also restores synaptic plasticity and spatial learning in female mice at the onset of puberty (verified by vaginal opening). To this end, field excitatory post-synaptic potentials (fEPSPs) were recorded from the stratum radiatum of CA1 hippocampus in the slice from +/+ and α4-/- pubertal mice (PND 35-44). Induction of LTP, in response to stimulation of the Schaffer collaterals with theta burst stimulation (TBS), was unsuccessful in the +/+ hippocampus, but reinstated by α4 knock-out (~65% potentiation) but not by blockade of α5-GABARs with L-655,708 (50nM). In order to compare spatial learning in the two groups of mice, animals were trained in an active place avoidance task where the latency to first enter a shock zone is a measure of learning. α4-/- mice had significantly longer latencies by the third learning trial, suggesting better spatial learning, compared to +/+ animals, who did not reach the criterion for learning (120s latency). These findings suggest that knock-out of the GABAR α4 subunit restores synaptic plasticity and spatial learning at puberty and is consistent with the concept that the dendritic α4βδ GABARs which emerge at puberty selectively impair CNS plasticity. This article is part of a Special Issue entitled SI: Adolescent plasticity.
Collapse
Affiliation(s)
- Hui Shen
- School of Biomedical Engineering, Tianjin Medical University, No. 22 Qixiangtai Road, Heping District, Tianjin 300070 China; Department of Physiology and Pharmacology, SUNY Downstate Medical Center, 450 Clarkson Ave., Brooklyn, NY 11203, USA
| | - Nicole Sabaliauskas
- Department of Physiology and Pharmacology, SUNY Downstate Medical Center, 450 Clarkson Ave., Brooklyn, NY 11203, USA; Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA
| | - Lie Yang
- Department of Physiology and Pharmacology, SUNY Downstate Medical Center, 450 Clarkson Ave., Brooklyn, NY 11203, USA
| | - Chiye Aoki
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA
| | - Sheryl S Smith
- Department of Physiology and Pharmacology, SUNY Downstate Medical Center, 450 Clarkson Ave., Brooklyn, NY 11203, USA.
| |
Collapse
|
28
|
Wei M, Joshi AA, Zhang M, Mei L, Manis FR, He Q, Beattie RL, Xue G, Shattuck DW, Leahy RM, Xue F, Houston SM, Chen C, Dong Q, Lu ZL. How age of acquisition influences brain architecture in bilinguals. JOURNAL OF NEUROLINGUISTICS 2015; 36:35-55. [PMID: 27695193 PMCID: PMC5045052 DOI: 10.1016/j.jneuroling.2015.05.001] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
In the present study, we explored how Age of Acquisition (AoA) of L2 affected brain structures in bilingual individuals. Thirty-six native English speakers who were bilingual were scanned with high resolution MRI. After MRI signal intensity inhomogeneity correction, we applied both voxel-based morphometry (VBM) and surface-based morphometry (SBM) approaches to the data. VBM analysis was performed using FSL's standard VBM processing pipeline. For the SBM analysis, we utilized a semi-automated sulci delineation procedure, registered the brains to an atlas, and extracted measures of twenty four pre-selected regions of interest. We addressed three questions: (1) Which areas are more susceptible to differences in AoA? (2) How do AoA, proficiency and current level of exposure work together in predicting structural differences in the brain? And (3) What is the direction of the effect of AoA on regional volumetric and surface measures? Both VBM and SBM results suggested that earlier second language exposure was associated with larger volumes in the right parietal cortex. Consistently, SBM showed that the cortical area of the right superior parietal lobule increased as AoA decreased. In contrast, in the right pars orbitalis of the inferior frontal gyrus, AoA, proficiency, and current level of exposure are equally important in accounting for the structural differences. We interpret our results in terms of current theory and research on the effects of L2 learning on brain structures and functions.
Collapse
Affiliation(s)
- Miao Wei
- Department of Psychology, University of Southern California, Los Angeles, CA 90089-1061, USA
| | - Anand A. Joshi
- Signal and Image Processing Institute, University of Southern California, Los Angeles, CA 90089-2564, USA
| | - Mingxia Zhang
- National Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, China
| | - Leilei Mei
- Center for Studies of Psychological Application and School of Psychology, South China Normal University, Guangzhou 510631, China
| | - Franklin R. Manis
- Department of Psychology, University of Southern California, Los Angeles, CA 90089-1061, USA
| | - Qinghua He
- Department of Psychology, University of Southern California, Los Angeles, CA 90089-1061, USA
| | - Rachel L. Beattie
- Center for Cognitive and Behavioral Brain Imaging and Department of Psychology, The Ohio State University, Columbus, OH 43210, USA
| | - Gui Xue
- National Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, China
| | - David W. Shattuck
- Ahmanson-Lovelace Brain Mapping Center, Department of Neurology, David Geffen School of Medicine, University of California, Los Angeles, CA 90095-7334, USA
| | - Richard M. Leahy
- Signal and Image Processing Institute, University of Southern California, Los Angeles, CA 90089-2564, USA
| | - Feng Xue
- Department of Psychology, University of Southern California, Los Angeles, CA 90089-1061, USA
| | - Suzanne M. Houston
- Department of Psychology, University of Southern California, Los Angeles, CA 90089-1061, USA
| | - Chuansheng Chen
- Department of Psychology and Social Behavior, University of California Irvine, Irvine, CA 92697, USA
| | - Qi Dong
- National Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, China
| | - Zhong-Lin Lu
- Center for Cognitive and Behavioral Brain Imaging and Department of Psychology, The Ohio State University, Columbus, OH 43210, USA
| |
Collapse
|
29
|
Román P, González J, Ventura-Campos N, Rodríguez-Pujadas A, Sanjuán A, Ávila C. Neural differences between monolinguals and early bilinguals in their native language during comprehension. BRAIN AND LANGUAGE 2015; 150:80-89. [PMID: 26340683 DOI: 10.1016/j.bandl.2015.07.011] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/23/2014] [Revised: 07/15/2015] [Accepted: 07/20/2015] [Indexed: 06/05/2023]
Abstract
Research has shown that semantic processing of sentences engages more activity in the bilingual compared to the monolingual brain and, more specifically, in the inferior frontal gyrus. The present study aims to extend those results and examines whether semantic and also grammatical sentence processing involve different cerebral structures when testing in the native language. In this regard, highly proficient Spanish/Catalan bilinguals and Spanish monolinguals made grammatical and semantic judgments in Spanish while being scanned. Results showed that both types of judgments recruited more cerebral activity for bilinguals in language-related areas including the superior and middle temporal gyri. Such neural differences co-occurred with similar performance at the behavioral level. Taken together, these data suggest that early bilingualism shapes the brain and cognitive processes in sentence comprehension even in their native language; on the other hand, they indicate that brain over activation in bilinguals is not constrained to a specific area.
Collapse
Affiliation(s)
- P Román
- Department of Basic and Clinical Psychology and Psychobiology, Universitat Jaume I, Castelló 12701, Spain; Department of Psychology, Pennsylvania State University, University Park 16802, USA.
| | - J González
- Department of Basic and Clinical Psychology and Psychobiology, Universitat Jaume I, Castelló 12701, Spain
| | - N Ventura-Campos
- Department of Basic and Clinical Psychology and Psychobiology, Universitat Jaume I, Castelló 12701, Spain
| | - A Rodríguez-Pujadas
- Department of Basic and Clinical Psychology and Psychobiology, Universitat Jaume I, Castelló 12701, Spain
| | - A Sanjuán
- Department of Basic and Clinical Psychology and Psychobiology, Universitat Jaume I, Castelló 12701, Spain; Wellcome Trust Centre for Neuroimaging, University College, London WC1N 3BG, UK
| | - C Ávila
- Department of Basic and Clinical Psychology and Psychobiology, Universitat Jaume I, Castelló 12701, Spain
| |
Collapse
|
30
|
Neural systems supporting linguistic structure, linguistic experience, and symbolic communication in sign language and gesture. Proc Natl Acad Sci U S A 2015; 112:11684-9. [PMID: 26283352 DOI: 10.1073/pnas.1510527112] [Citation(s) in RCA: 51] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Sign languages used by deaf communities around the world possess the same structural and organizational properties as spoken languages: In particular, they are richly expressive and also tightly grammatically constrained. They therefore offer the opportunity to investigate the extent to which the neural organization for language is modality independent, as well as to identify ways in which modality influences this organization. The fact that sign languages share the visual-manual modality with a nonlinguistic symbolic communicative system-gesture-further allows us to investigate where the boundaries lie between language and symbolic communication more generally. In the present study, we had three goals: to investigate the neural processing of linguistic structure in American Sign Language (using verbs of motion classifier constructions, which may lie at the boundary between language and gesture); to determine whether we could dissociate the brain systems involved in deriving meaning from symbolic communication (including both language and gesture) from those specifically engaged by linguistically structured content (sign language); and to assess whether sign language experience influences the neural systems used for understanding nonlinguistic gesture. The results demonstrated that even sign language constructions that appear on the surface to be similar to gesture are processed within the left-lateralized frontal-temporal network used for spoken languages-supporting claims that these constructions are linguistically structured. Moreover, although nonsigners engage regions involved in human action perception to process communicative, symbolic gestures, signers instead engage parts of the language-processing network-demonstrating an influence of experience on the perception of nonlinguistic stimuli.
Collapse
|
31
|
Hall ML, Ferreira VS, Mayberry RI. Syntactic priming in American Sign Language. PLoS One 2015; 10:e0119611. [PMID: 25786230 PMCID: PMC4364966 DOI: 10.1371/journal.pone.0119611] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2014] [Accepted: 02/02/2015] [Indexed: 11/18/2022] Open
Abstract
Psycholinguistic studies of sign language processing provide valuable opportunities to assess whether language phenomena, which are primarily studied in spoken language, are fundamentally shaped by peripheral biology. For example, we know that when given a choice between two syntactically permissible ways to express the same proposition, speakers tend to choose structures that were recently used, a phenomenon known as syntactic priming. Here, we report two experiments testing syntactic priming of a noun phrase construction in American Sign Language (ASL). Experiment 1 shows that second language (L2) signers with normal hearing exhibit syntactic priming in ASL and that priming is stronger when the head noun is repeated between prime and target (the lexical boost effect). Experiment 2 shows that syntactic priming is equally strong among deaf native L1 signers, deaf late L1 learners, and hearing L2 signers. Experiment 2 also tested for, but did not find evidence of, phonological or semantic boosts to syntactic priming in ASL. These results show that despite the profound differences between spoken and signed languages in terms of how they are produced and perceived, the psychological representation of sentence structure (as assessed by syntactic priming) operates similarly in sign and speech.
Collapse
Affiliation(s)
- Matthew L. Hall
- Linguistics, University of Connecticut, Storrs, Connecticut, United States of America
- * E-mail:
| | - Victor S. Ferreira
- Psychology, UC San Diego, San Diego, California, United States of America
| | - Rachel I. Mayberry
- Linguistics, UC San Diego, San Diego, California, United States of America
| |
Collapse
|
32
|
Nielsen K. Phonetic imitation by young children and its developmental changes. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2014; 57:2065-2075. [PMID: 25076096 DOI: 10.1044/2014_jslhr-s-13-0093] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/11/2013] [Accepted: 07/04/2014] [Indexed: 06/03/2023]
Abstract
PURPOSE In the current study, the author investigated the developmental course of phonetic imitation in childhood, and further evaluated existing accounts of phonetic imitation. METHOD Sixteen preschoolers, 15 third graders, and 18 college students participated in the current study. An experiment with a modified imitation paradigm with a picture-naming task was conducted, in which participants' voice-onset time (VOT) was compared before and after they were exposed to target speech with artificially increased VOT. RESULTS Extended VOT in the target speech was imitated by preschoolers and 3rd graders as well as adults, confirming previous findings in phonetic imitation. Furthermore, an age effect of phonetic imitation was observed; namely, children showed greater imitation than adults, whereas the degree of imitation was comparable between preschoolers and 3rd graders. No significant effect of gender or word specificity was observed. CONCLUSIONS Young children imitated fine phonetic details of the target speech, and greater degree of phonetic imitation was observed in children compared to adults. These findings suggest that the degree of phonetic imitation negatively correlates with phonological development.
Collapse
|
33
|
Baus C, Gutiérrez E, Carreiras M. The role of syllables in sign language production. Front Psychol 2014; 5:1254. [PMID: 25431562 PMCID: PMC4230165 DOI: 10.3389/fpsyg.2014.01254] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2014] [Accepted: 10/15/2014] [Indexed: 11/19/2022] Open
Abstract
The aim of the present study was to investigate the functional role of syllables in sign language and how the different phonological combinations influence sign production. Moreover, the influence of age of acquisition was evaluated. Deaf signers (native and non-native) of Catalan Signed Language (LSC) were asked in a picture-sign interference task to sign picture names while ignoring distractor-signs with which they shared two phonological parameters (out of three of the main sign parameters: Location, Movement, and Handshape). The results revealed a different impact of the three phonological combinations. While no effect was observed for the phonological combination Handshape-Location, the combination Handshape-Movement slowed down signing latencies, but only in the non-native group. A facilitatory effect was observed for both groups when pictures and distractors shared Location-Movement. Importantly, linguistic models have considered this phonological combination to be a privileged unit in the composition of signs, as syllables are in spoken languages. Thus, our results support the functional role of syllable units during phonological articulation in sign language production.
Collapse
Affiliation(s)
- Cristina Baus
- Laboratoire de Psychologie Cognitive, Centre National de la Recherche Scientifique (CNRS), Université d'Aix-Marseille Marseille, France
| | - Eva Gutiérrez
- Deafness, Cognition and Language Research Centre, University College London London, UK
| | - Manuel Carreiras
- BCBL - Basque Research Center on Cognition, Brain and Language Donostia, Spain ; IKERBASQUE, Basque Foundation for Science Bilbao, Spain ; Departamento de Lengua Vasca y Comunicación, Universidad del País Vasco Donostia, Spain
| |
Collapse
|
34
|
Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. Neural language processing in adolescent first-language learners. Cereb Cortex 2014; 24:2772-83. [PMID: 23696277 PMCID: PMC4153811 DOI: 10.1093/cercor/bht137] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
The relation between the timing of language input and development of neural organization for language processing in adulthood has been difficult to tease apart because language is ubiquitous in the environment of nearly all infants. However, within the congenitally deaf population are individuals who do not experience language until after early childhood. Here, we investigated the neural underpinnings of American Sign Language (ASL) in 2 adolescents who had no sustained language input until they were approximately 14 years old. Using anatomically constrained magnetoencephalography, we found that recently learned signed words mainly activated right superior parietal, anterior occipital, and dorsolateral prefrontal areas in these 2 individuals. This spatiotemporal activity pattern was significantly different from the left fronto-temporal pattern observed in young deaf adults who acquired ASL from birth, and from that of hearing young adults learning ASL as a second language for a similar length of time as the cases. These results provide direct evidence that the timing of language experience over human development affects the organization of neural language processing.
Collapse
Affiliation(s)
| | | | | | | | - Eric Halgren
- Multimodal Imaging Laboratory
- Department of Radiology
- Department of Neurosciences
- Kavli Institute for Brain and Mind, University of California, San Diego, USA
| | | |
Collapse
|
35
|
Komeilipoor N, Vicario CM, Daffertshofer A, Cesari P. Talking hands: tongue motor excitability during observation of hand gestures associated with words. Front Hum Neurosci 2014; 8:767. [PMID: 25324761 PMCID: PMC4179693 DOI: 10.3389/fnhum.2014.00767] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2014] [Accepted: 09/10/2014] [Indexed: 11/15/2022] Open
Abstract
Perception of speech and gestures engage common brain areas. Neural regions involved in speech perception overlap with those involved in speech production in an articulator-specific manner. Yet, it is unclear whether motor cortex also has a role in processing communicative actions like gesture and sign language. We asked whether the mere observation of hand gestures, paired and not paired with words, may result in changes in the excitability of the hand and tongue areas of motor cortex. Using single-pulse transcranial magnetic stimulation (TMS), we measured the motor excitability in tongue and hand areas of left primary motor cortex, while participants viewed video sequences of bimanual hand movements associated or not-associated with nouns. We found higher motor excitability in the tongue area during the presentation of meaningful gestures (noun-associated) as opposed to meaningless ones, while the excitability of hand motor area was not differentially affected by gesture observation. Our results let us argue that the observation of gestures associated with a word results in activation of articulatory motor network accompanying speech production.
Collapse
Affiliation(s)
- Naeem Komeilipoor
- Department of Neurological and Movement Sciences, University of Verona Verona, Italy ; MOVE Research Institute Amsterdam, VU University Amsterdam Amsterdam, Netherlands
| | | | | | - Paola Cesari
- Department of Neurological and Movement Sciences, University of Verona Verona, Italy
| |
Collapse
|
36
|
Kovelman I, Shalinsky MH, Berens MS, Petitto LA. Words in the bilingual brain: an fNIRS brain imaging investigation of lexical processing in sign-speech bimodal bilinguals. Front Hum Neurosci 2014; 8:606. [PMID: 25191247 PMCID: PMC4139656 DOI: 10.3389/fnhum.2014.00606] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2014] [Accepted: 07/21/2014] [Indexed: 11/29/2022] Open
Abstract
Early bilingual exposure, especially exposure to two languages in different modalities such as speech and sign, can profoundly affect an individual's language, culture, and cognition. Here we explore the hypothesis that bimodal dual language exposure can also affect the brain's organization for language. These changes occur across brain regions universally important for language and parietal regions especially critical for sign language (Newman et al., 2002). We investigated three groups of participants (N = 29) that completed a word repetition task in American Sign Language (ASL) during fNIRS brain imaging. Those groups were (1) hearing ASL-English bimodal bilinguals (n = 5), (2) deaf ASL signers (n = 7), and (3) English monolinguals naïve to sign language (n = 17). The key finding of the present study is that bimodal bilinguals showed reduced activation in left parietal regions relative to deaf ASL signers when asked to use only ASL. In contrast, this group of bimodal signers showed greater activation in left temporo-parietal regions relative to English monolinguals when asked to switch between their two languages (Kovelman et al., 2009). Converging evidence now suggest that bimodal bilingual experience changes the brain bases of language, including the left temporo-parietal regions known to be critical for sign language processing (Emmorey et al., 2007). The results provide insight into the resilience and constraints of neural plasticity for language and bilingualism.
Collapse
Affiliation(s)
- Ioulia Kovelman
- Department of Psychology, Psychology and Center for Human Growth and Development, University of Michigan Ann Arbor, MI, USA
| | - Mark H Shalinsky
- Department of Psychology, Psychology and Center for Human Growth and Development, University of Michigan Ann Arbor, MI, USA
| | | | - Laura-Ann Petitto
- Visual Language and Visual Learning (VL2), Science of Learning Center, Gallaudet University, National Science Foundation Washington, DC, USA
| |
Collapse
|
37
|
Li Q, Xia S, Zhao F, Qi J. Functional changes in people with different hearing status and experiences of using Chinese sign language: an fMRI study. JOURNAL OF COMMUNICATION DISORDERS 2014; 50:51-60. [PMID: 24958241 DOI: 10.1016/j.jcomdis.2014.05.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/14/2013] [Revised: 03/22/2014] [Accepted: 05/16/2014] [Indexed: 06/03/2023]
Abstract
UNLABELLED The purpose of this study was to assess functional changes in the cerebral cortex in people with different sign language experience and hearing status whilst observing and imitating Chinese Sign Language (CSL) using functional magnetic resonance imaging (fMRI). 50 participants took part in the study, and were divided into four groups according to their hearing status and experience of using sign language: prelingual deafness signer group (PDS), normal hearing non-signer group (HnS), native signer group with normal hearing (HNS), and acquired signer group with normal hearing (HLS). fMRI images were scanned from all subjects when they performed block-designed tasks that involved observing and imitating sign language stimuli. Nine activation areas were found in response to undertaking either observation or imitation CSL tasks and three activated areas were found only when undertaking the imitation task. Of those, the PDS group had significantly greater activation areas in terms of the cluster size of the activated voxels in the bilateral superior parietal lobule, cuneate lobe and lingual gyrus in response to undertaking either the observation or the imitation CSL task than the HnS, HNS and HLS groups. The PDS group also showed significantly greater activation in the bilateral inferior frontal gyrus which was also found in the HNS or the HLS groups but not in the HnS group. This indicates that deaf signers have better sign language proficiency, because they engage more actively with the phonetic and semantic elements. In addition, the activations of the bilateral superior temporal gyrus and inferior parietal lobule were only found in the PDS group and HNS group, and not in the other two groups, which indicates that the area for sign language processing appears to be sensitive to the age of language acquisition. LEARNING OUTCOMES After reading this article, readers will be able to: discuss the relationship between sign language and its neural mechanisms.
Collapse
Affiliation(s)
- Qiang Li
- Engineering College for the Deaf, Tianjin University of Technology, Tianjin, China
| | - Shuang Xia
- Department of Radiology, Tianjin First Central Hospital, Tianjin 300192, China.
| | - Fei Zhao
- Centre for Hearing and Balance Studies, University of Bristol, UK
| | - Ji Qi
- Department of Radiology, Tianjin First Central Hospital, Tianjin 300192, China
| |
Collapse
|
38
|
Malaia E, Talavage TM, Wilbur RB. Functional connectivity in task-negative network of the Deaf: effects of sign language experience. PeerJ 2014; 2:e446. [PMID: 25024915 PMCID: PMC4081178 DOI: 10.7717/peerj.446] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2014] [Accepted: 06/02/2014] [Indexed: 01/23/2023] Open
Abstract
Prior studies investigating cortical processing in Deaf signers suggest that life-long experience with sign language and/or auditory deprivation may alter the brain’s anatomical structure and the function of brain regions typically recruited for auditory processing (Emmorey et al., 2010; Pénicaud et al., 2013 inter alia). We report the first investigation of the task-negative network in Deaf signers and its functional connectivity—the temporal correlations among spatially remote neurophysiological events. We show that Deaf signers manifest increased functional connectivity between posterior cingulate/precuneus and left medial temporal gyrus (MTG), but also inferior parietal lobe and medial temporal gyrus in the right hemisphere- areas that have been found to show functional recruitment specifically during sign language processing. These findings suggest that the organization of the brain at the level of inter-network connectivity is likely affected by experience with processing visual language, although sensory deprivation could be another source of the difference. We hypothesize that connectivity alterations in the task negative network reflect predictive/automatized processing of the visual signal.
Collapse
Affiliation(s)
- Evie Malaia
- Center for Mind, Brain, and Education, University of Texas at Arlington , TX , USA
| | - Thomas M Talavage
- Weldon School of Biomedical Engineering, Purdue University , IN , USA ; School of Electrical and Computer Engineering, Purdue University , IN , USA
| | - Ronnie B Wilbur
- Speech, Language, and Hearing Sciences, and Linguistics Program, Purdue University , IN , USA
| |
Collapse
|
39
|
Hirshorn EA, Dye MWG, Hauser PC, Supalla TR, Bavelier D. Neural networks mediating sentence reading in the deaf. Front Hum Neurosci 2014; 8:394. [PMID: 24959127 PMCID: PMC4050738 DOI: 10.3389/fnhum.2014.00394] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2014] [Accepted: 05/17/2014] [Indexed: 11/13/2022] Open
Abstract
The present work addresses the neural bases of sentence reading in deaf populations. To better understand the relative role of deafness and spoken language knowledge in shaping the neural networks that mediate sentence reading, three populations with different degrees of English knowledge and depth of hearing loss were included—deaf signers, oral deaf and hearing individuals. The three groups were matched for reading comprehension and scanned while reading sentences. A similar neural network of left perisylvian areas was observed, supporting the view of a shared network of areas for reading despite differences in hearing and English knowledge. However, differences were observed, in particular in the auditory cortex, with deaf signers and oral deaf showing greatest bilateral superior temporal gyrus (STG) recruitment as compared to hearing individuals. Importantly, within deaf individuals, the same STG area in the left hemisphere showed greater recruitment as hearing loss increased. To further understand the functional role of such auditory cortex re-organization after deafness, connectivity analyses were performed from the STG regions identified above. Connectivity from the left STG toward areas typically associated with semantic processing (BA45 and thalami) was greater in deaf signers and in oral deaf as compared to hearing. In contrast, connectivity from left STG toward areas identified with speech-based processing was greater in hearing and in oral deaf as compared to deaf signers. These results support the growing literature indicating recruitment of auditory areas after congenital deafness for visually-mediated language functions, and establish that both auditory deprivation and language experience shape its functional reorganization. Implications for differential reliance on semantic vs. phonological pathways during reading in the three groups is discussed.
Collapse
Affiliation(s)
- Elizabeth A Hirshorn
- Department of Brain and Cognitive Sciences, University of Rochester , Rochester, NY, USA ; Learning Research and Development Center, University of Pittsburgh , Pittsburgh, PA, USA
| | - Matthew W G Dye
- Department of Speech and Hearing Science, University of Illinois at Urbana-Champaign , Champaign, IL, USA ; Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign , Urbana, IL, USA ; Department of Psychology, University of Illinois at Urbana-Champaign , Champaign, IL, USA
| | - Peter C Hauser
- National Technical Institute for the Deaf, Rochester Institute of Technology , Rochester, NY, USA
| | - Ted R Supalla
- Department of Brain and Cognitive Sciences, University of Rochester , Rochester, NY, USA
| | - Daphne Bavelier
- Department of Brain and Cognitive Sciences, University of Rochester , Rochester, NY, USA ; Faculté de Psychologie et des Sciences de l'éducation, Université de Genève Geneva, Switzerland
| |
Collapse
|
40
|
Hänel-Faulhaber B, Skotara N, Kügow M, Salden U, Bottari D, Röder B. ERP correlates of German Sign Language processing in deaf native signers. BMC Neurosci 2014; 15:62. [PMID: 24884527 PMCID: PMC4018965 DOI: 10.1186/1471-2202-15-62] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2013] [Accepted: 04/28/2014] [Indexed: 11/27/2022] Open
Abstract
Background The present study investigated the neural correlates of sign language processing of Deaf people who had learned German Sign Language (Deutsche Gebärdensprache, DGS) from their Deaf parents as their first language. Correct and incorrect signed sentences were presented sign by sign on a computer screen. At the end of each sentence the participants had to judge whether or not the sentence was an appropriate DGS sentence. Two types of violations were introduced: (1) semantically incorrect sentences containing a selectional restriction violation (implausible object); (2) morphosyntactically incorrect sentences containing a verb that was incorrectly inflected (i.e., incorrect direction of movement). Event-related brain potentials (ERPs) were recorded from 74 scalp electrodes. Results Semantic violations (implausible signs) elicited an N400 effect followed by a positivity. Sentences with a morphosyntactic violation (verb agreement violation) elicited a negativity followed by a broad centro-parietal positivity. Conclusions ERP correlates of semantic and morphosyntactic aspects of DGS clearly differed from each other and showed a number of similarities with those observed in other signed and oral languages. These data suggest a similar functional organization of signed and oral languages despite the visual-spacial modality of sign language.
Collapse
Affiliation(s)
- Barbara Hänel-Faulhaber
- University of Hamburg, Biological Psychology and Neuropsychology, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | | | | | | | | | | |
Collapse
|
41
|
Scott GD, Karns CM, Dow MW, Stevens C, Neville HJ. Enhanced peripheral visual processing in congenitally deaf humans is supported by multiple brain regions, including primary auditory cortex. Front Hum Neurosci 2014; 8:177. [PMID: 24723877 PMCID: PMC3972453 DOI: 10.3389/fnhum.2014.00177] [Citation(s) in RCA: 64] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2014] [Accepted: 03/10/2014] [Indexed: 11/17/2022] Open
Abstract
Brain reorganization associated with altered sensory experience clarifies the critical role of neuroplasticity in development. An example is enhanced peripheral visual processing associated with congenital deafness, but the neural systems supporting this have not been fully characterized. A gap in our understanding of deafness-enhanced peripheral vision is the contribution of primary auditory cortex. Previous studies of auditory cortex that use anatomical normalization across participants were limited by inter-subject variability of Heschl's gyrus. In addition to reorganized auditory cortex (cross-modal plasticity), a second gap in our understanding is the contribution of altered modality-specific cortices (visual intramodal plasticity in this case), as well as supramodal and multisensory cortices, especially when target detection is required across contrasts. Here we address these gaps by comparing fMRI signal change for peripheral vs. perifoveal visual stimulation (11-15° vs. 2-7°) in congenitally deaf and hearing participants in a blocked experimental design with two analytical approaches: a Heschl's gyrus region of interest analysis and a whole brain analysis. Our results using individually-defined primary auditory cortex (Heschl's gyrus) indicate that fMRI signal change for more peripheral stimuli was greater than perifoveal in deaf but not in hearing participants. Whole-brain analyses revealed differences between deaf and hearing participants for peripheral vs. perifoveal visual processing in extrastriate visual cortex including primary auditory cortex, MT+/V5, superior-temporal auditory, and multisensory and/or supramodal regions, such as posterior parietal cortex (PPC), frontal eye fields, anterior cingulate, and supplementary eye fields. Overall, these data demonstrate the contribution of neuroplasticity in multiple systems including primary auditory cortex, supramodal, and multisensory regions, to altered visual processing in congenitally deaf adults.
Collapse
Affiliation(s)
- Gregory D. Scott
- Brain Development Lab, Department of Psychology, University of OregonEugene, OR, USA
- Division of Pulmonary and Critical Care Medicine, Department of Medicine, Oregon Health and Science UniversityPortland, OR, USA
| | - Christina M. Karns
- Brain Development Lab, Department of Psychology, University of OregonEugene, OR, USA
| | - Mark W. Dow
- Brain Development Lab, Department of Psychology, University of OregonEugene, OR, USA
| | | | - Helen J. Neville
- Brain Development Lab, Department of Psychology, University of OregonEugene, OR, USA
| |
Collapse
|
42
|
Inubushi T, Sakai KL. Functional and anatomical correlates of word-, sentence-, and discourse-level integration in sign language. Front Hum Neurosci 2013; 7:681. [PMID: 24155706 PMCID: PMC3804906 DOI: 10.3389/fnhum.2013.00681] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2013] [Accepted: 09/27/2013] [Indexed: 11/17/2022] Open
Abstract
In both vocal and sign languages, we can distinguish word-, sentence-, and discourse-level integration in terms of hierarchical processes, which integrate various elements into another higher level of constructs. In the present study, we used magnetic resonance imaging and voxel-based morphometry (VBM) to test three language tasks in Japanese Sign Language (JSL): word-level (Word), sentence-level (Sent), and discourse-level (Disc) decision tasks. We analyzed cortical activity and gray matter (GM) volumes of Deaf signers, and clarified three major points. First, we found that the activated regions in the frontal language areas gradually expanded in the dorso-ventral axis, corresponding to a difference in linguistic units for the three tasks. Moreover, the activations in each region of the frontal language areas were incrementally modulated with the level of linguistic integration. These dual mechanisms of the frontal language areas may reflect a basic organization principle of hierarchically integrating linguistic information. Secondly, activations in the lateral premotor cortex and inferior frontal gyrus were left-lateralized. Direct comparisons among the language tasks exhibited more focal activation in these regions, suggesting their functional localization. Thirdly, we found significantly positive correlations between individual task performances and GM volumes in localized regions, even when the ages of acquisition (AOAs) of JSL and Japanese were factored out. More specifically, correlations with the performances of the Word and Sent tasks were found in the left precentral/postcentral gyrus and insula, respectively, while correlations with those of the Disc task were found in the left ventral inferior frontal gyrus and precuneus. The unification of functional and anatomical studies would thus be fruitful for understanding human language systems from the aspects of both universality and individuality.
Collapse
Affiliation(s)
- Tomoo Inubushi
- Department of Basic Science, Graduate School of Arts and Sciences, The University of Tokyo Tokyo, Japan ; Japan Society for the Promotion of Science Tokyo, Japan
| | | |
Collapse
|
43
|
Smith SS. The influence of stress at puberty on mood and learning: role of the α4βδ GABAA receptor. Neuroscience 2013; 249:192-213. [PMID: 23079628 PMCID: PMC3586385 DOI: 10.1016/j.neuroscience.2012.09.065] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2012] [Revised: 09/21/2012] [Accepted: 09/25/2012] [Indexed: 11/22/2022]
Abstract
It is well-known that the onset of puberty is associated with changes in mood as well as cognition. Stress can have an impact on these outcomes, which in many cases, can be more influential in females, suggesting that gender differences exist. The adolescent period is a vulnerable time for the onset of certain psychopathologies, including anxiety disorders, depression and eating disorders, which are also more prevalent in females. One factor which may contribute to stress-triggered anxiety at puberty is the GABAA receptor (GABAR), which is known to play a pivotal role in anxiety. Expression of α4βδ GABARs increases on the dendrites of CA1 pyramidal cells at the onset of puberty in the hippocampus, part of the limbic circuitry which governs emotion. This receptor is a sensitive target for the stress steroid 3α-OH-5[α]β-pregnan-20-one or [allo]pregnanolone, which paradoxically reduces inhibition and increases anxiety during the pubertal period (post-natal day ∼35-44) of female mice in contrast to its usual effect to enhance inhibition and reduce anxiety. Spatial learning and synaptic plasticity are also adversely impacted at puberty, likely a result of increased expression of α4βδ GABARs on the dendritic spines of CA1 hippocampal pyramidal cells, which are essential for consolidation of memory. This review will focus on the role of these receptors in mediating behavioral changes at puberty. Stress-mediated changes in mood and cognition in early adolescence may have relevance for the expression of psychopathologies in adulthood.
Collapse
Affiliation(s)
- S S Smith
- Department of Physiology and Pharmacology, SUNY Downstate Medical Center, 450 Clarkson Avenue, Brooklyn, NY 11203, USA.
| |
Collapse
|
44
|
Communicative and noncommunicative point-light actions featuring high-resolution representation of the hands and fingers. Behav Res Methods 2013; 45:319-28. [PMID: 23073730 DOI: 10.3758/s13428-012-0273-2] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
We describe the creation of a set of point-light movies depicting 43 communicative gestures and 43 noncommunicative, pantomimed actions. These actions were recorded using a motion capture system that is worn on the body and provides accurate capture of the positions and movements of individual fingers. The movies created thus include point-lights on the fingers, allowing for representation of actions and gestures that would not be possible with a conventional, line-of-sight-based motion capture system. These videos would be suitable for use in cognitive and cognitive neuroscientific studies of biological motion and gesture perception. Each video is described, along with an H statistic indicating the consistency of the descriptive labels that 20 observers gave to the actions. We also produced a scrambled version of each movie, in which the starting position of each point was randomized but its local motion vector was preserved. These scrambled movies would be suitable for use as control stimuli in experimental studies. As supplementary materials, we provide QuickTime movie files of each action, along with text files specifying the three-dimensional coordinates of each point-light in each frame of each movie.
Collapse
|
45
|
How age of bilingual exposure can change the neural systems for language in the developing brain: a functional near infrared spectroscopy investigation of syntactic processing in monolingual and bilingual children. Dev Cogn Neurosci 2013; 6:87-101. [PMID: 23974273 PMCID: PMC6987800 DOI: 10.1016/j.dcn.2013.06.005] [Citation(s) in RCA: 51] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2012] [Revised: 06/23/2013] [Accepted: 06/30/2013] [Indexed: 11/23/2022] Open
Abstract
Early life bilingual language experience can change the developing brain. Age of first bilingual exposure predicts neural activation for language. Bilinguals show greater extent and variability of neural activity in language areas. Early-exposed bilinguals show greater activation in IFG and STG vs. monolinguals. Later-exposed bilinguals have greater DLPFC activity vs. early bilinguals.
Is the developing bilingual brain fundamentally similar to the monolingual brain (e.g., neural resources supporting language and cognition)? Or, does early-life bilingual language experience change the brain? If so, how does age of first bilingual exposure impact neural activation for language? We compared how typically-developing bilingual and monolingual children (ages 7–10) and adults recruit brain areas during sentence processing using functional Near Infrared Spectroscopy (fNIRS) brain imaging. Bilingual participants included early-exposed (bilingual exposure from birth) and later-exposed individuals (bilingual exposure between ages 4–6). Both bilingual children and adults showed greater neural activation in left-hemisphere classic language areas, and additionally, right-hemisphere homologues (Right Superior Temporal Gyrus, Right Inferior Frontal Gyrus). However, important differences were observed between early-exposed and later-exposed bilinguals in their earliest-exposed language. Early bilingual exposure imparts fundamental changes to classic language areas instead of alterations to brain regions governing higher cognitive executive functions. However, age of first bilingual exposure does matter. Later-exposed bilinguals showed greater recruitment of the prefrontal cortex relative to early-exposed bilinguals and monolinguals. The findings provide fascinating insight into the neural resources that facilitate bilingual language use and are discussed in terms of how early-life language experiences can modify the neural systems underlying human language processing.
Collapse
|
46
|
Leonard MK, Ferjan Ramirez N, Torres C, Hatrak M, Mayberry RI, Halgren E. Neural stages of spoken, written, and signed word processing in beginning second language learners. Front Hum Neurosci 2013; 7:322. [PMID: 23847496 PMCID: PMC3698463 DOI: 10.3389/fnhum.2013.00322] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2013] [Accepted: 06/11/2013] [Indexed: 11/23/2022] Open
Abstract
We combined magnetoencephalography (MEG) and magnetic resonance imaging (MRI) to examine how sensory modality, language type, and language proficiency interact during two fundamental stages of word processing: (1) an early word encoding stage, and (2) a later supramodal lexico-semantic stage. Adult native English speakers who were learning American Sign Language (ASL) performed a semantic task for spoken and written English words, and ASL signs. During the early time window, written words evoked responses in left ventral occipitotemporal cortex, and spoken words in left superior temporal cortex. Signed words evoked activity in right intraparietal sulcus that was marginally greater than for written words. During the later time window, all three types of words showed significant activity in the classical left fronto-temporal language network, the first demonstration of such activity in individuals with so little second language (L2) instruction in sign. In addition, a dissociation between semantic congruity effects and overall MEG response magnitude for ASL responses suggested shallower and more effortful processing, presumably reflecting novice L2 learning. Consistent with previous research on non-dominant language processing in spoken languages, the L2 ASL learners also showed recruitment of right hemisphere and lateral occipital cortex. These results demonstrate that late lexico-semantic processing utilizes a common substrate, independent of modality, and that proficiency effects in sign language are comparable to those in spoken language.
Collapse
Affiliation(s)
- Matthew K Leonard
- Department of Radiology, University of California San Diego, La Jolla, CA, USA ; Multimodal Imaging Laboratory, Department of Radiology, University of California San Diego, La Jolla, CA, USA
| | | | | | | | | | | |
Collapse
|
47
|
Lexical processing in deaf readers: an FMRI investigation of reading proficiency. PLoS One 2013; 8:e54696. [PMID: 23359269 PMCID: PMC3554651 DOI: 10.1371/journal.pone.0054696] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2012] [Accepted: 12/17/2012] [Indexed: 11/19/2022] Open
Abstract
Individuals with significant hearing loss often fail to attain competency in reading orthographic scripts which encode the sound properties of spoken language. Nevertheless, some profoundly deaf individuals do learn to read at age-appropriate levels. The question of what differentiates proficient deaf readers from less-proficient readers is poorly understood but topical, as efforts to develop appropriate and effective interventions are needed. This study uses functional magnetic resonance imaging (fMRI) to examine brain activation in deaf readers (N = 21), comparing proficient (N = 11) and less proficient (N = 10) readers’ performance in a widely used test of implicit reading. Proficient deaf readers activated left inferior frontal gyrus and left middle and superior temporal gyrus in a pattern that is consistent with regions reported in hearing readers. In contrast, the less-proficient readers exhibited a pattern of response characterized by inferior and middle frontal lobe activation (right>left) which bears some similarity to areas reported in studies of logographic reading, raising the possibility that these individuals are using a qualitatively different mode of orthographic processing than is traditionally observed in hearing individuals reading sound-based scripts. The evaluation of proficient and less-proficient readers points to different modes of processing printed English words. Importantly, these preliminary findings allow us to begin to establish the impact of linguistic and educational factors on the neural systems that underlie reading achievement in profoundly deaf individuals.
Collapse
|
48
|
Corina DP, Lawyer LA, Cates D. Cross-linguistic differences in the neural representation of human language: evidence from users of signed languages. Front Psychol 2013; 3:587. [PMID: 23293624 PMCID: PMC3534395 DOI: 10.3389/fpsyg.2012.00587] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2012] [Accepted: 12/11/2012] [Indexed: 11/13/2022] Open
Abstract
Studies of deaf individuals who are users of signed languages have provided profound insight into the neural representation of human language. Case studies of deaf signers who have incurred left- and right-hemisphere damage have shown that left-hemisphere resources are a necessary component of sign language processing. These data suggest that, despite frank differences in the input and output modality of language, core left perisylvian regions universally serve linguistic function. Neuroimaging studies of deaf signers have generally provided support for this claim. However, more fine-tuned studies of linguistic processing in deaf signers are beginning to show evidence of important differences in the representation of signed and spoken languages. In this paper, we provide a critical review of this literature and present compelling evidence for language-specific cortical representations in deaf signers. These data lend support to the claim that the neural representation of language may show substantive cross-linguistic differences. We discuss the theoretical implications of these findings with respect to an emerging understanding of the neurobiology of language.
Collapse
Affiliation(s)
- David P Corina
- Cognitive Neurolinguistics Laboratory, Center for Mind and Brain, Department of Linguistics, University of California Davis Davis, CA, USA
| | | | | |
Collapse
|
49
|
Zou L, Ding G, Abutalebi J, Shu H, Peng D. Structural plasticity of the left caudate in bimodal bilinguals. Cortex 2012; 48:1197-206. [DOI: 10.1016/j.cortex.2011.05.022] [Citation(s) in RCA: 119] [Impact Index Per Article: 9.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2011] [Revised: 05/04/2011] [Accepted: 05/23/2011] [Indexed: 11/25/2022]
|
50
|
Gutiérrez E, Müller O, Baus C, Carreiras M. Electrophysiological evidence for phonological priming in Spanish Sign Language lexical access. Neuropsychologia 2012; 50:1335-46. [DOI: 10.1016/j.neuropsychologia.2012.02.018] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2011] [Revised: 02/13/2012] [Accepted: 02/22/2012] [Indexed: 11/29/2022]
|