1
|
Buyle M, Crollen V. Deafness and early language deprivation influence arithmetic performances. Front Hum Neurosci 2022; 16:1000598. [DOI: 10.3389/fnhum.2022.1000598] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2022] [Accepted: 11/07/2022] [Indexed: 12/02/2022] Open
Abstract
It has been consistently reported that deaf individuals experience mathematical difficulties compared to their hearing peers. However, the idea that deafness and early language deprivation might differently affect verbal (i.e., multiplication) vs. visuospatial (i.e., subtraction) arithmetic performances is still under debate. In the present paper, three groups of 21 adults (i.e., deaf signers, hearing signers, and hearing controls) were therefore asked to perform, as fast and as accurately as possible, subtraction and multiplication operations. No significant group effect was found for accuracy performances. However, reaction time results demonstrated that the deaf group performed both arithmetic operations slower than the hearing groups. This group difference was even more pronounced for multiplication problems than for subtraction problems. Weaker language-based phonological representations for retrieving multiplication facts, and sensitivity to interference are two hypotheses discussed to explain the observed dissociation.
Collapse
|
2
|
Hänel-Faulhaber B, Groen MA, Röder B, Friedrich CK. Ongoing Sign Processing Facilitates Written Word Recognition in Deaf Native Signing Children. Front Psychol 2022; 13:917700. [PMID: 35992405 PMCID: PMC9390089 DOI: 10.3389/fpsyg.2022.917700] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2022] [Accepted: 06/24/2022] [Indexed: 11/13/2022] Open
Abstract
Signed and written languages are intimately related in proficient signing readers. Here, we tested whether deaf native signing beginning readers are able to make rapid use of ongoing sign language to facilitate recognition of written words. Deaf native signing children (mean 10 years, 7 months) received prime target pairs with sign word onsets as primes and written words as targets. In a control group of hearing children (matched in their reading abilities to the deaf children, mean 8 years, 8 months), spoken word onsets were instead used as primes. Targets (written German words) either were completions of the German signs or of the spoken word onsets. Task of the participants was to decide whether the target word was a possible German word. Sign onsets facilitated processing of written targets in deaf children similarly to spoken word onsets facilitating processing of written targets in hearing children. In both groups, priming elicited similar effects in the simultaneously recorded event related potentials (ERPs), starting as early as 200 ms after the onset of the written target. These results suggest that beginning readers can use ongoing lexical processing in their native language - be it signed or spoken - to facilitate written word recognition. We conclude that intimate interactions between sign and written language might in turn facilitate reading acquisition in deaf beginning readers.
Collapse
Affiliation(s)
| | | | - Brigitte Röder
- Biological Psychology and Neuropsychology, Universität Hamburg, Hamburg, Germany
| | - Claudia K. Friedrich
- Department of Developmental Psychology, University of Tübingen, Tübingen, Germany
| |
Collapse
|
3
|
Event-related potential correlates of visuo-tactile motion processing in congenitally deaf humans. Neuropsychologia 2022; 170:108209. [DOI: 10.1016/j.neuropsychologia.2022.108209] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2021] [Revised: 02/23/2022] [Accepted: 03/08/2022] [Indexed: 01/08/2023]
|
4
|
Caldwell HB. Sign and Spoken Language Processing Differences in the Brain: A Brief Review of Recent Research. Ann Neurosci 2022; 29:62-70. [PMID: 35875424 PMCID: PMC9305909 DOI: 10.1177/09727531211070538] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Accepted: 11/29/2021] [Indexed: 11/27/2022] Open
Abstract
Background: It is currently accepted that sign languages and spoken languages have significant processing commonalities. The evidence supporting this often merely investigates frontotemporal pathways, perisylvian language areas, hemispheric lateralization, and event-related potentials in typical settings. However, recent evidence has explored beyond this and uncovered numerous modality-dependent processing differences between sign languages and spoken languages by accounting for confounds that previously invalidated processing comparisons and by delving into the specific conditions in which they arise. However, these processing differences are often shallowly dismissed as unspecific to language. Summary: This review examined recent neuroscientific evidence for processing differences between sign and spoken language modalities and the arguments against these differences’ importance. Key distinctions exist in the topography of the left anterior negativity (LAN) and with modulations of event-related potential (ERP) components like the N400. There is also differential activation of typical spoken language processing areas, such as the conditional role of the temporal areas in sign language (SL) processing. Importantly, sign language processing uniquely recruits parietal areas for processing phonology and syntax and requires the mapping of spatial information to internal representations. Additionally, modality-specific feedback mechanisms distinctively involve proprioceptive post-output monitoring in sign languages, contrary to spoken languages’ auditory and visual feedback mechanisms. The only study to find ERP differences post-production revealed earlier lexical access in sign than spoken languages. Themes of temporality, the validity of an analogous anatomical mechanisms viewpoint, and the comprehensiveness of current language models were also discussed to suggest improvements for future research. Key message: Current neuroscience evidence suggests various ways in which processing differs between sign and spoken language modalities that extend beyond simple differences between languages. Consideration and further exploration of these differences will be integral in developing a more comprehensive view of language in the brain.
Collapse
Affiliation(s)
- Hayley Bree Caldwell
- Cognitive and Systems Neuroscience Research Hub (CSN-RH), School of Justice and Society, University of South Australia Magill Campus, Magill, South Australia, Australia
| |
Collapse
|
5
|
Berteletti I, Kimbley SE, Sullivan SJ, Quandt LC, Miyakoshi M. Different Language Modalities Yet Similar Cognitive Processes in Arithmetic Fact Retrieval. Brain Sci 2022; 12:brainsci12020145. [PMID: 35203909 PMCID: PMC8870392 DOI: 10.3390/brainsci12020145] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2021] [Revised: 01/10/2022] [Accepted: 01/17/2022] [Indexed: 12/04/2022] Open
Abstract
Does experience with signed language impact the neurocognitive processes recruited by adults solving arithmetic problems? We used event-related potentials (ERPs) to identify the components that are modulated by operation type and problem size in Deaf American Sign Language (ASL) native signers and in hearing English-speaking participants. Participants were presented with single-digit subtraction and multiplication problems in a delayed verification task. Problem size was manipulated in small and large problems with an additional extra-large subtraction condition to equate the overall magnitude of large multiplication problems. Results show comparable behavioral results and similar ERP dissociations across groups. First, an early operation type effect is observed around 200 ms post-problem onset, suggesting that both groups have a similar attentional differentiation for processing subtraction and multiplication problems. Second, for the posterior-occipital component between 240 ms and 300 ms, subtraction problems show a similar modulation with problem size in both groups, suggesting that only subtraction problems recruit quantity-related processes. Control analyses exclude possible perceptual and cross-operation magnitude-related effects. These results are the first evidence that the two operation types rely on distinct cognitive processes within the ASL native signing population and that they are equivalent to those observed in the English-speaking population.
Collapse
Affiliation(s)
- Ilaria Berteletti
- Ph.D. in Educational Neuroscience Program, Gallaudet University, Washington, DC 20002, USA; (S.E.K.); (S.J.S.); (L.C.Q.)
- Correspondence:
| | - Sarah E. Kimbley
- Ph.D. in Educational Neuroscience Program, Gallaudet University, Washington, DC 20002, USA; (S.E.K.); (S.J.S.); (L.C.Q.)
| | - SaraBeth J. Sullivan
- Ph.D. in Educational Neuroscience Program, Gallaudet University, Washington, DC 20002, USA; (S.E.K.); (S.J.S.); (L.C.Q.)
| | - Lorna C. Quandt
- Ph.D. in Educational Neuroscience Program, Gallaudet University, Washington, DC 20002, USA; (S.E.K.); (S.J.S.); (L.C.Q.)
| | - Makoto Miyakoshi
- Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California, San Diego, CA 92093, USA;
| |
Collapse
|
6
|
Quandt LC, Kubicek E, Willis A, Lamberton J. Enhanced biological motion perception in deaf native signers. Neuropsychologia 2021; 161:107996. [PMID: 34425145 DOI: 10.1016/j.neuropsychologia.2021.107996] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2020] [Revised: 07/22/2021] [Accepted: 08/17/2021] [Indexed: 02/06/2023]
Abstract
We conducted two studies to test how deaf signed language users perceive biological motions. We created 18 Biological Motion point-light displays (PLDs) depicting everyday human actions, and 18 Scrambled control PLDs. First, we conducted an online behavioral rating survey, in which deaf and hearing raters identified the biological motion PLDs and rated how easy it was for them to identify the actions. Then, we conducted an EEG study in which Deaf Signers and Hearing Non-Signers watched both the Biological Motion PLDs and the Scrambled PLDs, and we computed the time-frequency responses within the theta, alpha, and beta EEG rhythms. From the behavioral rating task, we show that the deaf raters reported significantly less effort required for identifying the Biological motion PLDs, across all stimuli. The EEG results showed that the Deaf Signers showed theta, mu, and beta differentiation between Scrambled and Biological PLDs earlier and more consistently than Hearing Non-Signers. We conclude that native ASL users exhibit experience-dependent neuroplasticity in the domain of biological human motion perception.
Collapse
Affiliation(s)
- Lorna C Quandt
- Ph.D in Educational Neuroscience Program, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA.
| | - Emily Kubicek
- Ph.D in Educational Neuroscience Program, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA
| | - Athena Willis
- Ph.D in Educational Neuroscience Program, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA
| | - Jason Lamberton
- VL2 Center, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA
| |
Collapse
|
7
|
Bosworth RG, Binder EM, Tyler SC, Morford JP. Automaticity of lexical access in deaf and hearing bilinguals: Cross-linguistic evidence from the color Stroop task across five languages. Cognition 2021; 212:104659. [PMID: 33798950 DOI: 10.1016/j.cognition.2021.104659] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2020] [Revised: 12/08/2020] [Accepted: 03/07/2021] [Indexed: 11/15/2022]
Abstract
The well-known Stroop interference effect has been instrumental in revealing the highly automated nature of lexical processing as well as providing new insights to the underlying lexical organization of first and second languages within proficient bilinguals. The present cross-linguistic study had two goals: 1) to examine Stroop interference for dynamic signs and printed words in deaf ASL-English bilinguals who report no reliance on speech or audiological aids; 2) to compare Stroop interference effects in several groups of bilinguals whose two languages range from very distinct to very similar in their shared orthographic patterns: ASL-English bilinguals (very distinct), Chinese-English bilinguals (low similarity), Korean-English bilinguals (moderate similarity), and Spanish-English bilinguals (high similarity). Reaction time and accuracy were measured for the Stroop color naming and word reading tasks, for congruent and incongruent color font conditions. Results confirmed strong Stroop interference for both dynamic ASL stimuli and English printed words in deaf bilinguals, with stronger Stroop interference effects in ASL for deaf bilinguals who scored higher in a direct assessment of ASL proficiency. Comparison of the four groups of bilinguals revealed that the same-script bilinguals (Spanish-English bilinguals) exhibited significantly greater Stroop interference effects for color naming than the other three bilingual groups. The results support three conclusions. First, Stroop interference effects are found for both signed and spoken languages. Second, contrary to some claims in the literature about deaf signers who do not use speech being poor readers, deaf bilinguals' lexical processing of both signs and written words is highly automated. Third, cross-language similarity is a critical factor shaping bilinguals' experience of Stroop interference in their two languages. This study represents the first comparison of both deaf and hearing bilinguals on the Stroop task, offering a critical test of theories about bilingual lexical access and cognitive control.
Collapse
Affiliation(s)
- Rain G Bosworth
- National Technical Institute for the Deaf, Rochester Institute of Technology, USA.
| | | | - Sarah C Tyler
- Department of Psychology, University of California, San Diego, USA
| | - Jill P Morford
- Department of Linguistics, University of New Mexico, USA
| |
Collapse
|
8
|
Language development in deaf bilinguals: Deaf middle school students co-activate written English and American Sign Language during lexical processing. Cognition 2021; 211:104642. [PMID: 33752155 DOI: 10.1016/j.cognition.2021.104642] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2020] [Revised: 02/07/2021] [Accepted: 02/19/2021] [Indexed: 11/20/2022]
Abstract
Bilinguals, both hearing and deaf, activate multiple languages simultaneously even in contexts that require only one language. To date, the point in development at which bilingual signers experience cross-language activation of a signed and a spoken language remains unknown. We investigated the processing of written words by ASL-English bilingual deaf middle school students. Deaf bilinguals were faster to respond to English word pairs with phonologically related translations in ASL than to English word pairs with unrelated translations, but no difference was found for hearing controls with no knowledge of ASL. The results indicate that co-activation of signs and written words is not the outcome of years of bilingual experience, but instead characterizes bilingual language development.
Collapse
|
9
|
Kubicek E, Quandt LC. A Positive Relationship Between Sign Language Comprehension and Mental Rotation Abilities. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2021; 26:1-12. [PMID: 32978623 DOI: 10.1093/deafed/enaa030] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2020] [Revised: 06/29/2020] [Accepted: 08/05/2020] [Indexed: 06/11/2023]
Abstract
Past work investigating spatial cognition suggests better mental rotation abilities for those who are fluent in a signed language. However, no prior work has assessed whether fluency is needed to achieve this performance benefit or what it may look like on the neurobiological level. We conducted an electroencephalography experiment and assessed accuracy on a classic mental rotation task given to deaf fluent signers, hearing fluent signers, hearing non-fluent signers, and hearing non-signers. Two of the main findings of the study are as follows: (1) Sign language comprehension and mental rotation abilities are positively correlated and (2) Behavioral performance differences between signers and non-signers are not clearly reflected in brain activity typically associated with mental rotation. In addition, we propose that the robust impact sign language appears to have on mental rotation abilities strongly suggests that "sign language use" should be added to future measures of spatial experiences.
Collapse
Affiliation(s)
- Emily Kubicek
- Educational Neuroscience Program, Gallaudet University
| | - Lorna C Quandt
- Educational Neuroscience Program, Gallaudet University
- Department of Psychology, Gallaudet University
| |
Collapse
|
10
|
Emmorey K, Mott M, Meade G, Holcomb PJ, Midgley KJ. Lexical selection in bimodal bilinguals: ERP evidence from picture-word interference. LANGUAGE, COGNITION AND NEUROSCIENCE 2020; 36:840-853. [PMID: 34485589 PMCID: PMC8411899 DOI: 10.1080/23273798.2020.1821905] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/25/2019] [Accepted: 09/04/2020] [Indexed: 06/13/2023]
Abstract
The picture word interference (PWI) paradigm and ERPs were used to investigate whether lexical selection in deaf and hearing ASL-English bilinguals occurs via lexical competition or whether the response exclusion hypothesis (REH) for PWI effects is supported. The REH predicts that semantic interference should not occur for bimodal bilinguals because sign and word responses do not compete within an output buffer. Bimodal bilinguals named pictures in ASL, preceded by either a translation equivalent, semantically-related, or unrelated English written word. In both the translation and semantically-related conditions bimodal bilinguals showed facilitation effects: reduced RTs and N400 amplitudes for related compared to unrelated prime conditions. We also observed an unexpected focal left anterior positivity that was stronger in the translation condition, which we speculate may be due to articulatory priming. Overall, the results support the REH and models of bilingual language production that assume lexical selection occurs without competition between languages.
Collapse
Affiliation(s)
- Karen Emmorey
- Corresponding author: Laboratory for Language and Cognitive Neuroscience, 6495 Alvarado Road, Suite 200, San Diego, CA 92120,
| | - Megan Mott
- Psychology Department, San Diego State University
| | - Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University, University of California, San Diego
| | | | | |
Collapse
|
11
|
How Bilingualism Contributes to Healthy Development in Deaf Children: A Public Health Perspective. Matern Child Health J 2020; 24:1330-1338. [PMID: 32632844 DOI: 10.1007/s10995-020-02976-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
The aim of this article is to increase awareness of language practices in the deaf community that affect communication needs and health outcomes, focusing particularly on the prevalence of bilingualism among deaf adults. Language deprivation and poor health outcomes in the deaf population are risks that cannot be addressed solely by hearing intervention. We propose that bilingualism acts as a protective measure to minimize the health risks faced by deaf individuals. Provision of culturally and linguistically appropriate services to deaf stakeholders, and particularly hearing families of deaf children, requires familiarity with the developmental and social ramifications of bilingualism.
Collapse
|
12
|
Thierfelder P, Wigglesworth G, Tang G. Sign phonological parameters modulate parafoveal preview effects in deaf readers. Cognition 2020; 201:104286. [PMID: 32521285 DOI: 10.1016/j.cognition.2020.104286] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2019] [Revised: 03/23/2020] [Accepted: 03/30/2020] [Indexed: 11/16/2022]
Abstract
Research has found that deaf readers unconsciously activate sign translations of written words while reading. However, the ways in which different sign phonological parameters associated with these sign translations tie into reading processes have received little attention in the literature. In this study on Chinese reading, we used a parafoveal preview paradigm to investigate how four different types of sign phonologically related preview affect reading processes in adult deaf signers of Hong Kong Sign Language (HKSL). The four types of sign phonologically related preview-target pair were: (1) pairs with HKSL translations that overlapped in three parameters-handshape, location, and movement; (2) pairs that overlapped in only handshape and location; (3) pairs that only overlapped in handshape and movement; and (4) pairs that only overlapped in location and movement. Results showed that the handshape parameter was of particular importance as only sign translation pairs that had handshape among their overlapping sign phonological parameters led to early sign activation. Furthermore, we found that, compared to control previews, deaf readers took longer to read targets when the sign translation previews overlapped with targets in either handshape and movement or handshape, movement, and location. In contrast, fixation times on targets were shorter when previews and targets overlapped location and any single additional parameter-either handshape or movement. These results indicate that the phonological parameters of handshape, location, and movement are activated via orthography during Chinese reading and can have different effects on parafoveal processing in deaf signers of HKSL.
Collapse
Affiliation(s)
- Philip Thierfelder
- ARC Centre of Excellence for the Dynamics of Language, The University of Melbourne, Australia.
| | - Gillian Wigglesworth
- ARC Centre of Excellence for the Dynamics of Language, The University of Melbourne, Australia
| | - Gladys Tang
- The Centre for Sign Linguistics and Deaf Studies, The Chinese University of Hong Kong, Hong Kong
| |
Collapse
|
13
|
Morford JP, Occhino C, Zirnstein M, Kroll JF, Wilkinson E, Piñar P. What is the Source of Bilingual Cross-Language Activation in Deaf Bilinguals? JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2019; 24:356-365. [PMID: 31398721 DOI: 10.1093/deafed/enz024] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/09/2019] [Revised: 04/16/2019] [Accepted: 05/12/2019] [Indexed: 06/10/2023]
Abstract
When deaf bilinguals are asked to make semantic similarity judgments of two written words, their responses are influenced by the sublexical relationship of the signed language translations of the target words. This study investigated whether the observed effects of American Sign Language (ASL) activation on English print depend on (a) an overlap in syllabic structure of the signed translations or (b) on initialization, an effect of contact between ASL and English that has resulted in a direct representation of English orthographic features in ASL sublexical form. Results demonstrate that neither of these conditions is required or enhances effects of cross-language activation. The experimental outcomes indicate that deaf bilinguals discover the optimal mapping between their two languages in a manner that is not constrained by privileged sublexical associations.
Collapse
|
14
|
Kubicek E, Quandt LC. Sensorimotor system engagement during ASL sign perception: An EEG study in deaf signers and hearing non-signers. Cortex 2019; 119:457-469. [PMID: 31505437 DOI: 10.1016/j.cortex.2019.07.016] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2019] [Revised: 06/04/2019] [Accepted: 07/29/2019] [Indexed: 10/26/2022]
Abstract
When a person observes someone else performing an action, the observer's sensorimotor cortex activates as if the observer is the one performing the action, a phenomenon known as action simulation. While this process has been well-established for basic (e.g., grasping) and complex (e.g., dancing) actions, it remains unknown if the framework of action simulation is applicable to visual languages such as American Sign Language (ASL). We conducted an EEG experiment with deaf signers and hearing non-signers to compare overall sensorimotor EEG between groups, and to test whether sensorimotor systems are differentially sensitive to signs that are produced with one hand ("1H") or two hands ("2H"). We predicted greater alpha and beta event-related desynchronization (previously correlated with action simulation) during the perception of 2H ASL signs compared to 1H ASL signs, due to greater demands on sensorimotor processing systems required for producing two-handed actions. We recorded EEG from both groups as they observed videos of ASL signs, half 1H and half 2H. Event-related spectral perturbations (ERSPs) in the alpha and beta ranges were computed for the two conditions at central electrode sites overlying the sensorimotor cortex. Sensorimotor EEG responses in both Hearing and Deaf groups were sensitive to the observed gross motor characteristics of the observed signs. We show for the first time that despite hearing non-signers showing overall more sensorimotor cortex involvement during sign observation, mirroring-related processes are in fact involved when deaf signers observe signs.
Collapse
Affiliation(s)
- Emily Kubicek
- Educational Neuroscience Program, Gallaudet University, Washington, DC, USA
| | - Lorna C Quandt
- Educational Neuroscience Program, Gallaudet University, Washington, DC, USA; Department of Psychology, Gallaudet University, Washington, DC, USA.
| |
Collapse
|