1
|
Lemmetyinen S, Hokkanen L, Vehviläinen V, Klippi A. Recovery of gestures for persons with severe non-fluent aphasia and limb apraxia: A long-term follow-up study. APPLIED NEUROPSYCHOLOGY. ADULT 2024:1-12. [PMID: 38801404 DOI: 10.1080/23279095.2024.2355668] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2024]
Abstract
Persons with severe non-fluent aphasia would benefit from using gestures to substitute for their absent powers of speech. The use of gestures, however, is challenging for persons with aphasia and concomitant limb apraxia. Research on the long-term recovery of gestures is scant, and it is unclear whether gesture performance can show recovery over time. This study evaluated the recovery of emblems and tool use pantomimes of persons with severe non-fluent aphasia and limb apraxia after a left hemisphere stroke. The Florida Apraxia Screening Test-Revised (FAST-R) was used for measurements. The test includes 30 gestures to be performed (i) after an oral request, (ii) with the aid of a pictorial cue, or (iii) as an imitation. The gestures were rated on their degree of comprehensibility. The comprehensibility of gestures after an oral request improved significantly in five out of seven participants between the first (1-3 months after the stroke) and the last (3 years after) examination. Improvement continued for all five in the period between six months and three years. The imitation model did improve the comprehensibility of gestures for all participants, whereas the pictorial cue did so just slightly. The skill of producing gestures can improve even in the late phase post-stroke. Because of this potential, we suggest that gesture training should be systematically included in the rehabilitation of communication for persons with severe non-fluent aphasia.
Collapse
Affiliation(s)
- Sanna Lemmetyinen
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
- Services of Speech and Language Therapy, Wellbeing Services County of North Karelia, Joensuu, Finland
| | - Laura Hokkanen
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - Viivi Vehviläinen
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - Anu Klippi
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| |
Collapse
|
2
|
Funayama M, Nakajima A. Development of Self-made Gestures as an Adaptive Strategy for Communication in an Individual With Childhood Apraxia of Speech. Cogn Behav Neurol 2023; 36:249-258. [PMID: 37724738 DOI: 10.1097/wnn.0000000000000354] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Accepted: 04/13/2023] [Indexed: 09/21/2023]
Abstract
Individuals with childhood apraxia of speech often exhibit greater difficulty with expressive language than with receptive language. As a result, they may benefit from alternative modes of communication. Here, we present a patient with childhood apraxia of speech who used pointing as a means of communication at age 2 ¼ years and self-made gestures at age 3½, when he had severe difficulties speaking in spite of probable normal comprehension abilities. His original gestures included not only word-level expressions, but also sentence-length ones. For example, when expressing "I am going to bed," he pointed his index finger at himself (meaning I ) and then put both his hands together near his ear ( sleep ). When trying to convey the meaning of "I enjoyed the meal and am leaving," he covered his mouth with his right hand ( delicious ), then joined both of his hands in front of himself ( finish ) and finally waved his hands ( goodbye ). These original gestures and pointing peaked at the age of 4 and then subsided and completely disappeared by the age of 7, when he was able to make himself understood to some extent with spoken words. The present case demonstrates an adaptive strategy for communication that might be an inherent competence for human beings.
Collapse
Affiliation(s)
| | - Asuka Nakajima
- Rehabilitation, Ashikaga Red Cross Hospital, Tochigi, Japan
| |
Collapse
|
3
|
Zhang H, Hinzen W. Temporal Overlap Between Gestures and Speech in Poststroke Aphasia: Is There a Compensatory Effect? JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2022; 65:4797-4811. [PMID: 36455133 DOI: 10.1044/2022_jslhr-22-00130] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE If language production is impaired, will gestures compensate? Evidence in favor of this prediction has often been argued to come from aphasia, but it remains contested. Here, we tested whether thought content not present in speech due to language impairment is manifested in gestures, in 20 people with dysfluent (Broca's) aphasia, 20 people with fluent (Wernicke's) aphasia, and 20 matched neurotypical controls. METHOD A new annotation scheme was created distinguishing types of gestures and whether they co-occurred with fluent or dysfluent/absent speech and were temporally aligned in content with coproduced speech. RESULTS Across both aphasia types, noncontent (beat) gestures, which by their nature cannot compensate for lost speech content, constituted the greatest proportion of all types of gestures produced. Content (i.e., descriptive, referential, and metaphorical) gestures were largely coproduced with fluent rather than dysfluent speech and tended to be aligned with the content conveyed in speech. They also did not differ in quantity depending on whether the dysfluencies were eventually resolved or not. Neither aphasia severity nor comprehension ability had an impact on the total amount of content gesture produced in people with aphasia, which was instead positively correlated with speech fluency. CONCLUSIONS Together, these results suggest that gestures are unlikely to have a role in compensating for linguistic deficits and to serve as a representational system conveying thought content independent of language. Surprisingly, aphasia rather is a model of how gesture and language are inherently integrated and aligned: Even when language is impaired, it remains the essential provider of content.
Collapse
Affiliation(s)
- Han Zhang
- Department of Translation and Language Sciences, Universitat Pompeu Fabra, Barcelona, Spain
| | - Wolfram Hinzen
- Department of Translation and Language Sciences, Universitat Pompeu Fabra, Barcelona, Spain
- Catalan Institute for Advanced Studies and Research (ICREA), Barcelona, Spain
| |
Collapse
|
4
|
van Nispen K, Sekine K, van der Meulen I, Preisig BC. Gesture in the eye of the beholder: An eye-tracking study on factors determining the attention for gestures produced by people with aphasia. Neuropsychologia 2022; 174:108315. [DOI: 10.1016/j.neuropsychologia.2022.108315] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2021] [Revised: 06/28/2022] [Accepted: 06/30/2022] [Indexed: 10/17/2022]
|
5
|
Kuyler A, Johnson E, Bornman J. Unaided communication behaviours displayed by adults with severe cerebrovascular accidents and little or no functional speech: A scoping review. INTERNATIONAL JOURNAL OF LANGUAGE & COMMUNICATION DISORDERS 2022; 57:403-421. [PMID: 34967962 DOI: 10.1111/1460-6984.12691] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/30/2020] [Revised: 11/15/2021] [Accepted: 11/18/2021] [Indexed: 06/14/2023]
Abstract
BACKGROUND Unaided communication behaviours may provide communication support for persons with severe cerebrovascular accidents (CVA), as these individuals often experience severe communication difficulties, regardless of the aetiology. Though often subtle, these behaviours are present during all stages of recovery, and therefore communication partners need to know not only which unaided strategies are used as communication attempts, but also what their function is (i.e., what the person aims to achieve with the communication). AIM To identify the unaided communication behaviours that adults with severe CVA and little or no functional speech use to communicate, and to determine the communication functions addressed by these behaviours. METHODS & PROCEDURES The study used a scoping review methodology and included articles on communication partners of persons with CVA published between 1986 and 2020. Initially the searches yielded 732 studies from which 211 duplicates were identified. The remaining studies (n = 531) were then screened on title, abstract and full-text level resulting in a final inclusion of 18 studies. Of the 18 studies, five were qualitative and 13 consisted of quantitative methodologies. MAIN CONTRIBUTION The subtle communication behaviours used by persons with CVA (and resultant severe communication difficulties) are often misinterpreted or overlooked by their partners. If partners are trained to recognise such subtle or unaided communication behaviours, they can provide adequate support to access a range of communication functions. The unaided communication behaviours, which include 13 primary behaviours ranging from non-linguistic to linguistic, were utilised to convey 31 communication functions classified into four main communication categories. CONCLUSIONS & IMPLICATIONS Although unaided communication behaviours often appear as limiting, they can be utilised to communicate various communication functions. The findings of this review support the training of partners to identify these behaviours and improve person-partner communication. WHAT IS KNOWN?: Unaided communication has been widely researched. However, a summary is needed of the various unaided communication behaviours and of the different communication functions addressed by these behaviours. What the paper adds… This paper emphasises that unaided communication behaviours range from non-linguistic to linguistic, and they can support unintentional, pre-intentional and intentional communication functions. Clinical implications Even though aided communication is preferred, unaided communication behaviours are generally used in contexts with limited resources, as well as among culturally and linguistically diverse populations. This study advocates the identification of unaided communication behaviours by partners as well as the support and provision of access to communication strategies for persons with severe CVA. Future research should include more untrained communication partners.
Collapse
Affiliation(s)
- Ariné Kuyler
- Centre for Augmentative and Alternative Communication, University of Pretoria, Private Bag X20, Hatfield, 0028, South Africa
| | - Ensa Johnson
- Centre for Augmentative and Alternative Communication, University of Pretoria, Private Bag X20, Hatfield, 0028, South Africa
| | - Juan Bornman
- Centre for Augmentative and Alternative Communication, University of Pretoria, Private Bag X20, Hatfield, 0028, South Africa
| |
Collapse
|
6
|
What is Functional Communication? A Theoretical Framework for Real-World Communication Applied to Aphasia Rehabilitation. Neuropsychol Rev 2022; 32:937-973. [PMID: 35076868 PMCID: PMC9630202 DOI: 10.1007/s11065-021-09531-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Aphasia is an impairment of language caused by acquired brain damage such as stroke or traumatic brain injury, that affects a person’s ability to communicate effectively. The aim of rehabilitation in aphasia is to improve everyday communication, improving an individual’s ability to function in their day-to-day life. For that reason, a thorough understanding of naturalistic communication and its underlying mechanisms is imperative. The field of aphasiology currently lacks an agreed, comprehensive, theoretically founded definition of communication. Instead, multiple disparate interpretations of functional communication are used. We argue that this makes it nearly impossible to validly and reliably assess a person’s communicative performance, to target this behaviour through therapy, and to measure improvements post-therapy. In this article we propose a structured, theoretical approach to defining the concept of functional communication. We argue for a view of communication as “situated language use”, borrowed from empirical psycholinguistic studies with non-brain damaged adults. This framework defines language use as: (1) interactive, (2) multimodal, and (3) contextual. Existing research on each component of the framework from non-brain damaged adults and people with aphasia is reviewed. The consequences of adopting this approach to assessment and therapy for aphasia rehabilitation are discussed. The aim of this article is to encourage a more systematic, comprehensive approach to the study and treatment of situated language use in aphasia.
Collapse
|
7
|
Stark BC, Cofoid C. Task-Specific Iconic Gesturing During Spoken Discourse in Aphasia. AMERICAN JOURNAL OF SPEECH-LANGUAGE PATHOLOGY 2022; 31:30-47. [PMID: 34033493 PMCID: PMC9135014 DOI: 10.1044/2021_ajslp-20-00271] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/09/2020] [Revised: 11/30/2020] [Accepted: 02/11/2021] [Indexed: 05/26/2023]
Abstract
PURPOSE In persons living with aphasia, we will explore the relationship between iconic gesture production during spontaneous speech and discourse task, spoken language, and demographic information. METHOD Employing the AphasiaBank database, we coded iconic gestures in 75 speakers with aphasia during two spoken discourse tasks: a procedural narrative, which involved participants telling the experimenter how to make a sandwich ("Sandwich"), and a picture sequence narrative, which had participants describe the picture sequence to the experimenter ("Window"). Forty-three produced a gesture during both tasks, and we further evaluate data from this subgroup as a more direct comparison between tasks. RESULTS More iconic gestures, at a higher rate, were produced during the procedural narrative. For both tasks, there was a relationship between iconic gesture rate, modeled as iconic gestures per word, and metrics of language dysfluency extracted from the discourse task as well as a metric of fluency extracted from a standardized battery. Iconic gesture production was correlated with aphasia duration, which was driven by performance during only a single task (Window), but not with other demographic metrics, such as aphasia severity or age. We also provide preliminary evidence for task differences shown through the lens of two types of iconic gestures. CONCLUSIONS While speech-language pathologists have utilized gesture in therapy for poststroke aphasia, due to its possible facilitatory role in spoken language, there has been considerably less work in understanding how gesture differs across naturalistic tasks and how we can best utilize this information to better assess gesture in aphasia and improve multimodal treatment for aphasia. Furthermore, our results contribute to gesture theory, particularly, about the role of gesture across naturalistic tasks and its relationship with spoken language. Supplemental Material https://doi.org/10.23641/asha.14614941.
Collapse
Affiliation(s)
- Brielle C. Stark
- Department of Speech, Language and Hearing Sciences, Indiana University Bloomington
| | - Caroline Cofoid
- Department of Speech, Language and Hearing Sciences, Indiana University Bloomington
| |
Collapse
|
8
|
Rombouts E, Maes B, Zink I. An investigation into the relationship between Quality of pantomime gestures and visuospatial skills. Augment Altern Commun 2020; 36:179-189. [PMID: 33043713 DOI: 10.1080/07434618.2020.1811760] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022] Open
Abstract
While children with developmental language disorder or Williams syndrome appear to use hand gestures to compensate for specific cognitive and communicative difficulties, they have different cognitive strength-weakness profiles. Their semantic and visuospatial skills potentially affect gesture quality such as iconicity. The present study focuses on untangling the unique contribution of these skills in the quality of gestures. An explicit gesture elicitation task was presented to 25 participants with developmental language disorder between 7 and 10 years of age, 25 age-matched peers with typical development, and 14 participants with Williams Syndrome (8-23 years). They gestured pictures of objects without using speech (pantomime). The iconicity, semantic richness, and representation technique of the pantomimes were coded. Participants' semantic association and visuospatial skills were formally assessed. Iconicity was slightly lower in individuals with Williams syndrome, which seems related to their visuospatial deficit. While semantic saliency was similar across participant groups, small differences in representation technique were found. Partial correlations showed that visuospatial skills and semantic skills were instrumental in producing clear pantomimes. These findings indicate that clinicians aiming to enhance individuals' natural iconic gestures should consider achieved iconicity, particularly in individuals with low visuospatial skills.
Collapse
Affiliation(s)
- Ellen Rombouts
- Department of Neurosciences, Experimental Otorinolaryngology, KU Leuven, Belgium
| | - Bea Maes
- Parenting and Special Education Research Group, KU Leuven, Belgium
| | - Inge Zink
- Department of Neurosciences, Experimental Otorinolaryngology, KU Leuven, Belgium
| |
Collapse
|
9
|
Trujillo JP, Simanova I, Bekkering H, Özyürek A. The communicative advantage: how kinematic signaling supports semantic comprehension. PSYCHOLOGICAL RESEARCH 2020; 84:1897-1911. [PMID: 31079227 PMCID: PMC7772160 DOI: 10.1007/s00426-019-01198-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2019] [Accepted: 05/02/2019] [Indexed: 11/04/2022]
Abstract
Humans are unique in their ability to communicate information through representational gestures which visually simulate an action (eg. moving hands as if opening a jar). Previous research indicates that the intention to communicate modulates the kinematics (e.g., velocity, size) of such gestures. If and how this modulation influences addressees' comprehension of gestures have not been investigated. Here we ask whether communicative kinematic modulation enhances semantic comprehension (i.e., identification) of gestures. We additionally investigate whether any comprehension advantage is due to enhanced early identification or late identification. Participants (n = 20) watched videos of representational gestures produced in a more- (n = 60) or less-communicative (n = 60) context and performed a forced-choice recognition task. We tested the isolated role of kinematics by removing visibility of actor's faces in Experiment I, and by reducing the stimuli to stick-light figures in Experiment II. Three video lengths were used to disentangle early identification from late identification. Accuracy and response time quantified main effects. Kinematic modulation was tested for correlations with task performance. We found higher gesture identification performance in more- compared to less-communicative gestures. However, early identification was only enhanced within a full visual context, while late identification occurred even when viewing isolated kinematics. Additionally, temporally segmented acts with more post-stroke holds were associated with higher accuracy. Our results demonstrate that communicative signaling, interacting with other visual cues, generally supports gesture identification, while kinematic modulation specifically enhances late identification in the absence of other cues. Results provide insights into mutual understanding processes as well as creating artificial communicative agents.
Collapse
Affiliation(s)
- James P Trujillo
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Montessorilaan 3, B.01.25, 6525GR, Nijmegen, The Netherlands.
- Centre for Language Studies, Radboud University, Nijmegen, The Netherlands.
| | - Irina Simanova
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Montessorilaan 3, B.01.25, 6525GR, Nijmegen, The Netherlands
| | - Harold Bekkering
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Montessorilaan 3, B.01.25, 6525GR, Nijmegen, The Netherlands
| | - Asli Özyürek
- Centre for Language Studies, Radboud University, Nijmegen, The Netherlands
- Max Planck Institute for Psycholinguistics, Wundtlaan 1, 6525XD, Nijmegen, The Netherlands
| |
Collapse
|
10
|
de Beer C, Hogrefe K, Hielscher-Fastabend M, de Ruiter JP. Evaluating Models of Gesture and Speech Production for People With Aphasia. Cogn Sci 2020; 44:e12890. [PMID: 32939773 DOI: 10.1111/cogs.12890] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2020] [Revised: 07/16/2020] [Accepted: 07/31/2020] [Indexed: 11/29/2022]
Abstract
People with aphasia use gestures not only to communicate relevant content but also to compensate for their verbal limitations. The Sketch Model (De Ruiter, 2000) assumes a flexible relationship between gesture and speech with the possibility of a compensatory use of the two modalities. In the successor of the Sketch Model, the AR-Sketch Model (De Ruiter, 2017), the relationship between iconic gestures and speech is no longer assumed to be flexible and compensatory, but instead iconic gestures are assumed to express information that is redundant to speech. In this study, we evaluated the contradictory predictions of the Sketch Model and the AR-Sketch Model using data collected from people with aphasia as well as a group of people without language impairment. We only found compensatory use of gesture in the people with aphasia, whereas the people without language impairments made very little compensatory use of gestures. Hence, the people with aphasia gestured according to the prediction of the Sketch Model, whereas the people without language impairment did not. We conclude that aphasia fundamentally changes the relationship of gesture and speech.
Collapse
Affiliation(s)
- Carola de Beer
- Cognitive Sciences, Linguistics Department, University of Potsdam
| | - Katharina Hogrefe
- Clinical Neuropsychology Research Group (EKN), Institute of Phonetics and Speech Processing, Ludwig-Maximilians-Universität München
| | | | - Jan P de Ruiter
- Departments of Psychology and Computer Science, Tufts University
| |
Collapse
|
11
|
Clough S, Duff MC. The Role of Gesture in Communication and Cognition: Implications for Understanding and Treating Neurogenic Communication Disorders. Front Hum Neurosci 2020; 14:323. [PMID: 32903691 PMCID: PMC7438760 DOI: 10.3389/fnhum.2020.00323] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2020] [Accepted: 07/21/2020] [Indexed: 01/20/2023] Open
Abstract
When people talk, they gesture. Gesture is a fundamental component of language that contributes meaningful and unique information to a spoken message and reflects the speaker's underlying knowledge and experiences. Theoretical perspectives of speech and gesture propose that they share a common conceptual origin and have a tightly integrated relationship, overlapping in time, meaning, and function to enrich the communicative context. We review a robust literature from the field of psychology documenting the benefits of gesture for communication for both speakers and listeners, as well as its important cognitive functions for organizing spoken language, and facilitating problem-solving, learning, and memory. Despite this evidence, gesture has been relatively understudied in populations with neurogenic communication disorders. While few studies have examined the rehabilitative potential of gesture in these populations, others have ignored gesture entirely or even discouraged its use. We review the literature characterizing gesture production and its role in intervention for people with aphasia, as well as describe the much sparser literature on gesture in cognitive communication disorders including right hemisphere damage, traumatic brain injury, and Alzheimer's disease. The neuroanatomical and behavioral profiles of these patient populations provide a unique opportunity to test theories of the relationship of speech and gesture and advance our understanding of their neural correlates. This review highlights several gaps in the field of communication disorders which may serve as a bridge for applying the psychological literature of gesture to the study of language disorders. Such future work would benefit from considering theoretical perspectives of gesture and using more rigorous and quantitative empirical methods in its approaches. We discuss implications for leveraging gesture to explore its untapped potential in understanding and rehabilitating neurogenic communication disorders.
Collapse
Affiliation(s)
- Sharice Clough
- Communication and Memory Lab, Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, United States
| | | |
Collapse
|
12
|
Rombouts E, Maessen B, Maes B, Zink I. Key Word Signing Has Higher Iconicity Than Sign Language. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2020; 63:2418-2424. [PMID: 32546041 DOI: 10.1044/2020_jslhr-20-00034] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Purpose Key word signing (KWS) entails using manual signs to support the natural speech of individuals with normal hearing and who have communication difficulties. While manual signs from the local sign language may be used for this purpose, some KWS systems have opted for a distinct KWS lexicon. Distinct KWS lexicon typically aims for higher sign iconicity or recognizability to make the lexicon more accessible for individuals with intellectual disabilities. We sought to determine if, in the Belgian Dutch context, signs from such a distinct KWS lexicon (Spreken Met Ondersteuning van Gebaren [Speaking With Support of Signs; SMOG]) were indeed more iconic than their Flemish Sign Language (FSL) counterparts. Method Participants were 224 adults with typical development who had no signing experience. They rated the resemblance between a FSL sign and its meaning. Raw data on the iconicity of SMOG from a previous study were used. Translucency was statistically and qualitatively compared between the SMOG lexicon and their FSL counterparts. Results SMOG had an overall higher translucency than FSL and contained a higher number of iconic signs. Conclusion This finding may support the value of a separate sign lexicon over using sign language signs. Nevertheless, other aspects, such as wide availability and inclusion, need to be considered.
Collapse
Affiliation(s)
- Ellen Rombouts
- Department of Neurosciences, Experimental Otorinolaryngology, KU Leuven, Belgium
| | - Babette Maessen
- Department of Neurosciences, Experimental Otorinolaryngology, KU Leuven, Belgium
| | - Bea Maes
- Parenting and Special Education Research Group, KU Leuven, Belgium
| | - Inge Zink
- Department of Neurosciences, Experimental Otorinolaryngology, KU Leuven, Belgium
| |
Collapse
|
13
|
Vibrac C, Avias A, François PO, Isner-Horobeti ME, Krasny-Pacini A. Charlie Chaplin and gesture training in severe aphasia: A controlled double-blind single-case experimental design. Ann Phys Rehabil Med 2020; 64:101356. [PMID: 32032804 DOI: 10.1016/j.rehab.2019.12.010] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2019] [Revised: 12/12/2019] [Accepted: 12/21/2019] [Indexed: 11/30/2022]
Abstract
BACKGROUND Aphasia following a stroke is a frequent and disabling condition that decreases quality of life. The use of gesture has been proposed as a way to enhance aphasia recovery. OBJECTIVE We aimed to explore whether 2 types of gesture interventions could improve communication in individuals with severe aphasia. METHODS This was a pilot study performed at home in routine care by an outreach team. The study had a controlled double-blind single-case experimental design (SCED): a controlled multiple baseline design across 3 participants and 2 behaviors (gesture and naming). Three male patients with stroke-induced severe chronic aphasia, non-functional perseverative speech and severe associated impairments underwent a passive gesture intervention, in which participants watched movies selected for their intensive use of gesture, and an active gesture intervention, in which they actively practiced gestures by using visual action therapy. The main outcome measures were naming score, gesture score and nonverbal subscale score of the Lillois Test of Communication, with 3-month follow-up. RESULTS In all 3 participants, gesture interventions improved the ability to gesture a list of words (Tau-U=0.38-0.67 for combined gesture intervention effect) and increased nonverbal communication activity. Benefits were maintained at 3-month follow-up. CONCLUSIONS Mute films that use intensive nonverbal communication may be a useful add-on to speech therapy for individuals with aphasia. Improving naming in severe and chronic aphasia may not be feasible, and more effort could be devoted to improving gesture-based and nonverbal communication.
Collapse
Affiliation(s)
- Clemence Vibrac
- Pôle Ambroise-Paré, service d'ORL, Hôpitaux civils de Colmar, 39, avenue de la Liberté, 68024 Colmar cedex, France; Pôle psychiatrie, service de psychiatrie infanto juvénile, Hôpitaux civils de Colmar, 39, avenue de la Liberté, 68024 Colmar cedex, France; Centre de formation universitaire en orthophonie de Strasbourg, 4, rue Kirschleger, 67000 Strasbourg, France
| | - Amelie Avias
- Centre de formation universitaire en orthophonie de Strasbourg, 4, rue Kirschleger, 67000 Strasbourg, France
| | - Pierre-Olivier François
- Pôle de médecine physique et de réadaptation, Institut Universitaire de réadaptation Clemenceau-Strasbourg, 45, boulevard Clémenceau, 67082 Strasbourg cedex, France
| | - Marie-Eve Isner-Horobeti
- Pôle de médecine physique et de réadaptation, Institut Universitaire de réadaptation Clemenceau-Strasbourg, 45, boulevard Clémenceau, 67082 Strasbourg cedex, France; Strasbourg university, Fédération de médecine translationnelle de Strasbourg, EA 3072 "mitochondrie, stress oxydant et protection musculaire", Strasbourg, France
| | - Agata Krasny-Pacini
- Pôle de médecine physique et de réadaptation, Institut Universitaire de réadaptation Clemenceau-Strasbourg, 45, boulevard Clémenceau, 67082 Strasbourg cedex, France; Strasbourg university, unité Inserm 1114 Neuropsychologie cognitive et physiopathologie de la schizophrénie, département de psychiatrie, Hôpital civil de Strasbourg, 1, place de l'Hôpital, 67091 Strasbourg cedex, France.
| |
Collapse
|
14
|
de Beer C, de Ruiter JP, Hielscher-Fastabend M, Hogrefe K. The Production of Gesture and Speech by People With Aphasia: Influence of Communicative Constraints. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2019; 62:4417-4432. [PMID: 31710512 DOI: 10.1044/2019_jslhr-l-19-0020] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Purpose People with aphasia (PWA) use different kinds of gesture spontaneously when they communicate. Although there is evidence that the nature of the communicative task influences the linguistic performance of PWA, so far little is known about the influence of the communicative task on the production of gestures by PWA. We aimed to investigate the influence of varying communicative constraints on the production of gesture and spoken expression by PWA in comparison to persons without language impairment. Method Twenty-six PWA with varying aphasia severities and 26 control participants (CP) without language impairment participated in the study. Spoken expression and gesture production were investigated in 2 different tasks: (a) spontaneous conversation about topics of daily living and (b) a cartoon narration task, that is, retellings of short cartoon clips. The frequencies of words and gestures as well as of different gesture types produced by the participants were analyzed and tested for potential effects of group and task. Results Main results for task effects revealed that PWA and CP used more iconic gestures and pantomimes in the cartoon narration task than in spontaneous conversation. Metaphoric gestures, deictic gestures, number gestures, and emblems were more frequently used in spontaneous conversation than in cartoon narrations by both participant groups. Group effects show that, in both tasks, PWA's gesture-to-word ratios were higher than those for the CP. Furthermore, PWA produced more interactive gestures than the CP in both tasks, as well as more number gestures and pantomimes in spontaneous conversation. Conclusions The current results suggest that PWA use gestures to compensate for their verbal limitations under varying communicative constraints. The properties of the communicative task influence the use of different gesture types in people with and without aphasia. Thus, the influence of communicative constraints needs to be considered when assessing PWA's multimodal communicative abilities.
Collapse
Affiliation(s)
- Carola de Beer
- Cognitive Sciences, Linguistics Department, Potsdam University, Germany
| | - Jan P de Ruiter
- Cognitive Sciences, Departments of Psychology and Computer Science, Tufts University, Medford, MA
| | | | - Katharina Hogrefe
- Clinical Neuropsychology Research Group (EKN), Institute of Phonetics and Speech Processing, Ludwig-Maximilians-Universität München, Germany
| |
Collapse
|
15
|
Murteira A, Nickels L. Can gesture observation help people with aphasia name actions? Cortex 2019; 123:86-112. [PMID: 31760340 DOI: 10.1016/j.cortex.2019.10.005] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2019] [Revised: 09/04/2019] [Accepted: 10/15/2019] [Indexed: 11/19/2022]
Abstract
It has been suggested that gesture can play a role in the treatment of naming impairments in aphasia, however investigation is still sparse, especially when compared to research on verbal treatments. Critically, previous studies have included either verbal or gesture production in the training. However, while in speakers without language impairment, action naming is facilitated by gesture observation, no study has yet systematically determined whether gesture observation alone influences word retrieval in people with aphasia. This is the aim of the research presented here. In a gesture priming experiment, participants with aphasia named actions that were preceded by the observation of videos of congruent or unrelated gestures or a non-gesture control condition. At the group-level, action naming was facilitated by observation of congruent gestures. However, single-case analyses revealed variability in the extent to which the participants benefited from gesture cueing. The potential mechanisms underlying the effects of gesture observation on action picture naming in people with aphasia were examined by exploring participant-related and item-related predictors of improvement. It is concluded that gesture observation may facilitate verb retrieval at either semantic or lexical levels. In addition, and despite variability across individuals, gesture observation seems more likely to facilitate action naming in people with spared gesture semantics and mild-moderate deficits in lexical-semantic or post-semantic processing.
Collapse
Affiliation(s)
- Ana Murteira
- Department of Cognitive Science, Macquarie University, Sydney, Australia; International Doctorate of Experimental Approaches to Language and Brain - IDEALAB, Universities of Trento, Groningen, Potsdam, Newcastle and Macquarie University, Australia.
| | - Lyndsey Nickels
- Department of Cognitive Science, Macquarie University, Sydney, Australia
| |
Collapse
|
16
|
Macedonia M, Hammer F, Weichselbaum O. Guided Embodiment and Potential Applications of Tutor Systems in Language Instruction and Rehabilitation. Front Psychol 2018; 9:927. [PMID: 29951017 PMCID: PMC6008518 DOI: 10.3389/fpsyg.2018.00927] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2017] [Accepted: 05/22/2018] [Indexed: 11/13/2022] Open
Abstract
Intelligent tutor systems (ITSs) in mobile devices take us through learning tasks and make learning ubiquitous, autonomous, and at low cost (Nye, 2015). In this paper, we describe guided embodiment as an ITS essential feature for second language learning (L2) and aphasia rehabilitation (ARe) that enhances efficiency in the learning process. In embodiment, cognitive processes, here specifically language (re)learning are grounded in actions and gestures (Pecher and Zwaan, 2005; Fischer and Zwaan, 2008; Dijkstra and Post, 2015). In order to guide users through embodiment, ITSs must track action and gesture, and give corrective feed-back to achieve the users' goals. Therefore, sensor systems are essential to guided embodiment. In the next sections, we describe sensor systems that can be implemented in ITS for guided embodiment.
Collapse
Affiliation(s)
- Manuela Macedonia
- Information Engineering, Johannes Kepler Universität Linz, Linz, Austria.,Neural Mechanisms of Human Communication, Max-Planck-Institut für Kognitions- und Neurowissenschaften, Leipzig, Germany
| | | | - Otto Weichselbaum
- Information Engineering, Johannes Kepler Universität Linz, Linz, Austria.,Sew Systems Gmbh, Linz, Austria
| |
Collapse
|
17
|
van Nispen K, Mieke WME, van de Sandt-Koenderman E, Krahmer E. The comprehensibility of pantomimes produced by people with aphasia. INTERNATIONAL JOURNAL OF LANGUAGE & COMMUNICATION DISORDERS 2018; 53:85-100. [PMID: 28691196 DOI: 10.1111/1460-6984.12328] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2016] [Revised: 03/17/2017] [Accepted: 05/02/2017] [Indexed: 06/07/2023]
Abstract
BACKGROUND People with aphasia (PWA) use pantomime, gesture in absence of speech, differently from non-brain-damaged people (NBDP). AIMS To evaluate through an exploratory study the comprehensibility of PWA's pantomimes and to find out whether they can compensate for information PWA are unable to convey in speech. METHODS & PROCEDURES A total of 273 naïve observers participated in one of two judgement tasks: forced-choice and open-ended questions. These were used to determine the comprehensibility of pantomimes produced to depict objects by PWA as compared with NBDP. Furthermore, we compared the information conveyed in pantomime with the information in speech. We looked into factors influencing pantomime's comprehensibility: individual factors, manner of depiction and information needed to be depicted. OUTCOME & RESULTS Although comprehensibility scores for PWA's pantomimes were lower than for those produced by NBDP, all PWA were able to convey information in pantomime that they could not convey in speech. Comprehensibility of pantomimes was predicted by apraxia. The inability to use the right hand related to slightly lower comprehensibility scores. Objects for which individuals depicted its use were best understood. CONCLUSION & IMPLICATIONS Our findings highlight the potential benefit of pantomime for clinical practice. Pantomimes, even though sometimes impaired, can convey information that PWA cannot convey in speech. Clinical implications are discussed.
Collapse
Affiliation(s)
- Karin van Nispen
- Tilburg Center for Cognition and Communication (TiCC), Tilburg University, the Netherlands
| | - W M E Mieke
- Tilburg Center for Cognition and Communication (TiCC), Tilburg University, the Netherlands
| | | | - Emiel Krahmer
- Tilburg Center for Cognition and Communication (TiCC), Tilburg University, the Netherlands
| |
Collapse
|
18
|
de Beer C, Carragher M, van Nispen K, Hogrefe K, de Ruiter JP, Rose ML. How Much Information Do People With Aphasia Convey via Gesture? AMERICAN JOURNAL OF SPEECH-LANGUAGE PATHOLOGY 2017; 26:483-497. [PMID: 28492911 DOI: 10.1044/2016_ajslp-15-0027] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/27/2015] [Accepted: 03/15/2016] [Indexed: 06/07/2023]
Abstract
PURPOSE People with aphasia (PWA) face significant challenges in verbally expressing their communicative intentions. Different types of gestures are produced spontaneously by PWA, and a potentially compensatory function of these gestures has been discussed. The current study aimed to investigate how much information PWA communicate through 3 types of gesture and the communicative effectiveness of such gestures. METHOD Listeners without language impairment rated the information content of short video clips taken from PWA in conversation. Listeners were asked to rate communication within a speech-only condition and a gesture + speech condition. RESULTS The results revealed that the participants' interpretations of the communicative intentions expressed in the clips of PWA were significantly more accurate in the gesture + speech condition for all tested gesture types. CONCLUSION It was concluded that all 3 gesture types under investigation contributed to the expression of semantic meaning communicated by PWA. Gestures are an important communicative means for PWA and should be regarded as such by their interlocutors. Gestures have been shown to enhance listeners' interpretation of PWA's overall communication.
Collapse
Affiliation(s)
- Carola de Beer
- Department of Linguistics and Literature Science, Bielefeld University, Bielefeld, GermanyDepartment of Special Education and Rehabilitation, University of Cologne, Germany
| | - Marcella Carragher
- Rose Aphasia Lab, La Trobe University, Melbourne, AustraliaCentre for Clinical Research Excellence in Aphasia Rehabilitation (CCRE), University of Brisbane, Australia
| | - Karin van Nispen
- Department of Communication and Information Sciences, Tilburg University, Tilburg, the Netherlands
| | - Katharina Hogrefe
- Clinical Neuropsychology Research Group (EKN), Institute of Phonetics and Speech Processing, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Jan P de Ruiter
- Departments of Psychology and Computer Science, Tufts University, Medford, Massachusetts
| | - Miranda L Rose
- Rose Aphasia Lab, La Trobe University, Melbourne, AustraliaCentre for Clinical Research Excellence in Aphasia Rehabilitation (CCRE), University of Brisbane, Australia
| |
Collapse
|