1
|
What is Functional Communication? A Theoretical Framework for Real-World Communication Applied to Aphasia Rehabilitation. Neuropsychol Rev 2022; 32:937-973. [PMID: 35076868 PMCID: PMC9630202 DOI: 10.1007/s11065-021-09531-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Aphasia is an impairment of language caused by acquired brain damage such as stroke or traumatic brain injury, that affects a person’s ability to communicate effectively. The aim of rehabilitation in aphasia is to improve everyday communication, improving an individual’s ability to function in their day-to-day life. For that reason, a thorough understanding of naturalistic communication and its underlying mechanisms is imperative. The field of aphasiology currently lacks an agreed, comprehensive, theoretically founded definition of communication. Instead, multiple disparate interpretations of functional communication are used. We argue that this makes it nearly impossible to validly and reliably assess a person’s communicative performance, to target this behaviour through therapy, and to measure improvements post-therapy. In this article we propose a structured, theoretical approach to defining the concept of functional communication. We argue for a view of communication as “situated language use”, borrowed from empirical psycholinguistic studies with non-brain damaged adults. This framework defines language use as: (1) interactive, (2) multimodal, and (3) contextual. Existing research on each component of the framework from non-brain damaged adults and people with aphasia is reviewed. The consequences of adopting this approach to assessment and therapy for aphasia rehabilitation are discussed. The aim of this article is to encourage a more systematic, comprehensive approach to the study and treatment of situated language use in aphasia.
Collapse
|
2
|
Preisig BC, Eggenberger N, Cazzoli D, Nyffeler T, Gutbrod K, Annoni JM, Meichtry JR, Nef T, Müri RM. Multimodal Communication in Aphasia: Perception and Production of Co-speech Gestures During Face-to-Face Conversation. Front Hum Neurosci 2018; 12:200. [PMID: 29962942 PMCID: PMC6010555 DOI: 10.3389/fnhum.2018.00200] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2018] [Accepted: 04/30/2018] [Indexed: 11/24/2022] Open
Abstract
The role of nonverbal communication in patients with post-stroke language impairment (aphasia) is not yet fully understood. This study investigated how aphasic patients perceive and produce co-speech gestures during face-to-face interaction, and whether distinct brain lesions would predict the frequency of spontaneous co-speech gesturing. For this purpose, we recorded samples of conversations in patients with aphasia and healthy participants. Gesture perception was assessed by means of a head-mounted eye-tracking system, and the produced co-speech gestures were coded according to a linguistic classification system. The main results are that meaning-laden gestures (e.g., iconic gestures representing object shapes) are more likely to attract visual attention than meaningless hand movements, and that patients with aphasia are more likely to fixate co-speech gestures overall than healthy participants. This implies that patients with aphasia may benefit from the multimodal information provided by co-speech gestures. On the level of co-speech gesture production, we found that patients with damage to the anterior part of the arcuate fasciculus showed a higher frequency of meaning-laden gestures. This area lies in close vicinity to the premotor cortex and is considered to be important for speech production. This may suggest that the use of meaning-laden gestures depends on the integrity of patients’ speech production abilities.
Collapse
Affiliation(s)
- Basil C Preisig
- Perception and Eye Movement Laboratory, Department of Neurology and Clinical Research, University of Bern Inselspital, Bern, Switzerland.,Donders Centre for Cognitive Neuroimaging, Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| | - Noëmi Eggenberger
- Perception and Eye Movement Laboratory, Department of Neurology and Clinical Research, University of Bern Inselspital, Bern, Switzerland
| | - Dario Cazzoli
- Gerontechnology and Rehabilitation Group, ARTORG Center for Biomedical Engineering Research, University of Bern, Bern, Switzerland
| | - Thomas Nyffeler
- Perception and Eye Movement Laboratory, Department of Neurology and Clinical Research, University of Bern Inselspital, Bern, Switzerland.,Center of Neurology and Neurorehabilitation, Luzerner Kantonsspital, Luzern, Switzerland
| | - Klemens Gutbrod
- University Neurorehabilitation Clinics, Department of Neurology, University of Bern Inselspital, Bern, Switzerland
| | - Jean-Marie Annoni
- Neurology Unit, Laboratory for Cognitive and Neurological Sciences, Department of Medicine, Faculty of Science, University of Fribourg, Fribourg, Switzerland
| | - Jurka R Meichtry
- Perception and Eye Movement Laboratory, Department of Neurology and Clinical Research, University of Bern Inselspital, Bern, Switzerland.,University Neurorehabilitation Clinics, Department of Neurology, University of Bern Inselspital, Bern, Switzerland
| | - Tobias Nef
- Gerontechnology and Rehabilitation Group, ARTORG Center for Biomedical Engineering Research, University of Bern, Bern, Switzerland
| | - René M Müri
- Perception and Eye Movement Laboratory, Department of Neurology and Clinical Research, University of Bern Inselspital, Bern, Switzerland.,Gerontechnology and Rehabilitation Group, ARTORG Center for Biomedical Engineering Research, University of Bern, Bern, Switzerland.,University Neurorehabilitation Clinics, Department of Neurology, University of Bern Inselspital, Bern, Switzerland
| |
Collapse
|
3
|
Comprehension of Co-Speech Gestures in Aphasic Patients: An Eye Movement Study. PLoS One 2016; 11:e0146583. [PMID: 26735917 PMCID: PMC4703302 DOI: 10.1371/journal.pone.0146583] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2015] [Accepted: 12/18/2015] [Indexed: 11/19/2022] Open
Abstract
BACKGROUND Co-speech gestures are omnipresent and a crucial element of human interaction by facilitating language comprehension. However, it is unclear whether gestures also support language comprehension in aphasic patients. Using visual exploration behavior analysis, the present study aimed to investigate the influence of congruence between speech and co-speech gestures on comprehension in terms of accuracy in a decision task. METHOD Twenty aphasic patients and 30 healthy controls watched videos in which speech was either combined with meaningless (baseline condition), congruent, or incongruent gestures. Comprehension was assessed with a decision task, while remote eye-tracking allowed analysis of visual exploration. RESULTS In aphasic patients, the incongruent condition resulted in a significant decrease of accuracy, while the congruent condition led to a significant increase in accuracy compared to baseline accuracy. In the control group, the incongruent condition resulted in a decrease in accuracy, while the congruent condition did not significantly increase the accuracy. Visual exploration analysis showed that patients fixated significantly less on the face and tended to fixate more on the gesturing hands compared to controls. CONCLUSION Co-speech gestures play an important role for aphasic patients as they modulate comprehension. Incongruent gestures evoke significant interference and deteriorate patients' comprehension. In contrast, congruent gestures enhance comprehension in aphasic patients, which might be valuable for clinical and therapeutic purposes.
Collapse
|
4
|
Perception of co-speech gestures in aphasic patients: a visual exploration study during the observation of dyadic conversations. Cortex 2014; 64:157-68. [PMID: 25461716 DOI: 10.1016/j.cortex.2014.10.013] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2014] [Revised: 07/07/2014] [Accepted: 10/20/2014] [Indexed: 01/09/2023]
Abstract
BACKGROUND Co-speech gestures are part of nonverbal communication during conversations. They either support the verbal message or provide the interlocutor with additional information. Furthermore, they prompt as nonverbal cues the cooperative process of turn taking. In the present study, we investigated the influence of co-speech gestures on the perception of dyadic dialogue in aphasic patients. In particular, we analysed the impact of co-speech gestures on gaze direction (towards speaker or listener) and fixation of body parts. We hypothesized that aphasic patients, who are restricted in verbal comprehension, adapt their visual exploration strategies. METHODS Sixteen aphasic patients and 23 healthy control subjects participated in the study. Visual exploration behaviour was measured by means of a contact-free infrared eye-tracker while subjects were watching videos depicting spontaneous dialogues between two individuals. Cumulative fixation duration and mean fixation duration were calculated for the factors co-speech gesture (present and absent), gaze direction (to the speaker or to the listener), and region of interest (ROI), including hands, face, and body. RESULTS Both aphasic patients and healthy controls mainly fixated the speaker's face. We found a significant co-speech gesture × ROI interaction, indicating that the presence of a co-speech gesture encouraged subjects to look at the speaker. Further, there was a significant gaze direction × ROI × group interaction revealing that aphasic patients showed reduced cumulative fixation duration on the speaker's face compared to healthy controls. CONCLUSION Co-speech gestures guide the observer's attention towards the speaker, the source of semantic input. It is discussed whether an underlying semantic processing deficit or a deficit to integrate audio-visual information may cause aphasic patients to explore less the speaker's face.
Collapse
|
5
|
Silkes JP. Providing Audiological Services to Individuals With Aphasia: Considerations, Preliminary Recommendations, and a Call for Research. Am J Audiol 2012; 21:3-12. [DOI: 10.1044/1059-0889(2012/10-0002)] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022] Open
Abstract
Purpose
The populations most susceptible to hearing loss and to aphasia overlap substantially, creating a high likelihood that audiologists will be called on to assess and treat individuals with aphasia. There is, however, scarce research available to guide best practices for serving this population.
Method
The available relevant literature is reviewed to summarize what is already known, providing basic information about aphasia and its potential impact on audiological diagnostic and intervention processes.
Conclusion
Suggestions for managing aphasia in the clinical audiology setting are provided, and areas of needed research are identified so that services for individuals with aphasia can be optimized.
Collapse
|
6
|
Abstract
BACKGROUND: The study of communicative gestures is one of considerable interest for aphasia, in relation to theory, diagnosis, and treatment. Significant limitations currently permeate the general (psycho)linguistic literature on gesture production, and attention to these limitations is essential for both continued investigation and clinical application of gesture for people with aphasia. AIMS: The aims of this paper are to discuss issues imperative to advancing the gesture production literature and to provide specific suggestions for applying the material herein to studies in gesture production for people with aphasia. MAIN CONTRIBUTION: Two primary perspectives in the gesture production literature are distinct in their proposals about the function of gesture, and about where gesture arises in the communication stream. These two perspectives will be discussed, along with three elements considered to be prerequisites for advancing the research on gesture production. These include: operational definitions, coding systems, and the temporal synchrony characteristics of gesture. CONCLUSIONS: Addressing the specific elements discussed in this paper will provide essential information for both continued investigation and clinical application of gesture for people with aphasia.
Collapse
|
7
|
Skipper JI, Goldin-Meadow S, Nusbaum HC, Small SL. Speech-associated gestures, Broca's area, and the human mirror system. BRAIN AND LANGUAGE 2007; 101:260-77. [PMID: 17533001 PMCID: PMC2703472 DOI: 10.1016/j.bandl.2007.02.008] [Citation(s) in RCA: 127] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/01/2006] [Revised: 01/26/2007] [Accepted: 02/10/2007] [Indexed: 05/10/2023]
Abstract
Speech-associated gestures are hand and arm movements that not only convey semantic information to listeners but are themselves actions. Broca's area has been assumed to play an important role both in semantic retrieval or selection (as part of a language comprehension system) and in action recognition (as part of a "mirror" or "observation-execution matching" system). We asked whether the role that Broca's area plays in processing speech-associated gestures is consistent with the semantic retrieval/selection account (predicting relatively weak interactions between Broca's area and other cortical areas because the meaningful information that speech-associated gestures convey reduces semantic ambiguity and thus reduces the need for semantic retrieval/selection) or the action recognition account (predicting strong interactions between Broca's area and other cortical areas because speech-associated gestures are goal-direct actions that are "mirrored"). We compared the functional connectivity of Broca's area with other cortical areas when participants listened to stories while watching meaningful speech-associated gestures, speech-irrelevant self-grooming hand movements, or no hand movements. A network analysis of neuroimaging data showed that interactions involving Broca's area and other cortical areas were weakest when spoken language was accompanied by meaningful speech-associated gestures, and strongest when spoken language was accompanied by self-grooming hand movements or by no hand movements at all. Results are discussed with respect to the role that the human mirror system plays in processing speech-associated movements.
Collapse
Affiliation(s)
- Jeremy I Skipper
- Department of Psychology, The University of Chicago, Chicago, IL, USA.
| | | | | | | |
Collapse
|
8
|
Braunack-Mayer A, Hersh D. An Ethical Voice in the Silence of Aphasia: Judging Understanding and Consent in People with Aphasia. THE JOURNAL OF CLINICAL ETHICS 2001. [DOI: 10.1086/jce200112407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
|