1
|
Congdon EL, Novack MA, Wakefield EM. Exploring Individual Differences: A Case for Measuring Children's Spontaneous Gesture Production as a Predictor of Learning From Gesture Instruction. Top Cogn Sci 2024. [PMID: 38284283 DOI: 10.1111/tops.12722] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Revised: 07/06/2023] [Accepted: 01/12/2024] [Indexed: 01/30/2024]
Abstract
Decades of research have established that learners benefit when instruction includes hand gestures. This benefit is seen when learners watch an instructor gesture, as well as when they are taught or encouraged to gesture themselves. However, there is substantial individual variability with respect to this phenomenon-not all individuals benefit equally from gesture instruction. In the current paper, we explore the sources of this variability. First, we review the existing research on individual differences that do or do not predict learning from gesture instruction, including differences that are either context-dependent (linked to the particular task at hand) or context-independent (linked to the learner across multiple tasks). Next, we focus on one understudied measure of individual difference: the learner's own spontaneous gesture rate. We present data showing rates of "non-gesturers" across a number of studies and we provide theoretical motivation for why this is a fruitful area for future research. We end by suggesting ways in which research on individual differences will help gesture researchers to further refine existing theories and develop specific predictions about targeted gesture intervention for all kinds of learners.
Collapse
Affiliation(s)
| | - Miriam A Novack
- Department of Medical Social Sciences, Northwestern University Feinberg School of Medicine
| | | |
Collapse
|
2
|
Emotion is perceived accurately from isolated body parts, especially hands. Cognition 2023; 230:105260. [PMID: 36058103 DOI: 10.1016/j.cognition.2022.105260] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2022] [Revised: 08/16/2022] [Accepted: 08/17/2022] [Indexed: 11/21/2022]
Abstract
Body posture and configuration provide important visual cues about the emotion states of other people. We know that bodily form is processed holistically, however, emotion recognition may depend on different mechanisms; certain body parts, such as the hands, may be especially important for perceiving emotion. This study therefore compared participants' emotion recognition performance when shown images of full bodies, or of isolated hands, arms, heads and torsos. Across three experiments, emotion recognition accuracy was above chance for all body parts. While emotions were recognized most accurately from full bodies, recognition performance from the hands was more accurate than for other body parts. Representational similarity analysis further showed that the pattern of errors for the hands was related to that for full bodies. Performance was reduced when stimuli were inverted, showing a clear body inversion effect. The high performance for hands was not due only to the fact that there are two hands, as performance remained well above chance even when just one hand was shown. These results demonstrate that emotions can be decoded from body parts. Furthermore, certain features, such as the hands, are more important to emotion perception than others. STATEMENT OF RELEVANCE: Successful social interaction relies on accurately perceiving emotional information from others. Bodies provide an abundance of emotion cues; however, the way in which emotional bodies and body parts are perceived is unclear. We investigated this perceptual process by comparing emotion recognition for body parts with that for full bodies. Crucially, we found that while emotions were most accurately recognized from full bodies, emotions were also classified accurately when images of isolated hands, arms, heads and torsos were seen. Of the body parts shown, emotion recognition from the hands was most accurate. Furthermore, shared patterns of emotion classification for hands and full bodies suggested that emotion recognition mechanisms are shared for full bodies and body parts. That the hands are key to emotion perception is important evidence in its own right. It could also be applied to interventions for individuals who find it difficult to read emotions from faces and bodies.
Collapse
|
3
|
Stewart JR, Crutchfield R, Chang WL. Prelinguistic gesture and developmental abilities: A multi-ethnic comparative study. Infant Behav Dev 2022; 68:101748. [PMID: 35908421 DOI: 10.1016/j.infbeh.2022.101748] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2021] [Revised: 07/12/2022] [Accepted: 07/14/2022] [Indexed: 11/28/2022]
Abstract
The present study examined the frequency of gesture use and the relationship between frequency of gesture use and developmental abilities in typically developing 9- to 15-month-old, prelinguistic Hispanic and non-Hispanic White children. Data was collected through parent questionnaires, the Mullen Scales of Early Learning (MSEL), and two, 15-min video samples for each participant (semi-structured and structured settings). All video samples were coded for the frequency of the following gestures: total frequency, behavior regulation, social interaction, and joint attention. Results showed that children from both ethnicities used fewer gestures in a semi-structured setting in comparison to a structured setting and non-Hispanic White children produced higher frequencies of behavior regulation gestures and joint attention gestures, but lower frequencies of social interaction gestures. When controlling for ethnicity, gender, and age total frequency of gesture and frequencies of behavior regulation and social interaction were predictive of various developmental abilities. Furthermore, participant gender, age, and ethnicity were significantly related to various developmental abilities explored. These relationships were dependent upon setting. An understanding of the use of gesture and the relationship between gesture use and developmental abilities in prelinguistic children from different ethnic backgrounds has implication for early identification of delays and differences and is important to consider when exploring the connection between gesture and language and whether there are gesture-language, gesture-motor, and/or gesture-cognition integrated systems.
Collapse
Affiliation(s)
| | | | - Wan-Lin Chang
- University of Texas Rio Grande Valley, United States of America
| |
Collapse
|
4
|
Holler J, Drijvers L, Rafiee A, Majid A. Embodied Space-pitch Associations are Shaped by Language. Cogn Sci 2022; 46:e13083. [PMID: 35188682 DOI: 10.1111/cogs.13083] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2021] [Revised: 11/29/2021] [Accepted: 12/02/2021] [Indexed: 11/28/2022]
Abstract
Height-pitch associations are claimed to be universal and independent of language, but this claim remains controversial. The present study sheds new light on this debate with a multimodal analysis of individual sound and melody descriptions obtained in an interactive communication paradigm with speakers of Dutch and Farsi. The findings reveal that, in contrast to Dutch speakers, Farsi speakers do not use a height-pitch metaphor consistently in speech. Both Dutch and Farsi speakers' co-speech gestures did reveal a mapping of higher pitches to higher space and lower pitches to lower space, and this gesture space-pitch mapping tended to co-occur with corresponding spatial words (high-low). However, this mapping was much weaker in Farsi speakers than Dutch speakers. This suggests that cross-linguistic differences shape the conceptualization of pitch and further calls into question the universality of height-pitch associations.
Collapse
Affiliation(s)
- Judith Holler
- Donders Institute for Brain, Cognition & Behaviors, Radboud University.,Language & Cognition and Neurobiology of Language Departments, Max Planck Institute for Psycholinguistics
| | - Linda Drijvers
- Donders Institute for Brain, Cognition & Behaviors, Radboud University.,Language & Cognition and Neurobiology of Language Departments, Max Planck Institute for Psycholinguistics
| | | | - Asifa Majid
- Canter for Language Studies, Radboud University.,Department of Psychology, University of York
| |
Collapse
|
5
|
Billot-Vasquez K, Lian Z, Hirata Y, Kelly SD. Emblem Gestures Improve Perception and Evaluation of Non-native Speech. Front Psychol 2020; 11:574418. [PMID: 33071912 PMCID: PMC7536367 DOI: 10.3389/fpsyg.2020.574418] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2020] [Accepted: 08/19/2020] [Indexed: 01/02/2023] Open
Abstract
Traditionally, much of the attention on the communicative effects of non-native accent has focused on the accent itself rather than how it functions within a more natural context. The present study explores how the bodily context of co-speech emblematic gestures affects perceptual and social evaluation of non-native accent. In two experiments in two different languages, Mandarin and Japanese, we filmed learners performing a short utterance in three different within-subjects conditions: speech alone, culturally familiar gesture, and culturally unfamiliar gesture. Native Mandarin participants watched videos of foreign-accented Mandarin speakers (Experiment 1), and native Japanese participants watched videos of foreign-accented Japanese speakers (Experiment 2). Following each video, native language participants were asked a set of questions targeting speech perception and social impressions of the learners. Results from both experiments demonstrate that familiar—and occasionally unfamiliar—emblems facilitated speech perception and enhanced social evaluations compared to the speech alone baseline. The variability in our findings suggests that gesture may serve varied functions in the perception and evaluation of non-native accent.
Collapse
Affiliation(s)
- Kiana Billot-Vasquez
- Department of Psychological and Brain Sciences, Colgate University, Hamilton, NY, United States.,Center for Language and Brain, Hamilton, NY, United States
| | - Zhongwen Lian
- Center for Language and Brain, Hamilton, NY, United States.,Linguistics Program, Colgate University, Hamilton, NY, United States
| | - Yukari Hirata
- Center for Language and Brain, Hamilton, NY, United States.,Linguistics Program, Colgate University, Hamilton, NY, United States.,Department of East Asian Languages, Colgate University, Hamilton, NY, United States
| | - Spencer D Kelly
- Department of Psychological and Brain Sciences, Colgate University, Hamilton, NY, United States.,Center for Language and Brain, Hamilton, NY, United States.,Linguistics Program, Colgate University, Hamilton, NY, United States
| |
Collapse
|
6
|
Sparrow K, Lind C, van Steenbrugge W. Gesture, communication, and adult acquired hearing loss. JOURNAL OF COMMUNICATION DISORDERS 2020; 87:106030. [PMID: 32707420 DOI: 10.1016/j.jcomdis.2020.106030] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2018] [Revised: 06/18/2020] [Accepted: 06/19/2020] [Indexed: 06/11/2023]
Abstract
Nonverbal communication, specifically hand and arm movements (commonly known as gesture), has long been recognized and explored as a significant element in human interaction as well as potential compensatory behavior for individuals with communication difficulties. The use of gesture as a compensatory communication method in expressive and receptive human communication disorders has been the subject of much investigation. Yet within the context of adult acquired hearing loss, gesture has received limited research attention and much remains unknown about patterns of nonverbal behaviors in conversations in which hearing loss is a factor. This paper presents key elements of the background of gesture studies and the theories of gesture function and production followed by a review of research focused on adults with hearing loss and the role of gesture and gaze in rehabilitation. The current examination of the visual resource of co-speech gesture in the context of everyday interactions involving adults with acquired hearing loss suggests the need for the development of an evidence base to effect enhancements and changes in the way in which rehabilitation services are conducted.
Collapse
Affiliation(s)
- Karen Sparrow
- Audiology, College of Nursing & Health Sciences, Flinders University, GPO Box 2100, Adelaide, 5001, South Australia, Australia.
| | - Christopher Lind
- Audiology, College of Nursing & Health Sciences, Flinders University, GPO Box 2100, Adelaide, 5001, South Australia, Australia.
| | - Willem van Steenbrugge
- Speech Pathology, College of Nursing & Health Sciences, Flinders University, GPO Box 2100, Adelaide, 5001, South Australia, Australia.
| |
Collapse
|
7
|
ten Brinke L, Weisbuch M. How verbal-nonverbal consistency shapes the truth. JOURNAL OF EXPERIMENTAL SOCIAL PSYCHOLOGY 2020. [DOI: 10.1016/j.jesp.2020.103978] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
8
|
Gampe A, Hartmann L, Daum MM. Dynamic interaction patterns of monolingual and bilingual infants with their parents. JOURNAL OF CHILD LANGUAGE 2020; 47:45-63. [PMID: 31865931 DOI: 10.1017/s0305000919000631] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Bilingual children show a number of advantages in the domain of communication. The aim of the current study was to investigate whether differences in interactions are present before productive language skills emerge. For a duration of 5 minutes, 64 parents and their 14-month-old infants explored a decorated room together. The coordination of their behaviors in the modalities of action, language, and gesture was coded. The results showed no differences in interactions across different language statuses. In two additional analyses, we first compared monolinguals and bilinguals with caregivers who shared the same language and culture. Results showed the same pattern of non-difference. Second, we compared bilinguals with caregivers from different cultures. The rate and duration of coordination differed across infants with different cultural backgrounds. The findings suggest that exposure to two languages is not sufficient to explain the previously identified beneficial effects in the communicative interactions of bilingual children.
Collapse
|
9
|
Debreslioska S, van de Weijer J, Gullberg M. Addressees Are Sensitive to the Presence of Gesture When Tracking a Single Referent in Discourse. Front Psychol 2019; 10:1775. [PMID: 31456709 PMCID: PMC6700288 DOI: 10.3389/fpsyg.2019.01775] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2019] [Accepted: 07/16/2019] [Indexed: 11/13/2022] Open
Abstract
Production studies show that anaphoric reference is bimodal. Speakers can introduce a referent in speech by also using a localizing gesture, assigning a specific locus in space to it. Referring back to that referent, speakers then often accompany a spoken anaphor with a localizing anaphoric gesture (i.e., indicating the same locus). Speakers thus create visual anaphoricity in parallel to the anaphoric process in speech. In the current perception study, we examine whether addressees are sensitive to localizing anaphoric gestures and specifically to the (mis)match between recurrent use of space and spoken anaphora. The results of two reaction time experiments show that, when a single referent is gesturally tracked, addressees are sensitive to the presence of localizing gestures, but not to their spatial congruence. Addressees thus seem to integrate gestural information when processing bimodal anaphora, but their use of locational information in gestures is not obligatory in every discourse context.
Collapse
Affiliation(s)
- Sandra Debreslioska
- Centre for Languages and Literature, Lund University, Lund, Sweden
- *Correspondence: Sandra Debreslioska,
| | - Joost van de Weijer
- Centre for Languages and Literature, Lund University, Lund, Sweden
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Marianne Gullberg
- Centre for Languages and Literature, Lund University, Lund, Sweden
- Lund University Humanities Lab, Lund University, Lund, Sweden
| |
Collapse
|
10
|
The function of primate multimodal communication. Anim Cogn 2018; 21:619-629. [DOI: 10.1007/s10071-018-1197-8] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2017] [Revised: 02/21/2018] [Accepted: 03/05/2018] [Indexed: 02/04/2023]
|
11
|
Kong APH, Law SP, Cheung CKY. Use of co-verbal gestures during word-finding difficulty among Cantonese speakers with fluent aphasia and unimpaired controls. APHASIOLOGY 2018; 33:216-233. [PMID: 30853744 PMCID: PMC6402778 DOI: 10.1080/02687038.2018.1463085] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/07/2017] [Accepted: 04/03/2018] [Indexed: 06/09/2023]
Abstract
BACKGROUND Co-verbal gestures refer to hand or arm movements made during speaking. Spoken language and gestures have been shown to be tightly integrated in human communication. AIMS The present study investigated whether co-verbal gesture use was associated with lexical retrieval in connected speech in unimpaired speakers and persons with aphasia (PWA). METHODS & PROCEDURES Narrative samples of 58 fluent PWA and 58 control speakers were extracted from Cantonese AphasiaBank. Based on the indicators of word-finding difficulty (WFD) in connected speech adapted from previous research, and a gesture annotation system with independent coding of gesture forms and functions, all WFD instances were identified. The presence and type of gestures accompanying each incident of WFD were then annotated. Finally, whether the use of gesture was accompanied by resolution of WFD, i.e., the corresponding target word could be retrieved, was examined. OUTCOMES & RESULTS Employment of co-verbal gesture did not seem to be related to the success of word retrieval. PWA's naming ability at single-word level and their overall language ability (as reflected by the aphasia quotient of the Cantonese version of the Western Aphasia Battery) were found to be the two strongest predictors of success rate of resolving WFD. CONCLUSIONS The Lexical Retrieval Hypothesis highlighting the facilitative functions of iconic and metaphoric gestures in lexical retrieval was not supported. Challenges in conducting research related to WFD, and the clinical implications in gesture-based language intervention for PWA were discussed.
Collapse
Affiliation(s)
- Anthony Pak-Hin Kong
- Department of Communication Sciences and Disorders, University of Central Florida, Orlando, FL, USA
| | - Sam-Po Law
- Division of Speech and Hearing Sciences, University of Hong Kong, Hong Kong
| | | |
Collapse
|
12
|
Taking turns across channels: Conversation-analytic tools in animal communication. Neurosci Biobehav Rev 2017; 80:201-209. [DOI: 10.1016/j.neubiorev.2017.05.005] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2017] [Revised: 03/21/2017] [Accepted: 05/08/2017] [Indexed: 01/07/2023]
|
13
|
Debreslioska S, Gullberg M. Discourse Reference Is Bimodal: How Information Status in Speech Interacts with Presence and Viewpoint of Gestures. DISCOURSE PROCESSES 2017. [DOI: 10.1080/0163853x.2017.1351909] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
| | - Marianne Gullberg
- Centre for Languages and Literature and Lund University Humanities Lab Lund University, Lund, Sweden
| |
Collapse
|
14
|
Galati A, Weisberg SM, Newcombe NS, Avraamides MN. When gestures show us the way: Co-thought gestures selectively facilitate navigation and spatial memory. SPATIAL COGNITION AND COMPUTATION 2017. [DOI: 10.1080/13875868.2017.1332064] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Alexia Galati
- Department of Psychology, University of Cyprus, Nicosia, Cyprus
- Cognitive and Information Sciences, University of California–Merced, Merced, CA, USA
| | - Steven M. Weisberg
- Department of Neurology, University of Pennsylvania, Philadelphia, PA, USA
| | - Nora S. Newcombe
- Department of Psychology, Temple University, Philadelphia, PA, USA
| | - Marios N. Avraamides
- Department of Psychology, University of Cyprus, Nicosia, Cyprus
- Centre for Applied Neuroscience, University of Cyprus, Nicosia, Cyprus
| |
Collapse
|
15
|
Hogrefe K, Ziegler W, Weidinger N, Goldenberg G. Comprehensibility and neural substrate of communicative gestures in severe aphasia. BRAIN AND LANGUAGE 2017; 171:62-71. [PMID: 28535366 DOI: 10.1016/j.bandl.2017.04.007] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/10/2016] [Revised: 03/21/2017] [Accepted: 04/18/2017] [Indexed: 06/07/2023]
Abstract
Communicative gestures can compensate incomprehensibility of oral speech in severe aphasia, but the brain damage that causes aphasia may also have an impact on the production of gestures. We compared the comprehensibility of gestural communication of persons with severe aphasia and non-aphasic persons and used voxel based lesion symptom mapping (VLSM) to determine lesion sites that are responsible for poor gestural expression in aphasia. On group level, persons with aphasia conveyed more information via gestures than controls indicating a compensatory use of gestures in persons with severe aphasia. However, individual analysis showed a broad range of gestural comprehensibility. VLSM suggested that poor gestural expression was associated with lesions in anterior temporal and inferior frontal regions. We hypothesize that likely functional correlates of these localizations are selection of and flexible changes between communication channels as well as between different types of gestures and between features of actions and objects that are expressed by gestures.
Collapse
Affiliation(s)
- Katharina Hogrefe
- Clinical Neuropsychology Research Group (EKN), Institute of Phonetics and Speech Processing, Ludwig-Maximilians-Universität München, Munich, Germany.
| | - Wolfram Ziegler
- Clinical Neuropsychology Research Group (EKN), Institute of Phonetics and Speech Processing, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Nicole Weidinger
- Institute for German as a Foreign Language, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Georg Goldenberg
- Department of Neurology, Technical University Munich, Munich, Germany
| |
Collapse
|
16
|
Kong APH, Law SP, Chak GWC. A Comparison of Coverbal Gesture Use in Oral Discourse Among Speakers With Fluent and Nonfluent Aphasia. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2017; 60:2031-2046. [PMID: 28609510 PMCID: PMC5831092 DOI: 10.1044/2017_jslhr-l-16-0093] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/08/2016] [Revised: 08/04/2016] [Accepted: 01/16/2017] [Indexed: 05/26/2023]
Abstract
PURPOSE Coverbal gesture use, which is affected by the presence and degree of aphasia, can be culturally specific. The purpose of this study was to compare gesture use among Cantonese-speaking individuals: 23 neurologically healthy speakers, 23 speakers with fluent aphasia, and 21 speakers with nonfluent aphasia. METHOD Multimedia data of discourse samples from these speakers were extracted from the Cantonese AphasiaBank. Gestures were independently annotated on their forms and functions to determine how gesturing rate and distribution of gestures differed across speaker groups. A multiple regression was conducted to determine the most predictive variable(s) for gesture-to-word ratio. RESULTS Although speakers with nonfluent aphasia gestured most frequently, the rate of gesture use in counterparts with fluent aphasia did not differ significantly from controls. Different patterns of gesture functions in the 3 speaker groups revealed that gesture plays a minor role in lexical retrieval whereas its role in enhancing communication dominates among the speakers with aphasia. The percentages of complete sentences and dysfluency strongly predicted the gesturing rate in aphasia. CONCLUSIONS The current results supported the sketch model of language-gesture association. The relationship between gesture production and linguistic abilities and clinical implications for gesture-based language intervention for speakers with aphasia are also discussed.
Collapse
Affiliation(s)
- Anthony Pak-Hin Kong
- Department of Communication Sciences and Disorders, University of Central Florida, Orlando
| | - Sam-Po Law
- Division of Speech and Hearing Sciences, University of Hong Kong
| | | |
Collapse
|
17
|
Sheikholeslami S, Moon AJ, Croft EA. Cooperative gestures for industry: Exploring the efficacy of robot hand configurations in expression of instructional gestures for human–robot interaction. Int J Rob Res 2017. [DOI: 10.1177/0278364917709941] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Sara Sheikholeslami
- Department of Mechanical Engineering, University of British Columbia, Canada
| | - AJung Moon
- Department of Mechanical Engineering, University of British Columbia, Canada
| | - Elizabeth A Croft
- Department of Mechanical Engineering, University of British Columbia, Canada
| |
Collapse
|
18
|
Lickiss KP, Wellens AR. Effects of Visual Accessibility and Hand Restraint on Fluency of Gesticulator and Effectiveness of Message. Percept Mot Skills 2016. [DOI: 10.2466/pms.1978.46.3.925] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The role of hand gestures in human communication was examined in an experiment that manipulated communicator-receiver visual accessibility and freedom of the communicator's hand movements. While gesturing occurred primarily during periods of speech rather than silence, the visual availability of 10 speakers' hand gestures did not significantly enhance receivers' ability to decode and act upon task-related messages. Hand restraint did not significantly affect speakers' verbal fluency or total verbal output. The mere visual presence of an interactant had a greater impact on speech disfluency than did hand restraint.
Collapse
|
19
|
Rauscher FH, Krauss RM, Chen Y. Gesture, Speech, and Lexical Access: The Role of Lexical Movements in Speech Production. Psychol Sci 2016. [DOI: 10.1111/j.1467-9280.1996.tb00364.x] [Citation(s) in RCA: 260] [Impact Index Per Article: 32.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
Abstract
In a within-subjects design that varied whether speakers were allowed to gesture and the difficulty of lexical access, speakers were videotaped as they described animated action cartoons to a listener When speakers were permitted to gesture, they gestured more often during phrases with spatial content than during phrases with other content Speech with spatial content was less fluent when speakers could not gesture than when they could gesture, speech with nonspatial content was not affected by gesture condition Preventing gesturing increased the relative frequency of nonjuncture filled pauses in speech with spatial content, but not in speech with other content Overall, the effects of preventing speakers from gesturing resembled those of increasing the difficulty of lexical access by other means, except that the effects of gesture restriction were specific to speech with spatial content The findings support the hypothesis that gestural accompaniments to spontaneous speech can facilitate access to the mental lexicon
Collapse
|
20
|
Marentette P, Pettenati P, Bello A, Volterra V. Gesture and Symbolic Representation in Italian and English-Speaking Canadian 2-Year-Olds. Child Dev 2016; 87:944-61. [PMID: 27079825 DOI: 10.1111/cdev.12523] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Analyses of elicited pantomime, primarily of English-speaking children, show that preschool-aged children are more likely to symbolically represent an object with gestures depicting an object's form rather than its function. In contrast, anecdotal reports of spontaneous gesture production in younger children suggest that children use multiple representational techniques. This study examined the spontaneous gestures of sixty-four 2-year-old Italian children and English-speaking Canadian children, primarily from middle-class Caucasian families. The Italian children produced twice as many gestures as Canadian children in a picture-naming task but produced a similar range of representational techniques. Two-year-olds were equally likely to produce gestures depicting function as form. These data suggest young children's communicative skills are supported by a symbolic capacity that reflects contextual communicative demands.
Collapse
Affiliation(s)
| | - Paola Pettenati
- Università di Parma.,Academy of Developmental Neuropsychology, Parma
| | | | - Virginia Volterra
- Institute of Cognitive Sciences and Technologies, Italian National Research Council
| |
Collapse
|
21
|
Nicoladis E, Marentette P, Navarro S. Gesture Frequency Linked Primarily to Story Length in 4-10-Year Old Children's Stories. JOURNAL OF PSYCHOLINGUISTIC RESEARCH 2016; 45:189-204. [PMID: 25430692 DOI: 10.1007/s10936-014-9342-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
Previous studies have shown that older children gesture more while telling a story than younger children. This increase in gesture use has been attributed to increased story complexity. In adults, both narrative complexity and imagery predict gesture frequency. In this study, we tested the strength of three predictors of children's gesture use in a narrative context: age, narrative complexity (measured by discourse connectors), and use of imagery (measured by story length). French-, Spanish-, and English-speaking children between 4 and 10 years participated in this study. Including these three groups allows us to test for the generalizability of our results and for cross-linguistic differences in gesture frequency. All the children watched cartoons and retold the story. The results showed that the length of the story was a significant predictor of children's gesture rate while age and discourse connectors were not. There were no differences between language groups. One possible in interpretation of these results is that children's gesture frequency is strongly linked to activation of imagery.
Collapse
Affiliation(s)
- Elena Nicoladis
- Department of Psychology, University of Alberta, P2-17 Biological Sciences Bdg., Edmonton, AB, T6G 2E9, Canada.
| | | | | |
Collapse
|
22
|
Richland LE. Linking Gestures: Cross-Cultural Variation During Instructional Analogies. COGNITION AND INSTRUCTION 2015. [DOI: 10.1080/07370008.2015.1091459] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
23
|
Goldin-Meadow S, Beilock SL. Action's Influence on Thought: The Case of Gesture. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2015; 5:664-74. [PMID: 21572548 DOI: 10.1177/1745691610388764] [Citation(s) in RCA: 100] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Recent research has shown that people's actions can influence how they think. A separate body of research has shown that the gestures people produce when they speak can also influence how they think. In this article, we bring these two literatures together to explore whether gesture has an effect on thinking by virtue of its ability to reflect real-world actions. We first argue that gestures contain detailed perceptual-motor information about the actions they represent, information often not found in the speech that accompanies the gestures. We then show that the action features in gesture do not just reflect the gesturer's thinking--they can feed back and alter that thinking. Gesture actively brings action into a speaker's mental representations, and those mental representations then affect behavior--at times more powerfully than do the actions on which the gestures are based. Gesture thus has the potential to serve as a unique bridge between action and abstract thought.
Collapse
|
24
|
Pritchard M, Dipper L, Morgan G, Cocks N. Language and iconic gesture use in procedural discourse by speakers with aphasia. APHASIOLOGY 2015; 29:826-844. [PMID: 25999636 PMCID: PMC4409036 DOI: 10.1080/02687038.2014.993912] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/16/2014] [Accepted: 11/28/2014] [Indexed: 05/26/2023]
Abstract
Background: Conveying instructions is an everyday use of language, and gestures are likely to be a key feature of this. Although co-speech iconic gestures are tightly integrated with language, and people with aphasia (PWA) produce procedural discourses impaired at a linguistic level, no previous studies have investigated how PWA use co-speech iconic gestures in these contexts. Aims: This study investigated how PWA communicated meaning using gesture and language in procedural discourses, compared with neurologically healthy people (NHP). We aimed to identify the relative relationship of gesture and speech, in the context of impaired language, both overall and in individual events. Methods & Procedures: Twenty-nine PWA and 29 NHP produced two procedural discourses. The structure and semantic content of language of the whole discourses were analysed through predicate argument structure and spatial motor terms, and gestures were analysed for frequency and semantic form. Gesture and language were analysed in two key events, to determine the relative information presented in each modality. Outcomes & Results: PWA and NHP used similar frequencies and forms of gestures, although PWA used syntactically simpler language and fewer spatial words. This meant, overall, relatively more information was present in PWA gesture. This finding was also reflected in the key events, where PWA used gestures conveying rich semantic information alongside semantically impoverished language more often than NHP. Conclusions: PWA gestures, containing semantic information omitted from the concurrent speech, may help listeners with meaning when language is impaired. This finding indicates gesture should be included in clinical assessments of meaning-making.
Collapse
Affiliation(s)
| | - Lucy Dipper
- Division of Language and Communication Science, City University, London, UK
| | - Gary Morgan
- Division of Language and Communication Science, City University, London, UK
| | - Naomi Cocks
- School of Psychology and Speech Pathology, Curtin University, Perth, Australia
| |
Collapse
|
25
|
Kong APH, Law SP, Wat WKC, Lai C. Co-verbal gestures among speakers with aphasia: Influence of aphasia severity, linguistic and semantic skills, and hemiplegia on gesture employment in oral discourse. JOURNAL OF COMMUNICATION DISORDERS 2015; 56:88-102. [PMID: 26186256 PMCID: PMC4530578 DOI: 10.1016/j.jcomdis.2015.06.007] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/20/2014] [Revised: 06/08/2015] [Accepted: 06/29/2015] [Indexed: 05/20/2023]
Abstract
UNLABELLED The use of co-verbal gestures is common in human communication and has been reported to assist word retrieval and to facilitate verbal interactions. This study systematically investigated the impact of aphasia severity, integrity of semantic processing, and hemiplegia on the use of co-verbal gestures, with reference to gesture forms and functions, by 131 normal speakers, 48 individuals with aphasia and their controls. All participants were native Cantonese speakers. It was found that the severity of aphasia and verbal-semantic impairment was associated with significantly more co-verbal gestures. However, there was no relationship between right-sided hemiplegia and gesture employment. Moreover, significantly more gestures were employed by the speakers with aphasia, but about 10% of them did not gesture. Among those who used gestures, content-carrying gestures, including iconic, metaphoric, deictic gestures, and emblems, served the function of enhancing language content and providing information additional to the language content. As for the non-content carrying gestures, beats were used primarily for reinforcing speech prosody or guiding speech flow, while non-identifiable gestures were associated with assisting lexical retrieval or with no specific functions. The above findings would enhance our understanding of the use of various forms of co-verbal gestures in aphasic discourse production and their functions. Speech-language pathologists may also refer to the current annotation system and the results to guide clinical evaluation and remediation of gestures in aphasia. LEARNING OUTCOMES None.
Collapse
Affiliation(s)
- Anthony Pak-Hin Kong
- Department of Communication Sciences and Disorders, University of Central Florida, HPA-2 106, PO Box 162215, Orlando, FL 32816-2215, USA.
| | - Sam-Po Law
- Division of Speech and Hearing Sciences, The University of Hong Kong, Hong Kong SAR, Hong Kong
| | - Watson Ka-Chun Wat
- Division of Speech and Hearing Sciences, The University of Hong Kong, Hong Kong SAR, Hong Kong
| | - Christy Lai
- Division of Speech and Hearing Sciences, The University of Hong Kong, Hong Kong SAR, Hong Kong
| |
Collapse
|
26
|
Özyürek A. Hearing and seeing meaning in speech and gesture: insights from brain and behaviour. Philos Trans R Soc Lond B Biol Sci 2015; 369:20130296. [PMID: 25092664 DOI: 10.1098/rstb.2013.0296] [Citation(s) in RCA: 74] [Impact Index Per Article: 8.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
As we speak, we use not only the arbitrary form-meaning mappings of the speech channel but also motivated form-meaning correspondences, i.e. iconic gestures that accompany speech (e.g. inverted V-shaped hand wiggling across gesture space to demonstrate walking). This article reviews what we know about processing of semantic information from speech and iconic gestures in spoken languages during comprehension of such composite utterances. Several studies have shown that comprehension of iconic gestures involves brain activations known to be involved in semantic processing of speech: i.e. modulation of the electrophysiological recording component N400, which is sensitive to the ease of semantic integration of a word to previous context, and recruitment of the left-lateralized frontal-posterior temporal network (left inferior frontal gyrus (IFG), medial temporal gyrus (MTG) and superior temporal gyrus/sulcus (STG/S)). Furthermore, we integrate the information coming from both channels recruiting brain areas such as left IFG, posterior superior temporal sulcus (STS)/MTG and even motor cortex. Finally, this integration is flexible: the temporal synchrony between the iconic gesture and the speech segment, as well as the perceived communicative intent of the speaker, modulate the integration process. Whether these findings are special to gestures or are shared with actions or other visual accompaniments to speech (e.g. lips) or other visual symbols such as pictures are discussed, as well as the implications for a multimodal view of language.
Collapse
Affiliation(s)
- Aslı Özyürek
- Department of Linguistics, Radboud University Nijmegen, Erasmus Plain 1, 6500 HD, Nijmegen, The Netherlands Max Planck Institute for Psycholinguistics, Wundtlaan 1, Nijmegen 6525 JT, The Netherlands
| |
Collapse
|
27
|
|
28
|
Pouw WTJL, de Nooijer JA, van Gog T, Zwaan RA, Paas F. Toward a more embedded/extended perspective on the cognitive function of gestures. Front Psychol 2014; 5:359. [PMID: 24795687 PMCID: PMC4006024 DOI: 10.3389/fpsyg.2014.00359] [Citation(s) in RCA: 57] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2014] [Accepted: 04/06/2014] [Indexed: 11/30/2022] Open
Abstract
Gestures are often considered to be demonstrative of the embodied nature of the mind (Hostetter and Alibali, 2008). In this article, we review current theories and research targeted at the intra-cognitive role of gestures. We ask the question how can gestures support internal cognitive processes of the gesturer? We suggest that extant theories are in a sense disembodied, because they focus solely on embodiment in terms of the sensorimotor neural precursors of gestures. As a result, current theories on the intra-cognitive role of gestures are lacking in explanatory scope to address how gestures-as-bodily-acts fulfill a cognitive function. On the basis of recent theoretical appeals that focus on the possibly embedded/extended cognitive role of gestures (Clark, 2013), we suggest that gestures are external physical tools of the cognitive system that replace and support otherwise solely internal cognitive processes. That is gestures provide the cognitive system with a stable external physical and visual presence that can provide means to think with. We show that there is a considerable amount of overlap between the way the human cognitive system has been found to use its environment, and how gestures are used during cognitive processes. Lastly, we provide several suggestions of how to investigate the embedded/extended perspective of the cognitive function of gestures.
Collapse
Affiliation(s)
- Wim T J L Pouw
- Department of Social Sciences, Institute of Psychology, Erasmus University Rotterdam Rotterdam, South Holland, Netherlands
| | - Jacqueline A de Nooijer
- Department of Social Sciences, Institute of Psychology, Erasmus University Rotterdam Rotterdam, South Holland, Netherlands
| | - Tamara van Gog
- Department of Social Sciences, Institute of Psychology, Erasmus University Rotterdam Rotterdam, South Holland, Netherlands
| | - Rolf A Zwaan
- Department of Social Sciences, Institute of Psychology, Erasmus University Rotterdam Rotterdam, South Holland, Netherlands
| | - Fred Paas
- Department of Social Sciences, Institute of Psychology, Erasmus University Rotterdam Rotterdam, South Holland, Netherlands ; Early Start Research Institute, University of Wollongong Wollongong, NSW, Australia
| |
Collapse
|
29
|
Gurney DJ, Pine KJ, Wiseman R. The gestural misinformation effect: skewing eyewitness testimony through gesture. AMERICAN JOURNAL OF PSYCHOLOGY 2013; 126:301-14. [PMID: 24027944 DOI: 10.5406/amerjpsyc.126.3.0301] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
The susceptibility of eyewitnesses to verbal suggestion has been well documented, although little attention has been paid to the role of nonverbal communication in misinformation. Three experiments are reported; in each, participants watched footage of a crime scene before being questioned about what they had observed. In Experiments 1 and 2, an on-screen interviewer accompanied identically worded questions with gestures that either conveyed accurate information about the scene or conveyed false, misleading information. The misleading gestures significantly influenced recall, and participants' responses were consistent with the gestured information. In Experiment 3, a live interview was conducted, and the gestural misinformation effect was found to be robust; participants were influenced by misleading gestures performed by the interviewer during questioning. These findings provide compelling evidence for the gestural misinformation effect, whereby subtle hand gestures can implant information and distort the testimony of eyewitnesses. The practical and legal implications of these findings are discussed.
Collapse
Affiliation(s)
- Daniel J Gurney
- School of Psychology, University of Hertfordshire, Hatfield, UK.
| | | | | |
Collapse
|
30
|
Ahlsén E, Schwarz A. Features of aphasic gesturing--an exploratory study of features in gestures produced by persons with and without aphasia. CLINICAL LINGUISTICS & PHONETICS 2013; 27:823-836. [PMID: 23889213 DOI: 10.3109/02699206.2013.813077] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
The purpose of this study was to see how features of gestures produced by persons with aphasia (PWA) are affected and to relate the findings to possible underlying factors. Spontaneous gestures were studied in two contexts: (i) associated with the production of nouns and verbs and (ii) in relation to word finding or production difficulties. The method involved assembling two datasets of co-speech gestures, produced by PWA and by persons without aphasia and to code the gestures for a number of features of expression and content. Features that were affected in the Aphasia dataset were gaze, head movements, hand use and semantic features. The results point to possibly converging explanations, such as generally lower semantic complexity as a direct effect of the aphasia, more cognitive effort and/or a greater dependence on one-hand gestures leading more indirectly to increased gaze aversion, more head shakes and lower complexity in gestures in PWA.
Collapse
Affiliation(s)
- Elisabeth Ahlsén
- SCCIIL Interdisciplinary Center and Division of Communication and Cognition, Department of Applied Information Technology, University of Gothenburg , Göteborg , Sweden
| | | |
Collapse
|
31
|
Oi M, Saito H, Li Z, Zhao W. Co-speech gesture production in an animation-narration task by bilinguals: a near-infrared spectroscopy study. BRAIN AND LANGUAGE 2013; 125:77-81. [PMID: 23454618 DOI: 10.1016/j.bandl.2013.01.004] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/18/2012] [Revised: 12/02/2012] [Accepted: 01/14/2013] [Indexed: 06/01/2023]
Abstract
To examine the neural mechanism of co-speech gesture production, we measured brain activity of bilinguals during an animation-narration task using near-infrared spectroscopy. The task of the participants was to watch two stories via an animated cartoon, and then narrate the contents in their first language (Ll) and second language (L2), respectively. The participants showed significantly more gestures in L2 than in L1. The number of gestures lowered at the ending part of the narration in L1, but not in L2. Analyses of concentration changes of oxygenated hemoglobin revealed that activation of the left inferior frontal gyrus (IFG) significantly increased during gesture production, while activation of the left posterior superior temporal sulcus (pSTS) significantly decreased in line with an increase in the left IFG. These brain activation patterns suggest that the left IFG is involved in the gesture production, and the left pSTS is modulated by the speech load.
Collapse
Affiliation(s)
- Misato Oi
- Department of Cognitive Informatics, Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8601, Japan
| | | | | | | |
Collapse
|
32
|
Fu WT, D'Andrea L, Bertel S. Effects of Communication Methods on Communication Patterns and Performance in a Remote Spatial Orientation Task. SPATIAL COGNITION AND COMPUTATION 2013. [DOI: 10.1080/13875868.2012.701680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
33
|
|
34
|
Maricchiolo F, Livi S, Bonaiuto M, Gnisci A. Hand Gestures and Perceived Influence in Small Group Interaction. SPANISH JOURNAL OF PSYCHOLOGY 2013; 14:755-64. [DOI: 10.5209/rev_sjop.2011.v14.n2.23] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
A laboratory study was carried out to establish the relative importance of verbal and gestural behavior, as well as their interaction, for perceived social influence in more or less competitive small groups. Forty women (psychology students) participated in leaderless small group discussions of different sizes (fourmember and eight-member): at the end, each member rated the perceived influence in decision-making of every other member. Verbal dominance coding is based on traditional quantitative conversational dominance (number of talk turns). Gestural coding (conversational, ideational, object-adaptor, self-adaptor gestures) is based on classical gesture classifications. Beside a substantial effect of verbal dominance, the main result is that frequency of object-adaptors and conversational (only in large groups) and ideational (in both small and large groups) gestures increases perceived influence scores particularly when the verbal dominance of the speaker is low.
Collapse
|
35
|
Park E, Kim KJ, del Pobil AP. Facial Recognition Patterns of Children and Adults Looking at Robotic Faces. INT J ADV ROBOT SYST 2012. [DOI: 10.5772/47836] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
The present study investigates whether adults and children exhibit different eye-fixation patterns when they look at human faces, machinelike robotic faces, and humanlike robotic faces. The results from two between-subject experiments showed that children and adults did have different facial recognition patterns; children tended to fixate more on the mouth of both machinelike and humanlike robotic faces than they do on human faces, while adults focused more on the eyes. The implications of notable findings and the limitations of the experiment are discussed.
Collapse
Affiliation(s)
- Eunil Park
- Department of Interaction Science, Sungkyunkwan University, South Korea
| | - Ki Joon Kim
- Department of Interaction Science, Sungkyunkwan University, South Korea
| | - Angel P. del Pobil
- Department of Interaction Science, Sungkyunkwan University, South Korea
- Robotic Intelligence Laboratory, University Jaume-I, Spain
| |
Collapse
|
36
|
|
37
|
Wu YC, Coulson S. Are depictive gestures like pictures? commonalities and differences in semantic processing. BRAIN AND LANGUAGE 2011; 119:184-195. [PMID: 21864890 PMCID: PMC3196291 DOI: 10.1016/j.bandl.2011.07.002] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/06/2011] [Revised: 07/04/2011] [Accepted: 07/21/2011] [Indexed: 05/29/2023]
Abstract
Conversation is multi-modal, involving both talk and gesture. Does understanding depictive gestures engage processes similar to those recruited in the comprehension of drawings or photographs? Event-related brain potentials (ERPs) were recorded from neurotypical adults as they viewed spontaneously produced depictive gestures preceded by congruent and incongruent contexts. Gestures were presented either dynamically in short, soundless video-clips, or statically as freeze frames extracted from gesture videos. In a separate ERP experiment, the same participants viewed related or unrelated pairs of photographs depicting common real-world objects. Both object photos and gesture stimuli elicited less negative ERPs from 400 to 600ms post-stimulus when preceded by matching versus mismatching contexts (dN450). Object photos and static gesture stills also elicited less negative ERPs between 300 and 400ms post-stimulus (dN300). Findings demonstrate commonalities between the conceptual integration processes underlying the interpretation of iconic gestures and other types of image-based representations of the visual world.
Collapse
Affiliation(s)
- Ying Choon Wu
- Center for Research in Language, UC San Diego 0526, 9500 Gilman Dr., La Jolla, CA 92093
- Swartz Cener for Computational Neuroscience UC San Diego 0559, 9500 Gilman Dr., La Jolla, CA 92093
| | - Seana Coulson
- Center for Research in Language, UC San Diego 0526, 9500 Gilman Dr., La Jolla, CA 92093
- UC San Diego, Dept. of Cognitive Science 0515, 9500 Gilman Dr., La Jolla, CA 92093
| |
Collapse
|
38
|
Rowbotham S, Holler J, Lloyd D, Wearden A. How Do We Communicate About Pain? A Systematic Analysis of the Semantic Contribution of Co-speech Gestures in Pain-focused Conversations. JOURNAL OF NONVERBAL BEHAVIOR 2011. [DOI: 10.1007/s10919-011-0122-5] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
39
|
Raising the Ante of Communication: Evidence for Enhanced Gesture Use in High Stakes Situations. INFORMATION 2011. [DOI: 10.3390/info2040579] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
|
40
|
Hostetter AB, Skirving CJ. The Effect of Visual vs. Verbal Stimuli on Gesture Production. JOURNAL OF NONVERBAL BEHAVIOR 2011. [DOI: 10.1007/s10919-011-0109-2] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
41
|
Galati A, Samuel AG. The role of speech-gesture congruency and delay in remembering action events. ACTA ACUST UNITED AC 2011. [DOI: 10.1080/01690965.2010.494846] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
42
|
Co-Speech Gesture Mimicry in the Process of Collaborative Referring During Face-to-Face Dialogue. JOURNAL OF NONVERBAL BEHAVIOR 2011. [DOI: 10.1007/s10919-011-0105-6] [Citation(s) in RCA: 69] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
43
|
|
44
|
Silverman LB, Bennetto L, Campana E, Tanenhaus MK. Speech-and-gesture integration in high functioning autism. Cognition 2010; 115:380-93. [PMID: 20356575 DOI: 10.1016/j.cognition.2010.01.002] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2008] [Revised: 01/05/2010] [Accepted: 01/11/2010] [Indexed: 11/26/2022]
Abstract
This study examined iconic gesture comprehension in autism, with the goal of assessing whether cross-modal processing difficulties impede speech-and-gesture integration. Participants were 19 adolescents with high functioning autism (HFA) and 20 typical controls matched on age, gender, verbal IQ, and socio-economic status (SES). Gesture comprehension was assessed via quantitative analyses of visual fixations during a video-based task, using the visual world paradigm. Participants' eye movements were recorded while they watched videos of a person describing one of four shapes shown on a computer screen, using speech-and-gesture or speech-only descriptions. Participants clicked on the shape that the speaker described. Since gesture naturally precedes speech, earlier visual fixations to the target shape during speech-and-gesture compared to speech-only trials, would suggest immediate integration of auditory and visual information. Analyses of eye movements supported this pattern in control participants but not in individuals with autism: iconic gestures facilitated comprehension in typical individuals, while it hindered comprehension in those with autism. Cross-modal processing difficulties in autism were not accounted for by impaired unimodal speech or gesture processing. The results have important implications for the treatment of children and adults with this disorder.
Collapse
Affiliation(s)
- Laura B Silverman
- Department of Clinical and Social Sciences in Psychology, Division of Neurodevelopmental and Behavioral Pediatrics, University of Rochester, Rochester, NY 14627, USA.
| | | | | | | |
Collapse
|
45
|
Habets B, Kita S, Shao Z, Ozyurek A, Hagoort P. The role of synchrony and ambiguity in speech-gesture integration during comprehension. J Cogn Neurosci 2010; 23:1845-54. [PMID: 20201632 DOI: 10.1162/jocn.2010.21462] [Citation(s) in RCA: 62] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
During face-to-face communication, one does not only hear speech but also see a speaker's communicative hand movements. It has been shown that such hand gestures play an important role in communication where the two modalities influence each other's interpretation. A gesture typically temporally overlaps with coexpressive speech, but the gesture is often initiated before (but not after) the coexpressive speech. The present ERP study investigated what degree of asynchrony in the speech and gesture onsets are optimal for semantic integration of the concurrent gesture and speech. Videos of a person gesturing were combined with speech segments that were either semantically congruent or incongruent with the gesture. Although gesture and speech always overlapped in time, gesture and speech were presented with three different degrees of asynchrony. In the SOA 0 condition, the gesture onset and the speech onset were simultaneous. In the SOA 160 and 360 conditions, speech was delayed by 160 and 360 msec, respectively. ERPs time locked to speech onset showed a significant difference between semantically congruent versus incongruent gesture-speech combinations on the N400 for the SOA 0 and 160 conditions. No significant difference was found for the SOA 360 condition. These results imply that speech and gesture are integrated most efficiently when the differences in onsets do not exceed a certain time span because of the fact that iconic gestures need speech to be disambiguated in a way relevant to the speech context.
Collapse
|
46
|
Hugill N, Fink B, Neave N. The role of human body movements in mate selection. EVOLUTIONARY PSYCHOLOGY 2010; 8:66-89. [PMID: 22947780 PMCID: PMC10480986 DOI: 10.1177/147470491000800107] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2009] [Accepted: 02/05/2010] [Indexed: 10/17/2023] Open
Abstract
It is common scientific knowledge, that most of what we say within a conversation is not only expressed by the words' meaning alone, but also through our gestures, postures, and body movements. This non-verbal mode is possibly rooted firmly in our human evolutionary heritage, and as such, some scientists argue that it serves as a fundamental assessment and expression tool for our inner qualities. Studies of nonverbal communication have established that a universal, culture-free, non-verbal sign system exists, that is available to all individuals for negotiating social encounters. Thus, it is not only the kind of gestures and expressions humans use in social communication, but also the way these movements are performed, as this seems to convey key information about an individual's quality. Dance, for example, is a special form of movement, which can be observed in human courtship displays. Recent research suggests that people are sensitive to the variation in dance movements, and that dance performance provides information about an individual's mate quality in terms of health and strength. This article reviews the role of body movement in human non-verbal communication, and highlights its significance in human mate preferences in order to promote future work in this research area within the evolutionary psychology framework.
Collapse
Affiliation(s)
- Nadine Hugill
- Department of Sociobiology/Anthropology, Institute of Zoology and Anthropology, University of Göttingen, Göttingen, Germany
| | - Bernhard Fink
- Department of Sociobiology/Anthropology, Institute of Zoology and Anthropology, University of Göttingen, Göttingen, Germany
| | - Nick Neave
- Department of Psychology, School of Psychology and Sport Sciences, Northumbria University, Newcastle upon Tyne, United Kingdom
| |
Collapse
|
47
|
Nicoladis E, Pika S, Marentette P. Do French-English bilingual children gesture more than monolingual children? JOURNAL OF PSYCHOLINGUISTIC RESEARCH 2009; 38:573-85. [PMID: 19521776 DOI: 10.1007/s10936-009-9121-7] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/13/2007] [Accepted: 05/29/2009] [Indexed: 05/23/2023]
Abstract
Previous studies have shown that bilingual adults use more gestures than English monolinguals. Because no study has compared the gestures of bilinguals and monolinguals in both languages, the high gesture rate could be due to transfer from a high gesture language or could result from the use of gesture to aid in linguistic access. In this study we tried to distinguish between those causes by comparing the gesture rate of 10 French-English bilingual preschoolers with both 10 French and 10 English monolinguals. All were between 4 and 6 years of age. The children were asked to watch a cartoon and tell the story back. The results showed the bilingual children gestured more than either group of monolinguals and at the same rate in both French and English. These results suggest that that the bilinguals were not gesturing because they were transferring the high gesture rate from one language to another. We argue that bilinguals might gesture more than monolinguals to help formulate their spoken message.
Collapse
Affiliation(s)
- Elena Nicoladis
- Department of Psychology, University of Alberta, P-217 Biological Sciences Building, Edmonton, AB, Canada.
| | | | | |
Collapse
|
48
|
McGregor KK, Rohlfing KJ, Bean A, Marschner E. Gesture as a support for word learning: the case of under. JOURNAL OF CHILD LANGUAGE 2009; 36:807-828. [PMID: 18947455 PMCID: PMC3328788 DOI: 10.1017/s0305000908009173] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACTForty children, aged 1 ; 8-2 ; 0, participated in one of three training conditions meant to enhance their comprehension of the spatial term under: the +Gesture group viewed a symbolic gesture for under during training; those in the +Photo group viewed a still photograph of objects in the under relationship; those in the Model Only group did not receive supplemental symbolic support. Children's knowledge of under was measured before, immediately after, and two to three days after training. A gesture advantage was revealed when the gains exhibited by the groups on untrained materials (but not trained materials) were compared at delayed post-test (but not immediate post-test). Gestured input promoted more robust knowledge of the meaning of under, knowledge that was less tied to contextual familiarity and more prone to consolidation. Gestured input likely reduced cognitive load while emphasizing both the location and the movement relevant to the meaning of under.
Collapse
|
49
|
Cook SW, Tanenhaus MK. Embodied communication: speakers' gestures affect listeners' actions. Cognition 2009; 113:98-104. [PMID: 19682672 DOI: 10.1016/j.cognition.2009.06.006] [Citation(s) in RCA: 95] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2007] [Revised: 05/28/2009] [Accepted: 06/05/2009] [Indexed: 11/18/2022]
Abstract
We explored how speakers and listeners use hand gestures as a source of perceptual-motor information during naturalistic communication. After solving the Tower of Hanoi task either with real objects or on a computer, speakers explained the task to listeners. Speakers' hand gestures, but not their speech, reflected properties of the particular objects and the actions that they had previously used to solve the task. Speakers who solved the problem with real objects used more grasping handshapes and produced more curved trajectories during the explanation. Listeners who observed explanations from speakers who had previously solved the problem with real objects subsequently treated computer objects more like real objects; their mouse trajectories revealed that they lifted the objects in conjunction with moving them sideways, and this behavior was related to the particular gestures that were observed. These findings demonstrate that hand gestures are a reliable source of perceptual-motor information during human communication.
Collapse
Affiliation(s)
- Susan Wagner Cook
- Department of Psychology, The University of Iowa, E11 Seashore Hall, Iowa City, IA 52242, USA.
| | | |
Collapse
|
50
|
|