1
|
Hyusein G, Göksun T. Give your ideas a hand: the role of iconic hand gestures in enhancing divergent creative thinking. PSYCHOLOGICAL RESEARCH 2024; 88:1298-1313. [PMID: 38538819 PMCID: PMC11142943 DOI: 10.1007/s00426-024-01932-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Accepted: 01/29/2024] [Indexed: 06/02/2024]
Abstract
Hand gestures play an integral role in multimodal language and communication. Even though the self-oriented functions of gestures, such as activating a speaker's lexicon and maintaining visuospatial imagery, have been emphasized, gestures' functions in creative thinking are not well-established. In the current study, we investigated the role of iconic gestures in verbal divergent thinking-a creative thinking process related to generating many novel ideas. Based on previous findings, we hypothesized that iconic gesture use would facilitate divergent thinking in young adults, especially those with high mental imagery skills. Participants performed Guildford's Alternative Uses Task in a gesture-spontaneous and in a gesture-encouraged condition. We measured fluency (number of ideas), originality (uniqueness of ideas), flexibility (number of idea categories), and elaboration (number of details) in divergent thinking. The results showed that producing iconic gestures in the gesture-encouraged condition positively predicted fluency, originality, and elaboration. In the gesture-spontaneous condition, producing iconic gestures also positively predicted elaboration but negatively predicted flexibility. Mental imagery skills did not interact with the effects of gestures on divergent thinking. These results suggest that iconic gestures are a promising candidate for enhancing almost all aspects of divergent thinking. Overall, the current study adds a new dimension to the self-oriented function of iconic gestures, that is, their contribution to creative thinking.
Collapse
Affiliation(s)
| | - Tilbe Göksun
- Department of Psychology, Koç University, Istanbul, Turkey
| |
Collapse
|
2
|
Kim J, Hazan V, Tuomainen O, Davis C. Partner-directed gaze and co-speech hand gestures: effects of age, hearing loss and noise. Front Psychol 2024; 15:1324667. [PMID: 38882511 PMCID: PMC11178134 DOI: 10.3389/fpsyg.2024.1324667] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2023] [Accepted: 05/10/2024] [Indexed: 06/18/2024] Open
Abstract
Research on the adaptations talkers make to different communication conditions during interactive conversations has primarily focused on speech signals. We extended this type of investigation to two other important communicative signals, i.e., partner-directed gaze and iconic co-speech hand gestures with the aim of determining if the adaptations made by older adults differ from younger adults across communication conditions. We recruited 57 pairs of participants, comprising 57 primary talkers and 57 secondary ones. Primary talkers consisted of three groups: 19 older adults with mild Hearing Loss (older adult-HL); 17 older adults with Normal Hearing (older adult-NH); and 21 younger adults. The DiapixUK "spot the difference" conversation-based task was used to elicit conversions in participant pairs. One easy (No Barrier: NB) and three difficult communication conditions were tested. The three conditions consisted of two in which the primary talker could hear clearly, but the secondary talkers could not, due to multi-talker babble noise (BAB1) or a less familiar hearing loss simulation (HLS), and a condition in which both the primary and secondary talkers heard each other in babble noise (BAB2). For primary talkers, we measured mean number of partner-directed gazes; mean total gaze duration; and the mean number of co-speech hand gestures. We found a robust effects of communication condition that interacted with participant group. Effects of age were found for both gaze and gesture in BAB1, i.e., older adult-NH looked and gestured less than younger adults did when the secondary talker experienced babble noise. For hearing status, a difference in gaze between older adult-NH and older adult-HL was found for the BAB1 condition; for gesture this difference was significant in all three difficult communication conditions (older adult-HL gazed and gestured more). We propose the age effect may be due to a decline in older adult's attention to cues signaling how well a conversation is progressing. To explain the hearing status effect, we suggest that older adult's attentional decline is offset by hearing loss because these participants have learned to pay greater attention to visual cues for understanding speech.
Collapse
Affiliation(s)
- Jeesun Kim
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, NSW, Australia
| | - Valerie Hazan
- Speech Hearing and Phonetic Sciences, University College London, London, United Kingdom
| | - Outi Tuomainen
- Department of Linguistics, University of Potsdam, Potsdam, Germany
| | - Chris Davis
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, NSW, Australia
| |
Collapse
|
3
|
Parma C, Doria F, Zulueta A, Boscarino M, Giani L, Lunetta C, Parati EA, Picozzi M, Sattin D. Does Body Memory Exist? A Review of Models, Approaches and Recent Findings Useful for Neurorehabilitation. Brain Sci 2024; 14:542. [PMID: 38928542 PMCID: PMC11201876 DOI: 10.3390/brainsci14060542] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2024] [Revised: 05/20/2024] [Accepted: 05/23/2024] [Indexed: 06/28/2024] Open
Abstract
Over the past twenty years, scientific research on body representations has grown significantly, with Body Memory (BM) emerging as a prominent area of interest in neurorehabilitation. Compared to other body representations, BM stands out as one of the most obscure due to the multifaceted nature of the concept of "memory" itself, which includes various aspects (such as implicit vs. explicit, conscious vs. unconscious). The concept of body memory originates from the field of phenomenology and has been developed by research groups studying embodied cognition. In this narrative review, we aim to present compelling evidence from recent studies that explore various definitions and explanatory models of BM. Additionally, we will provide a comprehensive overview of the empirical settings used to examine BM. The results can be categorized into two main areas: (i) how the body influences our memories, and (ii) how memories, in their broadest sense, could generate and/or influence metarepresentations-the ability to reflect on or make inferences about one's own cognitive representations or those of others. We present studies that emphasize the significance of BM in experimental settings involving patients with neurological and psychiatric disorders, ultimately analyzing these findings from an ontogenic perspective.
Collapse
Affiliation(s)
- Chiara Parma
- Istituti Clinici Scientifici Maugeri IRCCS, Health Directorate, Via Camaldoli 64, 20138 Milan, Italy; (C.P.); (F.D.)
- PhD. Program, Medicina Clinica e Sperimentale e Medical Humanities, Insubria University, 21100 Varese, Italy
| | - Federica Doria
- Istituti Clinici Scientifici Maugeri IRCCS, Health Directorate, Via Camaldoli 64, 20138 Milan, Italy; (C.P.); (F.D.)
| | - Aida Zulueta
- Istituti Clinici Scientifici Maugeri IRCCS, Labion, Via Camaldoli 64, 20138 Milan, Italy;
| | - Marilisa Boscarino
- Neurorehabilitation Department, Istituti Clinici Scientifici Maugeri IRCCS, Via Camaldoli 64, 20138 Milan, Italy; (M.B.); (L.G.); (E.A.P.)
| | - Luca Giani
- Neurorehabilitation Department, Istituti Clinici Scientifici Maugeri IRCCS, Via Camaldoli 64, 20138 Milan, Italy; (M.B.); (L.G.); (E.A.P.)
| | - Christian Lunetta
- Amyotrophic Lateral Sclerosis Unit, Neurorehabilitation Department, Istituti Clinici Scientifici Maugeri IRCCS, Via Camaldoli 64, 20138 Milan, Italy;
| | - Eugenio Agostino Parati
- Neurorehabilitation Department, Istituti Clinici Scientifici Maugeri IRCCS, Via Camaldoli 64, 20138 Milan, Italy; (M.B.); (L.G.); (E.A.P.)
| | - Mario Picozzi
- Center for Clinical Ethics, Biotechnology and Life Sciences Department, Insubria University, 21100 Varese, Italy;
| | - Davide Sattin
- Istituti Clinici Scientifici Maugeri IRCCS, Health Directorate, Via Camaldoli 64, 20138 Milan, Italy; (C.P.); (F.D.)
| |
Collapse
|
4
|
Hostetter AB, Rascon-Powell DK. F@#k Pain! The Effect of Taboo Language and Gesture on the Experience of Pain. Psychol Rep 2024; 127:577-593. [PMID: 36075480 DOI: 10.1177/00332941221125776] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Swearing has been shown to reduce the experience of pain in a cold pressor task, and the effect has been suggested to be due to state aggression. In the present experiment, we examined whether producing a taboo gesture (i.e., the American gesture of raising the middle finger) reduces the experience of pain similar to the effect that has been shown for producing a taboo word. 111 participants completed two cold pressor trials in a 2 (Language vs. Gesture) × 2 (Taboo vs. Neutral) mixed design. We found that producing a taboo act in either language or gesture increased pain tolerance on the cold pressor task and reduced the experience of perceived pain compared to producing a neutral act. We found no changes in state aggression or heart rate. These results suggest that the pain-reducing effect of swearing is shared by taboo gesture and that these effects are likely not due to changes in state aggression.
Collapse
|
5
|
Patterson ML, Fridlund AJ, Crivelli C. Four Misconceptions About Nonverbal Communication. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2023; 18:1388-1411. [PMID: 36791676 PMCID: PMC10623623 DOI: 10.1177/17456916221148142] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/17/2023]
Abstract
Research and theory in nonverbal communication have made great advances toward understanding the patterns and functions of nonverbal behavior in social settings. Progress has been hindered, we argue, by presumptions about nonverbal behavior that follow from both received wisdom and faulty evidence. In this article, we document four persistent misconceptions about nonverbal communication-namely, that people communicate using decodable body language; that they have a stable personal space by which they regulate contact with others; that they express emotion using universal, evolved, iconic, categorical facial expressions; and that they can deceive and detect deception, using dependable telltale clues. We show how these misconceptions permeate research as well as the practices of popular behavior experts, with consequences that extend from intimate relationships to the boardroom and courtroom and even to the arena of international security. Notwithstanding these misconceptions, existing frameworks of nonverbal communication are being challenged by more comprehensive systems approaches and by virtual technologies that ambiguate the roles and identities of interactants and the contexts of interaction.
Collapse
Affiliation(s)
| | - Alan J. Fridlund
- Department of Psychological and Brain Sciences, University of California, Santa Barbara
| | | |
Collapse
|
6
|
Giovannelli F, Borgheresi A, Lucidi G, Squitieri M, Gavazzi G, Suppa A, Berardelli A, Viggiano MP, Cincotta M. Language-related motor facilitation in Italian Sign Language signers. Cereb Cortex 2023:6988100. [PMID: 36646456 DOI: 10.1093/cercor/bhac536] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 12/23/2022] [Accepted: 12/24/2022] [Indexed: 01/18/2023] Open
Abstract
Linguistic tasks facilitate corticospinal excitability as revealed by increased motor evoked potential (MEP) induced by transcranial magnetic stimulation (TMS) in the dominant hand. This modulation of the primary motor cortex (M1) excitability may reflect the relationship between speech and gestures. It is conceivable that in healthy individuals who use a sign language this cortical excitability modulation could be rearranged. The aim of this study was to evaluate the effect of spoken language tasks on M1 excitability in a group of hearing signers. Ten hearing Italian Sign Language (LIS) signers and 16 non-signer healthy controls participated. Single-pulse TMS was applied to either M1 hand area at the baseline and during different tasks: (i) reading aloud, (ii) silent reading, (iii) oral movements, (iv) syllabic phonation and (v) looking at meaningless non-letter strings. Overall, M1 excitability during the linguistic and non-linguistic tasks was higher in LIS group compared to the control group. In LIS group, MEPs were significantly larger during reading aloud, silent reading and non-verbal oral movements, regardless the hemisphere. These results suggest that in hearing signers there is a different modulation of the functional connectivity between the speech-related brain network and the motor system.
Collapse
Affiliation(s)
- Fabio Giovannelli
- Department of Neuroscience, Psychology, Drug Research and Child's Health (NEUROFARBA), Section of Psychology, University of Florence, Florence 50135, Italy
| | - Alessandra Borgheresi
- Unit of Neurology of Florence, Central Tuscany Local Health Authority, Florence 50143, Italy
| | - Giulia Lucidi
- Unit of Neurology of Florence, Central Tuscany Local Health Authority, Florence 50143, Italy
| | - Martina Squitieri
- Unit of Neurology of Florence, Central Tuscany Local Health Authority, Florence 50143, Italy
| | - Gioele Gavazzi
- Department of Neuroscience, Psychology, Drug Research and Child's Health (NEUROFARBA), Section of Psychology, University of Florence, Florence 50135, Italy
| | - Antonio Suppa
- Department of Human Neurosciences, Sapienza University of Rome, Rome 00185, Italy.,IRCCS Neuromed, Pozzilli (IS) 86077, Italy
| | - Alfredo Berardelli
- Department of Human Neurosciences, Sapienza University of Rome, Rome 00185, Italy.,IRCCS Neuromed, Pozzilli (IS) 86077, Italy
| | - Maria Pia Viggiano
- Department of Neuroscience, Psychology, Drug Research and Child's Health (NEUROFARBA), Section of Psychology, University of Florence, Florence 50135, Italy
| | - Massimo Cincotta
- Unit of Neurology of Florence, Central Tuscany Local Health Authority, Florence 50143, Italy
| |
Collapse
|
7
|
Nicoladis E, Aneja A, Sidhu J, Dhanoa A. Is There a Correlation Between the Use of Representational Gestures and Self-adaptors? JOURNAL OF NONVERBAL BEHAVIOR 2022. [DOI: 10.1007/s10919-022-00401-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
8
|
Arslan B, Göksun T. Aging, Gesture Production, and Disfluency in Speech: A Comparison of Younger and Older Adults. Cogn Sci 2022; 46:e13098. [PMID: 35122305 DOI: 10.1111/cogs.13098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2020] [Revised: 12/25/2021] [Accepted: 01/05/2022] [Indexed: 11/30/2022]
Abstract
Age-related changes are observed in the speech and gestures of neurotypical individuals. Older adults are more disfluent in speech and use fewer representational gestures (e.g., holding two hands close to each other to mean small), compared to younger adults. Using gestures, especially representational gestures, is common in difficult tasks to aid the conceptualization process and to facilitate lexical access. This study investigates how aging can affect gesture production and the co-occurrence between gesture and speech disfluency. We elicited speech and gesture samples from younger and older adults (N = 60) by using a painting description task that provided concrete and abstract contexts. Results indicated that albeit the two age groups revealed comparable overall speech disfluency and gesture rates, they differed in terms of how their disfluencies and gestures were distributed across specific categories. Moreover, the proportion of speech disfluencies that occur with a gesture was significantly higher for younger than older adults. However, the two age groups were comparable in terms of the proportion of gestures that were accompanied by a speech disfluency. These findings suggest that younger adults' language production system might be better at benefiting from other modalities, that is, gesture, to resolve temporary problems in speech planning. However, from a gesture perspective, it might be difficult to differentiate between gestures' self-oriented and communicative functions and understand their role in speech facilitation. Focusing on specific cases where speech disfluency and gestures co-occur and considering individual differences might bring insight into multimodal communication.
Collapse
|
9
|
Pasternak R, Tieu L. EXPRESS: Co-linguistic content inferences: From gestures to sound effects and emoji. Q J Exp Psychol (Hove) 2022; 75:1828-1843. [PMID: 35114858 DOI: 10.1177/17470218221080645] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Among other uses, co-speech gestures can contribute additional semantic content to the spoken utterances with which they coincide. A growing body of research is dedicated to understanding how inferences from gestures interact with logical operators in speech, including negation ("not"/"n't"), modals (e.g., "might"), and quantifiers (e.g., "each", "none", "exactly one"). A related but less-addressed question is what kinds of meaningful content other than gestures can evince this same behavior; this is in turn connected to the much broader question of what properties of gestures are responsible for how they interact with logical operators. We present two experiments investigating sentences with co-speech sound effects and co-text emoji in lieu of gestures, revealing a remarkably similar inference pattern to that of co-speech gestures. The results suggest that gestural inferences do not behave the way they do because of any traits specific to gestures, and that the inference pattern extends to a much broader range of content.
Collapse
Affiliation(s)
| | - Lyn Tieu
- Western Sydney University, Office of the Pro Vice-Chancellor (Research & Innovation), Penrith, Australia 6489
| |
Collapse
|
10
|
Maessen B, Rombouts E, Maes B, Zink I. The relation between gestures and stuttering in individuals with Down syndrome. JOURNAL OF APPLIED RESEARCH IN INTELLECTUAL DISABILITIES 2022; 35:761-776. [DOI: 10.1111/jar.12980] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2021] [Revised: 12/15/2021] [Accepted: 01/10/2022] [Indexed: 11/27/2022]
Affiliation(s)
- Babette Maessen
- Department of Neurosciences Experimental Otorhinolaryngology Leuven Belgium
| | - Ellen Rombouts
- Department of Neurosciences Experimental Otorhinolaryngology Leuven Belgium
| | - Bea Maes
- Faculty of Psychology and Educational Sciences Parenting and Special Education Research Group Leuven Belgium
| | - Inge Zink
- Department of Neurosciences Experimental Otorhinolaryngology Leuven Belgium
| |
Collapse
|
11
|
Stark BC, Cofoid C. Task-Specific Iconic Gesturing During Spoken Discourse in Aphasia. AMERICAN JOURNAL OF SPEECH-LANGUAGE PATHOLOGY 2022; 31:30-47. [PMID: 34033493 PMCID: PMC9135014 DOI: 10.1044/2021_ajslp-20-00271] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/09/2020] [Revised: 11/30/2020] [Accepted: 02/11/2021] [Indexed: 05/26/2023]
Abstract
PURPOSE In persons living with aphasia, we will explore the relationship between iconic gesture production during spontaneous speech and discourse task, spoken language, and demographic information. METHOD Employing the AphasiaBank database, we coded iconic gestures in 75 speakers with aphasia during two spoken discourse tasks: a procedural narrative, which involved participants telling the experimenter how to make a sandwich ("Sandwich"), and a picture sequence narrative, which had participants describe the picture sequence to the experimenter ("Window"). Forty-three produced a gesture during both tasks, and we further evaluate data from this subgroup as a more direct comparison between tasks. RESULTS More iconic gestures, at a higher rate, were produced during the procedural narrative. For both tasks, there was a relationship between iconic gesture rate, modeled as iconic gestures per word, and metrics of language dysfluency extracted from the discourse task as well as a metric of fluency extracted from a standardized battery. Iconic gesture production was correlated with aphasia duration, which was driven by performance during only a single task (Window), but not with other demographic metrics, such as aphasia severity or age. We also provide preliminary evidence for task differences shown through the lens of two types of iconic gestures. CONCLUSIONS While speech-language pathologists have utilized gesture in therapy for poststroke aphasia, due to its possible facilitatory role in spoken language, there has been considerably less work in understanding how gesture differs across naturalistic tasks and how we can best utilize this information to better assess gesture in aphasia and improve multimodal treatment for aphasia. Furthermore, our results contribute to gesture theory, particularly, about the role of gesture across naturalistic tasks and its relationship with spoken language. Supplemental Material https://doi.org/10.23641/asha.14614941.
Collapse
Affiliation(s)
- Brielle C. Stark
- Department of Speech, Language and Hearing Sciences, Indiana University Bloomington
| | - Caroline Cofoid
- Department of Speech, Language and Hearing Sciences, Indiana University Bloomington
| |
Collapse
|
12
|
Sun J, Wang Z, Tian X. Manual Gestures Modulate Early Neural Responses in Loudness Perception. Front Neurosci 2021; 15:634967. [PMID: 34539324 PMCID: PMC8440995 DOI: 10.3389/fnins.2021.634967] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2020] [Accepted: 08/06/2021] [Indexed: 12/02/2022] Open
Abstract
How different sensory modalities interact to shape perception is a fundamental question in cognitive neuroscience. Previous studies in audiovisual interaction have focused on abstract levels such as categorical representation (e.g., McGurk effect). It is unclear whether the cross-modal modulation can extend to low-level perceptual attributes. This study used motional manual gestures to test whether and how the loudness perception can be modulated by visual-motion information. Specifically, we implemented a novel paradigm in which participants compared the loudness of two consecutive sounds whose intensity changes around the just noticeable difference (JND), with manual gestures concurrently presented with the second sound. In two behavioral experiments and two EEG experiments, we investigated our hypothesis that the visual-motor information in gestures would modulate loudness perception. Behavioral results showed that the gestural information biased the judgment of loudness. More importantly, the EEG results demonstrated that early auditory responses around 100 ms after sound onset (N100) were modulated by the gestures. These consistent results in four behavioral and EEG experiments suggest that visual-motor processing can integrate with auditory processing at an early perceptual stage to shape the perception of a low-level perceptual attribute such as loudness, at least under challenging listening conditions.
Collapse
Affiliation(s)
- Jiaqiu Sun
- Division of Arts and Sciences, New York University Shanghai, Shanghai, China.,NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, Shanghai, China
| | - Ziqing Wang
- NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, Shanghai, China.,Shanghai Key Laboratory of Brain Functional Genomics, Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| | - Xing Tian
- Division of Arts and Sciences, New York University Shanghai, Shanghai, China.,NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, Shanghai, China.,Shanghai Key Laboratory of Brain Functional Genomics, Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| |
Collapse
|
13
|
Tversky B, Jamalian A. Thinking Tools: Gestures Change Thought About Time. Top Cogn Sci 2021; 13:750-776. [PMID: 34298590 DOI: 10.1111/tops.12566] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2021] [Revised: 06/24/2021] [Accepted: 07/05/2021] [Indexed: 11/28/2022]
Abstract
Our earliest tools are our bodies. Our hands raise and turn and toss and carry and push and pull, our legs walk and climb and kick allowing us to move and act in the world and to create the multitude of artifacts that improve our lives. The list of actions made by our hands and feet and other parts of our bodies is long. What is more remarkable is we turn those actions in the world into actions on thought through gestures, language, and graphics, thereby creating cognitive tools that expand the mind. The focus here is gesture; gestures transform actions on perceptible objects to actions on imagined thoughts, carrying meaning with them rapidly, precisely, and directly. We review evidence showing that gestures enhance our own thinking and change the thought of others. We illustrate the power of gestures in studies showing that gestures uniquely change conceptions of time, from sequential to simultaneous, from sequential to cyclical, and from a perspective embedded in a timeline to an external perspective looking on a timeline, and by so doing obviate the ambiguities of an embedded perspective. We draw parallels between representations in gesture and in graphics; both use marks or actions arrayed in space to communicate more immediately than symbolic language.
Collapse
Affiliation(s)
- Barbara Tversky
- Human Development, Columbia Teachers College.,Department of Psychology, Stanford University
| | - Azadeh Jamalian
- Human Development, Columbia Teachers College.,The GIANT Room, New York
| |
Collapse
|
14
|
Gordon R, Ramani GB. Integrating Embodied Cognition and Information Processing: A Combined Model of the Role of Gesture in Children's Mathematical Environments. Front Psychol 2021; 12:650286. [PMID: 33897559 PMCID: PMC8062855 DOI: 10.3389/fpsyg.2021.650286] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2021] [Accepted: 03/08/2021] [Indexed: 11/13/2022] Open
Abstract
Children learn and use various strategies to solve math problems. One way children's math learning can be supported is through their use of and exposure to hand gestures. Children's self-produced gestures can reveal unique, math-relevant knowledge that is not contained in their speech. Additionally, these gestures can assist with their math learning and problem solving by supporting their cognitive processes, such as executive function. The gestures that children observe during math instructions are also linked to supporting cognition. Specifically, children are better able to learn, retain, and generalize knowledge about math when that information is presented within the gestures that accompany an instructor's speech. To date, no conceptual model provides an outline regarding how these gestures and the math environment are connected, nor how they may interact with children's underlying cognitive capacities such as their executive function. In this review, we propose a new model based on an integration of the information processing approach and theory of embodied cognition. We provide an in-depth review of the related literature and consider how prior research aligns with each link within the proposed model. Finally, we discuss the utility of the proposed model as it pertains to future research endeavors.
Collapse
Affiliation(s)
- Raychel Gordon
- Department of Human Development and Quantitative Methodology, University of Maryland, College Park, MD, United States
| | - Geetha B Ramani
- Department of Human Development and Quantitative Methodology, University of Maryland, College Park, MD, United States
| |
Collapse
|
15
|
Clingan-Siverly S, Nelson PM, Göksun T, Demir-Lira ÖE. Spatial Thinking in Term and Preterm-Born Preschoolers: Relations to Parent-Child Speech and Gesture. Front Psychol 2021; 12:651678. [PMID: 33967912 PMCID: PMC8103033 DOI: 10.3389/fpsyg.2021.651678] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2021] [Accepted: 03/02/2021] [Indexed: 11/13/2022] Open
Abstract
Spatial skills predict important life outcomes, such as mathematical achievement or entrance into Science, Technology, Engineering, and Mathematics (STEM) disciplines. Children significantly vary in their spatial performance even before they enter formal schooling. One correlate of children's spatial performance is the spatial language they produce and hear from others, such as their parents. Because the emphasis has been on spatial language, less is known about the role of hand gestures in children's spatial development. Some children are more likely to fall behind in their spatial skills than others. Children born premature (gestational age <37 weeks) constitute such a risk group. Here, we compared performance of term and preterm-born children on two non-verbal spatial tasks-mental transformation and block design. We also examined relations of children's performance on these tasks to parental spatial language and gesture input and their own production of spatial language and gesture during an independent puzzle play interaction. We found that while term and preterm-born children (n = 40) as a group did not differ in the mental transformation or block design performance, children varied widely in their performance within each group. The variability in mental transformation scores was predicted by both a subset of spatial words (what aspects of spatial information) and all spatial gestures children produced. Children's spatial language and gesture were in turn related to their parents' spatial language and gesture. Parental spatial language and gesture had an indirect relation on children's mental transformation, but not block design, scores via children's spatial language, and gesture use. Overall, results highlight the unique contributions of speech and gesture in communicating spatial information and predicting children's spatial performance.
Collapse
Affiliation(s)
- Sam Clingan-Siverly
- Department of Psychological and Brain Sciences, University of Iowa, Iowa, IA, United States
| | - Paige M. Nelson
- Department of Psychological and Brain Sciences, University of Iowa, Iowa, IA, United States
| | - Tilbe Göksun
- Department of Psychology, Koç University, Istanbul, Turkey
| | - Ö. Ece Demir-Lira
- Department of Psychological and Brain Sciences, University of Iowa, Iowa, IA, United States
- DeLTA Center, University of Iowa, Iowa, IA, United States
- Iowa Neuroscience Institute, University of Iowa, Iowa, IA, United States
| |
Collapse
|
16
|
Morin-Lessard E, Hentges RF, Tough SC, Graham SA. Developmental Pathways Between Infant Gestures and Symbolic Actions, and Children's Communicative Skills at Age 5: Findings From the All Our Families Pregnancy Cohort. Child Dev 2021; 92:799-810. [PMID: 33835495 DOI: 10.1111/cdev.13567] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Using data from the All Our Families study, a longitudinal study of 1992 mother-child dyads in Canada (47.7% female; 81.9% White), we examined the developmental pathways between infant gestures and symbolic actions and communicative skills at age 5. Communicative gestures at age 12 months (e.g., pointing, nodding head "yes"), obtained via parental report, predicted stronger general communicative skills at age 5 years. Moreover, greater use of symbolic actions (e.g., "feeding" a stuffed animal with a bottle) indirectly predicted increased communicative skills at age 5 via increased productive vocabulary at 24 months. These pathways support the hypothesis that children's communicative skills during the transition to kindergarten emerge from a chain of developmental abilities starting with gestures and symbolic actions during infancy.
Collapse
|
17
|
Trettenbrein PC, Zaccarella E. Controlling Video Stimuli in Sign Language and Gesture Research: The OpenPoseR Package for Analyzing OpenPose Motion-Tracking Data in R. Front Psychol 2021; 12:628728. [PMID: 33679550 PMCID: PMC7932993 DOI: 10.3389/fpsyg.2021.628728] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Accepted: 01/29/2021] [Indexed: 01/08/2023] Open
Abstract
Researchers in the fields of sign language and gesture studies frequently present their participants with video stimuli showing actors performing linguistic signs or co-speech gestures. Up to now, such video stimuli have been mostly controlled only for some of the technical aspects of the video material (e.g., duration of clips, encoding, framerate, etc.), leaving open the possibility that systematic differences in video stimulus materials may be concealed in the actual motion properties of the actor’s movements. Computer vision methods such as OpenPose enable the fitting of body-pose models to the consecutive frames of a video clip and thereby make it possible to recover the movements performed by the actor in a particular video clip without the use of a point-based or markerless motion-tracking system during recording. The OpenPoseR package provides a straightforward and reproducible way of working with these body-pose model data extracted from video clips using OpenPose, allowing researchers in the fields of sign language and gesture studies to quantify the amount of motion (velocity and acceleration) pertaining only to the movements performed by the actor in a video clip. These quantitative measures can be used for controlling differences in the movements of an actor in stimulus video clips or, for example, between different conditions of an experiment. In addition, the package also provides a set of functions for generating plots for data visualization, as well as an easy-to-use way of automatically extracting metadata (e.g., duration, framerate, etc.) from large sets of video files.
Collapse
Affiliation(s)
- Patrick C Trettenbrein
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,International Max Planck Research School on Neuroscience of Communication: Structure, Function, and Plasticity (IMPRS NeuroCom), Leipzig, Germany
| | - Emiliano Zaccarella
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
18
|
Clough S, Duff MC. The Role of Gesture in Communication and Cognition: Implications for Understanding and Treating Neurogenic Communication Disorders. Front Hum Neurosci 2020; 14:323. [PMID: 32903691 PMCID: PMC7438760 DOI: 10.3389/fnhum.2020.00323] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2020] [Accepted: 07/21/2020] [Indexed: 01/20/2023] Open
Abstract
When people talk, they gesture. Gesture is a fundamental component of language that contributes meaningful and unique information to a spoken message and reflects the speaker's underlying knowledge and experiences. Theoretical perspectives of speech and gesture propose that they share a common conceptual origin and have a tightly integrated relationship, overlapping in time, meaning, and function to enrich the communicative context. We review a robust literature from the field of psychology documenting the benefits of gesture for communication for both speakers and listeners, as well as its important cognitive functions for organizing spoken language, and facilitating problem-solving, learning, and memory. Despite this evidence, gesture has been relatively understudied in populations with neurogenic communication disorders. While few studies have examined the rehabilitative potential of gesture in these populations, others have ignored gesture entirely or even discouraged its use. We review the literature characterizing gesture production and its role in intervention for people with aphasia, as well as describe the much sparser literature on gesture in cognitive communication disorders including right hemisphere damage, traumatic brain injury, and Alzheimer's disease. The neuroanatomical and behavioral profiles of these patient populations provide a unique opportunity to test theories of the relationship of speech and gesture and advance our understanding of their neural correlates. This review highlights several gaps in the field of communication disorders which may serve as a bridge for applying the psychological literature of gesture to the study of language disorders. Such future work would benefit from considering theoretical perspectives of gesture and using more rigorous and quantitative empirical methods in its approaches. We discuss implications for leveraging gesture to explore its untapped potential in understanding and rehabilitating neurogenic communication disorders.
Collapse
Affiliation(s)
- Sharice Clough
- Communication and Memory Lab, Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, United States
| | | |
Collapse
|
19
|
Miller HE, Andrews CA, Simmering VR. Speech and Gesture Production Provide Unique Insights Into Young Children's Spatial Reasoning. Child Dev 2020; 91:1934-1952. [PMID: 32720714 DOI: 10.1111/cdev.13396] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
This study took a novel approach to understanding the role of language in spatial development by combining approaches from spatial language and gesture research. It analyzed forty-three 4.5- to 6-year-old's speech and gesture production during explanations of reasoning behind performance on Spatial Analogies and Children's Mental Transformation Tasks. Results showed that speech and gesture relevant for solving the trials (disambiguating correct choices) predicted spatial performance when controlling for age, gender, and spatial words and gestures produced. Children performed the spatial tasks well if they produced relevant information either verbally through speech or nonverbally through gesture. These results highlight the importance of not only focusing on concepts children can reference but also on how such concepts are used in spatial tasks.
Collapse
|
20
|
Arslan B, Göksun T. Ageing, working memory, and mental imagery: Understanding gestural communication in younger and older adults. Q J Exp Psychol (Hove) 2020; 74:29-44. [PMID: 32640872 DOI: 10.1177/1747021820944696] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Ageing has effects both on language and gestural communication skills. Although gesture use is similar between younger and older adults, the use of representational gestures (e.g., drawing a line with fingers on the air to indicate a road) decreases with age. This study investigates whether this change in the production of representational gestures is related to individuals' working memory and/or mental imagery skills. We used three gesture tasks (daily activity description, story completion, and address description) to obtain spontaneous co-speech gestures from younger and older individuals (N = 60). Participants also completed the Corsi working memory task and a mental imagery task. Results showed that although the two age groups' overall gesture frequencies were similar across the three tasks, the younger adults used relatively higher proportions of representational gestures than the older adults only in the address description task. Regardless of age, the mental imagery but not working memory score was associated with the use of representational gestures only in this task. However, the use of spatial words in the address description task did not differ between the two age groups. The mental imagery or working memory scores did not associate with the spatial word use. These findings suggest that mental imagery can play a role in gesture production. Gesture and speech production might have separate timelines in terms of being affected by the ageing process, particularly for spatial content.
Collapse
Affiliation(s)
- Burcu Arslan
- Department of Psychology, Koç University, Istanbul, Turkey
| | - Tilbe Göksun
- Department of Psychology, Koç University, Istanbul, Turkey
| |
Collapse
|
21
|
Tobiansky DJ, Fuxjager MJ. Sex Steroids as Regulators of Gestural Communication. Endocrinology 2020; 161:5822602. [PMID: 32307535 PMCID: PMC7316366 DOI: 10.1210/endocr/bqaa064] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/10/2020] [Accepted: 04/16/2020] [Indexed: 12/13/2022]
Abstract
Gestural communication is ubiquitous throughout the animal kingdom, occurring in species that range from humans to arthropods. Individuals produce gestural signals when their nervous system triggers the production of limb and body movement, which in turn functions to help mediate communication between or among individuals. Like many stereotyped motor patterns, the probability of a gestural display in a given social context can be modulated by sex steroid hormones. Here, we review how steroid hormones mediate the neural mechanisms that underly gestural communication in humans and nonhumans alike. This is a growing area of research, and thus we explore how sex steroids mediate brain areas involved in language production, social behavior, and motor performance. We also examine the way that sex steroids can regulate behavioral output by acting in the periphery via skeletal muscle. Altogether, we outline a new avenue of behavioral endocrinology research that aims to uncover the hormonal basis for one of the most common modes of communication among animals on Earth.
Collapse
Affiliation(s)
- Daniel J Tobiansky
- Department of Ecology and Evolutionary Biology, Brown University, Providence, Rhode Island
- Correspondence: Daniel J. Tobiansky, Department of Ecology and Evolutionary Biology, Brown University, Providence, RI 02912.
| | - Matthew J Fuxjager
- Department of Ecology and Evolutionary Biology, Brown University, Providence, Rhode Island
| |
Collapse
|
22
|
Speech Discrimination in Real-World Group Communication Using Audio-Motion Multimodal Sensing. SENSORS 2020; 20:s20102948. [PMID: 32456031 PMCID: PMC7287755 DOI: 10.3390/s20102948] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/31/2020] [Revised: 05/18/2020] [Accepted: 05/20/2020] [Indexed: 11/16/2022]
Abstract
Speech discrimination that determines whether a participant is speaking at a given moment is essential in investigating human verbal communication. Specifically, in dynamic real-world situations where multiple people participate in, and form, groups in the same space, simultaneous speakers render speech discrimination that is solely based on audio sensing difficult. In this study, we focused on physical activity during speech, and hypothesized that combining audio and physical motion data acquired by wearable sensors can improve speech discrimination. Thus, utterance and physical activity data of students in a university participatory class were recorded, using smartphones worn around their neck. First, we tested the temporal relationship between manually identified utterances and physical motions and confirmed that physical activities in wide-frequency ranges co-occurred with utterances. Second, we trained and tested classifiers for each participant and found a higher performance with the audio-motion classifier (average accuracy 92.2%) than both the audio-only (80.4%) and motion-only (87.8%) classifiers. Finally, we tested inter-individual classification and obtained a higher performance with the audio-motion combined classifier (83.2%) than the audio-only (67.7%) and motion-only (71.9%) classifiers. These results show that audio-motion multimodal sensing using widely available smartphones can provide effective utterance discrimination in dynamic group communications.
Collapse
|
23
|
Grounded procedures: A proximate mechanism for the psychology of cleansing and other physical actions. Behav Brain Sci 2020; 44:e1. [PMID: 32390575 DOI: 10.1017/s0140525x20000308] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Experimental work has revealed causal links between physical cleansing and various psychological variables. Empirically, how robust are they? Theoretically, how do they operate? Major prevailing accounts focus on morality or disgust, capturing a subset of cleansing effects, but cannot easily handle cleansing effects in non-moral, non-disgusting contexts. Building on grounded views on cognitive processes and known properties of mental procedures, we propose grounded procedures of separation as a proximate mechanism underlying cleansing effects. This account differs from prevailing accounts in terms of explanatory kind, interpretive parsimony, and predictive scope. Its unique and falsifiable predictions have received empirical support: Cleansing attenuates or eliminates otherwise observed influences of prior events (1) across domains and (2) across valences. (3) Cleansing manipulations produce stronger effects the more strongly they engage sensorimotor capacities. (4) Reversing the causal arrow, motivation for cleansing is triggered more readily by negative than positive entities. (5) Conceptually similar effects extend to other physical actions of separation. On the flipside, grounded procedures of connection are also observed. Together, separation and connection organize prior findings relevant to multiple perspectives (e.g., conceptual metaphor, sympathetic magic) and open up new questions. Their predictions are more generalizable than the specific mappings in conceptual metaphors, but more fine-grained than the broad assumptions of grounded cognition. This intermediate level of analysis sheds light on the interplay between mental and physical processes.
Collapse
|
24
|
Moretti S, Greco A. Nodding and shaking of the head as simulated approach and avoidance responses. Acta Psychol (Amst) 2020; 203:102988. [PMID: 31935659 DOI: 10.1016/j.actpsy.2019.102988] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2019] [Revised: 12/18/2019] [Accepted: 12/20/2019] [Indexed: 10/25/2022] Open
Abstract
Our recent study within the embodiment perspective showed that the evaluation of true and false information activates the simulation of vertical and horizontal head movements involved in nodding and shaking of the head (Moretti & Greco, 2018). This result was found in an explicit evaluation task where motion detection software was deployed to enable participants to assess a series of objectively true or false statements by moving them with the head vertically and horizontally on a computer screen, under conditions of compatibility and incompatibility between simulated and performed action. This study replicated that experiment, but with subjective statements about liked and disliked food, in both explicit and implicit evaluation tasks. Two experiments, plus one control experiment, were devised to test the presence of a motor-affective compatibility effect (vertical-liked; horizontal-disliked) and whether the motor-semantic compatibility found with objective statements (vertical-true; horizontal-false) could be a sub-effect of a more general and automatic association (vertical-accepted; horizontal-refused). As expected, response times were shorter when statements about liked foods and disliked foods were moved vertically and horizontally respectively by making head movements, even when participants were not explicitly required to evaluate them. In contrast, the truth compatibility effect only occurred in the explicit evaluation task. Overall results support the idea that head-nodding and shaking are simulated approach-avoidance responses. Different aspects of the meaning of these gestures and the practical implications of the study for cognitive and social research are discussed.
Collapse
|
25
|
|
26
|
Hearing non-signers use their gestures to predict iconic form-meaning mappings at first exposure to signs. Cognition 2019; 191:103996. [PMID: 31238248 DOI: 10.1016/j.cognition.2019.06.008] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2018] [Revised: 06/04/2019] [Accepted: 06/06/2019] [Indexed: 11/20/2022]
Abstract
The sign languages of deaf communities and the gestures produced by hearing people are communicative systems that exploit the manual-visual modality as means of expression. Despite their striking differences they share the property of iconicity, understood as the direct relationship between a symbol and its referent. Here we investigate whether non-signing hearing adults exploit their implicit knowledge of gestures to bootstrap accurate understanding of the meaning of iconic signs they have never seen before. In Study 1 we show that for some concepts gestures exhibit systematic forms across participants, and share different degrees of form overlap with the signs for the same concepts (full, partial, and no overlap). In Study 2 we found that signs with stronger resemblance with signs are more accurately guessed and are assigned higher iconicity ratings by non-signers than signs with low overlap. In addition, when more people produced a systematic gesture resembling a sign, they assigned higher iconicity ratings to that sign. Furthermore, participants had a bias to assume that signs represent actions and not objects. The similarities between some signs and gestures could be explained by deaf signers and hearing gesturers sharing a conceptual substrate that is rooted in our embodied experiences with the world. The finding that gestural knowledge can ease the interpretation of the meaning of novel signs and predicts iconicity ratings is in line with embodied accounts of cognition and the influence of prior knowledge to acquire new schemas. Through these mechanisms we propose that iconic gestures that overlap in form with signs may serve as some type of 'manual cognates' that help non-signing adults to break into a new language at first exposure.
Collapse
|
27
|
Monaco E, Jost LB, Gygax PM, Annoni JM. Embodied Semantics in a Second Language: Critical Review and Clinical Implications. Front Hum Neurosci 2019; 13:110. [PMID: 30983983 PMCID: PMC6449436 DOI: 10.3389/fnhum.2019.00110] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2018] [Accepted: 03/12/2019] [Indexed: 12/20/2022] Open
Abstract
The role of the sensorimotor system in second language (L2) semantic processing as well as its clinical implications for bilingual patients has hitherto been neglected. We offer an overview of the issues at stake in this under-investigated field, presenting the theoretical and clinical relevance of studying L2 embodiment and reviewing the few studies on this topic. We highlight that (a) the sensorimotor network is involved in L2 processing, and that (b) in most studies, L2 is differently embodied than L1, reflected in a lower degree or in a different pattern of L2 embodiment. Importantly, we outline critical issues to be addressed in order to guide future research. We also delineate the subsequent steps needed to confirm or dismiss the value of language therapeutic approaches based on embodiment theories as a complement of speech and language therapies in adult bilinguals.
Collapse
Affiliation(s)
- Elisa Monaco
- Laboratory for Cognitive and Neurological Sciences, Neurology Unit, Medicine Section, Department of Neuroscience and Movement Science, Faculty of Science and Medicine, University of Fribourg, Fribourg, Switzerland
| | - Lea B. Jost
- Laboratory for Cognitive and Neurological Sciences, Neurology Unit, Medicine Section, Department of Neuroscience and Movement Science, Faculty of Science and Medicine, University of Fribourg, Fribourg, Switzerland
| | - Pascal M. Gygax
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Jean-Marie Annoni
- Laboratory for Cognitive and Neurological Sciences, Neurology Unit, Medicine Section, Department of Neuroscience and Movement Science, Faculty of Science and Medicine, University of Fribourg, Fribourg, Switzerland
- Neurology Unit, Fribourg Cantonal Hospital, Fribourg, Switzerland
| |
Collapse
|
28
|
Zhen A, Van Hedger S, Heald S, Goldin-Meadow S, Tian X. Manual directional gestures facilitate cross-modal perceptual learning. Cognition 2019; 187:178-187. [PMID: 30877849 DOI: 10.1016/j.cognition.2019.03.004] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2018] [Revised: 03/04/2019] [Accepted: 03/06/2019] [Indexed: 11/24/2022]
Abstract
Action and perception interact in complex ways to shape how we learn. In the context of language acquisition, for example, hand gestures can facilitate learning novel sound-to-meaning mappings that are critical to successfully understanding a second language. However, the mechanisms by which motor and visual information influence auditory learning are still unclear. We hypothesize that the extent to which cross-modal learning occurs is directly related to the common representational format of perceptual features across motor, visual, and auditory domains (i.e., the extent to which changes in one domain trigger similar changes in another). Furthermore, to the extent that information across modalities can be mapped onto a common representation, training in one domain may lead to learning in another domain. To test this hypothesis, we taught native English speakers Mandarin tones using directional pitch gestures. Watching or performing gestures that were congruent with pitch direction (e.g., an up gesture moving up, and a down gesture moving down, in the vertical plane) significantly enhanced tone category learning, compared to auditory-only training. Moreover, when gestures were rotated (e.g., an up gesture moving away from the body, and a down gesture moving toward the body, in the horizontal plane), performing the gestures resulted in significantly better learning, compared to watching the rotated gestures. Our results suggest that when a common representational mapping can be established between motor and sensory modalities, auditory perceptual learning is likely to be enhanced.
Collapse
Affiliation(s)
- Anna Zhen
- Division of Arts and Sciences, New York University Shanghai, Shanghai, China; Shanghai Key Laboratory of Brain Functional Genomics (Ministry of Education), School of Psychology and Cognitive Science, East China Normal University, Shanghai, China; NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, Shanghai, China; Department of Psychology, The University of Chicago, 5848 S. University Ave., Chicago, IL 60637 USA
| | - Stephen Van Hedger
- Department of Psychology, The University of Chicago, 5848 S. University Ave., Chicago, IL 60637 USA
| | - Shannon Heald
- Department of Psychology, The University of Chicago, 5848 S. University Ave., Chicago, IL 60637 USA
| | - Susan Goldin-Meadow
- Department of Psychology, The University of Chicago, 5848 S. University Ave., Chicago, IL 60637 USA
| | - Xing Tian
- Division of Arts and Sciences, New York University Shanghai, Shanghai, China; Shanghai Key Laboratory of Brain Functional Genomics (Ministry of Education), School of Psychology and Cognitive Science, East China Normal University, Shanghai, China; NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, Shanghai, China.
| |
Collapse
|
29
|
|
30
|
Graziano M, Gullberg M. When Speech Stops, Gesture Stops: Evidence From Developmental and Crosslinguistic Comparisons. Front Psychol 2018; 9:879. [PMID: 29910761 PMCID: PMC5992892 DOI: 10.3389/fpsyg.2018.00879] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2017] [Accepted: 05/15/2018] [Indexed: 11/20/2022] Open
Abstract
There is plenty of evidence that speech and gesture form a tightly integrated system, as reflected in parallelisms in language production, comprehension, and development (McNeill, 1992; Kendon, 2004). Yet, it is a common assumption that speakers use gestures to compensate for their expressive difficulties, a notion found in developmental studies of both first and second language acquisition, and in theoretical proposals concerning the gesture-speech relationship. If gestures are compensatory, they should mainly occur in disfluent stretches of speech. However, the evidence is sparse and conflicting. This study extends previous studies and tests the putative compensatory role of gestures by comparing the gestural behavior in fluent vs. disfluent stretches of narratives by competent speakers in two languages (Dutch and Italian), and by language learners (children and adult L2 learners). The results reveal that (1) in all groups speakers overwhelmingly produce gestures during fluent speech and only rarely during disfluencies. However, L2 learners are significantly more likely to gesture in disfluency than the other groups; (2) in all groups gestures during disfluencies tend to be holds; (3) in all groups the rare gestures completed in disfluencies have both referential and pragmatic functions. Overall, the data strongly suggest that when speech stops, so does gesture. The findings constitute an important challenge to both gesture and language acquisition theories assuming a mainly (lexical) compensatory role for (referential) gestures. Instead, the results provide strong support for the notion that speech and gestures form an integrated system.
Collapse
Affiliation(s)
- Maria Graziano
- Lund University Humanities Lab, Lund University, Lund, Sweden
| | - Marianne Gullberg
- Lund University Humanities Lab, Lund University, Lund, Sweden
- Centre for Languages and Literature, Lund University, Lund, Sweden
| |
Collapse
|
31
|
Moretti S, Greco A. Truth is in the head. A nod and shake compatibility effect. Acta Psychol (Amst) 2018; 185:203-218. [PMID: 29501975 DOI: 10.1016/j.actpsy.2018.02.010] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2018] [Revised: 02/14/2018] [Accepted: 02/19/2018] [Indexed: 11/19/2022] Open
Abstract
Studies from the embodiment perspective on language processing have shown facilitation or interference effects depending on the compatibility between verbal contents, concrete or abstract, and the motion of various parts of the body. The aim of the present study was to test whether such compatibility effects can be found when a higher cognitive process like truth evaluation is accomplished with head movements. Since nodding is a vertical head gesture typically performed with positive and affirmative responses, and shaking is a horizontal head gesture associated with negative and dissenting contents, faster response times can be expected when true information is evaluated by making a vertical head movement and false information by making a horizontal head movement. Three experiments were designed in order to test this motor compatibility effect. In the first experiment a series of very simple sentences were asked to be evaluated as true or false by dragging them vertically and horizontally with the head. It resulted that truth-value was assessed faster when it was compatible with the direction of the head movement, compared to when it was incompatible. In the second experiment participants were asked to evaluate the same sentences as the first experiment but by moving them with the mouse. In the third experiment, a non-evaluative classification task was given, where sentences concerning animals or objects were to be dragged by vertical and horizontal head movements. In the second and third experiment no compatibility effect was observed. Overall results support the hypothesis of an embodiment effect between the abstract processing of truth evaluation and the direction of the two head movements of nodding and shaking. Cultural aspects, cognitive implications, and the limits of these findings are discussed.
Collapse
Affiliation(s)
- Stefania Moretti
- Lab. of Psychology and Cognitive Sciences, COGNILAB-DISFOR, University of Genova, Italy.
| | - Alberto Greco
- Lab. of Psychology and Cognitive Sciences, COGNILAB-DISFOR, University of Genova, Italy.
| |
Collapse
|
32
|
Wong MKY, So WC. Absence of delay in spontaneous use of gestures in spoken narratives among children with Autism Spectrum Disorders. RESEARCH IN DEVELOPMENTAL DISABILITIES 2018; 72:128-139. [PMID: 29132079 DOI: 10.1016/j.ridd.2017.11.004] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/13/2016] [Revised: 10/29/2017] [Accepted: 11/06/2017] [Indexed: 06/07/2023]
Abstract
BACKGROUND Gestures are spontaneous hand movements produced when speaking. Despite gestures being of communicative significance, little is known about the gestural production in spoken narratives in six- to 12-year-old children with Autism Spectrum Disorders (ASD). AIMS The present study examined whether six- to 12-year-old children with ASD have a delay in gestural production in a spoken narrative task, in comparison to their typically-developing (TD) peers. METHODS AND PROCEDURES Six- to-12-year-old children with ASD (N=14) and their age- and IQ-matched TD peers (N=12) narrated a story, which could elicit spontaneous speech and gestures. Their speech and gestures were then transcribed and coded. OUTCOMES AND RESULTS Both groups of children had comparable expressive language skills. Children with ASD produced a similar number of pointing and marker gestures to TD children and significantly more iconic gestures in their spoken narratives. While children with ASD produced more reinforcing gestures than their TD counterparts, both groups of children produced comparable numbers of disambiguating and supplementary gestures. CONCLUSIONS Our findings indicate that children with ASD may be as capable as TD children in gestural production when they engage in spoken narratives, which gives them spontaneity in producing gestures.
Collapse
Affiliation(s)
- Miranda Kit-Yi Wong
- Department of Educational Psychology, The Chinese University of Hong Kong, Hong Kong
| | - Wing-Chee So
- Department of Educational Psychology, The Chinese University of Hong Kong, Hong Kong.
| |
Collapse
|
33
|
|
34
|
Rising tones and rustling noises: Metaphors in gestural depictions of sounds. PLoS One 2017; 12:e0181786. [PMID: 28750071 PMCID: PMC5547699 DOI: 10.1371/journal.pone.0181786] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2017] [Accepted: 07/06/2017] [Indexed: 11/19/2022] Open
Abstract
Communicating an auditory experience with words is a difficult task and, in consequence, people often rely on imitative non-verbal vocalizations and gestures. This work explored the combination of such vocalizations and gestures to communicate auditory sensations and representations elicited by non-vocal everyday sounds. Whereas our previous studies have analyzed vocal imitations, the present research focused on gestural depictions of sounds. To this end, two studies investigated the combination of gestures and non-verbal vocalizations. A first, observational study examined a set of vocal and gestural imitations of recordings of sounds representative of a typical everyday environment (ecological sounds) with manual annotations. A second, experimental study used non-ecological sounds whose parameters had been specifically designed to elicit the behaviors highlighted in the observational study, and used quantitative measures and inferential statistics. The results showed that these depicting gestures are based on systematic analogies between a referent sound, as interpreted by a receiver, and the visual aspects of the gestures: auditory-visual metaphors. The results also suggested a different role for vocalizations and gestures. Whereas the vocalizations reproduce all features of the referent sounds as faithfully as vocally possible, the gestures focus on one salient feature with metaphors based on auditory-visual correspondences. Both studies highlighted two metaphors consistently shared across participants: the spatial metaphor of pitch (mapping different pitches to different positions on the vertical dimension), and the rustling metaphor of random fluctuations (rapidly shaking of hands and fingers). We interpret these metaphors as the result of two kinds of representations elicited by sounds: auditory sensations (pitch and loudness) mapped to spatial position, and causal representations of the sound sources (e.g. rain drops, rustling leaves) pantomimed and embodied by the participants' gestures.
Collapse
|
35
|
|
36
|
Syntax response-space biases for hands, not feet. Atten Percept Psychophys 2017; 79:989-999. [PMID: 28078554 DOI: 10.3758/s13414-016-1271-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
A number of studies have shown a relationship between comprehending transitive sentences and spatial processing (e.g., Chatterjee, Trends in Cognitive Sciences, 5(2), 55-61, 2001), in which there is an advantage for responding to images that depict the agent of an action to the left of the patient. Boiteau and Almor (Cognitive Science, 2016) demonstrated that a similar effect is found for pure linguistic information, such that after reading a sentence, identifying a word that had appeared earlier as the agent is faster on the left than on the right, but only for left-hand responses. In this study, we examined the role of lateralized manual motor processes in this effect and found that such spatial effects occur even when only the responses, but not the stimuli, have a spatial dimension. In support of the specific role of manual motor processes, we found a response-space effect with manual but not with pedal responses. Our results support an effector-specific (as opposed to an effector-general) hypothesis: Manual responses showed spatial effects compatible with those in previous research, whereas pedal responses did not. This is consistent with theoretical and empirical work arguing that the hands are generally involved with, and perhaps more sensitive to, linguistic information.
Collapse
|
37
|
Macoun A, Sweller N. Listening and watching: The effects of observing gesture on preschoolers’ narrative comprehension. COGNITIVE DEVELOPMENT 2016. [DOI: 10.1016/j.cogdev.2016.08.005] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
38
|
Kang S, Tversky B. From hands to minds: Gestures promote understanding. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2016; 1:4. [PMID: 28180155 PMCID: PMC5256437 DOI: 10.1186/s41235-016-0004-9] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/09/2016] [Accepted: 07/21/2016] [Indexed: 11/28/2022]
Abstract
Gestures serve many roles in communication, learning and understanding both for those who view them and those who create them. Gestures are especially effective when they bear resemblance to the thought they represent, an advantage they have over words. Here, we examine the role of conceptually congruent gestures in deepening understanding of dynamic systems. Understanding the structure of dynamic systems is relatively easy, but understanding the actions of dynamic systems can be challenging. We found that seeing gestures representing actions enhanced understanding of the dynamics of a complex system as revealed in invented language, gestures and visual explanations. Gestures can map many meanings more directly than language, representing many concepts congruently. Designing and using gestures congruent with meaning can augment comprehension and learning.
Collapse
Affiliation(s)
- Seokmin Kang
- Wisconsin Center for Education Research, University of Wisconsin-Madison, Educational Sciences Building, 1025 West Johnson Street, Madison, WI 53706 USA
| | - Barbara Tversky
- Department of Psychology, Stanford University, Stanford, CA, USA ; Department of Human Development, Teachers College, Columbia University, New York, NY, USA
| |
Collapse
|
39
|
Braddock BA, Gabany C, Shah M, Armbrecht ES, Twyman KA. Patterns of Gesture Use in Adolescents With Autism Spectrum Disorder. AMERICAN JOURNAL OF SPEECH-LANGUAGE PATHOLOGY 2016; 25:408-415. [PMID: 27258802 DOI: 10.1044/2015_ajslp-14-0112] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/06/2014] [Accepted: 10/23/2015] [Indexed: 06/05/2023]
Abstract
PURPOSE The purpose of this study was to examine patterns of spontaneous gesture use in a sample of adolescents with autism spectrum disorder (ASD). METHOD Thirty-five adolescents with ASD ages 11 to 16 years participated (mean age = 13.51 years; 29 boys, 6 girls). Participants' spontaneous speech and gestures produced during a narrative task were later coded from videotape. Parents were also asked to complete questionnaires to quantify adolescents' general communication ability and autism severity. RESULTS No significant subgroup differences were apparent between adolescents who did not gesture versus those who produced at least 1 gesture in general communication ability and autism severity. Subanalyses including only adolescents who produced gesture indicated a statistically significant negative association between gesture rate and general communication ability, specifically speech and syntax subscale scores. Adolescents who gestured produced higher proportions of iconic gestures and used gesture mostly to add information to speech. CONCLUSIONS The findings relate spontaneous gesture use to underlying strengths and weaknesses in adolescents' speech and syntactical language development. More research examining cospeech gesture in fluent speakers with ASD is needed.
Collapse
|
40
|
Abstract
This article explores how the link between the hand and the mind might be exploited in the making of strategy. Using Mintzberg’s image of a potter undergoing iterative and recursive learning and knowledge-building processes as a point of departure, the authors develop a three-level theoretical schema, progressing from the physiological to the psychological to the social to trace the consequences of the hand-mind link. To illustrate their theoretical schema, the authors present an illustration case of managers from a large telecommunications firm experimenting with a process for strategy making in which they actively use their hands to construct representations of their organization and its environment. The authors conclude that new and potent forms of strategy making might be attained if the fundamental human experience of using one’s hands is put in the service of all kinds of organizational learning.
Collapse
|
41
|
Abstract
Comprehension of a phenomenon involves identifying its origin, structure, substrate, and function, and representing these factors in some formal system. Aristotle provided a clear specification of these kinds of explanation, which he called efficient causes (triggers), formal causes (models), material causes (substrates or mechanisms), and final causes (functions). In this article, Aristotle's framework is applied to conditioning and the computation-versus-association debate. The critical empirical issue is early versus late reduction of information to disposition. Automata theory provides a grammar for models of conditioning and information processing in which that constraint can be represented.
Collapse
Affiliation(s)
- Peter R Killeen
- Department of Psychology, Arizona State University, Tempe, Arizona
| |
Collapse
|
42
|
Krivokapić J. Gestural coordination at prosodic boundaries and its role for prosodic structure and speech planning processes. Philos Trans R Soc Lond B Biol Sci 2015; 369:20130397. [PMID: 25385775 DOI: 10.1098/rstb.2013.0397] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Prosodic structure is a grammatical component that serves multiple functions in the production, comprehension and acquisition of language. Prosodic boundaries are critical for the understanding of the nature of the prosodic structure of language, and important progress has been made in the past decades in illuminating their properties. We first review recent prosodic boundary research from the point of view of gestural coordination. We then go on to tie in this work to questions of speech planning and manual and head movement. We conclude with an outline of a new direction of research which is needed for a full understanding of prosodic boundaries and their role in the speech production process.
Collapse
Affiliation(s)
- Jelena Krivokapić
- Department of Linguistics, University of Michigan, 440 Lorch Hall, 611 Tappan St., Ann Arbor, MI 48109-1220, USA Haskins Laboratories, 300 George Street No. 900, New Haven, CT 06511, USA
| |
Collapse
|
43
|
Abner N, Cooperrider K, Goldin-Meadow S. Gesture for Linguists: A Handy Primer. LANGUAGE AND LINGUISTICS COMPASS 2015; 9:437-451. [PMID: 26807141 PMCID: PMC4721265 DOI: 10.1111/lnc3.12168] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Humans communicate using language, but they also communicate using gesture - spontaneous movements of the hands and body that universally accompany speech. Gestures can be distinguished from other movements, segmented, and assigned meaning based on their forms and functions. Moreover, gestures systematically integrate with language at all levels of linguistic structure, as evidenced in both production and perception. Viewed typologically, gesture is universal, but nevertheless exhibits constrained variation across language communities (as does language itself ). Finally, gesture has rich cognitive dimensions in addition to its communicative dimensions. In overviewing these and other topics, we show that the study of language is incomplete without the study of its communicative partner, gesture.
Collapse
|
44
|
Nathan MJ, Martinez CV. Gesture as model enactment: the role of gesture in mental model construction and inference making when learning from text. ACTA ACUST UNITED AC 2015. [DOI: 10.1080/23735082.2015.1006758] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
45
|
Nagels A, Kircher T, Steines M, Grosvald M, Straube B. A brief self-rating scale for the assessment of individual differences in gesture perception and production. LEARNING AND INDIVIDUAL DIFFERENCES 2015. [DOI: 10.1016/j.lindif.2015.03.008] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
46
|
Baus C, Costa A. On the temporal dynamics of sign production: An ERP study in Catalan Sign Language (LSC). Brain Res 2015; 1609:40-53. [PMID: 25801115 DOI: 10.1016/j.brainres.2015.03.013] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2014] [Revised: 03/05/2015] [Accepted: 03/08/2015] [Indexed: 11/30/2022]
Abstract
This study investigates the temporal dynamics of sign production and how particular aspects of the signed modality influence the early stages of lexical access. To that end, we explored the electrophysiological correlates associated to sign frequency and iconicity in a picture signing task in a group of bimodal bilinguals. Moreover, a subset of the same participants was tested in the same task but naming the pictures instead. Our results revealed that both frequency and iconicity influenced lexical access in sign production. At the ERP level, iconicity effects originated very early in the course of signing (while absent in the spoken modality), suggesting a stronger activation of the semantic properties for iconic signs. Moreover, frequency effects were modulated by iconicity, suggesting that lexical access in signed language is determined by the iconic properties of the signs. These results support the idea that lexical access is sensitive to the same phenomena in word and sign production, but its time-course is modulated by particular aspects of the modality in which a lexical item will be finally articulated.
Collapse
Affiliation(s)
- Cristina Baus
- Center of Brain and Cognition, CBC, Universitat Pompeu Fabra, Barcelona, Spain; Laboratoire de Psychologie Cognitive, CNRS and Université d'Aix-Marseille, Marseille, France.
| | - Albert Costa
- Center of Brain and Cognition, CBC, Universitat Pompeu Fabra, Barcelona, Spain; Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain.
| |
Collapse
|
47
|
How leaders influence followers through the use of nonverbal communication. LEADERSHIP & ORGANIZATION DEVELOPMENT JOURNAL 2015. [DOI: 10.1108/lodj-07-2013-0107] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Purpose
– The purpose of this paper is to address the relationship between a leader’s use of nonverbal immediacy (specific hand gestures) and followers’ attraction to the leader. This study provides initial evidence that certain hand gestures are more effective than others at creating immediacy between leaders and followers.
Design/methodology/approach
– In an experimental study, participants (male=89; female=121) were shown one of three videos of an actor, as leader, using three positive hand gestures, three defensive hand gestures, and no hand gestures, which have not been previously operationalized (and were grouped arbitrarily by the experimenter). Three hypotheses were tested using a 3×2 ANOVA (by group and gender) for main and interactional effects.
Findings
– The independent variable, positive hand gestures (M=2.4), was perceived by participants as more immediate than the other two independent variables, defensive hand gestures (M=−19.2) or no hand gestures (M=−21.6). Analysis of data indicate that participants perceived leaders with no hand gestures and defensive hand gestures to be distant or non-immediate and the leader with positive hand gestures to be more immediate or attractive.
Research limitations/implications
– This study is limited as a pilot study establishing differences between specific hand gestures for the first time.
Practical implications
– The research provides initial evidence that the hand gestures arbitrarily defined as “positive” create more immediacy between the followers and the leader than usage of “negative” gestures and no gestures.
Social implications
– The current research can act as a motivator for leaders to fast forward relationships with followers through the use of specific hand gestures.
Originality/value
– The results suggest the possibility that some hand gestures are more effective than others.
Collapse
|
48
|
|
49
|
Hostetter AB. Action Attenuates the Effect of Visibility on Gesture Rates. Cogn Sci 2014; 38:1468-81. [DOI: 10.1111/cogs.12113] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2012] [Revised: 06/27/2013] [Accepted: 06/27/2013] [Indexed: 12/01/2022]
|
50
|
Verbal working memory predicts co-speech gesture: evidence from individual differences. Cognition 2014; 132:174-80. [PMID: 24813571 DOI: 10.1016/j.cognition.2014.03.012] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2012] [Revised: 02/09/2014] [Accepted: 03/28/2014] [Indexed: 11/20/2022]
Abstract
Gesture facilitates language production, but there is debate surrounding its exact role. It has been argued that gestures lighten the load on verbal working memory (VWM; Goldin-Meadow, Nusbaum, Kelly, & Wagner, 2001), but gestures have also been argued to aid in lexical retrieval (Krauss, 1998). In the current study, 50 speakers completed an individual differences battery that included measures of VWM and lexical retrieval. To elicit gesture, each speaker described short cartoon clips immediately after viewing. Measures of lexical retrieval did not predict spontaneous gesture rates, but lower VWM was associated with higher gesture rates, suggesting that gestures can facilitate language production by supporting VWM when resources are taxed. These data also suggest that individual variability in the propensity to gesture is partly linked to cognitive capacities.
Collapse
|