151
|
Caligiore D, Fischer MH. Vision, action and language unified through embodiment. PSYCHOLOGICAL RESEARCH 2012; 77:1-6. [DOI: 10.1007/s00426-012-0417-0] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2012] [Accepted: 01/20/2012] [Indexed: 11/29/2022]
|
152
|
Cacciari C, Bolognini N, Senna I, Pellicciari MC, Miniussi C, Papagno C. Literal, fictive and metaphorical motion sentences preserve the motion component of the verb: a TMS study. BRAIN AND LANGUAGE 2011; 119:149-157. [PMID: 21684590 DOI: 10.1016/j.bandl.2011.05.004] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/02/2011] [Revised: 05/09/2011] [Accepted: 05/23/2011] [Indexed: 05/30/2023]
Abstract
We used Transcranial Magnetic Stimulation (TMS) to assess whether reading literal, non-literal (i.e., metaphorical, idiomatic) and fictive motion sentences modulates the activity of the motor system. Sentences were divided into three segments visually presented one at a time: the noun phrase, the verb and the final part of the sentence. Single pulse-TMS was delivered at the end of the sentence over the leg motor area in the left hemisphere and motor evoked potentials (MEPs) were recorded from the right gastrocnemius and tibialis anterior muscles. MEPs were larger when participants were presented with literal, fictive and metaphorical motion sentences than with idiomatic motion or mental sentences. These results suggest that the excitability of the motor system is modulated by the motor component of the verb, which is preserved in fictive and metaphorical motion sentences.
Collapse
Affiliation(s)
- C Cacciari
- Department of Biomedical Sciences, University of Modena & Reggio-Emilia, Italy
| | | | | | | | | | | |
Collapse
|
153
|
Tomasino B, Ceschia M, Fabbro F, Skrap M. Motor simulation during action word processing in neurosurgical patients. J Cogn Neurosci 2011; 24:736-48. [PMID: 22098262 DOI: 10.1162/jocn_a_00168] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The role that human motor areas play in linguistic processing is the subject of a stimulating debate. Data from nine neurosurgical patients with selective lesions of the precentral and postcentral sulcus could provide a direct answer as to whether motor area activation is necessary for action word processing. Action-related verbs (face-, hand-, and feet-related verbs plus neutral verbs) silently read were processed for (i) motor imagery by vividness ratings and (ii) frequency ratings. Although no stimulus- or task-dependent modulation was found in the RTs of healthy controls, patients showed a task × stimulus interaction resulting in a stimulus-dependent somatotopic pattern of RTs for the imagery task. A lesion affecting a part of the cortex that represents a body part also led to slower RTs during the creation of mental images for verbs describing actions involving that same body part. By contrast, no category-related differences were seen in the frequency judgment task. This task-related dissociation suggests that the sensorimotor area is critically involved in processing action verbs only when subjects are simulating the corresponding movement. These findings have important implications for the ongoing discussion regarding the involvement of the sensorimotor cortex in linguistic processing.
Collapse
|
154
|
Raymer AM, McHose B, Smith KG, Iman L, Ambrose A, Casselton C. Contrasting effects of errorless naming treatment and gestural facilitation for word retrieval in aphasia. Neuropsychol Rehabil 2011; 22:235-66. [PMID: 22047100 DOI: 10.1080/09602011.2011.618306] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/15/2022]
Abstract
We compared the effects of two treatments for aphasic word retrieval impairments, errorless naming treatment (ENT) and gestural facilitation of naming (GES), within the same individuals, anticipating that the use of gesture would enhance the effect of treatment over errorless treatment alone. In addition to picture naming, we evaluated results for other outcome measures that were largely untested in earlier ENT studies. In a single participant crossover treatment design, we examined the effects of ENT and GES in eight individuals with stroke-induced aphasia and word retrieval impairments (three semantic anomia, five phonological anomia) in counterbalanced phases across participants. We evaluated effects of the two treatments for a daily picture naming/gesture production probe measure and in standardised aphasia tests and communication rating scales administered across phases of the experiment. Both treatments led to improvements in naming of trained words (small-to-large effect sizes) in individuals with semantic and phonological anomia. Small generalised naming improvements were noted for three individuals with phonological anomia. GES improved use of corresponding gestures for trained words (large effect sizes). Results were largely maintained at one month post-treatment completion. Increases in scores on standardised aphasia testing also occurred for both ENT and GES training. Both ENT and GES led to improvements in naming measures, with no clear difference between treatments. Increased use of gestures following GES provided a potential compensatory means of communication for those who did not improve verbal skills. Both treatments are considered to be effective methods to promote recovery of word retrieval and verbal production skills in individuals with aphasia.
Collapse
Affiliation(s)
- Anastasia M Raymer
- Department of Communication Disorders and Special Education, Old Dominion University, Norfolk, VA 23529-0136, USA.
| | | | | | | | | | | |
Collapse
|
155
|
Abstract
Speakers convey meaning not only through words, but also through gestures. Although children are exposed to co-speech gestures from birth, we do not know how the developing brain comes to connect meaning conveyed in gesture with speech. We used functional magnetic resonance imaging (fMRI) to address this question and scanned 8- to 11-year-old children and adults listening to stories accompanied by hand movements, either meaningful co-speech gestures or meaningless self-adaptors. When listening to stories accompanied by both types of hand movement, both children and adults recruited inferior frontal, inferior parietal, and posterior temporal brain regions known to be involved in processing language not accompanied by hand movements. There were, however, age-related differences in activity in posterior superior temporal sulcus (STSp), inferior frontal gyrus, pars triangularis (IFGTr), and posterior middle temporal gyrus (MTGp) regions previously implicated in processing gesture. Both children and adults showed sensitivity to the meaning of hand movements in IFGTr and MTGp, but in different ways. Finally, we found that hand movement meaning modulates interactions between STSp and other posterior temporal and inferior parietal regions for adults, but not for children. These results shed light on the developing neural substrate for understanding meaning contributed by co-speech gesture.
Collapse
Affiliation(s)
- Anthony Steven Dick
- Department of Psychology, Florida International University, Modesto A. Maidique Campus, Deuxieme Maison 296B, 11200 S. W. 8th Street, Miami, FL 33199, USA.
| | | | | | | |
Collapse
|
156
|
Esopenko C, Crossley M, Haugrud N, Borowsky R. Naming and semantic processing of action-related stimuli following right versus left hemispherectomy. Epilepsy Behav 2011; 22:261-71. [PMID: 21831717 DOI: 10.1016/j.yebeh.2011.06.017] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/11/2011] [Revised: 05/12/2011] [Accepted: 06/12/2011] [Indexed: 11/18/2022]
Abstract
Previous neuroimaging research has shown left hemisphere dominance during the semantic processing of embodied action-related stimuli. The goal of our research was to examine how action-related stimuli are processed in individuals after right or left hemispherectomy. S.M. (right hemispherectomy), J.H. (left hemispherectomy), and healthy control participants completed naming and semantic generation tasks with picture and word stimuli with referents that are used by arms or legs. Our results showed evidence of a dissociation for pictures of objects used by legs. Specifically, the naming task showed that, relative to controls, S.M. is impaired on accuracy, whereas J.H. performs closer to normal levels. For the semantic generation task, the opposite result was obtained and is consistent with the response time data. Our results suggest that the right hemisphere is critical for normal picture naming, whereas the left hemisphere is critical for normal semantic generation of action-related knowledge.
Collapse
Affiliation(s)
- C Esopenko
- Department of Psychology, College of Arts and Science, University of Saskatchewan, Saskatoon, SK, Canada.
| | | | | | | |
Collapse
|
157
|
Enrici I, Adenzato M, Cappa S, Bara BG, Tettamanti M. Intention Processing in Communication: A Common Brain Network for Language and Gestures. J Cogn Neurosci 2011; 23:2415-31. [DOI: 10.1162/jocn.2010.21594] [Citation(s) in RCA: 73] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Human communicative competence is based on the ability to process a specific class of mental states, namely, communicative intention. The present fMRI study aims to analyze whether intention processing in communication is affected by the expressive means through which a communicative intention is conveyed, that is, the linguistic or extralinguistic gestural means. Combined factorial and conjunction analyses were used to test two sets of predictions: first, that a common brain network is recruited for the comprehension of communicative intentions independently of the modality through which they are conveyed; second, that additional brain areas are specifically recruited depending on the communicative modality used, reflecting distinct sensorimotor gateways. Our results clearly showed that a common neural network is engaged in communicative intention processing independently of the modality used. This network includes the precuneus, the left and right posterior STS and TPJ, and the medial pFC. Additional brain areas outside those involved in intention processing are specifically engaged by the particular communicative modality, that is, a peri-sylvian language network for the linguistic modality and a sensorimotor network for the extralinguistic modality. Thus, common representation of communicative intention may be accessed by modality-specific gateways, which are distinct for linguistic versus extralinguistic expressive means. Taken together, our results indicate that the information acquired by different communicative modalities is equivalent from a mental processing standpoint, in particular, at the point at which the actor's communicative intention has to be reconstructed.
Collapse
Affiliation(s)
| | - Mauro Adenzato
- 1University of Torino, Italy
- 2Neuroscience Institute of Turin, Italy
| | - Stefano Cappa
- 3Vita-Salute San Raffaele University, Milan, Italy
- 4Scientific Institute HSR, Milan, Italy
| | - Bruno G. Bara
- 1University of Torino, Italy
- 2Neuroscience Institute of Turin, Italy
| | | |
Collapse
|
158
|
Chiou RYC, Wu DH, Tzeng OJL, Hung DL, Chang EC. Relative size of numerical magnitude induces a size-contrast effect on the grip scaling of reach-to-grasp movements. Cortex 2011; 48:1043-51. [PMID: 21889134 DOI: 10.1016/j.cortex.2011.08.001] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2010] [Revised: 03/16/2011] [Accepted: 07/20/2011] [Indexed: 10/17/2022]
Abstract
Previous research found that quantitative information labelled on target objects of grasping movement modulates grip apertures. While the interaction between numerical cognition and sensorimotor control may reflect a general representation of magnitude underpinned by the parietal cortex, the nature of this embodied cognitive processing remains unclear. In the present study, we examined whether the numerical effects on grip aperture can be flexibly modulated by the relative magnitude between numbers under a context, which suggests a trial-by-trial comparison mechanism to underlie this effect. The participants performed visual open-loop grasping towards one of two adjacent objects that were of the same physical size but labelled with different Arabic digits. Analysis of participants' grip apertures revealed a numerical size-contrast effect, in which the same numerical label (i.e., 5) led to larger grip apertures when it was accompanied by a smaller number (i.e., 2) than by a larger number (i.e., 8). The corrected grip aperture over the time course of movement showed that the numerical size-contrast effect remained significant throughout the grasping movement, despite a trend of gradual dissipation. Our findings demonstrated that interactions between number and action critically depend on the size-contrast of magnitude information in the context. Such a size-contrast effect might result from a general system, which is sensitive to relative magnitude, for different quantity domains. Alternatively, the magnitude representations of numbers and action might be processed separately and interact at a later stage of motor programming.
Collapse
Affiliation(s)
- Rocco Y-C Chiou
- Laboratories for Cognitive Neuroscience, National Yang-Ming University, Taiwan
| | | | | | | | | |
Collapse
|
159
|
Reading action word affects the visual perception of biological motion. Acta Psychol (Amst) 2011; 137:330-4. [PMID: 21514548 DOI: 10.1016/j.actpsy.2011.04.001] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2010] [Revised: 04/03/2011] [Accepted: 04/04/2011] [Indexed: 01/19/2023] Open
Abstract
In the present study, we investigate whether reading an action-word can influence subsequent visual perception of biological motion. The participant's task was to perceptually judge whether a human action identifiable in the biological motion of a point-light display embedded in a high density mask was present or not in the visual sequence, which lasted for 633 ms on average. Prior to the judgement task, participants were exposed to an abstract verb or an action verb for 500 ms, which was related to the human action according to a congruent or incongruent semantic relation. Data analysis showed that correct judgements were not affected by action verbs, whereas a facilitation effect on response time (49 ms on average) was observed when a congruent action verb primed the judgement of biological movements. In relation with the existing literature, this finding suggests that the perception, the planning and the linguistic coding of motor action are subtended by common motor representations.
Collapse
|
160
|
Willems RM, Labruna L, D'Esposito M, Ivry R, Casasanto D. A functional role for the motor system in language understanding: evidence from theta-burst transcranial magnetic stimulation. Psychol Sci 2011; 22:849-54. [PMID: 21705521 DOI: 10.1177/0956797611412387] [Citation(s) in RCA: 99] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Does language comprehension depend, in part, on neural systems for action? In previous studies, motor areas of the brain were activated when people read or listened to action verbs, but it remains unclear whether such activation is functionally relevant for comprehension. In the experiments reported here, we used off-line theta-burst transcranial magnetic stimulation to investigate whether a causal relationship exists between activity in premotor cortex and action-language understanding. Right-handed participants completed a lexical decision task, in which they read verbs describing manual actions typically performed with the dominant hand (e.g., "to throw," "to write") and verbs describing nonmanual actions (e.g., "to earn," "to wander"). Responses to manual-action verbs (but not to nonmanual-action verbs) were faster after stimulation of the hand area in left premotor cortex than after stimulation of the hand area in right premotor cortex. These results suggest that premotor cortex has a functional role in action-language understanding.
Collapse
Affiliation(s)
- Roel M Willems
- 1Helen Wills Neuroscience Institute, University of California, Berkeley, USA.
| | | | | | | | | |
Collapse
|
161
|
Willems RM, Casasanto D. Flexibility in embodied language understanding. Front Psychol 2011; 2:116. [PMID: 21779264 PMCID: PMC3132681 DOI: 10.3389/fpsyg.2011.00116] [Citation(s) in RCA: 66] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2010] [Accepted: 05/19/2011] [Indexed: 11/13/2022] Open
Abstract
Do people use sensori-motor cortices to understand language? Here we review neurocognitive studies of language comprehension in healthy adults and evaluate their possible contributions to theories of language in the brain. We start by sketching the minimal predictions that an embodied theory of language understanding makes for empirical research, and then survey studies that have been offered as evidence for embodied semantic representations. We explore four debated issues: first, does activation of sensori-motor cortices during action language understanding imply that action semantics relies on mirror neurons? Second, what is the evidence that activity in sensori-motor cortices plays a functional role in understanding language? Third, to what extent do responses in perceptual and motor areas depend on the linguistic and extra-linguistic context? And finally, can embodied theories accommodate language about abstract concepts? Based on the available evidence, we conclude that sensori-motor cortices are activated during a variety of language comprehension tasks, for both concrete and abstract language. Yet, this activity depends on the context in which perception and action words are encountered. Although modality-specific cortical activity is not a sine qua non of language processing even for language about perception and action, sensori-motor regions of the brain appear to make functional contributions to the construction of meaning, and should therefore be incorporated into models of the neurocognitive architecture of language.
Collapse
Affiliation(s)
- Roel M Willems
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen Nijmegen, Netherlands
| | | |
Collapse
|
162
|
Struiksma ME, Noordzij ML, Postma A. Embodied representation of the body contains veridical spatial information. Q J Exp Psychol (Hove) 2011; 64:1124-37. [DOI: 10.1080/17470218.2011.552982] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
In two experiments, the extent to which mental body representations contain spatial information was examined. Participants were asked to compare distances between various body parts. Similar to what happens when people compare distances on a real visual stimulus, they were faster as the distance differences between body parts became larger (Experiment 1), and this effect could not (only) be explained by the crossing of major bodily categories (umbilicus to knee vs. knee to ankle; Experiment 2). In addition, participants also performed simple animate/inanimate verification on a set of nouns. The nouns describing animate items were names of body parts. A spatial priming effect was found: Verification was faster for body part items preceded by body parts in close spatial proximity. This suggests automatic activation of spatial body information. Taken together, results from the distance comparison task and the property verification task showed that mental body representations contain both categorical and more metric spatial information. These findings are further discussed in terms of recent embodied cognition theories.
Collapse
Affiliation(s)
- Marijn E. Struiksma
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - Matthijs L. Noordzij
- Department of Cognitive Psychology and Ergonomics, University of Twente, Enschede, The Netherlands
| | - Albert Postma
- Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
163
|
Potgieser ARE, de Jong BM. Different distal-proximal movement balances in right- and left-hand writing may hint at differential premotor cortex involvement. Hum Mov Sci 2011; 30:1072-8. [PMID: 21612835 DOI: 10.1016/j.humov.2011.02.005] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2010] [Revised: 01/11/2011] [Accepted: 02/13/2011] [Indexed: 11/28/2022]
Abstract
Right-handed people generally write with their right hand. Language expressed in script is thus performed with the hand also preferred for skilled motor tasks. This may suggest an efficient functional interaction between the language area of Broca and the adjacent ventral premotor cortex (PMv) in the left (dominant) hemisphere. Pilot observations suggested that distal movements are particularly implicated in cursive writing with the right hand and proximal movements in left-hand writing, which generated ideas concerning hemisphere-specific roles of PMv and dorsal premotor cortex (PMd). Now we examined upper-limb movements in 30 right-handed participants during right- and left-hand writing, respectively. Quantitative description of distal and proximal movements demonstrated a significant difference between movements in right- and left-hand writing (p<.001, Wilcoxon signed-rank test). A Distal Movement Excess (DME) characterized writing with the right hand, while proximal and distal movements similarly contributed to left-hand writing. Although differences between non-language drawings were not tested, we propose that the DME in right-hand writing may reflect functional dominance of PMv in the left hemisphere. More proximal movements in left-hand writing might be related to PMd dominance in right-hemisphere motor control, logically implicated in spatial visuomotor transformations as seen in reaching.
Collapse
Affiliation(s)
- A R E Potgieser
- Department of Neurology, University Medical Center Groningen, University of Groningen, Hanzeplein 1, PO Box 30.001, Groningen, The Netherlands
| | | |
Collapse
|
164
|
Bedny M, Caramazza A. Perception, action, and word meanings in the human brain: the case from action verbs. Ann N Y Acad Sci 2011; 1224:81-95. [PMID: 21486297 DOI: 10.1111/j.1749-6632.2011.06013.x] [Citation(s) in RCA: 97] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
Affiliation(s)
- Marina Bedny
- Brain and Cognitive Sciences Department, Massachusetts Institute of Technology, Cambridge, Massachusetts.Department of Psychology, Harvard University, Cambridge, Massachusetts.Center for Mind/brain Sciences CIMeC), University of Trento, Trento, Italy
| | - Alfonso Caramazza
- Brain and Cognitive Sciences Department, Massachusetts Institute of Technology, Cambridge, Massachusetts.Department of Psychology, Harvard University, Cambridge, Massachusetts.Center for Mind/brain Sciences CIMeC), University of Trento, Trento, Italy
| |
Collapse
|
165
|
Iverson JM, Braddock BA. Gesture and motor skill in relation to language in children with language impairment. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2011; 54:72-86. [PMID: 20719867 DOI: 10.1044/1092-4388(2010/08-0197)] [Citation(s) in RCA: 69] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
PURPOSE To examine gesture and motor abilities in relation to language in children with language impairment (LI). METHOD Eleven children with LI (aged 2;7 to 6;1 [years;months]) and 16 typically developing (TD) children of similar chronological ages completed 2 picture narration tasks, and their language (rate of verbal utterances, mean length of utterance, and number of different words) and gestures (coded for type, co-occurrence with language, and informational relationship to language) were examined. Fine and gross motor items from the Battelle Developmental Screening Inventory (J. Newborg, J. R. Stock, L. Wneck, J. Guidubaldi, & J. Suinick, 1994) and the Child Development Inventory (H. R. Ireton, 1992) were administered. RESULTS Relative to TD peers, children with LI used gestures at a higher rate and produced greater proportions of gesture-only communications, conventional gestures, and gestures that added unique information to co-occurring language. However, they performed more poorly on measures of fine and gross motor abilities. Regression analyses indicated that within the LI but not the TD group, poorer expressive language was related to more frequent gesture production. CONCLUSIONS When language is impaired, difficulties are also apparent in motor abilities, but gesture assumes a compensatory role. These findings underscore the utility of including spontaneous gesture and motor abilities in clinical assessment of and intervention for preschool children with language concerns.
Collapse
Affiliation(s)
- Jana M Iverson
- Department of Psychology, University of Pittsburgh, 3415 Sennott Square, 210 South Bouquet Street, Pittsburgh, PA 15260, USA.
| | | |
Collapse
|
166
|
Ibáñez A, Toro P, Cornejo C, Urquina H, Manes F, Weisbrod M, Schröder J. High contextual sensitivity of metaphorical expressions and gesture blending: A video event-related potential design. Psychiatry Res 2011; 191:68-75. [PMID: 21129937 DOI: 10.1016/j.pscychresns.2010.08.008] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/07/2008] [Revised: 08/05/2010] [Accepted: 08/12/2010] [Indexed: 10/18/2022]
Abstract
Human communication in a natural context implies the dynamic coordination of contextual clues, paralinguistic information and literal as well as figurative language use. In the present study we constructed a paradigm with four types of video clips: literal and metaphorical expressions accompanied by congruent and incongruent gesture actions. Participants were instructed to classify the gesture accompanying the expression as congruent or incongruent by pressing two different keys while electrophysiological activity was being recorded. We compared behavioral measures and event related potential (ERP) differences triggered by the gesture stroke onset. Accuracy data showed that incongruent metaphorical expressions were more difficult to classify. Reaction times were modulated by incongruent gestures, by metaphorical expressions and by a gesture-expression interaction. No behavioral differences were found between the literal and metaphorical expressions when the gesture was congruent. N400-like and LPC-like (late positive complex) components from metaphorical expressions produced greater negativity. The N400-like modulation of metaphorical expressions showed a greater difference between congruent and incongruent categories over the left anterior region, compared with the literal expressions. More importantly, the literal congruent as well as the metaphorical congruent categories did not show any difference. Accuracy, reaction times and ERPs provide convergent support for a greater contextual sensitivity of the metaphorical expressions.
Collapse
Affiliation(s)
- Agustín Ibáñez
- Laboratory of Experimental Psychology and Neurosciences, Institute of Cognitive Neurology (INECO), Buenos Aires, Argentina.
| | | | | | | | | | | | | |
Collapse
|
167
|
Courtin C, Jobard G, Vigneau M, Beaucousin V, Razafimandimby A, Hervé PY, Mellet E, Zago L, Petit L, Mazoyer B, Tzourio-Mazoyer N. A common neural system is activated in hearing non-signers to process French Sign language and spoken French. Brain Res Bull 2011; 84:75-87. [DOI: 10.1016/j.brainresbull.2010.09.013] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2010] [Revised: 09/27/2010] [Accepted: 09/27/2010] [Indexed: 10/19/2022]
|
168
|
Language sensorimotor specificity modulates the motor system. Cortex 2010; 48:849-56. [PMID: 21227411 DOI: 10.1016/j.cortex.2010.12.003] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2010] [Revised: 09/21/2010] [Accepted: 11/30/2010] [Indexed: 11/24/2022]
Abstract
Embodied approaches to language understanding hold that comprehension of linguistic material entails a situated simulation of the situation described. Some recent studies have shown that implicit, explicit, and relational properties of objects implied in a sentence are part of this simulation. However, the issue concerning the extent to which language sensorimotor specificity expressed by linguistic constituents of a sentence, contributes to situating the simulation process has not yet been adequately addressed. To fill this gap, we combined a concrete action verb with a noun denoting a graspable or non-graspable object, to form a sensible or non-sensible sentence. Verbs could express a specific action with low degrees of freedom (DoF) or an action with high DoF. Participants were asked to respond indicating whether the sentences were sensible or not. We found that simulation was active in understanding both sensible and non-sensible sentences. Moreover, the simulation was more situated with sentences containing a verb referring to an action with low DoF. Language sensorimotor specificity expressed by the noun, played a role in situating the simulation, only when the noun was preceded by a verb denoting an action with high DoF in sensible sentences. The simulation process in understanding non-sensible sentences evoked both the representations related to the verb and to the noun, these remaining separated rather than being integrated as in sensible sentences. Overall our findings are in keeping with embodied approaches to language understanding and suggest that the language sensorimotor specificity of sentence constituents affects the extent to which the simulation is situated.
Collapse
|
169
|
Willems RM, Varley R. Neural Insights into the Relation between Language and Communication. Front Hum Neurosci 2010; 4:203. [PMID: 21151364 PMCID: PMC2996040 DOI: 10.3389/fnhum.2010.00203] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2010] [Accepted: 10/04/2010] [Indexed: 11/13/2022] Open
Abstract
The human capacity to communicate has been hypothesized to be causally dependent upon language. Intuitively this seems plausible since most communication relies on language. Moreover, intention recognition abilities (as a necessary prerequisite for communication) and language development seem to co-develop. Here we review evidence from neuroimaging as well as from neuropsychology to evaluate the relationship between communicative and linguistic abilities. Our review indicates that communicative abilities are best considered as neurally distinct from language abilities. This conclusion is based upon evidence showing that humans rely on different cortical systems when designing a communicative message for someone else as compared to when performing core linguistic tasks, as well as upon observations of individuals with severe language loss after extensive lesions to the language system, who are still able to perform tasks involving intention understanding.
Collapse
Affiliation(s)
- Roel M Willems
- Helen Wills Neuroscience Institute, University of California Berkeley Berkeley, CA, USA
| | | |
Collapse
|
170
|
Willems RM, Toni I, Hagoort P, Casasanto D. Neural Dissociations between Action Verb Understanding and Motor Imagery. J Cogn Neurosci 2010; 22:2387-400. [DOI: 10.1162/jocn.2009.21386] [Citation(s) in RCA: 118] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
Abstract
According to embodied theories of language, people understand a verb like throw, at least in part, by mentally simulating throwing. This implicit simulation is often assumed to be similar or identical to motor imagery. Here we used fMRI to test whether implicit simulations of actions during language understanding involve the same cortical motor regions as explicit motor imagery. Healthy participants were presented with verbs related to hand actions (e.g., to throw) and nonmanual actions (e.g., to kneel). They either read these verbs (lexical decision task) or actively imagined performing the actions named by the verbs (imagery task). Primary motor cortex showed effector-specific activation during imagery, but not during lexical decision. Parts of premotor cortex distinguished manual from nonmanual actions during both lexical decision and imagery, but there was no overlap or correlation between regions activated during the two tasks. These dissociations suggest that implicit simulation and explicit imagery cued by action verbs may involve different types of motor representations and that the construct of “mental simulation” should be distinguished from “mental imagery” in embodied theories of language.
Collapse
Affiliation(s)
- Roel M. Willems
- 1Radboud University Nijmegen, The Netherlands
- 2University of California, Berkeley
| | - Ivan Toni
- 1Radboud University Nijmegen, The Netherlands
| | - Peter Hagoort
- 1Radboud University Nijmegen, The Netherlands
- 3Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Daniel Casasanto
- 3Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| |
Collapse
|
171
|
Weiskopf DA. Understanding is not simulating: a reply to Gibbs and Perlman. STUDIES IN HISTORY AND PHILOSOPHY OF SCIENCE 2010; 41:309-312. [PMID: 21466122 DOI: 10.1016/j.shpsa.2010.07.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
In this response, I do four things. First, I defend the claim that the action compatibility effect does not distinguish between embodied and traditional accounts of language comprehension. Second, I present neuroimaging and neuropsychological results that seem to support the traditional account. Third. I argue that metaphorical language poses no special challenge to the arguments I gave against embodied theories of comprehension. Fourth, I lay out the architecture of language I advocate and suggest the sorts of data that would decide between traditional and embodied accounts.
Collapse
Affiliation(s)
- Daniel A Weiskopf
- Department of Philosophy, Georgia State University, P.O. Box 4089, Atlanta, GA 30302, USA.
| |
Collapse
|
172
|
Jirak D, Menz MM, Buccino G, Borghi AM, Binkofski F. Grasping language--a short story on embodiment. Conscious Cogn 2010; 19:711-20. [PMID: 20739194 DOI: 10.1016/j.concog.2010.06.020] [Citation(s) in RCA: 98] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2009] [Revised: 06/26/2010] [Accepted: 06/28/2010] [Indexed: 12/29/2022]
Abstract
The new concept of embodied cognition theories has been enthusiastically studied by the cognitive sciences, by as well as such disparate disciplines as philosophy, anthropology, neuroscience, and robotics. Embodiment theory provides the framework for ongoing discussions on the linkage between "low" cognitive processes as perception and "high" cognition as language processing and comprehension, respectively. This review gives an overview along the lines of argumentation in the ongoing debate on the embodiment of language and employs an ALE meta-analysis to illustrate and weigh previous findings.The collected evidence on the somatotopic activation of motor areas, abstract and concrete word processing, as well as from reported patient and timing studies emphasizes the important role of sensorimotor areas in language processing and supports the hypothesis that the motor system is activated during language comprehension.
Collapse
Affiliation(s)
- Doreen Jirak
- Department of Systems Neuroscience and Neuroimage Nord, University Medical Center Hamburg Eppendorf, Hamburg, Germany
| | | | | | | | | |
Collapse
|
173
|
Aravena P, Hurtado E, Riveros R, Cardona JF, Manes F, Ibáñez A. Applauding with closed hands: neural signature of action-sentence compatibility effects. PLoS One 2010; 5:e11751. [PMID: 20676367 PMCID: PMC2911376 DOI: 10.1371/journal.pone.0011751] [Citation(s) in RCA: 82] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2010] [Accepted: 06/25/2010] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND Behavioral studies have provided evidence for an action-sentence compatibility effect (ACE) that suggests a coupling of motor mechanisms and action-sentence comprehension. When both processes are concurrent, the action sentence primes the actual movement, and simultaneously, the action affects comprehension. The aim of the present study was to investigate brain markers of bidirectional impact of language comprehension and motor processes. METHODOLOGY/PRINCIPAL FINDINGS Participants listened to sentences describing an action that involved an open hand, a closed hand, or no manual action. Each participant was asked to press a button to indicate his/her understanding of the sentence. Each participant was assigned a hand-shape, either closed or open, which had to be used to activate the button. There were two groups (depending on the assigned hand-shape) and three categories (compatible, incompatible and neutral) defined according to the compatibility between the response and the sentence. ACEs were found in both groups. Brain markers of semantic processing exhibited an N400-like component around the Cz electrode position. This component distinguishes between compatible and incompatible, with a greater negative deflection for incompatible. Motor response elicited a motor potential (MP) and a re-afferent potential (RAP), which are both enhanced in the compatible condition. CONCLUSIONS/SIGNIFICANCE The present findings provide the first ACE cortical measurements of semantic processing and the motor response. N400-like effects suggest that incompatibility with motor processes interferes in sentence comprehension in a semantic fashion. Modulation of motor potentials (MP and RAP) revealed a multimodal semantic facilitation of the motor response. Both results provide neural evidence of an action-sentence bidirectional relationship. Our results suggest that ACE is not an epiphenomenal post-sentence comprehension process. In contrast, motor-language integration occurring during the verb onset supports a genuine and ongoing brain motor-language interaction.
Collapse
Affiliation(s)
- Pia Aravena
- Laboratory of Experimental Psychology & Neuroscience, Institute of Cognitive Neurology (INECO), Buenos Aires, Capital Federal, Argentina
- Laboratory of Cognitive Neuroscience, Universidad Diego Portales, Santiago, Chile
| | - Esteban Hurtado
- Doctoral Program, Psychology School, Pontificia Universidad Católica de Chile, Santiago, Chile
| | - Rodrigo Riveros
- Laboratory of Cognitive Neuroscience, Universidad Diego Portales, Santiago, Chile
| | - Juan Felipe Cardona
- Laboratory of Experimental Psychology & Neuroscience, Institute of Cognitive Neurology (INECO), Buenos Aires, Capital Federal, Argentina
| | - Facundo Manes
- Laboratory of Experimental Psychology & Neuroscience, Institute of Cognitive Neurology (INECO), Buenos Aires, Capital Federal, Argentina
- Institute of Neuroscience, Favaloro University, Buenos Aires, Capital Federal, Argentina
| | - Agustín Ibáñez
- Laboratory of Experimental Psychology & Neuroscience, Institute of Cognitive Neurology (INECO), Buenos Aires, Capital Federal, Argentina
- Laboratory of Cognitive Neuroscience, Universidad Diego Portales, Santiago, Chile
- Institute of Neuroscience, Favaloro University, Buenos Aires, Capital Federal, Argentina
- National Scientific and Technical Research Council (CONICET), Buenos Aires, Capital Federal, Argentina
- * E-mail:
| |
Collapse
|
174
|
Borghi AM, Gianelli C, Scorolli C. Sentence comprehension: effectors and goals, self and others. An overview of experiments and implications for robotics. Front Neurorobot 2010; 4:3. [PMID: 20589241 PMCID: PMC2892993 DOI: 10.3389/fnbot.2010.00003] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2009] [Accepted: 04/27/2010] [Indexed: 11/13/2022] Open
Abstract
According to theories referring to embodied and grounded cognition (Barsalou, 2008), language comprehension encompasses an embodied simulation of actions. The neural underpinnings of this simulation could be found in wide neural circuits that involve canonical and mirror neurons (Rizzolatti et al., 1996). In keeping with this view, we review behavioral and kinematic studies conducted in our lab which help characterize the relationship existing between language and the motor system. Overall, our results reveal that the simulation evoked during sentence comprehension is fine-grained, primarily in its sensitivity to the different effectors we employ to perform actions. In addition, they suggest that linguistic comprehension also relies on the representation of actions in terms of goals and of the chains of motor acts necessary to accomplish them. Finally, they indicate that these goals are modulated by both the object features the sentence refers to as well as by social aspects such as the characteristics of the agents implied by sentences. We will discuss the implications of these studies for embodied robotics.
Collapse
Affiliation(s)
- Anna M. Borghi
- Department of Psychology, University of BolognaBologna, Italy
- Institute of Sciences and Technologies of Cognition, National Research CouncilRome, Italy
| | | | | |
Collapse
|
175
|
Willems RM, Clevis K, Hagoort P. Add a picture for suspense: neural correlates of the interaction between language and visual information in the perception of fear. Soc Cogn Affect Neurosci 2010; 6:404-16. [PMID: 20530540 DOI: 10.1093/scan/nsq050] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
We investigated how visual and linguistic information interact in the perception of emotion. We borrowed a phenomenon from film theory which states that presentation of an as such neutral visual scene intensifies the percept of fear or suspense induced by a different channel of information, such as language. Our main aim was to investigate how neutral visual scenes can enhance responses to fearful language content in parts of the brain involved in the perception of emotion. Healthy participants' brain activity was measured (using functional magnetic resonance imaging) while they read fearful and less fearful sentences presented with or without a neutral visual scene. The main idea is that the visual scenes intensify the fearful content of the language by subtly implying and concretizing what is described in the sentence. Activation levels in the right anterior temporal pole were selectively increased when a neutral visual scene was paired with a fearful sentence, compared to reading the sentence alone, as well as to reading of non-fearful sentences presented with the same neutral scene. We conclude that the right anterior temporal pole serves a binding function of emotional information across domains such as visual and linguistic information.
Collapse
Affiliation(s)
- Roel M Willems
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, PO Box 9101, 6500 HB Nijmegen, The Netherlands.
| | | | | |
Collapse
|
176
|
|
177
|
Nelissen N, Pazzaglia M, Vandenbulcke M, Sunaert S, Fannes K, Dupont P, Aglioti SM, Vandenberghe R. Gesture discrimination in primary progressive aphasia: the intersection between gesture and language processing pathways. J Neurosci 2010; 30:6334-41. [PMID: 20445059 PMCID: PMC6632725 DOI: 10.1523/jneurosci.0321-10.2010] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2010] [Revised: 03/03/2010] [Accepted: 03/25/2010] [Indexed: 01/25/2023] Open
Abstract
The issue of the relationship between language and gesture processing and the partial overlap of their neural representations is of fundamental importance to neurology, psychology, and social sciences. Patients suffering from primary progressive aphasia, a clinical syndrome characterized by comparatively isolated language deficits, may provide direct evidence for anatomical and functional association between specific language deficits and gesture discrimination deficits. A consecutive series of 16 patients with primary progressive aphasia and 16 matched control subjects participated. Our nonverbal gesture discrimination task consisted of 19 trials. In each trial, participants observed three video clips showing the same gesture performed correctly in one clip and incorrectly in the other two. Subjects had to indicate which of the three versions was correct. Language and gesture production were evaluated by means of conventional tasks. All participants underwent high-resolution structural and diffusion tensor magnetic resonance imaging. Ten of the primary progressive aphasia patients showed a significant deficit on the nonverbal gesture discrimination task. A factor analysis revealed that this deficit clustered with gesture imitation, word and pseudoword repetition, and writing-to-dictation. Individual scores on this cluster correlated with volume in the left anterior inferior parietal cortex extending into the posterior superior temporal gyrus. Probabilistic tractography indicated this region comprised the cortical relay station of the indirect pathway connecting the inferior frontal gyrus and the superior temporal cortex. Thus, the left perisylvian temporoparietal area may underpin verbal imitative behavior, gesture imitation, and gesture discrimination indicative of a partly shared neural substrate for language and gesture resonance.
Collapse
Affiliation(s)
- Natalie Nelissen
- Laboratory for Cognitive Neurology, Experimental Neurology Section, Katholieke Universiteit Leuven, 3000 Leuven, Belgium
| | - Mariella Pazzaglia
- Dipartimento di Psicologia, Università degli Studi di Roma “La Sapienza,” 00185 Rome, Italy
- Istituto di Ricovero e Cura a Carattere Scientifico, Fondazione Santa Lucia, 00142 Rome, Italy, and
| | | | | | - Katrien Fannes
- Laboratory for Cognitive Neurology, Experimental Neurology Section, Katholieke Universiteit Leuven, 3000 Leuven, Belgium
| | - Patrick Dupont
- Laboratory for Cognitive Neurology, Experimental Neurology Section, Katholieke Universiteit Leuven, 3000 Leuven, Belgium
| | - Salvatore M. Aglioti
- Dipartimento di Psicologia, Università degli Studi di Roma “La Sapienza,” 00185 Rome, Italy
- Istituto di Ricovero e Cura a Carattere Scientifico, Fondazione Santa Lucia, 00142 Rome, Italy, and
| | - Rik Vandenberghe
- Laboratory for Cognitive Neurology, Experimental Neurology Section, Katholieke Universiteit Leuven, 3000 Leuven, Belgium
- Neurology, University Hospitals Leuven, 3000 Leuven, Belgium
| |
Collapse
|
178
|
Tomasino B, Weiss PH, Fink GR. To move or not to move: imperatives modulate action-related verb processing in the motor system. Neuroscience 2010; 169:246-58. [PMID: 20420884 DOI: 10.1016/j.neuroscience.2010.04.039] [Citation(s) in RCA: 111] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2010] [Revised: 04/12/2010] [Accepted: 04/18/2010] [Indexed: 12/01/2022]
Abstract
It has been suggested that the processing of action-related words involves activation of the motor circuitry. Using fMRI (functional magnetic resonance imaging), the current study further explored the interaction between action and language by investigating whether the linguistic context, in which an action word occurs, modulates motor circuitry activity related to the processing of action words. To this end, we examined whether the presentation of hand action-related verbs as positive or negative imperatives, for example, "Do grasp" or "Don't write," modulates neural activity in the hand area of primary motor cortex (M1) or premotor cortex (Pm). Subjects (n = 19) were asked to read silently the imperative phrases, in which both meaningful action verbs and meaningless pseudo-verbs were presented, and to decide whether they made sense (lexical decision task). At the behavioral level, response times in the lexical decision task were significantly longer for negative, compared to positive, imperatives. At the neural level, activity was differentially decreased by action verbs presented as negative imperatives for the premotor and the primary motor cortex of both hemispheres. The data suggest that context (here: positive vs. negative imperatives), in which an action verb is encountered, modulates the neural activity within key areas of the motor system. The finding implies that motor simulation (or motor planning) rather than semantic processing per se may underlie previously observed motor system activation related to action verb processing. Furthermore, the current data suggest that negative imperatives may inhibit motor simulation or motor planning processes.
Collapse
Affiliation(s)
- B Tomasino
- Cognitive Neurology Section, Institute of Neuroscience and Medicine (INM-3), Research Centre Juelich, Germany.
| | | | | |
Collapse
|
179
|
The Semantic Specificity Hypothesis: When Gestures Do Not Depend Upon the Presence of a Listener. JOURNAL OF NONVERBAL BEHAVIOR 2010. [DOI: 10.1007/s10919-010-0089-7] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
180
|
Kelly SD, Creigh P, Bartolotti J. Integrating Speech and Iconic Gestures in a Stroop-like Task: Evidence for Automatic Processing. J Cogn Neurosci 2010; 22:683-94. [DOI: 10.1162/jocn.2009.21254] [Citation(s) in RCA: 78] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Previous research has demonstrated a link between language and action in the brain. The present study investigates the strength of this neural relationship by focusing on a potential interface between the two systems: cospeech iconic gesture. Participants performed a Stroop-like task in which they watched videos of a man and a woman speaking and gesturing about common actions. The videos differed as to whether the gender of the speaker and gesturer was the same or different and whether the content of the speech and gesture was congruent or incongruent. The task was to identify whether a man or a woman produced the spoken portion of the videos while accuracy rates, RTs, and ERPs were recorded to the words. Although not relevant to the task, participants paid attention to the semantic relationship between the speech and the gesture, producing a larger N400 to words accompanied by incongruent versus congruent gestures. In addition, RTs were slower to incongruent versus congruent gesture–speech stimuli, but this effect was greater when the gender of the gesturer and speaker was the same versus different. These results suggest that the integration of gesture and speech during language comprehension is automatic but also under some degree of neurocognitive control.
Collapse
|
181
|
Iverson JM. Developing language in a developing body: the relationship between motor development and language development. JOURNAL OF CHILD LANGUAGE 2010; 37:229-61. [PMID: 20096145 PMCID: PMC2833284 DOI: 10.1017/s0305000909990432] [Citation(s) in RCA: 390] [Impact Index Per Article: 27.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
ABSTRACTDuring the first eighteen months of life, infants acquire and refine a whole set of new motor skills that significantly change the ways in which the body moves in and interacts with the environment. In this review article, I argue that motor acquisitions provide infants with an opportunity to practice skills relevant to language acquisition before they are needed for that purpose; and that the emergence of new motor skills changes infants' experience with objects and people in ways that are relevant for both general communicative development and the acquisition of language. Implications of this perspective for current views of co-occurring language and motor impairments and for methodology in the field of child language research are also considered.
Collapse
|
182
|
Bergen B, Wheeler K. Grammatical aspect and mental simulation. BRAIN AND LANGUAGE 2010; 112:150-158. [PMID: 19656554 DOI: 10.1016/j.bandl.2009.07.002] [Citation(s) in RCA: 43] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/03/2008] [Revised: 06/09/2009] [Accepted: 07/03/2009] [Indexed: 05/28/2023]
Abstract
When processing sentences about perceptible scenes and performable actions, language understanders activate perceptual and motor systems to perform mental simulations of those events. But little is known about exactly what linguistic elements activate modality-specific systems during language processing. While it is known that content words, like nouns and verbs, influence the content of a mental simulation, the role of grammar is less well understood. We investigate the role of grammatical markers in mental simulation through two experiments in which we manipulate the meanings of sentences by modifying the grammatical aspect they use. Using the Action-sentence Compatibility Effect (ACE) methodology [Glenberg, A., Kaschak, M. (2002). Grounding language in action. Psychonomic Bulletin and Review, 9, 558-565], we show that progressive sentences about hand motion facilitate manual action in the same direction, while perfect sentences that are identical in every way except their aspect do not. The broader implication of this finding for language processing is that while content words tell understanders what to mentally simulate and what brain regions to use in performing these simulations, grammatical constructions such as aspect modulate how those simulations are performed.
Collapse
Affiliation(s)
- Benjamin Bergen
- University of Hawai'i at Manoa, Department of Linguistics, Honolulu, HI 96822, United States.
| | | |
Collapse
|
183
|
|
184
|
Dick AS, Goldin-Meadow S, Hasson U, Skipper JI, Small SL. Co-speech gestures influence neural activity in brain regions associated with processing semantic information. Hum Brain Mapp 2010; 30:3509-26. [PMID: 19384890 DOI: 10.1002/hbm.20774] [Citation(s) in RCA: 113] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
Everyday communication is accompanied by visual information from several sources, including co-speech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech.
Collapse
Affiliation(s)
- Anthony Steven Dick
- Department of Neurology, The University of Chicago, Chicago, Illinois 60637, USA
| | | | | | | | | |
Collapse
|
185
|
Wartenburger I, Kühn E, Sassenberg U, Foth M, Franz EA, der Meer EV. On the relationship between fluid intelligence, gesture production, and brain structure. INTELLIGENCE 2010. [DOI: 10.1016/j.intell.2009.11.001] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
186
|
Kelly SD, Ozyürek A, Maris E. Two sides of the same coin: speech and gesture mutually interact to enhance comprehension. Psychol Sci 2009; 21:260-7. [PMID: 20424055 DOI: 10.1177/0956797609357327] [Citation(s) in RCA: 148] [Impact Index Per Article: 9.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Gesture and speech are assumed to form an integrated system during language production. Based on this view, we propose the integrated-systems hypothesis, which explains two ways in which gesture and speech are integrated--through mutual and obligatory interactions--in language comprehension. Experiment 1 presented participants with action primes (e.g., someone chopping vegetables) and bimodal speech and gesture targets. Participants related primes to targets more quickly and accurately when they contained congruent information (speech: "chop"; gesture: chop) than when they contained incongruent information (speech: "chop"; gesture: twist). Moreover, the strength of the incongruence affected processing, with fewer errors for weak incongruities (speech: "chop"; gesture: cut) than for strong incongruities (speech: "chop"; gesture: twist). Crucial for the integrated-systems hypothesis, this influence was bidirectional. Experiment 2 demonstrated that gesture's influence on speech was obligatory. The results confirm the integrated-systems hypothesis and demonstrate that gesture and speech form an integrated system in language comprehension.
Collapse
Affiliation(s)
- Spencer D Kelly
- Department of Psychology, Colgate University, Hamilton, NY 13346, USA.
| | | | | |
Collapse
|
187
|
Abstract
There have been relatively few discussions of systematic studies of language, including neuroscience studies, in the psychoanalytic literature. To address this dearth, a detailed review of research on embodied language in neuroscience and related disciplines is presented, after which their findings are considered in light of diverse views of language in psychoanalysis, specifically the models of the Boston Change Process Study Group, Wilma Bucci, Fonagy and Target, David Olds, and Hans Loewald. The juxtaposition of psychoanalytic models with the findings of research on embodied language shows that scientific studies can focus psychoanalytic understanding of verbal processes, and that integrations with neuroscience neither inherently threaten the traditional psychoanalytic focus on verbal meanings nor reduce the richness and complexity of psychoanalytic theory.
Collapse
Affiliation(s)
- Jeanine M Vivona
- The College of New Jersey; adjunct clinical faculty, Department of Psychiatry, Pennsylvania Hospital, PA, USA.
| |
Collapse
|
188
|
Abstract
According to theories of embodied cognition, understanding a verb like throw involves unconsciously simulating the action of throwing, using areas of the brain that support motor planning. If understanding action words involves mentally simulating one’s own actions, then the neurocognitive representation of word meanings should differ for people with different kinds of bodies, who perform actions in systematically different ways. In a test of the body-specificity hypothesis, we used functional magnetic resonance imaging to compare premotor activity correlated with action verb understanding in right- and left-handers. Right-handers preferentially activated the left premotor cortex during lexical decisions on manual-action verbs (compared with nonmanual-action verbs), whereas left-handers preferentially activated right premotor areas. This finding helps refine theories of embodied semantics, suggesting that implicit mental simulation during language processing is body specific: Right- and left-handers, who perform actions differently, use correspondingly different areas of the brain for representing action verb meanings.
Collapse
Affiliation(s)
- Roel M. Willems
- Donders Institute for Brain, Cognition and Behaviour, Centre for Cognitive Neuroimaging, Radboud University Nijmegen
- Helen Wills Neuroscience Institute, University of California, Berkeley
| | - Peter Hagoort
- Donders Institute for Brain, Cognition and Behaviour, Centre for Cognitive Neuroimaging, Radboud University Nijmegen
- Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Daniel Casasanto
- Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| |
Collapse
|
189
|
Willems RM, Toni I, Hagoort P, Casasanto D. Body-specific motor imagery of hand actions: neural evidence from right- and left-handers. Front Hum Neurosci 2009; 3:39. [PMID: 19949484 PMCID: PMC2784680 DOI: 10.3389/neuro.09.039.2009] [Citation(s) in RCA: 57] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2009] [Accepted: 10/09/2009] [Indexed: 11/26/2022] Open
Abstract
If motor imagery uses neural structures involved in action execution, then the neural correlates of imagining an action should differ between individuals who tend to execute the action differently. Here we report fMRI data showing that motor imagery is influenced by the way people habitually perform motor actions with their particular bodies; that is, motor imagery is ‘body-specific’ (Casasanto, 2009). During mental imagery for complex hand actions, activation of cortical areas involved in motor planning and execution was left-lateralized in right-handers but right-lateralized in left-handers. We conclude that motor imagery involves the generation of an action plan that is grounded in the participant's motor habits, not just an abstract representation at the level of the action's goal. People with different patterns of motor experience form correspondingly different neurocognitive representations of imagined actions.
Collapse
Affiliation(s)
- Roel M Willems
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen Nijmegen, The Netherlands.
| | | | | | | |
Collapse
|
190
|
Differential roles for left inferior frontal and superior temporal cortex in multimodal integration of action and language. Neuroimage 2009; 47:1992-2004. [DOI: 10.1016/j.neuroimage.2009.05.066] [Citation(s) in RCA: 116] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2008] [Revised: 05/13/2009] [Accepted: 05/21/2009] [Indexed: 11/20/2022] Open
|
191
|
Green A, Straube B, Weis S, Jansen A, Willmes K, Konrad K, Kircher T. Neural integration of iconic and unrelated coverbal gestures: a functional MRI study. Hum Brain Mapp 2009; 30:3309-24. [PMID: 19350562 PMCID: PMC6870774 DOI: 10.1002/hbm.20753] [Citation(s) in RCA: 94] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2008] [Revised: 12/31/2008] [Accepted: 01/20/2009] [Indexed: 11/09/2022] Open
Abstract
Gestures are an important part of interpersonal communication, for example by illustrating physical properties of speech contents (e.g., "the ball is round"). The meaning of these so-called iconic gestures is strongly intertwined with speech. We investigated the neural correlates of the semantic integration for verbal and gestural information. Participants watched short videos of five speech and gesture conditions performed by an actor, including variation of language (familiar German vs. unfamiliar Russian), variation of gesture (iconic vs. unrelated), as well as isolated familiar language, while brain activation was measured using functional magnetic resonance imaging. For familiar speech with either of both gesture types contrasted to Russian speech-gesture pairs, activation increases were observed at the left temporo-occipital junction. Apart from this shared location, speech with iconic gestures exclusively engaged left occipital areas, whereas speech with unrelated gestures activated bilateral parietal and posterior temporal regions. Our results demonstrate that the processing of speech with speech-related versus speech-unrelated gestures occurs in two distinct but partly overlapping networks. The distinct processing streams (visual versus linguistic/spatial) are interpreted in terms of "auxiliary systems" allowing the integration of speech and gesture in the left temporo-occipital region.
Collapse
Affiliation(s)
- Antonia Green
- Department of Psychiatry and Psychotherapy-Section of Experimental Psychopathology, RWTH Aachen University, Aachen, Germany.
| | | | | | | | | | | | | |
Collapse
|
192
|
RUDNER MARY, ANDIN JOSEFINE, RÖNNBERG JERKER. Working memory, deafness and sign language. Scand J Psychol 2009; 50:495-505. [DOI: 10.1111/j.1467-9450.2009.00744.x] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
193
|
Proudfoot D. Meaning and mind: Wittgenstein's relevance for the ‘Does Language Shape Thought?’ debate. NEW IDEAS IN PSYCHOLOGY 2009. [DOI: 10.1016/j.newideapsych.2008.04.012] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
194
|
Barbieri F, Buonocore A, Volta RD, Gentilucci M. How symbolic gestures and words interact with each other. BRAIN AND LANGUAGE 2009; 110:1-11. [PMID: 19233459 DOI: 10.1016/j.bandl.2009.01.002] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/27/2008] [Revised: 01/07/2009] [Accepted: 01/12/2009] [Indexed: 05/27/2023]
Abstract
Previous repetitive Transcranial Magnetic Stimulation and neuroimaging studies showed that Broca's area is involved in the interaction between gestures and words. However, in these studies the nature of this interaction was not fully investigated; consequently, we addressed this issue in three behavioral experiments. When compared to the expression of one signal at a time, arm kinematics slowed down and voice parameters were amplified when congruent words plus gestures were simultaneously produced (experiment 1). When word and gesture were incongruent, arm kinematics did not change regardless of word category, whereas the gesture induced variation in vocal parameters of communicative and action words only (experiments 2 and 3). Data are discussed according to the hypothesis that integration between gesture and word occurs by transferring the social intention to interact directly with the interlocutor from the gesture to the word.
Collapse
Affiliation(s)
- Filippo Barbieri
- Dipartimento di Neuroscienze, Università di Parma, Parma 43100, Italy
| | | | | | | |
Collapse
|
195
|
Chieffi S, Secchi C, Gentilucci M. Deictic word and gesture production: Their interaction. Behav Brain Res 2009; 203:200-6. [PMID: 19433113 DOI: 10.1016/j.bbr.2009.05.003] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2008] [Revised: 04/27/2009] [Accepted: 05/03/2009] [Indexed: 10/20/2022]
Abstract
We examined whether and how deictic gestures and words influence each other when the content of the gesture was congruent or incongruent with that of the simultaneously produced word. Two experiments were carried out. In Experiment 1, the participants read aloud the deictic word 'QUA' ('here') or 'LA" ('there'), printed on a token placed near to or far from their body. Simultaneously, they pointed towards one's own body, when the token was placed near, or at a remote position, when the token was placed far. In this way, participants read 'QUA' ('here') and pointed towards themselves (congruent condition) or a remote position (incongruent condition); or they read 'LA" ('there') and pointed towards a remote position (congruent condition) or themselves (incongruent condition). In a control condition, in which a string of 'X' letters was printed on the token, the participants were silent and only pointed towards themselves (token placed near) or a remote position (token placed far). In Experiment 2, the participants read aloud the deictic word placed in the near or far position without gesturing. The results showed that the congruence/incongruence between the content of the deictic word and that of the gesture affected gesture kinematics and voice spectra. Indeed, the movement was faster in the congruent than in the control and incongruent conditions; and it was slower in the incongruent than in the control condition. As concerns voice spectra, formant 2 (F2) decreased in the incongruent conditions. The results suggest the existence of a bidirectional interaction between speech and gesture production systems.
Collapse
Affiliation(s)
- Sergio Chieffi
- Department of Neuroscience, Section of Physiology, University of Parma, Via Volturno 39, 43100 Parma, Italy.
| | | | | |
Collapse
|
196
|
Straube B, Green A, Weis S, Chatterjee A, Kircher T. Memory Effects of Speech and Gesture Binding: Cortical and Hippocampal Activation in Relation to Subsequent Memory Performance. J Cogn Neurosci 2009; 21:821-36. [DOI: 10.1162/jocn.2009.21053] [Citation(s) in RCA: 66] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
In human face-to-face communication, the content of speech is often illustrated by coverbal gestures. Behavioral evidence suggests that gestures provide advantages in the comprehension and memory of speech. Yet, how the human brain integrates abstract auditory and visual information into a common representation is not known. Our study investigates the neural basis of memory for bimodal speech and gesture representations. In this fMRI study, 12 participants were presented with video clips showing an actor performing meaningful metaphoric gestures (MG), unrelated, free gestures (FG), and no arm and hand movements (NG) accompanying sentences with an abstract content. After the fMRI session, the participants performed a recognition task. Behaviorally, the participants showed the highest hit rate for sentences accompanied by meaningful metaphoric gestures. Despite comparable old/new discrimination performances (d′) for the three conditions, we obtained distinct memory-related left-hemispheric activations in the inferior frontal gyrus (IFG), the premotor cortex (BA 6), and the middle temporal gyrus (MTG), as well as significant correlations between hippocampal activation and memory performance in the metaphoric gesture condition. In contrast, unrelated speech and gesture information (FG) was processed in areas of the left occipito-temporal and cerebellar region and the right IFG just like the no-gesture condition (NG). We propose that the specific left-lateralized activation pattern for the metaphoric speech–gesture sentences reflects semantic integration of speech and gestures. These results provide novel evidence about the neural integration of abstract speech and gestures as it contributes to subsequent memory performance.
Collapse
|
197
|
Willems RM, Hagoort P. Hand preference influences neural correlates of action observation. Brain Res 2009; 1269:90-104. [PMID: 19272363 DOI: 10.1016/j.brainres.2009.02.057] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2008] [Revised: 01/28/2009] [Accepted: 02/25/2009] [Indexed: 10/21/2022]
Abstract
It has been argued that we map observed actions onto our own motor system. Here we added to this issue by investigating whether hand preference influences the neural correlates of action observation of simple, essentially meaningless hand actions. Such an influence would argue for an intricate neural coupling between action production and action observation, which goes beyond effects of motor repertoire or explicit motor training, as has been suggested before. Indeed, parts of the human motor system exhibited a close coupling between action production and action observation. Ventral premotor and inferior and superior parietal cortices showed differential activation for left- and right-handers that was similar during action production as well as during action observation. This suggests that mapping observed actions onto the observer's own motor system is a core feature of action observation - at least for actions that do not have a clear goal or meaning. Basic differences in the way we act upon the world are not only reflected in neural correlates of action production, but can also influence the brain basis of action observation.
Collapse
Affiliation(s)
- Roel M Willems
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, PO Box 9101, 6500 HB Nijmegen, The Netherlands.
| | | |
Collapse
|
198
|
Willems RM, Hagoort P. Broca's region: battles are not won by ignoring half of the facts. Trends Cogn Sci 2009; 13:101; author reply 102. [DOI: 10.1016/j.tics.2008.12.001] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2008] [Revised: 12/08/2008] [Accepted: 12/19/2008] [Indexed: 11/24/2022]
|
199
|
Hubbard AL, Wilson SM, Callan DE, Dapretto M. Giving speech a hand: gesture modulates activity in auditory cortex during speech perception. Hum Brain Mapp 2009; 30:1028-37. [PMID: 18412134 PMCID: PMC2644740 DOI: 10.1002/hbm.20565] [Citation(s) in RCA: 73] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2007] [Revised: 01/20/2008] [Accepted: 02/15/2008] [Indexed: 11/07/2022] Open
Abstract
Viewing hand gestures during face-to-face communication affects speech perception and comprehension. Despite the visible role played by gesture in social interactions, relatively little is known about how the brain integrates hand gestures with co-occurring speech. Here we used functional magnetic resonance imaging (fMRI) and an ecologically valid paradigm to investigate how beat gesture-a fundamental type of hand gesture that marks speech prosody-might impact speech perception at the neural level. Subjects underwent fMRI while listening to spontaneously-produced speech accompanied by beat gesture, nonsense hand movement, or a still body; as additional control conditions, subjects also viewed beat gesture, nonsense hand movement, or a still body all presented without speech. Validating behavioral evidence that gesture affects speech perception, bilateral nonprimary auditory cortex showed greater activity when speech was accompanied by beat gesture than when speech was presented alone. Further, the left superior temporal gyrus/sulcus showed stronger activity when speech was accompanied by beat gesture than when speech was accompanied by nonsense hand movement. Finally, the right planum temporale was identified as a putative multisensory integration site for beat gesture and speech (i.e., here activity in response to speech accompanied by beat gesture was greater than the summed responses to speech alone and beat gesture alone), indicating that this area may be pivotally involved in synthesizing the rhythmic aspects of both speech and gesture. Taken together, these findings suggest a common neural substrate for processing speech and gesture, likely reflecting their joint communicative role in social interactions.
Collapse
Affiliation(s)
- Amy L Hubbard
- Ahmanson-Lovelace Brain Mapping Center, University of California, Los Angeles, California 90095-7085, USA.
| | | | | | | |
Collapse
|
200
|
Kircher T, Straube B, Leube D, Weis S, Sachs O, Willmes K, Konrad K, Green A. Neural interaction of speech and gesture: Differential activations of metaphoric co-verbal gestures. Neuropsychologia 2009; 47:169-79. [DOI: 10.1016/j.neuropsychologia.2008.08.009] [Citation(s) in RCA: 76] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2008] [Revised: 07/23/2008] [Accepted: 08/01/2008] [Indexed: 10/21/2022]
|