401
|
Hauk O, Shtyrov Y, Pulvermüller F. The sound of actions as reflected by mismatch negativity: rapid activation of cortical sensory-motor networks by sounds associated with finger and tongue movements. Eur J Neurosci 2006; 23:811-21. [PMID: 16487161 DOI: 10.1111/j.1460-9568.2006.04586.x] [Citation(s) in RCA: 63] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
In order to explore the activation dynamics of the human action recognition system, we investigated electrophysiological distinctions between the brain responses to sounds produced by human finger and tongue movements. Of special interest were the questions of how early these differences may occur, and whether the neural activation at the early stages of processing involves cortical motor representations related to the generation of these sounds. For this purpose we employed a high-density EEG set-up and recorded mismatch negativity (MMN) using a recently developed novel multideviant paradigm which allows acquisition of a high number of trials within a given time period. Deviant stimuli were naturally recorded finger and tongue clicks, as well as control stimuli with similar physical features but without the clear action associations (this was tested in a separate behavioural experiment). Both natural stimuli produced larger MMNs than their respective control stimuli at approximately 100 ms, indicating activation of memory traces for familiar action-related sounds. Furthermore, MMN topography at this latency differed between the brain responses to the natural finger and natural tongue sounds. Source estimation revealed the strongest sources for finger sounds in centrolateral areas of the left hemisphere, suggesting that hearing a sound related to finger actions evokes activity in motor areas associated with the dominant hand. Furthermore, tongue sounds produced activation in more inferior brain areas. Our data suggest that motor areas in the human brain are part of neural systems subserving the early automatic recognition of action-related sounds.
Collapse
Affiliation(s)
- O Hauk
- MRC Cognition and Brain Sciences Unit, 15 Chaucer Road, Cambridge CB2 2EF, UK.
| | | | | |
Collapse
|
402
|
Keysers C, Gazzola V. Towards a unifying neural theory of social cognition. PROGRESS IN BRAIN RESEARCH 2006; 156:379-401. [PMID: 17015092 DOI: 10.1016/s0079-6123(06)56021-2] [Citation(s) in RCA: 169] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Humans can effortlessly understand a lot of what is going on in other peoples' minds. Understanding the neural basis of this capacity has proven quite difficult. Since the discovery of mirror neurons, a number of successful experiments have approached the question of how we understand the actions of others from the perspective of sharing their actions. Recently we have demonstrated that a similar logic may apply to understanding the emotions and sensations of others. Here, we therefore review evidence that a single mechanism (shared circuits) applies to actions, sensations and emotions: witnessing the actions, sensations and emotions of other individuals activates brain areas normally involved in performing the same actions and feeling the same sensations and emotions. We propose that these circuits, shared between the first (I do, I feel) and third person perspective (seeing her do, seeing her feel) translate the vision and sound of what other people do and feel into the language of the observers own actions and feelings. This translation could help understand the actions and feelings of others by providing intuitive insights into their inner life. We propose a mechanism for the development of shared circuits on the basis of Hebbian learning, and underline that shared circuits could integrate with more cognitive functions during social cognitions.
Collapse
Affiliation(s)
- Christian Keysers
- BCN Neuro-Imaging-Centre, University Medical Center Groningen, University of Groningen, A. Deusinglaan 2, 9713AW Groningen, The Netherlands.
| | | |
Collapse
|
403
|
Abstract
Observing actions and understanding sentences about actions activates corresponding motor processes in the observer-comprehender. In 5 experiments, the authors addressed 2 novel questions regarding language-based motor resonance. The 1st question asks whether visual motion that is associated with an action produces motor resonance in sentence comprehension. The 2nd question asks whether motor resonance is modulated during sentence comprehension. The authors' experiments provide an affirmative response to both questions. A rotating visual stimulus affects both actual manual rotation and the comprehension of manual rotation sentences. Motor resonance is modulated by the linguistic input and is a rather immediate and localized phenomenon. The results are discussed in the context of theories of action observation and mental simulation.
Collapse
Affiliation(s)
- Rolf A Zwaan
- Department of Psychology, Florida State University, Tallahassee, FL 32306-1270, USA.
| | | |
Collapse
|
404
|
|
405
|
Bernardis P, Gentilucci M. Speech and gesture share the same communication system. Neuropsychologia 2006; 44:178-90. [PMID: 16005477 DOI: 10.1016/j.neuropsychologia.2005.05.007] [Citation(s) in RCA: 105] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2005] [Revised: 04/12/2005] [Accepted: 05/10/2005] [Indexed: 11/28/2022]
Abstract
Humans speak and produce symbolic gestures. Do these two forms of communication interact, and how? First, we tested whether the two communication signals influenced each other when emitted simultaneously. Participants either pronounced words, or executed symbolic gestures, or emitted the two communication signals simultaneously. Relative to the unimodal conditions, multimodal voice spectra were enhanced by gestures, whereas multimodal gesture parameters were reduced by words. In other words, gesture reinforced word, whereas word inhibited gesture. In contrast, aimless arm movements and pseudo-words had no comparable effects. Next, we tested whether observing word pronunciation during gesture execution affected verbal responses in the same way as emitting the two signals. Participants responded verbally to either spoken words, or to gestures, or to the simultaneous presentation of the two signals. We observed the same reinforcement in the voice spectra as during simultaneous emission. These results suggest that spoken word and symbolic gesture are coded as single signal by a unique communication system. This signal represents the intention to engage a closer interaction with a hypothetical interlocutor and it may have a meaning different from when word and gesture are encoded singly.
Collapse
Affiliation(s)
- Paolo Bernardis
- Dipartimento di Neuroscienze, Università di Parma, via Volturno 39, 43100 Parma, Italy
| | | |
Collapse
|
406
|
Noppeney U, Josephs O, Kiebel S, Friston KJ, Price CJ. Action selectivity in parietal and temporal cortex. ACTA ACUST UNITED AC 2005; 25:641-9. [PMID: 16242924 DOI: 10.1016/j.cogbrainres.2005.08.017] [Citation(s) in RCA: 84] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2005] [Revised: 07/24/2005] [Accepted: 08/23/2005] [Indexed: 11/20/2022]
Abstract
The sensory-action theory proposes that the neural substrates underlying action representations are related to a visuomotor action system encompassing the left ventral premotor cortex, the anterior intraparietal (AIP) and left posterior middle temporal gyrus (LPMT). Using fMRI, we demonstrate that semantic decisions on action, relative to non-action words, increased activation in the left AIP and LPMT irrespective of whether the words were presented in a written or spoken form. Left AIP and LPMT might thus play the role of amodal semantic regions that can be activated via auditory as well as visual input. Left AIP and LPMT did not distinguish between different types of actions such as hand actions and whole body movements, although a right STS region responded selectively to whole body movements.
Collapse
Affiliation(s)
- U Noppeney
- Wellcome Department of Imaging Neuroscience, Institute of Neurology, University College London, 12 Queen Square, London WC1 3N BG, UK.
| | | | | | | | | |
Collapse
|
407
|
Brass M, Heyes C. Imitation: is cognitive neuroscience solving the correspondence problem? Trends Cogn Sci 2005; 9:489-95. [PMID: 16126449 DOI: 10.1016/j.tics.2005.08.007] [Citation(s) in RCA: 332] [Impact Index Per Article: 17.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2005] [Revised: 07/26/2005] [Accepted: 08/16/2005] [Indexed: 11/23/2022]
Abstract
Imitation poses a unique problem: how does the imitator know what pattern of motor activation will make their action look like that of the model? Specialist theories suggest that this correspondence problem has a unique solution; there are functional and neurological mechanisms dedicated to controlling imitation. Generalist theories propose that the problem is solved by general mechanisms of associative learning and action control. Recent research in cognitive neuroscience, stimulated by the discovery of mirror neurons, supports generalist solutions. Imitation is based on the automatic activation of motor representations by movement observation. These externally triggered motor representations are then used to reproduce the observed behaviour. This imitative capacity depends on learned perceptual-motor links. Finally, mechanisms distinguishing self from other are implicated in the inhibition of imitative behaviour.
Collapse
Affiliation(s)
- Marcel Brass
- Department of Cognitive Neurology, Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1A, 04103 Leipzig, Germany.
| | | |
Collapse
|
408
|
Gentilucci M, Cattaneo L. Automatic audiovisual integration in speech perception. Exp Brain Res 2005; 167:66-75. [PMID: 16034571 DOI: 10.1007/s00221-005-0008-z] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2004] [Accepted: 03/30/2005] [Indexed: 11/30/2022]
Abstract
Two experiments aimed to determine whether features of both the visual and acoustical inputs are always merged into the perceived representation of speech and whether this audiovisual integration is based on either cross-modal binding functions or on imitation. In a McGurk paradigm, observers were required to repeat aloud a string of phonemes uttered by an actor (acoustical presentation of phonemic string) whose mouth, in contrast, mimicked pronunciation of a different string (visual presentation). In a control experiment participants read the same printed strings of letters. This condition aimed to analyze the pattern of voice and the lip kinematics controlling for imitation. In the control experiment and in the congruent audiovisual presentation, i.e. when the articulation mouth gestures were congruent with the emission of the string of phones, the voice spectrum and the lip kinematics varied according to the pronounced strings of phonemes. In the McGurk paradigm the participants were unaware of the incongruence between visual and acoustical stimuli. The acoustical analysis of the participants' spoken responses showed three distinct patterns: the fusion of the two stimuli (the McGurk effect), repetition of the acoustically presented string of phonemes, and, less frequently, of the string of phonemes corresponding to the mouth gestures mimicked by the actor. However, the analysis of the latter two responses showed that the formant 2 of the participants' voice spectra always differed from the value recorded in the congruent audiovisual presentation. It approached the value of the formant 2 of the string of phonemes presented in the other modality, which was apparently ignored. The lip kinematics of the participants repeating the string of phonemes acoustically presented were influenced by the observation of the lip movements mimicked by the actor, but only when pronouncing a labial consonant. The data are discussed in favor of the hypothesis that features of both the visual and acoustical inputs always contribute to the representation of a string of phonemes and that cross-modal integration occurs by extracting mouth articulation features peculiar for the pronunciation of that string of phonemes.
Collapse
Affiliation(s)
- Maurizio Gentilucci
- Dipartimento di Neuroscienze, Universitá di Parma, Via Volturno 39, 43100, Parma, Italy.
| | | |
Collapse
|
409
|
Tettamanti M, Buccino G, Saccuman MC, Gallese V, Danna M, Scifo P, Fazio F, Rizzolatti G, Cappa SF, Perani D. Listening to action-related sentences activates fronto-parietal motor circuits. J Cogn Neurosci 2005; 17:273-81. [PMID: 15811239 DOI: 10.1162/0898929053124965] [Citation(s) in RCA: 610] [Impact Index Per Article: 32.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Observing actions made by others activates the cortical circuits responsible for the planning and execution of those same actions. This observation-execution matching system (mirror-neuron system) is thought to play an important role in the understanding of actions made by others. In an fMRI experiment, we tested whether this system also becomes active during the processing of action-related sentences. Participants listened to sentences describing actions performed with the mouth, the hand, or the leg. Abstract sentences of comparable syntactic structure were used as control stimuli. The results showed that listening to action-related sentences activates a left fronto-parieto-temporal network that includes the pars opercularis of the inferior frontal gyrus (Broca's area), those sectors of the premotor cortex where the actions described are motorically coded, as well as the inferior parietal lobule, the intraparietal sulcus, and the posterior middle temporal gyrus. These data provide the first direct evidence that listening to sentences that describe actions engages the visuomotor circuits which subserve action execution and observation.
Collapse
Affiliation(s)
- Marco Tettamanti
- Neuroscience Department, Scientific Institute San Raffaele, Segrate (Milan), Italy.
| | | | | | | | | | | | | | | | | | | |
Collapse
|
410
|
Buccino G, Riggio L, Melli G, Binkofski F, Gallese V, Rizzolatti G. Listening to action-related sentences modulates the activity of the motor system: a combined TMS and behavioral study. ACTA ACUST UNITED AC 2005; 24:355-63. [PMID: 16099349 DOI: 10.1016/j.cogbrainres.2005.02.020] [Citation(s) in RCA: 391] [Impact Index Per Article: 20.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2004] [Revised: 02/09/2005] [Accepted: 02/10/2005] [Indexed: 11/23/2022]
Abstract
Transcranial magnetic stimulation (TMS) and a behavioral paradigm were used to assess whether listening to action-related sentences modulates the activity of the motor system. By means of single-pulse TMS, either the hand or the foot/leg motor area in the left hemisphere was stimulated in distinct experimental sessions, while participants were listening to sentences expressing hand and foot actions. Listening to abstract content sentences served as a control. Motor evoked potentials (MEPs) were recorded from hand and foot muscles. Results showed that MEPs recorded from hand muscles were specifically modulated by listening to hand-action-related sentences, as were MEPs recorded from foot muscles by listening to foot-action-related sentences. This modulation consisted of an amplitude decrease of the recorded MEPs. In the behavioral task, participants had to respond with the hand or the foot while listening to actions expressing hand and foot actions, as compared to abstract sentences. Coherently with the results obtained with TMS, when the response was given with the hand, reaction times were slower during listening to hand-action-related sentences, while when the response was given with the foot, reaction times were slower during listening to foot-action-related sentences. The present data show that processing verbally presented actions activates different sectors of the motor system, depending on the effector used in the listened-to action.
Collapse
Affiliation(s)
- G Buccino
- Department of Neuroscience, University of Parma, Via Volturno 39, 43100 Parma, Italy.
| | | | | | | | | | | |
Collapse
|
411
|
Uddin LQ, Kaplan JT, Molnar-Szakacs I, Zaidel E, Iacoboni M. Self-face recognition activates a frontoparietal “mirror” network in the right hemisphere: an event-related fMRI study. Neuroimage 2005; 25:926-35. [PMID: 15808992 DOI: 10.1016/j.neuroimage.2004.12.018] [Citation(s) in RCA: 299] [Impact Index Per Article: 15.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2004] [Revised: 12/01/2004] [Accepted: 12/07/2004] [Indexed: 11/18/2022] Open
Abstract
Self-recognition has been demonstrated by a select number of primate species and is often used as an index of self-awareness. Whether a specialized neural mechanism for self-face recognition in humans exists remains unclear. We used event-related fMRI to investigate brain regions selectively activated by images of one's own face. Ten right-handed normal subjects viewed digital morphs between their own face and a gender-matched familiar other presented in a random sequence. Subjects were instructed to press a button with the right hand if the image looked like their own face, and another button if it looked like a familiar or scrambled face. Contrasting the trials in which images contain more "self" with those containing more familiar "other" revealed signal changes in the right hemisphere (RH) including the inferior parietal lobule, inferior frontal gyrus, and inferior occipital gyrus. The opposite contrast revealed voxels with higher signal intensity for images of "other" than for "self" in the medial prefrontal cortex and precuneus. Additional contrasts against baseline revealed that activity in the "self" minus "other" contrasts represent signal increases compared to baseline (null events) in "self" trials, while activity in the "other" minus "self" contrasts represent deactivations relative to baseline during "self" trials. Thus, a unique network involving frontoparietal structures described as part of the "mirror neuron system" in the RH underlies self-face recognition, while regions comprising the "default/resting state" network deactivate less for familiar others. We provide a model that reconciles these findings and previously published work to account for the modulations in these two networks previously implicated in social cognition.
Collapse
Affiliation(s)
- Lucina Q Uddin
- Department of Psychology, University of California, Box 951563, B627 Franz Hall, Los Angeles, CA 90095, USA.
| | | | | | | | | |
Collapse
|
412
|
Abstract
Broca's region, classically considered a motor speech-production area, is involved in action understanding and imitation. It also seems to help in sequencing of actions. Broca's region might have evolved for interindividual communication, both by gestures and speech.
Collapse
Affiliation(s)
- Nobuyuki Nishitani
- Cognitive Functions Section, Department of Rehabilitation for Sensory Functions, Research Institute, National Rehabilitation Centre for Persons with Disabilities, Tokorozawa, Japan
| | | | | | | |
Collapse
|
413
|
Costantini M, Galati G, Ferretti A, Caulo M, Tartaro A, Romani GL, Aglioti SM. Neural Systems Underlying Observation of Humanly Impossible Movements: An fMRI Study. Cereb Cortex 2005; 15:1761-7. [PMID: 15728741 DOI: 10.1093/cercor/bhi053] [Citation(s) in RCA: 134] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Previous studies have indicated that largely overlapping parts of a complex, mainly fronto-parietal, neural network are activated during both observation and execution of an action. If these two processes are inextricably linked, increases of neural activity contingent upon action observation should be found only for movements that can actually be performed. Using functional magnetic resonance imaging, we investigated whether observation of possible and biomechanically impossible movements of fingers activated the same neural systems. Thirteen healthy subjects were scanned during observation of video-clips showing abduction/adduction movements of the right index or the little finger, which were defined as biomechanically possible or impossible according to the range of their angular displacement at the metacarpo-phalangeal joint. The mere observation of possible and impossible hand movements induced a selective activation of left precentral and left inferior frontal regions, thus indicating that motor-related areas map body actions even when they violate the constraints of human anatomy. An increase of the blood oxygen level-dependent signal selectively linked to observation of impossible hand movements was found in sensorimotor parietal regions. Our results suggest that while premotor areas code human actions regardless of whether they are biologically possible or impossible, sensorimotor parietal regions may be important for coding the plausibility of actions.
Collapse
Affiliation(s)
- Marcello Costantini
- Department of Clinical Sciences and Bio-imaging, University of Chieti G. D'Annunzio, Chieti, Italy
| | | | | | | | | | | | | |
Collapse
|
414
|
Schürmann M, Hesse MD, Stephan KE, Saarela M, Zilles K, Hari R, Fink GR. Yearning to yawn: the neural basis of contagious yawning. Neuroimage 2005; 24:1260-4. [PMID: 15670705 DOI: 10.1016/j.neuroimage.2004.10.022] [Citation(s) in RCA: 64] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2004] [Revised: 10/19/2004] [Accepted: 10/21/2004] [Indexed: 10/26/2022] Open
Abstract
Yawning is contagious: Watching another person yawn may trigger us to do the same. Here we studied brain activation with functional magnetic resonance imaging (fMRI) while subjects watched videotaped yawns. Significant increases in the blood oxygen level dependent (BOLD) signal, specific to yawn viewing as contrasted to viewing non-nameable mouth movements, were observed in the right posterior superior temporal sulcus (STS) and bilaterally in the anterior STS, in agreement with the high affinity of STS to social cues. However, no additional yawn-specific activation was observed in Broca's area, the core region of the human mirror-neuron system (MNS) that matches action observation and execution. Thus, activation associated with viewing another person yawn seems to circumvent the essential parts of the MNS, in line with the nature of contagious yawns as automatically released behavioural acts-rather than truly imitated motor patterns that would require detailed action understanding. The subjects' self-reported tendency to yawn covaried negatively with activation of the left periamygdalar region, suggesting a connection between yawn contagiousness and amygdalar activation.
Collapse
Affiliation(s)
- Martin Schürmann
- Brain Research Unit, Low Temperature Laboratory, PO Box 2200, Helsinki University of Technology, 02015 HUT, Espoo, Finland.
| | | | | | | | | | | | | |
Collapse
|
415
|
Affiliation(s)
- Akira Murata
- Department of Physiology, School of Medicine, Kinki University
| |
Collapse
|
416
|
The Intentional Attunement Hypothesis The Mirror Neuron System and Its Role in Interpersonal Relations. ACTA ACUST UNITED AC 2005. [DOI: 10.1007/11521082_2] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/20/2023]
|
417
|
Calvo-Merino B, Glaser DE, Grèzes J, Passingham RE, Haggard P. Action Observation and Acquired Motor Skills: An fMRI Study with Expert Dancers. Cereb Cortex 2004; 15:1243-9. [PMID: 15616133 DOI: 10.1093/cercor/bhi007] [Citation(s) in RCA: 1023] [Impact Index Per Article: 51.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
When we observe someone performing an action, do our brains simulate making that action? Acquired motor skills offer a unique way to test this question, since people differ widely in the actions they have learned to perform. We used functional magnetic resonance imaging to study differences in brain activity between watching an action that one has learned to do and an action that one has not, in order to assess whether the brain processes of action observation are modulated by the expertise and motor repertoire of the observer. Experts in classical ballet, experts in capoeira and inexpert control subjects viewed videos of ballet or capoeira actions. Comparing the brain activity when dancers watched their own dance style versus the other style therefore reveals the influence of motor expertise on action observation. We found greater bilateral activations in premotor cortex and intraparietal sulcus, right superior parietal lobe and left posterior superior temporal sulcus when expert dancers viewed movements that they had been trained to perform compared to movements they had not. Our results show that this 'mirror system' integrates observed actions of others with an individual's personal motor repertoire, and suggest that the human brain understands actions by motor simulation.
Collapse
Affiliation(s)
- B Calvo-Merino
- Institute of Movement Neuroscience, University College London and Department of Basic Psychology, Faculty of Psychology, Universidad Complutense, Madrid, Spain
| | | | | | | | | |
Collapse
|
418
|
Abstract
A category of stimuli of great importance for primates, humans in particular, is that formed by actions done by other individuals. If we want to survive, we must understand the actions of others. Furthermore, without action understanding, social organization is impossible. In the case of humans, there is another faculty that depends on the observation of others' actions: imitation learning. Unlike most species, we are able to learn by imitation, and this faculty is at the basis of human culture. In this review we present data on a neurophysiological mechanism--the mirror-neuron mechanism--that appears to play a fundamental role in both action understanding and imitation. We describe first the functional properties of mirror neurons in monkeys. We review next the characteristics of the mirror-neuron system in humans. We stress, in particular, those properties specific to the human mirror-neuron system that might explain the human capacity to learn by imitation. We conclude by discussing the relationship between the mirror-neuron system and language.
Collapse
Affiliation(s)
- Giacomo Rizzolatti
- Dipartimento di Neuroscienze, Sezione di Fisiologia, via Volturno, 3, Universita di Parma 43100, Parma, Italy.
| | | |
Collapse
|
419
|
Buccino G, Vogt S, Ritzl A, Fink GR, Zilles K, Freund HJ, Rizzolatti G. Neural Circuits Underlying Imitation Learning of Hand Actions. Neuron 2004; 42:323-34. [PMID: 15091346 DOI: 10.1016/s0896-6273(04)00181-3] [Citation(s) in RCA: 562] [Impact Index Per Article: 28.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2003] [Revised: 02/23/2004] [Accepted: 03/08/2004] [Indexed: 10/26/2022]
Abstract
The neural bases of imitation learning are virtually unknown. In the present study, we addressed this issue using an event-related fMRI paradigm. Musically naive participants were scanned during four events: (1) observation of guitar chords played by a guitarist, (2) a pause following model observation, (3) execution of the observed chords, and (4) rest. The results showed that the basic circuit underlying imitation learning consists of the inferior parietal lobule and the posterior part of the inferior frontal gyrus plus the adjacent premotor cortex (mirror neuron circuit). This circuit, known to be involved in action understanding, starts to be active during the observation of the guitar chords. During pause, the middle frontal gyrus (area 46) plus structures involved in motor preparation (dorsal premotor cortex, superior parietal lobule, rostral mesial areas) also become active. Given the functional properties of area 46, a model of imitation learning is proposed based on interactions between this area and the mirror neuron system.
Collapse
Affiliation(s)
- Giovanni Buccino
- Dipartimento di Neuroscienze, Università di Parma, Parma 43100, Italy
| | | | | | | | | | | | | |
Collapse
|