51
|
Aspell J, Lenggenhager B, Blanke O. Multisensory Perception and Bodily Self-Consciousness. Front Neurosci 2011. [DOI: 10.1201/9781439812174-30] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
52
|
Aspell J, Lenggenhager B, Blanke O. Multisensory Perception and Bodily Self-Consciousness. Front Neurosci 2011. [DOI: 10.1201/b11092-30] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
|
53
|
Sambo CF, Forster B. When far is near: ERP correlates of crossmodal spatial interactions between tactile and mirror-reflected visual stimuli. Neurosci Lett 2011; 500:10-5. [PMID: 21683122 DOI: 10.1016/j.neulet.2011.05.233] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2011] [Revised: 05/27/2011] [Accepted: 05/30/2011] [Indexed: 12/01/2022]
Abstract
Visuo-tactile integration occurs in a privileged way in peripersonal space, namely when visual and tactile stimuli are in spatial proximity. Here, we investigated whether crossmodal spatial effects (i.e. stronger crossmodal interactions for spatially congruent compared to incongruent visual and tactile stimuli) are also present when visual stimuli presented near the body are indirectly viewed in a mirror, thus appearing in far space. Participants had to attend to one of their hands throughout a block of stimuli in order to detect infrequent tactile target stimuli at that hand while ignoring tactile targets at the unattended hand, all tactile non-target stimuli, and any visual stimuli. Visual stimuli were presented simultaneously with tactile stimuli, in the same (congruent) or opposite (incongruent) hemispace with respect to the tactile stimuli. In one group of participants the visual stimuli were delivered near the participants' hands and were observed as indirect mirror reflections ('mirror' condition), while in the other group these were presented at a distance from the hands ('far' condition). The main finding was that crossmodal spatial modulations of ERPs recorded over and close to somatosensory cortex were present in the 'mirror' condition but not the 'far' condition. That is, ERPs were enhanced in response to tactile stimuli coupled with spatially congruent versus incongruent visual stimuli when the latter were viewed through a mirror. These effects emerged around 190 ms after stimuli onset, and were modulated by the focus of spatial attention. These results provide evidence that visual stimuli observed in far space via a mirror are coded as near-the-body stimuli according to their known rather than to their perceived location. This suggests that crossmodal interactions between vision and touch may be modulated by previous knowledge of reflecting surfaces (i.e. top-down processing).
Collapse
|
54
|
Northoff G, Hayes DJ. Is our self nothing but reward? Biol Psychiatry 2011; 69:1019-25. [PMID: 21276963 DOI: 10.1016/j.biopsych.2010.12.014] [Citation(s) in RCA: 125] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/23/2010] [Revised: 11/24/2010] [Accepted: 12/14/2010] [Indexed: 11/28/2022]
Abstract
Neuroscience has increasingly explored the neural mechanisms underlying our sense of self. Recent studies have demonstrated the recruitment of regions like the ventral tegmental area, ventromedial prefrontal cortex, and the ventral striatum to self-specific stimuli-regions typically associated with reward-related processing. This raises the question of whether there is a relationship between self and reward and, if so, how these different fields can be linked. Three relationship models that aim to explore the relationship between self and reward are discussed here: integration, segregation, and parallel processing. Their pros and cons are reviewed in light of the most recent findings. The conclusion is that both the fields of self and reward may benefit from increased interaction. This interaction may help to fill in some of the missing pieces regarding reward-related processing, as well as illuminate how brain function can bring forward the philosophical concept and psychological reality of self.
Collapse
Affiliation(s)
- Georg Northoff
- Mind, Brain Imaging and Neuroethics Unit, Institute of Mental Health Research, Royal Ottawa Health Care Group, University of Ottawa, Ontario, Canada.
| | | |
Collapse
|
55
|
Bekrater-Bodmann R, Foell J, Flor H. Relationship between bodily illusions and pain syndromes. Pain Manag 2011; 1:217-28. [DOI: 10.2217/pmt.11.20] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
SUMMARY Apart from their contribution to the overall knowledge of perception and related processes, sensory illusions have been used in recent years to treat and better understand pain disorders such as phantom limb pain or complex regional pain syndrome. With the help of modern imaging techniques, we can examine connections between basic processes of integrative perception and the occurrence of chronic pain. This article gives an overview of recent developments in the area of body illusions and pain, and provides suggestions on how they might lead to novel and effective treatments for chronic pain.
Collapse
Affiliation(s)
- Robin Bekrater-Bodmann
- Department of Cognitive & Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Jens Foell
- Department of Cognitive & Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | | |
Collapse
|
56
|
Affiliation(s)
- James R Anderson
- Department of Psychology, University of Stirling, Stirling, Scotland.
| | | |
Collapse
|
57
|
Caggiano V, Fogassi L, Rizzolatti G, Pomper JK, Thier P, Giese MA, Casile A. View-Based Encoding of Actions in Mirror Neurons of Area F5 in Macaque Premotor Cortex. Curr Biol 2011; 21:144-8. [DOI: 10.1016/j.cub.2010.12.022] [Citation(s) in RCA: 153] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2010] [Revised: 11/02/2010] [Accepted: 12/10/2010] [Indexed: 11/28/2022]
|
58
|
Bonifazi S, Farnè A, Rinaldesi L, Làdavas E. Dynamic size-change of peri-hand space through tool-use: Spatial extension or shift of the multi-sensory area. J Neuropsychol 2010; 1:101-14. [DOI: 10.1348/174866407x180846] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
59
|
Macellini S, Ferrari PF, Bonini L, Fogassi L, Paukner A. A modified mark test for own-body recognition in pig-tailed macaques (Macaca nemestrina). Anim Cogn 2010; 13:631-9. [PMID: 20148344 PMCID: PMC3638247 DOI: 10.1007/s10071-010-0313-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2009] [Revised: 01/21/2010] [Accepted: 01/26/2010] [Indexed: 10/19/2022]
Abstract
Classic mirror self-recognition mark tests involve familiarizing the subject with its mirror image, surreptitiously applying a mark on the subject's eyebrow, nose, or ear, and measuring self-directed behaviors toward the mark. For many non-human primate species, however, direct gaze at the face constitutes an aggressive and threatening signal. It is therefore possible that monkeys fail the mark test because they do not closely inspect their faces in a mirror and hence they have no expectations about their physical appearance. In the current study, we prevented two pig-tailed macaques (Macaca nemestrina) from seeing their own faces in a mirror, and we adopted a modified version of the classic mark test in which monkeys were marked on the chest, a body region to which they normally have direct visual access but that in the current study was visible only via a mirror. Neither monkey tried to touch the mark on its chest, possibly due to a failure to understand the mirror as a reflective surface. To further the monkeys' understanding of the mirror image, we trained them to reach for food using the mirror as the only source of information. After both monkeys had learned mirror-mediated reaching, we replicated the mark test. In this latter phase of the study, only one monkey scratched the red dye on the chest once. The results are consistent with other findings suggesting that monkeys are not capable of passing a mark test and imply that face and body recognition rely on the same cognitive abilities.
Collapse
Affiliation(s)
- Sara Macellini
- Dipartimento di Biologia Evolutiva e Funzionale, Università degli Studi di Parma, Via Usberti 11/A, Parma, Italy.
| | | | | | | | | |
Collapse
|
60
|
Haggard P, Jundi S. Rubber hand illusions and size-weight illusions: self-representation modulates representation of external objects. Perception 2010; 38:1796-803. [PMID: 20192129 DOI: 10.1068/p6399] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
Bodily illusions offer an experimental method to investigate the origins and functional role of the sense of one's own body. Using the rubber hand illusion (RHI) we show that a representation of one's own body is implicitly used to calibrate perception of external objects. Twelve participants experienced the RHI while watching stimulation of a large or small glove simultaneously with stimulation of their own hand. They then grasped cylinders of identical size but varying weight. RHI with the large glove caused the cylinders to feel heavier. We suggest that an illusory increase in hand size made the subsequently grasped cylinder feel correspondingly small, evoking a size-weight illusion. Self-representation thus influenced exteroception. The sense of one's own body provides a fundamental reference for perception in general.
Collapse
Affiliation(s)
- Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, 17 Queen Square, London WC1N 3AR, UK.
| | | |
Collapse
|
61
|
Bonaiuto J, Arbib MA. Extending the mirror neuron system model, II: what did I just do? A new role for mirror neurons. BIOLOGICAL CYBERNETICS 2010; 102:341-359. [PMID: 20217428 DOI: 10.1007/s00422-010-0371-0] [Citation(s) in RCA: 47] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/17/2009] [Accepted: 02/09/2010] [Indexed: 05/28/2023]
Abstract
A mirror system is active both when an animal executes a class of actions (self-actions) and when it sees another execute an action of that class. Much attention has been given to the possible roles of mirror systems in responding to the actions of others but there has been little attention paid to their role in self-actions. In the companion article (Bonaiuto et al. Biol Cybern 96:9-38, 2007) we presented MNS2, an extension of the Mirror Neuron System model of the monkey mirror system trained to recognize the external appearance of its own actions as a basis for recognizing the actions of other animals when they perform similar actions. Here we further extend the study of the mirror system by introducing the novel hypotheses that a mirror system may additionally help in monitoring the success of a self-action and may also be activated by recognition of one's own apparent actions as well as efference copy from one's intended actions. The framework for this computational demonstration is a model of action sequencing, called augmented competitive queuing, in which action choice is based on the desirability of executable actions. We show how this "what did I just do?" function of mirror neurons can contribute to the learning of both executability and desirability which in certain cases supports rapid reorganization of motor programs in the face of disruptions.
Collapse
Affiliation(s)
- James Bonaiuto
- University of Southern California, Hedco Neuroscience Building, 120E, Room 10B, Mailing Code 2520, 3641 Watt Way, Los Angeles, CA, 90089-2520, USA.
| | | |
Collapse
|
62
|
Magosso E, Zavaglia M, Serino A, di Pellegrino G, Ursino M. Visuotactile representation of peripersonal space: a neural network study. Neural Comput 2010; 22:190-243. [PMID: 19764874 DOI: 10.1162/neco.2009.01-08-694] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Neurophysiological and behavioral studies suggest that the peripersonal space is represented in a multisensory fashion by integrating stimuli of different modalities. We developed a neural network to simulate the visual-tactile representation of the peripersonal space around the right and left hands. The model is composed of two networks (one per hemisphere), each with three areas of neurons: two are unimodal (visual and tactile) and communicate by synaptic connections with a third downstream multimodal (visual-tactile) area. The hemispheres are interconnected by inhibitory synapses. We applied a combination of analytic and computer simulation techniques. The analytic approach requires some simplifying assumptions and approximations (linearization and a reduced number of neurons) and is used to investigate network stability as a function of parameter values, providing some emergent properties. These are then tested and extended by computer simulations of a more complex nonlinear network that does not rely on the previous simplifications. With basal parameter values, the extended network reproduces several in vivo phenomena: multisensory coding of peripersonal space, reinforcement of unisensory perception by multimodal stimulation, and coexistence of simultaneous right- and left-hand representations in bilateral stimulation. By reducing the strength of the synapses from the right tactile neurons, the network is able to mimic the responses characteristic of right-brain-damaged patients with left tactile extinction: perception of unilateral left tactile stimulation, cross-modal extinction and cross-modal facilitation in bilateral stimulation. Finally, a variety of sensitivity analyses on some key parameters was performed to shed light on the contribution of single-model components in network behaviour. The model may help us understand the neural circuitry underlying peripersonal space representation and identify its alterations explaining neurological deficits. In perspective, it could help in interpreting results of psychophysical and behavioral trials and clarifying the neural correlates of multisensory-based rehabilitation procedures.
Collapse
Affiliation(s)
- Elisa Magosso
- Department of Electronics, Computer Science and Systems, University of Bologna, Cesena, Italy.
| | | | | | | | | |
Collapse
|
63
|
|
64
|
Fuke S, Ogino M, Asada M. Acquisition of the Head-Centered Peri-Personal Spatial Representation Found in VIP Neuron. ACTA ACUST UNITED AC 2009. [DOI: 10.1109/tamd.2009.2031013] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
65
|
Asada M, Hosoda K, Kuniyoshi Y, Ishiguro H, Inui T, Yoshikawa Y, Ogino M, Yoshida C. Cognitive Developmental Robotics: A Survey. ACTA ACUST UNITED AC 2009. [DOI: 10.1109/tamd.2009.2021702] [Citation(s) in RCA: 362] [Impact Index Per Article: 24.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
66
|
|
67
|
Yamazaki Y, Namba H, Iriki A. Acquisition of an externalized eye by Japanese monkeys. Exp Brain Res 2009; 194:131-42. [PMID: 19139869 DOI: 10.1007/s00221-008-1677-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2008] [Accepted: 12/02/2008] [Indexed: 10/21/2022]
Abstract
Many animals use tools to augment motor function ("motor tools", like rake), while the use of tools to acquire sensory information or to augment sensory function ("sensory tools", like endoscope) has been reported only in humans. In the present study, we trained Japanese monkeys to acquire the sensory tool use to re-construct a possible developmental course of the human-specific tool use via motor tool use training. After they mastered the rake use, we systematically introduced a series of external mirror and video arrangements, so as to separate visual cues from their actual origins in visuomotor space. Finally, the monkeys could acquire the use of sensory tool-a sort of endoscope attached to a rake-to explore the experimental space to find and retrieve the food. The results indicated a critical role of environmental control to develop even higher order behavioral sequences like human-specific sensory tool use in nonhuman primates.
Collapse
Affiliation(s)
- Yumiko Yamazaki
- Laboratory for Symbolic Cognitive Development, RIKEN Brain Science Institute, 2-1 Hirosawa, Wako, 351-0198, Saitama, Japan.
| | | | | |
Collapse
|
68
|
Higuchi T, Hatano N, Soma K, Imanaka K. Perception of spatial requirements for wheelchair locomotion in experienced users with Tetraplegia. J Physiol Anthropol 2009; 28:15-21. [DOI: 10.2114/jpa2.28.15] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022] Open
|
69
|
Anderson JR, Kuroshima H, Paukner A, Fujita K. Capuchin monkeys (Cebus apella) respond to video images of themselves. Anim Cogn 2009; 12:55-62. [PMID: 18574604 PMCID: PMC3639483 DOI: 10.1007/s10071-008-0170-3] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2007] [Revised: 06/04/2008] [Accepted: 06/05/2008] [Indexed: 11/26/2022]
Abstract
Many studies have used mirror-image stimulation in attempts to find self-recognition in monkeys. However, very few studies have presented monkeys with video images of themselves; the present study is the first to do so with capuchin monkeys. Six tufted capuchin monkeys were individually exposed to live face-on and side-on video images of themselves (experimental Phase 1). Both video screens initially elicited considerable interest. Two adult males looked preferentially at their face-on image, whereas two adult females looked preferentially at their side-on image; the latter elicited lateral movements and head-cocking. Only males showed communicative facial expressions, which were directed towards the face-on screen. In Phase 2 monkeys discriminated between real-time, face-on images and identical images delayed by 1 s, with the adult females especially preferring real-time images. In this phase both screens elicited facial expressions, shown by all monkeys. In Phase 3 there was no evidence of discrimination between previously recorded video images of self and similar images of a familiar conspecific. Although they showed no signs of explicit self-recognition, the monkeys' behaviour strongly suggests recognition of the correspondence between kinaesthetic information and external visual effects. In species such as humans and great apes, this type of self-awareness feeds into a system that gives rise to explicit self-recognition.
Collapse
Affiliation(s)
- James R Anderson
- Department of Psychology, University of Stirling, Stirling, FK9 4LA, Scotland, UK.
| | | | | | | |
Collapse
|
70
|
Iriki A, Sakura O. The neuroscience of primate intellectual evolution: natural selection and passive and intentional niche construction. Philos Trans R Soc Lond B Biol Sci 2008; 363:2229-41. [PMID: 18426757 PMCID: PMC2394573 DOI: 10.1098/rstb.2008.2274] [Citation(s) in RCA: 97] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
We trained Japanese macaque monkeys to use tools, an advanced cognitive function monkeys do not exhibit in the wild, and then examined their brains for signs of modification. Following tool-use training, we observed neurophysiological, molecular genetic and morphological changes within the monkey brain. Despite being 'artificially' induced, these novel behaviours and neural connectivity patterns reveal overlap with those of humans. Thus, they may provide us with a novel experimental platform for studying the mechanisms of human intelligence, for revealing the evolutionary path that created these mechanisms from the 'raw material' of the non-human primate brain, and for deepening our understanding of what cognitive abilities are and of those that are not uniquely human. On these bases, we propose a theory of 'intentional niche construction' as an extension of natural selection in order to reveal the evolutionary mechanisms that forged the uniquely intelligent human brain.
Collapse
Affiliation(s)
- Atsushi Iriki
- RIKEN Brain Science Institute, Saitama 351-0198, Japan.
| | | |
Collapse
|
71
|
Corradi-Dell’Acqua C, Ueno K, Ogawa A, Cheng K, Rumiati RI, Iriki A. Effects of shifting perspective of the self: An fMRI study. Neuroimage 2008; 40:1902-11. [DOI: 10.1016/j.neuroimage.2007.12.062] [Citation(s) in RCA: 50] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2007] [Revised: 11/14/2007] [Accepted: 12/31/2007] [Indexed: 11/24/2022] Open
|
72
|
Whiteley L, Spence C, Haggard P. Visual processing and the bodily self. Acta Psychol (Amst) 2008; 127:129-36. [PMID: 17499204 DOI: 10.1016/j.actpsy.2007.03.005] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2006] [Revised: 03/15/2007] [Accepted: 03/19/2007] [Indexed: 11/24/2022] Open
Abstract
The 'body schema' has traditionally been defined as a passively updated, proprioceptive representation of the body. However, recent work has suggested that body representations are more complex and flexible than previously thought. They may integrate current perceptual information from all sensory modalities, and can be extended to incorporate indirect representations of the body and functional portions of tools. In the present study, we investigate the source of a facilitatory effect of viewing the body on speeded visual discrimination reaction times. Participants responded to identical visual stimuli that varied only in their context: being presented on the participant's own body, on the experimenter's body, or in a neutral context. The stimuli were filmed and viewed in real-time on a projector screen. Careful controls for attention, biological saliency, and attribution confirmed that the facilitatory effect depends critically on participants attributing the context to a real body. An intermediate effect was observed when the stimuli were presented on another person's body, suggesting that the effect of viewing one's own body might represent a conjunction of an interpersonal body effect and an egocentric effect.
Collapse
Affiliation(s)
- Louise Whiteley
- Institute of Cognitive Neuroscience and Department of Psychology, University College London, 17 Queen Square, London, WC1N 3AR, UK.
| | | | | |
Collapse
|
73
|
Ogawa K, Inui T. Lateralization of the Posterior Parietal Cortex for Internal Monitoring of Self- versus Externally Generated Movements. J Cogn Neurosci 2007; 19:1827-35. [DOI: 10.1162/jocn.2007.19.11.1827] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Internal monitoring or state estimation of movements is essential for human motor control to compensate for inherent delays and noise in sensorimotor loops. Two types of internal estimation of movements exist: self-generated movements, and externally generated movements. We used functional magnetic resonance imaging to investigate differences in brain activity for internal monitoring of self- versus externally generated movements during visual occlusion. Participants tracked a sinusoidally moving target with a mouse cursor. On some trials, vision of either target (externally generated) or cursor (self-generated) movement was transiently occluded, during which subjects continued tracking by estimating current position of either the invisible target or cursor on screen. Analysis revealed that both occlusion conditions were associated with increased activity in the presupplementary motor area and decreased activity in the right lateral occipital cortex compared to a control condition with no occlusion. Moreover, the right and left posterior parietal cortex (PPC) showed greater activation during occlusion of target and cursor movements, respectively. This study suggests lateralization of the PPC for internal monitoring of internally versus externally generated movements, fully consistent with previously reported clinical findings.
Collapse
|
74
|
Roma PG, Silberberg A, Huntsberry ME, Christensen CJ, Ruggiero AM, Suomi SJ. Mark tests for mirror self-recognition in capuchin monkeys (Cebus apella) trained to touch marks. Am J Primatol 2007; 69:989-1000. [PMID: 17253635 DOI: 10.1002/ajp.20404] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
In Experiment 1, three capuchin monkeys (Cebus apella) were exposed to a mirror in their home cage for 3 days and then given food treats for touching orange marks located on the surface of an experimental chamber. Following training, a mirror was added to the chamber to see if the monkeys would use it to guide non-reinforced contacts with an orange mark on their foreheads that was only visible as a mirror reflection (mark test). Two monkeys touched the head-mark more often with the mirror present than absent, but no mark touches were emitted while looking at the mirror. In Experiment 2, the monkeys were rewarded for touching orange marks on their bodies that were directly visible, followed by another head-mark test. Again, two monkeys touched the mark more often with the mirror present than absent, but these contacts were not emitted while looking at the mirror. Since facing the mirror while mark touching was not required for reinforcement during training, Experiment 3 further tested the possibility that increased mark touching in the presence of the mirror during Experiments 1 and 2 was the result of a memorial process. For this, a final, novel mark test was conducted using an orange mark on the neck that was only visible as a reflection (Experiment 3). No monkeys passed this test. These are the first mark tests given to capuchin monkeys. The results are consistent with the finding that no monkey species is capable of spontaneous mirror self-recognition. The implications of sequential training and mark testing for comparative evaluations of mirror self-recognition capacity are discussed.
Collapse
Affiliation(s)
- Peter G Roma
- Department of Psychology, American University, Washington, DC 20016, USA.
| | | | | | | | | | | |
Collapse
|
75
|
Legrand D, Brozzoli C, Rossetti Y, Farnè A. Close to me: Multisensory space representations for action and pre-reflexive consciousness of oneself-in-the-world. Conscious Cogn 2007; 16:687-99. [PMID: 17683948 DOI: 10.1016/j.concog.2007.06.003] [Citation(s) in RCA: 28] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2007] [Revised: 05/24/2007] [Accepted: 06/06/2007] [Indexed: 10/23/2022]
Abstract
Philosophical considerations as well as several recent studies from neurophysiology, neuropsychology, and psychophysics converged in showing that the peripersonal space (i.e. closely surrounding the body-parts) is structured in a body-centred manner and represented through integrated sensory inputs. Multisensory representations may deserve the function of coding peripersonal space for avoiding or interacting with objects. Neuropsychological evidence is reviewed for dynamic interactions between space representations and action execution, as revealed by the behavioural effects that the use of a tool, as a physical extension of the reachable space, produces on visual-tactile extinction. In particular, tool-use transiently modifies action space representation in a functionally effective way. The possibility is discussed that the investigation of multisensory space representations for action provides an empirical way to consider in its specificity pre-reflexive self-consciousness by considering the intertwining of self-relatedness and object-directness of spatial experience shaped by multisensory and sensorimotor integrations.
Collapse
|
76
|
Holmes NP, Calvert GA, Spence C. Tool use changes multisensory interactions in seconds: evidence from the crossmodal congruency task. Exp Brain Res 2007; 183:465-76. [PMID: 17665178 PMCID: PMC2084481 DOI: 10.1007/s00221-007-1060-7] [Citation(s) in RCA: 57] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2006] [Accepted: 07/04/2007] [Indexed: 11/24/2022]
Abstract
Active tool use in human and non-human primates has been claimed to alter the neural representations of multisensory peripersonal space. To date, most studies suggest that a short period of tool use leads to an expansion or elongation of these spatial representations, which lasts several minutes after the last tool use action. However, the possibility that multisensory interactions also change on a much shorter time scale following or preceding individual tool use movements has not yet been investigated. We measured crossmodal (visual-tactile) congruency effects as an index of multisensory integration during two tool use tasks. In the regular tool use task, the participants used one of two tools in a spatiotemporally predictable sequence after every fourth crossmodal congruency trial. In the random tool use task, the required timing and spatial location of the tool use task varied unpredictably. Multisensory integration effects increased as a function of the number of trials since tool use in the regular tool use group, but remained relatively constant in the random tool use group. The spatial distribution of these multisensory effects, however, was unaffected by tool use predictability, with significant spatial interactions found only near the hands and at the tips of the tools. These data suggest that endogenously preparing to use a tool enhances visual-tactile interactions near the tools. Such enhancements are likely due to the increased behavioural relevance of visual stimuli as each tool use action is prepared before execution.
Collapse
Affiliation(s)
- Nicholas P Holmes
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford OX1 3UD, UK.
| | | | | |
Collapse
|
77
|
Moizumi S, Yamamoto S, Kitazawa S. Referral of tactile stimuli to action points in virtual reality with reaction force. Neurosci Res 2007; 59:60-7. [PMID: 17617482 DOI: 10.1016/j.neures.2007.05.013] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2006] [Revised: 04/28/2007] [Accepted: 05/29/2007] [Indexed: 11/21/2022]
Abstract
When we touch something with a tool, we feel the touch at the tip of the tool rather than at the hand. Yamamoto and Kitazawa [Yamamoto, S., Kitazawa, S., 2001b. Sensation at the tips of invisible tools. Nat. Neurosci. 4, 979-980] previously showed that the judgment of the temporal order of two successive stimuli, delivered to the tips of sticks held in each hand, was dramatically altered by crossing the sticks without changing the positions of the hands. This provided evidence for the referral of tactile signals to the tip of a tool in hand. In this study, we examined importance of force feedback from the tool in the referral by manipulating the direction of force feedback in a virtual reality. The virtual tool consisted of a spherical action point that was moved with a stylus in hand. Subjects held two styli, one in each hand, put each action point on each of two buttons in the virtual reality, and were required to judge the order of successive taps, delivered to the two styli. We manipulated the direction of reaction force from each button so that it was congruent or incongruent to the visual configuration of the button. When the arms were uncrossed, judgment primarily depended on whether the action points were crossed or not in the visual space. But when the arms were crossed, judgment critically depended on the direction of force feedback. The results show that tactile signals can be referred to the action point in the virtual reality and that the force feedback becomes a critical factor when the arms are crossed.
Collapse
Affiliation(s)
- Shunjiro Moizumi
- Department of Neurophysiology, Juntendo University Graduate School of Medicine, Tokyo 113-8421, Japan
| | | | | |
Collapse
|
78
|
Obayashi S, Matsumoto R, Suhara T, Nagai Y, Iriki A, Maeda J. Functional organization of monkey brain for abstract operation. Cortex 2007; 43:389-96. [PMID: 17533762 DOI: 10.1016/s0010-9452(08)70464-7] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
When humans manipulate a control device under operational rules, with the goal of indirectly controlling a remote tool to achieve a desired outcome, they may rely on the power of internal representation to organize individual moves of the controller and tool into a set of sequences by mapping the motor space among hand, controller and tool. We recently used functional brain imaging (PET) to investigate activations in monkey brain associated with joystick-controlled remote operation of a shovel to obtain food. Activated areas included the prefrontal cortex, posterior parietal cortex and cerebellum, regardless of the rules relating movements of the joystick to those of the shovel (Obayashi et al., 2004). If those areas are engaged in the mental manipulation of internal representation, then we should expect brain activity in the same regions during any similar remote operation, even with different controllers and/or operational rules. To address the above hypothesis in the current study, we used PET to measure regional cerebral blood flow (rCBF) of two monkeys during a task in which they were required to control a shovel remotely (to fetch a food pellet) by manipulating dual dials. Compared to unplanned movement of the dials, the active dual-dial operation was associated with robust activation of the prefrontal cortex, higher-order motor cortex, posterior parietal cortex and cerebellum, quite similar to that observed during remote operation with a joystick. The present study suggests that monkeys might be able to organize abstract sequential operations according to learned rules, and perhaps indeed to have insight into the nature of the causal relationships, implying the existence of a relatively sophisticated system of internal representation in the absence of language. The fact that the present results are consistent with our previous PET studies strengthens the view that the underlying mechanism for implicit manipulation of internal representations may involve a cerebro-cerebellar neural circuit including the frontal and parietal cortex.
Collapse
Affiliation(s)
- Shigeru Obayashi
- Department of Molecular Neuroimaging, Molecular Imaging Center, National Institute of Radiological Sciences, Chiba, Japan.
| | | | | | | | | | | |
Collapse
|
79
|
Fujii N, Hihara S, Iriki A. Dynamic social adaptation of motion-related neurons in primate parietal cortex. PLoS One 2007; 2:e397. [PMID: 17460764 PMCID: PMC1851098 DOI: 10.1371/journal.pone.0000397] [Citation(s) in RCA: 54] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2007] [Accepted: 04/01/2007] [Indexed: 11/19/2022] Open
Abstract
Social brain function, which allows us to adapt our behavior to social context, is poorly understood at the single-cell level due largely to technical limitations. But the questions involved are vital: How do neurons recognize and modulate their activity in response to social context? To probe the mechanisms involved, we developed a novel recording technique, called multi-dimensional recording, and applied it simultaneously in the left parietal cortices of two monkeys while they shared a common social space. When the monkeys sat near each other but did not interact, each monkey's parietal activity showed robust response preference to action by his own right arm and almost no response to action by the other's arm. But the preference was broken if social conflict emerged between the monkeys—specifically, if both were able to reach for the same food item placed on the table between them. Under these circumstances, parietal neurons started to show complex combinatorial responses to motion of self and other. Parietal cortex adapted its response properties in the social context by discarding and recruiting different neural populations. Our results suggest that parietal neurons can recognize social events in the environment linked with current social context and form part of a larger social brain network.
Collapse
Affiliation(s)
- Naotaka Fujii
- Laboratory for Symbolic Cognitive Development, RIKEN Brain Science Institute, Wako, Japan.
| | | | | |
Collapse
|
80
|
Shibuya S, Takahashi T, Kitazawa S. Effects of visual stimuli on temporal order judgments of unimanual finger stimuli. Exp Brain Res 2007; 179:709-21. [PMID: 17216148 DOI: 10.1007/s00221-006-0829-4] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2006] [Accepted: 12/13/2006] [Indexed: 10/23/2022]
Abstract
Successive tactile stimuli, delivered one to each hand, are referred to spatial representation before they are ordered in time (Yamamoto and Kitazawa in Nat Neurosci 4:759-765 2001a). In the present study, we examined if this applies even when they are delivered unilaterally to fingers of a single hand. Tactile stimuli were delivered left-to-rightward relative to the body (2nd-3rd-4th) or in reverse with stimulus onset asynchrony of 100 ms. Simultaneously with the delivery of tactile stimuli, three of nine small squares arranged in a matrix of 3 x 3 were turned on as if they appeared near the tips of the fingers. Although subjects were instructed to ignore the visual stimuli and make a forced choice between the two orders of tactile stimuli, the correct-judgment probability depended on the direction of visual stimuli. It was greater than 95% when the direction of visual stimuli matched that of the tactile stimuli, but less than 50% when they were opposite to each other. When the right hand was rotated counterclockwise on the horizontal plane (90 degrees ) so that the fingers were pointing to the left, the preferred direction of visual stimuli that yielded the peak correct judgment was also rotated, although not to the full extent. These results show that subjects cannot be basing their tactile temporal order judgment solely on a somatotopic map, but rather on a spatial map on which both visual and tactile signals converge.
Collapse
Affiliation(s)
- Satoshi Shibuya
- Department of Integrative Physiology, Kyorin University School of Medicine, Tokyo, Japan
| | | | | |
Collapse
|
81
|
Farnè A, Serino A, Làdavas E. Dynamic Size-Change of Peri-Hand Space Following Tool-Use: Determinants and Spatial Characteristics Revealed Through Cross-Modal Extinction. Cortex 2007; 43:436-43. [PMID: 17533766 DOI: 10.1016/s0010-9452(08)70468-4] [Citation(s) in RCA: 62] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
In human and non human primates, evidence has been reported supporting the idea that near peripersonal space is represented through integrated multisensory processing. In humans, the interaction between near peripersonal space representation and action execution can be revealed in brain damaged patients through the use of tools that, by extending the reachable space, modify the strength of visual-tactile extinction, thus showing that tool-mediated actions modify the multisensory coding of near peripersonal space. For example, following the use of a rake to retrieve distant, otherwise non reachable objects, the peri-hand multisensory area has been documented to extend to include the distal part of a rake (Farnè and Làdavas, 2000). The re-sizing of peri-hand space seems to be selective for tool-use, as directional motor activity alone (i.e., pointing without the tool) and visual/proprioceptive experience alone (protracted passive exposure to the tool) does not vary the extent of the visual-tactile peri-hand space (Farnè et al., 2005a). Moreover, the amount of dynamic re-sizing varies with the length of the used tool, and is specifically centred on the functionally relevant part of the tool (Farnè et al., 2005b). Here, besides reviewing and discussing these results, we report new evidence, based on a single-case study, supporting the idea that dynamic re-sizing of peri-hand space consists of a real spatial extension of the visual-tactile integrative area along the tool axis.
Collapse
|
82
|
Holmes NP, Sanabria D, Calvert GA, Spence C. Tool-Use: Capturing Multisensory Spatial Attention or Extending Multisensory Peripersonal Space? Cortex 2007; 43:469-89. [PMID: 17533769 PMCID: PMC1885399 DOI: 10.1016/s0010-9452(08)70471-4] [Citation(s) in RCA: 73] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
The active and skilful use of tools has been claimed to lead to the "extension" of the visual receptive fields of single neurons representing peripersonal space--the visual space immediately surrounding one's body parts. While this hypothesis provides an attractive and potentially powerful explanation for one neural basis of tool-use behaviours in human and nonhuman primates, a number of competing hypotheses for the reported behavioural effects of tool-use have not yet been subjected to empirical test. Here, we report five behavioural experiments in healthy human participants (n=120) involving the effects of tool-use on visual-tactile interactions in peripersonal space. Specifically, we address the possibility that the use of only a single tool, which is typical of many neuropsychological studies of tool-use, induces a spatial allocation of attention towards the side where the tool is held. Participants' tactile discrimination responses were more strongly affected by visual stimuli presented on the right side when they held a single tool on the right, compared to visual stimuli presented on the left. When [corrected] two tools were held, one in each hand, this spatial effect disappeared. Our results are incompatible with the hypothesis that tool-use extends peripersonal space, and suggest instead that the use and/or manipulation of [corrected] tools results in an automatic multisensory shift of spatial attention to the side of space where the tip of the tool is actively held. These results have implications for many of the cognitive neuroscientific studies of tool-use published to date.
Collapse
Affiliation(s)
- Nicholas P Holmes
- Department of Experimental Psychology, Oxford University, Oxford, UK; Department of Psychology, Bath University, Bath, UK.
| | | | | | | |
Collapse
|
83
|
HIGUCHI TAKAHIRO, IMANAKA KUNIYASU, PATLA AFTABE. Action-oriented representation of peripersonal and extrapersonal space: Insights from manual and locomotor actions1. JAPANESE PSYCHOLOGICAL RESEARCH 2006. [DOI: 10.1111/j.1468-5884.2006.00314.x] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
84
|
Padberg J, Krubitzer L. Thalamocortical connections of anterior and posterior parietal cortical areas in New World titi monkeys. J Comp Neurol 2006; 497:416-35. [PMID: 16736469 DOI: 10.1002/cne.21005] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
We examined the thalamocortical connections of electrophysiologically identified locations in the hand and forelimb representations in areas 3b, 1, and 5 in the New World titi monkeys (Callicebus moloch), and of area 7b/AIP. Labeled cells and terminals in the thalamus resulting from the injections were related to architectonic boundaries. As in previous studies in primates, the hand representation of area 3b has dense, restricted projections predominantly from the lateral division of the ventral posterior nucleus (VPl). Projections to area 1 were highly convergent from several thalamic nuclei including the ventral lateral nucleus (VL), anterior pulvinar (PA), VPl, and the superior division of the ventral posterior nucleus (VPs). In cortex immediately caudal to area 1, what we term area 5, thalamocortical connections were also highly convergent and predominantly from nuclei of the thalamus associated with motor, visual, or somatic processing such as VL, the medial pulvinar (PM), and PA, respectively; with moderate projections from VP, central lateral nucleus (CL), lateral posterior nucleus (LP), and VPs. Finally, thalamocortical connections of area 7b/AIP were from a range of nuclei including PA, PM, LP/LD, VL, CL, PL, and CM. The current data support two conclusions drawn from previous studies in titi monkeys and other primates. First, cortex caudal to area 1 in New World monkeys is more like area 5 than area 2. Second, the presence of thalamic input to area 5 from both motor nuclei and somatosensory nuclei of the thalamus, suggests that area 5 could be considered a highly specialized sensorimotor area.
Collapse
Affiliation(s)
- Jeffrey Padberg
- Center for Neuroscience and Department of Psychology, University of California-Davis, Davis, California 95616, USA
| | | |
Collapse
|
85
|
Feintuch U, Raz L, Hwang J, Josman N, Katz N, Kizony R, Rand D, Rizzo AS, Shahar M, Yongseok J, Weiss PLT. Integrating Haptic-Tactile Feedback into a Video-Capture–Based Virtual Environment for Rehabilitation. ACTA ACUST UNITED AC 2006; 9:129-32. [PMID: 16640464 DOI: 10.1089/cpb.2006.9.129] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Video-capture virtual reality (VR) systems are gaining popularity as intervention tools. To date, these platforms offer visual and audio feedback but do not provide haptic feedback. We contend that adding haptic feedback may enhance the quality of intervention for various theoretical and empirical reasons. This study aims to integrate haptic-tactile feedback into a video capture system (GX VR), which is currently applied for rehabilitation. The proposed multi-modal system can deliver audio-visual as well as vibrotactile feedback. The latter is provided via small vibratory discs attached to the patient's limbs. This paper describes the system, the guidelines of its design, and the ongoing usability study.
Collapse
Affiliation(s)
- Uri Feintuch
- School of Occupational Therapy, Hadassah-Hebrew University Medical Center, Jerusalem, Israel.
| | | | | | | | | | | | | | | | | | | | | |
Collapse
|
86
|
Heschl A, Burkart J. A new mark test for mirror self-recognition in non-human primates. Primates 2006; 47:187-98. [PMID: 16432640 DOI: 10.1007/s10329-005-0170-8] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2004] [Accepted: 10/31/2005] [Indexed: 10/25/2022]
Abstract
For 30 years Gallup's (Science 167:86-87, 1970) mark test, which consists of confronting a mirror-experienced test animal with its own previously altered mirror image, usually a color mark on forehead, eyebrow or ear, has delivered valuable results about the distribution of visual self-recognition in non-human primates. Chimpanzees, bonobos, orangutans and, less frequently, gorillas can learn to correctly understand the reflection of their body in a mirror. However, the standard version of the mark test is good only for positively proving the existence of self-recognition. Conclusive statements about the lack of self-recognition are more difficult because of the methodological constraints of the test. This situation has led to a persistent controversy about the power of Gallup's original technique. We devised a new variant of the test which permits more unequivocal decisions about both the presence and absence of self-recognition. This new procedure was tested with marmoset monkeys (Callithrix jacchus), following extensive training with mirror-related tasks to facilitate performance in the standard mark test. The results show that a slightly altered mark test with a new marking substance (chocolate cream) can help to reliably discriminate between true negative results, indicating a real lack of ability to recognize oneself in a mirror, from false negative results that are due to methodological particularities of the standard test. Finally, an evolutionary hypothesis is put forward as to why many primates can use a mirror instrumentally - i.e. know how to use it for grasping at hidden objects - while failing in the decisive mark test.
Collapse
Affiliation(s)
- Adolf Heschl
- Konrad Lorenz Institute for Evolution and Cognition Research, Adolf Lorenz Gasse 2, 3422, Altenberg, Austria.
| | | |
Collapse
|
87
|
Hihara S, Notoya T, Tanaka M, Ichinose S, Ojima H, Obayashi S, Fujii N, Iriki A. Extension of corticocortical afferents into the anterior bank of the intraparietal sulcus by tool-use training in adult monkeys. Neuropsychologia 2006; 44:2636-46. [PMID: 16427666 DOI: 10.1016/j.neuropsychologia.2005.11.020] [Citation(s) in RCA: 162] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2005] [Revised: 11/05/2005] [Accepted: 11/28/2005] [Indexed: 11/28/2022]
Abstract
When humans use a tool, it becomes an extension of the hand physically and perceptually. Common introspection might occur in monkeys trained in tool-use, which should depend on brain operations that constantly update and automatically integrate information about the current intrinsic (somatosensory) and the extrinsic (visual) status of the body parts and the tools. The parietal cortex plays an important role in using tools. Intraparietal neurones of naïve monkeys mostly respond unimodally to somatosensory stimuli; however, after training these neurones become bimodally active and respond to visual stimuli. The response properties of these neurones change to code the body images modified by assimilation of the tool to the hand holding it. In this study, we compared the projection patterns between visually related areas and the intraparietal cortex in trained and naïve monkeys using tracer techniques. Light microscopy analyses revealed the emergence of novel projections from the higher visual centres in the vicinity of the temporo-parietal junction and the ventrolateral prefrontal areas to the intraparietal area in monkeys trained in tool-use, but not in naïve monkeys. Functionally active synapses of intracortical afferents arising from higher visual centres to the intraparietal cortex of the trained monkeys were confirmed by electron microscopy. These results provide the first concrete evidence for the induction of novel neural connections in the adult monkey cerebral cortex, which accompanies a process of demanding behaviour in these animals.
Collapse
Affiliation(s)
- Sayaka Hihara
- Laboratory for Symbolic Cognitive Development, RIKEN Brain Science Institute, Wako 351-0198, Japan.
| | | | | | | | | | | | | | | |
Collapse
|
88
|
Abstract
This study explored whether hand location affected spatial attention. The authors used a visual covert-orienting paradigm to examine whether spatial attention mechanisms--location prioritization and shifting attention--were supported by bimodal, hand-centered representations of space. Placing 1 hand next to a target location, participants detected visual targets following highly predictive visual cues. There was no a priori reason for the hand to influence task performance unless hand presence influenced attention. Results showed that target detection near the hand was facilitated relative to detection away from the hand, regardless of cue validity. Similar facilitation was found with only proprioceptive or visual hand location information but not with arbitrary visual anchors or distant targets. Hand presence affected attentional prioritization of space, not the shifting of attention.
Collapse
Affiliation(s)
- Catherine L Reed
- Department of Psychology, University of Denver, Denver, CO 80208, USA.
| | | | | |
Collapse
|
89
|
Abstract
Virtual reality (VR) has recently emerged as a potentially effective way to provide general and specialty health care services, and appears poised to enter mainstream psychotherapy delivery. Because VR could be part of the future of clinical psychology, it is critical to all psychotherapists that it be defined broadly. To ensure appropriate development of VR applications, clinicians must have a clear understanding of the opportunities and challenges it will provide in professional practice. This review outlines the current state of clinical research relevant to the development of virtual environments for use in psychotherapy. In particular, the paper focuses its analysis on both actual applications of VR in clinical psychology and how different clinical perspectives can use this approach to improve the process of therapeutic change.
Collapse
Affiliation(s)
- Giuseppe Riva
- Applied Technology for Neuro-Psychology Laboratory, Istituto Auxologico Italiano, Milan, Italy.
| |
Collapse
|
90
|
Yamamoto S, Moizumi S, Kitazawa S. Referral of Tactile Sensation to the Tips of L-Shaped Sticks. J Neurophysiol 2005; 93:2856-63. [PMID: 15634708 DOI: 10.1152/jn.01015.2004] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
When we touch something with a tool, we feel the touch at the tip of the tool rather than at the hand that holds the tool. We reported previously that judging the temporal order of two successive stimuli delivered to the tips of straight sticks held in each hand was dramatically altered by crossing the sticks without changing hand position. The results suggested that tactile signals are referred to the tip of a tool held in the hand. Here we examined temporal order judgement using L-shaped sticks instead of straight ones to determine whether the shape of a tool affects the way tactile signals are referred. Subjects reported the order of stimuli correctly in most trials when the tip of each L-shaped stick occupied the hemispace ipsilateral to the anatomical laterality of the hand holding the L-shaped stick. The subjects, however, misreported the order of stimuli presented at moderately short intervals (<300 ms) when the tip of the stick occupied the hemispace contralateral to the anatomical laterality of the hand holding it. The judgment reversal occurred irrespective of the number of physical crossings between the sticks and the arms (0, 1, and 3), as long as the tips of L-shaped tools were placed in the contralateral hemispace. Our results suggest that our brain refers tactile signals from the hand directly to the location of the tip without much accounting for the route that connects hand and tip.
Collapse
Affiliation(s)
- Shinya Yamamoto
- Neuroscience Research Institute, National Institute of Advanced Industrial Science and Technology, Umezono Tsukuba, Japan
| | | | | |
Collapse
|
91
|
Padberg J, Disbrow E, Krubitzer L. The organization and connections of anterior and posterior parietal cortex in titi monkeys: do New World monkeys have an area 2? ACTA ACUST UNITED AC 2005; 15:1938-63. [PMID: 15758196 DOI: 10.1093/cercor/bhi071] [Citation(s) in RCA: 94] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
We used multiunit electrophysiological recording techniques to examine the topographic organization of somatosensory area 3b and cortex posterior to area 3b, including area 1 and the presumptive area 5, in the New World titi monkey, Callicebus moloch. We also examined the ipsilateral and contralateral connections of these fields, as well as those in a region of cortex that appeared to be similar to both area 7b and the anterior intraparietal area (7b/AIP) described in macaque monkeys. All data were combined with architectonic analysis to generate comprehensive reconstructions. These studies led to several observations. First, area 1 in titi monkeys is not as precisely organized in terms of topographic order and receptive field size as is area 1 in macaque monkeys and a few New World monkeys. Second, cortex caudal to area 1 in titi monkeys is dominated by the representation of the hand and forelimb, and contains neurons that are often responsive to visual stimulation as well as somatic stimulation. This organization is more like area 5 described in macaque monkeys than like area 2. Third, ipsilateral and contralateral cortical connections become more broadly distributed away from area 3b towards the posterior parietal cortex. Specifically, area 3b has a relatively restricted pattern of connectivity with adjacent somatosensory fields 3a, 1, S2 and PV; area 1 has more broadly distributed connections than area 3b; and the presumptive areas 5 and 7b/AIP have highly diverse connections, including connections with motor and premotor cortex, extrastriate visual areas, auditory areas and somatosensory areas of the lateral sulcus. Fourth, the hand representation of the presumptive area 5 has dense callosal connections. Our results, together with previous studies in other primates, suggest that anterior parietal cortex has expanded in some primate lineages, perhaps in relation to manual abilities, and that the region of cortex we term area 5 is involved in integrating somatic inputs with the motor system and across hemispheres. Such connections could form the substrate for intentional reaching, grasping and intermanual transfer of information necessary for bilateral coordination of the hands.
Collapse
Affiliation(s)
- Jeffrey Padberg
- Center for Neuroscience, University of California, Davis, CA 95616, USA
| | | | | |
Collapse
|
92
|
Abstract
What happens in our brain when we use a tool to reach for a distant object? Recent neurophysiological, psychological and neuropsychological research suggests that this extended motor capability is followed by changes in specific neural networks that hold an updated map of body shape and posture (the putative "Body Schema" of classical neurology). These changes are compatible with the notion of the inclusion of tools in the "Body Schema", as if our own effector (e.g. the hand) were elongated to the tip of the tool. In this review we present empirical support for this intriguing idea from both single-neuron recordings in the monkey brain and behavioural performance of normal and brain-damaged humans. These relatively simple neural and behavioural aspects of tool-use shed light on more complex evolutionary and cognitive aspects of body representation and multisensory space coding for action.
Collapse
Affiliation(s)
- Angelo Maravita
- Dipartimento di Psicologia, Università di Milano-Bicocca, Piazza dell'Ateneo Nuovo 1, 20126 Milan, Italy.
| | | |
Collapse
|
93
|
Abstract
Here we report findings from neuropsychological investigations showing the existence, in humans, of intersensory integrative systems representing space through the multisensory coding of visual and tactile events. In addition, these findings show that visuo-tactile integration may take place in a privileged manner within a limited sector of space closely surrounding the body surface, i.e., the near-peripersonal space. They also demonstrate that the representation of near-peripersonal space is not static, as objects in the out-of-reach space can be processed as nearer, depending upon the (illusory) visual information about hand position in space, and the use of tools as physical extensions of the reachable space. Finally, new evidence is provided suggesting the multisensory coding of peripersonal space can be achieved through bottom-up processing that, at least in some instances, is not necessarily modulated by more "cognitive" top-down processing, such as the expectation regarding the possibility of being touched. These findings are entirely consistent with the functional properties of multisensory neuronal structures coding near-peripersonal space in monkeys, as well as with behavioral, and neuroimaging evidence for the cross-modal coding of space in normal subjects. This high level of convergence ultimately favors the idea that multisensory space coding is achieved through similar multimodal structures in both humans and non-human primates.
Collapse
Affiliation(s)
- Elisabetta Làdavas
- Dipartimento di Psicologia, Università degli Studi di Bologna, Viale Berti Pichat, 5 - 40127 Bologna, Italy.
| | | |
Collapse
|
94
|
Spence C, Pavani F, Maravita A, Holmes N. Multisensory contributions to the 3-D representation of visuotactile peripersonal space in humans: evidence from the crossmodal congruency task. ACTA ACUST UNITED AC 2005; 98:171-89. [PMID: 15477031 DOI: 10.1016/j.jphysparis.2004.03.008] [Citation(s) in RCA: 106] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
In order to determine precisely the location of a tactile stimulus presented to the hand it is necessary to know not only which part of the body has been stimulated, but also where that part of the body lies in space. This involves the multisensory integration of visual, tactile, proprioceptive, and even auditory cues regarding limb position. In recent years, researchers have become increasingly interested in the question of how these various sensory cues are weighted and integrated in order to enable people to localize tactile stimuli, as well as to give rise to the 'felt' position of our limbs, and ultimately the multisensory representation of 3-D peripersonal space. We highlight recent research on this topic using the crossmodal congruency task, in which participants make speeded elevation discrimination responses to vibrotactile targets presented to the thumb or index finger, while simultaneously trying to ignore irrelevant visual distractors presented from either the same (i.e., congruent) or a different (i.e., incongruent) elevation. Crossmodal congruency effects (calculated as performance on incongruent-congruent trials) are greatest when visual and vibrotactile stimuli are presented from the same azimuthal location, thus providing an index of common position across different sensory modalities. The crossmodal congruency task has been used to investigate a number of questions related to the representation of space in both normal participants and brain-damaged patients. In this review, we detail the major findings from this research, and highlight areas of convergence with other cognitive neuroscience disciplines.
Collapse
Affiliation(s)
- Charles Spence
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford OX1 3UD, UK.
| | | | | | | |
Collapse
|
95
|
Ochiai T, Mushiake H, Tanji J. Involvement of the ventral premotor cortex in controlling image motion of the hand during performance of a target-capturing task. ACTA ACUST UNITED AC 2004; 15:929-37. [PMID: 15483048 DOI: 10.1093/cercor/bhh193] [Citation(s) in RCA: 17] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The ventral premotor cortex (PMv) has been implicated in the visual guidance of movement. To examine whether neuronal activity in the PMv is involved in controlling the direction of motion of a visual image of the hand or the actual movement of the hand, we trained a monkey to capture a target that was presented on a video display using the same side of its hand as was displayed on the video display. We found that PMv neurons predominantly exhibited premovement activity that reflected the image motion to be controlled, rather than the physical motion of the hand. We also found that the activity of half of such direction-selective PMv neurons depended on which side (left versus right) of the video image of the hand was used to capture the target. Furthermore, this selectivity for a portion of the hand was not affected by changing the starting position of the hand movement. These findings suggest that PMv neurons play a crucial role in determining which part of the body moves in which direction, at least under conditions in which a visual image of a limb is used to guide limb movements.
Collapse
Affiliation(s)
- Tetsuji Ochiai
- Department of Physiology, Tohoku University school of Medicine, Japan
| | | | | |
Collapse
|
96
|
Affiliation(s)
- Matthew Botvinick
- Center for Cognitive Neuroscience, University of Pennsylvania, Philadelphia, PA 10104-6241, USA.
| |
Collapse
|
97
|
Abstract
Electrophysiological recordings in monkeys have now revealed several brain regions that contain bimodal visuotactile neurons capable of responding to either tactile or visual stimuli placed on or near the hands, arms, and face. These cells have now been found in frontal, parietal, and subcortical areas of the monkey brain, suggesting a cortical network of neurons that preferentially represent near peripersonal space. The degree to which the visual responses of such cells rely on input from the primary visual cortex and the extent to which they may contribute to visual perception is not completely understood. Nonetheless, recent neuropsychological studies suggest that a similar representation of near space may be bimodally coded in humans as well. Given the accumulating evidence for specialized processing of visual stimuli placed near the hands and arms, we hypothesized that arm position may be capable of modulating human visual ability. Here we report the case of WM, who lost his ability to see in his left visual hemifield after sustaining damage to his right primary visual cortex. Interestingly, the placement of WM's left arm into his “blind” field resulted in significantly better detection of left visual field stimuli compared to when his hand was placed in his lap at midline. Moreover, we found this attenuation to be confined to stimuli presented within reaching distance (unless a tool that extended WM's reach was held while he performed the test). These findings are highly consistent with the characteristics of the bimodal visuo-tactile neurons that have been described in monkeys. Thus, it seems that arm position can modulate human visual ability, even after damage to the primary visual cortex. This study provides an exciting bridge between monkey neurophysiology and human visual capacity while also offering a novel approach for improving visual defects acquired via cortical injury.
Collapse
Affiliation(s)
- Krista Schendel
- University of California-Berkeley and Veterans Affairs Northern California Health Care System, Martinez 94553, USA.
| | | |
Collapse
|
98
|
Abstract
In order to guide the movement of the body through space, the brain must constantly monitor the position and movement of the body in relation to nearby objects. The effective 'piloting' of the body to avoid or manipulate objects in pursuit of behavioural goals (Popper & Eccles, 1977, p. 129), requires an integrated neural representation of the body (the 'body schema') and of the space around the body ('peripersonal space'). In the review that follows, we describe and evaluate recent results from neurophysiology, neuropsychology, and psychophysics in both human and non-human primates that support the existence of an integrated representation of visual, somatosensory, and auditory peripersonal space. Such a representation involves primarily visual, somatosensory, and proprioceptive modalities, operates in body part-centred reference frames, and demonstrates significant plasticity. Recent research shows that the use of tools, the viewing of one's body or body parts in mirrors, and in video-monitors, may also modulate the visuotactile representation of peripersonal space.
Collapse
Affiliation(s)
| | - Charles Spence
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| |
Collapse
|
99
|
Obayashi S, Suhara T, Nagai Y, Okauchi T, Maeda J, Iriki A. Monkey brain areas underlying remote-controlled operation. Eur J Neurosci 2004; 19:1397-407. [PMID: 15016097 DOI: 10.1111/j.1460-9568.2004.03200.x] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
We can control distant tools effectively by manipulating other objects as controllers in various remote-operated ways, even when the two mechanics are altered. To master the remote operation, we may rely on internal representation to organize individual moves of the controller and tool into a set of sequences by mapping the motor space among hand, controller and tool as a continuum. The present study confirmed that monkeys could also organize a sequence by mapping such a motor space or reorganize by remapping even after alteration. In addition, to investigate the neural substrates underlying such mapping/remapping, we measured the regional cerebral blood flow of two monkeys during joystick-controlled operation with alterable function of mechanics using positron emission tomography with. The monkeys were scanned during three different tasks produced by altering the directional gains of the x or y axis of the joystick - the two mechanics are congruent (standard task) and not congruent (reversed in the X or Y axis, X reverse or Y reverse task, respectively). Compared with random movement of the joystick as the control task, increased activities were detected in the prefrontal cortex, higher-ordered motor cortex, posterior parietal cortex and cerebellum during the standard task. Common brain areas during performance of the X reverse and Y reverse task were identified as showing almost the same pattern as during the standard task. These shared areas may not simply be associated with organization of individual motor imagery, but also with context-dependent processing of reorganization based on current functions by means of internal representation.
Collapse
Affiliation(s)
- Shigeru Obayashi
- Brain Imaging Project, National Institute of Radiological Sciences, Chiba 263-8555, Japan.
| | | | | | | | | | | |
Collapse
|
100
|
Riva G, Alcãniz M, Anolli L, Bacchetta M, Baños R, Buselli C, Beltrame F, Botella C, Castelnuovo G, Cesa G, Conti S, Galimberti C, Gamberini L, Gaggioli A, Klinger E, Legeron P, Mantovani F, Mantovani G, Molinari E, Optale G, Ricciardiello L, Perpiñá C, Roy S, Spagnolli A, Troiani R, Weddle C. The VEPSY UPDATED Project: clinical rationale and technical approach. ACTA ACUST UNITED AC 2004; 6:433-9. [PMID: 14511457 DOI: 10.1089/109493103322278835] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
More than 10 years ago, Tart (1990) described virtual reality (VR) as a technological model of consciousness offering intriguing possibilities for developing diagnostic, inductive, psychotherapeutic, and training techniques that can extend and supplement current ones. To exploit and understand this potential is the overall goal of the "Telemedicine and Portable Virtual Environment in Clinical Psychology"--VEPSY UPDATED--a European Community-funded research project (IST-2000-25323, www.cybertherapy.info). Particularly, its specific goal is the development of different PC-based virtual reality modules to be used in clinical assessment and treatment of social phobia, panic disorders, male sexual disorders, obesity, and eating disorders. The paper describes the clinical and technical rationale behind the clinical applications developed by the project. Moreover, the paper focuses its analysis on the possible role of VR in clinical psychology and how it can be used for therapeutic change.
Collapse
Affiliation(s)
- G Riva
- Applied Technology for Neuro-Psychology Laboratory, Istituto Auxologico Italiano, Verbania, Italy.
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|