1
|
Caggiano P, Cocchini G, Stefano DD, Romano D. The different impact of attention, movement, and sensory information on body metric representation. Q J Exp Psychol (Hove) 2024; 77:1044-1051. [PMID: 37382243 PMCID: PMC11032629 DOI: 10.1177/17470218231187385] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2022] [Revised: 05/15/2023] [Accepted: 06/26/2023] [Indexed: 06/30/2023]
Abstract
A growing body of research investigating the relationship between body representation and tool-use has shown that body representation is highly malleable. The nature of the body representation does not consist only of sensory attributes but also of motor action-oriented qualities, which may modulate the subjective experience of our own body. However, how these multisensory factors and integrations may specifically guide and constrain body reorientation's plasticity has been under-investigated. In this study, we used a forearm bisection task to selectively investigate the contribution of motor, sensory, and attentional aspects in guiding body representation malleability. Results show that the perceived forearm midpoint deviates from the real one. This shift is further modulated by a motor task but not by a sensory task, whereas the attentional task generates more uncertain results. Our findings provide novel insight into the individual role of movement, somatosensation, and attention in modulating body metric representation.
Collapse
Affiliation(s)
- Pietro Caggiano
- School of Life & Medical Sciences, University of Hertfordshire, Hatfield, UK
| | - Gianna Cocchini
- Psychology Department, Goldsmiths, University of London, London, UK
| | | | - Daniele Romano
- Psychology Department, University of Milano-Bicocca, Milan, Italy
| |
Collapse
|
2
|
McManus R, Thomas LE. Action does not drive visual biases in peri-tool space. Atten Percept Psychophys 2024; 86:525-535. [PMID: 38127254 DOI: 10.3758/s13414-023-02826-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/07/2023] [Indexed: 12/23/2023]
Abstract
Observers experience visual biases in the area around handheld tools. These biases may occur when active use leads an observer to incorporate a tool into the body schema. However, the visual salience of a handheld tool may instead create an attentional prioritization that is not reliant on body-based representations. We investigated these competing explanations of near-tool visual biases in two experiments during which participants performed a target detection task. Targets could appear near or far from a tool positioned next to a display. In Experiment 1, participants showed facilitation in detecting targets that appeared near a simple handheld rake tool regardless of whether they first used the rake to retrieve objects, but participants who only viewed the tool without holding it were no faster to detect targets appearing near the rake than targets that appeared on the opposite side of the display. In a second experiment, participants who held a novel magnetic tool again showed a near-tool bias even when they refrained from using the tool. Taken together, these results suggest active use is unnecessary, but visual salience is not sufficient, to introduce visual biases in peri-tool space.
Collapse
Affiliation(s)
- Robert McManus
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| | - Laura E Thomas
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA.
| |
Collapse
|
3
|
Bell JD, Macuga KL. Knowing your boundaries: no effect of tool-use on body representation following a gather-and-sort task. Exp Brain Res 2023; 241:2275-2285. [PMID: 37552269 DOI: 10.1007/s00221-023-06669-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2022] [Accepted: 07/10/2023] [Indexed: 08/09/2023]
Abstract
Internal representations of the body have received considerable attention in recent years, particularly in the context of tool-use. Results have supported the notion that these representations are plastic and tool-use engenders an extension of the internal representation of the arm. However, the limitations of the literature underlying this tool embodiment process have not been adequately considered or tested. For example, there is some evidence that tool-use effects do not extend beyond simplistic tool-use tasks. To further clarify this issue, 66 participants engaged in a period of tool-augmented reaches in a speeded gather-and-sort task. If task characteristics inherent to simplistic tasks are relevant to putative embodiment effects, it was predicted that there would be no effect of tool-use on tactile distance judgments or forearm bisections. A Bayesian analysis found considerable support for the null hypothesis in both outcome measures, suggesting that some of the evidence for tool embodiment may be based in task characteristics inherent in the narrow range of tool-use tasks used to study them, rather than a tool incorporation process. Potential sources of influence stemming from these characteristics are discussed.
Collapse
Affiliation(s)
- Joshua D Bell
- School of Psychological Science, Oregon State University, 2950 SW Jefferson Dr., Reed Lodge, Corvallis, OR, 97331, USA.
| | - Kristen L Macuga
- School of Psychological Science, Oregon State University, 2950 SW Jefferson Dr., Reed Lodge, Corvallis, OR, 97331, USA
| |
Collapse
|
4
|
Pathak A, Jovanov K, Nitsche M, Mazalek A, Welsh TN. Do Changes in the Body-Part Compatibility Effect Index Tool-Embodiment? J Mot Behav 2023; 55:135-151. [PMID: 36642420 DOI: 10.1080/00222895.2022.2132201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/17/2023]
Abstract
Tool-embodiment is said to occur when the representation of the body extends to incorporate the representation of a tool following goal-directed tool-use. The present study was designed to determine if tool-embodiment-like phenomenon emerges following different interventions. Participants completed body-part compatibility task in which they responded with foot or hand presses to colored targets presented on the foot or hand of a model, or on a rake held by the model. This response time (RT) task was performed before and after one of four interventions. In the Virtual-Tangible and the Virtual-Keyboard interventions, participants used customized controllers or keyboards, respectively, to move a virtual rake and ball around a course. Participants in the Tool-Perception intervention manually pointed to targets presented on static images of the virtual tool-use task. Participants in the Tool-Absent group completed math problems and were not exposed to a tool task. Results revealed that all four interventions lead to a pattern of pre-/post-intervention changes in RT thought to indicate the emergence of tool-embodiment. Overall, the study indicated that tool-embodiment can occur through repeated exposure to the body-part compatibility paradigm in the absence of any active tool-use, and that the paradigm may tap into more than just body schema.
Collapse
Affiliation(s)
- Aarohi Pathak
- Centre for Motor Control, Faculty of Kinesiology & Physical Education, University of Toronto, Toronto, ON, Canada
| | - Kimberley Jovanov
- Centre for Motor Control, Faculty of Kinesiology & Physical Education, University of Toronto, Toronto, ON, Canada
| | - Michael Nitsche
- School of Literature, Media, and Communication, Georgia Tech, Atlanta, GA, USA
| | - Ali Mazalek
- Synaesthetic Media Lab, University of Ryerson, Toronto, ON, Canada
| | - Timothy N Welsh
- Centre for Motor Control, Faculty of Kinesiology & Physical Education, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
5
|
Gherri E, Xu A, Ambron E, Sedda A. Peripersonal space around the upper and the lower limbs. Exp Brain Res 2022; 240:2039-2050. [PMID: 35727366 PMCID: PMC9288357 DOI: 10.1007/s00221-022-06387-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2021] [Accepted: 05/09/2022] [Indexed: 11/18/2022]
Abstract
Peripersonal space (PPS), the space closely surrounding the body, is typically characterised by enhanced multisensory integration. Neurophysiological and behavioural studies have consistently shown stronger visuo-tactile integration when a visual stimulus is presented close to the tactually stimulate body part in near space (within PPS) than in far space. However, in the majority of these studies, tactile stimuli were delivered to the upper limbs, torso and face. Therefore, it is not known whether the space surrounding the lower limbs is characterised by similar multisensory properties. To address this question, we asked participants to complete two versions of the classic visuo-tactile crossmodal congruency task in which they had to perform speeded elevation judgements of tactile stimuli presented to the dorsum of the hand and foot while a simultaneous visual distractor was presented at spatially congruent or incongruent locations either in near or far space. In line with existing evidence, when the tactile target was presented to the hand, the size of the crossmodal congruency effect (CCE) decreased in far as compared to near space, suggesting stronger visuo-tactile multisensory integration within PPS. In contrast, when the tactile target was presented to the foot, the CCE decreased for visual distractors in near than far space. These findings show systematic differences between the representation of PPS around upper and lower limbs, suggesting that the multisensory properties of the different body part-centred representations of PPS are likely to depend on the potential actions performed by the different body parts.
Collapse
Affiliation(s)
- Elena Gherri
- Department of Philosophy and Communication, University of Bologna, Via Azzo Gardino 23, 40122, Bologna, Italy. .,Human Cognitive Neuroscience, University of Edinburgh, Edinburgh, UK.
| | - Aolong Xu
- Human Cognitive Neuroscience, University of Edinburgh, Edinburgh, UK
| | - Elisabetta Ambron
- Laboratory for Cognition and Neural Stimulation, Neurology Department, School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| | - Anna Sedda
- Department of Psychology, Heriot-Watt University, Edinburgh, UK
| |
Collapse
|
6
|
Empathy as a predictor of peripersonal space: Evidence from the crossmodal congruency task. Conscious Cogn 2022; 98:103267. [PMID: 34998269 DOI: 10.1016/j.concog.2021.103267] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Revised: 12/21/2021] [Accepted: 12/26/2021] [Indexed: 11/23/2022]
Abstract
To investigate whether individual differences in Empathy predict the characteristics of Peripersonal Space (PPS) representations, we asked participants to complete the IRI questionnaire and a visuo-tactile crossmodal congruency task (CCT) as an index of PPS. In the CCT, they responded to the elevation of a tactile target while ignoring a visual distractor presented at the same (i.e. congruent) or different (i.e. incongruent) elevation. The target-distractor distance was also manipulated in depth, with visual distractors randomly presented at near, middle or far locations (0 cm, 25 cm or 50 cm). The near and middle crossmodal congruency effects (CCE) were inversely related to participants' scores on the Empathic Concern sub-scale (EC). Furthermore, the slope of participants' CCE across locations was related to EC scores, with flatter slopes for higher EC individuals. Thus, higher EC individuals showed reduced visuo-tactile integration responses within PPS and a reduced differentiation between PPS and extra-personal space (EPS).
Collapse
|
7
|
Peripersonal space in the front, rear, left and right directions for audio-tactile multisensory integration. Sci Rep 2021; 11:11303. [PMID: 34050213 PMCID: PMC8163804 DOI: 10.1038/s41598-021-90784-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2020] [Accepted: 05/17/2021] [Indexed: 11/30/2022] Open
Abstract
Peripersonal space (PPS) is important for humans to perform body–environment interactions. However, many previous studies only focused on the specific direction of the PPS, such as the front space, despite suggesting that there were PPSs in all directions. We aimed to measure and compare the peri-trunk PPS in four directions (front, rear, left, and right). To measure the PPS, we used a tactile and an audio stimulus because auditory information is available at any time in all directions. We used the approaching and receding task-irrelevant sounds in the experiment. Observers were asked to respond as quickly as possible when a tactile stimulus was applied to a vibrator on their chest. We found that peri-trunk PPS representations exist with an approaching sound, irrespective of the direction.
Collapse
|
8
|
Cardinali L, Zanini A, Yanofsky R, Roy AC, de Vignemont F, Culham JC, Farnè A. The toolish hand illusion: embodiment of a tool based on similarity with the hand. Sci Rep 2021; 11:2024. [PMID: 33479395 PMCID: PMC7820319 DOI: 10.1038/s41598-021-81706-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Accepted: 01/04/2021] [Indexed: 01/07/2023] Open
Abstract
A tool can function as a body part yet not feel like one: Putting down a fork after dinner does not feel like losing a hand. However, studies show fake body-parts are embodied and experienced as parts of oneself. Typically, embodiment illusions have only been reported when the fake body-part visually resembles the real one. Here we reveal that participants can experience an illusion that a mechanical grabber, which looks scarcely like a hand, is part of their body. We found changes in three signatures of embodiment: the real hand’s perceived location, the feeling that the grabber belonged to the body, and autonomic responses to visible threats to the grabber. These findings show that artificial objects can become embodied even though they bear little visual resemblance to the hand.
Collapse
Affiliation(s)
- Lucilla Cardinali
- Cognition, Motion and Neuroscience Lab, Istituto Italiano Di Tecnologia, Genova, Italy.
| | - Alessandro Zanini
- Integrative Multisensory Perception Action and Cognition Team - ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, Lyon, France.,University UCBL Lyon 1, University of Lyon, Lyon, France
| | - Russell Yanofsky
- Cognition, Motion and Neuroscience Lab, Istituto Italiano Di Tecnologia, Genova, Italy
| | - Alice C Roy
- Dynamique Du LangageUMR 5596Institut Des Sciences de L'Homme, CNRS- Lyon University, Lyon, France.,University of Lyon II, Lyon, France
| | | | - Jody C Culham
- Department of Psychology, University of Western Ontario, London, ON, Canada
| | - Alessandro Farnè
- Integrative Multisensory Perception Action and Cognition Team - ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, Lyon, France.,University UCBL Lyon 1, University of Lyon, Lyon, France.,Neuro-Immersion - Mouvement et Handicap, Hospices Civils de Lyon, Lyon, France.,Center for Mind/Brain Sciences (CIMeC), University of Trento, Trento, Italy
| |
Collapse
|
9
|
Abstract
Previous studies have shown that after actively using a handheld tool for a period of time, participants show visual biases toward stimuli presented near the end of the tool. Research suggests this is driven by an incorporation of the tool into the observer's body schema, extending peripersonal space to surround the tool. This study aims to investigate whether the same visual biases might be seen near remotely operated tools. Participants used tools-a handheld rake (Experiment 1), a remote-controlled drone (Experiment 2), a remote-controlled excavator (Experiment 3), or a handheld excavator (Experiment 4)-to rake sand for several minutes, then performed a target-detection task in which they made speeded responses to targets appearing near and far from the tool. In Experiment 1, participants detected targets appearing near the rake significantly faster than targets appearing far from the rake, replicating previous findings. We failed to find strong evidence of improved target detection near remotely operated tools in Experiments 2 and 3, but found clear evidence of near-tool facilitation in Experiment 4 when participants physically picked up the excavator and used it as a handheld tool. These results suggest that observers may not incorporate remotely operated tools into the body schema in the same manner as handheld tools. We discuss potential mechanisms that may drive these differences in embodiment between handheld and remote-controlled tools.
Collapse
|
10
|
Schettler A, Raja V, Anderson ML. The Embodiment of Objects: Review, Analysis, and Future Directions. Front Neurosci 2019; 13:1332. [PMID: 31920499 PMCID: PMC6923672 DOI: 10.3389/fnins.2019.01332] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2019] [Accepted: 11/26/2019] [Indexed: 12/11/2022] Open
Abstract
Here we offer a thorough review of the empirical literature on the conditions under which an object, such as a tool or a prosthetic (whether real or virtual), can be experienced as being in some sense a part or extension of one's body. We discuss this literature both from the standpoint of the apparent malleability of our body representations, and also from within the framework of radical embodied cognition, which understands the phenomenon to result not from an alteration to a representation, but rather from the achievement of a certain kind of sensory/motor coupling. We highlight both the tensions between these frameworks, and also areas where they can productively complement one another for future research.
Collapse
Affiliation(s)
- Aubrie Schettler
- Department of Philosophy, Western University Canada, London, ON, Canada.,Rotman Institute of Philosophy, Western University Canada, London, ON, Canada
| | - Vicente Raja
- Rotman Institute of Philosophy, Western University Canada, London, ON, Canada
| | - Michael L Anderson
- Department of Philosophy, Western University Canada, London, ON, Canada.,Rotman Institute of Philosophy, Western University Canada, London, ON, Canada.,Brain and Mind Institute, Western University Canada, London, ON, Canada
| |
Collapse
|
11
|
Miller LE, Longo MR, Saygin AP. Tool Use Modulates Somatosensory Cortical Processing in Humans. J Cogn Neurosci 2019; 31:1782-1795. [PMID: 31368823 DOI: 10.1162/jocn_a_01452] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
Abstract
Tool use leads to plastic changes in sensorimotor body representations underlying tactile perception. The neural correlates of this tool-induced plasticity in humans have not been adequately characterized. This study used ERPs to investigate the stage of sensory processing modulated by tool use. Somatosensory evoked potentials, elicited by median nerve stimulation, were recorded before and after two forms of object interaction: tool use and hand use. Compared with baseline, tool use-but not use of the hand alone-modulated the amplitude of the P100. The P100 is a mid-latency component that indexes the construction of multisensory models of the body and has generators in secondary somatosensory and posterior parietal cortices. These results mark one of the first demonstrations of the neural correlates of tool-induced plasticity in humans and suggest that tool use modulates relatively late stages of somatosensory processing outside primary somatosensory cortex. This finding is consistent with what has been observed in tool-trained monkeys and suggests that the mechanisms underlying tool-induced plasticity have been preserved across primate evolution.
Collapse
Affiliation(s)
- Luke E Miller
- University of California, San Diego.,Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, Bron Cedex, France
| | | | | |
Collapse
|
12
|
Blustein D, Gill S, Wilson A, Sensinger J. Crossmodal congruency effect scores decrease with repeat test exposure. PeerJ 2019; 7:e6976. [PMID: 31179180 PMCID: PMC6535039 DOI: 10.7717/peerj.6976] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2018] [Accepted: 04/18/2019] [Indexed: 12/17/2022] Open
Abstract
The incorporation of feedback into a person’s body schema is well established. The crossmodal congruency task (CCT) is used to objectively quantify incorporation without being susceptible to experimenter biases. This visual-tactile interference task is used to calculate the crossmodal congruency effect (CCE) score as a difference in response time between incongruent and congruent trials. Here we show that this metric is susceptible to a learning effect that causes attenuation of the CCE score due to repeated task exposure sessions. We demonstrate that this learning effect is persistent, even after a 6 month hiatus in testing. Two mitigation strategies are proposed: 1. Only use CCE scores that are taken after learning has stabilized, or 2. Use a modified CCT protocol that decreases the task exposure time. We show that the modified and shortened CCT protocol, which may be required to meet time or logistical constraints in laboratory or clinical settings, reduced the impact of the learning effect on CCT results. Importantly, the CCE scores from the modified protocol were not significantly more variable than results obtained with the original protocol. This study highlights the importance of considering exposure time to the CCT when designing experiments and suggests two mitigation strategies to improve the utility of this psychophysical assessment.
Collapse
Affiliation(s)
- Daniel Blustein
- Department of Psychology; Neuroscience Program, Rhodes College, Memphis, TN, United States of America
| | - Satinder Gill
- Institute of Biomedical Engineering, University of New Brunswick, Fredericton, New Brunswick, Canada
| | - Adam Wilson
- Department of Electrical and Computer Engineering, University of New Brunswick, Fredericton, New Brunswick, Canada
| | - Jon Sensinger
- Institute of Biomedical Engineering, University of New Brunswick, Fredericton, New Brunswick, Canada.,Department of Electrical and Computer Engineering, University of New Brunswick, Fredericton, New Brunswick, Canada
| |
Collapse
|
13
|
Forsberg A, O'Dowd A, Gherri E. Tool use modulates early stages of visuo-tactile integration in far space: Evidence from event-related potentials. Biol Psychol 2019; 145:42-54. [PMID: 30970269 DOI: 10.1016/j.biopsycho.2019.03.020] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Revised: 01/15/2019] [Accepted: 03/28/2019] [Indexed: 10/27/2022]
Abstract
The neural representation of multisensory space near the body is modulated by the active use of long tools in non-human primates. Here, we investigated whether the electrophysiological correlates of visuo-tactile integration in near and far space were modulated by active tool use in healthy humans. Participants responded to a tactile target delivered to one hand while an irrelevant visual stimulus was presented ipsilaterally in near or far space. This crossmodal task was performed after the use of either short or long tools. Crucially, the P100 components elicited by visuo-tactile stimuli was enhanced on far as compared to near space trials after the use of long tools, while no such difference was present after short tool use. Thus, we found increased neural responses in brain areas encoding tactile stimuli to the body when visual stimuli were presented close to the tip of the tool after long tool use. This increased visuo-tactile integration on far space trials following the use of long tools might indicate a transient remapping of multisensory space. We speculate that performing voluntary actions with long tools strengthens the representation of sensory information arising within portions of space (i.e. the hand and the tip of the tool) that are most functionally relevant to one's behavioural goals.
Collapse
Affiliation(s)
- Alicia Forsberg
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK
| | - Alan O'Dowd
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK
| | - Elena Gherri
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK.
| |
Collapse
|
14
|
Bufacchi RJ, Iannetti GD. An Action Field Theory of Peripersonal Space. Trends Cogn Sci 2018; 22:1076-1090. [PMID: 30337061 PMCID: PMC6237614 DOI: 10.1016/j.tics.2018.09.004] [Citation(s) in RCA: 113] [Impact Index Per Article: 18.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2018] [Revised: 09/17/2018] [Accepted: 09/18/2018] [Indexed: 11/16/2022]
Abstract
Predominant conceptual frameworks often describe peripersonal space (PPS) as a single, distance-based, in-or-out zone within which stimuli elicit enhanced neural and behavioural responses. Here we argue that this intuitive framework is contradicted by neurophysiological and behavioural data. First, PPS-related measures are not binary, but graded with proximity. Second, they are strongly influenced by factors other than proximity, such as walking, tool use, stimulus valence, and social cues. Third, many different PPS-related responses exist, and each can be used to describe a different space. Here, we reconceptualise PPS as a set of graded fields describing behavioural relevance of actions aiming to create or avoid contact between objects and the body. This reconceptualisation incorporates PPS into mainstream theories of action selection and behaviour.
Collapse
Affiliation(s)
- Rory J Bufacchi
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK; Centre for Mathematics and Physics in the Life Sciences and Experimental Biology (CoMPLEX), University College London, London, UK
| | - Gian Domenico Iannetti
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK; Centre for Mathematics and Physics in the Life Sciences and Experimental Biology (CoMPLEX), University College London, London, UK; Neuroscience and Behaviour Laboratory, Istituto Italiano di Tecnologia, Rome, Italy.
| |
Collapse
|
15
|
Hunley SB, Lourenco SF. What is peripersonal space? An examination of unresolved empirical issues and emerging findings. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2018; 9:e1472. [PMID: 29985555 DOI: 10.1002/wcs.1472] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/17/2017] [Revised: 06/07/2018] [Accepted: 06/08/2018] [Indexed: 11/07/2022]
Abstract
Findings from diverse fields of study, including neuroscience, psychology, zoology, and sociology, demonstrate that human and non-human primates maintain a representation of the space immediately surrounding the body, known as peripersonal space (PPS). However, progress in this field has been hampered by the lack of an agreed upon definition of PPS. Since the beginning of its formal study, scientists have argued that PPS plays a crucial role in both defensive and non-defensive actions. Yet consensus is lacking about the cognitive and neural instantiation of these functions. In particular, researchers have begun to ask whether a single, unified system of spatial-attentional resources supports both the defensive and non-defensive functions of PPS or, rather, whether there are multiple, independent systems. Moreover, there are open questions about the specificity of PPS. For example: Does PPS dissociate from other well-known phenomena such as personal space and the body schema? Finally, emerging research has brought attention to important questions about individual differences in the flexibility of PPS and the distribution of PPS in front compared to behind the body. In this advanced review, we shed light on questions about the nature of PPS, offering answers when the research permits or providing recommendations for achieving answers in future research. In so doing, we lay the groundwork for a comprehensive definition of PPS. This article is categorized under: Cognitive Biology > Evolutionary Roots of Cognition Psychology > Attention Psychology > Perception and Psychophysics Neuroscience > Plasticity.
Collapse
Affiliation(s)
- Samuel B Hunley
- Department of Psychology, Emory University, Atlanta, Georgia
| | | |
Collapse
|
16
|
Heimler B, Baruffaldi F, Bonmassar C, Venturini M, Pavani F. Multisensory Interference in Early Deaf Adults. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2017; 22:422-433. [PMID: 28961871 DOI: 10.1093/deafed/enx025] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2016] [Accepted: 06/22/2017] [Indexed: 06/07/2023]
Abstract
Multisensory interactions in deaf cognition are largely unexplored. Unisensory studies suggest that behavioral/neural changes may be more prominent for visual compared to tactile processing in early deaf adults. Here we test whether such an asymmetry results in increased saliency of vision over touch during visuo-tactile interactions. About 23 early deaf and 25 hearing adults performed two consecutive visuo-tactile spatial interference tasks. Participants responded either to the elevation of the tactile target while ignoring a concurrent visual distractor at central or peripheral locations (respond to touch/ignore vision), or they performed the opposite task (respond to vision/ignore touch). Multisensory spatial interference emerged in both tasks for both groups. Crucially, deaf participants showed increased interference compared to hearing adults when they attempted to respond to tactile targets and ignore visual distractors, with enhanced difficulties with ipsilateral visual distractors. Analyses on task-order revealed that in deaf adults, interference of visual distractors on tactile targets was much stronger when this task followed the task in which vision was behaviorally relevant (respond to vision/ignore touch). These novel results suggest that behavioral/neural changes related to early deafness determine enhanced visual dominance during visuo-tactile multisensory conflict.
Collapse
Affiliation(s)
- Benedetta Heimler
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, The Hebrew University of Jerusalem, Hadassah Ein-Kerem, Building 3, 5th Floor, Jerusalem 91120, Israel
- The Edmond and Lily Safra Center for Brain Research, The Hebrew University of Jerusalem, Hadassah Ein-Kerem, Building 3, 5th Floor, Jerusalem 91120, Israel
| | | | - Claudia Bonmassar
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Corso Bettini, 31, Rovereto TN 38068, Italy
| | - Marta Venturini
- Department of Psychology and Cognitive Sciences, University of Trento, Corso Bettini, 31, Rovereto TN 38068, Italy
| | - Francesco Pavani
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Corso Bettini, 31, Rovereto TN 38068, Italy
- Department of Psychology and Cognitive Sciences, University of Trento, Corso Bettini, 31, Rovereto TN 38068, Italy
| |
Collapse
|
17
|
von Suchodoletz A, Fäsche A, Skuballa IT. The Role of Attention Shifting in Orthographic Competencies: Cross-Sectional Findings from 1st, 3rd, and 8th Grade Students. Front Psychol 2017; 8:1665. [PMID: 29018387 PMCID: PMC5622960 DOI: 10.3389/fpsyg.2017.01665] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2017] [Accepted: 09/11/2017] [Indexed: 11/13/2022] Open
Abstract
Attention shifting refers to one core component of executive functions, a set of higher-order cognitive processes that predict different aspects of academic achievement. To date, few studies have investigated the role of attention shifting in orthographic competencies during middle childhood and early adolescence. In the present study, 69 first-grade, 121 third-grade, and 85 eighth-grade students' attention shifting was tested with a computer version of the Dimensional Change Card Sort (DCCS; Zelazo, 2006). General spelling skills and specific writing and spelling strategies were assessed with the Hamburger Writing Test (May, 2002). Results suggested associations between attention shifting and various orthographic competencies that differ across age groups and by sex. Across all age groups, better attention shifting was associated with less errors in applying alphabetical strategies. In third graders, better attention shifting was furthermore related to better general spelling skills and less errors in using orthographical strategies. In this age group, associations did not differ by sex. Among first graders, attention shifting was negatively related to general spelling skills, but only for boys. In contrast, attention shifting was positively related to general spelling skills in eighth graders, but only for girls. Finally, better attention shifting was associated with less case-related errors in eighth graders, independent of students' sex. In sum, the data provide insight into both variability and consistency in the pattern of relations between attention shifting and various orthographic competencies among elementary and middle school students.
Collapse
Affiliation(s)
- Antje von Suchodoletz
- Department of Psychology, University of Freiburg, Freiburg, Germany
- Department of Psychology, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Anika Fäsche
- Department of Psychology, University of Freiburg, Freiburg, Germany
| | - Irene T. Skuballa
- Department of Psychology, University of Freiburg, Freiburg, Germany
- Department of Psychology, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| |
Collapse
|
18
|
Miller LE, Cawley-Bennett A, Longo MR, Saygin AP. The recalibration of tactile perception during tool use is body-part specific. Exp Brain Res 2017; 235:2917-2926. [PMID: 28702834 DOI: 10.1007/s00221-017-5028-y] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2017] [Accepted: 07/07/2017] [Indexed: 11/30/2022]
Abstract
Two decades of research have demonstrated that using a tool modulates spatial representations of the body. Whether this embodiment is specific to representations of the tool-using limb or extends to representations of other body parts has received little attention. Several studies of other perceptual phenomena have found that modulations to the primary somatosensory representation of the hand transfers to the face, due in part to their close proximity in primary somatosensory cortex. In the present study, we investigated whether tool-induced recalibration of tactile perception on the hand transfers to the cheek. Participants verbally estimated the distance between two tactile points applied to either their hand or face, before and after using a hand-shaped tool. Tool use recalibrated tactile distance perception on the hand-in line with previous findings-but left perception on the cheek unchanged. This finding provides support for the idea that embodiment is body-part specific. Furthermore, it suggests that tool-induced perceptual recalibration occurs at a level of somatosensory processing, where representations of the hand and face have become functionally disentangled.
Collapse
Affiliation(s)
- Luke E Miller
- Department of Cognitive Science, University of California, San Diego, USA. .,Kavli Institute for Brain and Mind, University of California, San Diego, USA.
| | | | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| | - Ayse P Saygin
- Department of Cognitive Science, University of California, San Diego, USA.,Kavli Institute for Brain and Mind, University of California, San Diego, USA
| |
Collapse
|
19
|
von Suchodoletz A, Slot PL, Shroff DM. Measuring executive function in Indian mothers and their 4-year-old daughters. Psych J 2017; 6:16-28. [PMID: 28371553 DOI: 10.1002/pchj.156] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2016] [Revised: 11/25/2016] [Accepted: 12/06/2016] [Indexed: 11/06/2022]
Abstract
Executive function (EF), including cognitive flexibility, attention shifting, and inhibitory control, has been linked to a range of outcomes across the lifespan, such as school readiness and academic functioning, job performance, health, and social-emotional well-being. Yet, research investigating links between parent EF and child EF is still limited. This is partly due to challenges in measuring the same EF abilities in parents and their children. The current study investigated the applicability of a computer-based battery of various EF tasks for use with both mothers and children. The battery included the following EF tasks: Dimensional Change Card Sort, Hearts and Flowers, and Fish Flanker. Participants were 80 Indian mothers and their 4-year-old daughters. EF was measured with regard to accuracy scores, response time, and inverse efficiency (IE) scores of the most complex blocks of each task. Scoring patterns indicated that children's task performance appeared to be determined by their ability to recognize the cue indicating which task to perform at any given trial and to inhibit an incorrect response. In contrast, mothers' performance appeared to be determined by response time, that is, their ability to be quick in giving the correct response. However, for both children and mothers, IE scores best captured individual differences in EF performance between participants. Furthermore, confirmatory factor analyses found that, for both children and mothers, all EF measures loaded on a latent factor, suggesting that the measures shared common variance in EF. There appeared to be no significant association between mothers' and children's EF scores, controlling for several background variables. Directions for further research include examining the applicability of the EF task battery to reliably describe developmental trajectories of EF abilities over time, and further examining variability in the parent-child EF association across the lifespan.
Collapse
Affiliation(s)
| | - Pauline L Slot
- Department of Child, Family, and Education Studies, Utrecht University, Utrecht, Netherlands
| | - Delshad M Shroff
- Department of Psychology, New York University Abu Dhabi, Abu Dhabi, UAE
| |
Collapse
|
20
|
Marini F, Romano D, Maravita A. The contribution of response conflict, multisensory integration, and body-mediated attention to the crossmodal congruency effect. Exp Brain Res 2016; 235:873-887. [PMID: 27913817 DOI: 10.1007/s00221-016-4849-4] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2016] [Accepted: 11/27/2016] [Indexed: 11/28/2022]
Abstract
The crossmodal congruency task is a consolidated paradigm for investigating interactions between vision and touch. In this task, participants judge the elevation of a tactile target stimulus while ignoring a visual distracter stimulus that may occur at a congruent or incongruent elevation, thus engendering a measure of visuo-tactile interference (crossmodal congruency effect, CCE). The CCE reflects perceptual, attentional, and response-related factors, but their respective roles and interactions have not been set out yet. In two experiments, we used the original version of the crossmodal congruency task as well as ad hoc manipulations of the experimental setting and of the participants' posture for characterizing the contributions of multisensory integration, body-mediated attention, and response conflict to the CCE. Results of the two experiments consistently showed that the largest amount of variance in the CCE is explained by the reciprocal elevation of visual and tactile stimuli. This finding is compatible with a major role of response conflict for the CCE. Weaker yet distinguishable contributions come from multisensory integration, observed in the absence of response conflict, and from hand-mediated attentional binding, observed with the modified posture and in the presence of response conflict. Overall, this study informs the long-standing debate about mechanisms underlying the CCE by revealing that the visuo-tactile interference in this task is primarily due to the competition between opposite response tendencies, with an additional contribution of multisensory integration and hand-mediated attentional binding.
Collapse
Affiliation(s)
- Francesco Marini
- Department of Psychology, University of Milano-Bicocca, Milan, Italy. .,Department of Psychology, University of Nevada Reno, 1664 N Virginia St, Reno, NV, 89557, USA.
| | - Daniele Romano
- Department of Psychology, University of Milano-Bicocca, Milan, Italy.,Milan Center for Neuroscience, Milan, Italy
| | - Angelo Maravita
- Department of Psychology, University of Milano-Bicocca, Milan, Italy.,Milan Center for Neuroscience, Milan, Italy
| |
Collapse
|
21
|
Martel M, Cardinali L, Roy AC, Farnè A. Tool-use: An open window into body representation and its plasticity. Cogn Neuropsychol 2016; 33:82-101. [PMID: 27315277 PMCID: PMC4975077 DOI: 10.1080/02643294.2016.1167678] [Citation(s) in RCA: 96] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2015] [Revised: 02/13/2016] [Accepted: 03/15/2016] [Indexed: 10/28/2022]
Abstract
Over the last decades, scientists have questioned the origin of the exquisite human mastery of tools. Seminal studies in monkeys, healthy participants and brain-damaged patients have primarily focused on the plastic changes that tool-use induces on spatial representations. More recently, we focused on the modifications tool-use must exert on the sensorimotor system and highlighted plastic changes at the level of the body representation used by the brain to control our movements, i.e., the Body Schema. Evidence is emerging for tool-use to affect also more visually and conceptually based representations of the body, such as the Body Image. Here we offer a critical review of the way different tool-use paradigms have been, and should be, used to try disentangling the critical features that are responsible for tool incorporation into different body representations. We will conclude that tool-use may offer a very valuable means to investigate high-order body representations and their plasticity.
Collapse
Affiliation(s)
- Marie Martel
- Laboratoire Dynamique du Langage, CNRS UMR 5596, Lyon69007, France
- University of Lyon, Lyon69000, France
| | - Lucilla Cardinali
- The Brain and Mind Institute, University of Western Ontario, London, ON, Canada
| | - Alice C. Roy
- Laboratoire Dynamique du Langage, CNRS UMR 5596, Lyon69007, France
- University of Lyon, Lyon69000, France
| | - Alessandro Farnè
- University of Lyon, Lyon69000, France
- Integrative Multisensory Perception Action & Cognition team (ImpAct), Lyon Neuroscience Research Center, INSERM U1028, CNRS UMR5292, Lyon69000, France
- Hospices Civils de Lyon, Mouvement et Handicap & Neuro-immersion, Lyon69000, France
| |
Collapse
|
22
|
Aymerich-Franch L, Ganesh G. The role of functionality in the body model for self-attribution. Neurosci Res 2015; 104:31-7. [PMID: 26602981 DOI: 10.1016/j.neures.2015.11.001] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2015] [Revised: 10/30/2015] [Accepted: 11/04/2015] [Indexed: 10/22/2022]
Abstract
Bodily self-attribution, the feeling that a body (or parts of it) is owned by me, is a fundamental component of one's self. Previous studies have suggested that, in addition to a necessary multi-sensory stimulation, the sense of body ownership is determined by the body model, a representation of our body in the brain. It is however unclear what features constitute the body representation. To examine this issue, we first briefly review results on embodiment of artificial limbs, whole bodies and virtual avatars to understand the apparent anatomical, volumetric and spatial constraints associated with the sense of ownership toward external entities. We then discuss how considering limb functionality in the body model can provide an integrated explanation for most of the varied embodiment results in literature. We propose that the self-attribution of an entity may be determined, not just by its physical features, but by whether the entity can afford actions that the brain has associated with the limb which it replaces.
Collapse
|
23
|
Park GD, Reed CL. Haptic over visual information in the distribution of visual attention after tool-use in near and far space. Exp Brain Res 2015; 233:2977-88. [PMID: 26126805 DOI: 10.1007/s00221-015-4368-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2014] [Accepted: 06/23/2015] [Indexed: 11/28/2022]
Abstract
Despite attentional prioritization for grasping space near the hands, tool-use appears to transfer attentional bias to the tool's end/functional part. The contributions of haptic and visual inputs to attentional distribution along a tool were investigated as a function of tool-use in near (Experiment 1) and far (Experiment 2) space. Visual attention was assessed with a 50/50, go/no-go, target discrimination task, while a tool was held next to targets appearing near the tool-occupied hand or tool-end. Target response times (RTs) and sensitivity (d-prime) were measured at target locations, before and after functional tool practice for three conditions: (1) open-tool: tool-end visible (visual + haptic inputs), (2) hidden-tool: tool-end visually obscured (haptic input only), and (3) short-tool: stick missing tool's length/end (control condition: hand occupied but no visual/haptic input). In near space, both open- and hidden-tool groups showed a tool-end, attentional bias (faster RTs toward tool-end) before practice; after practice, RTs near the hand improved. In far space, the open-tool group showed no bias before practice; after practice, target RTs near the tool-end improved. However, the hidden-tool group showed a consistent tool-end bias despite practice. Lack of short-tool group results suggested that hidden-tool group results were specific to haptic inputs. In conclusion, (1) allocation of visual attention along a tool due to tool practice differs in near and far space, and (2) visual attention is drawn toward the tool's end even when visually obscured, suggesting haptic input provides sufficient information for directing attention along the tool.
Collapse
Affiliation(s)
- George D Park
- Department of Psychology, Claremont Graduate University, Claremont, CA, USA. .,Systems Technology, Inc., Hawthorne, CA, USA.
| | - Catherine L Reed
- Department of Psychology, Claremont Graduate University, Claremont, CA, USA.,Psychology Department, Claremont McKenna College, Claremont, CA, USA
| |
Collapse
|
24
|
Galli G, Noel JP, Canzoneri E, Blanke O, Serino A. The wheelchair as a full-body tool extending the peripersonal space. Front Psychol 2015; 6:639. [PMID: 26042069 PMCID: PMC4435246 DOI: 10.3389/fpsyg.2015.00639] [Citation(s) in RCA: 52] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2015] [Accepted: 05/01/2015] [Indexed: 11/13/2022] Open
Abstract
Dedicated multisensory mechanisms in the brain represent peripersonal space (PPS), a limited portion of space immediately surrounding the body. Previous studies have illustrated the malleability of PPS representation through hand-object interaction, showing that tool use extends the limits of the hand-centered PPS. In the present study we investigated the effects of a special tool, the wheelchair, in extending the action possibilities of the whole body. We used a behavioral measure to quantify the extension of the PPS around the body before and after Active (Experiment 1) and Passive (Experiment 2) training with a wheelchair and when participants were blindfolded (Experiment 3). Results suggest that a wheelchair-mediated passive exploration of far space extended PPS representation. This effect was specifically related to the possibility of receiving information from the environment through vision, since no extension effect was found when participants were blindfolded. Surprisingly, the active motor training did not induce any modification in PPS representation, probably because the wheelchair maneuver was demanding for non-expert users and thus they may have prioritized processing of information from close to the wheelchair rather than at far spatial locations. Our results suggest that plasticity in PPS representation after tool use seems not to strictly depend on active use of the tool itself, but is triggered by simultaneous processing of information from the body and the space where the body acts in the environment, which is more extended in the case of wheelchair use. These results contribute to our understanding of the mechanisms underlying body-environment interaction for developing and improving applications of assistive technological devices in different clinical populations.
Collapse
Affiliation(s)
- Giulia Galli
- Istituto di Ricovero e Cura a Carattere Scientifico, Santa Lucia Foundation Rome, Italy ; Laboratory of Cognitive Neuroscience, Brain Mind Institute, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | - Jean Paul Noel
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland ; Center for Neuroprosthetics, School of Life Sciences, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland ; Vanderbilt Brain Institute, Vanderbilt University Nashville, TN, USA
| | - Elisa Canzoneri
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland ; Center for Neuroprosthetics, School of Life Sciences, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | - Olaf Blanke
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland ; Center for Neuroprosthetics, School of Life Sciences, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland ; Department of Neurology, University Hospital Geneva Geneva, Switzerland
| | - Andrea Serino
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland ; Center for Neuroprosthetics, School of Life Sciences, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| |
Collapse
|
25
|
de Vignemont F, Iannetti G. How many peripersonal spaces? Neuropsychologia 2015; 70:327-34. [PMID: 25448854 DOI: 10.1016/j.neuropsychologia.2014.11.018] [Citation(s) in RCA: 136] [Impact Index Per Article: 15.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2014] [Revised: 10/17/2014] [Accepted: 11/15/2014] [Indexed: 10/24/2022]
|
26
|
Dynamic expansion of alert responses to incoming painful stimuli following tool use. Neuropsychologia 2015; 70:486-94. [PMID: 25595342 DOI: 10.1016/j.neuropsychologia.2015.01.019] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2014] [Revised: 01/08/2015] [Accepted: 01/13/2015] [Indexed: 11/20/2022]
|
27
|
Van der Stoep N, Nijboer T, Van der Stigchel S, Spence C. Multisensory interactions in the depth plane in front and rear space: A review. Neuropsychologia 2015; 70:335-49. [DOI: 10.1016/j.neuropsychologia.2014.12.007] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2014] [Revised: 12/03/2014] [Accepted: 12/04/2014] [Indexed: 11/28/2022]
|
28
|
Ganesh G, Yoshioka T, Osu R, Ikegami T. Immediate tool incorporation processes determine human motor planning with tools. Nat Commun 2014; 5:4524. [PMID: 25077612 PMCID: PMC4279266 DOI: 10.1038/ncomms5524] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2014] [Accepted: 06/25/2014] [Indexed: 11/09/2022] Open
Abstract
Human dexterity with tools is believed to stem from our ability to incorporate and use tools as parts of our body. However tool incorporation, evident as extensions in our body representation and peri-personal space, has been observed predominantly after extended tool exposures and does not explain our immediate motor behaviours when we change tools. Here we utilize two novel experiments to elucidate the presence of additional immediate tool incorporation effects that determine motor planning with tools. Interestingly, tools were observed to immediately induce a trial-by-trial, tool length dependent shortening of the perceived limb lengths, opposite to observations of elongations after extended tool use. Our results thus exhibit that tools induce a dual effect on our body representation; an immediate shortening that critically affects motor planning with a new tool, and the slow elongation, probably a consequence of skill related changes in sensory-motor mappings with the repeated use of the tool.
Collapse
Affiliation(s)
- G Ganesh
- 1] Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology, 1-4 Yamadaoka, Osaka University Campus, Suita 5650871, Japan [2] CNRS-AIST JRL (Joint Robotics Laboratory), UMI3218/CRT, Intelligent Systems Research Institute, National Institute of Advanced Industrial Science and Technology (AIST) Tsukuba Central 2, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568, Japan
| | - T Yoshioka
- ATR Brain Information Communications research Laboratories, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 6190288, Japan
| | - R Osu
- ATR Brain Information Communications research Laboratories, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 6190288, Japan
| | - T Ikegami
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology, 1-4 Yamadaoka, Osaka University Campus, Suita 5650871, Japan
| |
Collapse
|
29
|
Zhang Y, Zhen YR, Neumann O, Day JK, Nordlander P, Halas NJ. Coherent anti-Stokes Raman scattering with single-molecule sensitivity using a plasmonic Fano resonance. Nat Commun 2014; 5:4424. [PMID: 25020075 DOI: 10.1038/ncomms5424] [Citation(s) in RCA: 129] [Impact Index Per Article: 12.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2014] [Accepted: 06/16/2014] [Indexed: 01/08/2023] Open
Abstract
Plasmonic nanostructures are of particular interest as substrates for the spectroscopic detection and identification of individual molecules. Single-molecule sensitivity Raman detection has been achieved by combining resonant molecular excitation with large electromagnetic field enhancements experienced by a molecule associated with an interparticle junction. Detection of molecules with extremely small Raman cross-sections (~10(-30) cm(2) sr(-1)), however, has remained elusive. Here we show that coherent anti-Stokes Raman spectroscopy (CARS), a nonlinear spectroscopy of great utility and potential for molecular sensing, can be used to obtain single-molecule detection sensitivity, by exploiting the unique light harvesting properties of plasmonic Fano resonances. The CARS signal is enhanced by ~11 orders of magnitude relative to spontaneous Raman scattering, enabling the detection of single molecules, which is verified using a statistically rigorous bi-analyte method. This approach combines unprecedented single-molecule spectral sensitivity with plasmonic substrates that can be fabricated using top-down lithographic strategies.
Collapse
Affiliation(s)
- Yu Zhang
- 1] Department of Physics and Astronomy, Rice University, Houston, Texas 77005, USA [2] Laboratory for Nanophotonics, Rice University, Houston, Texas 77005, USA
| | - Yu-Rong Zhen
- 1] Department of Physics and Astronomy, Rice University, Houston, Texas 77005, USA [2] Laboratory for Nanophotonics, Rice University, Houston, Texas 77005, USA
| | - Oara Neumann
- 1] Laboratory for Nanophotonics, Rice University, Houston, Texas 77005, USA [2] Department of Electrical and Computer Engineering, Rice University, Houston, Texas 77005, USA
| | - Jared K Day
- 1] Laboratory for Nanophotonics, Rice University, Houston, Texas 77005, USA [2] Department of Electrical and Computer Engineering, Rice University, Houston, Texas 77005, USA
| | - Peter Nordlander
- 1] Department of Physics and Astronomy, Rice University, Houston, Texas 77005, USA [2] Laboratory for Nanophotonics, Rice University, Houston, Texas 77005, USA [3] Department of Electrical and Computer Engineering, Rice University, Houston, Texas 77005, USA
| | - Naomi J Halas
- 1] Department of Physics and Astronomy, Rice University, Houston, Texas 77005, USA [2] Laboratory for Nanophotonics, Rice University, Houston, Texas 77005, USA [3] Department of Electrical and Computer Engineering, Rice University, Houston, Texas 77005, USA
| |
Collapse
|
30
|
Ward J, Wright T. Sensory substitution as an artificially acquired synaesthesia. Neurosci Biobehav Rev 2014; 41:26-35. [DOI: 10.1016/j.neubiorev.2012.07.007] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2012] [Revised: 07/18/2012] [Accepted: 07/26/2012] [Indexed: 10/28/2022]
|
31
|
van Stralen HE, van Zandvoort MJ, Hoppenbrouwers SS, Vissers LM, Kappelle LJ, Dijkerman HC. Affective touch modulates the rubber hand illusion. Cognition 2014; 131:147-58. [DOI: 10.1016/j.cognition.2013.11.020] [Citation(s) in RCA: 86] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2013] [Revised: 11/25/2013] [Accepted: 11/29/2013] [Indexed: 12/01/2022]
|
32
|
|
33
|
Brown LE, Goodale MA. A brief review of the role of training in near-tool effects. Front Psychol 2013; 4:576. [PMID: 24027545 PMCID: PMC3759798 DOI: 10.3389/fpsyg.2013.00576] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2013] [Accepted: 08/11/2013] [Indexed: 11/30/2022] Open
Abstract
Research suggests that, like near-hand effects, visual targets appearing near the tip of a hand-held real or virtual tool are treated differently than other targets. This paper reviews neurological and behavioral evidence relevant to near-tool effects and describes how the effect varies with the functional properties of the tool and the knowledge of the participant. In particular, the paper proposes that motor knowledge plays a key role in the appearance of near-tool effects.
Collapse
Affiliation(s)
- Liana E Brown
- Department of Psychology, Trent University Peterborough, ON, Canada
| | | |
Collapse
|
34
|
The effect of limb crossing and limb congruency on multisensory integration in peripersonal space for the upper and lower extremities. Conscious Cogn 2013; 22:545-55. [PMID: 23579198 DOI: 10.1016/j.concog.2013.02.006] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2012] [Revised: 02/16/2013] [Accepted: 02/18/2013] [Indexed: 11/24/2022]
Abstract
The present study investigated how multisensory integration in peripersonal space is modulated by limb posture (i.e. whether the limbs are crossed or uncrossed) and limb congruency (i.e. whether the observed body part matches the actual position of one's limb). This was done separately for the upper limbs (Experiment 1) and the lower limbs (Experiment 2). The crossmodal congruency task was used to measure peripersonal space integration for the hands and the feet. It was found that the peripersonal space representation for the hands but not for the feet is dynamically updated based on both limb posture and limb congruency. Together these findings show how dynamic cues from vision, proprioception, and touch are integrated in peripersonal limb space and highlight fundamental differences in the way in which peripersonal space is represented for the upper and lower extremity.
Collapse
|
35
|
Rognini G, Sengül A, Aspell JE, Salomon R, Bleuler H, Blanke O. Visuo-tactile integration and body ownership during self-generated action. Eur J Neurosci 2013; 37:1120-9. [PMID: 23351116 DOI: 10.1111/ejn.12128] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2011] [Revised: 12/10/2012] [Accepted: 12/13/2012] [Indexed: 02/02/2023]
Abstract
Although there is increasing knowledge about how visual and tactile cues from the hands are integrated, little is known about how self-generated hand movements affect such multisensory integration. Visuo-tactile integration often occurs under highly dynamic conditions requiring sensorimotor updating. Here, we quantified visuo-tactile integration by measuring cross-modal congruency effects (CCEs) in different bimanual hand movement conditions with the use of a robotic platform. We found that classical CCEs also occurred during bimanual self-generated hand movements, and that such movements lowered the magnitude of visuo-tactile CCEs as compared to static conditions. Visuo-tactile integration, body ownership and the sense of agency were decreased by adding a temporal visuo-motor delay between hand movements and visual feedback. These data show that visual stimuli interfere less with the perception of tactile stimuli during movement than during static conditions, especially when decoupled from predictive motor information. The results suggest that current models of visuo-tactile integration need to be extended to account for multisensory integration in dynamic conditions.
Collapse
Affiliation(s)
- G Rognini
- Center for Neuroprosthetics, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | | | | | | | | | | |
Collapse
|
36
|
I feel who I see: Visual body identity affects visual–tactile integration in peripersonal space. Conscious Cogn 2012; 21:1355-64. [DOI: 10.1016/j.concog.2012.06.012] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2012] [Revised: 06/20/2012] [Accepted: 06/26/2012] [Indexed: 12/22/2022]
|
37
|
|
38
|
Abstract
Plasticity of body representation fundamentally underpins human tool use. Recent studies have demonstrated remarkably complex plasticity of body representation in humans, showing that such plasticity (1) occurs flexibly across multiple time scales and (2) involves multiple body representations responding differently to tool use. Such findings reveal remarkable sophistication of body plasticity in humans, suggesting that Vaesen may overestimate the similarity of such mechanisms in humans and non-human primates.
Collapse
|
39
|
Gozli DG, Brown LE. Agency and control for the integration of a virtual tool into the peripersonal space. Perception 2012; 40:1309-19. [PMID: 22416589 DOI: 10.1068/p7027] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
Abstract
Our representation of the peripersonal space is tied to our representation of our bodies. This representation appears to be flexible and it can be updated to include the space in which tools work, particularly when the tool is actively used. One indicator of this update is the increased efficiency with which sensory events near the tool are processed. In the present study we examined the role of visuomotor control in extending peripersonal space to a common virtual tool-a computer mouse cursor. In particular, after participants were exposed to different spatial mappings between movements of the mouse cursor and movements of their hand, participants' performance in a motion-onset detection task was measured, with the mouse cursor as the stimulus. When participants, during exposure, had the ability to move the cursor efficiently and accurately (familiar hand-cursor mapping), they detected motion-onset targets more quickly than when they could not move the cursor at all during exposure (no hand-cursor mapping). Importantly, reversing the spatial correspondence between the movements of the hand and the cursor (unfamiliar hand-cursor mapping) during exposure, which was thought to preserve the ability to move the cursor (ie agency) while weakening the ability to make the movements efficiently and accurately (ie control), eliminated the detection-facilitation effect. These results provide evidence for the possible extension of peripersonal space to frequently used objects in the virtual domain. Importantly, these extensions seem to depend on the participant's knowledge of the dynamic spatial mapping between the acting limb and the visible virtual tool.
Collapse
Affiliation(s)
- Davood G Gozli
- Department of Psychology, University of Toronto, 100 St George Street, Toronto, ON M5S 3G3, Canada.
| | | |
Collapse
|
40
|
Does tool use extend peripersonal space? A review and re-analysis. Exp Brain Res 2012; 218:273-82. [DOI: 10.1007/s00221-012-3042-7] [Citation(s) in RCA: 77] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2011] [Accepted: 02/15/2012] [Indexed: 11/28/2022]
|
41
|
Brown LE, Doole R, Malfait N. The role of motor learning in spatial adaptation near a tool. PLoS One 2011; 6:e28999. [PMID: 22174944 PMCID: PMC3236781 DOI: 10.1371/journal.pone.0028999] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2011] [Accepted: 11/18/2011] [Indexed: 11/24/2022] Open
Abstract
Some visual-tactile (bimodal) cells have visual receptive fields (vRFs) that overlap and extend moderately beyond the skin of the hand. Neurophysiological evidence suggests, however, that a vRF will grow to encompass a hand-held tool following active tool use but not after passive holding. Why does active tool use, and not passive holding, lead to spatial adaptation near a tool? We asked whether spatial adaptation could be the result of motor or visual experience with the tool, and we distinguished between these alternatives by isolating motor from visual experience with the tool. Participants learned to use a novel, weighted tool. The active training group received both motor and visual experience with the tool, the passive training group received visual experience with the tool, but no motor experience, and finally, a no-training control group received neither visual nor motor experience using the tool. After training, we used a cueing paradigm to measure how quickly participants detected targets, varying whether the tool was placed near or far from the target display. Only the active training group detected targets more quickly when the tool was placed near, rather than far, from the target display. This effect of tool location was not present for either the passive-training or control groups. These results suggest that motor learning influences how visual space around the tool is represented.
Collapse
Affiliation(s)
- Liana E Brown
- Department of Psychology, Trent University, Peterborough, Ontario, Canada.
| | | | | |
Collapse
|
42
|
van Elk M, Blanke O. Manipulable objects facilitate cross-modal integration in peripersonal space. PLoS One 2011; 6:e24641. [PMID: 21949738 PMCID: PMC3176228 DOI: 10.1371/journal.pone.0024641] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2011] [Accepted: 08/15/2011] [Indexed: 11/18/2022] Open
Abstract
Previous studies have shown that tool use often modifies one's peripersonal space – i.e. the space directly surrounding our body. Given our profound experience with manipulable objects (e.g. a toothbrush, a comb or a teapot) in the present study we hypothesized that the observation of pictures representing manipulable objects would result in a remapping of peripersonal space as well. Subjects were required to report the location of vibrotactile stimuli delivered to the right hand, while ignoring visual distractors superimposed on pictures representing everyday objects. Pictures could represent objects that were of high manipulability (e.g. a cell phone), medium manipulability (e.g. a soap dispenser) and low manipulability (e.g. a computer screen). In the first experiment, when subjects attended to the action associated with the objects, a strong cross-modal congruency effect (CCE) was observed for pictures representing medium and high manipulability objects, reflected in faster reaction times if the vibrotactile stimulus and the visual distractor were in the same location, whereas no CCE was observed for low manipulability objects. This finding was replicated in a second experiment in which subjects attended to the visual properties of the objects. These findings suggest that the observation of manipulable objects facilitates cross-modal integration in peripersonal space.
Collapse
Affiliation(s)
- Michiel van Elk
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland.
| | | |
Collapse
|
43
|
|
44
|
|
45
|
|
46
|
|
47
|
Cuppini C, Magosso E, Ursino M. Organization, maturation, and plasticity of multisensory integration: insights from computational modeling studies. Front Psychol 2011; 2:77. [PMID: 21687448 PMCID: PMC3110383 DOI: 10.3389/fpsyg.2011.00077] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2010] [Accepted: 04/12/2011] [Indexed: 11/15/2022] Open
Abstract
In this paper, we present two neural network models – devoted to two specific and widely investigated aspects of multisensory integration – in order to evidence the potentialities of computational models to gain insight into the neural mechanisms underlying organization, development, and plasticity of multisensory integration in the brain. The first model considers visual–auditory interaction in a midbrain structure named superior colliculus (SC). The model is able to reproduce and explain the main physiological features of multisensory integration in SC neurons and to describe how SC integrative capability – not present at birth – develops gradually during postnatal life depending on sensory experience with cross-modal stimuli. The second model tackles the problem of how tactile stimuli on a body part and visual (or auditory) stimuli close to the same body part are integrated in multimodal parietal neurons to form the perception of peripersonal (i.e., near) space. The model investigates how the extension of peripersonal space – where multimodal integration occurs – may be modified by experience such as use of a tool to interact with the far space. The utility of the modeling approach relies on several aspects: (i) The two models, although devoted to different problems and simulating different brain regions, share some common mechanisms (lateral inhibition and excitation, non-linear neuron characteristics, recurrent connections, competition, Hebbian rules of potentiation and depression) that may govern more generally the fusion of senses in the brain, and the learning and plasticity of multisensory integration. (ii) The models may help interpretation of behavioral and psychophysical responses in terms of neural activity and synaptic connections. (iii) The models can make testable predictions that can help guiding future experiments in order to validate, reject, or modify the main assumptions.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electronics, Computer Science and Systems, University of Bologna Bologna, Italy
| | | | | |
Collapse
|
48
|
de Grave DDJ, Brenner E, Smeets JBJ. Using a stick does not necessarily alter judged distances or reachability. PLoS One 2011; 6:e16697. [PMID: 21390215 PMCID: PMC3044725 DOI: 10.1371/journal.pone.0016697] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2010] [Accepted: 12/26/2010] [Indexed: 11/19/2022] Open
Abstract
BACKGROUND It has been reported that participants judge an object to be closer after a stick has been used to touch it than after touching it with the hand. In this study we try to find out why this is so. METHODOLOGY We showed six participants a cylindrical object on a table. On separate trials (randomly intermixed) participants either estimated verbally how far the object is from their body or they touched a remembered location. Touching was done either with the hand or with a stick (in separate blocks). In three different sessions, participants touched either the object location or the location halfway to the object location. Verbal judgments were given either in centimeters or in terms of whether the object would be reachable with the hand. No differences in verbal distance judgments or touching responses were found between the blocks in which the stick or the hand was used. CONCLUSION Instead of finding out why the judged distance changes when using a tool, we found that using a stick does not necessarily alter judged distances or judgments about the reachability of objects.
Collapse
Affiliation(s)
- Denise D J de Grave
- Research Institute MOVE, Faculty of Human Movement Sciences, VU University, Amsterdam, The Netherlands.
| | | | | |
Collapse
|
49
|
Beauchamp MS, Pasalar S, Ro T. Neural substrates of reliability-weighted visual-tactile multisensory integration. Front Syst Neurosci 2010; 4:25. [PMID: 20631844 PMCID: PMC2903191 DOI: 10.3389/fnsys.2010.00025] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2010] [Accepted: 05/25/2010] [Indexed: 02/03/2023] Open
Abstract
As sensory systems deteriorate in aging or disease, the brain must relearn the appropriate weights to assign each modality during multisensory integration. Using blood-oxygen level dependent functional magnetic resonance imaging of human subjects, we tested a model for the neural mechanisms of sensory weighting, termed “weighted connections.” This model holds that the connection weights between early and late areas vary depending on the reliability of the modality, independent of the level of early sensory cortex activity. When subjects detected viewed and felt touches to the hand, a network of brain areas was active, including visual areas in lateral occipital cortex, somatosensory areas in inferior parietal lobe, and multisensory areas in the intraparietal sulcus (IPS). In agreement with the weighted connection model, the connection weight measured with structural equation modeling between somatosensory cortex and IPS increased for somatosensory-reliable stimuli, and the connection weight between visual cortex and IPS increased for visual-reliable stimuli. This double dissociation of connection strengths was similar to the pattern of behavioral responses during incongruent multisensory stimulation, suggesting that weighted connections may be a neural mechanism for behavioral reliability weighting.
Collapse
Affiliation(s)
- Michael S Beauchamp
- Department of Neurobiology and Anatomy, University of Texas Health Science Center at Houston Houston, TX, USA
| | | | | |
Collapse
|
50
|
|