1
|
Dupin L, Gerardin E, Térémetz M, Hamdoun S, Turc G, Maier MA, Baron JC, Lindberg PG. Alterations of tactile and anatomical spatial representations of the hand after stroke. Cortex 2024; 177:68-83. [PMID: 38838560 DOI: 10.1016/j.cortex.2024.04.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2023] [Revised: 03/19/2024] [Accepted: 04/18/2024] [Indexed: 06/07/2024]
Abstract
Stroke often causes long-term motor and somatosensory impairments. Motor planning and tactile perception rely on spatial body representations. However, the link between altered spatial body representations, motor deficit and tactile spatial coding remains unclear. This study investigates the relationship between motor deficits and alterations of anatomical (body) and tactile spatial representations of the hand in 20 post-stroke patients with upper limb hemiparesis. Anatomical and tactile spatial representations were assessed from 10 targets (nails and knuckles) respectively cued verbally by their anatomical name or using tactile stimulations. Two distance metrics (hand width and finger length) and two structural measures (relative organization of targets positions and angular deviation of fingers from their physical posture) were computed and compared to clinical assessments, normative data and lesions sites. Over half of the patients had altered anatomical and/or tactile spatial representations. Metrics of tactile and anatomical representations showed common variations, where a wider hand representation was linked to more severe motor deficits. In contrast, alterations in structural measures were not concomitantly observed in tactile and anatomical representations and did not correlate with clinical assessments. Finally, a preliminary analysis showed that specific alterations in tactile structural measures were associated with dorsolateral prefrontal stroke lesions. This study reveals shared and distinct characteristics of anatomical and tactile hand spatial representations, reflecting different mechanisms that can be affected differently after stroke: metrics and location of tactile and anatomical representations were partially shared while the structural measures of tactile and anatomical representations had distinct characteristics.
Collapse
Affiliation(s)
- Lucile Dupin
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, F-75014 Paris, France; Université Paris Cité, INCC UMR 8002, CNRS, F-75006 Paris, France.
| | - Eloïse Gerardin
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, F-75014 Paris, France
| | - Maxime Térémetz
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, F-75014 Paris, France
| | - Sonia Hamdoun
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, F-75014 Paris, France; Service de Médecine Physique et de Réadaptation, GHU-Paris Psychiatrie et Neurosciences, Hôpital Sainte Anne, F-75014 Paris, France
| | - Guillaume Turc
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, F-75014 Paris, France; Department of Neurology, GHU-Paris Psychiatrie et Neurosciences, FHU Neurovasc, Paris, France
| | - Marc A Maier
- Université Paris Cité, INCC UMR 8002, CNRS, F-75006 Paris, France
| | - Jean-Claude Baron
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, F-75014 Paris, France; Department of Neurology, GHU-Paris Psychiatrie et Neurosciences, FHU Neurovasc, Paris, France
| | - Påvel G Lindberg
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, F-75014 Paris, France
| |
Collapse
|
2
|
Abdulrabba S, Tremblay L, Manson GA. Investigating the online control of goal-directed actions to a tactile target on the body. Exp Brain Res 2022; 240:2773-2782. [PMID: 36100753 DOI: 10.1007/s00221-022-06445-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Accepted: 08/11/2022] [Indexed: 11/24/2022]
Abstract
Movement corrections to somatosensory targets have been found to be shorter in latency and larger in magnitude than corrections to external visual targets. Somatosensory targets (e.g., body positions) can be identified using both tactile (i.e., skin receptors) and proprioceptive information (e.g., the sense of body position derived from sensory organs in the muscles and joints). Here, we investigated whether changes in tactile information alone, without changes in proprioception, can elicit shorter correction latencies and larger correction magnitudes than those to external visual targets. Participants made reaching movements to a myofilament touching the index finger of the non-reaching finger (i.e., a tactile target) and a light-emitting diode (i.e., visual target). In one-third of the trials, target perturbations occurred 100 ms after movement onset, such that the target was displaced 3 cm either away or toward the participant. We found that participants demonstrated larger correction magnitudes to visual than tactile target perturbations. Moreover, we found no differences in correction latency between movements to perturbed tactile and visual targets. Further, we found that while participants detected tactile stimuli earlier than visual stimuli, they took longer to initiate reaching movements to an unperturbed tactile target than an unperturbed visual target. These results provide evidence that additional processes may be required when planning movements to tactile versus visual targets and that corrections to changes in tactile target positions alone may not facilitate the latency and magnitude advantages observed for corrections to somatosensory targets (i.e., proprioceptive-tactile targets).
Collapse
Affiliation(s)
- Sadiya Abdulrabba
- Perceptual-Motor Behaviour Laboratory, Centre for Motor Control, Faculty of Kinesiology and Physical Education, University of Toronto, 55 Harbord Street, Toronto, M5S 2W6, Canada.,Sensorimotor Exploration Laboratory, School of Kinesiology and Health Studies, Queen's University, 28 Division St, Kingston, ON, K7L 3N6, Canada
| | - Luc Tremblay
- Perceptual-Motor Behaviour Laboratory, Centre for Motor Control, Faculty of Kinesiology and Physical Education, University of Toronto, 55 Harbord Street, Toronto, M5S 2W6, Canada.
| | - Gerome Aleandro Manson
- Sensorimotor Exploration Laboratory, School of Kinesiology and Health Studies, Queen's University, 28 Division St, Kingston, ON, K7L 3N6, Canada
| |
Collapse
|
3
|
Huang CW, Lin CH, Lin YH, Tsai HY, Tseng MT. Neural Basis of Somatosensory Spatial and Temporal Discrimination in Humans: The Role of Sensory Detection. Cereb Cortex 2021; 32:1480-1493. [PMID: 34427294 DOI: 10.1093/cercor/bhab301] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Revised: 07/29/2021] [Accepted: 07/30/2021] [Indexed: 11/13/2022] Open
Abstract
While detecting somatic stimuli from the external environment, an accurate determination of their spatial and temporal properties is essential for human behavior. Whether and how detection relates to human capacity for somatosensory spatial discrimination (SD) and temporal discrimination (TD) remains unclear. Here, participants underwent functional magnetic resonance imaging scanning when simply detecting vibrotactile stimuli of the leg, judging their location (SD), or deciding their number in time (TD). By conceptualizing tactile discrimination as consisting of detection and determination processes, we found that tactile detection elicited activation specifically involved in SD within the right inferior and superior parietal lobules, 2 regions previously implicated in the control of spatial attention. These 2 regions remained activated in the determination process, during which functional connectivity between these 2 regions predicted individual SD ability. In contrast, tactile detection produced little activation specifically related to TD. Participants' TD ability was implemented in brain regions implicated in coding temporal structures of somatic stimuli (primary somatosensory cortex) and time estimation (anterior cingulate, pre-supplementary motor area, and putamen). Together, our findings indicate a close link between somatosensory detection and SD (but not TD) at the neural level, which aids in explaining why we can promptly respond toward detected somatic stimuli.
Collapse
Affiliation(s)
- Cheng-Wei Huang
- Graduate Institute of Clinical Medicine, National Taiwan University College of Medicine, Taipei, Taiwan
| | - Chin-Hsien Lin
- Department of Neurology, National Taiwan University Hospital, Taipei, Taiwan
| | - Yi-Hsuan Lin
- Taiwan International Graduate Program in Interdisciplinary Neuroscience, National Taiwan University and Academia Sinica, Taipei, Taiwan
| | - Hsin-Yun Tsai
- Taiwan International Graduate Program in Interdisciplinary Neuroscience, National Taiwan University and Academia Sinica, Taipei, Taiwan
| | - Ming-Tsung Tseng
- Graduate Institute of Brain and Mind Sciences, National Taiwan University College of Medicine, Taipei, Taiwan
| |
Collapse
|
4
|
Ossandón JP, König P, Heed T. No Evidence for a Role of Spatially Modulated α-Band Activity in Tactile Remapping and Short-Latency, Overt Orienting Behavior. J Neurosci 2020; 40:9088-9102. [PMID: 33087476 PMCID: PMC7672998 DOI: 10.1523/jneurosci.0581-19.2020] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2019] [Revised: 07/20/2020] [Accepted: 08/11/2020] [Indexed: 11/21/2022] Open
Abstract
Oscillatory α-band activity is commonly associated with spatial attention and multisensory prioritization. It has also been suggested to reflect the automatic transformation of tactile stimuli from a skin-based, somatotopic reference frame into an external one. Previous research has not convincingly separated these two possible roles of α-band activity. Previous experimental paradigms have used artificially long delays between tactile stimuli and behavioral responses to aid relating oscillatory activity to these different events. However, this strategy potentially blurs the temporal relationship of α-band activity relative to behavioral indicators of tactile-spatial transformations. Here, we assessed α-band modulation with massive univariate deconvolution, an analysis approach that disentangles brain signals overlapping in time and space. Thirty-one male and female human participants performed a delay-free, visual search task in which saccade behavior was unrestricted. A tactile cue to uncrossed or crossed hands was either informative or uninformative about visual target location. α-Band suppression following tactile stimulation was lateralized relative to the stimulated hand over central-parietal electrodes but relative to its external location over parieto-occipital electrodes. α-Band suppression reflected external touch location only after informative cues, suggesting that posterior α-band lateralization does not index automatic tactile transformation. Moreover, α-band suppression occurred at the time of, or after, the production of the saccades guided by tactile stimulation. These findings challenge the idea that α-band activity is directly involved in tactile-spatial transformation and suggest instead that it reflects delayed, supramodal processes related to attentional reorienting.SIGNIFICANCE STATEMENT Localizing a touch in space requires integrating somatosensory information about skin location and proprioceptive or visual information about posture. The automatic remapping between skin-based tactile information to a location in external space has been proposed to rely on the modulation of oscillatory brain activity in the α-band range, across the multiple cortical areas that are involved in tactile, multisensory, and spatial processing. We report two findings that are inconsistent with this view. First, α-band activity reflected the remapped stimulus location only when touch was task relevant. Second, α-band modulation occurred too late to account for spatially directed behavioral responses and, thus, only after remapping must have taken place. These characteristics contradict the idea that α-band directly reflects automatic tactile remapping processes.
Collapse
Affiliation(s)
- José P Ossandón
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg 20146, Germany
| | - Peter König
- Institute of Cognitive Science, University of Osnabrück, Osnabrück 49069, Germany
- Department of Neurophysiology and Pathophysiology, Center of Experimental Medicine, University Medical Center Hamburg-Eppendorf, Hamburg 20251, Germany
| | - Tobias Heed
- Biopsychology and Cognitive Neuroscience, Faculty of Psychology and Movement Science, Bielefeld University, Bielefeld 33615, Germany
- Center of Excellence Cognitive Interaction Technology, Bielefeld University, Bielefeld 33615, Germany
| |
Collapse
|
5
|
Chen S, Shi Z, Zang X, Zhu X, Assumpção L, Müller HJ, Geyer T. Crossmodal learning of target-context associations: When would tactile context predict visual search? Atten Percept Psychophys 2020; 82:1682-1694. [PMID: 31845105 PMCID: PMC7297845 DOI: 10.3758/s13414-019-01907-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
It is well established that statistical learning of visual target locations in relation to constantly positioned visual distractors facilitates visual search. In the present study, we investigated whether such a contextual-cueing effect would also work crossmodally, from touch onto vision. Participants responded to the orientation of a visual target singleton presented among seven homogenous visual distractors. Four tactile stimuli, two to different fingers of each hand, were presented either simultaneously with or prior to the visual stimuli. The identity of the stimulated fingers provided the crossmodal context cue: in half of the trials, a given visual target location was consistently paired with a given tactile configuration. The visual stimuli were presented above the unseen fingers, ensuring spatial correspondence between vision and touch. We found no evidence of crossmodal contextual cueing when the two sets of items (tactile, visual) were presented simultaneously (Experiment 1). However, a reliable crossmodal effect emerged when the tactile distractors preceded the onset of visual stimuli 700 ms (Experiment 2). But crossmodal cueing disappeared again when, after an initial learning phase, participants flipped their hands, making the tactile distractors appear at different positions in external space while their somatotopic positions remained unchanged (Experiment 3). In all experiments, participants were unable to explicitly discriminate learned from novel multisensory arrays. These findings indicate that search-facilitating context memory can be established across vision and touch. However, in order to guide visual search, the (predictive) tactile configurations must be remapped from their initial somatotopic into a common external representational format.
Collapse
Affiliation(s)
- Siyi Chen
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany.
| | - Zhuanghua Shi
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Xuelian Zang
- Center for Cognition and Brain Disorders, Institute of Psychological Sciences, Hangzhou Normal University, Hangzhou, China
| | - Xiuna Zhu
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Leonardo Assumpção
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Hermann J Müller
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Thomas Geyer
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| |
Collapse
|
6
|
Abstract
When intercepting a moving target, we typically rely on vision to determine where
the target is and where it will soon be. The accuracy of visually guided
interception can be represented by a model that combines the perceived position
and velocity of the target to estimate when and where to hit it and guides the
finger accordingly with a short delay. We might expect the accuracy of
interception to similarly depend on haptic judgments of position and velocity.
To test this, we conducted separate experiments to measure the precision and any
biases in tactile perception of position and velocity and used our findings to
predict the precision and biases that would be present in an interception task
if it were performed according to the principle described earlier. We then
performed a tactile interception task to test our predictions. We found that
interception of tactile targets is guided by similar principles as interception
of visual targets.
Collapse
Affiliation(s)
- J S Nelson
- Department of Human Movement Sciences, Vrije Universiteit Amsterdam, the Netherlands
| | - G Baud-Bovy
- Department of Robotics, Brain and Cognitive Sciences, Italian Institute of Technology, Genoa, Italy
| | | | - E Brenner
- Department of Human Movement Sciences, Vrije Universiteit Amsterdam, the Netherlands
| |
Collapse
|
7
|
Chinn LK, Hoffmann M, Leed JE, Lockman JJ. Reaching with one arm to the other: Coordinating touch, proprioception, and action during infancy. J Exp Child Psychol 2019; 183:19-32. [PMID: 30851626 DOI: 10.1016/j.jecp.2019.01.014] [Citation(s) in RCA: 39] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2018] [Revised: 01/18/2019] [Accepted: 01/18/2019] [Indexed: 11/26/2022]
Abstract
Reaching to target locations on the body has been studied little despite its importance for adaptive behaviors such as feeding, grooming, and indicating a source of discomfort. This behavior requires multisensory integration given that it involves coordination of touch, proprioception, and sometimes vision as well as action. Here we examined the origins of this skill by investigating how infants begin to localize targets on the body and the motor strategies by which they do so. Infants (7-21 months of age) were prompted to reach to a vibrating target placed at five arm/hand locations (elbow, crook of elbow, forearm, palm, and top of hand) one by one. To manually localize the target, infants needed to reach with one arm to the other. Results suggest that coordination increases with age in the strategies that infants used to localize body targets. Most infants showed bimanual coordination and usually moved the target arm toward the reaching arm to assist reaching. Furthermore, intersensory coordination increased with age. Simultaneous movements of the two arms increased with age, as did coordination between vision and reaching. The results provide new information about the development of multisensory integration during tactile localization and how such integration is linked to action.
Collapse
Affiliation(s)
- L K Chinn
- Department of Psychology, Tulane University, New Orleans, LA 70118, USA
| | - M Hoffmann
- Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, 160 00 Prague 6, Czech Republic
| | - J E Leed
- Department of Psychology, Tulane University, New Orleans, LA 70118, USA
| | - J J Lockman
- Department of Psychology, Tulane University, New Orleans, LA 70118, USA.
| |
Collapse
|
8
|
Tamè L, Azañón E, Longo MR. A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Front Psychol 2019; 10:291. [PMID: 30863333 PMCID: PMC6399380 DOI: 10.3389/fpsyg.2019.00291] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 01/29/2019] [Indexed: 11/30/2022] Open
Abstract
The processing of touch depends of multiple factors, such as the properties of the skin and type of receptors stimulated, as well as features related to the actual configuration and shape of the body itself. A large body of research has focused on the effect that the nature of the stimuli has on tactile processing. Less research, however, has focused on features beyond the nature of the touch. In this review, we focus on some features related to the body that have been investigated for less time and in a more fragmented way. These include the symmetrical quality of the two sides of the body, the postural configuration of the body, as well as the size and shape of different body parts. We will describe what we consider three key aspects: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body, regarded as priors; (3) and how tactile signals are integrated with representations of body size and shape. Here, we describe how these different body dimensions affect integration of tactile information as well as guide motor behavior by integrating them in a single model of tactile processing. We review a wide range of neuropsychological, neuroimaging, and neurophysiological data and suggest a revised model of tactile integration on the basis of the one proposed previously by Longo et al.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom.,School of Psychology, University of Kent, Canterbury, United Kingdom
| | - Elena Azañón
- Institute of Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany.,Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom
| |
Collapse
|
9
|
Task-irrelevant sounds influence both temporal order and apparent-motion judgments about tactile stimuli applied to crossed and uncrossed hands. Atten Percept Psychophys 2017; 80:773-783. [DOI: 10.3758/s13414-017-1476-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
10
|
Schubert JTW, Badde S, Röder B, Heed T. Task demands affect spatial reference frame weighting during tactile localization in sighted and congenitally blind adults. PLoS One 2017; 12:e0189067. [PMID: 29228023 PMCID: PMC5724835 DOI: 10.1371/journal.pone.0189067] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Accepted: 11/17/2017] [Indexed: 11/18/2022] Open
Abstract
Task demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically (“palm” or “back” of the hand), or externally (“up” or “down” in space). Under anatomical instructions, performance was more accurate for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was more accurate for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information—here, task instruction—even in the absence of developmental vision.
Collapse
Affiliation(s)
- Jonathan T. W. Schubert
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Stephanie Badde
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Department of Psychology, New York University, New York, United States of America
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Biopsychology & Cognitive Neuroscience, Faculty of Psychology & Sports Science, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| |
Collapse
|
11
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|