1
|
Abstract
Perhaps the most recognizable “sensory map” in neuroscience is the somatosensory homunculus. Although the homunculus suggests a direct link between cortical territory and body part, the relationship is actually ambiguous without a decoder that knows this mapping. How the somatosensory system derives a spatial code from an activation in the homunculus is a longstanding mystery we aimed to solve. We propose that touch location is disambiguated using multilateration, a computation used by surveying and global positioning systems to localize objects. We develop a Bayesian formulation of multilateration, which we implement in a neural network to identify its computational signature. We then detect this signature in psychophysical experiments. Our results suggest that multilateration provides the homunculus-to-body mapping necessary for localizing touch. Perhaps the most recognizable sensory map in all of neuroscience is the somatosensory homunculus. Although it seems straightforward, this simple representation belies the complex link between an activation in a somatotopic map and the associated touch location on the body. Any isolated activation is spatially ambiguous without a neural decoder that can read its position within the entire map, but how this is computed by neural networks is unknown. We propose that the somatosensory system implements multilateration, a common computation used by surveying and global positioning systems to localize objects. Specifically, to decode touch location on the body, multilateration estimates the relative distance between the afferent input and the boundaries of a body part (e.g., the joints of a limb). We show that a simple feedforward neural network, which captures several fundamental receptive field properties of cortical somatosensory neurons, can implement a Bayes-optimal multilateral computation. Simulations demonstrated that this decoder produced a pattern of localization variability between two boundaries that was unique to multilateration. Finally, we identify this computational signature of multilateration in actual psychophysical experiments, suggesting that it is a candidate computational mechanism underlying tactile localization.
Collapse
|
2
|
Unwalla K, Goldreich D, Shore DI. Exploring Reference Frame Integration Using Response Demands in a Tactile Temporal-Order Judgement Task. Multisens Res 2021; 34:1-32. [PMID: 34375947 DOI: 10.1163/22134808-bja10057] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2020] [Accepted: 06/10/2021] [Indexed: 11/19/2022]
Abstract
Exploring the world through touch requires the integration of internal (e.g., anatomical) and external (e.g., spatial) reference frames - you only know what you touch when you know where your hands are in space. The deficit observed in tactile temporal-order judgements when the hands are crossed over the midline provides one tool to explore this integration. We used foot pedals and required participants to focus on either the hand that was stimulated first (an anatomical bias condition) or the location of the hand that was stimulated first (a spatiotopic bias condition). Spatiotopic-based responses produce a larger crossed-hands deficit, presumably by focusing observers on the external reference frame. In contrast, anatomical-based responses focus the observer on the internal reference frame and produce a smaller deficit. This manipulation thus provides evidence that observers can change the relative weight given to each reference frame. We quantify this effect using a probabilistic model that produces a population estimate of the relative weight given to each reference frame. We show that a spatiotopic bias can result in either a larger external weight (Experiment 1) or a smaller internal weight (Experiment 2) and provide an explanation of when each one would occur.
Collapse
Affiliation(s)
- Kaian Unwalla
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
| | - Daniel Goldreich
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
- Multisensory Perception Laboratory, a Division of the Multisensory Mind Inc., Hamilton, ON, Canada
| |
Collapse
|
3
|
Unwalla K, Kearney H, Shore DI. Reliability of the Crossed-Hands Deficit in Tactile Temporal Order Judgements. Multisens Res 2020; 34:387-421. [PMID: 33706262 DOI: 10.1163/22134808-bja10039] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Accepted: 09/10/2020] [Indexed: 11/19/2022]
Abstract
Crossing the hands over the midline impairs performance on a tactile temporal order judgement (TOJ) task, resulting in the crossed-hands deficit. This deficit results from a conflict between two reference frames - one internal (somatotopic) and the other external (spatial) - for coding stimulus location. The substantial individual differences observed in the crossed-hands deficit highlight the differential reliance on these reference frames. For example, women have been reported to place a greater emphasis on the external reference frame than men, resulting in a larger crossed-hands deficit for women. It has also been speculated that individuals with an eating disorder place a greater weight on the external reference frame. Further exploration of individual differences in reference frame weighing using a tactile TOJ task requires that the reliability of the task be established. In Experiment 1, we investigated the reliability of the tactile TOJ task across two sessions separated by one week and found high reliability in the magnitude of the crossed-hands deficit. In Experiment 2, we report the split-half reliability across multiple experiments (both published and unpublished). Overall, tactile TOJ reliability was high. Experiments with small to moderate crossed-hands deficits showed good reliability; those with larger deficits showed even higher reliability. Researchers should try to maximize the size of the effect when interested in individual differences in the use of the internal and external reference frames.
Collapse
Affiliation(s)
- Kaian Unwalla
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - Hannah Kearney
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| |
Collapse
|
4
|
Maij F, Seegelke C, Medendorp WP, Heed T. External location of touch is constructed post-hoc based on limb choice. eLife 2020; 9:57804. [PMID: 32945257 PMCID: PMC7561349 DOI: 10.7554/elife.57804] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Accepted: 09/18/2020] [Indexed: 11/13/2022] Open
Abstract
When humans indicate on which hand a tactile stimulus occurred, they often err when their hands are crossed. This finding seemingly supports the view that the automatically determined touch location in external space affects limb assignment: the crossed right hand is localized in left space, and this conflict presumably provokes hand assignment errors. Here, participants judged on which hand the first of two stimuli, presented during a bimanual movement, had occurred, and then indicated its external location by a reach-to-point movement. When participants incorrectly chose the hand stimulated second, they pointed to where that hand had been at the correct, first time point, though no stimulus had occurred at that location. This behavior suggests that stimulus localization depended on hand assignment, not vice versa. It is, thus, incompatible with the notion of automatic computation of external stimulus location upon occurrence. Instead, humans construct external touch location post-hoc and on demand.
Collapse
Affiliation(s)
- Femke Maij
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Christian Seegelke
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| | - W Pieter Medendorp
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Tobias Heed
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
5
|
Immersive virtual reality reveals that visuo-proprioceptive discrepancy enlarges the hand-centred peripersonal space. Neuropsychologia 2020; 146:107540. [PMID: 32593721 DOI: 10.1016/j.neuropsychologia.2020.107540] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2020] [Revised: 06/11/2020] [Accepted: 06/19/2020] [Indexed: 12/23/2022]
Abstract
Vision and proprioception, informing the system about the body position in space, seem crucial in defining the boundary of the peripersonal space (PPS). What happens to the PPS representation when a conflict between vision and proprioception arises? We capitalize on the Immersive Virtual Reality to dissociate vision and proprioception by presenting the participants' 3D hand image in congruent/incongruent positions with respect to the participants' real hand. To measure the hand-centred PPS, we exploit multisensory integration occurring when visual stimuli are delivered simultaneously with tactile stimuli applied to a body district; i.e., visual enhancement of touch (VET). Participants are instructed to respond to tactile stimuli while ignoring visual stimuli (red LED), which can appear either near to or far from the hand receiving tactile (electrical) stimuli. The results show that, when vision and proprioception are congruent (i.e., real and virtual hand coincide), a space-dependent modulation of the VET effect occurs (with faster responses when visual stimuli are near to than far from the stimulated hand). Contrarily, when vision and proprioception are incongruent (i.e., a discrepancy between real and virtual hand is present), a comparable VET effect is observed when visual stimuli occur near to the real hand and when they occur far from it, but close to the virtual hand. These findings, also confirmed by the independent estimate of a Bayesian Causal Inference model, suggest that, when the visuo-proprioceptive discrepancy makes the coding of the hand position less precise, the hand-centred PPS is enlarged, likely to optimize reactions to external events.
Collapse
|
6
|
Badde S, Navarro KT, Landy MS. Modality-specific attention attenuates visual-tactile integration and recalibration effects by reducing prior expectations of a common source for vision and touch. Cognition 2020; 197:104170. [PMID: 32036027 DOI: 10.1016/j.cognition.2019.104170] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2019] [Revised: 12/19/2019] [Accepted: 12/20/2019] [Indexed: 10/25/2022]
Abstract
At any moment in time, streams of information reach the brain through the different senses. Given this wealth of noisy information, it is essential that we select information of relevance - a function fulfilled by attention - and infer its causal structure to eventually take advantage of redundancies across the senses. Yet, the role of selective attention during causal inference in cross-modal perception is unknown. We tested experimentally whether the distribution of attention across vision and touch enhances cross-modal spatial integration (visual-tactile ventriloquism effect, Expt. 1) and recalibration (visual-tactile ventriloquism aftereffect, Expt. 2) compared to modality-specific attention, and then used causal-inference modeling to isolate the mechanisms behind the attentional modulation. In both experiments, we found stronger effects of vision on touch under distributed than under modality-specific attention. Model comparison confirmed that participants used Bayes-optimal causal inference to localize visual and tactile stimuli presented as part of a visual-tactile stimulus pair, whereas simultaneously collected unity judgments - indicating whether the visual-tactile pair was perceived as spatially-aligned - relied on a sub-optimal heuristic. The best-fitting model revealed that attention modulated sensory and cognitive components of causal inference. First, distributed attention led to an increase of sensory noise compared to selective attention toward one modality. Second, attending to both modalities strengthened the stimulus-independent expectation that the two signals belong together, the prior probability of a common source for vision and touch. Yet, only the increase in the expectation of vision and touch sharing a common source was able to explain the observed enhancement of visual-tactile integration and recalibration effects with distributed attention. In contrast, the change in sensory noise explained only a fraction of the observed enhancements, as its consequences vary with the overall level of noise and stimulus congruency. Increased sensory noise leads to enhanced integration effects for visual-tactile pairs with a large spatial discrepancy, but reduced integration effects for stimuli with a small or no cross-modal discrepancy. In sum, our study indicates a weak a priori association between visual and tactile spatial signals that can be strengthened by distributing attention across both modalities.
Collapse
Affiliation(s)
- Stephanie Badde
- Department of Psychology and Center of Neural Science, New York University, 6 Washington Place, New York, NY, 10003, USA.
| | - Karen T Navarro
- Department of Psychology, University of Minnesota, 75 E River Rd., Minneapolis, MN, 55455, USA
| | - Michael S Landy
- Department of Psychology and Center of Neural Science, New York University, 6 Washington Place, New York, NY, 10003, USA
| |
Collapse
|
7
|
Badde S, Röder B, Heed T. Feeling a Touch to the Hand on the Foot. Curr Biol 2019; 29:1491-1497.e4. [PMID: 30955931 DOI: 10.1016/j.cub.2019.02.060] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2017] [Revised: 02/15/2019] [Accepted: 02/27/2019] [Indexed: 10/27/2022]
Abstract
Where we perceive a touch putatively depends on topographic maps that code the touch's location on the skin [1] as well as its position in external space [2-5]. However, neither somatotopic nor external-spatial representations can account for atypical tactile percepts in some neurological patients and amputees; referral of touch to an absent or anaesthetized hand after stimulation of a foot [6, 7] or the contralateral hand [8-10] challenges the role of topographic representations when attributing touch to the limbs. Here, we show that even healthy adults systematically misattribute touch to other limbs. Participants received two tactile stimuli, each to a different limb-hand or foot-and reported which of all four limbs had been stimulated first. Hands and feet were either uncrossed or crossed to dissociate body-based and external-spatial representations [11-14]. Remarkably, participants regularly attributed the first touch to a limb that had received neither of the two stimuli. The erroneously reported, non-stimulated limb typically matched the correct limb with respect to limb type or body side. Touch was misattributed to non-stimulated limbs of the other limb type and body side only if they were placed at the correct limb's canonical (default) side of space. The touch's actual location in external space was irrelevant. These errors replicated across several contexts, and modeling linked them to incoming sensory evidence rather than to decision strategies. The results highlight the importance of the touched body part's identity and canonical location but challenge the role of external-spatial tactile representations when attributing touch to a limb.
Collapse
Affiliation(s)
- Stephanie Badde
- Department of Psychology and Center of Neural Sciences, New York University, 6 Washington Place, New York, NY 10003, USA; Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany; Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1, 33619 Bielefeld, Germany
| |
Collapse
|
8
|
Tamè L, Azañón E, Longo MR. A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Front Psychol 2019; 10:291. [PMID: 30863333 PMCID: PMC6399380 DOI: 10.3389/fpsyg.2019.00291] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 01/29/2019] [Indexed: 11/30/2022] Open
Abstract
The processing of touch depends of multiple factors, such as the properties of the skin and type of receptors stimulated, as well as features related to the actual configuration and shape of the body itself. A large body of research has focused on the effect that the nature of the stimuli has on tactile processing. Less research, however, has focused on features beyond the nature of the touch. In this review, we focus on some features related to the body that have been investigated for less time and in a more fragmented way. These include the symmetrical quality of the two sides of the body, the postural configuration of the body, as well as the size and shape of different body parts. We will describe what we consider three key aspects: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body, regarded as priors; (3) and how tactile signals are integrated with representations of body size and shape. Here, we describe how these different body dimensions affect integration of tactile information as well as guide motor behavior by integrating them in a single model of tactile processing. We review a wide range of neuropsychological, neuroimaging, and neurophysiological data and suggest a revised model of tactile integration on the basis of the one proposed previously by Longo et al.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom.,School of Psychology, University of Kent, Canterbury, United Kingdom
| | - Elena Azañón
- Institute of Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany.,Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom
| |
Collapse
|
9
|
Effects of horizontal distance and limb crossing on perceived hand spacing and ownership: Differential sensory processing across hand configurations. Sci Rep 2018; 8:17699. [PMID: 30531927 PMCID: PMC6286308 DOI: 10.1038/s41598-018-35895-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2018] [Accepted: 10/01/2018] [Indexed: 11/08/2022] Open
Abstract
We have previously shown that, with the hands apart vertically, passively grasping an artificial finger induces a sense of ownership over the artificial finger and coming-together of the hands. The present study investigated this grasp illusion in the horizontal plane. Thirty healthy participants were tested in two conditions (grasp and no grasp) with their hands at different distances apart, either crossed or uncrossed. After 3 min, participants reported perceived spacing between index fingers, perceived index finger location, and, for the grasp condition, perceived ownership over the artificial finger. On average, there was no ownership at any of the hand configurations. With the hands uncrossed 7.5, 15 or 24 cm apart, there was no difference in perceived spacing between the grasp and no grasp conditions. With the hands crossed and 15 cm apart, perceived spacing between index fingers was 3.2 cm [0.7 to 5.7] (mean [95% CI]) smaller during the grasp condition compared to no grasp. Therefore, compared to when the hands are vertically separated, there is an almost complete lack of a grasp illusion in the horizontal plane which indicates the brain may process sensory inputs from the hands differently based on whether the hands are horizontally or vertically apart.
Collapse
|
10
|
Murphy S, Dalton P. Inattentional numbness and the influence of task difficulty. Cognition 2018; 178:1-6. [PMID: 29753983 DOI: 10.1016/j.cognition.2018.05.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2017] [Revised: 04/30/2018] [Accepted: 05/02/2018] [Indexed: 10/16/2022]
Abstract
Research suggests that clearly detectable stimuli can be missed when attention is focused elsewhere, particularly when the observer is engaged in a complex task. Although this phenomenon has been demonstrated in vision and audition, much less is known about the possibility of a similar phenomenon within touch. Across two experiments, we investigated reported awareness of an unexpected tactile event as a function of the difficulty of a concurrent tactile task. Participants were presented with sequences of tactile stimuli to one hand and performed either an easy or a difficult counting task. On the final trial, an additional tactile stimulus was concurrently presented to the unattended hand. Retrospective reports revealed that more participants in the difficult (vs. easy) condition remained unaware of this unexpected stimulus, even though it was clearly detectable under full attention conditions. These experiments are the first demonstrating the phenomenon of inattentional numbness modulated by concurrent tactile task difficulty.
Collapse
Affiliation(s)
- Sandra Murphy
- Department of Psychology, Royal Holloway, University of London, United Kingdom
| | - Polly Dalton
- Department of Psychology, Royal Holloway, University of London, United Kingdom.
| |
Collapse
|
11
|
Task-irrelevant sounds influence both temporal order and apparent-motion judgments about tactile stimuli applied to crossed and uncrossed hands. Atten Percept Psychophys 2017; 80:773-783. [DOI: 10.3758/s13414-017-1476-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
12
|
Abstract
A growing literature shows that body posture modulates the perception of touch, as well as somatosensory processing more widely. In this study, I investigated the effects of changes in the internal postural configuration of the hand on the perceived distance between touches. In two experiments participants positioned their hand in two postures, with the fingers splayed (Apart posture) or pressed together (Together posture). In Experiment 1, participants made forced-choice judgments of which of two tactile distances felt bigger, one oriented with the proximal-distal hand axis (Along orientation) and one oriented with the medio-lateral hand axis (Across orientation). In Experiment 2, participants made verbal estimates of the absolute distance between a single pair of touches, in one of the two orientations. Consistent with previous results, there was a clear bias to perceive distances in the across orientation as larger than those in the along orientation. Perceived tactile distance was also modulated by posture, with increased judgments in both orientations when the fingers were splayed. These results show that changes in the internal posture of the hand modulate the perceived distance between touches on the hand, and add to a growing literature showing postural modulation of touch.
Collapse
|
13
|
Azañón E, Camacho K, Morales M, Longo MR. The Sensitive Period for Tactile Remapping Does Not Include Early Infancy. Child Dev 2017; 89:1394-1404. [PMID: 28452406 DOI: 10.1111/cdev.12813] [Citation(s) in RCA: 50] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Visual input during development seems crucial in tactile spatial perception, given that late, but not congenitally, blind people are impaired when skin-based and tactile external representations are in conflict (when crossing the limbs). To test whether there is a sensitive period during which visual input is necessary, 14 children (age = 7.95) and a teenager (LM; age = 17.38) deprived of early vision by cataracts, and whose sight was restored during the first 5 months and at age 7, respectively, were tested. Tactile localization with arms crossed and uncrossed was measured. Children showed a crossing effect indistinguishable from a control group (Ns = 28, age = 8.24), whereas LM showed no crossing effect (Ns controls = 14, age = 20.78). This demonstrates a sensitive period which, critically, does not include early infancy.
Collapse
|