1
|
Otsuka S, Gao H, Hiraoka K. Contribution of external reference frame to tactile localization. Exp Brain Res 2024; 242:1957-1970. [PMID: 38918211 DOI: 10.1007/s00221-024-06877-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Accepted: 06/18/2024] [Indexed: 06/27/2024]
Abstract
The purpose of the present study was to elucidate whether an external reference frame contributes to tactile localization in blindfolded healthy humans. In a session, the right forearm was passively moved until the elbow finally reached to the target angle, and participants reached the left index finger to the right middle fingertip. The locus of the right middle fingertip indicated by the participants deviated in the direction of the elbow extension when vibration was provided to the biceps brachii muscle during the passive movement. This finding indicates that proprioception contributes to the identification of the spatial coordinate of the specific body part in an external reference frame. In another session, the tactile stimulus was provided to the dorsal of the right hand during the passive movement, and the participants reached the left index finger to the spatial locus at which the tactile stimulus was provided. Vibration to the biceps brachii muscle did not change the perceived locus of the tactile stimulus indicated by the left index finger. This finding indicates that an external reference frame does not contribute to tactile localization during the passive movement. Humans may estimate the spatial coordinate of the tactile stimulus based on the time between the movement onset and the time at which the tactile stimulus is provided.
Collapse
Affiliation(s)
- Shunsuke Otsuka
- College of Health and Human Sciences, Osaka Prefecture University, Habikino city, Japan
| | - Han Gao
- Graduate School of Rehabilitation Science, Osaka Metropolitan University, Habikino city, Japan
| | - Koichi Hiraoka
- Department of Rehabilitation Science, School of Medicine, Osaka Metropolitan University, Habikino city, Japan.
| |
Collapse
|
2
|
Alouit A, Gavaret M, Ramdani C, Lindberg PG, Dupin L. Cortical activations associated with spatial remapping of finger touch using EEG. Cereb Cortex 2024; 34:bhae161. [PMID: 38642106 DOI: 10.1093/cercor/bhae161] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 03/22/2024] [Accepted: 03/23/2024] [Indexed: 04/22/2024] Open
Abstract
The spatial coding of tactile information is functionally essential for touch-based shape perception and motor control. However, the spatiotemporal dynamics of how tactile information is remapped from the somatotopic reference frame in the primary somatosensory cortex to the spatiotopic reference frame remains unclear. This study investigated how hand position in space or posture influences cortical somatosensory processing. Twenty-two healthy subjects received electrical stimulation to the right thumb (D1) or little finger (D5) in three position conditions: palm down on right side of the body (baseline), hand crossing the body midline (effect of position), and palm up (effect of posture). Somatosensory-evoked potentials (SEPs) were recorded using electroencephalography. One early-, two mid-, and two late-latency neurophysiological components were identified for both fingers: P50, P1, N125, P200, and N250. D1 and D5 showed different cortical activation patterns: compared with baseline, the crossing condition showed significant clustering at P1 for D1, and at P50 and N125 for D5; the change in posture showed a significant cluster at N125 for D5. Clusters predominated at centro-parietal electrodes. These results suggest that tactile remapping of fingers after electrical stimulation occurs around 100-125 ms in the parietal cortex.
Collapse
Affiliation(s)
- Anaëlle Alouit
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Martine Gavaret
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
- GHU-Paris Psychiatrie et Neurosciences, Hôpital Sainte Anne, Service de neurophysiologie clinique, 1 Rue Cabanis, F-75014 Paris, France
| | - Céline Ramdani
- Service de Santé des Armées, Institut de Recherche Biomédicale des Armées, 1 Place du Général Valérie André, 91220 Brétigny-sur-Orge, France
| | - Påvel G Lindberg
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Lucile Dupin
- Université Paris Cité, INCC UMR 8002, CNRS, 45 Rue des Saints-Pères, F-75006 Paris, France
| |
Collapse
|
3
|
Do motor plans affect sensorimotor state estimates during temporal decision-making with crossed vs. uncrossed hands? Failure to replicate the dynamic crossed-hand effect. Exp Brain Res 2022; 240:1529-1545. [PMID: 35332358 DOI: 10.1007/s00221-022-06349-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2021] [Accepted: 03/10/2022] [Indexed: 11/04/2022]
Abstract
Hermosillo et al. (J Neurosci 31: 10019-10022, 2011) have suggested that action planning of hand movements impacts decisions about the temporal order judgments regarding vibrotactile stimulation of the hands. Specifically, these authors reported that the crossed-hand effect, a confusion about which hand is which when held in a crossed posture, gradually reverses some 320 ms before the arms begin to move from an uncrossed to a crossed posture or vice versa, such that the crossed-hand is reversed at the time of movement onset in anticipation of the movement's end position. However, to date, no other study has attempted to replicate this dynamic crossed-hand effect. Therefore, in the present study, we conducted four experiments to revisit the question whether preparing uncrossed-to-crossed or crossed-to-uncrossed movements affects the temporo-spatial perception of tactile stimulation of the hands. We used a temporal order judgement (TOJ) task at different time stages during action planning to test whether TOJs are more difficult with crossed than uncrossed hands ("static crossed-hand effect") and, crucially, whether planning to cross or uncross the hands shows the opposite pattern of difficulties ("dynamic crossed-hand effect"). As expected, our results confirmed the static crossed-hand effect. However, the dynamic crossed-hand effect could not be replicated. In addition, we observed that participants delayed their movements with late somatosensory stimulation from the TOJ task, even when the stimulations were meaningless, suggesting that the TOJ task resulted in cross-modal distractions. Whereas the current findings are not inconsistent with a contribution of motor signals to posture perception, they cast doubt on observations that motor signals impact state estimates well before movement onset.
Collapse
|
4
|
Lorentz L, Unwalla K, Shore DI. Imagine Your Crossed Hands as Uncrossed: Visual Imagery Impacts the Crossed-Hands Deficit. Multisens Res 2021; 35:1-29. [PMID: 34690111 DOI: 10.1163/22134808-bja10065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Accepted: 10/06/2021] [Indexed: 11/19/2022]
Abstract
Successful interaction with our environment requires accurate tactile localization. Although we seem to localize tactile stimuli effortlessly, the processes underlying this ability are complex. This is evidenced by the crossed-hands deficit, in which tactile localization performance suffers when the hands are crossed. The deficit results from the conflict between an internal reference frame, based in somatotopic coordinates, and an external reference frame, based in external spatial coordinates. Previous evidence in favour of the integration model employed manipulations to the external reference frame (e.g., blindfolding participants), which reduced the deficit by reducing conflict between the two reference frames. The present study extends this finding by asking blindfolded participants to visually imagine their crossed arms as uncrossed. This imagery manipulation further decreased the magnitude of the crossed-hands deficit by bringing information in the two reference frames into alignment. This imagery manipulation differentially affected males and females, which was consistent with the previously observed sex difference in this effect: females tend to show a larger crossed-hands deficit than males and females were more impacted by the imagery manipulation. Results are discussed in terms of the integration model of the crossed-hands deficit.
Collapse
Affiliation(s)
- Lisa Lorentz
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - Kaian Unwalla
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
- Multisensory Perception Laboratory, a Division of the Multisensory Mind Inc., Hamilton, ON, Canada
| |
Collapse
|
5
|
Perquin MN, Taylor M, Lorusso J, Kolasinski J. Directional biases in whole hand motion perception revealed by mid-air tactile stimulation. Cortex 2021; 142:221-236. [PMID: 34280867 PMCID: PMC8422163 DOI: 10.1016/j.cortex.2021.03.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2020] [Revised: 12/31/2020] [Accepted: 03/30/2021] [Indexed: 11/22/2022]
Abstract
Many emerging technologies are attempting to leverage the tactile domain to convey complex spatiotemporal information translated directly from the visual domain, such as shape and motion. Despite the intuitive appeal of touch for communication, we do not know to what extent the hand can substitute for the retina in this way. Here we ask whether the tactile system can be used to perceive complex whole hand motion stimuli, and whether it exhibits the same kind of established perceptual biases as reported in the visual domain. Using ultrasound stimulation, we were able to project complex moving dot percepts onto the palm in mid-air, over 30 cm above an emitter device. We generated dot kinetogram stimuli involving motion in three different directional axes ('Horizontal', 'Vertical', and 'Oblique') on the ventral surface of the hand. Using Bayesian statistics, we found clear evidence that participants were able to discriminate tactile motion direction. Furthermore, there was a marked directional bias in motion perception: participants were both better and more confident at discriminating motion in the vertical and horizontal axes of the hand, compared to those stimuli moving obliquely. This pattern directly mirrors the perceptional biases that have been robustly reported in the visual field, termed the 'Oblique Effect'. These data demonstrate the existence of biases in motion perception that transcend sensory modality. Furthermore, we extend the Oblique Effect to a whole hand scale, using motion stimuli presented on the broad and relatively low acuity surface of the palm, away from the densely innervated and much studied fingertips. These findings highlight targeted ultrasound stimulation as a versatile method to convey potentially complex spatial and temporal information without the need for a user to wear or touch a device.
Collapse
Affiliation(s)
- Marlou N Perquin
- Cardiff University Brain Research Imaging Centre, School of Psychology, Cardiff University, UK; Biopsychology & Cognitive Neuroscience, Faculty of Psychology and Sports Science, Bielefeld University, Germany; Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Germany.
| | - Mason Taylor
- Cardiff University Brain Research Imaging Centre, School of Psychology, Cardiff University, UK
| | - Jarred Lorusso
- Cardiff University Brain Research Imaging Centre, School of Psychology, Cardiff University, UK; School of Biological Sciences, University of Manchester, Manchester, UK
| | - James Kolasinski
- Cardiff University Brain Research Imaging Centre, School of Psychology, Cardiff University, UK
| |
Collapse
|
6
|
Schneider C, Marquis R, Jöhr J, Lopes da Silva M, Ryvlin P, Serino A, De Lucia M, Diserens K. Disentangling the percepts of illusory movement and sensory stimulation during tendon vibration in the EEG. Neuroimage 2021; 241:118431. [PMID: 34329723 DOI: 10.1016/j.neuroimage.2021.118431] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Revised: 07/16/2021] [Accepted: 07/26/2021] [Indexed: 01/10/2023] Open
Abstract
Mechanical vibration of muscle tendons in specific frequencies - termed functional proprioceptive stimulation (FPS) - has the ability to induce the illusion of a movement which is congruent with a lengthening of the vibrated tendon and muscle. The majority of previous reports of the brain correlates of this illusion are based on functional neuroimaging. Contrary to the electroencephalogram (EEG) however, such technologies are not suitable for bedside or ambulant use. While a handful of studies have shown EEG changes during FPS, it remains underinvestigated whether these changes were due to the perceived illusion or the perceived vibration. Here, we aimed at disentangling the neural correlates of the illusory movement from those produced by the vibration sensation by comparing the neural responses to two vibration types, one that did and one that did not elicit an illusion. We recruited 40 naïve participants, 20 for the EEG experiment and 20 for a supporting behavioral study, who received functional tendon co-vibration on the biceps and triceps tendon at their left elbow, pseudo-randomly switching between the illusion and non-illusion trials. Time-frequency decomposition uncovered a strong and lasting event-related desynchronization (ERD) in the mu and beta band in both conditions, suggesting a strong somatosensory response to the vibration. Additionally, the analysis of the evoked potentials revealed a significant difference between the two experimental conditions from 310 to 990ms post stimulus onset. Training classifiers on the frequency-based and voltage-based correlates of illusion perception yielded above chance accuracies for 17 and 13 out of the 20 subjects respectively. Our findings show that FPS-induced illusions produce EEG correlates that are distinct from a vibration-based control and which can be classified reliably in a large number of participants. These results encourage pursuing EEG-based detection of kinesthetic illusions as a tool for clinical use, e.g., to uncover aspects of cognitive perception in unresponsive patients.
Collapse
Affiliation(s)
- Christoph Schneider
- Acute Neurorehabilitation Unit (LRNA), Division of Neurology, Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland.
| | - Renaud Marquis
- Acute Neurorehabilitation Unit (LRNA), Division of Neurology, Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland
| | - Jane Jöhr
- Acute Neurorehabilitation Unit (LRNA), Division of Neurology, Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland; Division of Neurorehabilitation and Neuropsychology, Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland
| | - Marina Lopes da Silva
- Acute Neurorehabilitation Unit (LRNA), Division of Neurology, Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland
| | - Philippe Ryvlin
- Division of Neurology, Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland
| | - Andrea Serino
- MySpace Laboratory, Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland
| | - Marzia De Lucia
- Laboratory for Research in Neuroimaging (LREN), Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland
| | - Karin Diserens
- Acute Neurorehabilitation Unit (LRNA), Division of Neurology, Department of Clinical Neurosciences, Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland.
| |
Collapse
|
7
|
Unwalla K, Goldreich D, Shore DI. Exploring Reference Frame Integration Using Response Demands in a Tactile Temporal-Order Judgement Task. Multisens Res 2021; 34:1-32. [PMID: 34375947 DOI: 10.1163/22134808-bja10057] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2020] [Accepted: 06/10/2021] [Indexed: 11/19/2022]
Abstract
Exploring the world through touch requires the integration of internal (e.g., anatomical) and external (e.g., spatial) reference frames - you only know what you touch when you know where your hands are in space. The deficit observed in tactile temporal-order judgements when the hands are crossed over the midline provides one tool to explore this integration. We used foot pedals and required participants to focus on either the hand that was stimulated first (an anatomical bias condition) or the location of the hand that was stimulated first (a spatiotopic bias condition). Spatiotopic-based responses produce a larger crossed-hands deficit, presumably by focusing observers on the external reference frame. In contrast, anatomical-based responses focus the observer on the internal reference frame and produce a smaller deficit. This manipulation thus provides evidence that observers can change the relative weight given to each reference frame. We quantify this effect using a probabilistic model that produces a population estimate of the relative weight given to each reference frame. We show that a spatiotopic bias can result in either a larger external weight (Experiment 1) or a smaller internal weight (Experiment 2) and provide an explanation of when each one would occur.
Collapse
Affiliation(s)
- Kaian Unwalla
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
| | - Daniel Goldreich
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
- Multisensory Perception Laboratory, a Division of the Multisensory Mind Inc., Hamilton, ON, Canada
| |
Collapse
|
8
|
Applying a novel visual-to-touch sensory substitution for studying tactile reference frames. Sci Rep 2021; 11:10636. [PMID: 34017027 PMCID: PMC8137949 DOI: 10.1038/s41598-021-90132-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2020] [Accepted: 04/27/2021] [Indexed: 11/16/2022] Open
Abstract
Perceiving the spatial location and physical dimensions of touched objects is crucial for goal-directed actions. To achieve this, our brain transforms skin-based coordinates into a reference frame by integrating visual and posture information. In the current study, we examine the role of posture in mapping tactile sensations to a visual image. We developed a new visual-to-touch sensory substitution device that transforms images into a sequence of vibrations on the arm. 52 blindfolded participants performed spatial recognition tasks in three different arm postures and had to switch postures between trial blocks. As participants were not told which side of the device is down and which is up, they could choose how to map its vertical axis in their responses. Contrary to previous findings, we show that new proprioceptive inputs can be overridden in mapping tactile sensations. We discuss the results within the context of the spatial task and the various sensory contributions to the process.
Collapse
|
9
|
de Klerk CCJM, Filippetti ML, Rigato S. The development of body representations: an associative learning account. Proc Biol Sci 2021; 288:20210070. [PMID: 33906399 DOI: 10.1098/rspb.2021.0070] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Representing one's own body is of fundamental importance to interact with our environment, yet little is known about how body representations develop. One account suggests that the ability to represent one's own body is present from birth and supports infants' ability to detect similarities between their own and others' bodies. However, in recent years evidence has been accumulating for alternative accounts that emphasize the role of multisensory experience obtained through acting and interacting with our own body in the development of body representations. Here, we review this evidence, and propose an integrative account that suggests that through experience, infants form multisensory associations that facilitate the development of body representations. This associative account provides a coherent explanation for previous developmental findings, and generates novel hypotheses for future research.
Collapse
Affiliation(s)
- Carina C J M de Klerk
- Centre for Brain Science, Department of Psychology, University of Essex, Colchester, UK
| | | | - Silvia Rigato
- Centre for Brain Science, Department of Psychology, University of Essex, Colchester, UK
| |
Collapse
|
10
|
Goettker A, Fiehler K, Voudouris D. Somatosensory target information is used for reaching but not for saccadic eye movements. J Neurophysiol 2020; 124:1092-1102. [DOI: 10.1152/jn.00258.2020] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
A systematic investigation of contributions of different somatosensory modalities (proprioception, kinesthesia, tactile) for goal-directed movements is missing. Here we demonstrate that while eye movements are not affected by different types of somatosensory information, reach precision improves when two different types of information are available. Moreover, reach accuracy and gaze precision to unseen somatosensory targets improve when performing coordinated eye-hand movements, suggesting bidirectional contributions of efferent information in reach and eye movement control.
Collapse
Affiliation(s)
- Alexander Goettker
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
| | - Katja Fiehler
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University, Giessen, Germany
| | - Dimitris Voudouris
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
| |
Collapse
|
11
|
Unwalla K, Kearney H, Shore DI. Reliability of the Crossed-Hands Deficit in Tactile Temporal Order Judgements. Multisens Res 2020; 34:387-421. [PMID: 33706262 DOI: 10.1163/22134808-bja10039] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Accepted: 09/10/2020] [Indexed: 11/19/2022]
Abstract
Crossing the hands over the midline impairs performance on a tactile temporal order judgement (TOJ) task, resulting in the crossed-hands deficit. This deficit results from a conflict between two reference frames - one internal (somatotopic) and the other external (spatial) - for coding stimulus location. The substantial individual differences observed in the crossed-hands deficit highlight the differential reliance on these reference frames. For example, women have been reported to place a greater emphasis on the external reference frame than men, resulting in a larger crossed-hands deficit for women. It has also been speculated that individuals with an eating disorder place a greater weight on the external reference frame. Further exploration of individual differences in reference frame weighing using a tactile TOJ task requires that the reliability of the task be established. In Experiment 1, we investigated the reliability of the tactile TOJ task across two sessions separated by one week and found high reliability in the magnitude of the crossed-hands deficit. In Experiment 2, we report the split-half reliability across multiple experiments (both published and unpublished). Overall, tactile TOJ reliability was high. Experiments with small to moderate crossed-hands deficits showed good reliability; those with larger deficits showed even higher reliability. Researchers should try to maximize the size of the effect when interested in individual differences in the use of the internal and external reference frames.
Collapse
Affiliation(s)
- Kaian Unwalla
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - Hannah Kearney
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| |
Collapse
|
12
|
Maij F, Seegelke C, Medendorp WP, Heed T. External location of touch is constructed post-hoc based on limb choice. eLife 2020; 9:57804. [PMID: 32945257 PMCID: PMC7561349 DOI: 10.7554/elife.57804] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Accepted: 09/18/2020] [Indexed: 11/13/2022] Open
Abstract
When humans indicate on which hand a tactile stimulus occurred, they often err when their hands are crossed. This finding seemingly supports the view that the automatically determined touch location in external space affects limb assignment: the crossed right hand is localized in left space, and this conflict presumably provokes hand assignment errors. Here, participants judged on which hand the first of two stimuli, presented during a bimanual movement, had occurred, and then indicated its external location by a reach-to-point movement. When participants incorrectly chose the hand stimulated second, they pointed to where that hand had been at the correct, first time point, though no stimulus had occurred at that location. This behavior suggests that stimulus localization depended on hand assignment, not vice versa. It is, thus, incompatible with the notion of automatic computation of external stimulus location upon occurrence. Instead, humans construct external touch location post-hoc and on demand.
Collapse
Affiliation(s)
- Femke Maij
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Christian Seegelke
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| | - W Pieter Medendorp
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Tobias Heed
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
13
|
Hidaka S, Tucciarelli R, Azañón E, Longo MR. Tactile distance adaptation aftereffects do not transfer to perceptual hand maps. Acta Psychol (Amst) 2020; 208:103090. [PMID: 32485337 DOI: 10.1016/j.actpsy.2020.103090] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2019] [Revised: 05/08/2020] [Accepted: 05/12/2020] [Indexed: 12/26/2022] Open
Abstract
Recent studies have demonstrated that mental representations of the hand dorsum are distorted even for healthy participants. Perceptual hand maps estimated by pointing to specific landmarks (e.g., knuckles and tips of fingers) is stretched and shrunk along the medio-lateral and the proximo-distal axes, respectively. Similarly, tactile distance perception between two touches is longer along the medio-lateral axis than the proximo-distal axis. The congruency of the two types of distortions suggests that common perceptual and neural representations may be involved in these processes. Prolonged stimulation by two simultaneous touches having a particular distance can bias subsequent perception of tactile distances (e.g., adaptation to a long distance induces shorter stimuli to be perceived even shorter). This tactile distance adaptation aftereffect has been suggested to occur based on the modulations of perceptual and neural responses at low somatosensory processing stages. The current study investigated whether tactile distance adaptation aftereffects affect also the pattern of distortions on the perceptual hand maps. Participants localized locations on the hand dorsum cued by tactile stimulations (Experiment 1) or visually presented landmarks on a hand silhouette (Experiment 2). Each trial was preceded by adaptation to either a small (2 cm) or large (4 cm) tactile distance. We found clear tactile distance aftereffects. However, no changes were observed for the distorted pattern of the perceptual hand maps following adaptation to a tactile distance. Our results showed that internal body representations involved in perceptual distortions may be distinct between tactile distance perception and the perceptual hand maps underlying position sense.
Collapse
Affiliation(s)
- Souta Hidaka
- Department of Psychology, Rikkyo University, 1-2-26, Kitano, Niiza-shi, Saitama 352-8558, Japan; Department of Psychological Sciences, Birkbeck, University of London, United Kingdom.
| | - Raffaele Tucciarelli
- Department of Psychological Sciences, Birkbeck, University of London, United Kingdom
| | - Elena Azañón
- Institute of Psychology, Otto-von-Guericke University, Universitätsplatz 2, 39106 Magdeburg, Germany; Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Brenneckestraße 6, Magdeburg 39118, Germany; Center for Behavioral Brain Sciences, Universitätsplatz 2, Magdeburg 39106, Germany
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, United Kingdom
| |
Collapse
|
14
|
Chen S, Shi Z, Zang X, Zhu X, Assumpção L, Müller HJ, Geyer T. Crossmodal learning of target-context associations: When would tactile context predict visual search? Atten Percept Psychophys 2020; 82:1682-1694. [PMID: 31845105 PMCID: PMC7297845 DOI: 10.3758/s13414-019-01907-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
It is well established that statistical learning of visual target locations in relation to constantly positioned visual distractors facilitates visual search. In the present study, we investigated whether such a contextual-cueing effect would also work crossmodally, from touch onto vision. Participants responded to the orientation of a visual target singleton presented among seven homogenous visual distractors. Four tactile stimuli, two to different fingers of each hand, were presented either simultaneously with or prior to the visual stimuli. The identity of the stimulated fingers provided the crossmodal context cue: in half of the trials, a given visual target location was consistently paired with a given tactile configuration. The visual stimuli were presented above the unseen fingers, ensuring spatial correspondence between vision and touch. We found no evidence of crossmodal contextual cueing when the two sets of items (tactile, visual) were presented simultaneously (Experiment 1). However, a reliable crossmodal effect emerged when the tactile distractors preceded the onset of visual stimuli 700 ms (Experiment 2). But crossmodal cueing disappeared again when, after an initial learning phase, participants flipped their hands, making the tactile distractors appear at different positions in external space while their somatotopic positions remained unchanged (Experiment 3). In all experiments, participants were unable to explicitly discriminate learned from novel multisensory arrays. These findings indicate that search-facilitating context memory can be established across vision and touch. However, in order to guide visual search, the (predictive) tactile configurations must be remapped from their initial somatotopic into a common external representational format.
Collapse
Affiliation(s)
- Siyi Chen
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany.
| | - Zhuanghua Shi
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Xuelian Zang
- Center for Cognition and Brain Disorders, Institute of Psychological Sciences, Hangzhou Normal University, Hangzhou, China
| | - Xiuna Zhu
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Leonardo Assumpção
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Hermann J Müller
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Thomas Geyer
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| |
Collapse
|
15
|
Chen YP, Yeh CI, Lee TC, Huang JJ, Pei YC. Relative posture between head and finger determines perceived tactile direction of motion. Sci Rep 2020; 10:5494. [PMID: 32218502 PMCID: PMC7099024 DOI: 10.1038/s41598-020-62327-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2018] [Accepted: 03/12/2020] [Indexed: 11/09/2022] Open
Abstract
The hand explores the environment for obtaining tactile information that can be fruitfully integrated with other functions, such as vision, audition, and movement. In theory, somatosensory signals gathered by the hand are accurately mapped in the world-centered (allocentric) reference frame such that the multi-modal information signals, whether visual-tactile or motor-tactile, are perfectly aligned. However, an accumulating body of evidence indicates that the perceived tactile orientation or direction is inaccurate; yielding a surprisingly large perceptual bias. To investigate such perceptual bias, this study presented tactile motion stimuli to healthy adult participants in a variety of finger and head postures, and requested the participants to report the perceived direction of motion mapped on a video screen placed on the frontoparallel plane in front of the eyes. Experimental results showed that the perceptual bias could be divided into systematic and nonsystematic biases. Systematic bias, defined as the mean difference between the perceived and veridical directions, correlated linearly with the relative posture between the finger and the head. By contrast, nonsystematic bias, defined as minor difference in bias for different stimulus directions, was highly individualized, phase-locked to stimulus orientation presented on the skin. Overall, the present findings on systematic bias indicate that the transformation bias among the reference frames is dominated by the finger-to-head posture. Moreover, the highly individualized nature of nonsystematic bias reflects how information is obtained by the orientation-selective units in the S1 cortex.
Collapse
Affiliation(s)
- Yueh-Peng Chen
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan.,Center of Vascularized Tissue Allograft, Chang Gung Memorial Hospital at Linkou, Taoyuan, Taiwan.,School of Medicine, Chang Gung University, Taoyuan, Taiwan.,Healthy Aging Research Center, Chang Gung University, Taoyuan, Taiwan.,Center for Artificial Intelligence in Medicine, Chang Gung Memorial Hospital at Linkou, Taoyuan, Taiwan
| | - Chun-I Yeh
- Department of Psychology, National Taiwan University, Taipei, Taiwan
| | - Tsung-Chi Lee
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan
| | - Jian-Jia Huang
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan.,Center of Vascularized Tissue Allograft, Chang Gung Memorial Hospital at Linkou, Taoyuan, Taiwan.,School of Medicine, Chang Gung University, Taoyuan, Taiwan
| | - Yu-Cheng Pei
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan. .,Center of Vascularized Tissue Allograft, Chang Gung Memorial Hospital at Linkou, Taoyuan, Taiwan. .,School of Medicine, Chang Gung University, Taoyuan, Taiwan. .,Healthy Aging Research Center, Chang Gung University, Taoyuan, Taiwan.
| |
Collapse
|
16
|
Azañón E, Longo MR. Tactile Perception: Beyond the Somatotopy of the Somatosensory Cortex. Curr Biol 2020; 29:R322-R324. [PMID: 31063723 DOI: 10.1016/j.cub.2019.03.037] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
New research demonstrates systematic errors of tactile localisation, involving confusions of body parts and body sides. Such errors do not follow the organisation of topographic maps in somatosensory cortex, suggesting that tactile localisation involves coding of abstract features of limbs.
Collapse
Affiliation(s)
- Elena Azañón
- Institute of Psychology, Otto von Guericke University Magdeburg, 39106 Magdeburg, Germany; Center for Behavioral Brain Sciences, 39106 Magdeburg, Germany; Department of Behavioral Neurology, Leibniz Institute for Neurobiology, 39118 Magdeburg, Germany.
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK.
| |
Collapse
|
17
|
Christie BP, Charkhkar H, Shell CE, Marasco PD, Tyler DJ, Triolo RJ. Visual inputs and postural manipulations affect the location of somatosensory percepts elicited by electrical stimulation. Sci Rep 2019; 9:11699. [PMID: 31406122 PMCID: PMC6690924 DOI: 10.1038/s41598-019-47867-1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2019] [Accepted: 07/25/2019] [Indexed: 12/02/2022] Open
Abstract
The perception of somatosensation requires the integration of multimodal information, yet the effects of vision and posture on somatosensory percepts elicited by neural stimulation are not well established. In this study, we applied electrical stimulation directly to the residual nerves of trans-tibial amputees to elicit sensations referred to their missing feet. We evaluated the influence of congruent and incongruent visual inputs and postural manipulations on the perceived size and location of stimulation-evoked somatosensory percepts. We found that although standing upright may cause percept size to change, congruent visual inputs and/or body posture resulted in better localization. We also observed visual capture: the location of a somatosensory percept shifted toward a visual input when vision was incongruent with stimulation-induced sensation. Visual capture did not occur when an adopted posture was incongruent with somatosensation. Our results suggest that internal model predictions based on postural manipulations reinforce perceived sensations, but do not alter them. These characterizations of multisensory integration are important for the development of somatosensory-enabled prostheses because current neural stimulation paradigms cannot replicate the afferent signals of natural tactile stimuli. Nevertheless, multisensory inputs can improve perceptual precision and highlight regions of the foot important for balance and locomotion.
Collapse
Affiliation(s)
- Breanne P Christie
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA. .,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.
| | - Hamid Charkhkar
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| | - Courtney E Shell
- Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.,Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, OH, USA
| | - Paul D Marasco
- Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.,Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, OH, USA
| | - Dustin J Tyler
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| | - Ronald J Triolo
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| |
Collapse
|
18
|
Rahman MS, Yau JM. Somatosensory interactions reveal feature-dependent computations. J Neurophysiol 2019; 122:5-21. [DOI: 10.1152/jn.00168.2019] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Our ability to perceive and discriminate textures is based on the processing of high-frequency vibrations generated on the fingertip as it scans across a surface. Although much is known about the processing of vibration amplitude and frequency information when cutaneous stimulation is experienced at a single location on the body, how these stimulus features are processed when touch occurs at multiple locations is poorly understood. We evaluated participants’ ability to discriminate tactile cues (100–300 Hz) on one hand while they ignored distractor cues experienced on their other hand. We manipulated the relative positions of the hands to characterize how limb position influenced cutaneous touch interactions. In separate experiments, participants judged either the frequency or intensity of mechanical vibrations. We found that vibrations experienced on one hand always systematically modulated the perception of vibrations on the other hand. Notably, bimanual interaction patterns and their sensitivity to hand locations differed according to stimulus feature. Somatosensory interactions in intensity perception were only marked by attenuation that was invariant to hand position manipulations. In contrast, interactions in frequency perception consisted of both bias and sensitivity changes that were more pronounced when the hands were held in close proximity. We implemented models to infer the neural computations that mediate somatosensory interactions in the intensity and frequency dimensions. Our findings reveal obligatory and feature-dependent somatosensory interactions that may be supported by both feature-specific and feature-general operations. NEW & NOTEWORTHY Little is known about the neural computations mediating feature-specific sensory interactions between the hands. We show that vibrations experienced on one hand systematically modulate the perception of vibrations felt on the other hand. Critically, interaction patterns and their dependence on the relative positions of the hands differed depending on whether participants judged vibration intensity or frequency. These results, which we recapitulate with models, imply that somatosensory interactions are mediated by feature-dependent neural computations.
Collapse
Affiliation(s)
| | - Jeffrey M. Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas
| |
Collapse
|
19
|
Badde S, Röder B, Heed T. Feeling a Touch to the Hand on the Foot. Curr Biol 2019; 29:1491-1497.e4. [PMID: 30955931 DOI: 10.1016/j.cub.2019.02.060] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2017] [Revised: 02/15/2019] [Accepted: 02/27/2019] [Indexed: 10/27/2022]
Abstract
Where we perceive a touch putatively depends on topographic maps that code the touch's location on the skin [1] as well as its position in external space [2-5]. However, neither somatotopic nor external-spatial representations can account for atypical tactile percepts in some neurological patients and amputees; referral of touch to an absent or anaesthetized hand after stimulation of a foot [6, 7] or the contralateral hand [8-10] challenges the role of topographic representations when attributing touch to the limbs. Here, we show that even healthy adults systematically misattribute touch to other limbs. Participants received two tactile stimuli, each to a different limb-hand or foot-and reported which of all four limbs had been stimulated first. Hands and feet were either uncrossed or crossed to dissociate body-based and external-spatial representations [11-14]. Remarkably, participants regularly attributed the first touch to a limb that had received neither of the two stimuli. The erroneously reported, non-stimulated limb typically matched the correct limb with respect to limb type or body side. Touch was misattributed to non-stimulated limbs of the other limb type and body side only if they were placed at the correct limb's canonical (default) side of space. The touch's actual location in external space was irrelevant. These errors replicated across several contexts, and modeling linked them to incoming sensory evidence rather than to decision strategies. The results highlight the importance of the touched body part's identity and canonical location but challenge the role of external-spatial tactile representations when attributing touch to a limb.
Collapse
Affiliation(s)
- Stephanie Badde
- Department of Psychology and Center of Neural Sciences, New York University, 6 Washington Place, New York, NY 10003, USA; Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany; Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1, 33619 Bielefeld, Germany
| |
Collapse
|
20
|
Tamè L, Azañón E, Longo MR. A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Front Psychol 2019; 10:291. [PMID: 30863333 PMCID: PMC6399380 DOI: 10.3389/fpsyg.2019.00291] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 01/29/2019] [Indexed: 11/30/2022] Open
Abstract
The processing of touch depends of multiple factors, such as the properties of the skin and type of receptors stimulated, as well as features related to the actual configuration and shape of the body itself. A large body of research has focused on the effect that the nature of the stimuli has on tactile processing. Less research, however, has focused on features beyond the nature of the touch. In this review, we focus on some features related to the body that have been investigated for less time and in a more fragmented way. These include the symmetrical quality of the two sides of the body, the postural configuration of the body, as well as the size and shape of different body parts. We will describe what we consider three key aspects: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body, regarded as priors; (3) and how tactile signals are integrated with representations of body size and shape. Here, we describe how these different body dimensions affect integration of tactile information as well as guide motor behavior by integrating them in a single model of tactile processing. We review a wide range of neuropsychological, neuroimaging, and neurophysiological data and suggest a revised model of tactile integration on the basis of the one proposed previously by Longo et al.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom.,School of Psychology, University of Kent, Canterbury, United Kingdom
| | - Elena Azañón
- Institute of Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany.,Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom
| |
Collapse
|
21
|
Dupin L, Haggard P. Dynamic Displacement Vector Interacts with Tactile Localization. Curr Biol 2019; 29:492-498.e3. [PMID: 30686734 PMCID: PMC6370943 DOI: 10.1016/j.cub.2018.12.032] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2018] [Revised: 11/15/2018] [Accepted: 12/18/2018] [Indexed: 11/25/2022]
Abstract
Locating a tactile stimulus on the body seems effortless and straightforward. However, the perceived location of a tactile stimulation can differ from its physical location [1, 2, 3]. Tactile mislocalizations can depend on the timing of successive stimulations [2, 4, 5], tactile motion mechanisms [6], or processes that “remap” stimuli from skin locations to external space coordinates [7, 8, 9, 10, 11]. We report six experiments demonstrating that the perception of tactile localization on a static body part is strongly affected by the displacement between the locations of two successive task-irrelevant actions. Participants moved their index finger between two keys. Each keypress triggered synchronous tactile stimulation at a randomized location on the immobilized wrist or forehead. Participants reported the location of the second tactile stimulation relative to the first. The direction of either active finger movements or passive finger displacements biased participants’ tactile orientation judgements (experiment 1). The effect generalized to tactile stimuli delivered to other body sites (experiment 2). Two successive keypresses, by different fingers at distinct locations, reproduced the effect (experiment 3). The effect remained even when the hand that moved was placed far from the tactile stimulation site (experiments 4 and 5). Temporal synchrony within 600 ms between the movement and tactile stimulations was necessary for the effect (experiment 6). Our results indicate that a dynamic displacement vector, defined as the location of one sensorimotor event relative to the one before, plays a strong role in structuring tactile spatial perception. Human tactile localization is biased by simultaneous finger displacement The shift between two successive events biases the relative localization of touches Both active and passive movements induce a bias, even if far from the touched site The bias effect is vectorially organized
Collapse
Affiliation(s)
- Lucile Dupin
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK.
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK
| |
Collapse
|
22
|
Effects of horizontal distance and limb crossing on perceived hand spacing and ownership: Differential sensory processing across hand configurations. Sci Rep 2018; 8:17699. [PMID: 30531927 PMCID: PMC6286308 DOI: 10.1038/s41598-018-35895-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2018] [Accepted: 10/01/2018] [Indexed: 11/08/2022] Open
Abstract
We have previously shown that, with the hands apart vertically, passively grasping an artificial finger induces a sense of ownership over the artificial finger and coming-together of the hands. The present study investigated this grasp illusion in the horizontal plane. Thirty healthy participants were tested in two conditions (grasp and no grasp) with their hands at different distances apart, either crossed or uncrossed. After 3 min, participants reported perceived spacing between index fingers, perceived index finger location, and, for the grasp condition, perceived ownership over the artificial finger. On average, there was no ownership at any of the hand configurations. With the hands uncrossed 7.5, 15 or 24 cm apart, there was no difference in perceived spacing between the grasp and no grasp conditions. With the hands crossed and 15 cm apart, perceived spacing between index fingers was 3.2 cm [0.7 to 5.7] (mean [95% CI]) smaller during the grasp condition compared to no grasp. Therefore, compared to when the hands are vertically separated, there is an almost complete lack of a grasp illusion in the horizontal plane which indicates the brain may process sensory inputs from the hands differently based on whether the hands are horizontally or vertically apart.
Collapse
|
23
|
Murphy S, Dalton P. Inattentional numbness and the influence of task difficulty. Cognition 2018; 178:1-6. [PMID: 29753983 DOI: 10.1016/j.cognition.2018.05.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2017] [Revised: 04/30/2018] [Accepted: 05/02/2018] [Indexed: 10/16/2022]
Abstract
Research suggests that clearly detectable stimuli can be missed when attention is focused elsewhere, particularly when the observer is engaged in a complex task. Although this phenomenon has been demonstrated in vision and audition, much less is known about the possibility of a similar phenomenon within touch. Across two experiments, we investigated reported awareness of an unexpected tactile event as a function of the difficulty of a concurrent tactile task. Participants were presented with sequences of tactile stimuli to one hand and performed either an easy or a difficult counting task. On the final trial, an additional tactile stimulus was concurrently presented to the unattended hand. Retrospective reports revealed that more participants in the difficult (vs. easy) condition remained unaware of this unexpected stimulus, even though it was clearly detectable under full attention conditions. These experiments are the first demonstrating the phenomenon of inattentional numbness modulated by concurrent tactile task difficulty.
Collapse
Affiliation(s)
- Sandra Murphy
- Department of Psychology, Royal Holloway, University of London, United Kingdom
| | - Polly Dalton
- Department of Psychology, Royal Holloway, University of London, United Kingdom.
| |
Collapse
|
24
|
Schubert JTW, Badde S, Röder B, Heed T. Task demands affect spatial reference frame weighting during tactile localization in sighted and congenitally blind adults. PLoS One 2017; 12:e0189067. [PMID: 29228023 PMCID: PMC5724835 DOI: 10.1371/journal.pone.0189067] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Accepted: 11/17/2017] [Indexed: 11/18/2022] Open
Abstract
Task demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically (“palm” or “back” of the hand), or externally (“up” or “down” in space). Under anatomical instructions, performance was more accurate for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was more accurate for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information—here, task instruction—even in the absence of developmental vision.
Collapse
Affiliation(s)
- Jonathan T. W. Schubert
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Stephanie Badde
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Department of Psychology, New York University, New York, United States of America
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Biopsychology & Cognitive Neuroscience, Faculty of Psychology & Sports Science, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| |
Collapse
|
25
|
Visual Experience Shapes the Neural Networks Remapping Touch into External Space. J Neurosci 2017; 37:10097-10103. [PMID: 28947578 DOI: 10.1523/jneurosci.1213-17.2017] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2017] [Revised: 07/26/2017] [Indexed: 11/21/2022] Open
Abstract
Localizing touch relies on the activation of skin-based and externally defined spatial frames of reference. Psychophysical studies have demonstrated that early visual deprivation prevents the automatic remapping of touch into external space. We used fMRI to characterize how visual experience impacts the brain circuits dedicated to the spatial processing of touch. Sighted and congenitally blind humans performed a tactile temporal order judgment (TOJ) task, either with the hands uncrossed or crossed over the body midline. Behavioral data confirmed that crossing the hands has a detrimental effect on TOJ judgments in sighted but not in early blind people. Crucially, the crossed hand posture elicited enhanced activity, when compared with the uncrossed posture, in a frontoparietal network in the sighted group only. Psychophysiological interaction analysis revealed, however, that the congenitally blind showed enhanced functional connectivity between parietal and frontal regions in the crossed versus uncrossed hand postures. Our results demonstrate that visual experience scaffolds the neural implementation of the location of touch in space.SIGNIFICANCE STATEMENT In daily life, we seamlessly localize touch in external space for action planning toward a stimulus making contact with the body. For efficient sensorimotor integration, the brain has therefore to compute the current position of our limbs in the external world. In the present study, we demonstrate that early visual deprivation alters the brain activity in a dorsal parietofrontal network typically supporting touch localization in the sighted. Our results therefore conclusively demonstrate the intrinsic role that developmental vision plays in scaffolding the neural implementation of touch perception.
Collapse
|
26
|
Świder K, Wronka E, Oosterman JM, van Rijn CM, Jongsma MLA. Influence of transient spatial attention on the P3 component and perception of painful and non-painful electric stimuli in crossed and uncrossed hands positions. PLoS One 2017; 12:e0182616. [PMID: 28873414 PMCID: PMC5584947 DOI: 10.1371/journal.pone.0182616] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2016] [Accepted: 07/22/2017] [Indexed: 11/19/2022] Open
Abstract
Recent reports show that focusing attention on the location where pain is expected can enhance its perception. Moreover, crossing the hands over the body’s midline is known to impair the ability to localise stimuli and decrease tactile and pain sensations in healthy participants. The present study investigated the role of transient spatial attention on the perception of painful and non-painful electrical stimuli in conditions in which a match or a mismatch was induced between skin-based and external frames of reference (uncrossed and crossed hands positions, respectively). We measured the subjective experience (Numerical Rating Scale scores) and the electrophysiological response elicited by brief electric stimuli by analysing the P3 component of Event-Related Potentials (ERPs). Twenty-two participants underwent eight painful and eight non-painful stimulus blocks. The electrical stimuli were applied to either the left or the right hand, held in either a crossed or uncrossed position. Each stimulus was preceded by a direction cue (leftward or rightward arrow). In 80% of the trials, the arrow correctly pointed to the spatial regions where the stimulus would appear (congruent cueing). Our results indicated that congruent cues resulted in increased pain NRS scores compared to incongruent ones. For non-painful stimuli such an effect was observed only in the uncrossed hands position. For both non-painful and painful stimuli the P3 peak amplitudes were higher and occurred later for incongruently cued stimuli compared to congruent ones. However, we found that crossing the hands substantially reduced the cueing effect of the P3 peak amplitudes elicited by painful stimuli. Taken together, our results showed a strong influence of transient attention manipulations on the NRS ratings and on the brain activity. Our results also suggest that hand position may modulate the strength of the cueing effect, although differences between painful and non-painful stimuli exist.
Collapse
Affiliation(s)
- Karolina Świder
- Institute of Psychology, Jagiellonian University, Kraków, Poland
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
- * E-mail:
| | - Eligiusz Wronka
- Institute of Psychology, Jagiellonian University, Kraków, Poland
| | - Joukje M. Oosterman
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Clementina M. van Rijn
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Marijtje L. A. Jongsma
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
- Behavioural Science Institute, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
27
|
Kolasinski J, Logan JP, Hinson EL, Manners D, Divanbeighi Zand AP, Makin TR, Emir UE, Stagg CJ. A Mechanistic Link from GABA to Cortical Architecture and Perception. Curr Biol 2017; 27:1685-1691.e3. [PMID: 28552355 PMCID: PMC5462622 DOI: 10.1016/j.cub.2017.04.055] [Citation(s) in RCA: 38] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2016] [Revised: 03/07/2017] [Accepted: 04/26/2017] [Indexed: 11/26/2022]
Abstract
Understanding both the organization of the human cortex and its relation to the performance of distinct functions is fundamental in neuroscience. The primary sensory cortices display topographic organization, whereby receptive fields follow a characteristic pattern, from tonotopy to retinotopy to somatotopy [1]. GABAergic signaling is vital to the maintenance of cortical receptive fields [2]; however, it is unclear how this fine-grain inhibition relates to measurable patterns of perception [3, 4]. Based on perceptual changes following perturbation of the GABAergic system, it is conceivable that the resting level of cortical GABAergic tone directly relates to the spatial specificity of activation in response to a given input [5, 6, 7]. The specificity of cortical activation can be considered in terms of cortical tuning: greater cortical tuning yields more localized recruitment of cortical territory in response to a given input. We applied a combination of fMRI, MR spectroscopy, and psychophysics to substantiate the link between the cortical neurochemical milieu, the tuning of cortical activity, and variability in perceptual acuity, using human somatosensory cortex as a model. We provide data that explain human perceptual acuity in terms of both the underlying cellular and metabolic processes. Specifically, higher concentrations of sensorimotor GABA are associated with more selective cortical tuning, which in turn is associated with enhanced perception. These results show anatomical and neurochemical specificity and are replicated in an independent cohort. The mechanistic link from neurochemistry to perception provides a vital step in understanding population variability in sensory behavior, informing metabolic therapeutic interventions to restore perceptual abilities clinically. GABAergic tone correlates with perceptual acuity in the human somatosensory system This relationship is mediated by the tuning of activity in somatosensory cortex We explain perceptual acuity via the underlying cellular and metabolic processes
Collapse
Affiliation(s)
- James Kolasinski
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK; Cardiff University Brain Research Imaging Centre, School of Psychology, Cardiff University, Cardiff CF24 4HQ, UK; University College, Oxford OX1 4BH, UK.
| | - John P Logan
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK
| | - Emily L Hinson
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK; Oxford Centre for Human Brain Activity, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, UK
| | - Daniel Manners
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK
| | - Amir P Divanbeighi Zand
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK
| | - Tamar R Makin
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK
| | - Uzay E Emir
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK
| | - Charlotte J Stagg
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford OX3 9DU, UK; Oxford Centre for Human Brain Activity, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, UK
| |
Collapse
|
28
|
Azañón E, Camacho K, Morales M, Longo MR. The Sensitive Period for Tactile Remapping Does Not Include Early Infancy. Child Dev 2017; 89:1394-1404. [PMID: 28452406 DOI: 10.1111/cdev.12813] [Citation(s) in RCA: 50] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Visual input during development seems crucial in tactile spatial perception, given that late, but not congenitally, blind people are impaired when skin-based and tactile external representations are in conflict (when crossing the limbs). To test whether there is a sensitive period during which visual input is necessary, 14 children (age = 7.95) and a teenager (LM; age = 17.38) deprived of early vision by cataracts, and whose sight was restored during the first 5 months and at age 7, respectively, were tested. Tactile localization with arms crossed and uncrossed was measured. Children showed a crossing effect indistinguishable from a control group (Ns = 28, age = 8.24), whereas LM showed no crossing effect (Ns controls = 14, age = 20.78). This demonstrates a sensitive period which, critically, does not include early infancy.
Collapse
|
29
|
Tactile localization performance in children with developmental coordination disorder (DCD) corresponds to their motor skill and not their cognitive ability. Hum Mov Sci 2017; 53:72-83. [PMID: 28109545 DOI: 10.1016/j.humov.2016.12.008] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2016] [Revised: 11/01/2016] [Accepted: 12/26/2016] [Indexed: 11/21/2022]
Abstract
When localizing touches to the hands, typically developing children and adults show a "crossed hands effect" whereby identifying which hand received a tactile stimulus is less accurate when the hands are crossed than uncrossed. This demonstrates the use of an external frame of reference for locating touches to one's own body. Given that studies indicate that developmental vision plays a role in the emergence of external representations of touch, and reliance on vision for representing the body during action is atypical in developmental coordination disorder (DCD), we investigated external spatial representations of touch in children with DCD using the "crossed hands effect". Nineteen children with DCD aged 7-11years completed a tactile localization task in which posture (uncrossed, crossed) and view (hands seen, unseen) were varied systematically. Their performance was compared to that of 35 typically developing controls (19 of a similar age and cognitive ability, and 16 of a younger age but similar fine motor ability). Like controls, the DCD group exhibited a crossed hands effect, whilst their overall tactile localization performance was weaker than their peers of similar age and cognitive ability, but in line with younger controls of similar motor ability. For children with movement difficulties, these findings indicate tactile localization impairments in relation to age expectations, but apparently typical use of an external reference frame for localizing touch.
Collapse
|
30
|
Kolasinski J, Makin TR, Logan JP, Jbabdi S, Clare S, Stagg CJ, Johansen-Berg H. Perceptually relevant remapping of human somatotopy in 24 hours. eLife 2016; 5. [PMID: 28035900 PMCID: PMC5241114 DOI: 10.7554/elife.17280] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2016] [Accepted: 12/29/2016] [Indexed: 11/13/2022] Open
Abstract
Experience-dependent reorganisation of functional maps in the cerebral cortex is well described in the primary sensory cortices. However, there is relatively little evidence for such cortical reorganisation over the short-term. Using human somatosensory cortex as a model, we investigated the effects of a 24 hr gluing manipulation in which the right index and right middle fingers (digits 2 and 3) were adjoined with surgical glue. Somatotopic representations, assessed with two 7 tesla fMRI protocols, revealed rapid off-target reorganisation in the non-manipulated fingers following gluing, with the representation of the ring finger (digit 4) shifted towards the little finger (digit 5) and away from the middle finger (digit 3). These shifts were also evident in two behavioural tasks conducted in an independent cohort, showing reduced sensitivity for discriminating the temporal order of stimuli to the ring and little fingers, and increased substitution errors across this pair on a speeded reaction time task. DOI:http://dx.doi.org/10.7554/eLife.17280.001 The areas of the brain that receive inputs from our senses have a map-like structure. In an area called the visual cortex this map represents our field of vision; in the auditory cortex, it represents the range of different tones we can hear. The sense of touch is processed in the somatosensory cortex: an area of the brain that is organised around a map of the body, with adjacent regions of the cortex representing adjacent regions of the body. The clear structure of these brain regions makes them ideal for exploring how the organisation of the brain changes over time. How quickly can changes to the touch inputs that the brain receives cause the map in the somatosensory cortex to reorganise? Can these effects be produced in just 24 hours? And would this remapping affect how we perceive touch? To investigate these questions, Kolasinski et al. glued together the right index and right middle fingers of healthy human volunteers. This separated the middle and ring fingers: a pair that usually move together due to the anatomical structure of the hand. Functional magnetic resonance imaging of the brain’s activity revealed that within 24 hours of the gluing, the brain’s representation of the ring finger moved away from that of the middle finger, and towards the representation of the little finger. A perceptual judgment task mirrored this finding: after 24 hours of gluing, the participants became better at distinguishing between the middle and ring fingers and worse at distinguishing between the ring and little fingers. This is a powerful demonstration of the human brain’s potential to adapt and reorganise rapidly to changes to sensory inputs. The sense of touch declines gradually with age and may also be reduced as a result of disease such as stroke. A long-term challenge is to understand how the sensory regions of the brain change during this loss of sensation. Further research could then investigate how to maintain the structure of the cortical map to prolong or restore high quality touch sensation. DOI:http://dx.doi.org/10.7554/eLife.17280.002
Collapse
Affiliation(s)
- James Kolasinski
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom.,University College, Oxford, United Kingdom
| | - Tamar R Makin
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom.,Institute of Cognitive Neuroscience, University College London, London, United Kingdom
| | - John P Logan
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
| | - Saad Jbabdi
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
| | - Stuart Clare
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
| | - Charlotte J Stagg
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom.,Oxford Centre for Human Brain Activity, Department of Psychiatry, University of Oxford, Oxford, United Kingdom
| | - Heidi Johansen-Berg
- Oxford Centre for fMRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
| |
Collapse
|
31
|
Vizzari V, Barba S, Gindri P, Duca S, Giobbe D, Cerrato P, Geminiani G, Torta DM. Mechanical pinprick pain in patients with unilateral spatial neglect: The influence of space representation on the perception of nociceptive stimuli. Eur J Pain 2016; 21:738-749. [PMID: 27977072 DOI: 10.1002/ejp.978] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/14/2016] [Indexed: 11/11/2022]
Abstract
BACKGROUND Crossing the hands over the midline can reduce the perceived intensity of nociceptive stimuli applied onto the hands. It remains unclear to what extent intact representation of peripersonal space influences this effect. Here we used the crossed-hands paradigm in patients with unilateral spatial neglect, a neuropsychological condition characterized by the inability to detect, attend and respond to contralesional (most often left) stimuli, and spared ability to process stimuli in the non-affected space. METHODS Sixteen post-stroke patients without unilateral neglect and 11 patients with unilateral spatial neglect received punctate mechanical pinprick stimuli onto their crossed or uncrossed hands. We tested: (i) whether deficits in space representation reduce the possibility of observing 'crossed-hands analgesia', and; (ii) whether placing the contralesional hand, normally lying in the affected space in the healthy space would increase the number of detected stimuli. RESULTS Our results showed that neglect patients did not exhibit 'crossed-hands' analgesia, but did not provide strong evidence for an improvement in the number of detected stimuli when the contralesional hand was in the healthy space. CONCLUSION These findings uphold the notion that the perception of nociceptive stimuli is modulated by the relative position of the hands in space, but raise questions about the conditions under which these effects may arise. SIGNIFICANCE We show that deficits in space representation can influence the processing of mechanical pinprick stimuli. Our results raise several questions on the mechanisms underlying these effects, which are relevant for the clinical practice.
Collapse
Affiliation(s)
- V Vizzari
- Department of Psychology, Universita' degli studi di Torino, Italy
| | - S Barba
- San Camillo Hospital, Torino, Italy
| | - P Gindri
- San Camillo Hospital, Torino, Italy
| | - S Duca
- Koelliker Hospital, Torino, Italy
| | - D Giobbe
- Division of Neurology, Città della Salute e della Scienza, Torino, Italy
| | - P Cerrato
- Stroke Unit, Division of Neurology, Citta della Salute e della Scienza, Torino, Italy
| | - G Geminiani
- Department of Psychology, Universita' degli studi di Torino, Italy
| | - D M Torta
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| |
Collapse
|
32
|
Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach. Psychon Bull Rev 2016; 23:387-404. [PMID: 26350763 DOI: 10.3758/s13423-015-0918-0] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
To act upon a tactile stimulus its original skin-based, anatomical spatial code has to be transformed into an external, posture-dependent reference frame, a process known as tactile remapping. When the limbs are crossed, anatomical and external location codes are in conflict, leading to a decline in tactile localization accuracy. It is unknown whether this impairment originates from the integration of the resulting external localization response with the original, anatomical one or from a failure of tactile remapping in crossed postures. We fitted probabilistic models based on these diverging accounts to the data from three tactile localization experiments. Hand crossing disturbed tactile left-right location choices in all experiments. Furthermore, the size of these crossing effects was modulated by stimulus configuration and task instructions. The best model accounted for these results by integration of the external response mapping with the original, anatomical one, while applying identical integration weights for uncrossed and crossed postures. Thus, the model explained the data without assuming failures of remapping. Moreover, performance differences across tasks were accounted for by non-individual parameter adjustments, indicating that individual participants' task adaptation results from one common functional mechanism. These results suggest that remapping is an automatic and accurate process, and that the observed localization impairments in touch result from a cognitively controlled integration process that combines anatomically and externally coded responses.
Collapse
|
33
|
Tamè L, Wühle A, Petri CD, Pavani F, Braun C. Concurrent use of somatotopic and external reference frames in a tactile mislocalization task. Brain Cogn 2016; 111:25-33. [PMID: 27816777 DOI: 10.1016/j.bandc.2016.10.005] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2016] [Revised: 10/22/2016] [Accepted: 10/24/2016] [Indexed: 10/20/2022]
Abstract
Localizing tactile stimuli on our body requires sensory information to be represented in multiple frames of reference along the sensory pathways. These reference frames include the representation of sensory information in skin coordinates, in which the spatial relationship of skin regions is maintained. The organization of the primary somatosensory cortex matches such somatotopic reference frame. In contrast, higher-order representations are based on external coordinates, in which body posture and gaze direction are taken into account in order to localise touch in other meaningful ways according to task demands. Dominance of one representation or the other, or the use of multiple representations with different weights, is thought to depend on contextual factors of cognitive and/or sensory origins. However, it is unclear under which situations a reference frame takes over another or when different reference frames are jointly used at the same time. The study of tactile mislocalizations at the fingers has shown a key role of the somatotopic frame of reference, both when touches are delivered unilaterally to a single hand, and when they are delivered bilaterally to both hands. Here, we took advantage of a well-established tactile mislocalization paradigm to investigate whether the reference frame used to integrate bilateral tactile stimuli can change as a function of the spatial relationship between the two hands. Specifically, supra-threshold interference stimuli were applied to the index or little fingers of the left hand 200ms prior to the application of a test stimulus on a finger of the right hand. Crucially, different hands postures were adopted (uncrossed or crossed). Results show that introducing a change in hand-posture triggered the concurrent use of somatotopic and external reference frames when processing bilateral touch at the fingers. This demonstrates that both somatotopic and external reference frames can be concurrently used to localise tactile stimuli on the fingers.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, UK.
| | - Anja Wühle
- MEG-Centre, University of Tübingen, Germany
| | | | - Francesco Pavani
- Centre for Mind/Brain Sciences, University of Trento, Rovereto, Italy; Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy; INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Centre, Lyon, France
| | - Christoph Braun
- MEG-Centre, University of Tübingen, Germany; Centre for Mind/Brain Sciences, University of Trento, Rovereto, Italy; Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy; Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
| |
Collapse
|
34
|
Azañón E, Mihaljevic K, Longo MR. A three-dimensional spatial characterization of the crossed-hands deficit. Cognition 2016; 157:289-295. [PMID: 27697737 DOI: 10.1016/j.cognition.2016.09.007] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2015] [Revised: 09/07/2016] [Accepted: 09/11/2016] [Indexed: 11/17/2022]
Abstract
To perceive the location of touch in space, we integrate information about skin-location with information about the location of that body part in space. Most research investigating this process of tactile spatial remapping has used the so-called crossed-hands deficit, in which the ability to judge the temporal order of touches on the two hands is impaired when the arms are crossed. This posture induces a conflict between skin-based and tactile external spatial representations, specifically in the left-right dimension. Thus, it is unknown whether touch is affected by posture when spatial relations other than the right-left dimension are available. Here, we tested the extent to which the crossed-hands deficit is a measure of tactile remapping, reflecting tactile encoding in three-dimensional space. Participants judged the temporal order of tactile stimuli presented to crossed and uncrossed hands. The arms were placed at different elevations (up-down dimension; Experiments 1 and 2), or at different distances from the body in the depth plane (close-far dimension; Experiment 3). The crossed-hands deficit was reduced when other sources of spatial information, orthogonal to the left-right dimension (i.e., close-far, up-down), were available. Nonetheless, the deficit persisted in all conditions, even when processing of non-conflicting information in the close-far or up-down dimensions was enough to solve the task. Together, these results demonstrate that the processing underlying the crossed-hands deficit is related to the encoding of tactile localization in three-dimensional space, rather than related uniquely to the cost of processing information in the right-left dimension. Furthermore, the persistence of the crossing effect provides evidence for automatic integration of all available information during the encoding of tactile information.
Collapse
Affiliation(s)
- Elena Azañón
- Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK.
| | - Kim Mihaljevic
- Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK
| |
Collapse
|
35
|
Disentangling the External Reference Frames Relevant to Tactile Localization. PLoS One 2016; 11:e0158829. [PMID: 27391805 PMCID: PMC4938545 DOI: 10.1371/journal.pone.0158829] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Accepted: 06/22/2016] [Indexed: 12/03/2022] Open
Abstract
Different reference frames appear to be relevant for tactile spatial coding. When participants give temporal order judgments (TOJ) of two tactile stimuli, one on each hand, performance declines when the hands are crossed. This effect is attributed to a conflict between anatomical and external location codes: hand crossing places the anatomically right hand into the left side of external space. However, hand crossing alone does not specify the anchor of the external reference frame, such as gaze, trunk, or the stimulated limb. Experiments that used explicit localization responses, such as pointing to tactile stimuli rather than crossing manipulations, have consistently implicated gaze-centered coding for touch. To test whether crossing effects can be explained by gaze-centered coding alone, participants made TOJ while the position of the hands was manipulated relative to gaze and trunk. The two hands either lay on different sides of space relative to gaze or trunk, or they both lay on one side of the respective space. In the latter posture, one hand was on its "regular side of space" despite hand crossing, thus reducing overall conflict between anatomical and external codes. TOJ crossing effects were significantly reduced when the hands were both located on the same side of space relative to gaze, indicating gaze-centered coding. Evidence for trunk-centered coding was tentative, with an effect in reaction time but not in accuracy. These results link paradigms that use explicit localization and TOJ, and corroborate the relevance of gaze-related coding for touch. Yet, gaze and trunk-centered coding did not account for the total size of crossing effects, suggesting that tactile localization relies on additional, possibly limb-centered, reference frames. Thus, tactile location appears to be estimated by integrating multiple anatomical and external reference frames.
Collapse
|
36
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|
37
|
Noel JP, Wallace M. Relative contributions of visual and auditory spatial representations to tactile localization. Neuropsychologia 2016; 82:84-90. [PMID: 26768124 DOI: 10.1016/j.neuropsychologia.2016.01.005] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2015] [Revised: 12/21/2015] [Accepted: 01/04/2016] [Indexed: 11/17/2022]
Abstract
Spatial localization of touch is critically dependent upon coordinate transformation between different reference frames, which must ultimately allow for alignment between somatotopic and external representations of space. Although prior work has shown an important role for cues such as body posture in influencing the spatial localization of touch, the relative contributions of the different sensory systems to this process are unknown. In the current study, we had participants perform a tactile temporal order judgment (TOJ) under different body postures and conditions of sensory deprivation. Specifically, participants performed non-speeded judgments about the order of two tactile stimuli presented in rapid succession on their ankles during conditions in which their legs were either uncrossed or crossed (and thus bringing somatotopic and external reference frames into conflict). These judgments were made in the absence of 1) visual, 2) auditory, or 3) combined audio-visual spatial information by blindfolding and/or placing participants in an anechoic chamber. As expected, results revealed that tactile temporal acuity was poorer under crossed than uncrossed leg postures. Intriguingly, results also revealed that auditory and audio-visual deprivation exacerbated the difference in tactile temporal acuity between uncrossed to crossed leg postures, an effect not seen for visual-only deprivation. Furthermore, the effects under combined audio-visual deprivation were greater than those seen for auditory deprivation. Collectively, these results indicate that mechanisms governing the alignment between somatotopic and external reference frames extend beyond those imposed by body posture to include spatial features conveyed by the auditory and visual modalities - with a heavier weighting of auditory than visual spatial information. Thus, sensory modalities conveying exteroceptive spatial information contribute to judgments regarding the localization of touch.
Collapse
Affiliation(s)
- Jean-Paul Noel
- Neuroscience Graduate Program, Vanderbilt University Medical School, Vanderbilt University, Nashville, TN 37235, USA; Vanderbilt Brain Institute, Vanderbilt University Medical School, Vanderbilt University, Nashville, TN 37235, USA
| | - Mark Wallace
- Vanderbilt Brain Institute, Vanderbilt University Medical School, Vanderbilt University, Nashville, TN 37235, USA; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN 37235, USA; Department of Psychology, Vanderbilt University, Nashville, TN 37235, USA; Department of Psychiatry, Vanderbilt University, Nashville, TN 37235, USA.
| |
Collapse
|
38
|
Azañón E, Tamè L, Maravita A, Linkenauger S, Ferrè E, Tajadura-Jiménez A, Longo M. Multimodal Contributions to Body Representation. Multisens Res 2016. [DOI: 10.1163/22134808-00002531] [Citation(s) in RCA: 44] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
Abstract
Our body is a unique entity by which we interact with the external world. Consequently, the way we represent our body has profound implications in the way we process and locate sensations and in turn perform appropriate actions. The body can be the subject, but also the object of our experience, providing information from sensations on the body surface and viscera, but also knowledge of the body as a physical object. However, the extent to which different senses contribute to constructing the rich and unified body representations we all experience remains unclear. In this review, we aim to bring together recent research showing important roles for several different sensory modalities in constructing body representations. At the same time, we hope to generate new ideas of how and at which level the senses contribute to generate the different levels of body representations and how they interact. We will present an overview of some of the most recent neuropsychological evidence about multisensory control of pain, and the way that visual, auditory, vestibular and tactile systems contribute to the creation of coherent representations of the body. We will focus particularly on some of the topics discussed in the symposium on Multimodal Contributions to Body Representation held on the 15th International Multisensory Research Forum (2015, Pisa, Italy).
Collapse
Affiliation(s)
- Elena Azañón
- Department of Psychological Sciences, Birkbeck, University of London, WC1E 7HX, London, UK
| | - Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, WC1E 7HX, London, UK
| | - Angelo Maravita
- Department of Psychology, Università degli studi di Milano-Bicocca, Italy
- Neuromi: Milan Center for Neuroscience, Milano, Italy
| | | | - Elisa R. Ferrè
- Institute of Cognitive Neuroscience, University College London, UK
- Department of Psychology, Royal Holloway University of London, UK
| | - Ana Tajadura-Jiménez
- Laboratorio de Neurociencia Humana, Departamento de Psicología, Universidad Loyola Andalucía, Spain
- UCL Interaction Centre, University College London, UK
| | - Matthew R. Longo
- Department of Psychological Sciences, Birkbeck, University of London, WC1E 7HX, London, UK
| |
Collapse
|
39
|
Colon E, Legrain V, Huang G, Mouraux A. Frequency tagging of steady-state evoked potentials to explore the crossmodal links in spatial attention between vision and touch. Psychophysiology 2015; 52:1498-510. [PMID: 26329531 DOI: 10.1111/psyp.12511] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2015] [Accepted: 07/11/2015] [Indexed: 11/29/2022]
Abstract
The sustained periodic modulation of a stimulus induces an entrainment of cortical neurons responding to the stimulus, appearing as a steady-state evoked potential (SS-EP) in the EEG frequency spectrum. Here, we used frequency tagging of SS-EPs to study the crossmodal links in spatial attention between touch and vision. We hypothesized that a visual stimulus approaching the left or right hand orients spatial attention toward the approached hand, and thereby enhances the processing of vibrotactile input originating from that hand. Twenty-five subjects took part in the experiment: 16-s trains of vibrotactile stimuli (4.2 and 7.2 Hz) were applied simultaneously to the left and right hand, concomitantly with a punctate visual stimulus blinking at 9.8 Hz. The visual stimulus was approached toward the left or right hand. The hands were either uncrossed (left and right hands to the left and right of the participant) or crossed (left and right hands to the right and left of the participant). The vibrotactile stimuli elicited two distinct SS-EPs with scalp topographies compatible with activity in the contralateral primary somatosensory cortex. The visual stimulus elicited a third SS-EP with a topography compatible with activity in visual areas. When the visual stimulus was over one of the hands, the amplitude of the vibrotactile SS-EP elicited by stimulation of that hand was enhanced, regardless of whether the hands were uncrossed or crossed. This demonstrates a crossmodal effect of spatial attention between vision and touch, integrating proprioceptive and/or visual information to map the position of the limbs in external space.
Collapse
Affiliation(s)
- Elisabeth Colon
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| | - Valéry Legrain
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| | - Gan Huang
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| | - André Mouraux
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| |
Collapse
|
40
|
|