1
|
Otsuka S, Gao H, Hiraoka K. Contribution of external reference frame to tactile localization. Exp Brain Res 2024; 242:1957-1970. [PMID: 38918211 DOI: 10.1007/s00221-024-06877-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Accepted: 06/18/2024] [Indexed: 06/27/2024]
Abstract
The purpose of the present study was to elucidate whether an external reference frame contributes to tactile localization in blindfolded healthy humans. In a session, the right forearm was passively moved until the elbow finally reached to the target angle, and participants reached the left index finger to the right middle fingertip. The locus of the right middle fingertip indicated by the participants deviated in the direction of the elbow extension when vibration was provided to the biceps brachii muscle during the passive movement. This finding indicates that proprioception contributes to the identification of the spatial coordinate of the specific body part in an external reference frame. In another session, the tactile stimulus was provided to the dorsal of the right hand during the passive movement, and the participants reached the left index finger to the spatial locus at which the tactile stimulus was provided. Vibration to the biceps brachii muscle did not change the perceived locus of the tactile stimulus indicated by the left index finger. This finding indicates that an external reference frame does not contribute to tactile localization during the passive movement. Humans may estimate the spatial coordinate of the tactile stimulus based on the time between the movement onset and the time at which the tactile stimulus is provided.
Collapse
Affiliation(s)
- Shunsuke Otsuka
- College of Health and Human Sciences, Osaka Prefecture University, Habikino city, Japan
| | - Han Gao
- Graduate School of Rehabilitation Science, Osaka Metropolitan University, Habikino city, Japan
| | - Koichi Hiraoka
- Department of Rehabilitation Science, School of Medicine, Osaka Metropolitan University, Habikino city, Japan.
| |
Collapse
|
2
|
Coppi S, Jensen KB, Ehrsson HH. Eliciting the rubber hand illusion by the activation of nociceptive C and Aδ fibers. Pain 2024:00006396-990000000-00611. [PMID: 38787634 DOI: 10.1097/j.pain.0000000000003245] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Accepted: 02/12/2024] [Indexed: 05/26/2024]
Abstract
ABSTRACT The coherent perceptual experience of one's own body depends on the processing and integration of signals from multiple sensory modalities, including vision, touch, and proprioception. Although nociception provides critical information about damage to the tissues of one's body, little is known about how nociception contributes to own-body perception. A classic experimental approach to investigate the perceptual and neural mechanisms involved in the multisensory experience of one's own body is the rubber hand illusion (RHI). During the RHI, people experience a rubber hand as part of their own body (sense of body ownership) caused by synchronized stroking of the rubber hand in the participant's view and the hidden participant's real hand. We examined whether the RHI can be elicited by visual and "pure" nociceptive stimulation, ie, without tactile costimulation, and if so, whether it follows the basic perceptual rules of the illusion. In 6 separate experiments involving a total of 180 healthy participants, we used a Nd:YAP laser stimulator to specifically target C and Aδ fibers in the skin and compared the illusion condition (congruent visuonociceptive stimulation) to control conditions of incongruent visuonociceptive, incongruent visuoproprioceptive, and no nociceptive stimulation. The illusion was quantified through direct (questionnaire) and indirect (proprioceptive drift) behavioral measures. We found that a nociceptive rubber hand illusion (N-RHI) could be elicited and that depended on the spatiotemporal congruence of visuonociceptive signals, consistent with basic principles of multisensory integration. Our results suggest that nociceptive information shapes multisensory bodily awareness and contributes to the sense of body ownership.
Collapse
Affiliation(s)
| | - Karin B Jensen
- Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | | |
Collapse
|
3
|
Alouit A, Gavaret M, Ramdani C, Lindberg PG, Dupin L. Cortical activations associated with spatial remapping of finger touch using EEG. Cereb Cortex 2024; 34:bhae161. [PMID: 38642106 DOI: 10.1093/cercor/bhae161] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 03/22/2024] [Accepted: 03/23/2024] [Indexed: 04/22/2024] Open
Abstract
The spatial coding of tactile information is functionally essential for touch-based shape perception and motor control. However, the spatiotemporal dynamics of how tactile information is remapped from the somatotopic reference frame in the primary somatosensory cortex to the spatiotopic reference frame remains unclear. This study investigated how hand position in space or posture influences cortical somatosensory processing. Twenty-two healthy subjects received electrical stimulation to the right thumb (D1) or little finger (D5) in three position conditions: palm down on right side of the body (baseline), hand crossing the body midline (effect of position), and palm up (effect of posture). Somatosensory-evoked potentials (SEPs) were recorded using electroencephalography. One early-, two mid-, and two late-latency neurophysiological components were identified for both fingers: P50, P1, N125, P200, and N250. D1 and D5 showed different cortical activation patterns: compared with baseline, the crossing condition showed significant clustering at P1 for D1, and at P50 and N125 for D5; the change in posture showed a significant cluster at N125 for D5. Clusters predominated at centro-parietal electrodes. These results suggest that tactile remapping of fingers after electrical stimulation occurs around 100-125 ms in the parietal cortex.
Collapse
Affiliation(s)
- Anaëlle Alouit
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Martine Gavaret
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
- GHU-Paris Psychiatrie et Neurosciences, Hôpital Sainte Anne, Service de neurophysiologie clinique, 1 Rue Cabanis, F-75014 Paris, France
| | - Céline Ramdani
- Service de Santé des Armées, Institut de Recherche Biomédicale des Armées, 1 Place du Général Valérie André, 91220 Brétigny-sur-Orge, France
| | - Påvel G Lindberg
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Lucile Dupin
- Université Paris Cité, INCC UMR 8002, CNRS, 45 Rue des Saints-Pères, F-75006 Paris, France
| |
Collapse
|
4
|
Fabio C, Salemme R, Farnè A, Miller LE. Alpha oscillations reflect similar mapping mechanisms for localizing touch on hands and tools. iScience 2024; 27:109092. [PMID: 38405611 PMCID: PMC10884914 DOI: 10.1016/j.isci.2024.109092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Revised: 12/07/2023] [Accepted: 01/30/2024] [Indexed: 02/27/2024] Open
Abstract
It has been suggested that our brain re-uses body-based computations to localize touch on tools, but the neural implementation of this process remains unclear. Neural oscillations in the alpha and beta frequency bands are known to map touch on the body in external and skin-centered coordinates, respectively. Here, we pinpointed the role of these oscillations during tool-extended sensing by delivering tactile stimuli to either participants' hands or the tips of hand-held rods. To disentangle brain responses related to each coordinate system, we had participants' hands/tool tips crossed or uncrossed at their body midline. We found that midline crossing modulated alpha (but not beta) band activity similarly for hands and tools, also involving a similar network of cortical regions. Our findings strongly suggest that the brain uses similar oscillatory mechanisms for mapping touch on the body and tools, supporting the idea that body-based neural processes are repurposed for tool use.
Collapse
Affiliation(s)
- Cécile Fabio
- Integrative Multisensory Perception Action & Cognition Team of the Lyon Neuroscience Research, Center INSERM U1028 CNRS U5292 University of Lyon 1, Lyon, France
- Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Bielefeld, Germany
| | - Romeo Salemme
- Integrative Multisensory Perception Action & Cognition Team of the Lyon Neuroscience Research, Center INSERM U1028 CNRS U5292 University of Lyon 1, Lyon, France
- Hospices Civils de Lyon, Neuro-immersion, Lyon, France
| | - Alessandro Farnè
- Integrative Multisensory Perception Action & Cognition Team of the Lyon Neuroscience Research, Center INSERM U1028 CNRS U5292 University of Lyon 1, Lyon, France
- Hospices Civils de Lyon, Neuro-immersion, Lyon, France
| | - Luke E. Miller
- Integrative Multisensory Perception Action & Cognition Team of the Lyon Neuroscience Research, Center INSERM U1028 CNRS U5292 University of Lyon 1, Lyon, France
- Hospices Civils de Lyon, Neuro-immersion, Lyon, France
- Donders Institute for Brain, Cognition and Behaviour, Nijmegen, the Netherlands
| |
Collapse
|
5
|
Girondini M, Montanaro M, Gallace A. Spatial tactile localization depends on sensorimotor binding: preliminary evidence from virtual reality. Front Hum Neurosci 2024; 18:1354633. [PMID: 38445099 PMCID: PMC10912179 DOI: 10.3389/fnhum.2024.1354633] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2023] [Accepted: 01/26/2024] [Indexed: 03/07/2024] Open
Abstract
Introduction Our brain continuously maps our body in space. It has been suggested that at least two main frames of reference are used to process somatosensory stimuli presented on our own body: the anatomical frame of reference (based on the somatotopic representation of our body in the somatosensory cortex) and the spatial frame of reference (where body parts are mapped in external space). Interestingly, a mismatch between somatotopic and spatial information significantly affects the processing of bodily information, as demonstrated by the "crossing hand" effect. However, it is not clear if this impairment occurs not only when the conflict between these frames of reference is determined by a static change in the body position (e.g., by crossing the hands) but also when new associations between motor and sensory responses are artificially created (e.g., by presenting feedback stimuli on a side of the body that is not involved in the movement). Methods In the present study, 16 participants performed a temporal order judgment task before and after a congruent or incongruent visual-tactile-motor- task in virtual reality. During the VR task, participants had to move a cube using a virtual stick. In the congruent condition, the haptic feedback during the interaction with the cube was provided on the right hand (the one used to control the stick). In the incongruent condition, the haptic feedback was provided to the contralateral hand, simulating a sort of 'active' crossed feedback during the interaction. Using a psychophysical approach, the point of subjective equality (or PSE, i.e., the probability of responding left or right to the first stimulus in the sequence in 50% of the cases) and the JND (accuracy) were calculated for both conditions, before and after the VR-task. Results After the VR task, compared to the baseline condition, the PSE shifted toward the hand that received the haptic feedback during the interaction (toward the right hand for the congruent condition and toward the left hand for the incongruent condition). Dicussion This study demonstrated the possibility of inducing spatial biases in the processing of bodily information by modulating the sensory-motor interaction between stimuli in virtual environments (while keeping constant the actual position of the body in space).
Collapse
Affiliation(s)
- Matteo Girondini
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
- Mind and Behavior Technological Center, University of Milano-Bicocca, Milan, Italy
- MySpace Lab, Department of Clinical Neuroscience, University Hospital of Lausanne, Lausanne, Switzerland
| | - Massimo Montanaro
- Mind and Behavior Technological Center, University of Milano-Bicocca, Milan, Italy
| | - Alberto Gallace
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
- Mind and Behavior Technological Center, University of Milano-Bicocca, Milan, Italy
| |
Collapse
|
6
|
Klautke J, Foster C, Medendorp WP, Heed T. Dynamic spatial coding in parietal cortex mediates tactile-motor transformation. Nat Commun 2023; 14:4532. [PMID: 37500625 PMCID: PMC10374589 DOI: 10.1038/s41467-023-39959-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2022] [Accepted: 07/05/2023] [Indexed: 07/29/2023] Open
Abstract
Movements towards touch on the body require integrating tactile location and body posture information. Tactile processing and movement planning both rely on posterior parietal cortex (PPC) but their interplay is not understood. Here, human participants received tactile stimuli on their crossed and uncrossed feet, dissociating stimulus location relative to anatomy versus external space. Participants pointed to the touch or the equivalent location on the other foot, which dissociates sensory and motor locations. Multi-voxel pattern analysis of concurrently recorded fMRI signals revealed that tactile location was coded anatomically in anterior PPC but spatially in posterior PPC during sensory processing. After movement instructions were specified, PPC exclusively represented the movement goal in space, in regions associated with visuo-motor planning and with regional overlap for sensory, rule-related, and movement coding. Thus, PPC flexibly updates its spatial codes to accommodate rule-based transformation of sensory input to generate movement to environment and own body alike.
Collapse
Affiliation(s)
- Janina Klautke
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| | - Celia Foster
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - W Pieter Medendorp
- Radboud University, Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Bielefeld, Germany.
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany.
- Cognitive Psychology, Department of Psychology, University of Salzburg, Salzburg, Austria.
- Centre for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria.
| |
Collapse
|
7
|
Hanyu N, Watanabe K, Kitazawa S. Ready to detect a reversal of time's arrow: a psychophysical study using short video clips in daily scenes. ROYAL SOCIETY OPEN SCIENCE 2023; 10:230036. [PMID: 37090963 PMCID: PMC10113813 DOI: 10.1098/rsos.230036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Accepted: 03/28/2023] [Indexed: 05/03/2023]
Abstract
It is generally believed that time flows in one direction and that a reversal of time's arrow would render the external world non-sensical. We evaluated our ability to tell the direction of time's arrow in a wide range of dynamic scenes in our daily life by presenting 360 video clips in the correct or incorrect direction. Participants, who judged the direction in a speeded manner, erred in 39% of trials when a video was played in reverse, but in only 9% when it was played normally. Due to the bias favouring the 'forward' judgement, the reaction was generally faster for the forward response. However, the reaction became paradoxically faster and more synchronous for the detection of reversal in some critical occasions such as forward motion, free fall, diffusion, division and addition of materials by hand. Another experiment with a fraction of the video clips revealed that reversal replay of these videos provided instantaneous evidence strong enough to overtake the forward judgement bias. We suggest that our brain is equipped with a system that predicts how the external organisms behave or move in these critical occasions and that the prediction error of the system contributes to the fast 'reversal' detection.
Collapse
Affiliation(s)
- Nao Hanyu
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, and
| | - Kei Watanabe
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, and
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, Osaka 565-0871, Japan
| | - Shigeru Kitazawa
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, and
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, Osaka 565-0871, Japan
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology, Osaka 565-0871, Japan
| |
Collapse
|
8
|
Moharramipour A, Takahashi T, Kitazawa S. Distinctive modes of cortical communications in tactile temporal order judgment. Cereb Cortex 2023; 33:2982-2996. [PMID: 35811300 DOI: 10.1093/cercor/bhac255] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2022] [Revised: 06/03/2022] [Accepted: 06/04/2022] [Indexed: 11/12/2022] Open
Abstract
Temporal order judgment of two successive tactile stimuli delivered to our hands is often inverted when we cross our hands. The present study aimed to identify time-frequency profiles of the interactions across the cortical network associated with the crossed-hand tactile temporal order judgment task using magnetoencephalography. We found that the interactions across the cortical network were channeled to a low-frequency band (5-10 Hz) when the hands were uncrossed. However, the interactions became activated in a higher band (12-18 Hz) when the hands were crossed. The participants with fewer inverted judgments relied mainly on the higher band, whereas those with more frequent inverted judgments (reversers) utilized both. Moreover, reversers showed greater cortical interactions in the higher band when their judgment was correct compared to when it was inverted. Overall, the results show that the cortical network communicates in two distinctive frequency modes during the crossed-hand tactile temporal order judgment task. A default mode of communications in the low-frequency band encourages inverted judgments, and correct judgment is robustly achieved by recruiting the high-frequency mode.
Collapse
Affiliation(s)
- Ali Moharramipour
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, 1-3 Yamadaoka, Suita, Osaka 565-0871, Japan
- Laboratory for Consciousness, Center for Brain Science (CBS), RIKEN, 2-1 Hirosawa, Wako, Saitama 351-0106, Japan
| | - Toshimitsu Takahashi
- Department of Physiology, Dokkyo Medical University, 880 Kitakobayashi, Mibu, Shimotsuga, Tochigi 321-0293, Japan
| | - Shigeru Kitazawa
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, 1-3 Yamadaoka, Suita, Osaka 565-0871, Japan
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, 1-3 Yamakaoka, Suita, Osaka 565-0871, Japan
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology, 1-4 Yamadaoka, Suita, Osaka 565-0871, Japan
| |
Collapse
|
9
|
Wada M, Umesawa Y, Sano M, Tajima S, Kumagaya S, Miyazaki M. Weakened Bayesian Calibration for Tactile Temporal Order Judgment in Individuals with Higher Autistic Traits. J Autism Dev Disord 2023; 53:378-389. [PMID: 35064873 PMCID: PMC9889458 DOI: 10.1007/s10803-022-05442-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/10/2022] [Indexed: 02/04/2023]
Abstract
Previous psychophysical studies reported a positive aftereffect in tactile temporal order judgments, which can be explained by the Bayesian estimation model ('Bayesian calibration'). We investigated the relationship between Bayesian calibration and autistic traits in participants with typical development (TD) and autism spectrum disorder (ASD). Bayesian calibration was weakened in TD participants with high autistic traits, consistent with the 'hypo-priors' hypothesis for autistic perceptions. The results from the ASD group were generally observed as a continuation of those from the TD groups. Meanwhile, two ASD participants showed irregularly large positive or negative aftereffects. We discussed the mechanisms behind the general results among TD and ASD participants and two particular results among ASD participants based on the Bayesian estimation model.
Collapse
Affiliation(s)
- Makoto Wada
- Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, 4-1, Namiki, Tokorozawa, Saitama, 359-8555, Japan.
- Faculty of Informatics, Shizuoka University, Hamamatsu, Shizuoka, 432-8011, Japan.
| | - Yumi Umesawa
- Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, 4-1, Namiki, Tokorozawa, Saitama, 359-8555, Japan
- Faculty of Medicine, Kyorin University, Mitaka, Tokyo, 181-8611, Japan
| | - Misako Sano
- Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, 4-1, Namiki, Tokorozawa, Saitama, 359-8555, Japan
- Graduate School of Medicine, Nagoya University, Nagoya, Aichi, 461-8673, Japan
| | - Seiki Tajima
- Department of Child Psychiatry, Hospital of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan
| | - Shinichiro Kumagaya
- Research Center for Advanced Science and Technology, The University of Tokyo, Meguro, Tokyo, 153-8904, Japan
| | - Makoto Miyazaki
- Faculty of Informatics, Shizuoka University, Hamamatsu, Shizuoka, 432-8011, Japan.
| |
Collapse
|
10
|
Segil JL, Roldan LM, Graczyk EL. Measuring embodiment: A review of methods for prosthetic devices. Front Neurorobot 2022; 16:902162. [PMID: 36590084 PMCID: PMC9797051 DOI: 10.3389/fnbot.2022.902162] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 10/25/2022] [Indexed: 12/23/2022] Open
Abstract
The development of neural interfaces to provide improved control and somatosensory feedback from prosthetic limbs has initiated a new ability to probe the various dimensions of embodiment. Scientists in the field of neuroprosthetics require dependable measures of ownership, body representation, and agency to quantify the sense of embodiment felt by patients for their prosthetic limbs. These measures are critical to perform generalizable experiments and compare the utility of the new technologies being developed. Here, we review outcome measures used in the literature to evaluate the senses of ownership, body-representation, and agency. We categorize these existing measures based on the fundamental psychometric property measured and whether it is a behavioral or physiological measure. We present arguments for the efficacy and pitfalls of each measure to guide better experimental designs and future outcome measure development. The purpose of this review is to aid prosthesis researchers and technology developers in understanding the concept of embodiment and selecting metrics to assess embodiment in their research. Advances in the ability to measure the embodiment of prosthetic devices have far-reaching implications in the improvement of prosthetic limbs as well as promoting a broader understanding of ourselves as embodied agents.
Collapse
Affiliation(s)
- Jacob L. Segil
- Department of Mechanical Engineering, University of Colorado, Boulder, CO, United States
- Rocky Mountain Regional VA Medical Center, Aurora, CO, United States
| | - Leah Marie Roldan
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, United States
- Louis Stokes Cleveland VA Medical Center, Cleveland, OH, United States
| | - Emily L. Graczyk
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, United States
- Louis Stokes Cleveland VA Medical Center, Cleveland, OH, United States
| |
Collapse
|
11
|
Shibuya S, Oosone H, Ohki Y. Tactile temporal order judgment during rubber hand illusion: Distinct modulation of the point of subjective simultaneity and temporal resolution. Conscious Cogn 2022; 105:103402. [PMID: 36067686 DOI: 10.1016/j.concog.2022.103402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Revised: 08/10/2022] [Accepted: 08/24/2022] [Indexed: 01/27/2023]
Abstract
During the rubber hand illusion (RHI), individuals feel a fake hand as their own (ownership) and a perceived position of their real hand shifts toward the fake hand (proprioceptive drift; PD), which represents updating of multisensory hand representations. Bimanual tactile temporal order judgment (TOJ) includes processes of localizing tactile stimuli in space, for which multisensory hand representations are essential. According to the common processes, we examined tactile TOJ performance during the RHI and non-RHI. Temporal resolution (TR) as TOJ accuracy worsened during the non-RHI compared to the RHI. Additionally, a significant correlation between TR and PD was observed only in the non-RHI condition. However, the point of subjective simultaneity (PSS), which offers relative weighting of tactile inputs from the right and left hands, was correlated with illusory hand ownership. These results suggest that PSS and TR from tactile TOJ during RHI relate to self-attribution and localization of the hand, respectively.
Collapse
Affiliation(s)
- Satoshi Shibuya
- Department of Integrative Physiology, School of Medicine, Kyorin University, 6-20-2 Shinkawa, Mitaka, Tokyo 181-8611, Japan.
| | - Hiroki Oosone
- Chiba Minato Rehabilitation Hospital, 1-17-18 Chuo-minato, Chuo-ku, Chiba City, Chiba 260-0024, Japan
| | - Yukari Ohki
- Department of Integrative Physiology, School of Medicine, Kyorin University, 6-20-2 Shinkawa, Mitaka, Tokyo 181-8611, Japan
| |
Collapse
|
12
|
Abdulrabba S, Tremblay L, Manson GA. Investigating the online control of goal-directed actions to a tactile target on the body. Exp Brain Res 2022; 240:2773-2782. [PMID: 36100753 DOI: 10.1007/s00221-022-06445-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Accepted: 08/11/2022] [Indexed: 11/24/2022]
Abstract
Movement corrections to somatosensory targets have been found to be shorter in latency and larger in magnitude than corrections to external visual targets. Somatosensory targets (e.g., body positions) can be identified using both tactile (i.e., skin receptors) and proprioceptive information (e.g., the sense of body position derived from sensory organs in the muscles and joints). Here, we investigated whether changes in tactile information alone, without changes in proprioception, can elicit shorter correction latencies and larger correction magnitudes than those to external visual targets. Participants made reaching movements to a myofilament touching the index finger of the non-reaching finger (i.e., a tactile target) and a light-emitting diode (i.e., visual target). In one-third of the trials, target perturbations occurred 100 ms after movement onset, such that the target was displaced 3 cm either away or toward the participant. We found that participants demonstrated larger correction magnitudes to visual than tactile target perturbations. Moreover, we found no differences in correction latency between movements to perturbed tactile and visual targets. Further, we found that while participants detected tactile stimuli earlier than visual stimuli, they took longer to initiate reaching movements to an unperturbed tactile target than an unperturbed visual target. These results provide evidence that additional processes may be required when planning movements to tactile versus visual targets and that corrections to changes in tactile target positions alone may not facilitate the latency and magnitude advantages observed for corrections to somatosensory targets (i.e., proprioceptive-tactile targets).
Collapse
Affiliation(s)
- Sadiya Abdulrabba
- Perceptual-Motor Behaviour Laboratory, Centre for Motor Control, Faculty of Kinesiology and Physical Education, University of Toronto, 55 Harbord Street, Toronto, M5S 2W6, Canada.,Sensorimotor Exploration Laboratory, School of Kinesiology and Health Studies, Queen's University, 28 Division St, Kingston, ON, K7L 3N6, Canada
| | - Luc Tremblay
- Perceptual-Motor Behaviour Laboratory, Centre for Motor Control, Faculty of Kinesiology and Physical Education, University of Toronto, 55 Harbord Street, Toronto, M5S 2W6, Canada.
| | - Gerome Aleandro Manson
- Sensorimotor Exploration Laboratory, School of Kinesiology and Health Studies, Queen's University, 28 Division St, Kingston, ON, K7L 3N6, Canada
| |
Collapse
|
13
|
Li H, Song L, Wang P, Weiss PH, Fink GR, Zhou X, Chen Q. Impaired body-centered sensorimotor transformations in congenitally deaf people. Brain Commun 2022; 4:fcac148. [PMID: 35774184 PMCID: PMC9240416 DOI: 10.1093/braincomms/fcac148] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Revised: 02/26/2022] [Accepted: 06/03/2022] [Indexed: 11/20/2022] Open
Abstract
Congenital deafness modifies an individual’s daily interaction with the environment and alters the fundamental perception of the external world. How congenital deafness shapes the interface between the internal and external worlds remains poorly understood. To interact efficiently with the external world, visuospatial representations of external target objects need to be effectively transformed into sensorimotor representations with reference to the body. Here, we tested the hypothesis that egocentric body-centred sensorimotor transformation is impaired in congenital deafness. Consistent with this hypothesis, we found that congenital deafness induced impairments in egocentric judgements, associating the external objects with the internal body. These impairments were due to deficient body-centred sensorimotor transformation per se, rather than the reduced fidelity of the visuospatial representations of the egocentric positions. At the neural level, we first replicated the previously well-documented critical involvement of the frontoparietal network in egocentric processing, in both congenitally deaf participants and hearing controls. However, both the strength of neural activity and the intra-network connectivity within the frontoparietal network alone could not account for egocentric performance variance. Instead, the inter-network connectivity between the task-positive frontoparietal network and the task-negative default-mode network was significantly correlated with egocentric performance: the more cross-talking between them, the worse the egocentric judgement. Accordingly, the impaired egocentric performance in the deaf group was related to increased inter-network connectivity between the frontoparietal network and the default-mode network and decreased intra-network connectivity within the default-mode network. The altered neural network dynamics in congenital deafness were observed for both evoked neural activity during egocentric processing and intrinsic neural activity during rest. Our findings thus not only demonstrate the optimal network configurations between the task-positive and -negative neural networks underlying coherent body-centred sensorimotor transformations but also unravel a critical cause (i.e. impaired body-centred sensorimotor transformation) of a variety of hitherto unexplained difficulties in sensory-guided movements the deaf population experiences in their daily life.
Collapse
Affiliation(s)
- Hui Li
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education , China
- School of Psychology, Center for Studies of Psychological Application, and Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University , China
| | - Li Song
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education , China
- School of Psychology, Center for Studies of Psychological Application, and Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University , China
| | - Pengfei Wang
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education , China
- School of Psychology, Center for Studies of Psychological Application, and Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University , China
| | - Peter H. Weiss
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Centre Jülich, Germany, Wilhelm-Johnen-Strasse , 52428 Jülich, Germany
- Department of Neurology, University Hospital Cologne, Cologne University , 509737 Cologne, Germany
| | - Gereon R. Fink
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Centre Jülich, Germany, Wilhelm-Johnen-Strasse , 52428 Jülich, Germany
- Department of Neurology, University Hospital Cologne, Cologne University , 509737 Cologne, Germany
| | - Xiaolin Zhou
- Shanghai Key Laboratory of Mental Health and Psychological Crisis Intervention, School of Psychology and Cognitive Science, East China Normal University , 200062 Shanghai, China
| | - Qi Chen
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Centre Jülich, Germany, Wilhelm-Johnen-Strasse , 52428 Jülich, Germany
- Key Laboratory of Brain, Cognition and Education Sciences, Ministry of Education , China
- School of Psychology, Center for Studies of Psychological Application, and Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University , China
| |
Collapse
|
14
|
Do motor plans affect sensorimotor state estimates during temporal decision-making with crossed vs. uncrossed hands? Failure to replicate the dynamic crossed-hand effect. Exp Brain Res 2022; 240:1529-1545. [PMID: 35332358 DOI: 10.1007/s00221-022-06349-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2021] [Accepted: 03/10/2022] [Indexed: 11/04/2022]
Abstract
Hermosillo et al. (J Neurosci 31: 10019-10022, 2011) have suggested that action planning of hand movements impacts decisions about the temporal order judgments regarding vibrotactile stimulation of the hands. Specifically, these authors reported that the crossed-hand effect, a confusion about which hand is which when held in a crossed posture, gradually reverses some 320 ms before the arms begin to move from an uncrossed to a crossed posture or vice versa, such that the crossed-hand is reversed at the time of movement onset in anticipation of the movement's end position. However, to date, no other study has attempted to replicate this dynamic crossed-hand effect. Therefore, in the present study, we conducted four experiments to revisit the question whether preparing uncrossed-to-crossed or crossed-to-uncrossed movements affects the temporo-spatial perception of tactile stimulation of the hands. We used a temporal order judgement (TOJ) task at different time stages during action planning to test whether TOJs are more difficult with crossed than uncrossed hands ("static crossed-hand effect") and, crucially, whether planning to cross or uncross the hands shows the opposite pattern of difficulties ("dynamic crossed-hand effect"). As expected, our results confirmed the static crossed-hand effect. However, the dynamic crossed-hand effect could not be replicated. In addition, we observed that participants delayed their movements with late somatosensory stimulation from the TOJ task, even when the stimulations were meaningless, suggesting that the TOJ task resulted in cross-modal distractions. Whereas the current findings are not inconsistent with a contribution of motor signals to posture perception, they cast doubt on observations that motor signals impact state estimates well before movement onset.
Collapse
|
15
|
Martel M, Fuchs X, Trojan J, Gockel V, Habets B, Heed T. Illusory tactile movement crosses arms and legs and is coded in external space. Cortex 2022; 149:202-225. [DOI: 10.1016/j.cortex.2022.01.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Revised: 11/08/2021] [Accepted: 01/24/2022] [Indexed: 11/03/2022]
|
16
|
Fabio C, Salemme R, Koun E, Farnè A, Miller LE. Alpha Oscillations Are Involved in Localizing Touch on Handheld Tools. J Cogn Neurosci 2022; 34:675-686. [DOI: 10.1162/jocn_a_01820] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
Collapse
Affiliation(s)
- Cécile Fabio
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
| | - Romeo Salemme
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
| | - Eric Koun
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
| | - Alessandro Farnè
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
- University of Trento, Rovereto, Italy
| | - Luke E. Miller
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
- Donders Institute for Brain, Nijmegen, The Netherlands
| |
Collapse
|
17
|
Gori M, Campus C, Signorini S, Rivara E, Bremner AJ. Multisensory spatial perception in visually impaired infants. Curr Biol 2021; 31:5093-5101.e5. [PMID: 34555348 PMCID: PMC8612739 DOI: 10.1016/j.cub.2021.09.011] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Revised: 07/29/2021] [Accepted: 09/03/2021] [Indexed: 12/02/2022]
Abstract
Congenitally blind infants are not only deprived of visual input but also of visual influences on the intact senses. The important role that vision plays in the early development of multisensory spatial perception1, 2, 3, 4, 5, 6, 7 (e.g., in crossmodal calibration8, 9, 10 and in the formation of multisensory spatial representations of the body and the world1,2) raises the possibility that impairments in spatial perception are at the heart of the wide range of difficulties that visually impaired infants show across spatial,8, 9, 10, 11, 12 motor,13, 14, 15, 16, 17 and social domains.8,18,19 But investigations of early development are needed to clarify how visually impaired infants’ spatial hearing and touch support their emerging ability to make sense of their body and the outside world. We compared sighted (S) and severely visually impaired (SVI) infants’ responses to auditory and tactile stimuli presented on their hands. No statistically reliable differences in the direction or latency of responses to auditory stimuli emerged, but significant group differences emerged in responses to tactile and audiotactile stimuli. The visually impaired infants showed attenuated audiotactile spatial integration and interference, weighted more tactile than auditory cues when the two were presented in conflict, and showed a more limited influence of representations of the external layout of the body on tactile spatial perception.20 These findings uncover a distinct phenotype of multisensory spatial perception in early postnatal visual deprivation. Importantly, evidence of audiotactile spatial integration in visually impaired infants, albeit to a lesser degree than in sighted infants, signals the potential of multisensory rehabilitation methods in early development. Video abstract
Visually impaired infants have a distinct phenotype of audiotactile perception Infants with severe visual impairment (SVI) place more weight on tactile locations SVI infants show attenuated audiotactile spatial integration and interference SVI infants do not show an influence of body representations on tactile space
Collapse
Affiliation(s)
- Monica Gori
- Unit for Visually Impaired People, Istituto Italiano di Technologia, 16152 Genova, Italy.
| | - Claudio Campus
- Unit for Visually Impaired People, Istituto Italiano di Technologia, 16152 Genova, Italy
| | - Sabrina Signorini
- Centre of Child Neurophthalmology, IRCCS Mondino Foundation, 27100 Pavia, Italy
| | | | - Andrew J Bremner
- School of Psychology, University of Birmingham, Birmingham B15 2SB, UK
| |
Collapse
|
18
|
Lorentz L, Unwalla K, Shore DI. Imagine Your Crossed Hands as Uncrossed: Visual Imagery Impacts the Crossed-Hands Deficit. Multisens Res 2021; 35:1-29. [PMID: 34690111 DOI: 10.1163/22134808-bja10065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Accepted: 10/06/2021] [Indexed: 11/19/2022]
Abstract
Successful interaction with our environment requires accurate tactile localization. Although we seem to localize tactile stimuli effortlessly, the processes underlying this ability are complex. This is evidenced by the crossed-hands deficit, in which tactile localization performance suffers when the hands are crossed. The deficit results from the conflict between an internal reference frame, based in somatotopic coordinates, and an external reference frame, based in external spatial coordinates. Previous evidence in favour of the integration model employed manipulations to the external reference frame (e.g., blindfolding participants), which reduced the deficit by reducing conflict between the two reference frames. The present study extends this finding by asking blindfolded participants to visually imagine their crossed arms as uncrossed. This imagery manipulation further decreased the magnitude of the crossed-hands deficit by bringing information in the two reference frames into alignment. This imagery manipulation differentially affected males and females, which was consistent with the previously observed sex difference in this effect: females tend to show a larger crossed-hands deficit than males and females were more impacted by the imagery manipulation. Results are discussed in terms of the integration model of the crossed-hands deficit.
Collapse
Affiliation(s)
- Lisa Lorentz
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - Kaian Unwalla
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
- Multisensory Perception Laboratory, a Division of the Multisensory Mind Inc., Hamilton, ON, Canada
| |
Collapse
|
19
|
Different mechanisms of magnitude and spatial representation for tactile and auditory modalities. Exp Brain Res 2021; 239:3123-3132. [PMID: 34415367 PMCID: PMC8536643 DOI: 10.1007/s00221-021-06196-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Accepted: 08/11/2021] [Indexed: 11/04/2022]
Abstract
The human brain creates an external world representation based on magnitude judgments by estimating distance, numerosity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used stimulus–response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use different magnitude and spatial representation mechanisms.
Collapse
|
20
|
Unwalla K, Goldreich D, Shore DI. Exploring Reference Frame Integration Using Response Demands in a Tactile Temporal-Order Judgement Task. Multisens Res 2021; 34:1-32. [PMID: 34375947 DOI: 10.1163/22134808-bja10057] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2020] [Accepted: 06/10/2021] [Indexed: 11/19/2022]
Abstract
Exploring the world through touch requires the integration of internal (e.g., anatomical) and external (e.g., spatial) reference frames - you only know what you touch when you know where your hands are in space. The deficit observed in tactile temporal-order judgements when the hands are crossed over the midline provides one tool to explore this integration. We used foot pedals and required participants to focus on either the hand that was stimulated first (an anatomical bias condition) or the location of the hand that was stimulated first (a spatiotopic bias condition). Spatiotopic-based responses produce a larger crossed-hands deficit, presumably by focusing observers on the external reference frame. In contrast, anatomical-based responses focus the observer on the internal reference frame and produce a smaller deficit. This manipulation thus provides evidence that observers can change the relative weight given to each reference frame. We quantify this effect using a probabilistic model that produces a population estimate of the relative weight given to each reference frame. We show that a spatiotopic bias can result in either a larger external weight (Experiment 1) or a smaller internal weight (Experiment 2) and provide an explanation of when each one would occur.
Collapse
Affiliation(s)
- Kaian Unwalla
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
| | - Daniel Goldreich
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4K1, Canada
- Multisensory Perception Laboratory, a Division of the Multisensory Mind Inc., Hamilton, ON, Canada
| |
Collapse
|
21
|
Bollini A, Campus C, Gori M. The development of allocentric spatial frame in the auditory system. J Exp Child Psychol 2021; 211:105228. [PMID: 34242896 DOI: 10.1016/j.jecp.2021.105228] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2020] [Revised: 06/15/2021] [Accepted: 06/15/2021] [Indexed: 10/20/2022]
Abstract
The ability to encode space is a crucial aspect of interacting with the external world. Therefore, this ability appears to be fundamental for the correct development of the capacity to integrate different spatial reference frames. The spatial reference frame seems to be present in all the sensory modalities. However, it has been demonstrated that different sensory modalities follow various developmental courses. Nevertheless, to date these courses have been investigated only in people with sensory impairments, where there is a possible bias due to compensatory strategies and it is complicated to assess the exact age when these skills emerge. For these reasons, we investigated the development of the allocentric frame in the auditory domain in a group of typically developing children aged 6-10 years. To do so, we used an auditory Simon task, a paradigm that involves implicit spatial processing, and we asked children to perform the task in both the uncrossed and crossed hands postures. We demonstrated that the crossed hands posture affected the performance only in younger children (6-7 years), whereas at 10 years of age children performed as adults and were not affected by such posture. Moreover, we found that this task's performance correlated with age and developmental differences in spatial abilities. Our results support the hypothesis that auditory spatial cognition's developmental course is similar to the visual modality development as reported in the literature.
Collapse
Affiliation(s)
- Alice Bollini
- Unit for Visually Impaired People, Center for Human Technologies, Istituto Italiano di Tecnologia, 16163 Genova, Italy.
| | - Claudio Campus
- Unit for Visually Impaired People, Center for Human Technologies, Istituto Italiano di Tecnologia, 16163 Genova, Italy
| | - Monica Gori
- Unit for Visually Impaired People, Center for Human Technologies, Istituto Italiano di Tecnologia, 16163 Genova, Italy
| |
Collapse
|
22
|
Abstract
Accurate localization of touch requires the integration of two reference frames-an internal (e.g., anatomical) and an external (e.g., spatial). Using a tactile temporal order judgement task with the hands crossed over the midline, we investigated the integration of these two reference frames. We manipulated the reliability of the visual and vestibular information, both of which contribute to the external reference frame. Visual information was manipulated between experiments (Experiment 1 was done with full vision and Experiment 2 was done while wearing a blindfold). Vestibular information was manipulated in both experiments by having the two groups of participants complete the task in both an upright posture and one where they were lying down on their side. Using a Bayesian hierarchical model, we estimated the perceptual weight applied to these reference frames. Lying participants on their side reduced the weight applied to the external reference frame and produced a smaller deficit; blindfolding resulted in similar reductions. These findings reinforce the importance of the visual system when weighting tactile reference frames, and highlight the importance of the vestibular system in this integration.
Collapse
Affiliation(s)
- Kaian Unwalla
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada.
| | - Michelle L Cadieux
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience and Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
- Multisensory Perception Laboratory, a Division of the Multisensory Mind Inc., Hamilton, ON, Canada
| |
Collapse
|
23
|
Measuring the sensitivity of tactile temporal order judgments in sighted and blind participants using the adaptive psi method. Atten Percept Psychophys 2021; 83:2995-3007. [PMID: 34036536 DOI: 10.3758/s13414-021-02301-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/14/2021] [Indexed: 11/08/2022]
Abstract
Spatial locations of somatosensory stimuli are coded according to somatotopic (anatomical distribution of the sensory receptors on the skin surface) and spatiotopic (position of the body parts in external space) reference frames. This was mostly evidenced by means of temporal order judgment (TOJ) tasks in which participants discriminate the temporal order of two tactile stimuli, one applied on each hand. Because crossing the hands generates a conflict between anatomical and spatial responses, TOJ performance is decreased in such posture, except for congenitally blind people, suggesting a role of visual experience in somatosensory perception. In previous TOJ studies, stimuli were generally presented using the method of constant stimuli-that is, the repetition of a predefined sample of stimulus-onset asynchronies (SOA) separating the two stimuli. This method has the disadvantage that a large number of trials is needed to obtain reliable data when aiming at dissociating performances of groups characterized by different cognitive abilities. Indeed, each SOA among a large variety of different SOAs should be presented the same number of times irrespective of the participant's performance. This study aimed to replicate previous tactile TOJ data in sighted and blind participants with the adaptive psi method in order to validate a novel method that adapts the presented SOA according to the participant's performance. This allows to precisely estimate the temporal sensitivity of each participant while the presented stimuli are adapted to the participant's individual discrimination threshold. We successfully replicated previous findings in both sighted and blind participants, corroborating previous data using a more suitable psychophysical tool.
Collapse
|
24
|
Moharramipour A, Kitazawa S. What Underlies a Greater Reversal in Tactile Temporal Order Judgment When the Hands Are Crossed? A Structural MRI Study. Cereb Cortex Commun 2021; 2:tgab025. [PMID: 34296170 PMCID: PMC8152922 DOI: 10.1093/texcom/tgab025] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2020] [Revised: 03/31/2021] [Accepted: 03/31/2021] [Indexed: 02/02/2023] Open
Abstract
Our subjective temporal order of two successive tactile stimuli, delivered one to each hand, is often inverted when our hands are crossed. However, there is great variability among different individuals. We addressed the question of why some show almost complete reversal, but others show little reversal. To this end, we obtained structural magnetic resonance imaging data from 42 participants who also participated in the tactile temporal order judgment (TOJ) task. We extracted the cortical thickness and the convoluted surface area as cortical characteristics in 68 regions. We found that the participants with a thinner, larger, and more convoluted cerebral cortex in 10 regions, including the right pars-orbitalis, right and left postcentral gyri, left precuneus, left superior parietal lobule, right middle temporal gyrus, left superior temporal gyrus, right cuneus, left supramarginal gyrus, and right rostral middle frontal gyrus, showed a smaller degree of judgment reversal. In light of major theoretical accounts, we suggest that cortical elaboration in the aforementioned regions improve the crossed-hand TOJ performance through better integration of the tactile stimuli with the correct spatial representations in the left parietal regions, better representation of spatial information in the postcentral gyrus, or improvement of top-down inhibitory control by the right pars-orbitalis.
Collapse
Affiliation(s)
- Ali Moharramipour
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, Osaka 565-0871, Japan
| | - Shigeru Kitazawa
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, Osaka 565-0871, Japan
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, Osaka 565-0871, Japan
- Center for Information and Neural Networks, National Institute of Information and Communications Technology, Osaka University, Osaka 565-0871, Japan
| |
Collapse
|
25
|
Kimura T. Multiple Spatial Coordinates Influence the Prediction of Tactile Events Facilitated by Approaching Visual Stimuli. Multisens Res 2021; 34:1-21. [PMID: 33725668 DOI: 10.1163/22134808-bja10045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Accepted: 02/23/2021] [Indexed: 11/19/2022]
Abstract
Interaction with other sensory information is important for prediction of tactile events. Recent studies have reported that the approach of visual information toward the body facilitates prediction of subsequent tactile events. However, the processing of tactile events is influenced by multiple spatial coordinates, and it remains unclear how this approach effect influences tactile events in different spatial coordinates, i.e., spatial reference frames. We investigated the relationship between the prediction of a tactile stimulus via this approach effect and spatial coordinates by comparing ERPs. Participants were asked to place their arms on a desk and required to respond tactile stimuli which were presented to the left (or right) index finger with a high probability (80%) or to the opposite index finger with a low probability (20%). Before the presentation of each tactile stimulus, visual stimuli approached sequentially toward the hand to which the high-probability tactile stimulus was presented. In the uncrossed condition, each hand was placed on the corresponding side. In the crossed condition, each hand was crossed and placed on the opposite side, i.e., left (right) hand placed on the right (left) side. Thus, the spatial location of the tactile stimulus and hand was consistent in the uncrossed condition and inconsistent in the crossed condition. The results showed that N1 amplitudes elicited by high-probability tactile stimuli only decreased in the uncrossed condition. These results suggest that the prediction of a tactile stimulus facilitated by approaching visual information is influenced by multiple spatial coordinates.
Collapse
Affiliation(s)
- Tsukasa Kimura
- The Institute of Scientific and Industrial Research (ISIR), Osaka University, Ibaraki, 567-0047, Japan
| |
Collapse
|
26
|
Manfron L, Vanderclausen C, Legrain V. No Evidence for an Effect of the Distance Between the Hands on Tactile Temporal Order Judgments. Perception 2021; 50:294-307. [PMID: 33653176 DOI: 10.1177/0301006621998877] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Localizing somatosensory stimuli is an important process, as it allows us to spatially guide our actions toward the object entering in contact with the body. Accordingly, the positions of tactile inputs are coded according to both somatotopic and spatiotopic representations, the latter one considering the position of the stimulated limbs in external space. The spatiotopic representation has often been evidenced by means of temporal order judgment (TOJ) tasks. Participants' judgments about the order of appearance of two successive somatosensory stimuli are less accurate when the hands are crossed over the body midline than uncrossed but also when participants' hands are placed close together when compared with farther away. Moreover, these postural effects might depend on the vision of the stimulated limbs. The aim of this study was to test the influence of seeing the hands, on the modulation of tactile TOJ by the spatial distance between the stimulated limbs. The results showed no influence of the distance between the stimulated hands on TOJ performance and prevent us from concluding whether vision of the hands affects TOJ performance, or whether these variables interact. The reliability of such distance effect to investigate the spatial representations of tactile inputs is questioned.
Collapse
|
27
|
Di Cosmo G, Costantini M, Ambrosini E, Salone A, Martinotti G, Corbo M, Di Giannantonio M, Ferri F. Body-environment integration: Temporal processing of tactile and auditory inputs along the schizophrenia continuum. J Psychiatr Res 2021; 134:208-214. [PMID: 33418447 DOI: 10.1016/j.jpsychires.2020.12.034] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/05/2020] [Revised: 11/06/2020] [Accepted: 12/09/2020] [Indexed: 12/20/2022]
Abstract
According to the dimensional approach to psychosis, there is a continuum from low schizotypy to schizophrenia patients. The temporal aspect of sensory processing seems to be compromised across such continuum, as suggested by different studies separately investigating unisensory or multisensory domains. Most of these studies have so far focused primarily on the temporal processing of visual and auditory stimuli, either in schizotypy or schizophrenia, while leaving the tactile domain and the integration of touch with other senses mostly unexplored. Given the relevance of body-related perceptual abnormalities for psychosis proneness, we aimed at filling this gap in the literature across two studies. We asked participants with increasing levels of schizotypy (study 1) and schizophrenia patients (study 2) to perform three simultaneity judgement tasks: a unimodal tactile task, a unimodal auditory task and a bimodal audio-tactile task. Each task allowed estimating a simultaneity range (SR), as a proxy of the individual tolerance to asynchronies in the tactile, auditory and audio-tactile domains, respectively. Results showed larger SRs as the level of schizotypy increases. Specifically, the linear effect of schizotypy levels on the audio-tactile task was stronger than on the auditory task, which in turn was greater than the effect on the tactile task (study 1). Differently, schizophrenia patients showed larger SRs than controls in all the three tasks (study 2). The current study is the first empirical investigation across the continuum from low schizotypy to schizophrenia of the tolerance to asynchronies in the processing of external (auditory) and body-related (tactile) inputs.
Collapse
Affiliation(s)
- Giulio Di Cosmo
- Institute for Advanced Biomedical Technologies, ITAB, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy.
| | - Marcello Costantini
- Institute for Advanced Biomedical Technologies, ITAB, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy; Department of Psychological, Health and Territorial Sciences, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy
| | | | - Anatolia Salone
- Institute for Advanced Biomedical Technologies, ITAB, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy; Department of Neuroscience, Imaging and Clinical Sciences, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy
| | - Giovanni Martinotti
- Institute for Advanced Biomedical Technologies, ITAB, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy; Department of Neuroscience, Imaging and Clinical Sciences, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy
| | - Mariangela Corbo
- Institute for Advanced Biomedical Technologies, ITAB, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy; Department of Neuroscience, Imaging and Clinical Sciences, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy
| | - Massimo Di Giannantonio
- Institute for Advanced Biomedical Technologies, ITAB, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy; Department of Neuroscience, Imaging and Clinical Sciences, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy
| | - Francesca Ferri
- Institute for Advanced Biomedical Technologies, ITAB, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy; Department of Neuroscience, Imaging and Clinical Sciences, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy.
| |
Collapse
|
28
|
Abstract
Hearing aid and cochlear implant (CI) users often struggle to locate and segregate sounds. The dominant sound-localisation cues are time and intensity differences across the ears. A recent study showed that CI users locate sounds substantially better when these cues are provided through haptic stimulation on each wrist. However, the sensitivity of the wrists to these cues and the robustness of this sensitivity to aging is unknown. The current study showed that time difference sensitivity is much poorer across the wrists than across the ears and declines with age. In contrast, high sensitivity to across-wrist intensity differences was found that was robust to aging. This high sensitivity was observed across a range of stimulation intensities for both amplitude modulated and unmodulated sinusoids and matched across-ear intensity difference sensitivity for normal-hearing individuals. Furthermore, the usable dynamic range for haptic stimulation on the wrists was found to be around four times larger than for CIs. These findings suggest that high-precision haptic sound-localisation can be achieved, which could aid many hearing-impaired listeners. Furthermore, the finding that high-fidelity across-wrist intensity information can be transferred could be exploited in human-machine interfaces to enhance virtual reality and improve remote control of military, medical, or research robots.
Collapse
|
29
|
Fletcher MD. Using haptic stimulation to enhance auditory perception in hearing-impaired listeners. Expert Rev Med Devices 2020; 18:63-74. [PMID: 33372550 DOI: 10.1080/17434440.2021.1863782] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
INTRODUCTION Hearing-assistive devices, such as hearing aids and cochlear implants, transform the lives of hearing-impaired people. However, users often struggle to locate and segregate sounds. This leads to impaired threat detection and an inability to understand speech in noisy environments. Recent evidence suggests that segregation and localization can be improved by providing missing sound-information through haptic stimulation. AREAS COVERED This article reviews the evidence that haptic stimulation can effectively provide sound information. It then discusses the research and development required for this approach to be implemented in a clinically viable device. This includes discussion of what sound information should be provided and how that information can be extracted and delivered. EXPERT OPINION Although this research area has only recently emerged, it builds on a significant body of work showing that sound information can be effectively transferred through haptic stimulation. Current evidence suggests that haptic stimulation is highly effective at providing missing sound-information to cochlear implant users. However, a great deal of work remains to implement this approach in an effective wearable device. If successful, such a device could offer an inexpensive, noninvasive means of improving educational, work, and social experiences for hearing-impaired individuals, including those without access to hearing-assistive devices.
Collapse
Affiliation(s)
- Mark D Fletcher
- University of Southampton Auditory Implant Service, Southampton, UK.,Institute of Sound and Vibration Research, University of Southampton, Southampton, UK
| |
Collapse
|
30
|
Togoli I, Marlair C, Collignon O, Arrighi R, Crollen V. Tactile numerosity is coded in external space. Cortex 2020; 134:43-51. [PMID: 33249299 DOI: 10.1016/j.cortex.2020.10.008] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 09/13/2020] [Accepted: 10/14/2020] [Indexed: 11/18/2022]
Abstract
Humans, and several non-human species, possess the ability to make approximate but reliable estimates of the number of objects around them. Alike other perceptual features, numerosity perception is susceptible to adaptation: exposure to a high number of items causes underestimation of the numerosity of a subsequent set of items, and vice versa. Several studies have investigated adaptation in the auditory and visual modality, whereby stimuli are preferentially encoded in an external coordinate system. As tactile stimuli are primarily coded in an internal (body-centered) reference frame, here we ask whether tactile numerosity adaptation operates based on internal or external spatial coordinates as it occurs in vision or audition. Twenty participants performed an adaptation task with their right hand located either in the right (uncrossed) or left (crossed) hemispace, in order for the two hands to occupy either two completely different positions, or the same position in space, respectively. Tactile adaptor and test stimuli were passively delivered either to the same (adapted) or different (non-adapted) hands. Our results show a clear signature of tactile numerosity adaptation aftereffects with a pattern of over- and under-estimation according to the adaptation rate (low and high, respectively). In the uncrossed position, we observed stronger adaptation effects when adaptor and test stimuli were delivered to the "adapted" hand. However, when both hands were aligned in the same spatial position (crossed condition), the magnitude of adaptation was similar irrespective of which hand received adaptor and test stimuli. These results demonstrate that numerosity information is automatically coded in external coordinates even in the tactile modality, suggesting that such a spatial reference frame is an intrinsic property of numerosity processing irrespective of the sensory modality.
Collapse
Affiliation(s)
- Irene Togoli
- International School for Advanced Studies (SISSA), Trieste, Italy.
| | - Cathy Marlair
- Psychological Sciences Research Institute (IPSY) and Institute of NeuroScience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| | - Olivier Collignon
- Psychological Sciences Research Institute (IPSY) and Institute of NeuroScience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| | - Roberto Arrighi
- University of Florence, Department of Neuroscience, Psychology and Child Health, Florence, Italy.
| | - Virginie Crollen
- Psychological Sciences Research Institute (IPSY) and Institute of NeuroScience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| |
Collapse
|
31
|
Unwalla K, Kearney H, Shore DI. Reliability of the Crossed-Hands Deficit in Tactile Temporal Order Judgements. Multisens Res 2020; 34:387-421. [PMID: 33706262 DOI: 10.1163/22134808-bja10039] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Accepted: 09/10/2020] [Indexed: 11/19/2022]
Abstract
Crossing the hands over the midline impairs performance on a tactile temporal order judgement (TOJ) task, resulting in the crossed-hands deficit. This deficit results from a conflict between two reference frames - one internal (somatotopic) and the other external (spatial) - for coding stimulus location. The substantial individual differences observed in the crossed-hands deficit highlight the differential reliance on these reference frames. For example, women have been reported to place a greater emphasis on the external reference frame than men, resulting in a larger crossed-hands deficit for women. It has also been speculated that individuals with an eating disorder place a greater weight on the external reference frame. Further exploration of individual differences in reference frame weighing using a tactile TOJ task requires that the reliability of the task be established. In Experiment 1, we investigated the reliability of the tactile TOJ task across two sessions separated by one week and found high reliability in the magnitude of the crossed-hands deficit. In Experiment 2, we report the split-half reliability across multiple experiments (both published and unpublished). Overall, tactile TOJ reliability was high. Experiments with small to moderate crossed-hands deficits showed good reliability; those with larger deficits showed even higher reliability. Researchers should try to maximize the size of the effect when interested in individual differences in the use of the internal and external reference frames.
Collapse
Affiliation(s)
- Kaian Unwalla
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - Hannah Kearney
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| | - David I Shore
- Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main Street West, Hamilton, ON, L8S 4K1, Canada
| |
Collapse
|
32
|
Maij F, Seegelke C, Medendorp WP, Heed T. External location of touch is constructed post-hoc based on limb choice. eLife 2020; 9:57804. [PMID: 32945257 PMCID: PMC7561349 DOI: 10.7554/elife.57804] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Accepted: 09/18/2020] [Indexed: 11/13/2022] Open
Abstract
When humans indicate on which hand a tactile stimulus occurred, they often err when their hands are crossed. This finding seemingly supports the view that the automatically determined touch location in external space affects limb assignment: the crossed right hand is localized in left space, and this conflict presumably provokes hand assignment errors. Here, participants judged on which hand the first of two stimuli, presented during a bimanual movement, had occurred, and then indicated its external location by a reach-to-point movement. When participants incorrectly chose the hand stimulated second, they pointed to where that hand had been at the correct, first time point, though no stimulus had occurred at that location. This behavior suggests that stimulus localization depended on hand assignment, not vice versa. It is, thus, incompatible with the notion of automatic computation of external stimulus location upon occurrence. Instead, humans construct external touch location post-hoc and on demand.
Collapse
Affiliation(s)
- Femke Maij
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Christian Seegelke
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| | - W Pieter Medendorp
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| | - Tobias Heed
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany.,Center for Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
33
|
Wada M, Ikeda H, Kumagaya S. Atypical Effects of Visual Interference on Tactile Temporal Order Judgment in Individuals With Autism Spectrum Disorder. Multisens Res 2020; 34:129-151. [PMID: 33706272 DOI: 10.1163/22134808-bja10033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2020] [Accepted: 07/17/2020] [Indexed: 11/19/2022]
Abstract
Visual distractors interfere with tactile temporal order judgment (TOJ) at moderately short stimulus onset asynchronies (SOAs) in typically developing participants. Presentation of a rubber hand in a forward direction to the participant's hand enhances this effect, while that in an inverted direction weakens the effect. Individuals with autism spectrum disorder (ASD) have atypical multisensory processing; however, effects of interferences on atypical multisensory processing in ASD remain unclear. In this study, we examined the effects of visual interference on tactile TOJ in individuals with ASD. Two successive tactile stimuli were delivered to the index and ring fingers of a participant's right hand in an opaque box. A rubber hand was placed on the box in a forward or inverted direction. Concurrently, visual stimuli provided by light-emitting diodes on the fingers of the rubber hand were delivered in a congruent or incongruent order. Participants were required to judge the temporal order of the tactile stimuli regardless of visual distractors. In the absence of a visual stimulus, participants with ASD tended to judge the simultaneous stimuli as the ring finger being stimulated first during tactile TOJ compared with typically developing (TD) controls, and congruent visual stimuli eliminated the bias. When incongruent visual stimuli were delivered, judgment was notably reversed in participants with ASD, regardless of the direction of the rubber hand. The findings demonstrate that there are considerable effects of visual interferences on tactile TOJ in individuals with ASD.
Collapse
Affiliation(s)
- Makoto Wada
- 1Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan.,2Faculty of Informatics, Shizuoka University, Hamamatsu, Shizuoka 432-8011, Japan
| | - Hanako Ikeda
- 1Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Saitama, 359-8555, Japan
| | - Shinichiro Kumagaya
- 3Research Center for Advanced Science and Technology, The University of Tokyo, Meguro, Tokyo 153-8904, Japan
| |
Collapse
|
34
|
Immersive virtual reality reveals that visuo-proprioceptive discrepancy enlarges the hand-centred peripersonal space. Neuropsychologia 2020; 146:107540. [PMID: 32593721 DOI: 10.1016/j.neuropsychologia.2020.107540] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2020] [Revised: 06/11/2020] [Accepted: 06/19/2020] [Indexed: 12/23/2022]
Abstract
Vision and proprioception, informing the system about the body position in space, seem crucial in defining the boundary of the peripersonal space (PPS). What happens to the PPS representation when a conflict between vision and proprioception arises? We capitalize on the Immersive Virtual Reality to dissociate vision and proprioception by presenting the participants' 3D hand image in congruent/incongruent positions with respect to the participants' real hand. To measure the hand-centred PPS, we exploit multisensory integration occurring when visual stimuli are delivered simultaneously with tactile stimuli applied to a body district; i.e., visual enhancement of touch (VET). Participants are instructed to respond to tactile stimuli while ignoring visual stimuli (red LED), which can appear either near to or far from the hand receiving tactile (electrical) stimuli. The results show that, when vision and proprioception are congruent (i.e., real and virtual hand coincide), a space-dependent modulation of the VET effect occurs (with faster responses when visual stimuli are near to than far from the stimulated hand). Contrarily, when vision and proprioception are incongruent (i.e., a discrepancy between real and virtual hand is present), a comparable VET effect is observed when visual stimuli occur near to the real hand and when they occur far from it, but close to the virtual hand. These findings, also confirmed by the independent estimate of a Bayesian Causal Inference model, suggest that, when the visuo-proprioceptive discrepancy makes the coding of the hand position less precise, the hand-centred PPS is enlarged, likely to optimize reactions to external events.
Collapse
|
35
|
Di Pino G, Romano D, Spaccasassi C, Mioli A, D’Alonzo M, Sacchetti R, Guglielmelli E, Zollo L, Di Lazzaro V, Denaro V, Maravita A. Sensory- and Action-Oriented Embodiment of Neurally-Interfaced Robotic Hand Prostheses. Front Neurosci 2020; 14:389. [PMID: 32477046 PMCID: PMC7232597 DOI: 10.3389/fnins.2020.00389] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2020] [Accepted: 03/30/2020] [Indexed: 12/15/2022] Open
Abstract
Embodiment is the percept that something not originally belonging to the self becomes part of the body. Feeling embodiment for a prosthesis may counteract amputees' altered image of the body and increase prosthesis acceptability. Prosthesis embodiment has been studied longitudinally in an amputee receiving feedback through intraneural and perineural multichannel electrodes implanted in her stump. Three factors-invasive (vs non-invasive) stimulation, training, and anthropomorphism-have been tested through two multisensory integration tasks: visuo-tactile integration (VTI) and crossing-hand effect in temporal order judgment (TOJ), the former more sensible to an extension of a safe margin around the body and the latter to action-oriented remapping. Results from the amputee participant were compared with the ones from healthy controls. Testing the participant with intraneural stimulation produced an extension of peripersonal space, a sign of prosthesis embodiment. One-month training extended the peripersonal space selectively on the side wearing the prostheses. More and less-anthropomorphic prostheses benefited of intraneural feedback and extended the peripersonal space. However, the worsening of TOJ performance following arm crossing was present only wearing the more trained, despite less anthropomorphic, prosthesis, suggesting that training was critical for our participant to achieve operative tool-like embodiment.
Collapse
Affiliation(s)
- Giovanni Di Pino
- Research Unit of Neurophysiology and Neuroengineering of Human-Technology Interaction (NeXTlab), Università Campus Bio-Medico di Roma, Rome, Italy
| | - Daniele Romano
- Psychology Department & NeuroMi, Milan Center for Neuroscience, University of Milan-Bicocca, Milan, Italy
| | - Chiara Spaccasassi
- Psychology Department & NeuroMi, Milan Center for Neuroscience, University of Milan-Bicocca, Milan, Italy
| | - Alessandro Mioli
- Research Unit of Neurophysiology and Neuroengineering of Human-Technology Interaction (NeXTlab), Università Campus Bio-Medico di Roma, Rome, Italy
| | - Marco D’Alonzo
- Research Unit of Neurophysiology and Neuroengineering of Human-Technology Interaction (NeXTlab), Università Campus Bio-Medico di Roma, Rome, Italy
| | - Rinaldo Sacchetti
- National Institute for Insurance Against Accidents at Work, Bologna, Italy
| | - Eugenio Guglielmelli
- Research Unit of Advanced Robotics and Human-Centred Technologies, Università Campus Bio-Medico di Roma, Rome, Italy
| | - Loredana Zollo
- Research Unit of Advanced Robotics and Human-Centred Technologies, Università Campus Bio-Medico di Roma, Rome, Italy
| | - Vincenzo Di Lazzaro
- Research Unit of Neurology, Neurophysiology, Neurobiology, Università Campus Bio-Medico di Roma, Rome, Italy
| | - Vincenzo Denaro
- Research Unit of Orthopedics and Traumatology, Università Campus Bio-Medico di Roma, Rome, Italy
| | - Angelo Maravita
- Psychology Department & NeuroMi, Milan Center for Neuroscience, University of Milan-Bicocca, Milan, Italy
| |
Collapse
|
36
|
De Paepe AL, Legrain V, Van der Biest L, Hollevoet N, Van Tongel A, De Wilde L, Jacobs H, Crombez G. An investigation of perceptual biases in complex regional pain syndrome. PeerJ 2020; 8:e8819. [PMID: 32274265 PMCID: PMC7130113 DOI: 10.7717/peerj.8819] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2019] [Accepted: 02/28/2020] [Indexed: 01/31/2023] Open
Abstract
Patients with complex regional pain syndrome (CRPS) report cognitive difficulties, affecting the ability to represent, perceive and use their affected limb. Moseley, Gallace & Spence (2009) observed that CRPS patients tend to bias the perception of tactile stimulation away from the pathological limb. Interestingly, this bias was reversed when CRPS patients were asked to cross their arms, implying that this bias is embedded in a complex representation of the body that takes into account the position of body-parts. Other studies have failed to replicate this finding (Filbrich et al., 2017) or have even found a bias in the opposite direction (Sumitani et al., 2007). Moreover, perceptual biases in CRPS patients have not often been compared to these of other chronic pain patients. Chronic pain patients are often characterized by an excessive focus of attention for bodily sensations. We might therefore expect that non-CRPS pain patients would show a bias towards instead of away from their affected limb. The aim of this study was to replicate the study of Moseley, Gallace & Spence (2009) and to extend it by comparing perceptual biases in a CRPS group with two non-CRPS pain control groups (i.e., chronic unilateral wrist and shoulder pain patients). In a temporal order judgment (TOJ) task, participants reported which of two tactile stimuli, one applied to either hand at various intervals, was perceived as occurring first. TOJs were made, either with the arms in a normal (uncrossed) position, or with the arms crossed over the body midline. We found no consistent perceptual biases in either of the patient groups and in either of the conditions (crossed/uncrossed). Individual differences were large and might, at least partly, be explained by other variables, such as pain duration and temperature differences between the pathological and non-pathological hand. Additional studies need to take these variables into account by, for example, comparing biases in CRPS (and non-CRPS) patients in an acute versus a chronic pain state.
Collapse
Affiliation(s)
- Annick L. De Paepe
- Department of Experimental-Clinical and Health Psychology, Faculty of Psychology and Educational Sciences, Ghent University, Ghent, Belgium
| | - Valéry Legrain
- Institute of Neuroscience, Université Catholique de Louvain, Brussels, Belgium
| | - Lien Van der Biest
- Department of Experimental-Clinical and Health Psychology, Faculty of Psychology and Educational Sciences, Ghent University, Ghent, Belgium
| | - Nadine Hollevoet
- Department of Orthopaedic Surgery and Traumatology, Ghent University Hospital, Ghent, Belgium
| | - Alexander Van Tongel
- Department of Orthopaedic Surgery and Traumatology, Ghent University Hospital, Ghent, Belgium
| | - Lieven De Wilde
- Department of Orthopaedic Surgery and Traumatology, Ghent University Hospital, Ghent, Belgium
| | - Herlinde Jacobs
- Unit of Physical Medicine, AZ Maria Middelares Hospital, Ghent, Belgium
| | - Geert Crombez
- Department of Experimental-Clinical and Health Psychology, Faculty of Psychology and Educational Sciences, Ghent University, Ghent, Belgium
| |
Collapse
|
37
|
Abstract
Recent studies have shown that body-representations can be altered by dynamic changes in sound. In the so-called “auditory Pinocchio illusion” participants feel their finger to be longer when the action of pulling their finger is paired with a rising pitch. Here, we investigated whether preschool children - an age group in which multisensory body-representations are still fine-tuning - are also sensitive to this illusion. In two studies, sixty adult and sixty child participants heard sounds rising or falling in pitch while the experimenter concurrently pulled or pressed their index finger on a vertical (Experiment 1) or horizontal axis (Experiment 2). Results showed that the illusion was subjected to axis and age: both adults and children reported their finger to be longer in Experiment 1, but not in Experiment 2. However, while in adults the feeling of finger elongation corresponded to a recalibration of the fingertip’s felt position upwards, this was not the case in children, who presented a dissociation between the feeling of finger elongation and the perceived fingertip position. Our results reveal that the ‘auditory Pinocchio illusion’ is constrained to the vertical dimension and suggest that multisensory interactions differently contribute to subjective feelings and sense of position depending on developmental stage.
Collapse
|
38
|
Cutaneous and stick rabbit illusions in individuals with autism spectrum disorder. Sci Rep 2020; 10:1665. [PMID: 32020035 PMCID: PMC7000771 DOI: 10.1038/s41598-020-58536-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2019] [Accepted: 01/14/2020] [Indexed: 12/02/2022] Open
Abstract
Prediction is the process by which future events are anticipated based on past events; in contrast, postdiction is the retrospective interpretation of past events based on latter, more recent events. The prediction and postdiction are suggested to be similar based on theoretical models. Previous studies suggest that prediction is impaired in individuals with autism spectrum disorder (ASD). However, it is unclear whether postdiction is also impaired in individuals with ASD. In this study, we evaluated postdiction in individuals with ASD using the cutaneous and stick rabbit illusion paradigms in which the perceived location of a touch shifts postdictively in response to a subsequent touch stimulus. We observed significant cutaneous and stick rabbit illusion in both typically developing (TD) and ASD groups; therefore, postdiction was functional in individuals with ASD. Our present results suggest that postdiction involves a different neuronal process than prediction. We also observed that the ASD group exhibited significantly larger individual difference compared with the TD group in the stick rabbit illusion, which is considered to reflect extension of body schema to external objects. We discuss implications of the individual difference among the ASD participants in the context of sports requiring interactions between the body and external objects.
Collapse
|
39
|
Electro-Haptic Enhancement of Spatial Hearing in Cochlear Implant Users. Sci Rep 2020; 10:1621. [PMID: 32005889 PMCID: PMC6994470 DOI: 10.1038/s41598-020-58503-8] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2019] [Accepted: 01/15/2020] [Indexed: 11/08/2022] Open
Abstract
Cochlear implants (CIs) have enabled hundreds of thousands of profoundly hearing-impaired people to perceive sounds by electrically stimulating the auditory nerve. However, CI users are often very poor at locating sounds, which leads to impaired sound segregation and threat detection. We provided missing spatial hearing cues through haptic stimulation to augment the electrical CI signal. We found that this "electro-haptic" stimulation dramatically improved sound localisation. Furthermore, participants were able to effectively integrate spatial information transmitted through these two senses, performing better with combined audio and haptic stimulation than with either alone. Our haptic signal was presented to the wrists and could readily be delivered by a low-cost wearable device. This approach could provide a non-invasive means of improving outcomes for the vast majority of CI users who have only one implant, without the expense and risk of a second implantation.
Collapse
|
40
|
Testing the exteroceptive function of nociception: The role of visual experience in shaping the spatial representations of nociceptive inputs. Cortex 2020; 126:26-38. [PMID: 32062141 DOI: 10.1016/j.cortex.2019.12.024] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2019] [Revised: 05/21/2019] [Accepted: 12/14/2019] [Indexed: 01/30/2023]
Abstract
Adequately localizing pain is crucial to protect the body against physical damage and react to the stimulus in external space having caused such damage. Accordingly, it is hypothesized that nociceptive inputs are remapped from a somatotopic reference frame, representing the skin surface, towards a spatiotopic frame, representing the body parts in external space. This ability is thought to be developed and shaped by early visual experience. To test this hypothesis, normally sighted and early blind participants performed temporal order judgment tasks during which they judged which of two nociceptive stimuli applied on each hand's dorsum was perceived as first delivered. Crucially, tasks were performed with the hands either in an uncrossed posture or crossed over body midline. While early blinds were not affected by the posture, performances of the normally sighted participants decreased in the crossed condition relative to the uncrossed condition. This indicates that nociceptive stimuli were automatically remapped into a spatiotopic representation that interfered with somatotopy in normally sighted individuals, whereas early blinds seemed to mostly rely on a somatotopic representation to localize nociceptive inputs. Accordingly, the plasticity of the nociceptive system would not purely depend on bodily experiences but also on crossmodal interactions between nociception and vision during early sensory experience.
Collapse
|
41
|
Neural correlates of tactile simultaneity judgement: a functional magnetic resonance imaging study. Sci Rep 2019; 9:19481. [PMID: 31862896 PMCID: PMC6925270 DOI: 10.1038/s41598-019-54323-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2018] [Accepted: 11/12/2019] [Indexed: 11/17/2022] Open
Abstract
Simultaneity judgement (SJ) is a temporal discrimination task in which the targets span an ultimately short time range (zero or not). Psychophysical studies suggest that SJ is adequate to probe the perceptual components of human time processing in pure form. Thus far, time-relevant neural correlates for tactile SJ are unclear. We performed functional magnetic resonance imaging (fMRI) to investigate the neural correlates of tactile SJ using tactile number judgement as a time-irrelevant control task. As our main result, we demonstrated that the right inferior parietal lobule (IPL) is an SJ-specific region. The right IPL was detected by both parametric and non-parametric statistical analyses, and its activation intensity fulfilled a strict statistical criterion. In addition, we observed that some left-dominant regions (e.g., the striatum) were specifically activated by successive stimuli during SJ. Meanwhile, no region was specifically activated by simultaneous stimuli during SJ. Accordingly, we infer that the neural process for tactile SJ is as follows: the striatum estimates the time interval between tactile stimuli; based on this interval, the right IPL discriminates the successiveness or simultaneity of the stimuli. Moreover, taking detailed behavioural results into account, we further discuss possible concurrent or alternative mechanisms that can explain the fMRI results.
Collapse
|
42
|
Medendorp WP, Heed T. State estimation in posterior parietal cortex: Distinct poles of environmental and bodily states. Prog Neurobiol 2019; 183:101691. [DOI: 10.1016/j.pneurobio.2019.101691] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2019] [Revised: 08/12/2019] [Accepted: 08/29/2019] [Indexed: 01/06/2023]
|
43
|
The influence of visual experience and cognitive goals on the spatial representations of nociceptive stimuli. Pain 2019; 161:328-337. [DOI: 10.1097/j.pain.0000000000001721] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
|
44
|
Hense M, Badde S, Köhne S, Dziobek I, Röder B. Visual and Proprioceptive Influences on Tactile Spatial Processing in Adults with Autism Spectrum Disorders. Autism Res 2019; 12:1745-1757. [PMID: 31507084 DOI: 10.1002/aur.2202] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 06/25/2019] [Accepted: 08/14/2019] [Indexed: 12/19/2022]
Abstract
Children with autism spectrum disorders (ASDs) often exhibit altered representations of the external world. Consistently, when localizing touch, children with ASDs were less influenced than their peers by changes of the stimulated limb's location in external space [Wada et al., Scientific Reports 2015, 4(1), 5985]. However, given the protracted development of an external-spatial dominance in tactile processing in typically developing children, this difference might reflect a developmental delay rather than a set suppression of external space in ASDs. Here, adults with ASDs and matched control-participants completed (a) the tactile temporal order judgment (TOJ) task previously used to test external-spatial representation of touch in children with ASDs and (b) a tactile-visual cross-modal congruency (CC) task which assesses benefits of task-irrelevant visual stimuli on tactile localization in external space. In both experiments, participants localized tactile stimuli to the fingers of each hand, while holding their hands either crossed or uncrossed. Performance differences between hand postures reflect the influence of external-spatial codes. In both groups, tactile TOJ-performance markedly decreased when participants crossed their hands and CC-effects were especially large if the visual stimulus was presented at the same side of external space as the task-relevant touch. The absence of group differences was statistically confirmed using Bayesian statistical modeling: adults with ASDs weighted external-spatial codes comparable to typically developed adults during tactile and visual-tactile spatio-temporal tasks. Thus, atypicalities in the spatial coding of touch for children with ASDs appear to reflect a developmental delay rather than a stable characteristic of ASD. Autism Res 2019, 12: 1745-1757. © 2019 International Society for Autism Research, Wiley Periodicals, Inc. LAY SUMMARY: A touched limb's location can be described twofold, with respect to the body (right hand) or the external world (right side). Children and adolescents with autism spectrum disorder (ASD) reportedly rely less than their peers on the external world. Here, adults with and without ASDs completed two tactile localization tasks. Both groups relied to the same degree on external world locations. This opens the possibility that the tendency to relate touch to the external world is typical in individuals with ASDs but emerges with a delay.
Collapse
Affiliation(s)
- Marlene Hense
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| | - Stephanie Badde
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany.,Department of Psychology, New York University, New York, New York
| | - Svenja Köhne
- Berlin School of Mind and Brain, Department of Psychology, Humboldt University Berlin, Berlin, Germany
| | - Isabel Dziobek
- Berlin School of Mind and Brain, Department of Psychology, Humboldt University Berlin, Berlin, Germany
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| |
Collapse
|
45
|
Developmental changes in the perception of audiotactile simultaneity. J Exp Child Psychol 2019; 183:208-221. [DOI: 10.1016/j.jecp.2019.02.006] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2018] [Revised: 01/29/2019] [Accepted: 02/13/2019] [Indexed: 11/23/2022]
|
46
|
Rahman MS, Yau JM. Somatosensory interactions reveal feature-dependent computations. J Neurophysiol 2019; 122:5-21. [DOI: 10.1152/jn.00168.2019] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Our ability to perceive and discriminate textures is based on the processing of high-frequency vibrations generated on the fingertip as it scans across a surface. Although much is known about the processing of vibration amplitude and frequency information when cutaneous stimulation is experienced at a single location on the body, how these stimulus features are processed when touch occurs at multiple locations is poorly understood. We evaluated participants’ ability to discriminate tactile cues (100–300 Hz) on one hand while they ignored distractor cues experienced on their other hand. We manipulated the relative positions of the hands to characterize how limb position influenced cutaneous touch interactions. In separate experiments, participants judged either the frequency or intensity of mechanical vibrations. We found that vibrations experienced on one hand always systematically modulated the perception of vibrations on the other hand. Notably, bimanual interaction patterns and their sensitivity to hand locations differed according to stimulus feature. Somatosensory interactions in intensity perception were only marked by attenuation that was invariant to hand position manipulations. In contrast, interactions in frequency perception consisted of both bias and sensitivity changes that were more pronounced when the hands were held in close proximity. We implemented models to infer the neural computations that mediate somatosensory interactions in the intensity and frequency dimensions. Our findings reveal obligatory and feature-dependent somatosensory interactions that may be supported by both feature-specific and feature-general operations. NEW & NOTEWORTHY Little is known about the neural computations mediating feature-specific sensory interactions between the hands. We show that vibrations experienced on one hand systematically modulate the perception of vibrations felt on the other hand. Critically, interaction patterns and their dependence on the relative positions of the hands differed depending on whether participants judged vibration intensity or frequency. These results, which we recapitulate with models, imply that somatosensory interactions are mediated by feature-dependent neural computations.
Collapse
Affiliation(s)
| | - Jeffrey M. Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas
| |
Collapse
|
47
|
Alpha-band oscillations reflect external spatial coding for tactile stimuli in sighted, but not in congenitally blind humans. Sci Rep 2019; 9:9215. [PMID: 31239467 PMCID: PMC6592921 DOI: 10.1038/s41598-019-45634-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2018] [Accepted: 06/11/2019] [Indexed: 12/02/2022] Open
Abstract
We investigated the function of oscillatory alpha-band activity in the neural coding of spatial information during tactile processing. Sighted humans concurrently encode tactile location in skin-based and, after integration with posture, external spatial reference frames, whereas congenitally blind humans preferably use skin-based coding. Accordingly, lateralization of alpha-band activity in parietal regions during attentional orienting in expectance of tactile stimulation reflected external spatial coding in sighted, but skin-based coding in blind humans. Here, we asked whether alpha-band activity plays a similar role in spatial coding for tactile processing, that is, after the stimulus has been received. Sighted and congenitally blind participants were cued to attend to one hand in order to detect rare tactile deviant stimuli at this hand while ignoring tactile deviants at the other hand and tactile standard stimuli at both hands. The reference frames encoded by oscillatory activity during tactile processing were probed by adopting either an uncrossed or crossed hand posture. In sighted participants, attended relative to unattended standard stimuli suppressed the power in the alpha-band over ipsilateral centro-parietal and occipital cortex. Hand crossing attenuated this attentional modulation predominantly over ipsilateral posterior-parietal cortex. In contrast, although contralateral alpha-activity was enhanced for attended versus unattended stimuli in blind participants, no crossing effects were evident in the oscillatory activity of this group. These findings suggest that oscillatory alpha-band activity plays a pivotal role in the neural coding of external spatial information for touch.
Collapse
|
48
|
Badde S, Röder B, Heed T. Feeling a Touch to the Hand on the Foot. Curr Biol 2019; 29:1491-1497.e4. [PMID: 30955931 DOI: 10.1016/j.cub.2019.02.060] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2017] [Revised: 02/15/2019] [Accepted: 02/27/2019] [Indexed: 10/27/2022]
Abstract
Where we perceive a touch putatively depends on topographic maps that code the touch's location on the skin [1] as well as its position in external space [2-5]. However, neither somatotopic nor external-spatial representations can account for atypical tactile percepts in some neurological patients and amputees; referral of touch to an absent or anaesthetized hand after stimulation of a foot [6, 7] or the contralateral hand [8-10] challenges the role of topographic representations when attributing touch to the limbs. Here, we show that even healthy adults systematically misattribute touch to other limbs. Participants received two tactile stimuli, each to a different limb-hand or foot-and reported which of all four limbs had been stimulated first. Hands and feet were either uncrossed or crossed to dissociate body-based and external-spatial representations [11-14]. Remarkably, participants regularly attributed the first touch to a limb that had received neither of the two stimuli. The erroneously reported, non-stimulated limb typically matched the correct limb with respect to limb type or body side. Touch was misattributed to non-stimulated limbs of the other limb type and body side only if they were placed at the correct limb's canonical (default) side of space. The touch's actual location in external space was irrelevant. These errors replicated across several contexts, and modeling linked them to incoming sensory evidence rather than to decision strategies. The results highlight the importance of the touched body part's identity and canonical location but challenge the role of external-spatial tactile representations when attributing touch to a limb.
Collapse
Affiliation(s)
- Stephanie Badde
- Department of Psychology and Center of Neural Sciences, New York University, 6 Washington Place, New York, NY 10003, USA; Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Universitätsstrasse 25, 33615 Bielefeld, Germany; Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1, 33619 Bielefeld, Germany
| |
Collapse
|
49
|
Tamè L, Azañón E, Longo MR. A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Front Psychol 2019; 10:291. [PMID: 30863333 PMCID: PMC6399380 DOI: 10.3389/fpsyg.2019.00291] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 01/29/2019] [Indexed: 11/30/2022] Open
Abstract
The processing of touch depends of multiple factors, such as the properties of the skin and type of receptors stimulated, as well as features related to the actual configuration and shape of the body itself. A large body of research has focused on the effect that the nature of the stimuli has on tactile processing. Less research, however, has focused on features beyond the nature of the touch. In this review, we focus on some features related to the body that have been investigated for less time and in a more fragmented way. These include the symmetrical quality of the two sides of the body, the postural configuration of the body, as well as the size and shape of different body parts. We will describe what we consider three key aspects: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body, regarded as priors; (3) and how tactile signals are integrated with representations of body size and shape. Here, we describe how these different body dimensions affect integration of tactile information as well as guide motor behavior by integrating them in a single model of tactile processing. We review a wide range of neuropsychological, neuroimaging, and neurophysiological data and suggest a revised model of tactile integration on the basis of the one proposed previously by Longo et al.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom.,School of Psychology, University of Kent, Canterbury, United Kingdom
| | - Elena Azañón
- Institute of Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany.,Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom
| |
Collapse
|
50
|
Sadibolova R, Tamè L, Longo MR. More than skin-deep: Integration of skin-based and musculoskeletal reference frames in localization of touch. J Exp Psychol Hum Percept Perform 2018; 44:1672-1682. [PMID: 30160504 PMCID: PMC6205026 DOI: 10.1037/xhp0000562] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2018] [Revised: 04/25/2018] [Accepted: 04/26/2018] [Indexed: 11/08/2022]
Abstract
The skin of the forearm is, in one sense, a flat 2-dimensional (2D) sheet, but in another sense approximately cylindrical, mirroring the 3-dimensional (3D) volumetric shape of the arm. The role of frames of reference based on the skin as a 2D sheet versus based on the musculoskeletal structure of the arm remains unclear. When we rotate the forearm from a pronated to a supinated posture, the skin on its surface is displaced. Thus, a marked location will slide with the skin across the underlying flesh, and the touch perceived at this location should follow this displacement if it is localized within a skin-based reference frame. We investigated, however, if the perceived tactile locations were also affected by the rearrangement in underlying musculoskeletal structure, that is, displaced medially and laterally on a pronated and supinated forearm, respectively. Participants pointed to perceived touches (Experiment 1), or marked them on a (3D) size-matched forearm on a computer screen (Experiment 2). The perceived locations were indeed displaced medially after forearm pronation in both response modalities. This misperception was reduced (Experiment 1), or absent altogether (Experiment 2) in the supinated posture when the actual stimulus grid moved laterally with the displaced skin. The grid was perceptually stretched at medial-lateral axis, and it was displaced distally, which suggest the influence of skin-based factors. Our study extends the tactile localization literature focused on the skin-based reference frame and on the effects of spatial positions of body parts by implicating the musculoskeletal factors in localization of touch on the body. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Collapse
|