1
|
Alouit A, Gavaret M, Ramdani C, Lindberg PG, Dupin L. Cortical activations associated with spatial remapping of finger touch using EEG. Cereb Cortex 2024; 34:bhae161. [PMID: 38642106 DOI: 10.1093/cercor/bhae161] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 03/22/2024] [Accepted: 03/23/2024] [Indexed: 04/22/2024] Open
Abstract
The spatial coding of tactile information is functionally essential for touch-based shape perception and motor control. However, the spatiotemporal dynamics of how tactile information is remapped from the somatotopic reference frame in the primary somatosensory cortex to the spatiotopic reference frame remains unclear. This study investigated how hand position in space or posture influences cortical somatosensory processing. Twenty-two healthy subjects received electrical stimulation to the right thumb (D1) or little finger (D5) in three position conditions: palm down on right side of the body (baseline), hand crossing the body midline (effect of position), and palm up (effect of posture). Somatosensory-evoked potentials (SEPs) were recorded using electroencephalography. One early-, two mid-, and two late-latency neurophysiological components were identified for both fingers: P50, P1, N125, P200, and N250. D1 and D5 showed different cortical activation patterns: compared with baseline, the crossing condition showed significant clustering at P1 for D1, and at P50 and N125 for D5; the change in posture showed a significant cluster at N125 for D5. Clusters predominated at centro-parietal electrodes. These results suggest that tactile remapping of fingers after electrical stimulation occurs around 100-125 ms in the parietal cortex.
Collapse
Affiliation(s)
- Anaëlle Alouit
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Martine Gavaret
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
- GHU-Paris Psychiatrie et Neurosciences, Hôpital Sainte Anne, Service de neurophysiologie clinique, 1 Rue Cabanis, F-75014 Paris, France
| | - Céline Ramdani
- Service de Santé des Armées, Institut de Recherche Biomédicale des Armées, 1 Place du Général Valérie André, 91220 Brétigny-sur-Orge, France
| | - Påvel G Lindberg
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Lucile Dupin
- Université Paris Cité, INCC UMR 8002, CNRS, 45 Rue des Saints-Pères, F-75006 Paris, France
| |
Collapse
|
2
|
Girondini M, Montanaro M, Gallace A. Spatial tactile localization depends on sensorimotor binding: preliminary evidence from virtual reality. Front Hum Neurosci 2024; 18:1354633. [PMID: 38445099 PMCID: PMC10912179 DOI: 10.3389/fnhum.2024.1354633] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2023] [Accepted: 01/26/2024] [Indexed: 03/07/2024] Open
Abstract
Introduction Our brain continuously maps our body in space. It has been suggested that at least two main frames of reference are used to process somatosensory stimuli presented on our own body: the anatomical frame of reference (based on the somatotopic representation of our body in the somatosensory cortex) and the spatial frame of reference (where body parts are mapped in external space). Interestingly, a mismatch between somatotopic and spatial information significantly affects the processing of bodily information, as demonstrated by the "crossing hand" effect. However, it is not clear if this impairment occurs not only when the conflict between these frames of reference is determined by a static change in the body position (e.g., by crossing the hands) but also when new associations between motor and sensory responses are artificially created (e.g., by presenting feedback stimuli on a side of the body that is not involved in the movement). Methods In the present study, 16 participants performed a temporal order judgment task before and after a congruent or incongruent visual-tactile-motor- task in virtual reality. During the VR task, participants had to move a cube using a virtual stick. In the congruent condition, the haptic feedback during the interaction with the cube was provided on the right hand (the one used to control the stick). In the incongruent condition, the haptic feedback was provided to the contralateral hand, simulating a sort of 'active' crossed feedback during the interaction. Using a psychophysical approach, the point of subjective equality (or PSE, i.e., the probability of responding left or right to the first stimulus in the sequence in 50% of the cases) and the JND (accuracy) were calculated for both conditions, before and after the VR-task. Results After the VR task, compared to the baseline condition, the PSE shifted toward the hand that received the haptic feedback during the interaction (toward the right hand for the congruent condition and toward the left hand for the incongruent condition). Dicussion This study demonstrated the possibility of inducing spatial biases in the processing of bodily information by modulating the sensory-motor interaction between stimuli in virtual environments (while keeping constant the actual position of the body in space).
Collapse
Affiliation(s)
- Matteo Girondini
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
- Mind and Behavior Technological Center, University of Milano-Bicocca, Milan, Italy
- MySpace Lab, Department of Clinical Neuroscience, University Hospital of Lausanne, Lausanne, Switzerland
| | - Massimo Montanaro
- Mind and Behavior Technological Center, University of Milano-Bicocca, Milan, Italy
| | - Alberto Gallace
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
- Mind and Behavior Technological Center, University of Milano-Bicocca, Milan, Italy
| |
Collapse
|
3
|
Moharramipour A, Takahashi T, Kitazawa S. Distinctive modes of cortical communications in tactile temporal order judgment. Cereb Cortex 2023; 33:2982-2996. [PMID: 35811300 DOI: 10.1093/cercor/bhac255] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2022] [Revised: 06/03/2022] [Accepted: 06/04/2022] [Indexed: 11/12/2022] Open
Abstract
Temporal order judgment of two successive tactile stimuli delivered to our hands is often inverted when we cross our hands. The present study aimed to identify time-frequency profiles of the interactions across the cortical network associated with the crossed-hand tactile temporal order judgment task using magnetoencephalography. We found that the interactions across the cortical network were channeled to a low-frequency band (5-10 Hz) when the hands were uncrossed. However, the interactions became activated in a higher band (12-18 Hz) when the hands were crossed. The participants with fewer inverted judgments relied mainly on the higher band, whereas those with more frequent inverted judgments (reversers) utilized both. Moreover, reversers showed greater cortical interactions in the higher band when their judgment was correct compared to when it was inverted. Overall, the results show that the cortical network communicates in two distinctive frequency modes during the crossed-hand tactile temporal order judgment task. A default mode of communications in the low-frequency band encourages inverted judgments, and correct judgment is robustly achieved by recruiting the high-frequency mode.
Collapse
Affiliation(s)
- Ali Moharramipour
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, 1-3 Yamadaoka, Suita, Osaka 565-0871, Japan
- Laboratory for Consciousness, Center for Brain Science (CBS), RIKEN, 2-1 Hirosawa, Wako, Saitama 351-0106, Japan
| | - Toshimitsu Takahashi
- Department of Physiology, Dokkyo Medical University, 880 Kitakobayashi, Mibu, Shimotsuga, Tochigi 321-0293, Japan
| | - Shigeru Kitazawa
- Dynamic Brain Network Laboratory, Graduate School of Frontier Biosciences, Osaka University, 1-3 Yamadaoka, Suita, Osaka 565-0871, Japan
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, 1-3 Yamakaoka, Suita, Osaka 565-0871, Japan
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology, 1-4 Yamadaoka, Suita, Osaka 565-0871, Japan
| |
Collapse
|
4
|
De Havas J, Ito S, Bestmann S, Gomi H. Neural dynamics of illusory tactile pulling sensations. iScience 2022; 25:105018. [PMID: 36105590 PMCID: PMC9464957 DOI: 10.1016/j.isci.2022.105018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Revised: 07/13/2022] [Accepted: 08/22/2022] [Indexed: 11/25/2022] Open
Abstract
Directional tactile pulling sensations are integral to everyday life, but their neural mechanisms remain unknown. Prior accounts hold that primary somatosensory (SI) activity is sufficient to generate pulling sensations, with alternative proposals suggesting that amodal frontal or parietal regions may be critical. We combined high-density EEG with asymmetric vibration, which creates an illusory pulling sensation, thereby unconfounding pulling sensations from unrelated sensorimotor processes. Oddballs that created opposite direction pulls to common stimuli were compared to the same oddballs after neutral common stimuli (symmetric vibration) and to neutral oddballs. We found evidence against the sensory-frontal N140 and in favor of the midline P200 tracking the emergence of pulling sensations, specifically contralateral parietal lobe activity 264-320ms, centered on the intraparietal sulcus. This suggests that SI is not sufficient to generate pulling sensations, which instead depend on the parietal association cortex, and may reflect the extraction of orientation information and related spatial processing. Tactile pulling sensations are difficult to isolate in the human brain Illusory pulls from asymmetric vibration allow neural activity to be isolated Pulling sensations are driven by parietal lobe activity 264-320ms post-stimulus Spatial processing in the parietal lobe may be essential for pulling sensations
Collapse
|
5
|
Akbari S, Soltanlou M, Sabourimoghaddam H, Nuerk HC, Leuthold H. The complexity of simple counting: ERP findings reveal early perceptual and late numerical processes in different arrangements. Sci Rep 2022; 12:6763. [PMID: 35474225 PMCID: PMC9042952 DOI: 10.1038/s41598-022-10206-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2019] [Accepted: 03/31/2022] [Indexed: 11/16/2022] Open
Abstract
The counting process can only be fully understood when taking into account the visual characteristics of the sets counted. Comparing behavioral data as well as event-related brain potentials (ERPs) evoked by different task-irrelevant arrangements of dots during an exact enumeration task, we aimed to investigate the effect of illusory contour detection on the counting process while other grouping cues like proximity were controlled and dot sparsity did not provide a cue to the numerosity of sets. Adult participants (N = 37) enumerated dots (8-12) in irregular and two different types of regular arrangements which differed in the shape of their illusory dot lattices. Enumeration speed was affected by both arrangement and magnitude. The type of arrangement influenced an early ERP negativity peaking at about 270 ms after stimulus onset, whereas numerosity only affected later ERP components (> 300 ms). We also observed that without perceptual cues, magnitude was constructed at a later stage of cognitive processing. We suggest that chunking is a prerequisite for more fluent counting which influences automatic processing (< 300 ms) during enumeration. We conclude that the procedure of exact enumeration depends on the interaction of several perceptual and numerical processes that are influenced by magnitude and arrangement.
Collapse
Affiliation(s)
- Shadi Akbari
- Cognitive Neuroscience Lab, Department of Psychology, University of Tabriz, Tabriz, Iran
| | - Mojtaba Soltanlou
- Department of Psychology, University of Tuebingen, Schleichstreet 4, 72076, Tuebingen, Germany
- School of Psychology, University of Surrey, Guildford, UK
| | | | - Hans-Christoph Nuerk
- Department of Psychology, University of Tuebingen, Schleichstreet 4, 72076, Tuebingen, Germany.
- Leibniz-Institut Für Wissensmedien, Tuebingen, Germany.
- LEAD Graduate School and Research Network, University of Tuebingen, Tuebingen, Germany.
| | - Hartmut Leuthold
- Department of Psychology, University of Tuebingen, Schleichstreet 4, 72076, Tuebingen, Germany
| |
Collapse
|
6
|
Fabio C, Salemme R, Koun E, Farnè A, Miller LE. Alpha Oscillations Are Involved in Localizing Touch on Handheld Tools. J Cogn Neurosci 2022; 34:675-686. [DOI: 10.1162/jocn_a_01820] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
Collapse
Affiliation(s)
- Cécile Fabio
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
| | - Romeo Salemme
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
| | - Eric Koun
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
| | - Alessandro Farnè
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
- University of Trento, Rovereto, Italy
| | - Luke E. Miller
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
- Donders Institute for Brain, Nijmegen, The Netherlands
| |
Collapse
|
7
|
Manfron L, Vanderclausen C, Legrain V. No Evidence for an Effect of the Distance Between the Hands on Tactile Temporal Order Judgments. Perception 2021; 50:294-307. [PMID: 33653176 DOI: 10.1177/0301006621998877] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Localizing somatosensory stimuli is an important process, as it allows us to spatially guide our actions toward the object entering in contact with the body. Accordingly, the positions of tactile inputs are coded according to both somatotopic and spatiotopic representations, the latter one considering the position of the stimulated limbs in external space. The spatiotopic representation has often been evidenced by means of temporal order judgment (TOJ) tasks. Participants' judgments about the order of appearance of two successive somatosensory stimuli are less accurate when the hands are crossed over the body midline than uncrossed but also when participants' hands are placed close together when compared with farther away. Moreover, these postural effects might depend on the vision of the stimulated limbs. The aim of this study was to test the influence of seeing the hands, on the modulation of tactile TOJ by the spatial distance between the stimulated limbs. The results showed no influence of the distance between the stimulated hands on TOJ performance and prevent us from concluding whether vision of the hands affects TOJ performance, or whether these variables interact. The reliability of such distance effect to investigate the spatial representations of tactile inputs is questioned.
Collapse
|
8
|
Van der Looven R, Deschrijver M, Hermans L, De Muynck M, Vingerhoets G. Hand size representation in healthy children and young adults. J Exp Child Psychol 2020; 203:105016. [PMID: 33246254 DOI: 10.1016/j.jecp.2020.105016] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Revised: 09/21/2020] [Accepted: 09/22/2020] [Indexed: 10/22/2022]
Abstract
Whereas we experience our body as a coherent volumetric object, the brain appears to maintain highly fragmented representations of individual body parts. Little is known about how body representations of hand size and shape are built and evolve during infancy and young adulthood. This study aimed to investigate the effect of hand side, handedness, and age on the development of central hand size representation. The observational study with comparison groups was conducted with 90 typically developing Belgian school children and young adults (48 male and 42 female; age range = 5.0-23.0 years; 49 left-handed and 41 right-handed). Participants estimated their hand size and shape using two different tasks. In the localization task, participants were verbally cued to judge the locations of 10 anatomical landmarks of an occluded hand. An implicit hand size map was constructed and compared with actual hand dimensions. In the template selection task, the explicit hand shape was measured with a depictive method. Hand shape indexes were calculated and compared for the actual, implicit, and explicit conditions. Participants were divided into four age groups (5-8 years, 9-10 years, 11-16 years, and 17-23 years). Implicit hand maps featured underestimation of finger length and overestimation of hand width, which is already present in the youngest children. Linear mixed modeling revealed no influence of hand side on finger length underestimation; nonetheless, a significant main effect of age (p = .001) was exposed. Sinistrals aged 11 to 16 years showed significantly less underestimation (p = .03) than dextrals of the same age. As for the hand shape, the implicit condition differed significantly with the actual and explicit conditions (p < .001). Again, the implicit shape index was subjected to handedness and age effects, with significant differences being found between sinistrals and dextrals in the age groups of 9 and 10 years (p = .029) and 11 to 16 years (p < .001). In conclusion, the implicit metric component of the hand representation in children and young adults is misperceived, featuring shortened fingers and broadened hands since a very young age. Crucially, the finger length underestimation increases with age and shows a different developmental trajectory for sinistrals and dextrals. In contrast, the explicit hand shape is approximately veridical and seems immune from age and handedness effects. This study confirms the dual character of somatoperception and establishes a point of reference for children and young adults.
Collapse
Affiliation(s)
- Ruth Van der Looven
- Child Rehabilitation Centre, Department of Physical Medicine and Rehabilitation, Ghent University Hospital, 9000 Ghent, Belgium.
| | - Miguel Deschrijver
- Department of Physical Medicine and Rehabilitation, Ghent University Hospital, 9000 Ghent, Belgium
| | - Linda Hermans
- Child Rehabilitation Centre, Department of Physical Medicine and Rehabilitation, Ghent University Hospital, 9000 Ghent, Belgium
| | - Martine De Muynck
- Department of Physical Medicine and Rehabilitation, Ghent University Hospital, 9000 Ghent, Belgium
| | - Guy Vingerhoets
- Department of Experimental Psychology, Faculty of Psychology and Educational Sciences, Ghent University, 9000 Ghent, Belgium
| |
Collapse
|
9
|
Spatial Information of Somatosensory Stimuli in the Brain: Multivariate Pattern Analysis of Functional Magnetic Resonance Imaging Data. Neural Plast 2020; 2020:8307580. [PMID: 32684924 PMCID: PMC7341392 DOI: 10.1155/2020/8307580] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2020] [Revised: 05/13/2020] [Accepted: 05/19/2020] [Indexed: 12/16/2022] Open
Abstract
Background Multivoxel pattern analysis has provided new evidence on somatotopic representation in the human brain. However, the effects of stimulus modality (e.g., penetrating needle versus non-penetrating touch) and level of classification (e.g., multiclass versus binary classification) on patterns of brain activity encoding spatial information of body parts have not yet been studied. We hypothesized that performance of brain-based prediction models may vary across the types of stimuli, and neural patterns of voxels in the SI and parietal cortex would significantly contribute to the prediction of stimulated locations. Objective We aimed to (1) test whether brain responses to tactile stimuli could distinguish among stimulated locations on the body surface, (2) investigate whether the stimulus modality and number of classes affect classification performance, and (3) localize brain regions encoding the spatial information of somatosensory stimuli. Methods Fifteen healthy participants completed two functional magnetic resonance imaging (MRI) scans and were stimulated via the insertion of acupuncture needles or by non-invasive touch stimuli (5.46-sized von Frey filament). Participants received the stimuli at four different locations on the upper and lower limbs (two sites each) for 5 min while blood-oxygen-level-dependent activity (BOLD) was measured using 3-Tesla MRI. We performed multivariate pattern analysis (MVPA) using parameter estimate images of each trial for each participant and the support vector classifier (SVC) function, and the prediction accuracy and other MVPA outcomes were evaluated using stratified five-fold cross validation. We estimated the significance of the classification accuracy using a permutation test with randomly labeled training data (n = 10,000). Searchlight analysis was conducted to identify brain regions associated with significantly higher accuracy compared to predictions based on chance as obtained from a random classifier. Results For the four-class classification (classifying four stimulated points on the body), SVC analysis of whole-brain beta values in response to acupuncture stimulation was able to discriminate among stimulated locations (mean accuracy, 0.31; q < 0.01). The searchlight analysis found that values related to the right primary somatosensory cortex (SI) and intraparietal sulcus were significantly more accurate than those due to chance (p < 0.01). On the other hand, the same classifier did not predict stimulated locations accurately for touch stimulation (mean accuracy, 0.25; q = 0.66). For binary classification (discriminating between two stimulated body parts, i.e., the arm or leg), the SVC algorithm successfully predicted the stimulated body parts for both acupuncture (mean accuracy, 0.63; q < 0.001) and touch stimulation (mean accuracy, 0.60; q < 0.01). Searchlight analysis revealed that predictions based on the right SI, primary motor cortex (MI), paracentral gyrus, and superior frontal gyrus were significantly more accurate compared to predictions based on chance (p < 0.05). Conclusion Our findings suggest that the SI, as well as the MI, intraparietal sulcus, paracentral gyrus, and superior frontal gyrus, is responsible for the somatotopic representation of body parts stimulated by tactile stimuli. The MVPA approach for identifying neural patterns encoding spatial information of somatosensory stimuli may be affected by the stimulus type (penetrating needle versus non-invasive touch) and the number of classes (classification of four small points on the body versus two large body parts). Future studies with larger samples will identify stimulus-specific neural patterns representing stimulated locations, independent of subjective tactile perception and emotional responses. Identification of distinct neural patterns of body surfaces will help in improving neural biomarkers for pain and other sensory percepts in the future.
Collapse
|
10
|
Miller LE, Fabio C, Ravenda V, Bahmad S, Koun E, Salemme R, Luauté J, Bolognini N, Hayward V, Farnè A. Somatosensory Cortex Efficiently Processes Touch Located Beyond the Body. Curr Biol 2019; 29:4276-4283.e5. [PMID: 31813607 DOI: 10.1016/j.cub.2019.10.043] [Citation(s) in RCA: 32] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2019] [Revised: 09/30/2019] [Accepted: 10/21/2019] [Indexed: 01/24/2023]
Abstract
The extent to which a tool is an extension of its user is a question that has fascinated writers and philosophers for centuries [1]. Despite two decades of research [2-7], it remains unknown how this could be instantiated at the neural level. To this aim, the present study combined behavior, electrophysiology and neuronal modeling to characterize how the human brain could treat a tool like an extended sensory "organ." As with the body, participants localize touches on a hand-held tool with near-perfect accuracy [7]. This behavior is owed to the ability of the somatosensory system to rapidly and efficiently use the tool as a tactile extension of the body. Using electroencephalography (EEG), we found that where a hand-held tool was touched was immediately coded in the neural dynamics of primary somatosensory and posterior parietal cortices of healthy participants. We found similar neural responses in a proprioceptively deafferented patient with spared touch perception, suggesting that location information is extracted from the rod's vibrational patterns. Simulations of mechanoreceptor responses [8] suggested that the speed at which these patterns are processed is highly efficient. A second EEG experiment showed that touches on the tool and arm surfaces were localized by similar stages of cortical processing. Multivariate decoding algorithms and cortical source reconstruction provided further evidence that early limb-based processes were repurposed to map touch on a tool. We propose that an elementary strategy the human brain uses to sense with tools is to recruit primary somatosensory dynamics otherwise devoted to the body.
Collapse
Affiliation(s)
- Luke E Miller
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France.
| | - Cécile Fabio
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France
| | - Valeria Ravenda
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Department of Psychology & Milan Center for Neuroscience-NeuroMi, University of Milano Bicocca, Building U6, 1 Piazza dell'Ateneo Nuovo, Milan 20126, Italy
| | - Salam Bahmad
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France
| | - Eric Koun
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France
| | - Romeo Salemme
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France
| | - Jacques Luauté
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France
| | - Nadia Bolognini
- Department of Psychology & Milan Center for Neuroscience-NeuroMi, University of Milano Bicocca, Building U6, 1 Piazza dell'Ateneo Nuovo, Milan 20126, Italy; Laboratory of Neuropsychology, IRCSS Istituto Auxologico Italiano, 28 Via G. Mercalli, Milan 20122, Italy
| | - Vincent Hayward
- Sorbonne Université, Institut des Systèmes Intelligents et de Robotique (ISIR), 4 Place Jussieu, Paris 75005, France; Centre for the Study of the Senses, School of Advanced Study, University of London, Senate House, Malet Street, London WC1E 7HU, UK
| | - Alessandro Farnè
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France; Center for Mind/Brain Sciences, University of Trento, 31 Corso Bettini, Rovereto 38068, Italy
| |
Collapse
|
11
|
The influence of visual experience and cognitive goals on the spatial representations of nociceptive stimuli. Pain 2019; 161:328-337. [DOI: 10.1097/j.pain.0000000000001721] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
|
12
|
Dupin L, Haggard P. Dynamic Displacement Vector Interacts with Tactile Localization. Curr Biol 2019; 29:492-498.e3. [PMID: 30686734 PMCID: PMC6370943 DOI: 10.1016/j.cub.2018.12.032] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2018] [Revised: 11/15/2018] [Accepted: 12/18/2018] [Indexed: 11/25/2022]
Abstract
Locating a tactile stimulus on the body seems effortless and straightforward. However, the perceived location of a tactile stimulation can differ from its physical location [1, 2, 3]. Tactile mislocalizations can depend on the timing of successive stimulations [2, 4, 5], tactile motion mechanisms [6], or processes that “remap” stimuli from skin locations to external space coordinates [7, 8, 9, 10, 11]. We report six experiments demonstrating that the perception of tactile localization on a static body part is strongly affected by the displacement between the locations of two successive task-irrelevant actions. Participants moved their index finger between two keys. Each keypress triggered synchronous tactile stimulation at a randomized location on the immobilized wrist or forehead. Participants reported the location of the second tactile stimulation relative to the first. The direction of either active finger movements or passive finger displacements biased participants’ tactile orientation judgements (experiment 1). The effect generalized to tactile stimuli delivered to other body sites (experiment 2). Two successive keypresses, by different fingers at distinct locations, reproduced the effect (experiment 3). The effect remained even when the hand that moved was placed far from the tactile stimulation site (experiments 4 and 5). Temporal synchrony within 600 ms between the movement and tactile stimulations was necessary for the effect (experiment 6). Our results indicate that a dynamic displacement vector, defined as the location of one sensorimotor event relative to the one before, plays a strong role in structuring tactile spatial perception. Human tactile localization is biased by simultaneous finger displacement The shift between two successive events biases the relative localization of touches Both active and passive movements induce a bias, even if far from the touched site The bias effect is vectorially organized
Collapse
Affiliation(s)
- Lucile Dupin
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK.
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK
| |
Collapse
|
13
|
Ozgun N, Bennewitz R, Strauss DJ. Friction in Passive Tactile Perception Induces Phase Coherency in Late Somatosensory Single Trial Sequences. IEEE Trans Neural Syst Rehabil Eng 2019; 27:129-138. [PMID: 30629510 DOI: 10.1109/tnsre.2019.2891915] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Event related potentials represent a noninvasive means for studying sensory and cognitive processes that occur in response to particular stimuli. Here, we report on a phase measure for estimating single trial interaction of late somatosensory potentials (LSPs) following a tribological well defined mechanical stimulation of the human fingertip. Stimuli are presented via a programmable Braille-display with actively switchable pins which was slid along the apex of the passive fingertip, i.e., the fingertip rested stationarily in a finger holding system with circular opening at the bottom. The event was the raising and the lowering of either one, three, or five lines of pins. Differences were identified by measures based on instantaneous phase synchronization to the stimuli across trials, in particular the wavelet phase synchronization stability (WPSS) measure for single trial sequences of LSPs. In particular, we show that the higher the friction the stronger and more localized the induced phase coherency is. We concluded that the WPSS analysis of single sequences of LSPs represents a reliable method which allows for the quantification of brain responses upon distinct tactile stimuli.
Collapse
|
14
|
Legrain V, Manfron L, Garcia M, Filbrich L. Does Body Perception Shape Visuospatial Perception? Perception 2018; 47:507-520. [DOI: 10.1177/0301006618763269] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
How we perceive our body is shaped by sensory experiences with our surrounding environment, as witnessed by poor performance in tasks during which participants judge with their hands crossed the temporal order between two somatosensory stimuli, one applied on each hand. This suggests that somatosensory stimuli are not only processed according to a somatotopic representation but also a spatiotopic representation of the body. We investigated whether the perception of stimuli occurring in external space, such as visual stimuli, can also be influenced by the body posture and somatosensory stimuli. Participants performed temporal order judgements on pairs of visual stimuli, one in each side of space, with their hands uncrossed or crossed. In Experiment 1, participants’ hands were placed either near or far from the visual stimuli. In Experiment 2, the visual stimuli were preceded, either by 60 ms or 360 ms, by tactile stimuli applied on the hands placed near the visual stimuli. Manipulating the time interval was intended to activate either a somatotopic or a spatiotopic representation of somatic inputs. We did not obtain any evidence for an influence of body posture on visual temporal order judgment, suggesting that body perception is less relevant for processing extrabody stimuli than the reverse.
Collapse
Affiliation(s)
- Valéry Legrain
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| | - Louise Manfron
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| | - Marynn Garcia
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| | - Lieve Filbrich
- Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium
| |
Collapse
|
15
|
Hiramoto R, Kanayama N, Nakao T, Matsumoto T, Konishi H, Sakurai S, Okada G, Okamoto Y, Yamawaki S. BDNF as a possible modulator of EEG oscillatory response at the parietal cortex during visuo-tactile integration processes using a rubber hand. Neurosci Res 2017; 124:16-24. [PMID: 28668502 DOI: 10.1016/j.neures.2017.05.006] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2017] [Revised: 05/23/2017] [Accepted: 05/24/2017] [Indexed: 11/16/2022]
Abstract
Multisensory integration of visuo-tactile information presented on the body or a dummy body has a strong impact on body image. Previous researches show that alteration of body image induced by visuo-tactile integration is closely related to the activation of the parietal cortex, a sensory association area. The expression of brain-derived neurotrophic factor (BDNF) in the parietal area of macaque monkeys is thought to modulate the activation of the parietal cortex and alter the extension of body image during tool-use learning. However, the relationship between parietal cortex activation related to body image alterations and BDNF levels in humans remains unclear. We investigated the relationship between human serum BDNF levels and electroencephalography responses during a visuo-tactile integration task involving a rubber hand. We found cortical oscillatory components in the high frequency (gamma) band in the left parietal cortex. Moreover, the power values of these oscillations were positively correlated (p<0.05) with serum BDNF levels. Our results suggest that serum BDNF could play a role in modulating the cortical activity in response to visuo-tactile integration processes related to body image alteration in humans.
Collapse
Affiliation(s)
- Ryosuke Hiramoto
- Department of Psychology, Graduate School of Education, Hiroshima University, Hiroshima, Japan
| | - Noriaki Kanayama
- Department of Psychiatry and Neurosciences, Institute of Biomedical & Health Sciences, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan; Center of KANSEI Innovation, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan.
| | - Takashi Nakao
- Department of Psychology, Graduate School of Education, Hiroshima University, Hiroshima, Japan
| | - Tomoya Matsumoto
- Department of Psychiatry and Neurosciences, Institute of Biomedical & Health Sciences, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan; Center of KANSEI Innovation, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan
| | - Hirona Konishi
- Faculty of Medicine, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan
| | - Satoru Sakurai
- Faculty of Medicine, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan
| | - Go Okada
- Department of Psychiatry and Neurosciences, Institute of Biomedical & Health Sciences, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan
| | - Yasumasa Okamoto
- Department of Psychiatry and Neurosciences, Institute of Biomedical & Health Sciences, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan
| | - Shigeto Yamawaki
- Department of Psychiatry and Neurosciences, Institute of Biomedical & Health Sciences, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan; Center of KANSEI Innovation, Hiroshima University, 1-2-3, Kasumi, Minami-ku, Hiroshima, 734-8551, Japan
| |
Collapse
|
16
|
Bremner AJ, Spence C. The Development of Tactile Perception. ADVANCES IN CHILD DEVELOPMENT AND BEHAVIOR 2017; 52:227-268. [PMID: 28215286 DOI: 10.1016/bs.acdb.2016.12.002] [Citation(s) in RCA: 52] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Touch is the first of our senses to develop, providing us with the sensory scaffold on which we come to perceive our own bodies and our sense of self. Touch also provides us with direct access to the external world of physical objects, via haptic exploration. Furthermore, a recent area of interest in tactile research across studies of developing children and adults is its social function, mediating interpersonal bonding. Although there are a range of demonstrations of early competence with touch, particularly in the domain of haptics, the review presented here indicates that many of the tactile perceptual skills that we take for granted as adults (e.g., perceiving touches in the external world as well as on the body) take some time to develop in the first months of postnatal life, likely as a result of an extended process of connection with other sense modalities which provide new kinds of information from birth (e.g., vision and audition). Here, we argue that because touch is of such fundamental importance across a wide range of social and cognitive domains, it should be placed much more centrally in the study of early perceptual development than it currently is.
Collapse
Affiliation(s)
- A J Bremner
- Goldsmiths, University of London, London, United Kingdom.
| | - C Spence
- University of Oxford, Oxford, United Kingdom
| |
Collapse
|
17
|
Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach. Psychon Bull Rev 2016; 23:387-404. [PMID: 26350763 DOI: 10.3758/s13423-015-0918-0] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
To act upon a tactile stimulus its original skin-based, anatomical spatial code has to be transformed into an external, posture-dependent reference frame, a process known as tactile remapping. When the limbs are crossed, anatomical and external location codes are in conflict, leading to a decline in tactile localization accuracy. It is unknown whether this impairment originates from the integration of the resulting external localization response with the original, anatomical one or from a failure of tactile remapping in crossed postures. We fitted probabilistic models based on these diverging accounts to the data from three tactile localization experiments. Hand crossing disturbed tactile left-right location choices in all experiments. Furthermore, the size of these crossing effects was modulated by stimulus configuration and task instructions. The best model accounted for these results by integration of the external response mapping with the original, anatomical one, while applying identical integration weights for uncrossed and crossed postures. Thus, the model explained the data without assuming failures of remapping. Moreover, performance differences across tasks were accounted for by non-individual parameter adjustments, indicating that individual participants' task adaptation results from one common functional mechanism. These results suggest that remapping is an automatic and accurate process, and that the observed localization impairments in touch result from a cognitively controlled integration process that combines anatomically and externally coded responses.
Collapse
|
18
|
Ferri F, Ambrosini E, Costantini M. Spatiotemporal processing of somatosensory stimuli in schizotypy. Sci Rep 2016; 6:38735. [PMID: 27934937 PMCID: PMC5146666 DOI: 10.1038/srep38735] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2016] [Accepted: 11/14/2016] [Indexed: 12/26/2022] Open
Abstract
Unusual interaction behaviors and perceptual aberrations, like those occurring in schizotypy and schizophrenia, may in part originate from impaired remapping of environmental stimuli in the body space. Such remapping is contributed by the integration of tactile and proprioceptive information about current body posture with other exteroceptive spatial information. Surprisingly, no study has investigated whether alterations in such remapping occur in psychosis-prone individuals. Four hundred eleven students were screened with respect to schizotypal traits using the Schizotypal Personality Questionnaire. A subgroup of them, classified as low, moderate, and high schizotypes were to perform a temporal order judgment task of tactile stimuli delivered on their hands, with both uncrossed and crossed arms. Results revealed marked differences in touch remapping in the high schizotypes as compared to low and moderate schizotypes. For the first time here we reveal that the remapping of environmental stimuli in the body space, an essential function to demarcate the boundaries between self and external world, is altered in schizotypy. Results are discussed in relation to recent models of 'self-disorders' as due to perceptual incoherence.
Collapse
Affiliation(s)
- Francesca Ferri
- Centre for Brain Science, Department of Psychology, University of Essex, Colchester, UK
| | | | - Marcello Costantini
- Centre for Brain Science, Department of Psychology, University of Essex, Colchester, UK.,Laboratory of Neuropsychology and Cognitive Neuroscience, Department of Neuroscience and Imaging, University G. d'Annunzio &Institute for Advanced Biomedical Technologies - ITAB, Foundation University G. d'Annunzio, Chieti, Italy
| |
Collapse
|
19
|
Saby JN, Meltzoff AN, Marshall PJ. Beyond the N1: A review of late somatosensory evoked responses in human infants. Int J Psychophysiol 2016; 110:146-152. [PMID: 27553531 DOI: 10.1016/j.ijpsycho.2016.08.008] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2016] [Revised: 08/17/2016] [Accepted: 08/18/2016] [Indexed: 01/05/2023]
Abstract
Somatosensory evoked potentials (SEPs) have been used for decades to study the development of somatosensory processing in human infants. Research on infant SEPs has focused on the initial cortical component (N1) and its clinical utility for predicting neurological outcome in at-risk infants. However, recent studies suggest that examining the later components in the infant somatosensory evoked response will greatly advance our understanding of somatosensory processing in infancy. The purpose of this review is to synthesize the existing electroencephalography (EEG) and magnetoencephalography (MEG) studies on late somatosensory evoked responses in infants. We describe the late responses that have been reported and discuss the utility of such responses for illuminating key aspects of somatosensory processing in typical and atypical development.
Collapse
Affiliation(s)
- Joni N Saby
- Institute for Learning & Brain Sciences, University of Washington, Box 357988, Seattle, WA 98195, United States.
| | - Andrew N Meltzoff
- Institute for Learning & Brain Sciences, University of Washington, Box 357988, Seattle, WA 98195, United States
| | - Peter J Marshall
- Department of Psychology, Temple University, 1701 North 13th Street, Philadelphia, PA 19122, United States
| |
Collapse
|
20
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|
21
|
Arm crossing updates brain functional connectivity of the left posterior parietal cortex. Sci Rep 2016; 6:28105. [PMID: 27302746 PMCID: PMC4908406 DOI: 10.1038/srep28105] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2014] [Accepted: 05/31/2016] [Indexed: 11/21/2022] Open
Abstract
The unusual configuration of body parts can cause illusions. For example, when tactile stimuli are delivered to crossed arms a reversal of subjective temporal ordering occurs. Our group has previously demonstrated that arm crossing without sensory stimuli causes activity changes in the left posterior parietal cortex (PPC) and an assessment of tactile temporal order judgments (TOJs) revealed a positive association between activity in this area, especially the left intraparietal sulcus (IPS), and the degree of the crossed-hand illusion. Thus, the present study investigated how the IPS actively relates to other cortical areas under arms-crossed and -uncrossed conditions by analyzing the functional connectivity of the IPS. Regions showing connectivity with the IPS overlapped with regions within the default mode network (DMN) but the IPS also showed connectivity with other brain areas, including the frontoparietal control network (FPCN). The right middle/inferior frontal gyrus (MFG/IFG), which is included in the FPCN, showed greater connectivity in the arms-crossed condition than in the arms-uncrossed condition. These findings suggest that there is state-dependent connectivity during arm crossing, and that the left IPS may play an important role during the spatio-temporal updating of arm positions.
Collapse
|
22
|
Brandes J, Heed T. Reach Trajectories Characterize Tactile Localization for Sensorimotor Decision Making. J Neurosci 2015; 35:13648-58. [PMID: 26446218 PMCID: PMC6605379 DOI: 10.1523/jneurosci.1873-14.2015] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2014] [Revised: 08/24/2015] [Accepted: 08/27/2015] [Indexed: 11/21/2022] Open
Abstract
Spatial target information for movement planning appears to be coded in a gaze-centered reference frame. In touch, however, location is initially coded with reference to the skin. Therefore, the tactile spatial location must be derived by integrating skin location and posture. It has been suggested that this recoding is impaired when the limb is placed in the opposite hemispace, for example, by limb crossing. Here, human participants reached toward visual and tactile targets located at uncrossed and crossed feet in a sensorimotor decision task. We characterized stimulus recoding by analyzing the timing and spatial profile of hand reaches. For tactile targets at crossed feet, skin-based information implicates the incorrect side, and only recoded information points to the correct location. Participants initiated straight reaches and redirected the hand toward a target presented in midflight. Trajectories to visual targets were unaffected by foot crossing. In contrast, trajectories to tactile targets were redirected later with crossed than uncrossed feet. Reaches to crossed feet usually continued straight until they were directed toward the correct tactile target and were not biased toward the skin-based target location. Occasional, far deflections toward the incorrect target were most likely when this target was implicated by trial history. These results are inconsistent with the suggestion that spatial transformations in touch are impaired by limb crossing, but are consistent with tactile location being recoded rapidly and efficiently, followed by integration of skin-based and external information to specify the reach target. This process may be implemented in a bounded integrator framework. SIGNIFICANCE STATEMENT How do you touch yourself, for instance, to scratch an itch? The place you need to reach is defined by a sensation on the skin, but our bodies are flexible, so this skin location could be anywhere in 3D space. The movement toward the tactile sensation must therefore be specified by merging skin location and body posture. By investigating human hand reach trajectories toward tactile stimuli on the feet, we provide experimental evidence that this transformation process is quick and efficient, and that its output is integrated with the original skin location in a fashion consistent with bounded integrator decision-making frameworks.
Collapse
Affiliation(s)
- Janina Brandes
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, 20146 Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, 20146 Hamburg, Germany
| |
Collapse
|
23
|
Rubber hand presentation modulates visuotactile interference effect especially in persons with high autistic traits. Exp Brain Res 2015; 234:51-65. [DOI: 10.1007/s00221-015-4429-z] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2015] [Accepted: 08/21/2015] [Indexed: 12/23/2022]
|
24
|
Heed T, Buchholz VN, Engel AK, Röder B. Tactile remapping: from coordinate transformation to integration in sensorimotor processing. Trends Cogn Sci 2015; 19:251-8. [DOI: 10.1016/j.tics.2015.03.001] [Citation(s) in RCA: 65] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2014] [Revised: 03/04/2015] [Accepted: 03/05/2015] [Indexed: 10/23/2022]
|
25
|
Ley P, Steinberg U, Hanganu-Opatz IL, Röder B. Event-related potential evidence for a dynamic (re-)weighting of somatotopic and external coordinates of touch during visual-tactile interactions. Eur J Neurosci 2015; 41:1466-74. [PMID: 25879770 DOI: 10.1111/ejn.12896] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2014] [Revised: 03/09/2015] [Accepted: 03/16/2015] [Indexed: 02/05/2023]
Abstract
The localization of touch in external space requires the remapping of somatotopically represented tactile information into an external frame of reference. Several recent studies have highlighted the role of posterior parietal areas for this remapping process, yet its temporal dynamics are poorly understood. The present study combined cross-modal stimulation with electrophysiological recordings in humans to trace the time course of tactile spatial remapping during visual-tactile interactions. Adopting an uncrossed or crossed hand posture, participants made speeded elevation judgments about rare vibrotactile stimuli within a stream of frequent, task-irrelevant vibrotactile events presented to the left or right hand. Simultaneous but spatially independent visual stimuli had to be ignored. An analysis of the recorded event-related potentials to the task-irrelevant vibrotactile stimuli revealed a somatotopic coding of tactile stimuli within the first 100 ms. Between 180 and 250 ms, neither an external nor a somatotopic representation dominated, suggesting that both coordinates were active in parallel. After 250 ms, tactile stimuli were coded in a somatotopic frame of reference. Our results indicate that cross-modal interactions start before the termination of tactile spatial remapping, that is within the first 100 ms. Thereafter, tactile stimuli are represented simultaneously in both somatotopic and external spatial coordinates, which are dynamically (re-)weighted as a function of processing stage.
Collapse
Affiliation(s)
- Pia Ley
- Biological Psychology and Neuropsychology, Von-Melle-Park 11, Hamburg, D-20146, Germany
| | - Ulf Steinberg
- Biological Psychology and Neuropsychology, Von-Melle-Park 11, Hamburg, D-20146, Germany
| | - Ileana L Hanganu-Opatz
- Developmental Neurophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, Von-Melle-Park 11, Hamburg, D-20146, Germany
| |
Collapse
|
26
|
Nishikawa N, Shimo Y, Wada M, Hattori N, Kitazawa S. Effects of aging and idiopathic Parkinson's disease on tactile temporal order judgment. PLoS One 2015; 10:e0118331. [PMID: 25760621 PMCID: PMC4356579 DOI: 10.1371/journal.pone.0118331] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2014] [Accepted: 01/13/2015] [Indexed: 11/22/2022] Open
Abstract
It is generally accepted that the basal ganglia play an important role in interval timing that requires the measurement of temporal durations. By contrast, it remains controversial whether the basal ganglia play an essential role in temporal order judgment (TOJ) of successive stimuli, a behavior that does not necessarily require the measurement of durations in time. To address this issue, we compared the effects of idiopathic Parkinson’s disease (PD) on the TOJ of two successive taps delivered to each hand, with the arms uncrossed in one condition and crossed in another. In addition to age-matched elderly participants without PD (non-PD), we examined young healthy participants so that the effect of aging could serve as a control for evaluating the effects of PD. There was no significant difference between PD and non-PD participants in any parameter of TOJ under either arm posture, although reaction time was significantly longer in PD compared with non-PD participants. By contrast, the effect of aging was apparent in both conditions. With their arms uncrossed, the temporal resolution (the interstimulus interval that yielded 84% correct responses) in elderly participants was significantly worse compared with young participants. With their arms crossed, elderly participants made more errors at longer intervals (~1 s) than young participants, although both age groups showed similar judgment reversal at moderately short intervals (~200 ms). These results indicate that the basal ganglia and dopaminergic systems do not play essential roles in tactile TOJ involving both hands and that the effect of aging on TOJ is mostly independent of the dopaminergic systems.
Collapse
Affiliation(s)
- Natsuko Nishikawa
- Department of Neurology, Juntendo University School of Medicine, Tokyo, 113-8421, Japan
| | - Yasushi Shimo
- Department of Neurology, Juntendo University School of Medicine, Tokyo, 113-8421, Japan
| | - Makoto Wada
- Department of Neurophysiology, Graduate School of Medicine, Juntendo University, Tokyo, Japan
- Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Japan
| | - Nobutaka Hattori
- Department of Neurology, Juntendo University School of Medicine, Tokyo, 113-8421, Japan
| | - Shigeru Kitazawa
- Department of Neurophysiology, Graduate School of Medicine, Juntendo University, Tokyo, Japan
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, Osaka, Japan
- Dynamic Brain Network Laboratory, Graduate School of Frontiers Bioscience, Osaka University, Osaka, Japan
- * E-mail:
| |
Collapse
|
27
|
|
28
|
Sustained maintenance of somatotopic information in brain regions recruited by tactile working memory. J Neurosci 2015; 35:1390-5. [PMID: 25632117 DOI: 10.1523/jneurosci.3535-14.2015] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
To adaptively guide ongoing behavior, representations in working memory (WM) often have to be modified in line with changing task demands. We used event-related potentials (ERPs) to demonstrate that tactile WM representations are stored in modality-specific cortical regions, that the goal-directed modulation of these representations is mediated through hemispheric-specific activation of somatosensory areas, and that the rehearsal of somatotopic coordinates in memory is accomplished by modality-specific spatial attention mechanisms. Participants encoded two tactile sample stimuli presented simultaneously to the left and right hands, before visual retro-cues indicated which of these stimuli had to be retained to be matched with a subsequent test stimulus on the same hand. Retro-cues triggered a sustained tactile contralateral delay activity component with a scalp topography over somatosensory cortex contralateral to the cued hand. Early somatosensory ERP components to task-irrelevant probe stimuli (that were presented after the retro-cues) and to subsequent test stimuli were enhanced when these stimuli appeared at the currently memorized location relative to other locations on the cued hand, demonstrating that a precise focus of spatial attention was established during the selective maintenance of tactile events in WM. These effects were observed regardless of whether participants performed the matching task with uncrossed or crossed hands, indicating that WM representations in this task were based on somatotopic rather than allocentric spatial coordinates. In conclusion, spatial rehearsal in tactile WM operates within somatotopically organized sensory brain areas that have been recruited for information storage.
Collapse
|
29
|
Honeine JL, Schieppati M. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions. Front Syst Neurosci 2014; 8:190. [PMID: 25339872 PMCID: PMC4186340 DOI: 10.3389/fnsys.2014.00190] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2014] [Accepted: 09/17/2014] [Indexed: 01/22/2023] Open
Abstract
Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1–2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices.
Collapse
Affiliation(s)
- Jean-Louis Honeine
- Department of Public Health, Experimental and Forensic Medicine, University of Pavia Pavia, Italy
| | - Marco Schieppati
- Department of Public Health, Experimental and Forensic Medicine, University of Pavia Pavia, Italy ; Centro Studi Attività Motorie (CSAM), Fondazione Salvatore Maugeri (IRCSS), Scientific Institute of Pavia Pavia, Italy
| |
Collapse
|
30
|
Medina J, McCloskey M, Coslett HB, Rapp B. Somatotopic representation of location: evidence from the Simon effect. J Exp Psychol Hum Percept Perform 2014; 40:2131-42. [PMID: 25243674 DOI: 10.1037/a0037975] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Representing the locations of tactile stimulation can involve somatotopic reference frames in which locations are defined relative to a position on the skin surface, and also external reference frames that take into account stimulus position in external space. Locations in somatotopic and external reference frames can conflict in terms of left/right assignment when the hands are crossed or positioned outside of their typical hemispace. To investigate the spatial codes of the representation of both tactile stimuli and responses to touch, a Simon effect task, often used in the visual modality to examine issues of spatial reference frames, was deployed in the tactile modality. Participants performed the task with stimuli delivered to the hands with arms in crossed or uncrossed postures and responses were produced with foot pedals. Across all 4 experiments, participants were faster on somatotopically congruent trials (e.g., left hand stimulus, left foot response) than on somatotopically incongruent trials (left hand stimulus, right foot response), regardless of arm or leg position. However, some evidence of an externally based Simon effect also appeared in 1 experiment in which arm (stimulus) and leg (response) position were both manipulated. Overall, the results demonstrate that tactile stimulus and response codes are primarily generated based on their somatotopic identity. However, stimulus and response coding based on an external reference frame can become more salient when both hands and feet can be crossed, creating a situation in which somatotopic and external representations can differ for both stimulus and response codes.
Collapse
Affiliation(s)
- Jared Medina
- Department of Psychology, University of Delaware
| | | | | | - Brenda Rapp
- Department of Cognitive Science, Johns Hopkins University
| |
Collapse
|
31
|
Abstract
Correctly localising sensory stimuli in space is a formidable challenge for the newborn brain. A new study provides a first glimpse into how human brain mechanisms for sensory remapping develop in the first year of life.
Collapse
|
32
|
Rigato S, Begum Ali J, van Velzen J, Bremner AJ. The neural basis of somatosensory remapping develops in human infancy. Curr Biol 2014; 24:1222-6. [PMID: 24856214 DOI: 10.1016/j.cub.2014.04.004] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2014] [Revised: 03/11/2014] [Accepted: 04/01/2014] [Indexed: 10/25/2022]
Abstract
When we sense a touch, our brains take account of our current limb position to determine the location of that touch in external space [1, 2]. Here we show that changes in the way the brain processes somatosensory information in the first year of life underlie the origins of this ability [3]. In three experiments we recorded somatosensory evoked potentials (SEPs) from 6.5-, 8-, and 10-month-old infants while presenting vibrotactile stimuli to their hands across uncrossed- and crossed-hands postures. At all ages we observed SEPs over central regions contralateral to the stimulated hand. Somatosensory processing was influenced by arm posture from 8 months onward. At 8 months, posture influenced mid-latency SEP components, but by 10 months effects were observed at early components associated with feed-forward stages of somatosensory processing. Furthermore, sight of the hands was a necessary pre-requisite for somatosensory remapping at 10 months. Thus, the cortical networks [4] underlying the ability to dynamically update the location of a perceived touch across limb movements become functional during the first year of life. Up until at least 6.5 months of age, it seems that human infants' perceptions of tactile stimuli in the external environment are heavily dependent upon limb position.
Collapse
Affiliation(s)
- Silvia Rigato
- Department of Psychology, University of Essex, Colchester CO4 3SQ, UK; Sensorimotor Development Research Unit, Department of Psychology, Goldsmiths, University of London, New Cross, London SE14 6NW, UK
| | - Jannath Begum Ali
- Sensorimotor Development Research Unit, Department of Psychology, Goldsmiths, University of London, New Cross, London SE14 6NW, UK
| | - José van Velzen
- Sensorimotor Development Research Unit, Department of Psychology, Goldsmiths, University of London, New Cross, London SE14 6NW, UK
| | - Andrew J Bremner
- Sensorimotor Development Research Unit, Department of Psychology, Goldsmiths, University of London, New Cross, London SE14 6NW, UK.
| |
Collapse
|
33
|
Heed T, Azañón E. Using time to investigate space: a review of tactile temporal order judgments as a window onto spatial processing in touch. Front Psychol 2014; 5:76. [PMID: 24596561 PMCID: PMC3925972 DOI: 10.3389/fpsyg.2014.00076] [Citation(s) in RCA: 80] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2013] [Accepted: 01/20/2014] [Indexed: 11/13/2022] Open
Abstract
To respond to a touch, it is often necessary to localize it in space, and not just on the skin. The computation of this external spatial location involves the integration of somatosensation with visual and proprioceptive information about current body posture. In the past years, the study of touch localization has received substantial attention and has become a central topic in the research field of multisensory integration. In this review, we will explore important findings from this research, zooming in on one specific experimental paradigm, the temporal order judgment (TOJ) task, which has proven particularly fruitful for the investigation of tactile spatial processing. In a typical TOJ task participants perform non-speeded judgments about the order of two tactile stimuli presented in rapid succession to different skin sites. This task could be solved without relying on external spatial coordinates. However, postural manipulations affect TOJ performance, indicating that external coordinates are in fact computed automatically. We show that this makes the TOJ task a reliable indicator of spatial remapping, and provide an overview over the versatile analysis options for TOJ. We introduce current theories of TOJ and touch localization, and then relate TOJ to behavioral and electrophysiological evidence from other paradigms, probing the benefit of TOJ for the study of spatial processing as well as related topics such as multisensory plasticity, body processing, and pain.
Collapse
Affiliation(s)
- Tobias Heed
- Department of Psychology and Human Movement Science, University of Hamburg Hamburg, Germany
| | - Elena Azañón
- Action and Body Group, Institute of Cognitive Neuroscience, University College London London, UK
| |
Collapse
|
34
|
Ruzzoli M, Soto-Faraco S. Alpha stimulation of the human parietal cortex attunes tactile perception to external space. Curr Biol 2014; 24:329-32. [PMID: 24440394 DOI: 10.1016/j.cub.2013.12.029] [Citation(s) in RCA: 49] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2013] [Revised: 11/28/2013] [Accepted: 12/13/2013] [Indexed: 10/25/2022]
Abstract
An intriguing question in neuroscience concerns how somatosensory events on the skin are represented in the human brain. Since Head and Holmes' [1] neuropsychological dissociation between localizing touch on the skin and localizing body parts in external space, touch is considered to operate in a variety of spatial reference frames [2]. At least two representations of space are in competition during orienting to touch: a somatotopic one, reflecting the organization of the somatosensory cortex (S1) [3], and a more abstract, external reference frame that factors postural changes in relation to body parts and/or external space [4, 5]. Previous transcranial magnetic stimulation (TMS) studies suggest that the posterior parietal cortex (PPC) plays a key role in supporting representations as well as orienting attention in an external reference frame [4, 6]. Here, we capitalized on the TMS entrainment approach [7, 8], targeting the intraparietal sulcus (IPS). We found that frequency-specific (10 Hz) tuning of the PPC induced spatially specific enhancement of tactile detection that was expressed in an external reference frame. This finding establishes a tight causal link between a concrete form of brain activity (10 Hz oscillation) and a specific type of spatial representation, revealing a fundamental property of how the parietal cortex encodes information.
Collapse
Affiliation(s)
- Manuela Ruzzoli
- Center for Brain and Cognition, Departament de Tecnologies de la Informació i les Comunicacions, Universitat Pompeu Fabra, 08018 Barcelona, Spain.
| | - Salvador Soto-Faraco
- Center for Brain and Cognition, Departament de Tecnologies de la Informació i les Comunicacions, Universitat Pompeu Fabra, 08018 Barcelona, Spain; Institució Catalana de Recerca i Estudis Avançats (ICREA), 08010 Barcelona, Spain
| |
Collapse
|