1
|
Alouit A, Gavaret M, Ramdani C, Lindberg PG, Dupin L. Cortical activations associated with spatial remapping of finger touch using EEG. Cereb Cortex 2024; 34:bhae161. [PMID: 38642106 DOI: 10.1093/cercor/bhae161] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 03/22/2024] [Accepted: 03/23/2024] [Indexed: 04/22/2024] Open
Abstract
The spatial coding of tactile information is functionally essential for touch-based shape perception and motor control. However, the spatiotemporal dynamics of how tactile information is remapped from the somatotopic reference frame in the primary somatosensory cortex to the spatiotopic reference frame remains unclear. This study investigated how hand position in space or posture influences cortical somatosensory processing. Twenty-two healthy subjects received electrical stimulation to the right thumb (D1) or little finger (D5) in three position conditions: palm down on right side of the body (baseline), hand crossing the body midline (effect of position), and palm up (effect of posture). Somatosensory-evoked potentials (SEPs) were recorded using electroencephalography. One early-, two mid-, and two late-latency neurophysiological components were identified for both fingers: P50, P1, N125, P200, and N250. D1 and D5 showed different cortical activation patterns: compared with baseline, the crossing condition showed significant clustering at P1 for D1, and at P50 and N125 for D5; the change in posture showed a significant cluster at N125 for D5. Clusters predominated at centro-parietal electrodes. These results suggest that tactile remapping of fingers after electrical stimulation occurs around 100-125 ms in the parietal cortex.
Collapse
Affiliation(s)
- Anaëlle Alouit
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Martine Gavaret
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
- GHU-Paris Psychiatrie et Neurosciences, Hôpital Sainte Anne, Service de neurophysiologie clinique, 1 Rue Cabanis, F-75014 Paris, France
| | - Céline Ramdani
- Service de Santé des Armées, Institut de Recherche Biomédicale des Armées, 1 Place du Général Valérie André, 91220 Brétigny-sur-Orge, France
| | - Påvel G Lindberg
- Université Paris Cité, Institute of Psychiatry and Neuroscience of Paris (IPNP), INSERM U1266, 102-108 Rue de la Santé, 75014 Paris, France
| | - Lucile Dupin
- Université Paris Cité, INCC UMR 8002, CNRS, 45 Rue des Saints-Pères, F-75006 Paris, France
| |
Collapse
|
2
|
Guo G, Wang N, Sun C, Geng H. Embodied Cross-Modal Interactions Based on an Altercentric Reference Frame. Brain Sci 2024; 14:314. [PMID: 38671966 PMCID: PMC11048532 DOI: 10.3390/brainsci14040314] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Revised: 03/20/2024] [Accepted: 03/22/2024] [Indexed: 04/28/2024] Open
Abstract
Accurate comprehension of others' thoughts and intentions is crucial for smooth social interactions, wherein understanding their perceptual experiences serves as a fundamental basis for this high-level social cognition. However, previous research has predominantly focused on the visual modality when investigating perceptual processing from others' perspectives, leaving the exploration of multisensory inputs during this process largely unexplored. By incorporating auditory stimuli into visual perspective-taking (VPT) tasks, we have designed a novel experimental paradigm in which the spatial correspondence between visual and auditory stimuli was limited to the altercentric rather than the egocentric reference frame. Overall, we found that when individuals engaged in explicit or implicit VPT to process visual stimuli from an avatar's viewpoint, the concomitantly presented auditory stimuli were also processed within this avatar-centered reference frame, revealing altercentric cross-modal interactions.
Collapse
Affiliation(s)
- Guanchen Guo
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| | - Nanbo Wang
- Department of Psychology, School of Health, Fujian Medical University, Fuzhou 350122, China;
| | - Chu Sun
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| | - Haiyan Geng
- School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China; (G.G.); (C.S.)
| |
Collapse
|
3
|
Fabio C, Salemme R, Farnè A, Miller LE. Alpha oscillations reflect similar mapping mechanisms for localizing touch on hands and tools. iScience 2024; 27:109092. [PMID: 38405611 PMCID: PMC10884914 DOI: 10.1016/j.isci.2024.109092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Revised: 12/07/2023] [Accepted: 01/30/2024] [Indexed: 02/27/2024] Open
Abstract
It has been suggested that our brain re-uses body-based computations to localize touch on tools, but the neural implementation of this process remains unclear. Neural oscillations in the alpha and beta frequency bands are known to map touch on the body in external and skin-centered coordinates, respectively. Here, we pinpointed the role of these oscillations during tool-extended sensing by delivering tactile stimuli to either participants' hands or the tips of hand-held rods. To disentangle brain responses related to each coordinate system, we had participants' hands/tool tips crossed or uncrossed at their body midline. We found that midline crossing modulated alpha (but not beta) band activity similarly for hands and tools, also involving a similar network of cortical regions. Our findings strongly suggest that the brain uses similar oscillatory mechanisms for mapping touch on the body and tools, supporting the idea that body-based neural processes are repurposed for tool use.
Collapse
Affiliation(s)
- Cécile Fabio
- Integrative Multisensory Perception Action & Cognition Team of the Lyon Neuroscience Research, Center INSERM U1028 CNRS U5292 University of Lyon 1, Lyon, France
- Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Bielefeld, Germany
| | - Romeo Salemme
- Integrative Multisensory Perception Action & Cognition Team of the Lyon Neuroscience Research, Center INSERM U1028 CNRS U5292 University of Lyon 1, Lyon, France
- Hospices Civils de Lyon, Neuro-immersion, Lyon, France
| | - Alessandro Farnè
- Integrative Multisensory Perception Action & Cognition Team of the Lyon Neuroscience Research, Center INSERM U1028 CNRS U5292 University of Lyon 1, Lyon, France
- Hospices Civils de Lyon, Neuro-immersion, Lyon, France
| | - Luke E. Miller
- Integrative Multisensory Perception Action & Cognition Team of the Lyon Neuroscience Research, Center INSERM U1028 CNRS U5292 University of Lyon 1, Lyon, France
- Hospices Civils de Lyon, Neuro-immersion, Lyon, France
- Donders Institute for Brain, Cognition and Behaviour, Nijmegen, the Netherlands
| |
Collapse
|
4
|
Matsuda Y, Sugawara Y, Akaiwa M, Saito H, Shibata E, Sasaki T, Sugawara K. Event-Related Brain Potentials N140 and P300 during Somatosensory Go/NoGo Tasks Are Modulated by Movement Preparation. Brain Sci 2023; 14:38. [PMID: 38248253 PMCID: PMC10813311 DOI: 10.3390/brainsci14010038] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Revised: 12/21/2023] [Accepted: 12/28/2023] [Indexed: 01/23/2024] Open
Abstract
The Go/NoGo task requires attention and sensory processing to distinguish a motor action cue or 'Go stimulus' from a 'NoGo stimulus' requiring no action, as well as motor preparation for a rapid Go stimulus response. The neural activity mediating these response phases can be examined non-invasively by measuring specific event-related brain potentials (ERPs) using electroencephalography. However, it is critical to determine how different task conditions, such as the relationship between attention site and movement site, influence ERPs and task performance. In this study, we compared attention-associated ERP components N140 and P300, the performance metrics reaction time (RT) and accuracy (%Error) and movement-related cortical potentials (MRCPs) between Go/NoGo task trials in which attention target and movement site were the same (right index finger movement in response to right index finger stimulation) or different (right index finger movement in response to fifth finger stimulation). In other Count trials, participants kept a running count of target stimuli presented but did not initiate a motor response. The N140 amplitudes at electrode site Cz were significantly larger in Movement trials than in Count trials regardless of the stimulation site-movement site condition. In contrast, the P300 amplitude at Cz was significantly smaller in Movement trials than in Count trials. The temporal windows of N140 and P300 overlapped with the MRCP. This superposition may influence N140 and P300 through summation, possibly independent of changes in attentional allocation.
Collapse
Affiliation(s)
- Yuya Matsuda
- Graduate School of Health Sciences, Sapporo Medical University, Sapporo 060-8556, Hokkaido, Japan
| | - Yasushi Sugawara
- Graduate School of Health Sciences, Sapporo Medical University, Sapporo 060-8556, Hokkaido, Japan
| | - Mayu Akaiwa
- Graduate School of Health Sciences, Sapporo Medical University, Sapporo 060-8556, Hokkaido, Japan
| | - Hidekazu Saito
- Department of Occupational Therapy, School of Health Science, Sapporo Medical University, Sapporo 060-8556, Hokkaido, Japan
| | - Eriko Shibata
- Major of Physical Therapy, Department of Rehabilitation, Faculty of Healthcare and Science, Hokkaido Bunkyo University, Eniwa 061-1449, Hokkaido, Japan
| | - Takeshi Sasaki
- Department of Physical Therapy, School of Health Science, Sapporo Medical University, Sapporo 060-8556, Hokkaido, Japan
| | - Kazuhiro Sugawara
- Department of Physical Therapy, School of Health Science, Sapporo Medical University, Sapporo 060-8556, Hokkaido, Japan
| |
Collapse
|
5
|
Kida T, Kaneda T, Nishihira Y. ERP evidence of attentional somatosensory processing and stimulus-response coupling under different hand and arm postures. Front Hum Neurosci 2023; 17:1252686. [PMID: 38021238 PMCID: PMC10676239 DOI: 10.3389/fnhum.2023.1252686] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2023] [Accepted: 10/16/2023] [Indexed: 12/01/2023] Open
Abstract
We investigated (1) the effects of divided and focused attention on event-related brain potentials (ERPs) elicited by somatosensory stimulation under different response modes, (2) the effects of hand position (closely-placed vs. separated hands) and arm posture (crossed vs. uncrossed forearms) on the attentional modulation of somatosensory ERPs, and (3) changes in the coupling of stimulus- and response-related processes by somatosensory attention using a single-trial analysis of P300 latency and reaction times. Electrocutaneous stimulation was presented randomly to the thumb or middle finger of the left or right hand at random interstimulus intervals (700-900 ms). Subjects attended unilaterally or bilaterally to stimuli in order to detect target stimuli by a motor response or counting. The effects of unilaterally-focused attention were also tested under different hand and arm positions. The amplitude of N140 in the divided attention condition was intermediate between unilaterally attended and unattended stimuli in the unilaterally-focused attention condition in both the mental counting and motor response tasks. Attended infrequent (target) stimuli elicited greater P300 in the unilaterally attention condition than in the divided attention condition. P300 latency was longer in the divided attention condition than in the unilaterally-focused attention condition in the motor response task, but remained unchanged in the counting task. Closely locating the hands had no impact, whereas crossing the forearms decreased the attentional enhancement in N140 amplitude. In contrast, these two manipulations uniformly decreased P300 amplitude and increased P300 latency. The correlation between single-trial P300 latency and RT was decreased by crossed forearms, but not by divided attention or closely-placed hands. Therefore, the present results indicate that focused and divided attention differently affected middle latency and late processing, and that hand position and arm posture also differently affected attentional processes and stimulus-response coupling.
Collapse
Affiliation(s)
- Tetsuo Kida
- Higher Brain Function Unit, Department of Functioning and Disability, Institute for Developmental Research, Aichi Developmental Disability Center, Kasugai, Japan
| | | | - Yoshiaki Nishihira
- Graduate School of Comprehensive Human Sciences, University of Tsukuba, Tsukuba, Japan
| |
Collapse
|
6
|
Klautke J, Foster C, Medendorp WP, Heed T. Dynamic spatial coding in parietal cortex mediates tactile-motor transformation. Nat Commun 2023; 14:4532. [PMID: 37500625 PMCID: PMC10374589 DOI: 10.1038/s41467-023-39959-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2022] [Accepted: 07/05/2023] [Indexed: 07/29/2023] Open
Abstract
Movements towards touch on the body require integrating tactile location and body posture information. Tactile processing and movement planning both rely on posterior parietal cortex (PPC) but their interplay is not understood. Here, human participants received tactile stimuli on their crossed and uncrossed feet, dissociating stimulus location relative to anatomy versus external space. Participants pointed to the touch or the equivalent location on the other foot, which dissociates sensory and motor locations. Multi-voxel pattern analysis of concurrently recorded fMRI signals revealed that tactile location was coded anatomically in anterior PPC but spatially in posterior PPC during sensory processing. After movement instructions were specified, PPC exclusively represented the movement goal in space, in regions associated with visuo-motor planning and with regional overlap for sensory, rule-related, and movement coding. Thus, PPC flexibly updates its spatial codes to accommodate rule-based transformation of sensory input to generate movement to environment and own body alike.
Collapse
Affiliation(s)
- Janina Klautke
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| | - Celia Foster
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - W Pieter Medendorp
- Radboud University, Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
| | - Tobias Heed
- Biopsychology & Cognitive Neuroscience, Bielefeld University, Bielefeld, Germany.
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany.
- Cognitive Psychology, Department of Psychology, University of Salzburg, Salzburg, Austria.
- Centre for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria.
| |
Collapse
|
7
|
Gherri E, White F, Venables E. On the spread of spatial attention in touch: Evidence from Event-Related Brain potentials. Biol Psychol 2023; 178:108544. [PMID: 36931591 DOI: 10.1016/j.biopsycho.2023.108544] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Revised: 01/27/2023] [Accepted: 03/14/2023] [Indexed: 03/18/2023]
Abstract
To investigate the distribution of tactile spatial attention near the current attentional focus, participants were cued to attend to one of four body locations (hand or shoulder on the left or right side) to respond to infrequent tactile targets. In this Narrow attention task, effects of spatial attention on the ERPs elicited by tactile stimuli delivered to the hands were compared as a function of the distance from the attentional focus (Focus on the hand vs. Focus on the shoulder). When participants focused on the hand, attentional modulations of the sensory-specific P100 and N140 components were followed by the longer latency Nd component. Notably, when participants focused on the shoulder, they were unable to restrict their attentional resources to the cued location, as revealed by the presence of reliable attentional modulations at the hands. This effect of attention outside the attentional focus was delayed and reduced compared to that observed within the attentional focus, revealing the presence of an attentional gradient. In addition, to investigate whether the size of the attentional focus modulated the effects of tactile spatial attention on somatosensory processing, participants also completed the Broad attention task, in which they were cued to attend to two locations (both the hand and the shoulder) on the left or right side. Attentional modulations at the hands emerged later and were reduced in the Broad compared to the Narrow attention task, suggesting reduced attentional resources for a wider attentional focus.
Collapse
Affiliation(s)
- Elena Gherri
- Human Cognitive Neuroscience, University of Edinburgh, UK; Università di Bologna, Italy.
| | - Felicity White
- Human Cognitive Neuroscience, University of Edinburgh, UK
| | | |
Collapse
|
8
|
Gherri E, White F, Ambron E. Searching on the Back: Attentional Selectivity in the Periphery of the Tactile Field. Front Psychol 2022; 13:934573. [PMID: 35911043 PMCID: PMC9328746 DOI: 10.3389/fpsyg.2022.934573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Accepted: 06/07/2022] [Indexed: 12/03/2022] Open
Abstract
Recent evidence has identified the N140cc lateralized component of event-related potentials as a reliable index of the deployment of attention to task-relevant items in touch. However, existing ERP studies have presented the tactile search array to participants' limbs, most often to the hands. Here, we investigated distractor interference effects when the tactile search array was presented to a portion of the body that is less lateralized and peripheral compared to the hands. Participants were asked to localize a tactile target presented among distractors in a circular arrangement to their back. The N140cc was elicited contralateral to the target when the singleton distractor was absent. Its amplitude was reduced when the singleton distractor was present and contralateral to the target, suggesting that attention was directed at least in part to the distractor when the singletons are on opposite sides. However, similar N140cc were observed when the singleton distractor was ipsilateral to the target compared to distractor absent trials. We suggest that when target and singleton distractor are ipsilateral, the exact localization of the target requires the attentional processing of all items on the same side of the array, similar to distractor absent trials. Together, these observations replicate the distractor interference effects previously observed for the hands, suggesting that analogous mechanisms guide attentional selectivity across different body parts.
Collapse
Affiliation(s)
- Elena Gherri
- Dipartimento di Filosofia e Comunicazione, University of Bologna, Bologna, Italy
- Human Cognitive Neuroscience, University of Edinburgh, Edinburgh, United Kingdom
- *Correspondence: Elena Gherri
| | - Felicity White
- Human Cognitive Neuroscience, University of Edinburgh, Edinburgh, United Kingdom
| | - Elisabetta Ambron
- Laboratory for Cognition and Neural Stimulation, Neurology Department, School of Medicine University of Pennsylvania, Philadelphia, PA, United States
| |
Collapse
|
9
|
Liu Y, Wang W, Xu W, Cheng Q, Ming D. Quantifying the Generation Process of Multi-Level Tactile Sensations via ERP Component Investigation. Int J Neural Syst 2021; 31:2150049. [PMID: 34635035 DOI: 10.1142/s0129065721500490] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Humans obtain characteristic information such as texture and weight of external objects, relying on the brain's integration and classification of tactile information; however, the decoding mechanism of multi-level tactile information is relatively elusive from the temporal sequence. In this paper, nonvariant frequency, along with the variant pulse width of electrotactile stimulus, was performed to generate multi-level pressure sensation. Event-related potentials (ERPs) were measured to investigate the mechanism of whole temporal tactile processing. Five ERP components, containing P100-N140-P200-N200-P300, were observed. By establishing the relationship between stimulation parameters and ERP component amplitudes, we found the following: (1) P200 is the most significant component for distinguishing multi-level tactile sensations; (2) P300 is correlated well with the subjective judgment of tactile sensation. The temporal sequence of brain topographies was implemented to clarify the spatiotemporal characteristics of the tactile process, which conformed to the serial processing model in neurophysiology and cortical network response area described by fMRI. Our results can help further clarify the mechanism of tactile sequential processing, which can be applied to improve the tactile BCI performance, sensory enhancement, and clinical diagnosis for doctors to evaluate the tactile process disorders by examining the temporal ERP components.
Collapse
Affiliation(s)
- Yuan Liu
- Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin, China, 92 Weijin Road, Nankai District, Tianjin, P. R. China
| | - Wenjie Wang
- Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin, China, 92 Weijin Road, Nankai District, Tianjin, P. R. China
| | - Weiguo Xu
- Tianjin Hospital, Tianjin University, Tianjin, China, 406 South Jiefang Road, Hexi District, Tianjin, P. R. China
| | - Qian Cheng
- Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin, China, 92 Weijin Road, Nankai District, Tianjin, P. R. China
| | - Dong Ming
- College of Precision Instruments and Optoelectronics Engineering, Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin, China, 92 Weijin Road, Nankai District, Tianjin, P. R. China
| |
Collapse
|
10
|
Spinal and Cerebral Integration of Noxious Inputs in Left-handed Individuals. Brain Topogr 2021; 34:568-586. [PMID: 34338897 DOI: 10.1007/s10548-021-00864-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2020] [Accepted: 07/23/2021] [Indexed: 10/20/2022]
Abstract
Some pain-related information is processed preferentially in the right cerebral hemisphere. Considering that functional lateralization can be affected by handedness, spinal and cerebral pain-related responses may be different between right- and left-handed individuals. Therefore, this study aimed to investigate the cortical and spinal mechanisms of nociceptive integration when nociceptive stimuli are applied to right -handed vs. left -handed individuals. The NFR, evoked potentials (ERP: P45, N100, P260), and event-related spectral perturbations (ERSP: theta, alpha, beta and gamma band oscillations) were compared between ten right-handed and ten left-handed participants. Pain was induced by transcutaneous electrical stimulation of the lower limbs and left upper limb. Stimulation intensity was adjusted individually in five counterbalanced conditions of 21 stimuli each: three unilateral (right lower limb, left lower limb, and left upper limb stimulation) and two bilateral conditions (right and left lower limbs, and the right lower limb and left upper limb stimulation). The amplitude of the NFR, ERP, ERSP, and pain ratings were compared between groups and conditions using a mixed ANOVA. A significant increase of responses was observed in bilateral compared with unilateral conditions for pain intensity, NFR amplitude, N100, theta oscillations, and gamma oscillations. However, these effects were not significantly different between right- and left-handed individuals. These results suggest that spinal and cerebral integration of bilateral nociceptive inputs is similar between right- and left-handed individuals. They also imply that pain-related responses measured in this study may be examined independently of handedness.
Collapse
|
11
|
Berchicci M, Russo Y, Bianco V, Quinzi F, Rum L, Macaluso A, Committeri G, Vannozzi G, Di Russo F. Stepping forward, stepping backward: a movement-related cortical potential study unveils distinctive brain activities. Behav Brain Res 2020; 388:112663. [DOI: 10.1016/j.bbr.2020.112663] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2019] [Revised: 03/16/2020] [Accepted: 04/21/2020] [Indexed: 01/03/2023]
|
12
|
Chen S, Shi Z, Zang X, Zhu X, Assumpção L, Müller HJ, Geyer T. Crossmodal learning of target-context associations: When would tactile context predict visual search? Atten Percept Psychophys 2020; 82:1682-1694. [PMID: 31845105 PMCID: PMC7297845 DOI: 10.3758/s13414-019-01907-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
It is well established that statistical learning of visual target locations in relation to constantly positioned visual distractors facilitates visual search. In the present study, we investigated whether such a contextual-cueing effect would also work crossmodally, from touch onto vision. Participants responded to the orientation of a visual target singleton presented among seven homogenous visual distractors. Four tactile stimuli, two to different fingers of each hand, were presented either simultaneously with or prior to the visual stimuli. The identity of the stimulated fingers provided the crossmodal context cue: in half of the trials, a given visual target location was consistently paired with a given tactile configuration. The visual stimuli were presented above the unseen fingers, ensuring spatial correspondence between vision and touch. We found no evidence of crossmodal contextual cueing when the two sets of items (tactile, visual) were presented simultaneously (Experiment 1). However, a reliable crossmodal effect emerged when the tactile distractors preceded the onset of visual stimuli 700 ms (Experiment 2). But crossmodal cueing disappeared again when, after an initial learning phase, participants flipped their hands, making the tactile distractors appear at different positions in external space while their somatotopic positions remained unchanged (Experiment 3). In all experiments, participants were unable to explicitly discriminate learned from novel multisensory arrays. These findings indicate that search-facilitating context memory can be established across vision and touch. However, in order to guide visual search, the (predictive) tactile configurations must be remapped from their initial somatotopic into a common external representational format.
Collapse
Affiliation(s)
- Siyi Chen
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany.
| | - Zhuanghua Shi
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Xuelian Zang
- Center for Cognition and Brain Disorders, Institute of Psychological Sciences, Hangzhou Normal University, Hangzhou, China
| | - Xiuna Zhu
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Leonardo Assumpção
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Hermann J Müller
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| | - Thomas Geyer
- General and Experimental Psychology, Department of Psychology, LMU Munich, Leopoldstr 13, 80802, Munich, Germany
| |
Collapse
|
13
|
Scandola M, Aglioti SM, Lazzeri G, Avesani R, Ionta S, Moro V. Visuo-motor and interoceptive influences on peripersonal space representation following spinal cord injury. Sci Rep 2020; 10:5162. [PMID: 32198431 PMCID: PMC7083926 DOI: 10.1038/s41598-020-62080-1] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2019] [Accepted: 03/02/2020] [Indexed: 02/08/2023] Open
Abstract
Peripersonal space (PPS) representation is modulated by information coming from the body. In paraplegic individuals, whose lower limb sensory-motor functions are impaired or completely lost, the representation of PPS around the feet is reduced. However, passive motion can have short-term restorative effects. What remains unclear is the mechanisms underlying this recovery, in particular with regard to the contribution of visual and motor feedback and of interoception. Using virtual reality technology, we dissociated the motor and visual feedback during passive motion in paraplegics with complete and incomplete lesions and in healthy controls. The results show that in the case of paraplegics, the presence of motor feedback was necessary for the recovery of PPS representation, both when the motor feedback was congruent and when it was incongruent with the visual feedback. In contrast, visuo-motor incongruence led to an inhibition of PPS representation in the control group. There were no differences in sympathetic responses between the three groups. Nevertheless, in individuals with incomplete lesions, greater interoceptive sensitivity was associated with a better representation of PPS around the feet in the visuo-motor incongruent conditions. These results shed new light on the modulation of PPS representation, and demonstrate the importance of residual motor feedback and its integration with other bodily information in maintaining space representation.
Collapse
Affiliation(s)
- Michele Scandola
- NPSY-Lab.VR, Department of Human Sciences, University of Verona, Verona, Italy. .,IRCCS, Fondazione Santa Lucia, Rome, Italy.
| | - Salvatore Maria Aglioti
- IRCCS, Fondazione Santa Lucia, Rome, Italy.,Department of Psychology, University of Rome "Sapienza", Rome, Italy.,Istituto Italiano di Tecnologia, Rome, Italy
| | | | - Renato Avesani
- Department of Rehabilitation, IRCSS Sacro Cuore - Don Calabria Hospital, Verona, Italy
| | - Silvio Ionta
- Sensory-Motor Lab (SeMoLa), Department of Ophthalmology-University of Lausanne, Jules Gonin Eye; Hospital-Fondation Asile des Aveugles, Lausanne, Switzerland
| | - Valentina Moro
- NPSY-Lab.VR, Department of Human Sciences, University of Verona, Verona, Italy
| |
Collapse
|
14
|
Abstract
Humans localize touch on hand-held tools by interpreting the unique vibratory patterns elicited by impact to different parts of the tool. This perceptual strategy differs markedly from localizing touch on the skin. A new study shows that, nonetheless, touch location is probably processed similarly for skin and tool already early in somatosensory cortex.
Collapse
Affiliation(s)
- Tobias Heed
- Faculty of Psychology and Sports Science and Cluster of Excellence "Cognitive Interaction Technology", Bielefeld University, Universitätsstr. 25, 33615 Bielefeld, Germany.
| |
Collapse
|
15
|
Christie BP, Charkhkar H, Shell CE, Marasco PD, Tyler DJ, Triolo RJ. Visual inputs and postural manipulations affect the location of somatosensory percepts elicited by electrical stimulation. Sci Rep 2019; 9:11699. [PMID: 31406122 PMCID: PMC6690924 DOI: 10.1038/s41598-019-47867-1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2019] [Accepted: 07/25/2019] [Indexed: 12/02/2022] Open
Abstract
The perception of somatosensation requires the integration of multimodal information, yet the effects of vision and posture on somatosensory percepts elicited by neural stimulation are not well established. In this study, we applied electrical stimulation directly to the residual nerves of trans-tibial amputees to elicit sensations referred to their missing feet. We evaluated the influence of congruent and incongruent visual inputs and postural manipulations on the perceived size and location of stimulation-evoked somatosensory percepts. We found that although standing upright may cause percept size to change, congruent visual inputs and/or body posture resulted in better localization. We also observed visual capture: the location of a somatosensory percept shifted toward a visual input when vision was incongruent with stimulation-induced sensation. Visual capture did not occur when an adopted posture was incongruent with somatosensation. Our results suggest that internal model predictions based on postural manipulations reinforce perceived sensations, but do not alter them. These characterizations of multisensory integration are important for the development of somatosensory-enabled prostheses because current neural stimulation paradigms cannot replicate the afferent signals of natural tactile stimuli. Nevertheless, multisensory inputs can improve perceptual precision and highlight regions of the foot important for balance and locomotion.
Collapse
Affiliation(s)
- Breanne P Christie
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA. .,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.
| | - Hamid Charkhkar
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| | - Courtney E Shell
- Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.,Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, OH, USA
| | - Paul D Marasco
- Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.,Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, OH, USA
| | - Dustin J Tyler
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| | - Ronald J Triolo
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| |
Collapse
|
16
|
Miller LE, Longo MR, Saygin AP. Tool Use Modulates Somatosensory Cortical Processing in Humans. J Cogn Neurosci 2019; 31:1782-1795. [PMID: 31368823 DOI: 10.1162/jocn_a_01452] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
Abstract
Tool use leads to plastic changes in sensorimotor body representations underlying tactile perception. The neural correlates of this tool-induced plasticity in humans have not been adequately characterized. This study used ERPs to investigate the stage of sensory processing modulated by tool use. Somatosensory evoked potentials, elicited by median nerve stimulation, were recorded before and after two forms of object interaction: tool use and hand use. Compared with baseline, tool use-but not use of the hand alone-modulated the amplitude of the P100. The P100 is a mid-latency component that indexes the construction of multisensory models of the body and has generators in secondary somatosensory and posterior parietal cortices. These results mark one of the first demonstrations of the neural correlates of tool-induced plasticity in humans and suggest that tool use modulates relatively late stages of somatosensory processing outside primary somatosensory cortex. This finding is consistent with what has been observed in tool-trained monkeys and suggests that the mechanisms underlying tool-induced plasticity have been preserved across primate evolution.
Collapse
Affiliation(s)
- Luke E Miller
- University of California, San Diego.,Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, Bron Cedex, France
| | | | | |
Collapse
|
17
|
Rahman MS, Yau JM. Somatosensory interactions reveal feature-dependent computations. J Neurophysiol 2019; 122:5-21. [DOI: 10.1152/jn.00168.2019] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Our ability to perceive and discriminate textures is based on the processing of high-frequency vibrations generated on the fingertip as it scans across a surface. Although much is known about the processing of vibration amplitude and frequency information when cutaneous stimulation is experienced at a single location on the body, how these stimulus features are processed when touch occurs at multiple locations is poorly understood. We evaluated participants’ ability to discriminate tactile cues (100–300 Hz) on one hand while they ignored distractor cues experienced on their other hand. We manipulated the relative positions of the hands to characterize how limb position influenced cutaneous touch interactions. In separate experiments, participants judged either the frequency or intensity of mechanical vibrations. We found that vibrations experienced on one hand always systematically modulated the perception of vibrations on the other hand. Notably, bimanual interaction patterns and their sensitivity to hand locations differed according to stimulus feature. Somatosensory interactions in intensity perception were only marked by attenuation that was invariant to hand position manipulations. In contrast, interactions in frequency perception consisted of both bias and sensitivity changes that were more pronounced when the hands were held in close proximity. We implemented models to infer the neural computations that mediate somatosensory interactions in the intensity and frequency dimensions. Our findings reveal obligatory and feature-dependent somatosensory interactions that may be supported by both feature-specific and feature-general operations. NEW & NOTEWORTHY Little is known about the neural computations mediating feature-specific sensory interactions between the hands. We show that vibrations experienced on one hand systematically modulate the perception of vibrations felt on the other hand. Critically, interaction patterns and their dependence on the relative positions of the hands differed depending on whether participants judged vibration intensity or frequency. These results, which we recapitulate with models, imply that somatosensory interactions are mediated by feature-dependent neural computations.
Collapse
Affiliation(s)
| | - Jeffrey M. Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas
| |
Collapse
|
18
|
Alpha-band oscillations reflect external spatial coding for tactile stimuli in sighted, but not in congenitally blind humans. Sci Rep 2019; 9:9215. [PMID: 31239467 PMCID: PMC6592921 DOI: 10.1038/s41598-019-45634-w] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2018] [Accepted: 06/11/2019] [Indexed: 12/02/2022] Open
Abstract
We investigated the function of oscillatory alpha-band activity in the neural coding of spatial information during tactile processing. Sighted humans concurrently encode tactile location in skin-based and, after integration with posture, external spatial reference frames, whereas congenitally blind humans preferably use skin-based coding. Accordingly, lateralization of alpha-band activity in parietal regions during attentional orienting in expectance of tactile stimulation reflected external spatial coding in sighted, but skin-based coding in blind humans. Here, we asked whether alpha-band activity plays a similar role in spatial coding for tactile processing, that is, after the stimulus has been received. Sighted and congenitally blind participants were cued to attend to one hand in order to detect rare tactile deviant stimuli at this hand while ignoring tactile deviants at the other hand and tactile standard stimuli at both hands. The reference frames encoded by oscillatory activity during tactile processing were probed by adopting either an uncrossed or crossed hand posture. In sighted participants, attended relative to unattended standard stimuli suppressed the power in the alpha-band over ipsilateral centro-parietal and occipital cortex. Hand crossing attenuated this attentional modulation predominantly over ipsilateral posterior-parietal cortex. In contrast, although contralateral alpha-activity was enhanced for attended versus unattended stimuli in blind participants, no crossing effects were evident in the oscillatory activity of this group. These findings suggest that oscillatory alpha-band activity plays a pivotal role in the neural coding of external spatial information for touch.
Collapse
|
19
|
Tamè L, Azañón E, Longo MR. A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Front Psychol 2019; 10:291. [PMID: 30863333 PMCID: PMC6399380 DOI: 10.3389/fpsyg.2019.00291] [Citation(s) in RCA: 42] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 01/29/2019] [Indexed: 11/30/2022] Open
Abstract
The processing of touch depends of multiple factors, such as the properties of the skin and type of receptors stimulated, as well as features related to the actual configuration and shape of the body itself. A large body of research has focused on the effect that the nature of the stimuli has on tactile processing. Less research, however, has focused on features beyond the nature of the touch. In this review, we focus on some features related to the body that have been investigated for less time and in a more fragmented way. These include the symmetrical quality of the two sides of the body, the postural configuration of the body, as well as the size and shape of different body parts. We will describe what we consider three key aspects: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body, regarded as priors; (3) and how tactile signals are integrated with representations of body size and shape. Here, we describe how these different body dimensions affect integration of tactile information as well as guide motor behavior by integrating them in a single model of tactile processing. We review a wide range of neuropsychological, neuroimaging, and neurophysiological data and suggest a revised model of tactile integration on the basis of the one proposed previously by Longo et al.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom.,School of Psychology, University of Kent, Canterbury, United Kingdom
| | - Elena Azañón
- Institute of Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany.,Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom
| |
Collapse
|
20
|
Kida T, Tanaka E, Kakigi R. Adaptive flexibility of the within-hand attentional gradient in touch: An MEG study. Neuroimage 2018; 179:373-384. [DOI: 10.1016/j.neuroimage.2018.06.063] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2018] [Revised: 06/19/2018] [Accepted: 06/21/2018] [Indexed: 10/28/2022] Open
|
21
|
Dall'Orso S, Steinweg J, Allievi AG, Edwards AD, Burdet E, Arichi T. Somatotopic Mapping of the Developing Sensorimotor Cortex in the Preterm Human Brain. Cereb Cortex 2018; 28:2507-2515. [PMID: 29901788 PMCID: PMC5998947 DOI: 10.1093/cercor/bhy050] [Citation(s) in RCA: 51] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2017] [Accepted: 02/13/2018] [Indexed: 01/26/2023] Open
Abstract
In the mature mammalian brain, the primary somatosensory and motor cortices are known to be spatially organized such that neural activity relating to specific body parts can be somatopically mapped onto an anatomical "homunculus". This organization creates an internal body representation which is fundamental for precise motor control, spatial awareness and social interaction. Although it is unknown when this organization develops in humans, animal studies suggest that it may emerge even before the time of normal birth. We therefore characterized the somatotopic organization of the primary sensorimotor cortices using functional MRI and a set of custom-made robotic tools in 35 healthy preterm infants aged from 31 + 6 to 36 + 3 weeks postmenstrual age. Functional responses induced by somatosensory stimulation of the wrists, ankles, and mouth had a distinct spatial organization as seen in the characteristic mature homunculus map. In comparison to the ankle, activation related to wrist stimulation was significantly larger and more commonly involved additional areas including the supplementary motor area and ipsilateral sensorimotor cortex. These results are in keeping with early intrinsic determination of a somatotopic map within the primary sensorimotor cortices. This may explain why acquired brain injury in this region during the preterm period cannot be compensated for by cortical reorganization and therefore can lead to long-lasting motor and sensory impairment.
Collapse
Affiliation(s)
- S Dall'Orso
- Department of Bioengineering, Imperial College London, London, UK
- Centre for the Developing Brain, School of Biomedical Engineering and Imaging Sciences, King's College London, King's Health Partners, St Thomas' Hospital, London, UK
| | - J Steinweg
- Centre for the Developing Brain, School of Biomedical Engineering and Imaging Sciences, King's College London, King's Health Partners, St Thomas' Hospital, London, UK
| | - A G Allievi
- Department of Bioengineering, Imperial College London, London, UK
| | - A D Edwards
- Department of Bioengineering, Imperial College London, London, UK
- Centre for the Developing Brain, School of Biomedical Engineering and Imaging Sciences, King's College London, King's Health Partners, St Thomas' Hospital, London, UK
| | - E Burdet
- Department of Bioengineering, Imperial College London, London, UK
| | - T Arichi
- Department of Bioengineering, Imperial College London, London, UK
- Centre for the Developing Brain, School of Biomedical Engineering and Imaging Sciences, King's College London, King's Health Partners, St Thomas' Hospital, London, UK
- Paediatric Neurosciences, Evelina London Children's Hospital, St Thomas' Hospital, London, UK
| |
Collapse
|
22
|
Meltzoff AN, Saby JN, Marshall PJ. Neural representations of the body in 60-day-old human infants. Dev Sci 2018; 22:e12698. [PMID: 29938877 DOI: 10.1111/desc.12698] [Citation(s) in RCA: 59] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2018] [Revised: 04/23/2018] [Accepted: 05/15/2018] [Indexed: 11/28/2022]
Abstract
The organization of body representations in the adult brain has been well documented. Little is understood about this aspect of brain organization in human infancy. The current study employed electroencephalography (EEG) with 60-day-old infants to test the distribution of brain responses to tactile stimulation of three different body parts: hand, foot, and lip. Analyses focused on a prominent positive response occurring at 150-200 ms in the somatosensory evoked potential at central and parietal electrode sites. The results show differential electrophysiological signatures for touch of these three body parts. Stimulation of the left hand was associated with greater positive amplitude over the lateral central region contralateral to the side stimulated. Left foot stimulation was associated with greater positivity over the midline parietal site. Stimulation of the midline of the upper lip was associated with a strong bilateral response over the central region. These findings provide new insights into the neural representation of the body in infancy and shed light on research and theories about the involvement of somatosensory cortex in infant imitation and social perception.
Collapse
Affiliation(s)
- Andrew N Meltzoff
- Institute for Learning & Brain Sciences, University of Washington, Seattle, Washington
| | - Joni N Saby
- Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pennsylvania
| | - Peter J Marshall
- Department of Psychology, Temple University, Philadelphia, Pennsylvania
| |
Collapse
|
23
|
Task-irrelevant sounds influence both temporal order and apparent-motion judgments about tactile stimuli applied to crossed and uncrossed hands. Atten Percept Psychophys 2017; 80:773-783. [DOI: 10.3758/s13414-017-1476-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
24
|
Schubert JTW, Badde S, Röder B, Heed T. Task demands affect spatial reference frame weighting during tactile localization in sighted and congenitally blind adults. PLoS One 2017; 12:e0189067. [PMID: 29228023 PMCID: PMC5724835 DOI: 10.1371/journal.pone.0189067] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Accepted: 11/17/2017] [Indexed: 11/18/2022] Open
Abstract
Task demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically (“palm” or “back” of the hand), or externally (“up” or “down” in space). Under anatomical instructions, performance was more accurate for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was more accurate for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information—here, task instruction—even in the absence of developmental vision.
Collapse
Affiliation(s)
- Jonathan T. W. Schubert
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Stephanie Badde
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Department of Psychology, New York University, New York, United States of America
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, Hamburg, Germany
- Biopsychology & Cognitive Neuroscience, Faculty of Psychology & Sports Science, Bielefeld University, Bielefeld, Germany
- Center of Excellence in Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| |
Collapse
|
25
|
Shen G, Smyk NJ, Meltzoff AN, Marshall PJ. Using somatosensory mismatch responses as a window into somatotopic processing of tactile stimulation. Psychophysiology 2017; 55:e13030. [PMID: 29139557 DOI: 10.1111/psyp.13030] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2017] [Revised: 10/18/2017] [Accepted: 10/19/2017] [Indexed: 11/30/2022]
Abstract
Brain responses to tactile stimulation have often been studied through the examination of ERPs elicited to touch on the body surface. Here, we examined two factors potentially modulating the amplitude of the somatosensory mismatch negativity (sMMN) and P300 responses elicited by touch to pairs of body parts: (a) the distance between the representation of these body parts in somatosensory cortex, and (b) the physical distances between the stimulated points on the body surface. The sMMN and the P300 response were elicited by tactile stimulation in two oddball protocols. One protocol leveraged a discontinuity in cortical somatotopic organization, and involved stimulation of either the neck or the hand in relation to stimulation of the lip. The other protocol involved stimulation to the third or fifth finger in relation to the second finger. The neck-lip pairing resulted in significantly larger sMMN responses (with shorter latencies) than the hand-lip pairing, whereas the reverse was true for the amplitude of the P300. Mean sMMN amplitude and latency did not differ between finger pairings. However, larger P300 responses were elicited to stimulation of the fifth finger than the third finger. These results suggest that, for certain combinations of body parts, early automatic somatosensory mismatch responses may be influenced by distance between the cortical representations of these body parts, whereas the later P300 response may be more influenced by the distance between stimulated body parts on the body surface. Future investigations can shed more light on this novel suggestion.
Collapse
Affiliation(s)
- Guannan Shen
- Department of Psychology, Temple University, Philadelphia, Pennsylvania, USA
| | - Nathan J Smyk
- Department of Psychology, Temple University, Philadelphia, Pennsylvania, USA
| | - Andrew N Meltzoff
- Institute for Learning and Brain Sciences, University of Washington, Seattle, Washington, USA
| | - Peter J Marshall
- Department of Psychology, Temple University, Philadelphia, Pennsylvania, USA
| |
Collapse
|
26
|
Aggius-Vella E, Campus C, Finocchietti S, Gori M. Audio Spatial Representation Around the Body. Front Psychol 2017; 8:1932. [PMID: 29249999 PMCID: PMC5715385 DOI: 10.3389/fpsyg.2017.01932] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2017] [Accepted: 10/19/2017] [Indexed: 11/13/2022] Open
Abstract
Studies have found that portions of space around our body are differently coded by our brain. Numerous works have investigated visual and auditory spatial representation, focusing mostly on the spatial representation of stimuli presented at head level, especially in the frontal space. Only few studies have investigated spatial representation around the entire body and its relationship with motor activity. Moreover, it is still not clear whether the space surrounding us is represented as a unitary dimension or whether it is split up into different portions, differently shaped by our senses and motor activity. To clarify these points, we investigated audio localization of dynamic and static sounds at different body levels. In order to understand the role of a motor action in auditory space representation, we asked subjects to localize sounds by pointing with the hand or the foot, or by giving a verbal answer. We found that the audio sound localization was different depending on the body part considered. Moreover, a different pattern of response was observed when subjects were asked to make actions with respect to the verbal responses. These results suggest that the audio space around our body is split in various spatial portions, which are perceived differently: front, back, around chest, and around foot, suggesting that these four areas could be differently modulated by our senses and our actions.
Collapse
Affiliation(s)
- Elena Aggius-Vella
- Unit for Visually Impaired People, Center for Human Technologies, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Claudio Campus
- Unit for Visually Impaired People, Center for Human Technologies, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Sara Finocchietti
- Unit for Visually Impaired People, Center for Human Technologies, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Monica Gori
- Unit for Visually Impaired People, Center for Human Technologies, Istituto Italiano di Tecnologia, Genoa, Italy
| |
Collapse
|
27
|
Świder K, Wronka E, Oosterman JM, van Rijn CM, Jongsma MLA. Influence of transient spatial attention on the P3 component and perception of painful and non-painful electric stimuli in crossed and uncrossed hands positions. PLoS One 2017; 12:e0182616. [PMID: 28873414 PMCID: PMC5584947 DOI: 10.1371/journal.pone.0182616] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2016] [Accepted: 07/22/2017] [Indexed: 11/19/2022] Open
Abstract
Recent reports show that focusing attention on the location where pain is expected can enhance its perception. Moreover, crossing the hands over the body’s midline is known to impair the ability to localise stimuli and decrease tactile and pain sensations in healthy participants. The present study investigated the role of transient spatial attention on the perception of painful and non-painful electrical stimuli in conditions in which a match or a mismatch was induced between skin-based and external frames of reference (uncrossed and crossed hands positions, respectively). We measured the subjective experience (Numerical Rating Scale scores) and the electrophysiological response elicited by brief electric stimuli by analysing the P3 component of Event-Related Potentials (ERPs). Twenty-two participants underwent eight painful and eight non-painful stimulus blocks. The electrical stimuli were applied to either the left or the right hand, held in either a crossed or uncrossed position. Each stimulus was preceded by a direction cue (leftward or rightward arrow). In 80% of the trials, the arrow correctly pointed to the spatial regions where the stimulus would appear (congruent cueing). Our results indicated that congruent cues resulted in increased pain NRS scores compared to incongruent ones. For non-painful stimuli such an effect was observed only in the uncrossed hands position. For both non-painful and painful stimuli the P3 peak amplitudes were higher and occurred later for incongruently cued stimuli compared to congruent ones. However, we found that crossing the hands substantially reduced the cueing effect of the P3 peak amplitudes elicited by painful stimuli. Taken together, our results showed a strong influence of transient attention manipulations on the NRS ratings and on the brain activity. Our results also suggest that hand position may modulate the strength of the cueing effect, although differences between painful and non-painful stimuli exist.
Collapse
Affiliation(s)
- Karolina Świder
- Institute of Psychology, Jagiellonian University, Kraków, Poland
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
- * E-mail:
| | - Eligiusz Wronka
- Institute of Psychology, Jagiellonian University, Kraków, Poland
| | - Joukje M. Oosterman
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Clementina M. van Rijn
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
| | - Marijtje L. A. Jongsma
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, The Netherlands
- Behavioural Science Institute, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
28
|
Pitti A, Pugach G, Gaussier P, Shimada S. Spatio-Temporal Tolerance of Visuo-Tactile Illusions in Artificial Skin by Recurrent Neural Network with Spike-Timing-Dependent Plasticity. Sci Rep 2017; 7:41056. [PMID: 28106139 PMCID: PMC5247701 DOI: 10.1038/srep41056] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2016] [Accepted: 12/16/2016] [Indexed: 12/15/2022] Open
Abstract
Perceptual illusions across multiple modalities, such as the rubber-hand illusion, show how dynamic the brain is at adapting its body image and at determining what is part of it (the self) and what is not (others). Several research studies showed that redundancy and contingency among sensory signals are essential for perception of the illusion and that a lag of 200-300 ms is the critical limit of the brain to represent one's own body. In an experimental setup with an artificial skin, we replicate the visuo-tactile illusion within artificial neural networks. Our model is composed of an associative map and a recurrent map of spiking neurons that learn to predict the contingent activity across the visuo-tactile signals. Depending on the temporal delay incidentally added between the visuo-tactile signals or the spatial distance of two distinct stimuli, the two maps detect contingency differently. Spiking neurons organized into complex networks and synchrony detection at different temporal interval can well explain multisensory integration regarding self-body.
Collapse
Affiliation(s)
- Alexandre Pitti
- ETIS Laboratory, UMR CNRS 8051, University of Cergy-Pontoise, ENSEA, Cergy-Pontoise, France
| | - Ganna Pugach
- ETIS Laboratory, UMR CNRS 8051, University of Cergy-Pontoise, ENSEA, Cergy-Pontoise, France.,Energy and Metallurgy Department, Donetsk National Technical University, Krasnoarmeysk, Ukraine
| | - Philippe Gaussier
- ETIS Laboratory, UMR CNRS 8051, University of Cergy-Pontoise, ENSEA, Cergy-Pontoise, France
| | - Sotaro Shimada
- Dept. of Electronics and Bioinformatics, School of Science and Technology, Meiji University, Kawasaki, Japan
| |
Collapse
|
29
|
Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach. Psychon Bull Rev 2016; 23:387-404. [PMID: 26350763 DOI: 10.3758/s13423-015-0918-0] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
To act upon a tactile stimulus its original skin-based, anatomical spatial code has to be transformed into an external, posture-dependent reference frame, a process known as tactile remapping. When the limbs are crossed, anatomical and external location codes are in conflict, leading to a decline in tactile localization accuracy. It is unknown whether this impairment originates from the integration of the resulting external localization response with the original, anatomical one or from a failure of tactile remapping in crossed postures. We fitted probabilistic models based on these diverging accounts to the data from three tactile localization experiments. Hand crossing disturbed tactile left-right location choices in all experiments. Furthermore, the size of these crossing effects was modulated by stimulus configuration and task instructions. The best model accounted for these results by integration of the external response mapping with the original, anatomical one, while applying identical integration weights for uncrossed and crossed postures. Thus, the model explained the data without assuming failures of remapping. Moreover, performance differences across tasks were accounted for by non-individual parameter adjustments, indicating that individual participants' task adaptation results from one common functional mechanism. These results suggest that remapping is an automatic and accurate process, and that the observed localization impairments in touch result from a cognitively controlled integration process that combines anatomically and externally coded responses.
Collapse
|
30
|
Tamè L, Wühle A, Petri CD, Pavani F, Braun C. Concurrent use of somatotopic and external reference frames in a tactile mislocalization task. Brain Cogn 2016; 111:25-33. [PMID: 27816777 DOI: 10.1016/j.bandc.2016.10.005] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2016] [Revised: 10/22/2016] [Accepted: 10/24/2016] [Indexed: 10/20/2022]
Abstract
Localizing tactile stimuli on our body requires sensory information to be represented in multiple frames of reference along the sensory pathways. These reference frames include the representation of sensory information in skin coordinates, in which the spatial relationship of skin regions is maintained. The organization of the primary somatosensory cortex matches such somatotopic reference frame. In contrast, higher-order representations are based on external coordinates, in which body posture and gaze direction are taken into account in order to localise touch in other meaningful ways according to task demands. Dominance of one representation or the other, or the use of multiple representations with different weights, is thought to depend on contextual factors of cognitive and/or sensory origins. However, it is unclear under which situations a reference frame takes over another or when different reference frames are jointly used at the same time. The study of tactile mislocalizations at the fingers has shown a key role of the somatotopic frame of reference, both when touches are delivered unilaterally to a single hand, and when they are delivered bilaterally to both hands. Here, we took advantage of a well-established tactile mislocalization paradigm to investigate whether the reference frame used to integrate bilateral tactile stimuli can change as a function of the spatial relationship between the two hands. Specifically, supra-threshold interference stimuli were applied to the index or little fingers of the left hand 200ms prior to the application of a test stimulus on a finger of the right hand. Crucially, different hands postures were adopted (uncrossed or crossed). Results show that introducing a change in hand-posture triggered the concurrent use of somatotopic and external reference frames when processing bilateral touch at the fingers. This demonstrates that both somatotopic and external reference frames can be concurrently used to localise tactile stimuli on the fingers.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, UK.
| | - Anja Wühle
- MEG-Centre, University of Tübingen, Germany
| | | | - Francesco Pavani
- Centre for Mind/Brain Sciences, University of Trento, Rovereto, Italy; Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy; INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Centre, Lyon, France
| | - Christoph Braun
- MEG-Centre, University of Tübingen, Germany; Centre for Mind/Brain Sciences, University of Trento, Rovereto, Italy; Department of Psychology and Cognitive Sciences, University of Trento, Rovereto, Italy; Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
| |
Collapse
|
31
|
Saby JN, Meltzoff AN, Marshall PJ. Beyond the N1: A review of late somatosensory evoked responses in human infants. Int J Psychophysiol 2016; 110:146-152. [PMID: 27553531 DOI: 10.1016/j.ijpsycho.2016.08.008] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2016] [Revised: 08/17/2016] [Accepted: 08/18/2016] [Indexed: 01/05/2023]
Abstract
Somatosensory evoked potentials (SEPs) have been used for decades to study the development of somatosensory processing in human infants. Research on infant SEPs has focused on the initial cortical component (N1) and its clinical utility for predicting neurological outcome in at-risk infants. However, recent studies suggest that examining the later components in the infant somatosensory evoked response will greatly advance our understanding of somatosensory processing in infancy. The purpose of this review is to synthesize the existing electroencephalography (EEG) and magnetoencephalography (MEG) studies on late somatosensory evoked responses in infants. We describe the late responses that have been reported and discuss the utility of such responses for illuminating key aspects of somatosensory processing in typical and atypical development.
Collapse
Affiliation(s)
- Joni N Saby
- Institute for Learning & Brain Sciences, University of Washington, Box 357988, Seattle, WA 98195, United States.
| | - Andrew N Meltzoff
- Institute for Learning & Brain Sciences, University of Washington, Box 357988, Seattle, WA 98195, United States
| | - Peter J Marshall
- Department of Psychology, Temple University, 1701 North 13th Street, Philadelphia, PA 19122, United States
| |
Collapse
|
32
|
Disentangling the External Reference Frames Relevant to Tactile Localization. PLoS One 2016; 11:e0158829. [PMID: 27391805 PMCID: PMC4938545 DOI: 10.1371/journal.pone.0158829] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Accepted: 06/22/2016] [Indexed: 12/03/2022] Open
Abstract
Different reference frames appear to be relevant for tactile spatial coding. When participants give temporal order judgments (TOJ) of two tactile stimuli, one on each hand, performance declines when the hands are crossed. This effect is attributed to a conflict between anatomical and external location codes: hand crossing places the anatomically right hand into the left side of external space. However, hand crossing alone does not specify the anchor of the external reference frame, such as gaze, trunk, or the stimulated limb. Experiments that used explicit localization responses, such as pointing to tactile stimuli rather than crossing manipulations, have consistently implicated gaze-centered coding for touch. To test whether crossing effects can be explained by gaze-centered coding alone, participants made TOJ while the position of the hands was manipulated relative to gaze and trunk. The two hands either lay on different sides of space relative to gaze or trunk, or they both lay on one side of the respective space. In the latter posture, one hand was on its "regular side of space" despite hand crossing, thus reducing overall conflict between anatomical and external codes. TOJ crossing effects were significantly reduced when the hands were both located on the same side of space relative to gaze, indicating gaze-centered coding. Evidence for trunk-centered coding was tentative, with an effect in reaction time but not in accuracy. These results link paradigms that use explicit localization and TOJ, and corroborate the relevance of gaze-related coding for touch. Yet, gaze and trunk-centered coding did not account for the total size of crossing effects, suggesting that tactile localization relies on additional, possibly limb-centered, reference frames. Thus, tactile location appears to be estimated by integrating multiple anatomical and external reference frames.
Collapse
|
33
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|
34
|
Noel JP, Lukowska M, Wallace M, Serino A. Multisensory simultaneity judgment and proximity to the body. J Vis 2016; 16:21. [PMID: 26891828 PMCID: PMC4777235 DOI: 10.1167/16.3.21] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/03/2022] Open
Abstract
The integration of information across different sensory modalities is known to be dependent upon the statistical characteristics of the stimuli to be combined. For example, the spatial and temporal proximity of stimuli are important determinants with stimuli that are close in space and time being more likely to be bound. These multisensory interactions occur not only for singular points in space/time, but over “windows” of space and time that likely relate to the ecological statistics of real-world stimuli. Relatedly, human psychophysical work has demonstrated that individuals are highly prone to judge multisensory stimuli as co-occurring over a wide range of time—a so-called simultaneity window (SW). Similarly, there exists a spatial representation of peripersonal space (PPS) surrounding the body in which stimuli related to the body and to external events occurring near the body are highly likely to be jointly processed. In the current study, we sought to examine the interaction between these temporal and spatial dimensions of multisensory representation by measuring the SW for audiovisual stimuli through proximal–distal space (i.e., PPS and extrapersonal space). Results demonstrate that the audiovisual SWs within PPS are larger than outside PPS. In addition, we suggest that this effect is likely due to an automatic and additional computation of these multisensory events in a body-centered reference frame. We discuss the current findings in terms of the spatiotemporal constraints of multisensory interactions and the implication of distinct reference frames on this process.
Collapse
|
35
|
Forster B, Tziraki M, Jones A. The attentive homunculus: ERP evidence for somatotopic allocation of attention in tactile search. Neuropsychologia 2016; 84:158-66. [DOI: 10.1016/j.neuropsychologia.2016.02.009] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2015] [Revised: 02/13/2016] [Accepted: 02/15/2016] [Indexed: 11/24/2022]
|
36
|
Juravle G, Heed T, Spence C, Röder B. Neural correlates of tactile perception during pre-, peri-, and post-movement. Exp Brain Res 2016; 234:1293-305. [DOI: 10.1007/s00221-016-4589-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2015] [Accepted: 01/30/2016] [Indexed: 11/29/2022]
|
37
|
Pozeg P, Galli G, Blanke O. Those are Your Legs: The Effect of Visuo-Spatial Viewpoint on Visuo-Tactile Integration and Body Ownership. Front Psychol 2015; 6:1749. [PMID: 26635663 PMCID: PMC4646976 DOI: 10.3389/fpsyg.2015.01749] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2015] [Accepted: 10/31/2015] [Indexed: 11/13/2022] Open
Abstract
Experiencing a body part as one's own, i.e., body ownership, depends on the integration of multisensory bodily signals (including visual, tactile, and proprioceptive information) with the visual top-down signals from peripersonal space. Although it has been shown that the visuo-spatial viewpoint from where the body is seen is an important visual top-down factor for body ownership, different studies have reported diverging results. Furthermore, the role of visuo-spatial viewpoint (sometime also called first-person perspective) has only been studied for hands or the whole body, but not for the lower limbs. We thus investigated whether and how leg visuo-tactile integration and leg ownership depended on the visuo-spatial viewpoint from which the legs were seen and the anatomical similarity of the visual leg stimuli. Using a virtual leg illusion, we tested the strength of visuo-tactile integration of leg stimuli using the crossmodal congruency effect (CCE) as well as the subjective sense of leg ownership (assessed by a questionnaire). Fifteen participants viewed virtual legs or non-corporeal control objects, presented either from their habitual first-person viewpoint or from a viewpoint that was rotated by 90°(third-person viewpoint), while applying visuo-tactile stroking between the participants legs and the virtual legs shown on a head-mounted display. The data show that the first-person visuo-spatial viewpoint significantly boosts the visuo-tactile integration as well as the sense of leg ownership. Moreover, the viewpoint-dependent increment of the visuo-tactile integration was only found in the conditions when participants viewed the virtual legs (absent for control objects). These results confirm the importance of first person visuo-spatial viewpoint for the integration of visuo-tactile stimuli and extend findings from the upper extremity and the trunk to visuo-tactile integration and ownership for the legs.
Collapse
Affiliation(s)
- Polona Pozeg
- Center for Neuroprosthetics, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne , Lausanne, Switzerland ; Laboratory of Cognitive Neuroscience, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne , Lausanne, Switzerland
| | - Giulia Galli
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne , Lausanne, Switzerland ; Istituti di Ricovero e Cura a Carattere Scientifico, Fondazione Santa Lucia , Rome, Italy
| | - Olaf Blanke
- Center for Neuroprosthetics, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne , Lausanne, Switzerland ; Laboratory of Cognitive Neuroscience, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne , Lausanne, Switzerland ; Department of Neurology, University Hospital of Geneva , Geneva, Switzerland
| |
Collapse
|
38
|
Posture modulates implicit hand maps. Conscious Cogn 2015; 36:96-102. [DOI: 10.1016/j.concog.2015.06.009] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2015] [Revised: 06/12/2015] [Accepted: 06/15/2015] [Indexed: 01/04/2023]
|
39
|
Brandes J, Heed T. Reach Trajectories Characterize Tactile Localization for Sensorimotor Decision Making. J Neurosci 2015; 35:13648-58. [PMID: 26446218 PMCID: PMC6605379 DOI: 10.1523/jneurosci.1873-14.2015] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2014] [Revised: 08/24/2015] [Accepted: 08/27/2015] [Indexed: 11/21/2022] Open
Abstract
Spatial target information for movement planning appears to be coded in a gaze-centered reference frame. In touch, however, location is initially coded with reference to the skin. Therefore, the tactile spatial location must be derived by integrating skin location and posture. It has been suggested that this recoding is impaired when the limb is placed in the opposite hemispace, for example, by limb crossing. Here, human participants reached toward visual and tactile targets located at uncrossed and crossed feet in a sensorimotor decision task. We characterized stimulus recoding by analyzing the timing and spatial profile of hand reaches. For tactile targets at crossed feet, skin-based information implicates the incorrect side, and only recoded information points to the correct location. Participants initiated straight reaches and redirected the hand toward a target presented in midflight. Trajectories to visual targets were unaffected by foot crossing. In contrast, trajectories to tactile targets were redirected later with crossed than uncrossed feet. Reaches to crossed feet usually continued straight until they were directed toward the correct tactile target and were not biased toward the skin-based target location. Occasional, far deflections toward the incorrect target were most likely when this target was implicated by trial history. These results are inconsistent with the suggestion that spatial transformations in touch are impaired by limb crossing, but are consistent with tactile location being recoded rapidly and efficiently, followed by integration of skin-based and external information to specify the reach target. This process may be implemented in a bounded integrator framework. SIGNIFICANCE STATEMENT How do you touch yourself, for instance, to scratch an itch? The place you need to reach is defined by a sensation on the skin, but our bodies are flexible, so this skin location could be anywhere in 3D space. The movement toward the tactile sensation must therefore be specified by merging skin location and body posture. By investigating human hand reach trajectories toward tactile stimuli on the feet, we provide experimental evidence that this transformation process is quick and efficient, and that its output is integrated with the original skin location in a fashion consistent with bounded integrator decision-making frameworks.
Collapse
Affiliation(s)
- Janina Brandes
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, 20146 Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, Faculty of Psychology and Human Movement Science, University of Hamburg, 20146 Hamburg, Germany
| |
Collapse
|
40
|
Abstract
To perceive the location of a tactile stimulus in external space (external tactile localisation), information about the location of the stimulus on the skin surface (tactile localisation on the skin) must be combined with proprioceptive information about the spatial location of body parts (position sense)--a process often referred to as 'tactile spatial remapping'. Recent research has revealed that both of these component processes rely on highly distorted implicit body representations. For example, on the dorsal hand surface position sense relies on a squat, wide hand representation. In contrast, tactile localisation on the same skin surface shows large biases towards the knuckles. These distortions can be seen as behavioural 'signatures' of these respective perceptual processes. Here, we investigated the role of implicit body representation in tactile spatial remapping by investigating whether the distortions of each of the two component processes (tactile localisation and position sense) also appear when participants localise the external spatial location of touch. Our study reveals strong distortions characteristic of position sense (i.e., overestimation of distances across vs along the hand) in tactile spatial remapping. In contrast, distortions characteristic of tactile localisation on the skin (i.e., biases towards the knuckles) were not apparent in tactile spatial remapping. These results demonstrate that a common implicit hand representation underlies position sense and external tactile localisation. Furthermore, the present findings imply that tactile spatial remapping does not require mapping the same signals in a frame of reference centred on a specific body part.
Collapse
Affiliation(s)
- Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London.
| | - Flavia Mancini
- Department of Neuroscience, Physiology and Pharmacology, University College London
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London
| |
Collapse
|
41
|
Oscillatory activity reflects differential use of spatial reference frames by sighted and blind individuals in tactile attention. Neuroimage 2015; 117:417-28. [DOI: 10.1016/j.neuroimage.2015.05.068] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2014] [Revised: 04/24/2015] [Accepted: 05/24/2015] [Indexed: 11/19/2022] Open
|
42
|
Body maps in the infant brain. Trends Cogn Sci 2015; 19:499-505. [PMID: 26231760 DOI: 10.1016/j.tics.2015.06.012] [Citation(s) in RCA: 93] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2015] [Revised: 06/27/2015] [Accepted: 06/29/2015] [Indexed: 11/22/2022]
Abstract
Researchers have examined representations of the body in the adult brain but relatively little attention has been paid to ontogenetic aspects of neural body maps in human infants. Novel applications of methods for recording brain activity in infants are delineating cortical body maps in the first months of life. Body maps may facilitate infants' registration of similarities between self and other - an ability that is foundational to developing social cognition. Alterations in interpersonal aspects of body representations might also contribute to social deficits in certain neurodevelopmental disorders.
Collapse
|
43
|
Gherri E, Forster B. Independent effects of eye gaze and spatial attention on the processing of tactile events: Evidence from event-related potentials. Biol Psychol 2015; 109:239-47. [PMID: 26101088 DOI: 10.1016/j.biopsycho.2015.05.008] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2014] [Revised: 04/30/2015] [Accepted: 05/31/2015] [Indexed: 10/23/2022]
Abstract
Directing one's gaze at a body part reduces detection speed and enhances the processing of tactile stimuli presented at the gazed location. Given the close links between spatial attention and the oculomotor system it is possible that these gaze- dependent modulations of touch are mediated by attentional mechanisms. To investigate this possibility, gaze direction and sustained tactile attention were orthogonally manipulated in the present study. Participants covertly attended to one hand to perform a tactile target-nontarget discrimination while they gazed at the same or opposite hand. Spatial attention resulted in enhancements of the somatosensory P100 and Nd components. In contrast, gaze resulted in modulations of the N140 component with more positive ERPs for gazed than non gazed stimuli. This dissociation in the pattern and timing of the effects of gaze and attention on somatosensory processing reveals that gaze and attention have independent effects on touch.
Collapse
Affiliation(s)
- Elena Gherri
- Cognitive Neuroscience Research Unit, City University London, UK.
| | - Bettina Forster
- Cognitive Neuroscience Research Unit, City University London, UK
| |
Collapse
|
44
|
Saby JN, Meltzoff AN, Marshall PJ. Neural body maps in human infants: Somatotopic responses to tactile stimulation in 7-month-olds. Neuroimage 2015; 118:74-8. [PMID: 26070263 DOI: 10.1016/j.neuroimage.2015.05.097] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2015] [Revised: 05/19/2015] [Accepted: 05/21/2015] [Indexed: 12/26/2022] Open
Abstract
A large literature has examined somatotopic representations of the body in the adult brain, but little attention has been paid to the development of somatotopic neural organization in human infants. In the present study we examined whether the somatosensory evoked potential (SEP) elicited by brief tactile stimulation of infants' hands and feet shows a somatotopic response pattern at 7months postnatal age. The tactile stimuli elicited a prominent positive component in the SEP at central sites that peaked around 175ms after stimulus onset. Consistent with a somatotopic response pattern, the amplitude of the response to hand stimulation was greater at lateral central electrodes (C3 and C4) than at the midline central electrode (Cz). As expected, the opposite pattern was obtained to foot stimulation, with greater peak amplitude at Cz than at C3 and C4. These results provide evidence of somatotopy in human infants and suggest that the developing body map can be delineated using readily available methods such as EEG. These findings open up possibilities for further work investigating the organization and plasticity of infant body maps.
Collapse
Affiliation(s)
- Joni N Saby
- Institute for Learning & Brain Sciences, University of Washington, 1715 NE Columbia Road, Seattle, WA 98195.
| | - Andrew N Meltzoff
- Institute for Learning & Brain Sciences, University of Washington, 1715 NE Columbia Road, Seattle, WA 98195
| | - Peter J Marshall
- Department of Psychology, Temple University, 1701 North 13th Street, Philadelphia, PA 19122
| |
Collapse
|
45
|
Heed T, Buchholz VN, Engel AK, Röder B. Tactile remapping: from coordinate transformation to integration in sensorimotor processing. Trends Cogn Sci 2015; 19:251-8. [DOI: 10.1016/j.tics.2015.03.001] [Citation(s) in RCA: 65] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2014] [Revised: 03/04/2015] [Accepted: 03/05/2015] [Indexed: 10/23/2022]
|
46
|
Ley P, Steinberg U, Hanganu-Opatz IL, Röder B. Event-related potential evidence for a dynamic (re-)weighting of somatotopic and external coordinates of touch during visual-tactile interactions. Eur J Neurosci 2015; 41:1466-74. [PMID: 25879770 DOI: 10.1111/ejn.12896] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2014] [Revised: 03/09/2015] [Accepted: 03/16/2015] [Indexed: 02/05/2023]
Abstract
The localization of touch in external space requires the remapping of somatotopically represented tactile information into an external frame of reference. Several recent studies have highlighted the role of posterior parietal areas for this remapping process, yet its temporal dynamics are poorly understood. The present study combined cross-modal stimulation with electrophysiological recordings in humans to trace the time course of tactile spatial remapping during visual-tactile interactions. Adopting an uncrossed or crossed hand posture, participants made speeded elevation judgments about rare vibrotactile stimuli within a stream of frequent, task-irrelevant vibrotactile events presented to the left or right hand. Simultaneous but spatially independent visual stimuli had to be ignored. An analysis of the recorded event-related potentials to the task-irrelevant vibrotactile stimuli revealed a somatotopic coding of tactile stimuli within the first 100 ms. Between 180 and 250 ms, neither an external nor a somatotopic representation dominated, suggesting that both coordinates were active in parallel. After 250 ms, tactile stimuli were coded in a somatotopic frame of reference. Our results indicate that cross-modal interactions start before the termination of tactile spatial remapping, that is within the first 100 ms. Thereafter, tactile stimuli are represented simultaneously in both somatotopic and external spatial coordinates, which are dynamically (re-)weighted as a function of processing stage.
Collapse
Affiliation(s)
- Pia Ley
- Biological Psychology and Neuropsychology, Von-Melle-Park 11, Hamburg, D-20146, Germany
| | - Ulf Steinberg
- Biological Psychology and Neuropsychology, Von-Melle-Park 11, Hamburg, D-20146, Germany
| | - Ileana L Hanganu-Opatz
- Developmental Neurophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, Von-Melle-Park 11, Hamburg, D-20146, Germany
| |
Collapse
|
47
|
Nishikawa N, Shimo Y, Wada M, Hattori N, Kitazawa S. Effects of aging and idiopathic Parkinson's disease on tactile temporal order judgment. PLoS One 2015; 10:e0118331. [PMID: 25760621 PMCID: PMC4356579 DOI: 10.1371/journal.pone.0118331] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2014] [Accepted: 01/13/2015] [Indexed: 11/22/2022] Open
Abstract
It is generally accepted that the basal ganglia play an important role in interval timing that requires the measurement of temporal durations. By contrast, it remains controversial whether the basal ganglia play an essential role in temporal order judgment (TOJ) of successive stimuli, a behavior that does not necessarily require the measurement of durations in time. To address this issue, we compared the effects of idiopathic Parkinson’s disease (PD) on the TOJ of two successive taps delivered to each hand, with the arms uncrossed in one condition and crossed in another. In addition to age-matched elderly participants without PD (non-PD), we examined young healthy participants so that the effect of aging could serve as a control for evaluating the effects of PD. There was no significant difference between PD and non-PD participants in any parameter of TOJ under either arm posture, although reaction time was significantly longer in PD compared with non-PD participants. By contrast, the effect of aging was apparent in both conditions. With their arms uncrossed, the temporal resolution (the interstimulus interval that yielded 84% correct responses) in elderly participants was significantly worse compared with young participants. With their arms crossed, elderly participants made more errors at longer intervals (~1 s) than young participants, although both age groups showed similar judgment reversal at moderately short intervals (~200 ms). These results indicate that the basal ganglia and dopaminergic systems do not play essential roles in tactile TOJ involving both hands and that the effect of aging on TOJ is mostly independent of the dopaminergic systems.
Collapse
Affiliation(s)
- Natsuko Nishikawa
- Department of Neurology, Juntendo University School of Medicine, Tokyo, 113-8421, Japan
| | - Yasushi Shimo
- Department of Neurology, Juntendo University School of Medicine, Tokyo, 113-8421, Japan
| | - Makoto Wada
- Department of Neurophysiology, Graduate School of Medicine, Juntendo University, Tokyo, Japan
- Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, Tokorozawa, Japan
| | - Nobutaka Hattori
- Department of Neurology, Juntendo University School of Medicine, Tokyo, 113-8421, Japan
| | - Shigeru Kitazawa
- Department of Neurophysiology, Graduate School of Medicine, Juntendo University, Tokyo, Japan
- Department of Brain Physiology, Graduate School of Medicine, Osaka University, Osaka, Japan
- Dynamic Brain Network Laboratory, Graduate School of Frontiers Bioscience, Osaka University, Osaka, Japan
- * E-mail:
| |
Collapse
|
48
|
Sustained maintenance of somatotopic information in brain regions recruited by tactile working memory. J Neurosci 2015; 35:1390-5. [PMID: 25632117 DOI: 10.1523/jneurosci.3535-14.2015] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
To adaptively guide ongoing behavior, representations in working memory (WM) often have to be modified in line with changing task demands. We used event-related potentials (ERPs) to demonstrate that tactile WM representations are stored in modality-specific cortical regions, that the goal-directed modulation of these representations is mediated through hemispheric-specific activation of somatosensory areas, and that the rehearsal of somatotopic coordinates in memory is accomplished by modality-specific spatial attention mechanisms. Participants encoded two tactile sample stimuli presented simultaneously to the left and right hands, before visual retro-cues indicated which of these stimuli had to be retained to be matched with a subsequent test stimulus on the same hand. Retro-cues triggered a sustained tactile contralateral delay activity component with a scalp topography over somatosensory cortex contralateral to the cued hand. Early somatosensory ERP components to task-irrelevant probe stimuli (that were presented after the retro-cues) and to subsequent test stimuli were enhanced when these stimuli appeared at the currently memorized location relative to other locations on the cued hand, demonstrating that a precise focus of spatial attention was established during the selective maintenance of tactile events in WM. These effects were observed regardless of whether participants performed the matching task with uncrossed or crossed hands, indicating that WM representations in this task were based on somatotopic rather than allocentric spatial coordinates. In conclusion, spatial rehearsal in tactile WM operates within somatotopically organized sensory brain areas that have been recruited for information storage.
Collapse
|
49
|
Processing load impairs coordinate integration for the localization of touch. Atten Percept Psychophys 2015; 76:1136-50. [PMID: 24550040 DOI: 10.3758/s13414-013-0590-2] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
To perform an action toward a touch, the tactile spatial representation must be transformed from a skin-based, anatomical reference frame into an external reference frame. Evidence suggests that, after transformation, both anatomical and external coordinates are integrated for the location estimate. The present study investigated whether the calculation and integration of external coordinates are automatic processes. Participants made temporal order judgments (TOJs) of two tactile stimuli, one applied to each hand, in crossed and uncrossed postures. The influence of the external coordinates of touch was indicated by the performance difference between crossed and uncrossed postures, referred to as the crossing effect. To assess automaticity, the TOJ task was combined with a working memory task that varied in difficulty (size of the working memory set) and quality (verbal vs. spatial). In two studies, the crossing effect was consistently reduced under processing load. When the load level was adaptively adjusted to individual performance (Study 2), the crossing effect additionally varied as a function of the difficulty of the secondary task. These modulatory effects of processing load on the crossing effect were independent of the type of working memory. The sensitivity of the crossing effect to processing load suggests that coordinate integration for touch localization is not fully automatic. To reconcile the present results with previous findings, we suggest that the genuine remapping process-that is, the transformation of anatomical into external coordinates-proceeds automatically, whereas their integration in service of a combined location estimate is subject to top-down control.
Collapse
|
50
|
Badde S, Röder B, Heed T. Flexibly weighted integration of tactile reference frames. Neuropsychologia 2014; 70:367-74. [PMID: 25447059 DOI: 10.1016/j.neuropsychologia.2014.10.001] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2014] [Revised: 09/29/2014] [Accepted: 10/01/2014] [Indexed: 10/24/2022]
Abstract
To estimate the location of a tactile stimulus, the brain seems to integrate different types of spatial information such as skin-based, anatomical coordinates and external, spatiotopic coordinates. The aim of the present study was to test whether the use of these coordinates is fixed, or whether they are weighted according to the task context. Participants made judgments about two tactile stimuli with different vibration characteristics, one applied to each hand. First, they always performed temporal order judgments (TOJ) of the tactile stimuli with respect to the stimulated hands that were either crossed or uncrossed. The resulting crossing effect, that is, impaired performance in crossed compared to uncrossed conditions, was used as a measure of reference frame weighting and was compared across conditions. Second, in dual judgment conditions participants subsequently made judgments about the stimulus vibration characteristics, either with respect to spatial location or with respect to temporal order. Responses in the spatial secondary task either accented anatomical (Experiment 1) or external (Experiment 2) coding. A TOJ crossing effect emerged in all conditions, and secondary tasks did not affect primary task performance in the uncrossed posture. Yet, the spatial secondary task resulted in improved crossed hands performance in the primary task, but only if the secondary judgment stressed the anatomical reference frame (Experiment 1), rather than the external reference frames (Experiment 2). Like the anatomically coded spatial secondary task, the temporal secondary task improved crossed hand performance of the primary task. The differential influence of the varying secondary tasks implies that integration weights assigned to the anatomical and external reference frames are not fixed. Rather, they are flexibly adjusted to the context, presumably through top-down modulation.
Collapse
Affiliation(s)
- Stephanie Badde
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany.
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| | - Tobias Heed
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, 20146 Hamburg, Germany
| |
Collapse
|