1
|
Chen Q, Dong Y, Gai Y. Tactile Location Perception Encoded by Gamma-Band Power. Bioengineering (Basel) 2024; 11:377. [PMID: 38671798 PMCID: PMC11048554 DOI: 10.3390/bioengineering11040377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2024] [Revised: 03/31/2024] [Accepted: 04/10/2024] [Indexed: 04/28/2024] Open
Abstract
BACKGROUND The perception of tactile-stimulation locations is an important function of the human somatosensory system during body movements and its interactions with the surroundings. Previous psychophysical and neurophysiological studies have focused on spatial location perception of the upper body. In this study, we recorded single-trial electroencephalography (EEG) responses evoked by four vibrotactile stimulators placed on the buttocks and thighs while the human subject was sitting in a chair with a cushion. METHODS Briefly, 14 human subjects were instructed to sit in a chair for a duration of 1 h or 1 h and 45 min. Two types of cushions were tested with each subject: a foam cushion and an air-cell-based cushion dedicated for wheelchair users to alleviate tissue stress. Vibrotactile stimulations were applied to the sitting interface at the beginning and end of the sitting period. Somatosensory-evoked potentials were obtained using a 32-channel EEG. An artificial neural net was used to predict the tactile locations based on the evoked EEG power. RESULTS We found that single-trial beta (13-30 Hz) and gamma (30-50 Hz) waves can best predict the tactor locations with an accuracy of up to 65%. Female subjects showed the highest performances, while males' sensitivity tended to degrade after the sitting period. A three-way ANOVA analysis indicated that the air-cell cushion maintained location sensitivity better than the foam cushion. CONCLUSION Our finding shows that tactile location information is encoded in EEG responses and provides insights on the fundamental mechanisms of the tactile system, as well as applications in brain-computer interfaces that rely on tactile stimulation.
Collapse
Affiliation(s)
| | | | - Yan Gai
- Biomedical Engineering, School of Science and Engineering, Saint Louis University, 3507 Lindell Blvd, St. Louis, MO 63103, USA; (Q.C.); (Y.D.)
| |
Collapse
|
2
|
Dupin L, Haggard P. Dynamic Displacement Vector Interacts with Tactile Localization. Curr Biol 2019; 29:492-498.e3. [PMID: 30686734 PMCID: PMC6370943 DOI: 10.1016/j.cub.2018.12.032] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2018] [Revised: 11/15/2018] [Accepted: 12/18/2018] [Indexed: 11/25/2022]
Abstract
Locating a tactile stimulus on the body seems effortless and straightforward. However, the perceived location of a tactile stimulation can differ from its physical location [1, 2, 3]. Tactile mislocalizations can depend on the timing of successive stimulations [2, 4, 5], tactile motion mechanisms [6], or processes that “remap” stimuli from skin locations to external space coordinates [7, 8, 9, 10, 11]. We report six experiments demonstrating that the perception of tactile localization on a static body part is strongly affected by the displacement between the locations of two successive task-irrelevant actions. Participants moved their index finger between two keys. Each keypress triggered synchronous tactile stimulation at a randomized location on the immobilized wrist or forehead. Participants reported the location of the second tactile stimulation relative to the first. The direction of either active finger movements or passive finger displacements biased participants’ tactile orientation judgements (experiment 1). The effect generalized to tactile stimuli delivered to other body sites (experiment 2). Two successive keypresses, by different fingers at distinct locations, reproduced the effect (experiment 3). The effect remained even when the hand that moved was placed far from the tactile stimulation site (experiments 4 and 5). Temporal synchrony within 600 ms between the movement and tactile stimulations was necessary for the effect (experiment 6). Our results indicate that a dynamic displacement vector, defined as the location of one sensorimotor event relative to the one before, plays a strong role in structuring tactile spatial perception. Human tactile localization is biased by simultaneous finger displacement The shift between two successive events biases the relative localization of touches Both active and passive movements induce a bias, even if far from the touched site The bias effect is vectorially organized
Collapse
Affiliation(s)
- Lucile Dupin
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK.
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK
| |
Collapse
|
3
|
Sadibolova R, Tamè L, Longo MR. More than skin-deep: Integration of skin-based and musculoskeletal reference frames in localization of touch. J Exp Psychol Hum Percept Perform 2018; 44:1672-1682. [PMID: 30160504 PMCID: PMC6205026 DOI: 10.1037/xhp0000562] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2018] [Revised: 04/25/2018] [Accepted: 04/26/2018] [Indexed: 11/08/2022]
Abstract
The skin of the forearm is, in one sense, a flat 2-dimensional (2D) sheet, but in another sense approximately cylindrical, mirroring the 3-dimensional (3D) volumetric shape of the arm. The role of frames of reference based on the skin as a 2D sheet versus based on the musculoskeletal structure of the arm remains unclear. When we rotate the forearm from a pronated to a supinated posture, the skin on its surface is displaced. Thus, a marked location will slide with the skin across the underlying flesh, and the touch perceived at this location should follow this displacement if it is localized within a skin-based reference frame. We investigated, however, if the perceived tactile locations were also affected by the rearrangement in underlying musculoskeletal structure, that is, displaced medially and laterally on a pronated and supinated forearm, respectively. Participants pointed to perceived touches (Experiment 1), or marked them on a (3D) size-matched forearm on a computer screen (Experiment 2). The perceived locations were indeed displaced medially after forearm pronation in both response modalities. This misperception was reduced (Experiment 1), or absent altogether (Experiment 2) in the supinated posture when the actual stimulus grid moved laterally with the displaced skin. The grid was perceptually stretched at medial-lateral axis, and it was displaced distally, which suggest the influence of skin-based factors. Our study extends the tactile localization literature focused on the skin-based reference frame and on the effects of spatial positions of body parts by implicating the musculoskeletal factors in localization of touch on the body. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Collapse
|
4
|
Arnold G, Spence C, Auvray M. A unity of the self or a multiplicity of locations? How the graphesthesia task sheds light on the role of spatial perspectives in bodily self-consciousness. Conscious Cogn 2017; 56:100-114. [DOI: 10.1016/j.concog.2017.06.012] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2017] [Revised: 06/02/2017] [Accepted: 06/20/2017] [Indexed: 10/19/2022]
|
5
|
Medina S, Tamè L, Longo MR. Tactile localization biases are modulated by gaze direction. Exp Brain Res 2017; 236:31-42. [PMID: 29018928 DOI: 10.1007/s00221-017-5105-2] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Accepted: 10/05/2017] [Indexed: 01/03/2023]
Abstract
Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition, a single stimulus was delivered on participants' hand, while the reflexion of the right hand was seen through the mirror; in the object condition, participants looked at a box through the mirror, and in the right hand condition, participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points' locations, by elongating it, in the radio-ulnar axis.
Collapse
Affiliation(s)
- Sonia Medina
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK
| | - Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK.
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK.
| |
Collapse
|
6
|
Disentangling the External Reference Frames Relevant to Tactile Localization. PLoS One 2016; 11:e0158829. [PMID: 27391805 PMCID: PMC4938545 DOI: 10.1371/journal.pone.0158829] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Accepted: 06/22/2016] [Indexed: 12/03/2022] Open
Abstract
Different reference frames appear to be relevant for tactile spatial coding. When participants give temporal order judgments (TOJ) of two tactile stimuli, one on each hand, performance declines when the hands are crossed. This effect is attributed to a conflict between anatomical and external location codes: hand crossing places the anatomically right hand into the left side of external space. However, hand crossing alone does not specify the anchor of the external reference frame, such as gaze, trunk, or the stimulated limb. Experiments that used explicit localization responses, such as pointing to tactile stimuli rather than crossing manipulations, have consistently implicated gaze-centered coding for touch. To test whether crossing effects can be explained by gaze-centered coding alone, participants made TOJ while the position of the hands was manipulated relative to gaze and trunk. The two hands either lay on different sides of space relative to gaze or trunk, or they both lay on one side of the respective space. In the latter posture, one hand was on its "regular side of space" despite hand crossing, thus reducing overall conflict between anatomical and external codes. TOJ crossing effects were significantly reduced when the hands were both located on the same side of space relative to gaze, indicating gaze-centered coding. Evidence for trunk-centered coding was tentative, with an effect in reaction time but not in accuracy. These results link paradigms that use explicit localization and TOJ, and corroborate the relevance of gaze-related coding for touch. Yet, gaze and trunk-centered coding did not account for the total size of crossing effects, suggesting that tactile localization relies on additional, possibly limb-centered, reference frames. Thus, tactile location appears to be estimated by integrating multiple anatomical and external reference frames.
Collapse
|
7
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|
8
|
Krüger HM, Collins T, Englitz B, Cavanagh P. Saccades create similar mislocalizations in visual and auditory space. J Neurophysiol 2016; 115:2237-45. [PMID: 26888101 DOI: 10.1152/jn.00853.2014] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2014] [Accepted: 02/17/2016] [Indexed: 11/22/2022] Open
Abstract
Orienting our eyes to a light, a sound, or a touch occurs effortlessly, despite the fact that sound and touch have to be converted from head- and body-based coordinates to eye-based coordinates to do so. We asked whether the oculomotor representation is also used for localization of sounds even when there is no saccade to the sound source. To address this, we examined whether saccades introduced similar errors of localization judgments for both visual and auditory stimuli. Sixteen subjects indicated the direction of a visual or auditory apparent motion seen or heard between two targets presented either during fixation or straddling a saccade. Compared with the fixation baseline, saccades introduced errors in direction judgments for both visual and auditory stimuli: in both cases, apparent motion judgments were biased in direction of the saccade. These saccade-induced effects across modalities give rise to the possibility of shared, cross-modal location coding for perception and action.
Collapse
Affiliation(s)
- Hannah M Krüger
- Laboratoire Psychologie de la Perception, Université Paris Descartes and Centre National de la Recherche Scientifique (UMR 8242), Paris, France; Faculteit der Sociale Wetenschappen, Radboud University, Nijmegen, The Netherlands;
| | - Thérèse Collins
- Laboratoire Psychologie de la Perception, Université Paris Descartes and Centre National de la Recherche Scientifique (UMR 8242), Paris, France
| | - Bernhard Englitz
- Department of Neurophysiology, Donders Institute, Radboud University, Nijmegen, The Netherlands; and
| | - Patrick Cavanagh
- Laboratoire Psychologie de la Perception, Université Paris Descartes and Centre National de la Recherche Scientifique (UMR 8242), Paris, France; Department of Psychological and Brain Sciences, Dartmouth College, Hanover, New Hampshire
| |
Collapse
|
9
|
Gherri E, Forster B. Independent effects of eye gaze and spatial attention on the processing of tactile events: Evidence from event-related potentials. Biol Psychol 2015; 109:239-47. [PMID: 26101088 DOI: 10.1016/j.biopsycho.2015.05.008] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2014] [Revised: 04/30/2015] [Accepted: 05/31/2015] [Indexed: 10/23/2022]
Abstract
Directing one's gaze at a body part reduces detection speed and enhances the processing of tactile stimuli presented at the gazed location. Given the close links between spatial attention and the oculomotor system it is possible that these gaze- dependent modulations of touch are mediated by attentional mechanisms. To investigate this possibility, gaze direction and sustained tactile attention were orthogonally manipulated in the present study. Participants covertly attended to one hand to perform a tactile target-nontarget discrimination while they gazed at the same or opposite hand. Spatial attention resulted in enhancements of the somatosensory P100 and Nd components. In contrast, gaze resulted in modulations of the N140 component with more positive ERPs for gazed than non gazed stimuli. This dissociation in the pattern and timing of the effects of gaze and attention on somatosensory processing reveals that gaze and attention have independent effects on touch.
Collapse
Affiliation(s)
- Elena Gherri
- Cognitive Neuroscience Research Unit, City University London, UK.
| | - Bettina Forster
- Cognitive Neuroscience Research Unit, City University London, UK
| |
Collapse
|
10
|
Harris LR, Carnevale MJ, D’Amour S, Fraser LE, Harrar V, Hoover AEN, Mander C, Pritchett LM. How our body influences our perception of the world. Front Psychol 2015; 6:819. [PMID: 26124739 PMCID: PMC4464078 DOI: 10.3389/fpsyg.2015.00819] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2015] [Accepted: 05/29/2015] [Indexed: 12/02/2022] Open
Abstract
Incorporating the fact that the senses are embodied is necessary for an organism to interpret sensory information. Before a unified perception of the world can be formed, sensory signals must be processed with reference to body representation. The various attributes of the body such as shape, proportion, posture, and movement can be both derived from the various sensory systems and can affect perception of the world (including the body itself). In this review we examine the relationships between sensory and motor information, body representations, and perceptions of the world and the body. We provide several examples of how the body affects perception (including but not limited to body perception). First we show that body orientation effects visual distance perception and object orientation. Also, visual-auditory crossmodal-correspondences depend on the orientation of the body: audio "high" frequencies correspond to a visual "up" defined by both gravity and body coordinates. Next, we show that perceived locations of touch is affected by the orientation of the head and eyes on the body, suggesting a visual component to coding body locations. Additionally, the reference-frame used for coding touch locations seems to depend on whether gaze is static or moved relative to the body during the tactile task. The perceived attributes of the body such as body size, affect tactile perception even at the level of detection thresholds and two-point discrimination. Next, long-range tactile masking provides clues to the posture of the body in a canonical body schema. Finally, ownership of seen body parts depends on the orientation and perspective of the body part in view. Together, all of these findings demonstrate how sensory and motor information, body representations, and perceptions (of the body and the world) are interdependent.
Collapse
Affiliation(s)
- Laurence R. Harris
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Michael J. Carnevale
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Sarah D’Amour
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Lindsey E. Fraser
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Vanessa Harrar
- School of Optometry, University of Montreal, Montreal, QC, Canada
| | - Adria E. N. Hoover
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Charles Mander
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Lisa M. Pritchett
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| |
Collapse
|
11
|
Margolis AN, Longo MR. Visual detail about the body modulates tactile localisation biases. Exp Brain Res 2014; 233:351-8. [DOI: 10.1007/s00221-014-4118-3] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2014] [Accepted: 09/28/2014] [Indexed: 12/26/2022]
|
12
|
Mueller S, Fiehler K. Effector movement triggers gaze-dependent spatial coding of tactile and proprioceptive-tactile reach targets. Neuropsychologia 2014; 62:184-93. [DOI: 10.1016/j.neuropsychologia.2014.07.025] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2013] [Revised: 06/10/2014] [Accepted: 07/22/2014] [Indexed: 11/27/2022]
|
13
|
Abstract
We investigated whether the relative position of objects and the body would influence haptic recognition. People felt objects on the right or left side of their body midline, using their right hand. Their head was turned towards or away from the object, and they could not see their hands or the object. People were better at naming 2-D raised line drawings and 3-D small-scale models of objects and also real, everyday objects when they looked towards them. However, this head-towards benefit was reliable only when their right hand crossed their body midline to feel objects on their left side. Thus, haptic object recognition was influenced by people's head position, although vision of their hand and the object was blocked. This benefit of turning the head towards the object being explored suggests that proprioceptive and haptic inputs are remapped into an external coordinate system and that this remapping is harder when the body is in an unusual position (with the hand crossing the body midline and the head turned away from the hand). The results indicate that haptic processes align sensory inputs from the hand and head even though either hand-centered or object-centered coordinate systems should suffice for haptic object recognition.
Collapse
|
14
|
Gherri E, Forster B. Attention to the body depends on eye-in-orbit position. Front Psychol 2014; 5:683. [PMID: 25071653 PMCID: PMC4086396 DOI: 10.3389/fpsyg.2014.00683] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2013] [Accepted: 06/13/2014] [Indexed: 11/13/2022] Open
Abstract
Attentional selectivity in touch is modulated by the position of the body in external space. For instance, during endogenous attention tasks in which tactile stimuli are presented to the hands, the effect of attention is reduced when the hands are placed far apart than when they are close together and when the hands are crossed as compared to when they are placed in their anatomical position. This suggests that both somatotopic and external spatial reference frames coding the hands’ locations contribute to the spatial selection of the relevant hand. Here we investigate whether tactile selection of hands is also modulated by the position of other body parts, not directly involved in tactile perception, such as eye-in-orbit (gaze direction). We asked participants to perform the same sustained tactile attention task while gazing laterally toward an eccentric fixation point (Eccentric gaze) or toward a central fixation point (Central gaze). Event-related potentials recorded in response to tactile non-target stimuli presented to the attended or unattended hand were compared as a function of gaze direction (Eccentric vs. Central conditions). Results revealed that attentional modulations were reduced in the Eccentric gaze condition as compared to the Central gaze condition in the time range of the Nd component (200–260 ms post-stimulus), demonstrating for the first time that the attentional selection of one of the hands is affected by the position of the eye in the orbit. Directing the eyes toward an eccentric position might be sufficient to create a misalignment between external and somatotopic frames of references reducing tactile attention. This suggests that the eye-in-orbit position contributes to the spatial selection of the task relevant body part.
Collapse
Affiliation(s)
- Elena Gherri
- Department of Psychology, University of Edinburgh, Edinburgh UK
| | | |
Collapse
|
15
|
Vibrotactile masking through the body. Exp Brain Res 2014; 232:2859-63. [DOI: 10.1007/s00221-014-3955-4] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2014] [Accepted: 04/07/2014] [Indexed: 11/28/2022]
|
16
|
Longo MR. The effects of immediate vision on implicit hand maps. Exp Brain Res 2014; 232:1241-7. [PMID: 24449015 DOI: 10.1007/s00221-014-3840-1] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2013] [Accepted: 01/09/2014] [Indexed: 12/24/2022]
Abstract
Perceiving the external spatial location of the limbs using position sense requires that immediate proprioceptive afferent signals be combined with a stored body model specifying the size and shape of the body. Longo and Haggard (Proc Natl Acad Sci USA 107:11727-11732, 2010) developed a method to isolate and measure this body model in the case of the hand in which participants judge the perceived location in external space of several landmarks on their occluded hand. The spatial layout of judgments of different landmarks is used to construct implicit hand maps, which can then be compared with actual hand shape. Studies using this paradigm have revealed that the body model of the hand is massively distorted, in a highly stereotyped way across individuals, with large underestimation of finger length and overestimation of hand width. Previous studies using this paradigm have allowed participants to see the locations of their judgments on the occluding board. Several previous studies have demonstrated that immediate vision, even when wholly non-informative, can alter processing of somatosensory signals and alter the reference frame in which they are localised. The present study therefore investigated whether immediate vision contributes to the distortions of implicit hand maps described previously. Participants judged the external spatial location of the tips and knuckles of their occluded left hand either while being able to see where they were pointing (as in previous studies) or while blindfolded. The characteristic distortions of implicit hand maps reported previously were clearly apparent in both conditions, demonstrating that the distortions are not an artefact of immediate vision. However, there were significant differences in the magnitude of distortions in the two conditions, suggesting that vision may modulate representations of body size and shape, even when entirely non-informative.
Collapse
Affiliation(s)
- Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London, WC1E 7HX, UK,
| |
Collapse
|
17
|
Steenbergen P, Buitenweg JR, Trojan J, Klaassen B, Veltink PH. Subject-level differences in reported locations of cutaneous tactile and nociceptive stimuli. Front Hum Neurosci 2012; 6:325. [PMID: 23226126 PMCID: PMC3510457 DOI: 10.3389/fnhum.2012.00325] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2012] [Accepted: 11/14/2012] [Indexed: 12/14/2022] Open
Abstract
Recent theoretical advances on the topic of body representations have raised the question whether spatial perception of touch and nociception involve the same representations. Various authors have established that subjective localizations of touch and nociception are displaced in a systematic manner. The relation between veridical stimulus locations and localizations can be described in the form of a perceptual map; these maps differ between subjects. Recently, evidence was found for a common set of body representations to underlie spatial perception of touch and slow and fast pain, which receive information from modality specific primary representations. There are neurophysiological clues that the various cutaneous senses may not share the same primary representation. If this is the case, then differences in primary representations between touch and nociception may cause subject-dependent differences in perceptual maps of these modalities. We studied localization of tactile and nociceptive sensations on the forearm using electrocutaneous stimulation. The perceptual maps of these modalities differed at the group level. When assessed for individual subjects, the differences localization varied in nature between subjects. The agreement of perceptual maps of the two modalities was moderate. These findings are consistent with a common internal body representation underlying spatial perception of touch and nociception. The subject level differences suggest that in addition to these representations other aspects, possibly differences in primary representation and/or the influence of stimulus parameters, lead to differences in perceptual maps in individuals.
Collapse
Affiliation(s)
- Peter Steenbergen
- Biomedical Signals and Systems, Mira Institute for Biomedical Technology and Technical Medicine, University of Twente Enschede, Netherlands
| | | | | | | | | |
Collapse
|
18
|
Pritchett LM, Carnevale MJ, Harris LR. Reference frames for coding touch location depend on the task. Exp Brain Res 2012; 222:437-45. [PMID: 22941315 DOI: 10.1007/s00221-012-3231-4] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2012] [Accepted: 08/11/2012] [Indexed: 11/26/2022]
Abstract
The position of gaze (eye plus head position) relative to body is known to alter the perceived locations of sensory targets. This effect suggests that perceptual space is at least partially coded in a gaze-centered reference frame. However, the direction of the effects reported has not been consistent. Here, we investigate the cause of a discrepancy between reported directions of shift in tactile localization related to head position. We demonstrate that head eccentricity can cause errors in touch localization in either the same or opposite direction as the head is turned depending on the procedure used. When head position is held eccentric during both the presentation of a touch and the response, there is a shift in the direction opposite to the head. When the head is returned to center before reporting, the shift is in the same direction as head eccentricity. We rule out a number of possible explanations for the difference and conclude that when the head is moved between a touch and response the touch is coded in a predominantly gaze-centered reference frame, whereas when the head remains stationary a predominantly body-centered reference frame is used. The mechanism underlying these displacements in perceived location is proposed to involve an underestimated gaze signal. We propose a model demonstrating how this single neural error could cause localization errors in either direction depending on whether the gaze or body midline is used as a reference. This model may be useful in explaining gaze-related localization errors in other modalities.
Collapse
Affiliation(s)
- Lisa M Pritchett
- Centre for Vision Research, York University, Toronto, ON, Canada.
| | | | | |
Collapse
|
19
|
Meyer GF, Noppeney U. Multisensory integration: from fundamental principles to translational research. Exp Brain Res 2011; 213:163-6. [PMID: 21800253 DOI: 10.1007/s00221-011-2803-z] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|