1
|
Otsuka S, Gao H, Hiraoka K. Contribution of external reference frame to tactile localization. Exp Brain Res 2024; 242:1957-1970. [PMID: 38918211 DOI: 10.1007/s00221-024-06877-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Accepted: 06/18/2024] [Indexed: 06/27/2024]
Abstract
The purpose of the present study was to elucidate whether an external reference frame contributes to tactile localization in blindfolded healthy humans. In a session, the right forearm was passively moved until the elbow finally reached to the target angle, and participants reached the left index finger to the right middle fingertip. The locus of the right middle fingertip indicated by the participants deviated in the direction of the elbow extension when vibration was provided to the biceps brachii muscle during the passive movement. This finding indicates that proprioception contributes to the identification of the spatial coordinate of the specific body part in an external reference frame. In another session, the tactile stimulus was provided to the dorsal of the right hand during the passive movement, and the participants reached the left index finger to the spatial locus at which the tactile stimulus was provided. Vibration to the biceps brachii muscle did not change the perceived locus of the tactile stimulus indicated by the left index finger. This finding indicates that an external reference frame does not contribute to tactile localization during the passive movement. Humans may estimate the spatial coordinate of the tactile stimulus based on the time between the movement onset and the time at which the tactile stimulus is provided.
Collapse
Affiliation(s)
- Shunsuke Otsuka
- College of Health and Human Sciences, Osaka Prefecture University, Habikino city, Japan
| | - Han Gao
- Graduate School of Rehabilitation Science, Osaka Metropolitan University, Habikino city, Japan
| | - Koichi Hiraoka
- Department of Rehabilitation Science, School of Medicine, Osaka Metropolitan University, Habikino city, Japan.
| |
Collapse
|
2
|
Chen Q, Dong Y, Gai Y. Tactile Location Perception Encoded by Gamma-Band Power. Bioengineering (Basel) 2024; 11:377. [PMID: 38671798 PMCID: PMC11048554 DOI: 10.3390/bioengineering11040377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2024] [Revised: 03/31/2024] [Accepted: 04/10/2024] [Indexed: 04/28/2024] Open
Abstract
BACKGROUND The perception of tactile-stimulation locations is an important function of the human somatosensory system during body movements and its interactions with the surroundings. Previous psychophysical and neurophysiological studies have focused on spatial location perception of the upper body. In this study, we recorded single-trial electroencephalography (EEG) responses evoked by four vibrotactile stimulators placed on the buttocks and thighs while the human subject was sitting in a chair with a cushion. METHODS Briefly, 14 human subjects were instructed to sit in a chair for a duration of 1 h or 1 h and 45 min. Two types of cushions were tested with each subject: a foam cushion and an air-cell-based cushion dedicated for wheelchair users to alleviate tissue stress. Vibrotactile stimulations were applied to the sitting interface at the beginning and end of the sitting period. Somatosensory-evoked potentials were obtained using a 32-channel EEG. An artificial neural net was used to predict the tactile locations based on the evoked EEG power. RESULTS We found that single-trial beta (13-30 Hz) and gamma (30-50 Hz) waves can best predict the tactor locations with an accuracy of up to 65%. Female subjects showed the highest performances, while males' sensitivity tended to degrade after the sitting period. A three-way ANOVA analysis indicated that the air-cell cushion maintained location sensitivity better than the foam cushion. CONCLUSION Our finding shows that tactile location information is encoded in EEG responses and provides insights on the fundamental mechanisms of the tactile system, as well as applications in brain-computer interfaces that rely on tactile stimulation.
Collapse
Affiliation(s)
| | | | - Yan Gai
- Biomedical Engineering, School of Science and Engineering, Saint Louis University, 3507 Lindell Blvd, St. Louis, MO 63103, USA; (Q.C.); (Y.D.)
| |
Collapse
|
3
|
Merz S, Frings C, Spence C. Motion perception in touch: resolving contradictory findings by varying probabilities of different trial types. PSYCHOLOGICAL RESEARCH 2024; 88:148-155. [PMID: 37369933 PMCID: PMC10805958 DOI: 10.1007/s00426-023-01849-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Accepted: 06/10/2023] [Indexed: 06/29/2023]
Abstract
Representational momentum describes the typical overestimation of the final location of a moving stimulus in the direction of stimulus motion. While systematically observed in different sensory modalities, especially vision and audition, in touch, empirical findings indicate a mixed pattern of results, with some published studies suggesting the existence of the phenomenon, while others do not. In the present study, one possible moderating variable, the relative probabilities of different trial types, was explored in an attempt to resolve the seemingly contradictory findings in the literature. In some studies, only consistently moving target stimuli were presented and no representational momentum was observed, while other studies have included inconsistently moving target stimuli in the same experimental block, and observed representational momentum. Therefore, the present study was designed to systematically compare the localization of consistent target motion stimuli across two experimental blocks, for which either only consistent motion trials were presented, or else mixed with inconsistent target motion trials. The results indicate a strong influence of variations in the probability of different trial types on the occurrence of representational momentum. That is, representational momentum only occurred when both trial types (inconsistent and consistent target motion) were presented within one experimental block. The results are discussed in light of recent theoretical advancements in the literature, namely the speed prior account of motion perception.
Collapse
Affiliation(s)
- Simon Merz
- Department of Psychology, Cognitive Psychology, University of Trier, Universitätsring 15, 54286, Trier, Germany.
| | - Christian Frings
- Department of Psychology, Cognitive Psychology, University of Trier, Universitätsring 15, 54286, Trier, Germany
| | - Charles Spence
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| |
Collapse
|
4
|
Nakamura F, Verhulst A, Sakurada K, Fukuoka M, Sugimoto M. Evaluation of Spatial Directional Guidance Using Cheek Haptic Stimulation in a Virtual Environment. FRONTIERS IN COMPUTER SCIENCE 2022. [DOI: 10.3389/fcomp.2022.733844] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Spatial cues play an important role in navigating people in both physical and virtual spaces. In spatial navigation, visual information with additional cues, such as haptic cues, enables effective guidance. Most haptic devices are applied to various body parts to make mechanical stimuli, while few devices stimulate a head despite the excellent sensitivity. This article presents Virtual Whiskers, a spatial directional guidance technique by cheek haptic stimulation using tiny robot arms attached to a Head-Mounted Display (HMD). The tip of the robotic arm has photo reflective sensors to detect the distance between the tip and the cheek surface. Using the robot arms, we stimulate a point on the cheek obtained by calculating an intersection between the cheek surface and the target direction. In the directional guidance experiment, we investigated how accurately participants identify the target direction provided by our guidance method. We evaluated an error between the actual target direction and the participant's pointed direction. The experimental result shows that our method achieves the average absolute directional error of 2.54° in the azimuthal plane and 6.54° in the elevation plane. We also conducted a spatial guidance experiment to evaluate task performance in a target search task. We compared the condition of visual information, visual and audio information, and visual information and cheek haptics for task completion time, System Usability Scale (SUS) score, NASA-TLX score. The averages of task completion time were M = 6.39 s, SD = 3.34 s, and M = 5.62 s, SD = 3.12 s, and M = 4.35 s, SD = 2.26 s, in visual-only condition, visual+audio condition, and visual+haptic condition, respectively. In terms of the SUS score, visual condition, visual+audio condition, and visual+haptic condition achieved M = 55.83, SD = 20.40, and M = 47.78, SD = 20.09, and M = 80.42, SD = 10.99, respectively. As for NASA-TLX score, visual condition, visual+audio condition, and visual+haptic condition resulted in M = 75.81, SD = 16.89, and M = 67.57, SD = 14.96, and M = 38.83, SD = 18.52, respectively. Statistical tests revealed significant differences in task completion time, SUS score, and NASA-TLX score between the visual and the visual+haptic condition and the visual+audio and the visual+haptic condition.
Collapse
|
5
|
Fabio C, Salemme R, Koun E, Farnè A, Miller LE. Alpha Oscillations Are Involved in Localizing Touch on Handheld Tools. J Cogn Neurosci 2022; 34:675-686. [DOI: 10.1162/jocn_a_01820] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
Collapse
Affiliation(s)
- Cécile Fabio
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
| | - Romeo Salemme
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
| | - Eric Koun
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
| | - Alessandro Farnè
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
- University of Trento, Rovereto, Italy
| | - Luke E. Miller
- ImpAct, Lyon Neuroscience Research Center, France
- University of Lyon 1, France
- Hospices Civils de Lyon, Neuro-immersion, France
- Donders Institute for Brain, Nijmegen, The Netherlands
| |
Collapse
|
6
|
Dupin L, Cuenca M, Baron JC, Maier MA, Lindberg PG. Shrinking of spatial hand representation but not of objects across the lifespan. Cortex 2021; 146:173-185. [PMID: 34883309 DOI: 10.1016/j.cortex.2021.10.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2021] [Revised: 09/02/2021] [Accepted: 10/22/2021] [Indexed: 11/17/2022]
Abstract
Perception and action are based on cerebral spatial representations of the body and the external world. However, spatial representations differ from the physical characteristics of body and external space (e.g., objects). It remains unclear whether these discrepancies are related to functional requirements of action and are shared between different spatial representations, indicating common brain processes. We hypothesized that distortions of spatial hand representation would be affected by age, sensorimotor practice and external space representation. We assessed hand representations using tactile and verbal localization tasks and quantified object representation in three age groups (20-79 yrs, total n = 60). Our results show significant shrinking of spatial hand representations (hand width) with age, unrelated to sensorimotor functions. No such shrinking occurred in spatial object representations despite some common characteristics with hand representations. Therefore, spatial properties of body representation partially share characteristics of object representation but also evolve independently across the lifespan.
Collapse
Affiliation(s)
- Lucile Dupin
- Institut de Psychiatrie et Neurosciences de Paris, Inserm U1266, Université de Paris, Paris, France.
| | - Macarena Cuenca
- Centre de Recherche Clinique, GHU, Hôpital Sainte-Anne, Paris, France
| | - Jean-Claude Baron
- Institut de Psychiatrie et Neurosciences de Paris, Inserm U1266, Université de Paris, Paris, France
| | - Marc A Maier
- Université de Paris, INCC UMR 8002, CNRS, Paris, France
| | - Påvel G Lindberg
- Institut de Psychiatrie et Neurosciences de Paris, Inserm U1266, Université de Paris, Paris, France
| |
Collapse
|
7
|
Ujitoko Y, Tokuhisa R, Sakurai S, Hirota K. Impact Vibration Source Localization in Two-Dimensional Space Around Hand. IEEE TRANSACTIONS ON HAPTICS 2021; 14:862-873. [PMID: 34061752 DOI: 10.1109/toh.2021.3085756] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
This article investigated the localization ability of an impulse vibration source outside the body in two-dimensional space. We tested whether humans can recognize the direction or distance of an impulse vibration source when using their hand to detect spatiotemporal vibrotactile information provided by the propagated vibrational wave from the source. Specifically, we had users put their hands on a silicone rubber sheet in several postures. We asked users to indicate the position of the vibration source when a location on the sheet was indented. Experimental results suggested that the direction of the impact vibration source can be recognized to some extent, although recognition accuracy depends on hand posture and the position of the vibration source. The best results were achieved when the fingers and palm were grounded and a vibration source was presented around the middle fingertip, and the directional recognition error in this case was 6 °. In contrast, results suggest it is difficult to accurately recognize the distance of the vibration. The results of this study suggest a new possibility for directional display where vibrotactile actuators are embedded at a distance from the user's hand.
Collapse
|
8
|
Omnidirectional Haptic Guidance for the Hearing Impaired to Track Sound Sources. SIGNALS 2021. [DOI: 10.3390/signals2030030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
We developed a hearing assistance system that enables hearing-impaired people to track the horizontal movement of a single sound source. The movement of the sound source is presented to the subject by vibrating vibrators on both shoulders according to the distance to and direction of the sound source, which are estimated from the acoustic signals detected by microphones attached to both ears. We presented the direction of and distance to the sound source to the subject by changing the ratio of the intensity of the two vibrators according to the direction and by increasing the intensity the closer the person got to the sound source. The subject could recognize the approaching sound source as a change in the vibration intensity by turning their face in the direction where the intensity of both vibrators was equal. The direction of the moving sound source can be tracked with an accuracy of less than 5° when an analog vibration pattern is added to indicate the direction of the sound source. By presenting the direction of the sound source with high accuracy, it is possible to show subjects the approach and departure of a sound source.
Collapse
|
9
|
Intact tactile detection yet biased tactile localization in a hand-centered frame of reference: Evidence from a dissociation. Neuropsychologia 2020; 147:107585. [PMID: 32841632 DOI: 10.1016/j.neuropsychologia.2020.107585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2019] [Revised: 04/20/2020] [Accepted: 08/10/2020] [Indexed: 11/21/2022]
Abstract
We examined the performance of an individual with subcortical damage, but an intact somatosensory thalamocortical pathway, to examine the functional architecture of tactile detection and tactile localization processes. Consistent with the intact somatosensory thalamocortical pathway, tactile detection on the contralesional hand was well within the normal range. Despite intact detection, the individual demonstrated substantial localization biases. Across all localization experiments, he consistently localized tactile stimuli to the left side in space relative to the long axis of his hand. This was observed when the contralesional hand was palm up, palm down, rotated 90° relative to the trunk, and when making verbal responses. Furthermore, control experiments demonstrated that this response pattern was unlikely a motor response error. These findings indicate that tactile localization on the body is influenced by proprioceptive information specifically in a hand-centered frame of reference. Furthermore, this also provides evidence that aspects of tactile localization are mediated by pathways outside of the primary somatosensory thalamocortical pathway.
Collapse
|
10
|
Christie BP, Charkhkar H, Shell CE, Marasco PD, Tyler DJ, Triolo RJ. Visual inputs and postural manipulations affect the location of somatosensory percepts elicited by electrical stimulation. Sci Rep 2019; 9:11699. [PMID: 31406122 PMCID: PMC6690924 DOI: 10.1038/s41598-019-47867-1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2019] [Accepted: 07/25/2019] [Indexed: 12/02/2022] Open
Abstract
The perception of somatosensation requires the integration of multimodal information, yet the effects of vision and posture on somatosensory percepts elicited by neural stimulation are not well established. In this study, we applied electrical stimulation directly to the residual nerves of trans-tibial amputees to elicit sensations referred to their missing feet. We evaluated the influence of congruent and incongruent visual inputs and postural manipulations on the perceived size and location of stimulation-evoked somatosensory percepts. We found that although standing upright may cause percept size to change, congruent visual inputs and/or body posture resulted in better localization. We also observed visual capture: the location of a somatosensory percept shifted toward a visual input when vision was incongruent with stimulation-induced sensation. Visual capture did not occur when an adopted posture was incongruent with somatosensation. Our results suggest that internal model predictions based on postural manipulations reinforce perceived sensations, but do not alter them. These characterizations of multisensory integration are important for the development of somatosensory-enabled prostheses because current neural stimulation paradigms cannot replicate the afferent signals of natural tactile stimuli. Nevertheless, multisensory inputs can improve perceptual precision and highlight regions of the foot important for balance and locomotion.
Collapse
Affiliation(s)
- Breanne P Christie
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA. .,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.
| | - Hamid Charkhkar
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| | - Courtney E Shell
- Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.,Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, OH, USA
| | - Paul D Marasco
- Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA.,Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic, Cleveland, OH, USA
| | - Dustin J Tyler
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| | - Ronald J Triolo
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, USA.,Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, USA
| |
Collapse
|
11
|
Arnold G, Sarlegna FR, Fernandez LG, Auvray M. Somatosensory Loss Influences the Adoption of Self-Centered Versus Decentered Perspectives. Front Psychol 2019; 10:419. [PMID: 30914989 PMCID: PMC6421312 DOI: 10.3389/fpsyg.2019.00419] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2018] [Accepted: 02/12/2019] [Indexed: 11/13/2022] Open
Abstract
The body and the self are commonly experienced as forming a unity. Experiencing the external world as distinct from the self and the body strongly relies on adopting a single self-centered perspective which results in integrating multisensory sensations into one egocentric body-centered reference frame. Body posture and somatosensory representations have been reported to influence perception and specifically the reference frame relative to which multisensory sensations are coded. In the study reported here, we investigated the role of somatosensory and visual information in adopting self-centered and decentered spatial perspectives. Two deafferented patients who have neither tactile nor proprioceptive perception below the head and a group of age-matched control participants performed a graphesthesia task, consisting of the recognition of ambiguous letters (b, d, p, and q) drawn tactilely on head surfaces. To answer which letter was drawn, the participants can adopt either a self-centered perspective or a decentered one (i.e., centered on a body part or on an external location). The participants' responses can be used, in turn, to infer the way the left-right and top-bottom letters' axes are assigned with respect to the left-right and top-bottom axes of their body. In order to evaluate the influence of body posture, the ambiguous letters were drawn on the participants' forehead, left, and right surfaces of the head, with the head aligned or rotated in yaw relative to the trunk. In order to evaluate the role of external information, the participants completed the task with their eyes open in one session and closed in another one. The results obtained in control participants revealed that their preferred perspective varied with body posture but not with vision. Different results were obtained with the deafferented patients who overall do not show any significant effect of their body posture on their preferred perspective. This result suggests that the orientation of their self is not influenced by their physical body. There was an effect of vision for only one of the two patients. The deafferented patients rely on strategies that are more prone to interindividual differences, which highlights the crucial role of somatosensory information in adopting self-centered spatial perspectives.
Collapse
Affiliation(s)
- Gabriel Arnold
- Caylar, Villebon-sur-Yvette, France.,Institut des Systèmes Intelligents et de Robotique (ISIR), CNRS UMR 7222, Sorbonne Université, Paris, France
| | | | - Laura G Fernandez
- Institut des Systèmes Intelligents et de Robotique (ISIR), CNRS UMR 7222, Sorbonne Université, Paris, France
| | - Malika Auvray
- Institut des Systèmes Intelligents et de Robotique (ISIR), CNRS UMR 7222, Sorbonne Université, Paris, France
| |
Collapse
|
12
|
Abstract
We report two experiments designed to investigate how the implied motion of tactile stimuli influences perceived location. Predicting the location of sensory input is especially important as far as the perception of, and interaction with, the external world is concerned. Using two different experimental approaches, an overall pattern of localization shifts analogous to what has been described previously in the visual and auditory modalities is reported. That is, participants perceive the last location of a dynamic stimulus further along its trajectory than is objectively the case. In Experiment 1, participants judged whether the last vibration in a sequence of three was located closer to the wrist or to the elbow. In Experiment 2, they indicated the last location on a ruler attached to their forearm. We further pinpoint the effects of implied motion on tactile localization by investigating the independent influences of motion direction and perceptual uncertainty. Taken together, these findings underline the importance of dynamic information in localizing tactile stimuli on the skin.
Collapse
|
13
|
Wang D, Peng C, Afzal N, Li W, Wu D, Zhang Y. Localization Performance of Multiple Vibrotactile Cues on Both Arms. IEEE TRANSACTIONS ON HAPTICS 2018; 11:97-106. [PMID: 28841557 DOI: 10.1109/toh.2017.2742507] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
To present information using vibrotactile stimuli in wearable devices, it is fundamental to understand human performance of localizing vibrotactile cues across the skin surface. In this paper, we studied human ability to identify locations of multiple vibrotactile cues activated simultaneously on both arms. Two haptic bands were mounted in proximity to the elbow and shoulder joints on each arm, and two vibrotactile motors were mounted on each band to provide vibration cues to the dorsal and palmar side of the arm. The localization performance under four conditions were compared, with the number of the simultaneously activated cues varying from one to four in each condition. Experimental results illustrate that the rate of correct localization decreases linearly with the increase in the number of activated cues. It was 27.8 percent for three activated cues, and became even lower for four activated cues. An analysis of the correct rate and error patterns show that the layout of vibrotactile cues can have significant effects on the localization performance of multiple vibrotactile cues. These findings might provide guidelines for using vibrotactile cues to guide the simultaneous motion of multiple joints on both arms.
Collapse
|
14
|
Arnold G, Spence C, Auvray M. A unity of the self or a multiplicity of locations? How the graphesthesia task sheds light on the role of spatial perspectives in bodily self-consciousness. Conscious Cogn 2017; 56:100-114. [DOI: 10.1016/j.concog.2017.06.012] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2017] [Revised: 06/02/2017] [Accepted: 06/20/2017] [Indexed: 10/19/2022]
|
15
|
Medina S, Tamè L, Longo MR. Tactile localization biases are modulated by gaze direction. Exp Brain Res 2017; 236:31-42. [PMID: 29018928 DOI: 10.1007/s00221-017-5105-2] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Accepted: 10/05/2017] [Indexed: 01/03/2023]
Abstract
Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition, a single stimulus was delivered on participants' hand, while the reflexion of the right hand was seen through the mirror; in the object condition, participants looked at a box through the mirror, and in the right hand condition, participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points' locations, by elongating it, in the radio-ulnar axis.
Collapse
Affiliation(s)
- Sonia Medina
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK
| | - Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK.
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, WC1E 7HX, UK.
| |
Collapse
|
16
|
Tamè L, Dransfield E, Quettier T, Longo MR. Finger posture modulates structural body representations. Sci Rep 2017; 7:43019. [PMID: 28223685 PMCID: PMC5320438 DOI: 10.1038/srep43019] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2016] [Accepted: 01/17/2017] [Indexed: 11/09/2022] Open
Abstract
Patients with lesions of the left posterior parietal cortex commonly fail in identifying their fingers, a condition known as finger agnosia, yet are relatively unimpaired in sensation and skilled action. Such dissociations have traditionally been interpreted as evidence that structural body representations (BSR), such as the body structural description, are distinct from sensorimotor representations, such as the body schema. We investigated whether performance on tasks commonly used to assess finger agnosia is modulated by changes in hand posture. We used the 'in between' test in which participants estimate the number of unstimulated fingers between two touched fingers or a localization task in which participants judge which two fingers were stimulated. Across blocks, the fingers were placed in three levels of splay. Judged finger numerosity was analysed, in Exp. 1 by direct report and in Exp. 2 as the actual number of fingers between the fingers named. In both experiments, judgments were greater when non-adjacent stimulated fingers were positioned far apart compared to when they were close together or touching, whereas judgements were unaltered when adjacent fingers were stimulated. This demonstrates that BSRs are not fixed, but are modulated by the real-time physical distances between body parts.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| | - Elanah Dransfield
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| | - Thomas Quettier
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| |
Collapse
|
17
|
Nociceptive-Evoked Potentials Are Sensitive to Behaviorally Relevant Stimulus Displacements in Egocentric Coordinates. eNeuro 2016; 3:eN-NWR-0151-15. [PMID: 27419217 PMCID: PMC4939400 DOI: 10.1523/eneuro.0151-15.2016] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2015] [Revised: 05/11/2016] [Accepted: 05/26/2016] [Indexed: 12/26/2022] Open
Abstract
Feature selection has been extensively studied in the context of goal-directed behavior, where it is heavily driven by top-down factors. A more primitive version of this function is the detection of bottom-up changes in stimulus features in the environment. Indeed, the nervous system is tuned to detect fast-rising, intense stimuli that are likely to reflect threats, such as nociceptive somatosensory stimuli. These stimuli elicit large brain potentials maximal at the scalp vertex. When elicited by nociceptive laser stimuli, these responses are labeled laser-evoked potentials (LEPs). Although it has been shown that changes in stimulus modality and increases in stimulus intensity evoke large LEPs, it has yet to be determined whether stimulus displacements affect the amplitude of the main LEP waves (N1, N2, and P2). Here, in three experiments, we identified a set of rules that the human nervous system obeys to identify changes in the spatial location of a nociceptive stimulus. We showed that the N2 wave is sensitive to: (1) large displacements between consecutive stimuli in egocentric, but not somatotopic coordinates; and (2) displacements that entail a behaviorally relevant change in the stimulus location. These findings indicate that nociceptive-evoked vertex potentials are sensitive to behaviorally relevant changes in the location of a nociceptive stimulus with respect to the body, and that the hand is a particularly behaviorally important site.
Collapse
|
18
|
Disentangling the External Reference Frames Relevant to Tactile Localization. PLoS One 2016; 11:e0158829. [PMID: 27391805 PMCID: PMC4938545 DOI: 10.1371/journal.pone.0158829] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Accepted: 06/22/2016] [Indexed: 12/03/2022] Open
Abstract
Different reference frames appear to be relevant for tactile spatial coding. When participants give temporal order judgments (TOJ) of two tactile stimuli, one on each hand, performance declines when the hands are crossed. This effect is attributed to a conflict between anatomical and external location codes: hand crossing places the anatomically right hand into the left side of external space. However, hand crossing alone does not specify the anchor of the external reference frame, such as gaze, trunk, or the stimulated limb. Experiments that used explicit localization responses, such as pointing to tactile stimuli rather than crossing manipulations, have consistently implicated gaze-centered coding for touch. To test whether crossing effects can be explained by gaze-centered coding alone, participants made TOJ while the position of the hands was manipulated relative to gaze and trunk. The two hands either lay on different sides of space relative to gaze or trunk, or they both lay on one side of the respective space. In the latter posture, one hand was on its "regular side of space" despite hand crossing, thus reducing overall conflict between anatomical and external codes. TOJ crossing effects were significantly reduced when the hands were both located on the same side of space relative to gaze, indicating gaze-centered coding. Evidence for trunk-centered coding was tentative, with an effect in reaction time but not in accuracy. These results link paradigms that use explicit localization and TOJ, and corroborate the relevance of gaze-related coding for touch. Yet, gaze and trunk-centered coding did not account for the total size of crossing effects, suggesting that tactile localization relies on additional, possibly limb-centered, reference frames. Thus, tactile location appears to be estimated by integrating multiple anatomical and external reference frames.
Collapse
|
19
|
Abstract
Vestibular signals are integrated with signals from other sensory modalities. This convergence could reflect an important mechanism for maintaining the perception of the body. Here we review the current literature in order to develop a framework for understanding how the vestibular system contributes to body representation. According to recent models, we distinguish between three processes for body representation, and we look at whether vestibular signals might influence each process. These are (i) somatosensation, the primary sensory processing of somatic stimuli, (ii) somatoperception, the processes of constructing percepts and experiences of somatic objects and events and (iii) somatorepresentation, the knowledge about the body as a physical object in the world. Vestibular signals appear to contribute to all three levels in this model of body processing. Thus, the traditional view of the vestibular system as a low-level, dedicated orienting module tends to underestimate the pervasive role of vestibular input in bodily self-awareness.
Collapse
Affiliation(s)
- Elisa Raffaella Ferrè
- a Department of Psychology , Royal Holloway University of London , Egham , UK.,b Institute of Cognitive Neuroscience , University College London , London , UK
| | - Patrick Haggard
- b Institute of Cognitive Neuroscience , University College London , London , UK
| |
Collapse
|
20
|
Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn Neuropsychol 2016; 33:26-47. [PMID: 27327353 PMCID: PMC4975087 DOI: 10.1080/02643294.2016.1168791] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.
Collapse
Affiliation(s)
- Stephanie Badde
- a Department of Psychology , New York University , New York , NY , USA
| | - Tobias Heed
- b Faculty of Psychology and Human Movement Science , University of Hamburg , Hamburg , Germany
| |
Collapse
|
21
|
Arnold G, Spence C, Auvray M. Taking someone else’s spatial perspective: Natural stance or effortful decentring? Cognition 2016; 148:27-33. [DOI: 10.1016/j.cognition.2015.12.006] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2015] [Revised: 12/07/2015] [Accepted: 12/12/2015] [Indexed: 11/26/2022]
|
22
|
Tamè L, Longo MR. Inter-hemispheric integration of tactile-motor responses across body parts. Front Hum Neurosci 2015; 9:345. [PMID: 26124718 PMCID: PMC4466437 DOI: 10.3389/fnhum.2015.00345] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2015] [Accepted: 05/29/2015] [Indexed: 12/14/2022] Open
Abstract
In simple detection tasks, reaction times (RTs) are faster when stimuli are presented to the visual field or side of the body ipsilateral to the body part used to respond. This advantage, the crossed-uncrossed difference (CUD), is thought to reflect inter-hemispheric interactions needed for sensorimotor information to be integrated between the two cerebral hemispheres. However, it is unknown whether the tactile CUD is invariant when different body parts are stimulated. The most likely structure mediating such processing is thought to be the corpus callosum (CC). Neurophysiological studies have shown that there are denser callosal connections between regions that represent proximal parts of the body near the body midline and more sparse connections for regions representing distal extremities. Therefore, if the information transfer between the two hemispheres is affected by the density of callosal connections, stimuli presented on more distal regions of the body should produce a greater CUD compared to stimuli presented on more proximal regions. This is because interhemispheric transfer of information from regions with sparse callosal connections will be less efficient, and hence slower. Here, we investigated whether the CUD is modulated as a function of the different body parts stimulated by presenting tactile stimuli unpredictably on body parts at different distances from the body midline (i.e., Middle Finger, Forearm, or Forehead of each side of the body). Participants detected the stimulus and responded as fast as possible using either their left or right foot. Results showed that the magnitude of the CUD was larger on the finger (~2.6 ms) and forearm (~1.8 ms) than on the forehead (≃0.9 ms). This result suggests that the interhemispheric transfer of tactile stimuli varies as a function of the strength of callosal connections of the body parts.
Collapse
Affiliation(s)
- Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London London, UK
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London London, UK
| |
Collapse
|
23
|
Harris LR, Carnevale MJ, D’Amour S, Fraser LE, Harrar V, Hoover AEN, Mander C, Pritchett LM. How our body influences our perception of the world. Front Psychol 2015; 6:819. [PMID: 26124739 PMCID: PMC4464078 DOI: 10.3389/fpsyg.2015.00819] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2015] [Accepted: 05/29/2015] [Indexed: 12/02/2022] Open
Abstract
Incorporating the fact that the senses are embodied is necessary for an organism to interpret sensory information. Before a unified perception of the world can be formed, sensory signals must be processed with reference to body representation. The various attributes of the body such as shape, proportion, posture, and movement can be both derived from the various sensory systems and can affect perception of the world (including the body itself). In this review we examine the relationships between sensory and motor information, body representations, and perceptions of the world and the body. We provide several examples of how the body affects perception (including but not limited to body perception). First we show that body orientation effects visual distance perception and object orientation. Also, visual-auditory crossmodal-correspondences depend on the orientation of the body: audio "high" frequencies correspond to a visual "up" defined by both gravity and body coordinates. Next, we show that perceived locations of touch is affected by the orientation of the head and eyes on the body, suggesting a visual component to coding body locations. Additionally, the reference-frame used for coding touch locations seems to depend on whether gaze is static or moved relative to the body during the tactile task. The perceived attributes of the body such as body size, affect tactile perception even at the level of detection thresholds and two-point discrimination. Next, long-range tactile masking provides clues to the posture of the body in a canonical body schema. Finally, ownership of seen body parts depends on the orientation and perspective of the body part in view. Together, all of these findings demonstrate how sensory and motor information, body representations, and perceptions (of the body and the world) are interdependent.
Collapse
Affiliation(s)
- Laurence R. Harris
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Michael J. Carnevale
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Sarah D’Amour
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Lindsey E. Fraser
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Vanessa Harrar
- School of Optometry, University of Montreal, Montreal, QC, Canada
| | - Adria E. N. Hoover
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Charles Mander
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Lisa M. Pritchett
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| |
Collapse
|
24
|
Abstract
We investigated whether the relative position of objects and the body would influence haptic recognition. People felt objects on the right or left side of their body midline, using their right hand. Their head was turned towards or away from the object, and they could not see their hands or the object. People were better at naming 2-D raised line drawings and 3-D small-scale models of objects and also real, everyday objects when they looked towards them. However, this head-towards benefit was reliable only when their right hand crossed their body midline to feel objects on their left side. Thus, haptic object recognition was influenced by people's head position, although vision of their hand and the object was blocked. This benefit of turning the head towards the object being explored suggests that proprioceptive and haptic inputs are remapped into an external coordinate system and that this remapping is harder when the body is in an unusual position (with the hand crossing the body midline and the head turned away from the hand). The results indicate that haptic processes align sensory inputs from the hand and head even though either hand-centered or object-centered coordinate systems should suffice for haptic object recognition.
Collapse
|
25
|
Gherri E, Forster B. Attention to the body depends on eye-in-orbit position. Front Psychol 2014; 5:683. [PMID: 25071653 PMCID: PMC4086396 DOI: 10.3389/fpsyg.2014.00683] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2013] [Accepted: 06/13/2014] [Indexed: 11/13/2022] Open
Abstract
Attentional selectivity in touch is modulated by the position of the body in external space. For instance, during endogenous attention tasks in which tactile stimuli are presented to the hands, the effect of attention is reduced when the hands are placed far apart than when they are close together and when the hands are crossed as compared to when they are placed in their anatomical position. This suggests that both somatotopic and external spatial reference frames coding the hands’ locations contribute to the spatial selection of the relevant hand. Here we investigate whether tactile selection of hands is also modulated by the position of other body parts, not directly involved in tactile perception, such as eye-in-orbit (gaze direction). We asked participants to perform the same sustained tactile attention task while gazing laterally toward an eccentric fixation point (Eccentric gaze) or toward a central fixation point (Central gaze). Event-related potentials recorded in response to tactile non-target stimuli presented to the attended or unattended hand were compared as a function of gaze direction (Eccentric vs. Central conditions). Results revealed that attentional modulations were reduced in the Eccentric gaze condition as compared to the Central gaze condition in the time range of the Nd component (200–260 ms post-stimulus), demonstrating for the first time that the attentional selection of one of the hands is affected by the position of the eye in the orbit. Directing the eyes toward an eccentric position might be sufficient to create a misalignment between external and somatotopic frames of references reducing tactile attention. This suggests that the eye-in-orbit position contributes to the spatial selection of the task relevant body part.
Collapse
Affiliation(s)
- Elena Gherri
- Department of Psychology, University of Edinburgh, Edinburgh UK
| | | |
Collapse
|
26
|
Development of a simple pressure and heat stimulator for intra- and interdigit functional magnetic resonance imaging. Behav Res Methods 2013; 46:396-405. [PMID: 23861087 DOI: 10.3758/s13428-013-0371-9] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
For this study, we developed a simple pressure and heat stimulator that can quantitatively control pressure and provide heat stimulation to intra- and interdigit areas. The developed stimulator consists of a control unit, drive units, and tactors. The control unit controls the stimulation parameters, such as stimulation types, intensity, time, and channel, and transmits a created signal of stimulation to the drive units. The drive units operate pressure and heat tactors in response to commands from the control unit. The pressure and heat tactors can display various stimulation intensities quantitatively, apply stimulation continuously, and adjust the stimulation areas. Additionally, they can easily be attached to and detached from the digits. The developed pressure and heat stimulator is small in total size, easy to install, and inexpensive to manufacture. The new stimulator operated stably in a magnetic resonance imaging (MRI) environment without affecting the obtained images. A preliminary functional magnetic resonance imaging (fMRI) experiment confirmed that differences in activation of somatosensory areas were induced from the pressure and heat stimulation. The developed pressure and heat stimulator is expected to be utilized for future intra- and interdigit fMRI studies on pressure and heat stimulation.
Collapse
|
27
|
Cowie D, Makin TR, Bremner AJ. Children’s Responses to the Rubber-Hand Illusion Reveal Dissociable Pathways in Body Representation. Psychol Sci 2013; 24:762-9. [DOI: 10.1177/0956797612462902] [Citation(s) in RCA: 67] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
The bodily self is constructed from multisensory information. However, little is known of the relation between multisensory development and the emerging sense of self. We investigated this question by measuring the strength of the rubber-hand illusion in young children (4 to 9 years old) and adults. Intermanual pointing showed that children were as sensitive as adults to visual-tactile synchrony cues for hand position, which indicates that a visual-tactile pathway to the bodily self matures by at least 4 years of age. However, regardless of synchrony cues, children’s perceived hand position was closer to the rubber hand than adults’ perceived hand position was. This indicates a second, later-maturing process based on visual-proprioceptive information. Furthermore, explicit feelings of embodiment were related only to the visual-tactile process. These findings demonstrate two dissociable processes underlying body representation in early life, and they call into question current models of body representation and ownership in adulthood.
Collapse
Affiliation(s)
- Dorothy Cowie
- Sensorimotor Development Research Unit, Department of Psychology, Goldsmiths, University of London
| | - Tamar R. Makin
- FMRIB Centre, Nuffield Department of Clinical Neurosciences, University of Oxford
| | - Andrew J. Bremner
- Sensorimotor Development Research Unit, Department of Psychology, Goldsmiths, University of London
| |
Collapse
|
28
|
Pritchett LM, Carnevale MJ, Harris LR. Reference frames for coding touch location depend on the task. Exp Brain Res 2012; 222:437-45. [PMID: 22941315 DOI: 10.1007/s00221-012-3231-4] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2012] [Accepted: 08/11/2012] [Indexed: 11/26/2022]
Abstract
The position of gaze (eye plus head position) relative to body is known to alter the perceived locations of sensory targets. This effect suggests that perceptual space is at least partially coded in a gaze-centered reference frame. However, the direction of the effects reported has not been consistent. Here, we investigate the cause of a discrepancy between reported directions of shift in tactile localization related to head position. We demonstrate that head eccentricity can cause errors in touch localization in either the same or opposite direction as the head is turned depending on the procedure used. When head position is held eccentric during both the presentation of a touch and the response, there is a shift in the direction opposite to the head. When the head is returned to center before reporting, the shift is in the same direction as head eccentricity. We rule out a number of possible explanations for the difference and conclude that when the head is moved between a touch and response the touch is coded in a predominantly gaze-centered reference frame, whereas when the head remains stationary a predominantly body-centered reference frame is used. The mechanism underlying these displacements in perceived location is proposed to involve an underestimated gaze signal. We propose a model demonstrating how this single neural error could cause localization errors in either direction depending on whether the gaze or body midline is used as a reference. This model may be useful in explaining gaze-related localization errors in other modalities.
Collapse
Affiliation(s)
- Lisa M Pritchett
- Centre for Vision Research, York University, Toronto, ON, Canada.
| | | | | |
Collapse
|
29
|
Kim HS, Choi MH, Yeon HW, Jun JH, Yi JH, Park JR, Lim DW, Chung SC. A new tactile stimulator using a planar coil type actuator. SENSORS AND ACTUATORS A: PHYSICAL 2012; 178:209-216. [DOI: 10.1016/j.sna.2012.02.044] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/14/2024]
|
30
|
Kim HS, Yeon HW, Choi MH, Kim JH, Choi JS, Park JY, Jun JH, Yi JH, Tack GR, Chung SC. Development of a tactile stimulator with simultaneous visual and auditory stimulation using E-Prime software. Comput Methods Biomech Biomed Engin 2011; 16:481-7. [PMID: 22149159 DOI: 10.1080/10255842.2011.625018] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
Abstract
In this study, a tactile stimulator was developed, which can stimulate visual and auditory senses simultaneously by using the E-Prime software. This study tried to compensate for systematic stimulation control and other problems that occurred with previously developed tactile stimulators. The newly developed system consists of three units: a control unit, a drive unit and a vibrator. Since the developed system is a small, lightweight, simple structure with low electrical consumption, a maximum of 35 stimulation channels and various visual and auditory stimulation combinations without delay time, the previous systematic problem is corrected in this study. The system was designed to stimulate any part of the body including the fingers. Since the developed tactile stimulator used E-Prime software, which is widely used in the study of visual and auditory senses, the stimulator is expected to be highly practical due to a diverse combination of stimuli, such as tactile-visual, tactile-auditory, visual-auditory and tactile-visual-auditory stimulation.
Collapse
Affiliation(s)
- Hyung-Sik Kim
- Department of Biomedical Engineering, College of Biomedical and Health Science, Research Institute of Biomedical Engineering, Konkuk University, 322 Danwol-dong, Chungju-si, Chungcheongbuk-do 380-701, South Korea
| | | | | | | | | | | | | | | | | | | |
Collapse
|
31
|
Perceived touch location is coded using a gaze signal. Exp Brain Res 2011; 213:229-34. [PMID: 21559744 DOI: 10.1007/s00221-011-2713-0] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2010] [Accepted: 04/26/2011] [Indexed: 10/18/2022]
Abstract
The location of a touch to the skin, first coded in body coordinates, may be transformed into retinotopic coordinates to facilitate visual-tactile integration. In order for the touch location to be transformed into a retinotopic reference frame, the location of the eyes and head must be taken into account. Previous studies have found eye position-related errors (Harrar and Harris in Exp Brain Res 203:615-620, 2009) and head position-related errors (Ho and Spence Brain Res 1144:136-141, 2007) in tactile localization, indicating that imperfect versions of eye and head signals may be used in the body-to-visual coordinate transformation. Here, we investigated the combined effects of head and eye position on the perceived location of a mechanical touch to the arm. Subjects reported the perceived position of a touch that was presented while their head was positioned to the left, right, or center of the body and their eyes were positioned to the left, right, or center in their orbits. The perceived location of a touch shifted in the direction of both head and the eyes by approximately the same amount. We interpret these shifts as being consistent with touch location being coded in a visual reference frame with a gaze signal used to compute the transformation.
Collapse
|
32
|
Harrar V, Harris LR. Touch used to guide action is partially coded in a visual reference frame. Exp Brain Res 2010; 203:615-20. [PMID: 20428854 DOI: 10.1007/s00221-010-2252-0] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2010] [Accepted: 04/08/2010] [Indexed: 11/25/2022]
Abstract
The perceived location of touch on the skin is affected by the position of the eyes in the head, suggesting that it is at least partially coded in a visual reference frame. This observation was made by comparing the perceived location of a touch to a visual reference. Here, we ask whether the location of a touch is coded differently when it guides an action. We tested the perceived position of four touches on the arm (approximately 5 cm apart) while participants adopted one of four eccentric fixations. A touch-sensitive screen was positioned over the stimulated left arm and subjects pointed, using their right arm, to the perceived touch location. The location that subjects pointed to varied with eye position, shifting by 0.016 cm/deg in the direction of eye eccentricity. The dependence on eye position suggests that tactile coding for action is also at least partially coded in a visual reference frame, as it is for perception.
Collapse
Affiliation(s)
- Vanessa Harrar
- Centre for Vision Research, York University, Toronto, ON, M3J 1P3, Canada
| | | |
Collapse
|
33
|
|
34
|
From maps to form to space: touch and the body schema. Neuropsychologia 2009; 48:645-54. [PMID: 19699214 DOI: 10.1016/j.neuropsychologia.2009.08.017] [Citation(s) in RCA: 129] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2009] [Revised: 07/14/2009] [Accepted: 08/16/2009] [Indexed: 11/24/2022]
Abstract
Evidence from patients has shown that primary somatosensory representations are plastic, dynamically changing in response to central or peripheral alterations, as well as experience. Furthermore, recent research has also demonstrated that altering body posture results in changes in the perceived sensation and localization of tactile stimuli. Using evidence from behavioral studies with brain-damaged and healthy subjects, as well as functional imaging, we propose that the traditional concept of the body schema should be divided into three components. First are primary somatosensory representations, which are representations of the skin surface that are typically somatotopically organized, and have been shown to change dynamically due to peripheral (usage, amputation, deafferentation) or central (lesion) modifications. Second, we argue for a mapping from a primary somatosensory representation to a secondary representation of body size and shape (body form representation). Finally, we review evidence for a third set of representations that encodes limb position and is used to represent the location of tactile stimuli relative to the subject using external, non-somatotopic reference frames (postural representations).
Collapse
|
35
|
Eye position affects the perceived location of touch. Exp Brain Res 2009; 198:403-10. [PMID: 19533110 DOI: 10.1007/s00221-009-1884-4] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2008] [Accepted: 05/27/2009] [Indexed: 12/23/2022]
Abstract
Here, we demonstrate a systematic shift in the perceived location of a tactile stimulus on the arm toward where the eye is looking. Participants reported the perceived position of touches presented between the elbow and the wrist while maintaining eye positions at various eccentricities. The perceived location of the touch was shifted by between 1 and 5 cm (1.9 degrees -9.5 degrees visual angle) by a change in eye position of +/-25 degrees from straight ahead. In a control condition, we repeat the protocol with the eyes fixating straight ahead. Changes in attention accounted for only 17% of the shift due to eye position. The pattern of tactile shifts due to eye position was comparable whether or not the arm was visible. However, touches at locations along the forearm were perceived as being farther apart when the arm was visible compared to when it was covered. These results are discussed in terms of the coding of tactile space, which seems to require integration of tactile, visual and eye position information.
Collapse
|
36
|
Ho C, Santangelo V, Spence C. Multisensory warning signals: when spatial correspondence matters. Exp Brain Res 2009; 195:261-72. [PMID: 19381621 DOI: 10.1007/s00221-009-1778-5] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2009] [Accepted: 03/17/2009] [Indexed: 10/20/2022]
|
37
|
Spence C, Ho C. Multisensory warning signals for event perception and safe driving. THEORETICAL ISSUES IN ERGONOMICS SCIENCE 2008. [DOI: 10.1080/14639220701816765] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|