1
|
Jelonek W, Malik J, Łochyński D. Effects of attentional focus on spatial localization of distal body parts and touch in two-arm position matching. Exp Brain Res 2024; 243:27. [PMID: 39699636 DOI: 10.1007/s00221-024-06976-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2024] [Accepted: 12/06/2024] [Indexed: 12/20/2024]
Abstract
This study investigated how the judgment of proximal joint position can be affected by touch alone, focused attention on the distal body part, or touch spatial localization. Participants completed a two-arm elbow joint position-matching task, in which they indicated the location of one forearm by the placement of the other. In four test conditions, matching was performed during (1) detection of touch (tactile stimulation of index finger pads), (2) spatial localization of fingers (attention focused on the position of index finger pads), (3) spatial localization of touch on fingers (attention focused on tactile stimulation of index finger pads), and (4) detection of touch but localization of fingers (tactile stimulation of index finger pads, but attention focusing on the spatial position of the pads). In the first experiment (n = 23), the sensitivity of muscle spindle receptors in both reference and indicator arms was reduced and equalized by both-slack conditioning. In the second experiment (n = 20), the illusion of excessive elbow flexion in the reference arm and excessive extension in the indicator arm was generated through extension-flexion conditioning. In the first experiment, the accuracy and precision of matching were unaffected in any test condition. In the second experiment, participants made amplified undershooting errors under attention-focused conditions. In conclusion, focused attention on the location of a distal body part and touch affects both the spatial localization of the limb and tactile remapping only when the perceived forearm position is misinterpreted due to imbalanced proprioceptive input from antagonistic arm muscles.
Collapse
Affiliation(s)
- Wojciech Jelonek
- Department of Neuromuscular Physiotherapy, Poznan University of Physical Education, Królowej Jadwigi 27/39, Poznan, 61-871, Poland.
| | - Jakub Malik
- Department of Pedagogy, Poznan University of Physical Education, Królowej Jadwigi 27/39, Poznan, 61-871, Poland
| | - Dawid Łochyński
- Department of Neuromuscular Physiotherapy, Poznan University of Physical Education, Królowej Jadwigi 27/39, Poznan, 61-871, Poland
| |
Collapse
|
2
|
Peviani VC, Joosten MGA, Miller LE, Medendorp WP. Bayesian inference in arm posture perception. J Neurophysiol 2024; 132:1639-1649. [PMID: 39412564 DOI: 10.1152/jn.00297.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2024] [Revised: 10/03/2024] [Accepted: 10/11/2024] [Indexed: 11/15/2024] Open
Abstract
To configure our limbs in space, the brain must compute their position based on sensory information provided by mechanoreceptors in the skin, muscles, and joints. Because this information is corrupted by noise, the brain is thought to process it probabilistically and integrate it with prior belief about arm posture, following Bayes' rule. Here, we combined computational modeling with behavioral experimentation to test this hypothesis. The model conceives the perception of arm posture as the combination of a probabilistic kinematic chain composed by the shoulder, elbow, and wrist angles, compromised with additive Gaussian noise, with a Gaussian prior about these joint angles. We tested whether the model explains errors in a virtual reality (VR)-based posture matching task better than a model that assumes a uniform prior. Human participants (N = 20) were required to align their unseen right arm to a target posture, presented as a visual configuration of the arm in the horizontal plane. Results show idiosyncratic biases in how participants matched their unseen arm to the target posture. We used maximum likelihood estimation to fit the Bayesian model to these observations and estimate key parameters including the prior means and its variance-covariance structure. The Bayesian model including a Gaussian prior explained the response biases and variance much better than a model with a uniform prior. The prior varied across participants, consistent with the idiosyncrasies in arm posture perception and in alignment with previous behavioral research. Our work clarifies the biases in arm posture perception within a new perspective on the nature of proprioceptive computations.NEW & NOTEWORTHY We modeled the perception of arm posture as a Bayesian computation. A VR posture-matching task was used to empirically test this Bayesian model. The Bayesian model including a nonuniform postural prior well explained individual participants' biases in arm posture matching.
Collapse
Affiliation(s)
- Valeria C Peviani
- Donders Center for CognitionRadboud University, Nijmegen, The Netherlands
| | - Manon G A Joosten
- Donders Center for CognitionRadboud University, Nijmegen, The Netherlands
| | - Luke E Miller
- Donders Center for CognitionRadboud University, Nijmegen, The Netherlands
| | - W Pieter Medendorp
- Donders Center for CognitionRadboud University, Nijmegen, The Netherlands
| |
Collapse
|
3
|
Tanner J, Orthlieb G, Helms Tillery S. Effect of touch on proprioception: non-invasive trigeminal nerve stimulation suggests general arousal rather than tactile-proprioceptive integration. Front Hum Neurosci 2024; 18:1429843. [PMID: 39469503 PMCID: PMC11513270 DOI: 10.3389/fnhum.2024.1429843] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2024] [Accepted: 09/09/2024] [Indexed: 10/30/2024] Open
Abstract
Introduction Proprioceptive error of estimated fingertip position in two-dimensional space is reduced with the addition of tactile stimulation applied at the fingertip. Tactile input does not disrupt the participants' estimation strategy, as the individual error vector maps maintain their overall structure. This relationship suggests integration of proprioception and tactile information improves proprioceptive estimation, which can also be improved with trained mental focus and attention. Task attention and arousal are physiologically regulated by the reticular activating system (RAS), a brainstem circuit including the locus coeruleus (LC). There is direct and indirect evidence that these structures can be modulated by non-invasive trigeminal nerve stimulation (nTNS), providing an opportunity to examine nTNS effect on the integrative relationship of proprioceptive and tactile information. Methods Fifteen right-handed participants performed a simple right-handed proprioceptive estimation task with tactile feedback (touch) and no tactile (hover) feedback. Participants repeated the task after nTNS administration. Stimulation was delivered for 10 min, and stimulation parameters were 3,000 Hz, 50 μs pulse width, with a mean of 7 mA. Error maps across the workspace are generated using polynomial models of the participants' target responses. Results Error maps did not demonstrate significant vector direction changes between conditions for any participant, indicating that nTNS does not disrupt spatial proprioception estimation strategies. A linear mixed model regression with nTNS epoch, tactile condition, and the interaction as factors demonstrated that nTNS reduced proprioceptive error under the hover condition only. Discussion We argue that nTNS does not disrupt spatial proprioceptive error maps but can improve proprioceptive estimation in the absence of tactile feedback. However, we observe no evidence that nTNS enhances tactile-proprioceptive integration as the touch condition does not exhibit significantly reduced error after nTNS.
Collapse
Affiliation(s)
- Justin Tanner
- School of Biological and Health Systems Engineering, Arizona State University, Tempe, AZ, United States
| | | | | |
Collapse
|
4
|
Wang T, Morehead RJ, Tsay JS, Ivry RB. The Origin of Movement Biases During Reaching. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.03.15.585272. [PMID: 38562840 PMCID: PMC10983854 DOI: 10.1101/2024.03.15.585272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/04/2024]
Abstract
Goal-directed movements can fail due to errors in our perceptual and motor systems. While these errors may arise from random noise within these sources, they also reflect systematic motor biases that vary with the location of the target. The origin of these systematic biases remains controversial. Drawing on data from an extensive array of reaching tasks conducted over the past 30 years, we evaluated the merits of various computational models regarding the origin of motor biases. Contrary to previous theories, we show that motor biases do not arise from systematic errors associated with the sensed hand position during motor planning or from the biomechanical constraints imposed during motor execution. Rather, motor biases are primarily caused by a misalignment between eye-centric and the body-centric representations of position. This model can account for motor biases across a wide range of contexts, encompassing movements with the right versus left hand, proximal and distal effectors, visible and occluded starting positions, as well as before and after sensorimotor adaptation.
Collapse
Affiliation(s)
- Tianhe Wang
- Department of Psychology, University of California, Berkeley
- Department of Neuroscience, University of California, Berkeley
| | | | | | - Richard B. Ivry
- Department of Psychology, University of California, Berkeley
- Department of Neuroscience, University of California, Berkeley
| |
Collapse
|
5
|
Héroux ME, Fisher G, Axelson LH, Butler AA, Gandevia SC. How we perceive the width of grasped objects: Insights into the central processes that govern proprioceptive judgements. J Physiol 2024; 602:2899-2916. [PMID: 38734987 DOI: 10.1113/jp286322] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2024] [Accepted: 04/09/2024] [Indexed: 05/13/2024] Open
Abstract
Low-level proprioceptive judgements involve a single frame of reference, whereas high-level proprioceptive judgements are made across different frames of reference. The present study systematically compared low-level (grasp → $\rightarrow$ grasp) and high-level (vision → $\rightarrow$ grasp, grasp → $\rightarrow$ vision) proprioceptive tasks, and quantified the consistency of grasp → $\rightarrow$ vision and possible reciprocal nature of related high-level proprioceptive tasks. Experiment 1 (n = 30) compared performance across vision → $\rightarrow$ grasp, a grasp → $\rightarrow$ vision and a grasp → $\rightarrow$ grasp tasks. Experiment 2 (n = 30) compared performance on the grasp → $\rightarrow$ vision task between hands and over time. Participants were accurate (mean absolute error 0.27 cm [0.20 to 0.34]; mean [95% CI]) and precise (R 2 $R^2$ = 0.95 [0.93 to 0.96]) for grasp → $\rightarrow$ grasp judgements, with a strong correlation between outcomes (r = -0.85 [-0.93 to -0.70]). Accuracy and precision decreased in the two high-level tasks (R 2 $R^2$ = 0.86 and 0.89; mean absolute error = 1.34 and 1.41 cm), with most participants overestimating perceived width for the vision → $\rightarrow$ grasp task and underestimating it for grasp → $\rightarrow$ vision task. There was minimal correlation between accuracy and precision for these two tasks. Converging evidence indicated performance was largely reciprocal (inverse) between the vision → $\rightarrow$ grasp and grasp → $\rightarrow$ vision tasks. Performance on the grasp → $\rightarrow$ vision task was consistent between dominant and non-dominant hands, and across repeated sessions a day or week apart. Overall, there are fundamental differences between low- and high-level proprioceptive judgements that reflect fundamental differences in the cortical processes that underpin these perceptions. Moreover, the central transformations that govern high-level proprioceptive judgements of grasp are personalised, stable and reciprocal for reciprocal tasks. KEY POINTS: Low-level proprioceptive judgements involve a single frame of reference (e.g. indicating the width of a grasped object by selecting from a series of objects of different width), whereas high-level proprioceptive judgements are made across different frames of reference (e.g. indicating the width of a grasped object by selecting from a series of visible lines of different length). We highlight fundamental differences in the precision and accuracy of low- and high-level proprioceptive judgements. We provide converging evidence that the neural transformations between frames of reference that govern high-level proprioceptive judgements of grasp are personalised, stable and reciprocal for reciprocal tasks. This stability is likely key to precise judgements and accurate predictions in high-level proprioception.
Collapse
Affiliation(s)
- Martin E Héroux
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| | - Georgia Fisher
- Neuroscience Research Australia, Randwick, Australia
- Australian Institute of Health Innovation, Macquarie University, Macquarie Park, Australia
| | | | - Annie A Butler
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| | - Simon C Gandevia
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| |
Collapse
|
6
|
Bertoni T, Mastria G, Akulenko N, Perrin H, Zbinden B, Bassolino M, Serino A. The self and the Bayesian brain: Testing probabilistic models of body ownership through a self-localization task. Cortex 2023; 167:247-272. [PMID: 37586137 DOI: 10.1016/j.cortex.2023.06.019] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 03/29/2023] [Accepted: 06/19/2023] [Indexed: 08/18/2023]
Abstract
Simple multisensory manipulations can induce the illusory misattribution of external objects to one's own body, allowing to experimentally investigate body ownership. In this context, body ownership has been conceptualized as the result of the online Bayesian optimal estimation of the probability that one object belongs to the body from the congruence of multisensory inputs. This idea has been highly influential, as it provided a quantitative basis to bottom-up accounts of self-consciousness. However, empirical evidence fully supporting this view is scarce, as the optimality of the putative inference process has not been assessed rigorously. This pre-registered study aimed at filling this gap by testing a Bayesian model of hand ownership based on spatial and temporal visuo-proprioceptive congruences. Model predictions were compared to data from a virtual-reality reaching task, whereby reaching errors induced by a spatio-temporally mismatching virtual hand have been used as an implicit proxy of hand ownership. To rigorously test optimality, we compared the Bayesian model versus alternative non-Bayesian models of multisensory integration, and independently assess unisensory components and compare them to model estimates. We found that individually measured values of proprioceptive precision correlated with those fitted from our reaching task, providing compelling evidence that the underlying visuo-proprioceptive integration process approximates Bayesian optimality. Furthermore, reaching errors correlated with explicit ownership ratings at the single individual and trial level. Taken together, these results provide novel evidence that body ownership, a key component of self-consciousness, can be truly described as the bottom-up, behaviourally optimal processing of multisensory inputs.
Collapse
Affiliation(s)
- Tommaso Bertoni
- MySpace Lab, Department of Clinical Neurosciences, University Hospital of Lausanne, University of Lausanne, Lausanne, Switzerland
| | - Giulio Mastria
- MySpace Lab, Department of Clinical Neurosciences, University Hospital of Lausanne, University of Lausanne, Lausanne, Switzerland
| | - Nikita Akulenko
- MySpace Lab, Department of Clinical Neurosciences, University Hospital of Lausanne, University of Lausanne, Lausanne, Switzerland
| | - Henri Perrin
- School of Medicine, Faculty of Biology and Medicine, University of Lausanne, Lausanne, Switzerland
| | - Boris Zbinden
- MySpace Lab, Department of Clinical Neurosciences, University Hospital of Lausanne, University of Lausanne, Lausanne, Switzerland
| | | | - Andrea Serino
- MySpace Lab, Department of Clinical Neurosciences, University Hospital of Lausanne, University of Lausanne, Lausanne, Switzerland.
| |
Collapse
|
7
|
Oh K, Prilutsky BI. Transformation from arm joint coordinates to hand external coordinates explains non-uniform precision of hand position sense in horizontal workspace. Hum Mov Sci 2022; 86:103020. [DOI: 10.1016/j.humov.2022.103020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2022] [Revised: 10/17/2022] [Accepted: 10/23/2022] [Indexed: 11/06/2022]
|
8
|
Oh J, Mahnan A, Xu J, Block HJ, Konczak J. Typical Development of Finger Position Sense From Late Childhood to Adolescence. J Mot Behav 2022; 55:102-110. [PMID: 36257920 DOI: 10.1080/00222895.2022.2134287] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
Finger position sense is a proprioceptive modality highly important for fine motor control. Its developmental time course is largely unknown. This cross-sectional study examined its typical development in 138 children (8-17 years) and a group of 14 healthy young adults using a fast and novel psychophysical test that yielded objective measures of position sense acuity. Participants placed their hands underneath a computer tablet and judged the perceived position of their unseen index finger relative to two visible areas displayed on a tablet following a two-forced-choice paradigm. Responses were fitted to a psychometric acuity function from which the difference between the point-of-subjective-equality and the veridical finger position (ΔPSE) was derived as a measure of position sense bias, and the uncertainty area (UA) as a measure of precision. The main results are: First, children under 12 exhibited a significantly greater UA than adults while adolescent children (13-17 years) exhibited no significant differences when compared to adults. Second, no significant age-related differences in ΔPSE were found across the age range of 8-17 years. This implies that the typical development of finger position sense from late childhood to adulthood is characterized as an age-dependent increase in proprioceptive precision and not as a decrease in bias.
Collapse
Affiliation(s)
- Jinseok Oh
- Human Sensorimotor Control Laboratory, School of Kinesiology, University of Minnesota, Minneapolis, MN, USA
| | - Arash Mahnan
- Human Sensorimotor Control Laboratory, School of Kinesiology, University of Minnesota, Minneapolis, MN, USA.,Reality Labs Health and Safety UXR, Meta, Redmond, WA, USA
| | - Jiapeng Xu
- Human Sensorimotor Control Laboratory, School of Kinesiology, University of Minnesota, Minneapolis, MN, USA
| | - Hannah J Block
- Sensorimotor Neurophysiology Laboratory, School of Public Health, Indiana University Bloomington, IN, USA
| | - Jürgen Konczak
- Human Sensorimotor Control Laboratory, School of Kinesiology, University of Minnesota, Minneapolis, MN, USA.,Center for Clinical Movement Science, University of Minnesota, Minneapolis, MN, USA
| |
Collapse
|
9
|
Salas MA, Bell J, Niketeghad S, Oswalt D, Bosking W, Patel U, Dorn JD, Yoshor D, Greenberg R, Bari A, Pouratian N. Sequence of visual cortex stimulation affects phosphene brightness in blind subjects. Brain Stimul 2022; 15:605-614. [DOI: 10.1016/j.brs.2022.03.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2021] [Revised: 03/12/2022] [Accepted: 03/29/2022] [Indexed: 11/02/2022] Open
|
10
|
Tanner J, Orthlieb G, Shumate D, Helms Tillery S. Effect of Tactile Sensory Substitution on the Proprioceptive Error Map of the Arm. Front Neurosci 2021; 15:586740. [PMID: 34305509 PMCID: PMC8292232 DOI: 10.3389/fnins.2021.586740] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2020] [Accepted: 05/20/2021] [Indexed: 11/13/2022] Open
Abstract
Proprioceptive error of estimated fingertip position in two-dimensional space is reduced with the addition of tactile stimulation to the fingertip. This tactile input does not disrupt the subjects’ estimation strategy, as the individual error vector maps maintain their overall geometric structure. This relationship suggests an integration of proprioception and tactile sensory information to enhance proprioceptive estimation. To better understand this multisensory integration, we explored the effect of electrotactile and vibrotactile stimulation to the fingertips in place of actual contact, thus limiting interaction forces. This allowed us to discern any proprioceptive estimation improvement that arose from purely tactile stimulation. Ten right-handed and ten left-handed subjects performed a simple right-handed proprioceptive estimation task under four tactile feedback conditions: hover, touch, electrotactile, and vibrotactile. Target sets were generated for each subject, persisted across all feedback modalities, and targets were presented in randomized orders. Error maps across the workspace were generated using polynomial models of the subjects’ responses. Error maps did not change shape between conditions for any right-handed subjects and changed for a single condition for two left-handed subjects. Non-parametric statistical analysis of the error magnitude shows that both modes of sensory substitution significantly reduce error for right-handed subjects, but not to the level of actual touch. Left-handed subjects demonstrated increased error for all feedback conditions compared to hover. Compared to right-handed subjects, left-handed subjects demonstrated more error in each condition except the hover condition. This is consistent with the hypothesis that the non-dominant hand is specialized for position control, while the dominant is specialized for velocity. Notably, our results suggest that non-dominant hand estimation strategies are hindered by stimuli to the fingertip. We conclude that electrotactile and vibrotactile sensory substitution only succeed in multisensory integration when applied to the dominant hand. These feedback modalities do not disrupt established dominate hand proprioceptive error maps, and existing strategies adapt to the novel input and minimize error. Since actual touch provides the best error reduction, sensory substitution lacks some unidentified beneficial information, such as familiarity or natural sensation. This missing component could also be what confounds subjects using their non-dominant hand for positional tasks.
Collapse
Affiliation(s)
- Justin Tanner
- School of Biological and Health Systems Engineering, Arizona State University, Tempe, AZ, United States
| | - Gerrit Orthlieb
- Stanford School of Medicine, Stanford University, Stanford, CA, United States
| | - David Shumate
- School of Biological and Health Systems Engineering, Arizona State University, Tempe, AZ, United States
| | - Stephen Helms Tillery
- School of Biological and Health Systems Engineering, Arizona State University, Tempe, AZ, United States
| |
Collapse
|
11
|
Three-Dimensional Assessment of Upper Limb Proprioception via a Wearable Exoskeleton. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11062615] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
Abstract
Proprioception—the sense of body segment’s position and movement—plays a crucial role in human motor control, integrating the sensory information necessary for the correct execution of daily life activities. Despite scientific evidence recognizes that several neurological diseases hamper proprioceptive encoding with consequent inability to correctly perform movements, proprioceptive assessment in clinical settings is still limited to standard scales. Literature on physiology of upper limb’s proprioception is mainly focused on experimental approaches involving planar setups, while the present work provides a novel paradigm for assessing proprioception during single—and multi-joint matching tasks in a three-dimensional workspace. To such extent, a six-degrees of freedom exoskeleton, ALEx-RS (Arm Light Exoskeleton Rehab Station), was used to evaluate 18 healthy subjects’ abilities in matching proprioceptive targets during combined single and multi-joint arm’s movements: shoulder abduction/adduction, shoulder flexion/extension, and elbow flexion/extension. Results provided evidence that proprioceptive abilities depend on the number of joints simultaneously involved in the task and on their anatomical location, since muscle spindles work along their preferred direction, modulating the streaming of sensory information accordingly. These findings suggest solutions for clinical sensorimotor evaluation after neurological disease, where assessing proprioceptive deficits can improve the recovery path and complement the rehabilitation outcomes.
Collapse
|
12
|
Tanner J, Newman N, Helms Tillery S. Anisotropic Psychophysical Trends in the Discrimination of Tactile Direction in a Precision Grip. Front Neurosci 2021; 14:576020. [PMID: 33510606 PMCID: PMC7835715 DOI: 10.3389/fnins.2020.576020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2020] [Accepted: 11/23/2020] [Indexed: 11/13/2022] Open
Abstract
Tactile cues arising from interactions with objects have a sense of directionality which affects grasp. Low latency responses to varied grip perturbations indicate that grasp safety margins are exaggerated in certain directions and conditions. In a grip with the ulnar-radial axis vertical, evidence suggests that distal and downward directions are more sensitive to task parameters and have larger safety margins. This suggests that, for the purpose of applying forces with the fingers, reference frames with respect to the hand and gravity are both in operation. In this experiment, we examined human sensitivities to the direction of tactile movement in the context of precision grip in orientations either orthogonal to or parallel to gravity. Subjects performed a two-alternative-forced-choice task involving a textured cube which moved orthogonal to their grip axis. Subjects' arms were placed in a brace that allowed for finger movement but minimized arm movement. Movement of thumb and index joints were monitored via PhaseSpace motion capture. The subject was presented with a textured cube and instructed to lightly grasp the cube, as if it were slipping. In each trial the object was first translated 1 cm in 0° (proximal), 90° (radial), 180° (distal), or 270° (ulnar) and returned to its origin. This primary stimulus was immediately followed by a 10 mm secondary stimulus at a random 5° interval between -30° and 30° of the primary stimulus. Response from the subject after each pair of stimuli indicated whether the test direction felt the same as or different from the primary stimulus. Traditional bias and sensitivity analyses did not provide conclusive results but suggested that performance is best in the ulnar-radial axis regardless of gravity. Modeling of the response curve generated a detection threshold for each primary stimulus. Lower thresholds, indicating improved detection, persisted in the ulnar-radial axis. Anisotropic thresholds of increased detection appear to coincide with digit displacement and appear to be independent of the grasp orientation.
Collapse
Affiliation(s)
- Justin Tanner
- School of Biological and Health Systems Engineering, Arizona State University, Tempe, AZ, United States
| | - Naomi Newman
- University Medical Center, Banner Health, Phoenix, AZ, United States
| | - Stephen Helms Tillery
- School of Biological and Health Systems Engineering, Arizona State University, Tempe, AZ, United States
| |
Collapse
|
13
|
Bertoni T, Magosso E, Serino A. From statistical regularities in multisensory inputs to peripersonal space representation and body ownership: Insights from a neural network model. Eur J Neurosci 2021; 53:611-636. [PMID: 32965729 PMCID: PMC7894138 DOI: 10.1111/ejn.14981] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2020] [Revised: 09/03/2020] [Accepted: 09/04/2020] [Indexed: 01/07/2023]
Abstract
Peripersonal space (PPS), the interface between the self and the environment, is represented by a network of multisensory neurons with visual (or auditory) receptive fields anchored to specific body parts, and tactile receptive fields covering the same body parts. Neurophysiological and behavioural features of hand PPS representation have been previously modelled through a neural network constituted by one multisensory population integrating tactile inputs with visual/auditory external stimuli. Reference frame transformations were not explicitly modelled, as stimuli were encoded in pre-computed hand-centred coordinates. Here we present a novel model, aiming to overcome this limitation by including a proprioceptive population encoding hand position. We confirmed behaviourally the plausibility of the proposed architecture, showing that visuo-proprioceptive information is integrated to enhance tactile processing on the hand. Moreover, the network's connectivity was spontaneously tuned through a Hebbian-like mechanism, under two minimal assumptions. First, the plasticity rule was designed to learn the statistical regularities of visual, proprioceptive and tactile inputs. Second, such statistical regularities were simply those imposed by the body structure. The network learned to integrate proprioceptive and visual stimuli, and to compute their hand-centred coordinates to predict tactile stimulation. Through the same mechanism, the network reproduced behavioural correlates of manipulations implicated in subjective body ownership: the invisible and the rubber hand illusion. We thus propose that PPS representation and body ownership may emerge through a unified neurocomputational process; the integration of multisensory information consistently with a model of the body in the environment, learned from the natural statistics of sensory inputs.
Collapse
Affiliation(s)
- Tommaso Bertoni
- MySpace LabDepartment of Clinical NeuroscienceLausanne University Hospital (CHUV)University of LausanneLausanneSwitzerland
| | - Elisa Magosso
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi”University of BolognaCesenaItaly
| | - Andrea Serino
- MySpace LabDepartment of Clinical NeuroscienceLausanne University Hospital (CHUV)University of LausanneLausanneSwitzerland
| |
Collapse
|
14
|
Zhou Q, Yu D, Reinoso MN, Newn J, Goncalves J, Velloso E. Eyes-free Target Acquisition During Walking in Immersive Mixed Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:3423-3433. [PMID: 32941144 DOI: 10.1109/tvcg.2020.3023570] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Reaching towards out-of-sight objects during walking is a common task in daily life, however the same task can be challenging when wearing immersive Head-Mounted Displays (HMD). In this paper, we investigate the effects of spatial reference frame, walking path curvature, and target placement relative to the body on user performance of manually acquiring out-of-sight targets located around their bodies, as they walk in a spatial-mapping Mixed Reality (MR) environment wearing an immersive HMD. We found that walking and increased path curvature negatively affected the overall spatial accuracy of the performance, and that the performance benefited more from using the torso as the reference frame than the head. We also found that targets placed at maximum reaching distance yielded less error in angular rotation and depth of the reaching arm. We discuss our findings with regard to human walking kinesthetics and the sensory integration in the peripersonal space during locomotion in immersive MR. We provide design guidelines for future immersive MR experience featuring spatial mapping and full-body motion tracking to provide better embodied experience.
Collapse
|
15
|
Accuracy of hand localization is subject-specific and improved without performance feedback. Sci Rep 2020; 10:19188. [PMID: 33154521 PMCID: PMC7645785 DOI: 10.1038/s41598-020-76220-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Accepted: 10/12/2020] [Indexed: 11/09/2022] Open
Abstract
Accumulating evidence indicates that the spatial error of human's hand localization appears subject-specific. However, whether the idiosyncratic pattern persists across time with good within-subject consistency has not been adequately examined. Here we measured the hand localization map by a Visual-matching task in multiple sessions over 2 days. Interestingly, we found that participants improved their hand localization accuracy when tested repetitively without performance feedback. Importantly, despite the reduction of average error, the spatial pattern of hand localization errors remained idiosyncratic. Based on individuals' hand localization performance, a standard convolutional neural network classifier could identify participants with good accuracy. Moreover, we did not find supporting evidence that participants' baseline hand localization performance could predict their motor performance in a visual Trajectory-matching task even though both tasks require accurate mapping of hand position to visual targets in the same workspace. Using a separate experiment, we not only replicated these findings but also ruled out the possibility that performance feedback during a few familiarization trials caused the observed improvement in hand localization. We conclude that the conventional hand localization test itself, even without feedback, can improve hand localization but leave the idiosyncrasy of hand localization map unchanged.
Collapse
|
16
|
Kuling IA, Laan L, Lopik EV, Smeets JBJ, Brenner E. Looking Precisely at Your Fingertip Requires Visual Guidance of Gaze. Perception 2020; 49:1252-1259. [PMID: 33086914 PMCID: PMC7675761 DOI: 10.1177/0301006620965133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
People often look at objects that they are holding in their hands. It is therefore reasonable to expect them to be able to direct their gaze precisely with respect to their fingers. However, we know that people make reproducible idiosyncratic errors of up to a few centimetres when they try to align a visible cursor to their own finger hidden below a surface. To find out whether they also make idiosyncratic errors when they try to look at their finger, we asked participants to hold their finger in front of their head in the dark, and look at it. Participants made idiosyncratic errors of a similar magnitude to those previously found when matching a visual cursor to their hidden finger. This shows that proprioceptive position sense of finger and gaze are not aligned, suggesting that people rely on vision to guide their gaze to their own finger.
Collapse
Affiliation(s)
| | - Lotte Laan
- Vrije Universiteit Amsterdam, The Netherlands
| | | | | | - Eli Brenner
- Vrije Universiteit Amsterdam, The Netherlands
| |
Collapse
|
17
|
Zimmet AM, Cao D, Bastian AJ, Cowan NJ. Cerebellar patients have intact feedback control that can be leveraged to improve reaching. eLife 2020; 9:53246. [PMID: 33025903 PMCID: PMC7577735 DOI: 10.7554/elife.53246] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2019] [Accepted: 10/06/2020] [Indexed: 12/24/2022] Open
Abstract
It is thought that the brain does not simply react to sensory feedback, but rather uses an internal model of the body to predict the consequences of motor commands before sensory feedback arrives. Time-delayed sensory feedback can then be used to correct for the unexpected—perturbations, motor noise, or a moving target. The cerebellum has been implicated in this predictive control process. Here, we show that the feedback gain in patients with cerebellar ataxia matches that of healthy subjects, but that patients exhibit substantially more phase lag. This difference is captured by a computational model incorporating a Smith predictor in healthy subjects that is missing in patients, supporting the predictive role of the cerebellum in feedback control. Lastly, we improve cerebellar patients’ movement control by altering (phase advancing) the visual feedback they receive from their own self movement in a simplified virtual reality setup.
Collapse
Affiliation(s)
- Amanda M Zimmet
- Kennedy Krieger Institute, Baltimore, United States.,Department of Biomedical Engineering, Johns Hopkins University, Baltimore, United States
| | - Di Cao
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, United States
| | - Amy J Bastian
- Kennedy Krieger Institute, Baltimore, United States.,Department of Neuroscience, Johns Hopkins University, Baltimore, United States
| | - Noah J Cowan
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, United States
| |
Collapse
|
18
|
Goettker A, Fiehler K, Voudouris D. Somatosensory target information is used for reaching but not for saccadic eye movements. J Neurophysiol 2020; 124:1092-1102. [DOI: 10.1152/jn.00258.2020] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
A systematic investigation of contributions of different somatosensory modalities (proprioception, kinesthesia, tactile) for goal-directed movements is missing. Here we demonstrate that while eye movements are not affected by different types of somatosensory information, reach precision improves when two different types of information are available. Moreover, reach accuracy and gaze precision to unseen somatosensory targets improve when performing coordinated eye-hand movements, suggesting bidirectional contributions of efferent information in reach and eye movement control.
Collapse
Affiliation(s)
- Alexander Goettker
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
| | - Katja Fiehler
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University, Giessen, Germany
| | - Dimitris Voudouris
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
| |
Collapse
|
19
|
Peviani V, Bottini G. Proprioceptive errors in the localization of hand landmarks: What can be learnt about the hand metric representation? PLoS One 2020; 15:e0236416. [PMID: 32735572 PMCID: PMC7394425 DOI: 10.1371/journal.pone.0236416] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2020] [Accepted: 07/06/2020] [Indexed: 01/13/2023] Open
Abstract
Proprioception acquires a crucial role in estimating the configuration of our body segments in space when visual information is not available. Proprioceptive accuracy is assessed by asking participants to match the perceived position of an unseen body landmark through reaching movements. This task was also adopted to study the perceived hand structure by computing the relative distances between averaged proprioceptive judgments (hand Localization Task). However, the pattern of proprioceptive errors leading to the misperceived hand structure is unexplored. Here, we aimed to characterize this pattern across different hand landmarks, having different anatomo-physiological properties and cortical representations. Furthermore, we sought to describe the error consistency and its stability over time. To this purpose, we analyzed the proprioceptive errors of 43 healthy participants during the hand Localization Task. We found larger but more consistent errors for the fingertips compared to the knuckles, possibly due to poorer proprioceptive signal, compensated by other sources of spatial information. Furthermore, we found a shift (overlap effect) and a temporal drift of the hand perceived position towards the shoulder of origin, which was consistent within and between subjects. The overlap effect had a greater influence on lateral compared to medial landmarks, leading to the hand width overestimation. Our results are compatible with domain-general and body-specific spatial biases affecting the proprioceptive localization of the hand landmarks, thus the apparent hand structure misperception.
Collapse
Affiliation(s)
- Valeria Peviani
- Department of Neuroscience, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
| | - Gabriella Bottini
- Department of Neuroscience, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
- Cognitive Neuropsychology Center, ASST Grande Ospedale Metropolitano Niguarda, Milan, Italy
- NeuroMI, Milan Center for Neuroscience, Milan, Italy
| |
Collapse
|
20
|
Judgements of hand location and hand spacing show minimal proprioceptive drift. Exp Brain Res 2020; 238:1759-1767. [PMID: 32462377 DOI: 10.1007/s00221-020-05836-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2020] [Accepted: 05/15/2020] [Indexed: 12/14/2022]
Abstract
With a visual memory of where our hands are, their perceived location drifts. We investigated whether the perceived location of one hand or the spacing between two hands drifts in the absence of visual memories or cues. In 30 participants (17 females, mean age 27 years, range 20-45 years), perceived location of the right index finger was assessed when it was 10 cm to the right or left of the midline. Perceived spacing between the index fingers was assessed when they were spaced 20 cm apart, centred on the midline. Testing included two conditions, one with ten measures at 30 s intervals and another where a 3 min delay was introduced after the fifth measure. Participants responded by selecting a point on a ruler or a line from a series of lines of different lengths. Overall, participants mislocalised their hands closer to the midline. However, there was little to no drift in perceived index finger location when measures were taken at regular intervals (ipsilateral slope: 0.073 cm/measure [[Formula: see text] to 0.160], mean [99% CI]; contralateral slope: 0.045 cm/measure [[Formula: see text] to 0.120]), or across a 3 min delay (ipsilateral: ([Formula: see text] cm [[Formula: see text] to 0.17]; contralateral: [Formula: see text] cm [[Formula: see text] to 0.24]). There was a slight drift in perceived spacing when measures were taken at regular intervals (slope: [Formula: see text] cm/measure [[Formula: see text] to [Formula: see text]]), but none across a 3 min delay (0.08 cm [[Formula: see text] to 1.24]). Thus, proprioceptive-based perceptions of where our hands are located or how they are spaced drift minimally or not at all, indicating these perceptions are stable.
Collapse
|
21
|
Dandu B, Kuling IA, Visell Y. Proprioceptive Localization of the Fingers: Coarse, Biased, and Context-Sensitive. IEEE TRANSACTIONS ON HAPTICS 2020; 13:259-269. [PMID: 30762567 DOI: 10.1109/toh.2019.2899302] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The proprioceptive sense provides somatosensory information about positions of parts of the body, information that is essential for guiding behavior and monitoring the body. Few studies have investigated the perceptual localization of individual fingers, despite their importance for tactile exploration and fine manipulation. We present two experiments assessing the performance of proprioceptive localization of multiple fingers, either alone or in combination with visual cues. In the first experiment, we used a virtual reality paradigm to assess localization of multiple fingers. Surprisingly, the errors averaged 3.7 cm per digit, which represents a significant fraction of the range of motion of any finger. Both random and systematic errors were large. The latter included participant-specific biases and participant-independent distortions that evoked similar observations from prior studies of perceptual representations of hand shape. In a second experiment, we introduced visual cues about positions of nearby fingers, and observed that this contextual information could greatly decrease localization errors. The results suggest that only coarse proprioceptive information is available through somatosensation, and that finer information may not be necessary for fine motor behavior. These findings may help elucidate human hand function, and inform new applications to the design of human-computer interfaces or interactions in virtual reality.
Collapse
|
22
|
The hidden hand is perceived closer to midline. Exp Brain Res 2019; 237:1773-1779. [PMID: 31037326 DOI: 10.1007/s00221-019-05546-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 04/24/2019] [Indexed: 12/27/2022]
Abstract
Whether visible or not, knowing the location of our hands is fundamental to how we perceive ourselves and interact with our environment. The present study investigated perceived hand location in the absence of vision in 30 participants. Their right index finger was placed 10, 20 or 30 cm away on either side of the body midline, with and without their left index finger placed 10 cm to the left of the right index. On average, at each position, participants perceived their right hand closer to the body midline than it actually was. This underestimation increased linearly with increased distance of the hand from body midline [slope 0.77 (0.74 to 0.81), mean (95% CI)]. Participants made smaller errors in perceived hand location when the right hand was in the contralateral workspace [mean difference 2.13 cm (1.57 to 2.69)]. Presence of the left hand on the support surface had little or no effect on perceived location of the right hand [mean difference [Formula: see text] cm ([Formula: see text] to 0.02)]. Overall, participants made systematic perceptual errors immediately after hand placement. The magnitude of these errors grew linearly as the hand got further away from the body midline. Because of their magnitude, these errors may contribute to errors in motor planning when visual feedback is not available. Also, these errors are important for studies in which perceived hand location is assessed after some time, for example, when studying illusions of body ownership and proprioceptive drift.
Collapse
|
23
|
Ingram LA, Butler AA, Gandevia SC, Walsh LD. Proprioceptive measurements of perceived hand position using pointing and verbal localisation tasks. PLoS One 2019; 14:e0210911. [PMID: 30653568 PMCID: PMC6336330 DOI: 10.1371/journal.pone.0210911] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2018] [Accepted: 01/03/2019] [Indexed: 11/19/2022] Open
Abstract
Previous studies revealed that healthy individuals consistently misjudge the size and shape of their hidden hand during a localisation task. Specifically, they overestimate the width of their hand and underestimate the length of their fingers. This would also imply that the same individuals misjudge the actual location of at least some parts of their hand during the task. Therefore, the primary aim of the current study was to determine whether healthy individuals could accurately locate the actual position of their hand when hidden from view, and whether accuracy depends on the type of localisation task used, the orientation of the hidden hand, and whether the left or right hand is tested. Sixteen healthy right-handed participants performed a hand localisation task that involved both pointing to and verbally indicating the perceived position of landmarks on their hidden hand. Hand position was consistently misjudged as closer to the wrist (proximal bias) and, to a lesser extent, away from the thumb (ulnar bias). The magnitude of these biases depended on the localisation task (pointing vs. verbal), the orientation of the hand (straight vs. rotated), and the hand tested (left vs. right). Furthermore, the proximal location bias increased in size as the duration of the experiment increased, while the magnitude of ulnar bias remained stable through the experiment. Finally, the resultant maps of perceived hand location appear to replicate the previously reported overestimation of hand width and underestimation of finger length. Once again, the magnitude of these distortions is dependent on the task, orientation, and hand tested. These findings underscore the need to control and standardise each component of the hand localisation task in future studies.
Collapse
Affiliation(s)
- Lewis A. Ingram
- Neuroscience Research Australia, Sydney, New South Wales, Australia
- University of New South Wales, Sydney, New South Wales, Australia
| | - Annie A. Butler
- Neuroscience Research Australia, Sydney, New South Wales, Australia
- University of New South Wales, Sydney, New South Wales, Australia
| | - Simon C. Gandevia
- Neuroscience Research Australia, Sydney, New South Wales, Australia
- University of New South Wales, Sydney, New South Wales, Australia
| | - Lee D. Walsh
- Platypus Technical Consultants Pty Ltd, Canberra, Australia
| |
Collapse
|
24
|
Kuling IA, de Brouwer AJ, Smeets JBJ, Flanagan JR. Correcting for natural visuo-proprioceptive matching errors based on reward as opposed to error feedback does not lead to higher retention. Exp Brain Res 2018; 237:735-741. [PMID: 30560507 PMCID: PMC6394780 DOI: 10.1007/s00221-018-5456-3] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2018] [Accepted: 11/26/2018] [Indexed: 01/09/2023]
Abstract
When asked to move their unseen hand-to-visual targets, people exhibit idiosyncratic but reliable visuo-proprioceptive matching errors. Unsurprisingly, vision and proprioception quickly align when these errors are made apparent by providing visual feedback of the position of the hand. However, retention of this learning is limited, such that the original matching errors soon reappear when visual feedback is removed. Several recent motor learning studies have shown that reward feedback can improve retention relative to error feedback. Here, using a visuo-proprioceptive position-matching task, we examined whether binary reward feedback can be effectively exploited to reduce matching errors and, if so, whether this learning leads to improved retention relative to learning based on error feedback. The results show that participants were able to adjust the visuo-proprioceptive mapping with reward feedback, but that the level of retention was similar to that observed when the adjustment was accomplished with error feedback. Therefore, similar to error feedback, reward feedback allows for temporary recalibration, but does not support long-lasting retention of this recalibration.
Collapse
Affiliation(s)
- Irene A Kuling
- Department of Human Movement Sciences, Vrije Universiteit Amsterdam, van der Boechorststraat 9, 1081 BT, Amsterdam, The Netherlands.,Institute of Neuroscience, Université Catholique de Louvain, Brussels, Belgium
| | - Anouk J de Brouwer
- Centre for Neuroscience Studies, Queen's University, Kingston, ON, Canada.,Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, BC, Canada
| | - Jeroen B J Smeets
- Department of Human Movement Sciences, Vrije Universiteit Amsterdam, van der Boechorststraat 9, 1081 BT, Amsterdam, The Netherlands.
| | - J Randall Flanagan
- Centre for Neuroscience Studies, Queen's University, Kingston, ON, Canada.,Department of Psychology, Queen's University, Kingston, ON, Canada
| |
Collapse
|
25
|
Kuling IA, de Bruijne WJ, Burgering K, Brenner E, Smeets JBJ. Visuo-Proprioceptive Matching Errors Are Consistent with Biases in Distance Judgments. J Mot Behav 2018; 51:572-579. [PMID: 30375949 DOI: 10.1080/00222895.2018.1528435] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Abstract
People make systematic errors when matching the location of an unseen index finger with that of a visual target. These errors are consistent over time, but idiosyncratic and surprisingly task-specific. The errors that are made when moving the unseen index finger to a visual target are not consistent with the errors when moving a visual target to the unseen index finger. To test whether such inconsistencies arise because a large part of the matching errors originate during movement execution, we compared errors in moving the unseen finger to a target with biases in deciding which of two visual targets was closer to the index finger before the movement. We found that the judgment as to which is the closest target was consistent with the matching errors. This means that inconsistencies in visuo-proprioceptive matching errors are not caused by systematic errors in movement execution, but are likely to be related to biases in sensory transformations.
Collapse
Affiliation(s)
- Irene A Kuling
- a Department of Human Movement Sciences , Vrije Universiteit Amsterdam , Amsterdam , The Netherlands .,b Institute of Neuroscience , Université Catholique de Louvain , Brussels , Belgium
| | - Willem J de Bruijne
- a Department of Human Movement Sciences , Vrije Universiteit Amsterdam , Amsterdam , The Netherlands
| | - Kimberley Burgering
- a Department of Human Movement Sciences , Vrije Universiteit Amsterdam , Amsterdam , The Netherlands
| | - Eli Brenner
- a Department of Human Movement Sciences , Vrije Universiteit Amsterdam , Amsterdam , The Netherlands
| | - Jeroen B J Smeets
- a Department of Human Movement Sciences , Vrije Universiteit Amsterdam , Amsterdam , The Netherlands
| |
Collapse
|
26
|
Liu Y, Sexton BM, Block HJ. Spatial bias in estimating the position of visual and proprioceptive targets. J Neurophysiol 2018; 119:1879-1888. [PMID: 29465330 DOI: 10.1152/jn.00633.2017] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
When people match an unseen hand to a visual or proprioceptive target, they make both variable and systematic (bias) errors. Variance is a well-established factor in behavior, but the origin and implications of bias, and its connection to variance, are poorly understood. Eighty healthy adults matched their unseen right index finger to proprioceptive (left index finger) and visual targets with no performance feedback. We asked whether matching bias was related to target modality and to the magnitude or spatial properties of matching variance. Bias errors were affected by target modality, with subjects estimating visual and proprioceptive targets 20 mm apart. We found three pieces of evidence to suggest a connection between bias and variable errors: 1) for most subjects, the target modality that yielded greater spatial bias was also estimated with greater variance; 2) magnitudes of matching bias and variance were somewhat correlated for each target modality ( R = 0.24 and 0.29); and 3) bias direction was closely related to the angle of the major axis of the confidence ellipse ( R = 0.60 and 0.63). However, whereas variance was significantly correlated with visuo-proprioceptive weighting as predicted by multisensory integration theory ( R = -0.29 and 0.27 for visual and proprioceptive variance, respectively), bias was not. In a second session, subjects improved their matching variance, but not bias, for both target modalities, indicating a difference in stability. Taken together, these results suggest bias and variance are related only in some respects, which should be considered in the study of multisensory behavior. NEW & NOTEWORTHY People matching visual or proprioceptive targets make both variable and systematic (bias) errors. Multisensory integration is thought to minimize variance, but if the less variable modality has more bias, behavioral accuracy will decrease. Our data set suggests this is unusual. However, although bias and variable errors were spatially related, they differed in both stability and correlation with multisensory weighting. This suggests the bias-variance relationship is not straightforward, and both should be considered in multisensory behavior.
Collapse
Affiliation(s)
- Yang Liu
- Department of Kinesiology and Program in Neuroscience, Indiana University Bloomington , Bloomington, Indiana
| | - Brandon M Sexton
- Department of Kinesiology and Program in Neuroscience, Indiana University Bloomington , Bloomington, Indiana
| | - Hannah J Block
- Department of Kinesiology and Program in Neuroscience, Indiana University Bloomington , Bloomington, Indiana
| |
Collapse
|
27
|
Fraser LE, Harris LR. The effect of hand position on perceived finger orientation in left- and right-handers. Exp Brain Res 2017; 235:3683-3693. [PMID: 28929312 PMCID: PMC5671529 DOI: 10.1007/s00221-017-5090-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2017] [Accepted: 09/14/2017] [Indexed: 12/24/2022]
Abstract
In the absence of visual feedback, the perceived orientation of the fingers is systematically biased. In right-handers these biases are asymmetrical between the left and right hands in the horizontal plane and may reflect common functional postures for the two hands. Here we compared finger orientation perception in right- and left-handed participants for both hands, across various hand positions in the horizontal plane. Participants rotated a white line on a screen optically superimposed over their hand to indicate the perceived position of the finger that was rotated to one of seven orientations with the hand either aligned with the body midline, aligned with the shoulder, or displaced by twice the shoulder-to-midline distance from the midline. We replicated the asymmetric pattern of biases previously reported in right-handed participants (left hand biased towards an orientation ~30° inward, right hand ~10° inward). However, no such asymmetry was found for left-handers, suggesting left-handers may use different strategies when mapping proprioception to body or space coordinates and/or have less specialization of function between the hands. Both groups' responses rotated further outward as distance of the hand from the body midline increased, consistent with other research showing spatial orientation estimates diverge outward in the periphery. Finally, for right-handers, precision of responses was best when the hand was aligned with the shoulder compared to the other two conditions. These results highlight the unique role of hand dominance and hand position in perception of finger orientation, and provide insight into the proprioceptive position sense of the upper limbs.
Collapse
Affiliation(s)
- Lindsey E Fraser
- Department of Psychology, Center for Vision Research, York University, 4700 Keele St, Toronto, ON, M3J 1P3, Canada.
| | - Laurence R Harris
- Department of Psychology, Center for Vision Research, York University, 4700 Keele St, Toronto, ON, M3J 1P3, Canada
| |
Collapse
|
28
|
Matching locations is not just matching sensory representations. Exp Brain Res 2016; 235:533-545. [PMID: 27807607 PMCID: PMC5272888 DOI: 10.1007/s00221-016-4815-1] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2015] [Accepted: 10/27/2016] [Indexed: 01/05/2023]
Abstract
People make systematic errors when matching locations of an unseen index finger with the index finger of the other hand, or with a visual target. In this study, we present two experiments that test the consistency of such matching errors across different combinations of matching methods. In the first experiment, subjects had to move their unseen index fingers to visually presented targets. We examined the consistency between matching errors for the two hands and for different postures (hand above a board or below it). We found very little consistency: The matching error depends on the posture and differs between the hands. In the second experiment, we designed sets of tasks that involved the same matching configurations. For example, we compared matching errors when moving with the unseen index finger to a visual target, with errors when moving a visual target to the unseen index finger. We found that matching errors are not invertible. Furthermore, moving both index fingers to the same visual target results in a different mismatch between the hands than directly matching the two index fingers. We conclude that the errors that we make when matching locations cannot only arise from systematic mismatches between sensory representations of the positions of the fingers and of visually perceived space. We discuss how these results can be interpreted in terms of sensory transformations that depend on the movement that needs to be made.
Collapse
|
29
|
Kuling IA, Brenner E, Smeets JBJ. Proprioceptive Localization of the Hand Changes When Skin Stretch around the Elbow Is Manipulated. Front Psychol 2016; 7:1620. [PMID: 27818638 PMCID: PMC5073131 DOI: 10.3389/fpsyg.2016.01620] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2016] [Accepted: 10/04/2016] [Indexed: 11/30/2022] Open
Abstract
Cutaneous information has been shown to influence proprioceptive position sense when subjects had to judge or match the posture of their limbs. In the present study, we tested whether cutaneous information also affects proprioceptive localization of the hand when moving it to a target. In an explorative study, we manipulated the skin stretch around the elbow by attaching elastic sports tape to one side of the arm. Subjects were asked to move the unseen manipulated arm to visually presented targets. We found that the tape induced a significant shift of the end-points of these hand movements. Surprisingly, this shift corresponded with an increase in elbow extension, irrespective of the side of the arm that was taped. A control experiment showed that this cannot be explained by how the skin stretches, because the skin near the elbow stretches to a similar extent on the inside and outside of the arm when the elbow angle increases and decreases, respectively. A second control experiment reproduced and extended the results of the main experiment for tape on the inside of the arm, and showed that the asymmetry was not just a consequence of the tape originally being applied slightly differently to the outside of the arm. However, the way in which the tape was applied does appear to matter, because applying the tape in the same way to the outside of the arm as to the inside of the arm influenced different subjects quite differently, suggesting that the relationship between skin stretch and sensed limb posture is quite complex. We conclude that the way the skin is stretched during a goal-directed movement provides information that helps guide the hand toward the target.
Collapse
Affiliation(s)
- Irene A Kuling
- Department of Human Movement Sciences, MOVE Research Institute Amsterdam, Vrije Universiteit Amsterdam Amsterdam, Netherlands
| | - Eli Brenner
- Department of Human Movement Sciences, MOVE Research Institute Amsterdam, Vrije Universiteit Amsterdam Amsterdam, Netherlands
| | - Jeroen B J Smeets
- Department of Human Movement Sciences, MOVE Research Institute Amsterdam, Vrije Universiteit Amsterdam Amsterdam, Netherlands
| |
Collapse
|
30
|
van Beek FE, Kuling IA, Brenner E, Bergmann Tiest WM, Kappers AML. Correcting for Visuo-Haptic Biases in 3D Haptic Guidance. PLoS One 2016; 11:e0158709. [PMID: 27438009 PMCID: PMC4954687 DOI: 10.1371/journal.pone.0158709] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2016] [Accepted: 06/21/2016] [Indexed: 12/04/2022] Open
Abstract
Visuo-haptic biases are observed when bringing your unseen hand to a visual target. The biases are different between, but consistent within participants. We investigated the usefulness of adjusting haptic guidance to these user-specific biases in aligning haptic and visual perception. By adjusting haptic guidance according to the biases, we aimed to reduce the conflict between the modalities. We first measured the biases using an adaptive procedure. Next, we measured performance in a pointing task using three conditions: 1) visual images that were adjusted to user-specific biases, without haptic guidance, 2) veridical visual images combined with haptic guidance, and 3) shifted visual images combined with haptic guidance. Adding haptic guidance increased precision. Combining haptic guidance with user-specific visual information yielded the highest accuracy and the lowest level of conflict with the guidance at the end point. These results show the potential of correcting for user-specific perceptual biases when designing haptic guidance.
Collapse
Affiliation(s)
- Femke E. van Beek
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
- * E-mail:
| | - Irene A. Kuling
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Eli Brenner
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Wouter M. Bergmann Tiest
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Astrid M. L. Kappers
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
31
|
Kuling IA, Brenner E, Smeets JB. Errors in visuo-haptic and haptic-haptic location matching are stable over long periods of time. Acta Psychol (Amst) 2016; 166:31-6. [PMID: 27043253 DOI: 10.1016/j.actpsy.2016.03.011] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2015] [Revised: 02/08/2016] [Accepted: 03/22/2016] [Indexed: 10/22/2022] Open
Abstract
People make systematic errors when they move their unseen dominant hand to a visual target (visuo-haptic matching) or to their other unseen hand (haptic-haptic matching). Why they make such errors is still unknown. A key question in determining the reason is to what extent individual participants' errors are stable over time. To examine this, we developed a method to quantify the consistency. With this method, we studied the stability of systematic matching errors across time intervals of at least a month. Within this time period, individual subjects' matches were as consistent as one could expect on the basis of the variability in the individual participants' performance within each session. Thus individual participants make quite different systematic errors, but in similar circumstances they make the same errors across long periods of time.
Collapse
|
32
|
Mugge W, Kuling IA, Brenner E, Smeets JBJ. Haptic Guidance Needs to Be Intuitive Not Just Informative to Improve Human Motor Accuracy. PLoS One 2016; 11:e0150912. [PMID: 26982481 PMCID: PMC4794196 DOI: 10.1371/journal.pone.0150912] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2015] [Accepted: 02/22/2016] [Indexed: 11/30/2022] Open
Abstract
Humans make both random and systematic errors when reproducing learned movements. Intuitive haptic guidance that assists one to make the movements reduces such errors. Our study examined whether any additional haptic information about the location of the target reduces errors in a position reproduction task, or whether the haptic guidance needs to be assistive to do so. Holding a haptic device, subjects made reaches to visible targets without time constraints. They did so in a no-guidance condition, and in guidance conditions in which the direction of the force with respect to the target differed, but the force scaled with the distance to the target in the same way. We examined whether guidance forces directed towards the target would reduce subjects’ errors in reproducing a prior position to the same extent as do forces rotated by 90 degrees or 180 degrees, as it might because the forces provide the same information in all three cases. Without vision of the arm, both the accuracy and precision were significantly better with guidance directed towards the target than in all other conditions. The errors with rotated guidance did not differ from those without guidance. Not surprisingly, the movements tended to be faster when guidance forces directed the reaches to the target. This study shows that haptic guidance significantly improved motor performance when using it was intuitive, while non-intuitively presented information did not lead to any improvements and seemed to be ignored even in our simple paradigm with static targets and no time constraints.
Collapse
Affiliation(s)
- Winfred Mugge
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, VU University, Amsterdam, Netherlands
- * E-mail: ;
| | - Irene A. Kuling
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, VU University, Amsterdam, Netherlands
| | - Eli Brenner
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, VU University, Amsterdam, Netherlands
| | - Jeroen B. J. Smeets
- MOVE Research Institute Amsterdam, Department of Human Movement Sciences, VU University, Amsterdam, Netherlands
| |
Collapse
|
33
|
Azañón E, Tamè L, Maravita A, Linkenauger S, Ferrè E, Tajadura-Jiménez A, Longo M. Multimodal Contributions to Body Representation. Multisens Res 2016. [DOI: 10.1163/22134808-00002531] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
Abstract
Our body is a unique entity by which we interact with the external world. Consequently, the way we represent our body has profound implications in the way we process and locate sensations and in turn perform appropriate actions. The body can be the subject, but also the object of our experience, providing information from sensations on the body surface and viscera, but also knowledge of the body as a physical object. However, the extent to which different senses contribute to constructing the rich and unified body representations we all experience remains unclear. In this review, we aim to bring together recent research showing important roles for several different sensory modalities in constructing body representations. At the same time, we hope to generate new ideas of how and at which level the senses contribute to generate the different levels of body representations and how they interact. We will present an overview of some of the most recent neuropsychological evidence about multisensory control of pain, and the way that visual, auditory, vestibular and tactile systems contribute to the creation of coherent representations of the body. We will focus particularly on some of the topics discussed in the symposium on Multimodal Contributions to Body Representation held on the 15th International Multisensory Research Forum (2015, Pisa, Italy).
Collapse
Affiliation(s)
- Elena Azañón
- Department of Psychological Sciences, Birkbeck, University of London, WC1E 7HX, London, UK
| | - Luigi Tamè
- Department of Psychological Sciences, Birkbeck, University of London, WC1E 7HX, London, UK
| | - Angelo Maravita
- Department of Psychology, Università degli studi di Milano-Bicocca, Italy
- Neuromi: Milan Center for Neuroscience, Milano, Italy
| | | | - Elisa R. Ferrè
- Institute of Cognitive Neuroscience, University College London, UK
- Department of Psychology, Royal Holloway University of London, UK
| | - Ana Tajadura-Jiménez
- Laboratorio de Neurociencia Humana, Departamento de Psicología, Universidad Loyola Andalucía, Spain
- UCL Interaction Centre, University College London, UK
| | - Matthew R. Longo
- Department of Psychological Sciences, Birkbeck, University of London, WC1E 7HX, London, UK
| |
Collapse
|
34
|
Boisgontier MP, Swinnen SP. Age-related deficit in a bimanual joint position matching task is amplitude dependent. Front Aging Neurosci 2015; 7:162. [PMID: 26347649 PMCID: PMC4543861 DOI: 10.3389/fnagi.2015.00162] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2015] [Accepted: 08/07/2015] [Indexed: 11/23/2022] Open
Abstract
The cognitive load associated with joint position sense increases with age but does not necessarily result in impaired performance in a joint position matching task. It is still unclear which factors interact with age to predict matching performance. To test whether movement amplitude and direction are part of such predictors, young and older adults performed a bimanual wrist joint position matching task. Results revealed an age-related deficit when the target limb was positioned far from (25°) the neutral position, but not when close to (15°, 5°) the neutral joint position, irrespective of the direction. These results suggest that the difficulty associated with the comparison of two musculoskeletal states increases towards extreme joint amplitude and that older adults are more vulnerable to this increased difficulty.
Collapse
Affiliation(s)
- Matthieu P Boisgontier
- Movement Control and Neuroplasticity Research Group, Department of Kinesiology, Biomedical Sciences Group, KU Leuven Leuven, Belgium
| | - Stephan P Swinnen
- Movement Control and Neuroplasticity Research Group, Department of Kinesiology, Biomedical Sciences Group, KU Leuven Leuven, Belgium ; Leuven Research Institute for Neuroscience and Disease (LIND), KU Leuven Leuven, Belgium
| |
Collapse
|
35
|
Cameron BD, de la Malla C, López-Moliner J. Why do movements drift in the dark? Passive versus active mechanisms of error accumulation. J Neurophysiol 2015; 114:390-9. [PMID: 25925317 DOI: 10.1152/jn.00032.2015] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2015] [Accepted: 04/27/2015] [Indexed: 11/22/2022] Open
Abstract
When vision of the hand is unavailable, movements drift systematically away from their targets. It is unclear, however, why this drift occurs. We investigated whether drift is an active process, in which people deliberately modify their movements based on biased position estimates, causing the real hand to move away from the real target location, or a passive process, in which execution error accumulates because people have diminished sensory feedback and fail to adequately compensate for the execution error. In our study participants reached back and forth between two targets when vision of the hand, targets, or both the hand and targets was occluded. We observed the most drift when hand vision and target vision were occluded and equivalent amounts of drift when either hand vision or target vision was occluded. In a second experiment, we observed movement drift even when no visual target was ever present, providing evidence that drift is not driven by a visual-proprioceptive discrepancy. The observed drift in both experiments was consistent with a model of passive error accumulation in which the amount of drift is determined by the precision of the sensory estimate of movement error.
Collapse
Affiliation(s)
- Brendan D Cameron
- Vision and Control of Action Group, Departament de Psicologia Bàsica, Universitat de Barcelona, Barcelona, Catalonia, Spain; and Institute for Brain, Cognition, and Behaviour (IR3C), Universitat de Barcelona, Barcelona, Catalonia, Spain
| | - Cristina de la Malla
- Vision and Control of Action Group, Departament de Psicologia Bàsica, Universitat de Barcelona, Barcelona, Catalonia, Spain; and Institute for Brain, Cognition, and Behaviour (IR3C), Universitat de Barcelona, Barcelona, Catalonia, Spain
| | - Joan López-Moliner
- Vision and Control of Action Group, Departament de Psicologia Bàsica, Universitat de Barcelona, Barcelona, Catalonia, Spain; and Institute for Brain, Cognition, and Behaviour (IR3C), Universitat de Barcelona, Barcelona, Catalonia, Spain
| |
Collapse
|
36
|
Samad M, Chung AJ, Shams L. Perception of body ownership is driven by Bayesian sensory inference. PLoS One 2015; 10:e0117178. [PMID: 25658822 PMCID: PMC4320053 DOI: 10.1371/journal.pone.0117178] [Citation(s) in RCA: 204] [Impact Index Per Article: 20.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2014] [Accepted: 12/18/2014] [Indexed: 11/19/2022] Open
Abstract
Recent studies have shown that human perception of body ownership is highly malleable. A well-known example is the rubber hand illusion (RHI) wherein ownership over a dummy hand is experienced, and is generally believed to require synchronized stroking of real and dummy hands. Our goal was to elucidate the computational principles governing this phenomenon. We adopted the Bayesian causal inference model of multisensory perception and applied it to visual, proprioceptive, and tactile stimuli. The model reproduced the RHI, predicted that it can occur without tactile stimulation, and that synchronous stroking would enhance it. Various measures of ownership across two experiments confirmed the predictions: a large percentage of individuals experienced the illusion in the absence of any tactile stimulation, and synchronous stroking strengthened the illusion. Altogether, these findings suggest that perception of body ownership is governed by Bayesian causal inference—i.e., the same rule that appears to govern the perception of outside world.
Collapse
Affiliation(s)
- Majed Samad
- Department of Psychology, University of California, Los Angeles, CA, USA
- * E-mail:
| | - Albert Jin Chung
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - Ladan Shams
- Department of Psychology, University of California, Los Angeles, CA, USA
- Department of Bioengineering, University of California, Los Angeles, CA, USA
| |
Collapse
|
37
|
Dynamic Tuning of Tactile Localization to Body Posture. Curr Biol 2015; 25:512-7. [DOI: 10.1016/j.cub.2014.12.038] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2014] [Revised: 10/21/2014] [Accepted: 12/12/2014] [Indexed: 11/20/2022]
|
38
|
Schaap TS, Gonzales TI, Janssen TWJ, Brown SH. Proprioceptively guided reaching movements in 3D space: effects of age, task complexity and handedness. Exp Brain Res 2014; 233:631-9. [DOI: 10.1007/s00221-014-4142-3] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2014] [Accepted: 11/04/2014] [Indexed: 12/19/2022]
|
39
|
Efficiency of visual feedback integration differs between dominant and non-dominant arms during a reaching task. Exp Brain Res 2014; 233:317-27. [DOI: 10.1007/s00221-014-4116-5] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2014] [Accepted: 09/23/2014] [Indexed: 11/26/2022]
|
40
|
van Beek FE, Bergmann Tiest WM, Kappers AML. Haptic discrimination of distance. PLoS One 2014; 9:e104769. [PMID: 25116638 PMCID: PMC4130575 DOI: 10.1371/journal.pone.0104769] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2014] [Accepted: 07/14/2014] [Indexed: 11/18/2022] Open
Abstract
While quite some research has focussed on the accuracy of haptic perception of distance, information on the precision of haptic perception of distance is still scarce, particularly regarding distances perceived by making arm movements. In this study, eight conditions were measured to answer four main questions, which are: what is the influence of reference distance, movement axis, perceptual mode (active or passive) and stimulus type on the precision of this kind of distance perception? A discrimination experiment was performed with twelve participants. The participants were presented with two distances, using either a haptic device or a real stimulus. Participants compared the distances by moving their hand from a start to an end position. They were then asked to judge which of the distances was the longer, from which the discrimination threshold was determined for each participant and condition. The precision was influenced by reference distance. No effect of movement axis was found. The precision was higher for active than for passive movements and it was a bit lower for real stimuli than for rendered stimuli, but it was not affected by adding cutaneous information. Overall, the Weber fraction for the active perception of a distance of 25 or 35 cm was about 11% for all cardinal axes. The recorded position data suggest that participants, in order to be able to judge which distance was the longer, tried to produce similar speed profiles in both movements. This knowledge could be useful in the design of haptic devices.
Collapse
Affiliation(s)
- Femke E. van Beek
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University, Amsterdam, The Netherlands
- * E-mail:
| | - Wouter M. Bergmann Tiest
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University, Amsterdam, The Netherlands
| | - Astrid M. L. Kappers
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University, Amsterdam, The Netherlands
| |
Collapse
|
41
|
Kodaka K, Ishihara Y. Crossed hands strengthen and diversify proprioceptive drift in the self-touch illusion. Front Hum Neurosci 2014; 8:422. [PMID: 24987345 PMCID: PMC4060641 DOI: 10.3389/fnhum.2014.00422] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2014] [Accepted: 05/27/2014] [Indexed: 11/13/2022] Open
Abstract
In the self-touch illusion (STI), some can feel that both hands are touching each other even when they are separated actually. This is achieved by giving synchronized touches to both hands. Because the STI involves both hands (an administrating hand and a receptive hand) of a single person, two types of proprioceptive drifts (PDs) simultaneously occur in such a way that both hands are attracted to each other. It is known that the PD distance is generally larger for the administrating hand than for the receptive hand when the two hands are uncrossed. However, it remains unclear why such an asymmetrical relationship is observed universally. In this study, we conducted two types of experiment to induce the STI. The first experiment involved four conditions combining a factor of "whether the hands are uncrossed or crossed" and a factor of "whether the administrating hand is resting or active on the surface," with the receptive (left) hand located at the body's midline. The result demonstrated that crossing hands and resting on surface (ROS) induced the STI. Specifically, crossing hands enhanced the amount of PD distance by more than two or three times. Moreover, it is interesting that strong PD with dominance of the receptive hand, which did not appear in the uncrossed condition, was observed frequently. The second experiment collected seven "illusion-sensitive" participants from the first experiment, all of whom had a strong tendency to feel the self-touch, and examined the effect of the location of the body midline on the PD when hands are crossed with the administrating hand ROS. The result demonstrated that the dominant hand on the PD completely differed among participants, but was relatively stable over the midline position and time in the same person. We also found that a small number of participants exhibited quite a different pattern of the PD in the identical posture. On the basis of the results, we analyze in detail how the dominant hand on the PD is determined in the STI.
Collapse
Affiliation(s)
- Kenri Kodaka
- Graduate School of Design and Architecture, Nagoya City University Nagoya, Japan
| | - Yuki Ishihara
- Graduate School of Design and Architecture, Nagoya City University Nagoya, Japan
| |
Collapse
|
42
|
Kuling IA, van der Graaff MCW, Brenner E, Smeets JBJ. Proprioceptive Biases in Different Experimental Designs. HAPTICS: NEUROSCIENCE, DEVICES, MODELING, AND APPLICATIONS 2014. [DOI: 10.1007/978-3-662-44193-0_3] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
|
43
|
King J, Karduna A. Joint position sense during a reaching task improves at targets located closer to the head but is unaffected by instruction. Exp Brain Res 2013; 232:865-74. [PMID: 24352607 DOI: 10.1007/s00221-013-3799-3] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2013] [Accepted: 11/27/2013] [Indexed: 12/15/2022]
Abstract
The purpose of the present study was twofold. Our first purpose was to test whether joint position sense is similar under instructions to memorize hand position and instructions to memorize shoulder and elbow angles. We hypothesized that instructions to memorize hand position would produce smaller errors due to evidence suggesting that the CNS directly determines hand position but indirectly determines joint angles from proprioceptive information. Our second purpose was to assess biases in joint position sense at various joint angles in a sagittal workspace. We hypothesized that akin to previous single-joint investigations, the shoulder and elbow would demonstrate better joint position sense as joint angles approached 90° during our multi-joint task. Sixteen healthy and right-hand-dominant subjects participated in the present investigation. Subjects were required to actively position their right upper extremity to one of three targets for a memorization period. After returning to the rest position, subjects then actively repositioned back into the target. We did not find evidence of a substantial difference in joint position sense between instructions to memorize the hand position or joint angle. This finding, when considered in conjunction with other evidence, suggests that studies employing either a joint angle protocol or a hand estimation protocol likely produce results that are similar enough to be compared. Proprioception has been shown to be non-uniform across a two-dimensional horizontal workspace. The present investigation provides evidence that proprioception is also non-uniform across a two-dimensional sagittal workspace. Specifically, angular errors decrease as upper extremity joint angles approach 90° of flexion and endpoint errors decrease as targets are located increasingly closer to the head.
Collapse
Affiliation(s)
- Jacqlyn King
- Department of Human Physiology, University of Oregon, Eugene, OR, 97403, USA
| | | |
Collapse
|
44
|
Bingham GP, Mon-Williams MA. The dynamics of sensorimotor calibration in reaching-to-grasp movements. J Neurophysiol 2013; 110:2857-62. [PMID: 24068760 DOI: 10.1152/jn.00112.2013] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Reach-to-grasp movements require information about the distance and size of target objects. Calibration of this information could be achieved via feedback information (visual and/or haptic) regarding terminal accuracy when target objects are grasped. A number of reports suggest that the nervous system alters reach-to-grasp behavior following either a visual or haptic error signal indicating inaccurate reaching. Nevertheless, the reported modification is generally partial (reaching is changed less than predicted by the feedback error), a finding that has been ascribed to slow adaptation rates. It is possible, however, that the modified reaching reflects the system's weighting of the visual and haptic information in the presence of noise rather than calibration per se. We modeled the dynamics of calibration and showed that the discrepancy between reaching behavior and the feedback error results from an incomplete calibration process. Our results provide evidence for calibration being an intrinsic feature of reach-to-grasp behavior.
Collapse
Affiliation(s)
- Geoffrey P Bingham
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana; and
| | | |
Collapse
|
45
|
Kuling IA, Brenner E, Smeets JBJ. Proprioception is robust under external forces. PLoS One 2013; 8:e74236. [PMID: 24019959 PMCID: PMC3760830 DOI: 10.1371/journal.pone.0074236] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2013] [Accepted: 07/31/2013] [Indexed: 11/19/2022] Open
Abstract
Information from cutaneous, muscle and joint receptors is combined with efferent information to create a reliable percept of the configuration of our body (proprioception). We exposed the hand to several horizontal force fields to examine whether external forces influence this percept. In an end-point task subjects reached visually presented positions with their unseen hand. In a vector reproduction task, subjects had to judge a distance and direction visually and reproduce the corresponding vector by moving the unseen hand. We found systematic individual errors in the reproduction of the end-points and vectors, but these errors did not vary systematically with the force fields. This suggests that human proprioception accounts for external forces applied to the hand when sensing the position of the hand in the horizontal plane.
Collapse
Affiliation(s)
- Irene A. Kuling
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University, Amsterdam, The Netherlands
- * E-mail:
| | - Eli Brenner
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University, Amsterdam, The Netherlands
| | - Jeroen B. J. Smeets
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University, Amsterdam, The Netherlands
| |
Collapse
|
46
|
Song W, Francis JT. Tactile information processing in primate hand somatosensory cortex (S1) during passive arm movement. J Neurophysiol 2013; 110:2061-70. [PMID: 23945783 DOI: 10.1152/jn.00893.2012] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Motor output mostly depends on sensory input, which also can be affected by action. To further our understanding of how tactile information is processed in the primary somatosensory cortex (S1) in dynamic environments, we recorded neural responses to tactile stimulation of the hand in three awake monkeys under arm/hand passive movement and rest. We found that neurons generally responded to tactile stimulation under both conditions and were modulated by movement: with a higher baseline firing rate, a suppressed peak rate, and a smaller dynamic range during passive movement than during rest, while the area under the response curve was stable across these two states. By using an information theory-based method, the mutual information between tactile stimulation and neural responses was quantified with rate and spatial coding models under the two conditions. The two potential encoding models showed different contributions depending on behavioral contexts. Tactile information encoded with rate coding from individual units was lower than spatial coding of unit pairs, especially during movement; however, spatial coding had redundant information between unit pairs. Passive movement regulated the mutual information, and such regulation might play different roles depending on the encoding strategies used. The underlying mechanisms of our observation most likely come from a bottom-up strategy, where neurons in S1 were regulated through the activation of the peripheral tactile/proprioceptive receptors and the interactions between these different types of information.
Collapse
Affiliation(s)
- Weiguo Song
- Department of Physiology and Pharmacology, State University of New York Downstate Medical Center, Brooklyn, New York
| | | |
Collapse
|
47
|
Tramper JJ, Flanders M. Predictive mechanisms in the control of contour following. Exp Brain Res 2013; 227:535-46. [PMID: 23649968 DOI: 10.1007/s00221-013-3529-x] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2013] [Accepted: 04/15/2013] [Indexed: 11/24/2022]
Abstract
In haptic exploration, when running a fingertip along a surface, the control system may attempt to anticipate upcoming changes in curvature in order to maintain a consistent level of contact force. Such predictive mechanisms are well known in the visual system, but have yet to be studied in the somatosensory system. Thus, the present experiment was designed to reveal human capabilities for different types of haptic prediction. A robot arm with a large 3D workspace was attached to the index fingertip and was programmed to produce virtual surfaces with curvatures that varied within and across trials. With eyes closed, subjects moved the fingertip around elliptical hoops with flattened regions or Limaçon shapes, where the curvature varied continuously. Subjects anticipated the corner of the flattened region rather poorly, but for the Limaçon shapes, they varied finger speed with upcoming curvature according to the two-thirds power law. Furthermore, although the Limaçon shapes were randomly presented in various 3D orientations, modulation of contact force also indicated good anticipation of upcoming changes in curvature. The results demonstrate that it is difficult to haptically anticipate the spatial location of an abrupt change in curvature, but smooth changes in curvature may be facilitated by anticipatory predictions.
Collapse
Affiliation(s)
- Julian J Tramper
- Department of Neuroscience, University of Minnesota, 321 Church St SE, Minneapolis, MN 55455, USA
| | | |
Collapse
|
48
|
van der Kooij K, Brenner E, van Beers RJ, Schot WD, Smeets JBJ. Alignment to natural and imposed mismatches between the senses. J Neurophysiol 2013; 109:1890-9. [DOI: 10.1152/jn.00845.2012] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Does the nervous system continuously realign the senses so that objects are seen and felt in the same place? Conflicting answers to this question have been given. Research imposing a sensory mismatch has provided evidence that the nervous system realigns the senses to reduce the mismatch. Other studies have shown that when subjects point with the unseen hand to visual targets, their end points show visual-proprioceptive biases that do not disappear after episodes of visual feedback. These biases are indicative of intersensory mismatches that the nervous system does not align for. Here, we directly compare how the nervous system deals with natural and imposed mismatches. Subjects moved a hand-held cube to virtual cubes appearing at pseudorandom locations in three-dimensional space. We alternated blocks in which subjects moved without visual feedback of the hand with feedback blocks in which we rendered a cube representing the hand-held cube. In feedback blocks, we rotated the visual feedback by 5° relative to the subject's head, creating an imposed mismatch between vision and proprioception on top of any natural mismatches. Realignment occurred quickly but was incomplete. We found more realignment to imposed mismatches than to natural mismatches. We propose that this difference is related to the way in which the visual information changed when subjects entered the experiment: the imposed mismatches were different from the mismatch in daily life, so alignment started from scratch, whereas the natural mismatches were not imposed by the experimenter, so subjects are likely to have entered the experiment partly aligned.
Collapse
Affiliation(s)
- K. van der Kooij
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University Amsterdam, Amsterdam, The Netherlands; and
| | - E. Brenner
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University Amsterdam, Amsterdam, The Netherlands; and
| | - R. J. van Beers
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University Amsterdam, Amsterdam, The Netherlands; and
| | - W. D. Schot
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University Amsterdam, Amsterdam, The Netherlands; and
- Center for Motor and Cognitive Disabilities, Utrecht University, Utrecht, The Netherlands
| | - J. B. J. Smeets
- MOVE Research Institute Amsterdam, Faculty of Human Movement Sciences, VU University Amsterdam, Amsterdam, The Netherlands; and
| |
Collapse
|
49
|
Abstract
Motor learning is driven by movement errors. The speed of learning can be quantified by the learning rate, which is the proportion of an error that is corrected for in the planning of the next movement. Previous studies have shown that the learning rate depends on the reliability of the error signal and on the uncertainty of the motor system's own state. These dependences are in agreement with the predictions of the Kalman filter, which is a state estimator that can be used to determine the optimal learning rate for each movement such that the expected movement error is minimized. Here we test whether not only the average behaviour is optimal, as the previous studies showed, but if the learning rate is chosen optimally in every individual movement. Subjects made repeated movements to visual targets with their unseen hand. They received visual feedback about their endpoint error immediately after each movement. The reliability of these error-signals was varied across three conditions. The results are inconsistent with the predictions of the Kalman filter because correction for large errors in the beginning of a series of movements to a fixed target was not as fast as predicted and the learning rates for the extent and the direction of the movements did not differ in the way predicted by the Kalman filter. Instead, a simpler model that uses the same learning rate for all movements with the same error-signal reliability can explain the data. We conclude that our brain does not apply state estimation to determine the optimal planning correction for every individual movement, but it employs a simpler strategy of using a fixed learning rate for all movements with the same level of error-signal reliability.
Collapse
|
50
|
Rincon-Gonzalez L, Naufel SN, Santos VJ, Helms Tillery S. Interactions Between Tactile and Proprioceptive Representations in Haptics. J Mot Behav 2012; 44:391-401. [DOI: 10.1080/00222895.2012.746281] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|