1
|
Verdel D, Bastide S, Geffard F, Bruneau O, Vignais N, Berret B. Reoptimization of single-joint motor patterns to non-Earth gravity torques induced by a robotic exoskeleton. iScience 2023; 26:108350. [PMID: 38026148 PMCID: PMC10665922 DOI: 10.1016/j.isci.2023.108350] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Revised: 01/29/2023] [Accepted: 10/24/2023] [Indexed: 12/01/2023] Open
Abstract
Gravity is a ubiquitous component of our environment that we have learned to optimally integrate in movement control. Yet, altered gravity conditions arise in numerous applications from space exploration to rehabilitation, thereby pressing the sensorimotor system to adapt. Here, we used a robotic exoskeleton to reproduce the elbow joint-level effects of arbitrary gravity fields ranging from 1g to -1g, passing through Mars- and Moon-like gravities, and tested whether humans can reoptimize their motor patterns accordingly. By comparing the motor patterns of actual arm movements with those predicted by an optimal control model, we show that our participants (N = 61 ) adapted optimally to each gravity-like torque. These findings suggest that the joint-level effects of a large range of gravities can be efficiently apprehended by humans, thus opening new perspectives in arm weight support training in manipulation tasks, whether it be for patients or astronauts.
Collapse
Affiliation(s)
- Dorian Verdel
- Université Paris-Saclay, CIAMS, 91405 Orsay, France
- CIAMS, Université d’Orléans, Orléans, France
| | - Simon Bastide
- Université Paris-Saclay, CIAMS, 91405 Orsay, France
- CIAMS, Université d’Orléans, Orléans, France
| | | | - Olivier Bruneau
- LURPA, Mechanical Engineering Department, ENS Paris-Saclay, Université Paris-Saclay, 91190 Gif-sur-Yvette, France
| | - Nicolas Vignais
- Université Paris-Saclay, CIAMS, 91405 Orsay, France
- CIAMS, Université d’Orléans, Orléans, France
| | - Bastien Berret
- Université Paris-Saclay, CIAMS, 91405 Orsay, France
- CIAMS, Université d’Orléans, Orléans, France
- Institut Universitaire de France, Paris, France
| |
Collapse
|
2
|
Carriot J, Mackrous I, Cullen KE. Challenges to the Vestibular System in Space: How the Brain Responds and Adapts to Microgravity. Front Neural Circuits 2021; 15:760313. [PMID: 34803615 PMCID: PMC8595211 DOI: 10.3389/fncir.2021.760313] [Citation(s) in RCA: 26] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2021] [Accepted: 10/11/2021] [Indexed: 11/13/2022] Open
Abstract
In the next century, flying civilians to space or humans to Mars will no longer be a subject of science fiction. The altered gravitational environment experienced during space flight, as well as that experienced following landing, results in impaired perceptual and motor performance-particularly in the first days of the new environmental challenge. Notably, the absence of gravity unloads the vestibular otolith organs such that they are no longer stimulated as they would be on earth. Understanding how the brain responds initially and then adapts to altered sensory input has important implications for understanding the inherent abilities as well as limitations of human performance. Space-based experiments have shown that altered gravity causes structural and functional changes at multiple stages of vestibular processing, spanning from the hair cells of its sensory organs to the Purkinje cells of the vestibular cerebellum. Furthermore, ground-based experiments have established the adaptive capacity of vestibular pathways and neural mechanism that likely underlie this adaptation. We review these studies and suggest that the brain likely uses two key strategies to adapt to changes in gravity: (i) the updating of a cerebellum-based internal model of the sensory consequences of gravity; and (ii) the re-weighting of extra-vestibular information as the vestibular system becomes less (i.e., entering microgravity) and then again more reliable (i.e., return to earth).
Collapse
Affiliation(s)
- Jérome Carriot
- Department of Physiology, McGill University, Montreal, QC, Canada
| | | | - Kathleen E. Cullen
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD, United States
| |
Collapse
|
3
|
Clément G, Bukley A, Loureiro N, Lindblad L, Sousa D, Zandvilet A. Horizontal and Vertical Distance Perception in Altered Gravity. Sci Rep 2020; 10:5471. [PMID: 32214172 PMCID: PMC7096486 DOI: 10.1038/s41598-020-62405-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2019] [Accepted: 03/12/2020] [Indexed: 11/25/2022] Open
Abstract
The perception of the horizontal and vertical distances of a visual target to an observer was investigated in parabolic flight during alternating short periods of normal gravity (1 g). microgravity (0 g), and hypergravity (1.8 g). The methods used for obtaining absolute judgments of egocentric distance included verbal reports and visually directed motion toward a memorized visual target by pulling on a rope with the arms (blind pulling). The results showed that, for all gravity levels, the verbal reports of distance judgments were accurate for targets located between 0.6 and 6.0 m. During blind pulling, subjects underestimated horizontal distances as distances increased, and this underestimation decreased in 0 g. Vertical distances for up targets were overestimated and vertical distances for down targets were underestimated in both 1 g and 1.8 g. This vertical asymmetry was absent in 0 g. The results of the present study confirm that blind pulling and verbal reports are independently influenced by gravity. The changes in distance judgments during blind pulling in 0 g compared to 1 g support the view that, during an action-based task, subjects base their perception of distance on the estimated motor effort of navigating to the perceived object.
Collapse
Affiliation(s)
| | - Angie Bukley
- International Space University Org., Inc., Webster, Massachusetts, USA
| | - Nuno Loureiro
- Champalimaud Research, Champalimaud Centre for the Unknown, Lisbon, Portugal
| | | | | | - André Zandvilet
- European Space Research and Technology Center, Noordwijk, The Netherlands
| |
Collapse
|
4
|
La Scaleia B, Lacquaniti F, Zago M. Body orientation contributes to modelling the effects of gravity for target interception in humans. J Physiol 2019; 597:2021-2043. [PMID: 30644996 DOI: 10.1113/jp277469] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2018] [Accepted: 01/09/2019] [Indexed: 11/08/2022] Open
Abstract
KEY POINTS It is known that interception of targets accelerated by gravity involves internal models coupled with visual signals. Non-visual signals related to head and body orientation relative to gravity may also contribute, although their role is poorly understood. In a novel experiment, we asked pitched observers to hit a virtual target approaching with an acceleration that was either coherent or incoherent with their pitch-tilt. Initially, the timing errors were large and independent of the coherence between target acceleration and observer's pitch. With practice, however, the timing errors became substantially smaller in the coherent conditions. The results show that information about head and body orientation can contribute to modelling the effects of gravity on a moving target. Orientation cues from vestibular and somatosensory signals might be integrated with visual signals in the vestibular cortex, where the internal model of gravity is assumed to be encoded. ABSTRACT Interception of moving targets relies on visual signals and internal models. Less is known about the additional contribution of non-visual cues about head and body orientation relative to gravity. We took advantage of Galileo's law of motion along an incline to demonstrate the effects of vestibular and somatosensory cues about head and body orientation on interception timing. Participants were asked to hit a ball rolling in a gutter towards the eyes, resulting in image expansion. The scene was presented in a head-mounted display, without any visual information about gravity direction. In separate blocks of trials participants were pitched backwards by 20° or 60°, whereas ball acceleration was randomized across trials so as to be compatible with rolling down a slope of 20° or 60°. Initially, the timing errors were large, independently of the coherence between ball acceleration and pitch angle, consistent with responses based exclusively on visual information because visual stimuli were identical at both tilts. At the end of the experiment, however, the timing errors were systematically smaller in the coherent conditions than the incoherent ones. Moreover, the responses were significantly (P = 0.007) earlier when participants were pitched by 60° than when they were pitched by 20°. Therefore, practice with the task led to incorporation of information about head and body orientation relative to gravity for response timing. Instead, posture did not affect response timing in a control experiment in which participants hit a static target in synchrony with the last of a predictable series of stationary audiovisual stimuli.
Collapse
Affiliation(s)
- Barbara La Scaleia
- Laboratory of Neuromotor Physiology, IRCCS Fondazione Santa Lucia, Rome, Italy
| | - Francesco Lacquaniti
- Laboratory of Neuromotor Physiology, IRCCS Fondazione Santa Lucia, Rome, Italy.,Department of Systems Medicine, University of Rome Tor Vergata, Rome, Italy.,Centre of Space Bio-medicine, University of Rome Tor Vergata, Rome, Italy
| | - Myrka Zago
- Laboratory of Neuromotor Physiology, IRCCS Fondazione Santa Lucia, Rome, Italy.,Department of Civil Engineering and Computer Science Engineering, University of Rome Tor Vergata, Rome, Italy
| |
Collapse
|
5
|
Kato T, Imaizumi S, Tanno Y. Metaphorical Action Retrospectively but Not Prospectively Alters Emotional Judgment. Front Psychol 2018; 9:1927. [PMID: 30356744 PMCID: PMC6189424 DOI: 10.3389/fpsyg.2018.01927] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2018] [Accepted: 09/19/2018] [Indexed: 11/13/2022] Open
Abstract
Metaphorical association between vertical space and emotional valence is activated by bodily movement toward the corresponding space. Upward or downward manual movement "following" observation of emotional images is reported to alter the perceived valence as more positive or negative. This study aimed to clarify this retrospective emotional modulation. Experiment 1 investigated the effects of temporal order of emotional stimuli and manual movements. Participants performed upward, downward, or horizontal manual movements immediately before or after observation of emotional images; they then rated the valence of the image. The images were rated as more negative in downward- than in horizontal-movement conditions only when the movements followed the image observation. Upward movement showed no effect. Experiment 2 examined the effects of temporal proximity between images, movements, and ratings. The results showed that a 2-s interval either between image and movement or movement and rating nullified the retrospective effect. Bodily movement that corresponds to space-valence metaphor retrospectively, but not prospectively, alters the perceived valence of emotional stimuli. This effect requires temporal proximity between emotional stimulus, the subsequent movement, and rating of the stimulus. With respect to the lack of effect of upward-positive correspondence, anisotropy in effects of movement direction is discussed.
Collapse
Affiliation(s)
- Tatsuya Kato
- Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
| | - Shu Imaizumi
- Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
- Japan Society for the Promotion of Science, Tokyo, Japan
| | - Yoshihiko Tanno
- Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
6
|
Jörges B, López-Moliner J. Gravity as a Strong Prior: Implications for Perception and Action. Front Hum Neurosci 2017; 11:203. [PMID: 28503140 PMCID: PMC5408029 DOI: 10.3389/fnhum.2017.00203] [Citation(s) in RCA: 51] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2017] [Accepted: 04/07/2017] [Indexed: 11/29/2022] Open
Abstract
In the future, humans are likely to be exposed to environments with altered gravity conditions, be it only visually (Virtual and Augmented Reality), or visually and bodily (space travel). As visually and bodily perceived gravity as well as an interiorized representation of earth gravity are involved in a series of tasks, such as catching, grasping, body orientation estimation and spatial inferences, humans will need to adapt to these new gravity conditions. Performance under earth gravity discrepant conditions has been shown to be relatively poor, and few studies conducted in gravity adaptation are rather discouraging. Especially in VR on earth, conflicts between bodily and visual gravity cues seem to make a full adaptation to visually perceived earth-discrepant gravities nearly impossible, and even in space, when visual and bodily cues are congruent, adaptation is extremely slow. We invoke a Bayesian framework for gravity related perceptual processes, in which earth gravity holds the status of a so called “strong prior”. As other strong priors, the gravity prior has developed through years and years of experience in an earth gravity environment. For this reason, the reliability of this representation is extremely high and overrules any sensory information to its contrary. While also other factors such as the multisensory nature of gravity perception need to be taken into account, we present the strong prior account as a unifying explanation for empirical results in gravity perception and adaptation to earth-discrepant gravities.
Collapse
Affiliation(s)
- Björn Jörges
- Department of Cognition, Development and Psychology of Education, Faculty of Psychology, Universitat de BarcelonaCatalonia, Spain.,Institut de Neurociències, Universitat de BarcelonaCatalonia, Spain
| | - Joan López-Moliner
- Department of Cognition, Development and Psychology of Education, Faculty of Psychology, Universitat de BarcelonaCatalonia, Spain.,Institut de Neurociències, Universitat de BarcelonaCatalonia, Spain
| |
Collapse
|
7
|
Török Á, Ferrè ER, Kokkinara E, Csépe V, Swapp D, Haggard P. Up, Down, Near, Far: An Online Vestibular Contribution to Distance Judgement. PLoS One 2017; 12:e0169990. [PMID: 28085939 PMCID: PMC5235368 DOI: 10.1371/journal.pone.0169990] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2016] [Accepted: 12/27/2016] [Indexed: 11/19/2022] Open
Abstract
Whether a visual stimulus seems near or far away depends partly on its vertical elevation. Contrasting theories suggest either that perception of distance could vary with elevation, because of memory of previous upwards efforts in climbing to overcome gravity, or because of fear of falling associated with the downwards direction. The vestibular system provides a fundamental signal for the downward direction of gravity, but the relation between this signal and depth perception remains unexplored. Here we report an experiment on vestibular contributions to depth perception, using Virtual Reality. We asked participants to judge the absolute distance of an object presented on a plane at different elevations during brief artificial vestibular inputs. Relative to distance estimates collected with the object at the level of horizon, participants tended to overestimate distances when the object was presented above the level of horizon and the head was tilted upward and underestimate them when the object was presented below the level of horizon. Interestingly, adding artificial vestibular inputs strengthened these distance biases, showing that online multisensory signals, and not only stored information, contribute to such distance illusions. Our results support the gravity theory of depth perception, and show that vestibular signals make an on-line contribution to the perception of effort, and thus of distance.
Collapse
Affiliation(s)
- Ágoston Török
- Brain Imaging Centre, Research Centre for Natural Sciences, Hungarian Academy of Sciences, Budapest, Hungary
| | - Elisa Raffaella Ferrè
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom
- Department of Psychology, Royal Holloway University of London, Egham, United Kingdom
| | - Elena Kokkinara
- Department of Personality, Assessment and Psychological Treatments, University of Barcelona, Barcelona, Spain
| | - Valéria Csépe
- Brain Imaging Centre, Research Centre for Natural Sciences, Hungarian Academy of Sciences, Budapest, Hungary
| | - David Swapp
- Department of Computer Science, University College London, London, United Kingdom
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom
| |
Collapse
|
8
|
Rousseau C, Papaxanthis C, Gaveau J, Pozzo T, White O. Initial information prior to movement onset influences kinematics of upward arm pointing movements. J Neurophysiol 2016; 116:1673-1683. [PMID: 27486106 DOI: 10.1152/jn.00616.2015] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2015] [Accepted: 07/11/2016] [Indexed: 11/22/2022] Open
Abstract
To elaborate a motor plan and perform online control in the gravity field, the brain relies on priors and multisensory integration of information. In particular, afferent and efferent inputs related to the initial state are thought to convey sensorimotor information to plan the upcoming action. Yet it is still unclear to what extent these cues impact motor planning. Here we examined the role of initial information on the planning and execution of arm movements. Participants performed upward arm movements around the shoulder at three speeds and in two arm conditions. In the first condition, the arm was outstretched horizontally and required a significant muscular command to compensate for the gravitational shoulder torque before movement onset. In contrast, in the second condition the arm was passively maintained in the same position with a cushioned support and did not require any muscle contraction before movement execution. We quantified differences in motor performance by comparing shoulder velocity profiles. Previous studies showed that asymmetric velocity profiles reflect an optimal integration of the effects of gravity on upward movements. Consistent with this, we found decreased acceleration durations in both arm conditions. However, early differences in kinematic asymmetries and EMG patterns between the two conditions signaled a change of the motor plan. This different behavior carried on through trials when the arm was at rest before movement onset and may reveal a distinct motor strategy chosen in the context of uncertainty. Altogether, we suggest that the information available online must be complemented by accurate initial information.
Collapse
Affiliation(s)
- Célia Rousseau
- Université de Bourgogne Franche-Comté (UBFC), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; Institut National de Santé et de Recherche Médicale (INSERM U1093), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; and
| | - Charalambos Papaxanthis
- Université de Bourgogne Franche-Comté (UBFC), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; Institut National de Santé et de Recherche Médicale (INSERM U1093), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; and
| | - Jérémie Gaveau
- Université de Bourgogne Franche-Comté (UBFC), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; Institut National de Santé et de Recherche Médicale (INSERM U1093), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; and
| | - Thierry Pozzo
- Université de Bourgogne Franche-Comté (UBFC), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; Institut National de Santé et de Recherche Médicale (INSERM U1093), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; and Institut Universitaire de France (IUF), Paris, France
| | - Olivier White
- Université de Bourgogne Franche-Comté (UBFC), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; Institut National de Santé et de Recherche Médicale (INSERM U1093), Cognition Action et Plasticité Sensorimotrice (CAPS) UMR1093, Dijon, France; and
| |
Collapse
|
9
|
Duda KR, Vasquez RA, Middleton AJ, Hansberry ML, Newman DJ, Jacobs SE, West JJ. The Variable Vector Countermeasure Suit (V2Suit) for space habitation and exploration. Front Syst Neurosci 2015; 9:55. [PMID: 25914631 PMCID: PMC4392692 DOI: 10.3389/fnsys.2015.00055] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2015] [Accepted: 03/17/2015] [Indexed: 11/30/2022] Open
Abstract
The “Variable Vector Countermeasure Suit (V2Suit) for Space Habitation and Exploration” is a novel system concept that provides a platform for integrating sensors and actuators with daily astronaut intravehicular activities to improve health and performance, while reducing the mass and volume of the physiologic adaptation countermeasure systems, as well as the required exercise time during long-duration space exploration missions. The V2Suit system leverages wearable kinematic monitoring technology and uses inertial measurement units (IMUs) and control moment gyroscopes (CMGs) within miniaturized modules placed on body segments to provide a “viscous resistance” during movements against a specified direction of “down”—initially as a countermeasure to the sensorimotor adaptation performance decrements that manifest themselves while living and working in microgravity and during gravitational transitions during long-duration spaceflight, including post-flight recovery and rehabilitation. Several aspects of the V2Suit system concept were explored and simulated prior to developing a brassboard prototype for technology demonstration. This included a system architecture for identifying the key components and their interconnects, initial identification of key human-system integration challenges, development of a simulation architecture for CMG selection and parameter sizing, and the detailed mechanical design and fabrication of a module. The brassboard prototype demonstrates closed-loop control from “down” initialization through CMG actuation, and provides a research platform for human performance evaluations to mitigate sensorimotor adaptation, as well as a tool for determining the performance requirements when used as a musculoskeletal deconditioning countermeasure. This type of countermeasure system also has Earth benefits, particularly in gait or movement stabilization and rehabilitation.
Collapse
Affiliation(s)
- Kevin R Duda
- The Charles Stark Draper Laboratory, Inc. Cambridge, MA, USA
| | | | | | | | - Dava J Newman
- Massachusetts Institute of Technology Cambridge, MA, USA
| | | | - John J West
- The Charles Stark Draper Laboratory, Inc. Cambridge, MA, USA
| |
Collapse
|
10
|
How do visual and postural cues combine for self-tilt perception during slow pitch rotations? Acta Psychol (Amst) 2014; 153:51-9. [PMID: 25299446 DOI: 10.1016/j.actpsy.2014.09.005] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2013] [Revised: 09/10/2014] [Accepted: 09/17/2014] [Indexed: 11/23/2022] Open
Abstract
Self-orientation perception relies on the integration of multiple sensory inputs which convey spatially-related visual and postural cues. In the present study, an experimental set-up was used to tilt the body and/or the visual scene to investigate how these postural and visual cues are integrated for self-tilt perception (the subjective sensation of being tilted). Participants were required to repeatedly rate a confidence level for self-tilt perception during slow (0.05°·s(-1)) body and/or visual scene pitch tilts up to 19° relative to vertical. Concurrently, subjects also had to perform arm reaching movements toward a body-fixed target at certain specific angles of tilt. While performance of a concurrent motor task did not influence the main perceptual task, self-tilt detection did vary according to the visuo-postural stimuli. Slow forward or backward tilts of the visual scene alone did not induce a marked sensation of self-tilt contrary to actual body tilt. However, combined body and visual scene tilt influenced self-tilt perception more strongly, although this effect was dependent on the direction of visual scene tilt: only a forward visual scene tilt combined with a forward body tilt facilitated self-tilt detection. In such a case, visual scene tilt did not seem to induce vection but rather may have produced a deviation of the perceived orientation of the longitudinal body axis in the forward direction, which may have lowered the self-tilt detection threshold during actual forward body tilt.
Collapse
|
11
|
Sarlegna FR, Mutha PK. The influence of visual target information on the online control of movements. Vision Res 2014; 110:144-54. [PMID: 25038472 DOI: 10.1016/j.visres.2014.07.001] [Citation(s) in RCA: 54] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2014] [Revised: 07/02/2014] [Accepted: 07/04/2014] [Indexed: 11/25/2022]
Abstract
The continuously changing properties of our environment require constant monitoring of our actions and updating of our motor commands based on the task goals. Such updating relies upon our predictions about the sensory consequences of our movement commands, as well as sensory feedback received during movement execution. Here we focus on how visual information about target location is used to update and guide ongoing actions so that the task goal is successfully achieved. We review several studies that have manipulated vision of the target in a variety of ways, ranging from complete removal of visual target information to changes in visual target properties after movement onset to examine how such changes are accounted for during motor execution. We also examined the specific role of a critical neural structure, the parietal cortex, and argue that a fundamental challenge for the future is to understand how visual information about target location is integrated with other streams of information, during movement execution, to estimate the state of the body and the environment in order to ensure optimal motor performance.
Collapse
Affiliation(s)
| | - Pratik K Mutha
- Indian Institute of Technology Gandhinagar, Ahmedabad 382424, Gujarat, India
| |
Collapse
|