1
|
Ugolini G, Graf W. Pathways from the superior colliculus and the nucleus of the optic tract to the posterior parietal cortex in macaque monkeys: Functional frameworks for representation updating and online movement guidance. Eur J Neurosci 2024; 59:2792-2825. [PMID: 38544445 DOI: 10.1111/ejn.16314] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2023] [Revised: 01/31/2024] [Accepted: 02/22/2024] [Indexed: 05/22/2024]
Abstract
The posterior parietal cortex (PPC) integrates multisensory and motor-related information for generating and updating body representations and movement plans. We used retrograde transneuronal transfer of rabies virus combined with a conventional tracer in macaque monkeys to identify direct and disynaptic pathways to the arm-related rostral medial intraparietal area (MIP), the ventral lateral intraparietal area (LIPv), belonging to the parietal eye field, and the pursuit-related lateral subdivision of the medial superior temporal area (MSTl). We found that these areas receive major disynaptic pathways via the thalamus from the nucleus of the optic tract (NOT) and the superior colliculus (SC), mainly ipsilaterally. NOT pathways, targeting MSTl most prominently, serve to process the sensory consequences of slow eye movements for which the NOT is the key sensorimotor interface. They potentially contribute to the directional asymmetry of the pursuit and optokinetic systems. MSTl and LIPv receive feedforward inputs from SC visual layers, which are potential correlates for fast detection of motion, perceptual saccadic suppression and visual spatial attention. MSTl is the target of efference copy pathways from saccade- and head-related compartments of SC motor layers and head-related reticulospinal neurons. They are potential sources of extraretinal signals related to eye and head movement in MSTl visual-tracking neurons. LIPv and rostral MIP receive efference copy pathways from all SC motor layers, providing online estimates of eye, head and arm movements. Our findings have important implications for understanding the role of the PPC in representation updating, internal models for online movement guidance, eye-hand coordination and optic ataxia.
Collapse
Affiliation(s)
- Gabriella Ugolini
- Paris-Saclay Institute of Neuroscience (NeuroPSI), UMR9197 CNRS - Université Paris-Saclay, Campus CEA Saclay, Saclay, France
| | - Werner Graf
- Department of Physiology and Biophysics, Howard University, Washington, DC, USA
| |
Collapse
|
2
|
Kang JU, Mooshagian E, Snyder LH. Functional organization of posterior parietal cortex circuitry based on inferred information flow. Cell Rep 2024; 43:114028. [PMID: 38581681 PMCID: PMC11090617 DOI: 10.1016/j.celrep.2024.114028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Revised: 02/09/2024] [Accepted: 03/15/2024] [Indexed: 04/08/2024] Open
Abstract
Many studies infer the role of neurons by asking what information can be decoded from their activity or by observing the consequences of perturbing their activity. An alternative approach is to consider information flow between neurons. We applied this approach to the parietal reach region (PRR) and the lateral intraparietal area (LIP) in posterior parietal cortex. Two complementary methods imply that across a range of reaching tasks, information flows primarily from PRR to LIP. This indicates that during a coordinated reach task, LIP has minimal influence on PRR and rules out the idea that LIP forms a general purpose spatial processing hub for action and cognition. Instead, we conclude that PRR and LIP operate in parallel to plan arm and eye movements, respectively, with asymmetric interactions that likely support eye-hand coordination. Similar methods can be applied to other areas to infer their functional relationships based on inferred information flow.
Collapse
Affiliation(s)
- Jung Uk Kang
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA.
| | - Eric Mooshagian
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Lawrence H Snyder
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA
| |
Collapse
|
3
|
Jana S, Gopal A, Murthy A. Computational Mechanisms Mediating Inhibitory Control of Coordinated Eye-Hand Movements. Brain Sci 2021; 11:607. [PMID: 34068477 PMCID: PMC8150398 DOI: 10.3390/brainsci11050607] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2021] [Revised: 05/02/2021] [Accepted: 05/04/2021] [Indexed: 11/17/2022] Open
Abstract
Significant progress has been made in understanding the computational and neural mechanisms that mediate eye and hand movements made in isolation. However, less is known about the mechanisms that control these movements when they are coordinated. Here, we outline our computational approaches using accumulation-to-threshold and race-to-threshold models to elucidate the mechanisms that initiate and inhibit these movements. We suggest that, depending on the behavioral context, the initiation and inhibition of coordinated eye-hand movements can operate in two modes-coupled and decoupled. The coupled mode operates when the task context requires a tight coupling between the effectors; a common command initiates both effectors, and a unitary inhibitory process is responsible for stopping them. Conversely, the decoupled mode operates when the task context demands weaker coupling between the effectors; separate commands initiate the eye and hand, and separate inhibitory processes are responsible for stopping them. We hypothesize that the higher-order control processes assess the behavioral context and choose the most appropriate mode. This computational mechanism can explain the heterogeneous results observed across many studies that have investigated the control of coordinated eye-hand movements and may also serve as a general framework to understand the control of complex multi-effector movements.
Collapse
Affiliation(s)
- Sumitash Jana
- Department of Psychology, University of California San Diego, La Jolla, CA 92093, USA
| | - Atul Gopal
- Laboratory of Sensorimotor Research, National Eye Institute, Bethesda, MD 20814, USA
| | - Aditya Murthy
- Centre for Neuroscience, Indian Institute of Science, Bangalore, Karnataka 560012, India;
| |
Collapse
|
4
|
de Brouwer AJ, Flanagan JR, Spering M. Functional Use of Eye Movements for an Acting System. Trends Cogn Sci 2021; 25:252-263. [PMID: 33436307 DOI: 10.1016/j.tics.2020.12.006] [Citation(s) in RCA: 28] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2020] [Revised: 12/05/2020] [Accepted: 12/07/2020] [Indexed: 10/22/2022]
Abstract
Movements of the eyes assist vision and support hand and body movements in a cooperative way. Despite their strong functional coupling, different types of movements are usually studied independently. We integrate knowledge from behavioral, neurophysiological, and clinical studies on how eye movements are coordinated with goal-directed hand movements and how they facilitate motor learning. Understanding the coordinated control of eye and hand movements can provide important insights into brain functions that are essential for performing or learning daily tasks in health and disease. This knowledge can also inform applications such as robotic manipulation and clinical rehabilitation.
Collapse
Affiliation(s)
- Anouk J de Brouwer
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada.
| | - J Randall Flanagan
- Centre for Neuroscience Studies, Queen's University, Kingston, Canada; Department of Psychology, Queen's University, Kingston, Canada
| | - Miriam Spering
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada; Djavad Mowafaghian Centre for Brain Health, University of British Columbia, Vancouver, Canada
| |
Collapse
|
5
|
O'Rielly JL, Ma-Wyatt A. Saccade dynamics during an online updating task change with healthy aging. J Vis 2020; 20:2. [PMID: 33270828 PMCID: PMC7718816 DOI: 10.1167/jov.20.13.2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2019] [Accepted: 08/01/2020] [Indexed: 11/28/2022] Open
Abstract
Goal-directed movements rely on the integration of both visual and motor information, especially during the online control of movement, to fluidly and flexibly control coordinated action. Eye-hand coordination typically plays an important role in goal-directed movements. As people age, various aspects of motor control and visual performance decline (Haegerstrom-Portnoy, Schneck, & Brabyn, 1999; Seidler et al., 2010), including an increase in saccade latencies (Munoz, Broughton, Goldring, & Armstrong, 1998). However, there is limited insight into how age-related changes in saccadic performance impact eye-hand coordination during online control. We investigated this question through the use of a target perturbation paradigm. Older and younger participants completed a perturbation task where target perturbations could occur either early (0 ms) or later (200 ms) after reach onset. We analyzed reach correction latencies and the frequency of the reach correction, coupled with analyses of saccades across all stages of movement. Older participants had slower correction latencies and initiated corrections less frequently compared to younger participants, with this trend being exacerbated in the later (200 ms) target perturbation condition. Older participants also produced slower saccade latencies toward both the initial target and the perturbed target. For trials in which a correction occurred to a late perturbation, touch responses were more accurate when there was more time between the saccade landing and the touch. Altogether, our results suggest that these age-related effects may be due to the delayed acquisition of visual and oculomotor information used to inform the reaching movement, stemming from the increase in saccade latencies before and after target perturbation.
Collapse
Affiliation(s)
- Jessica L O'Rielly
- School of Psychology, University of Adelaide, Adelaide, South Australia, Australia
| | - Anna Ma-Wyatt
- School of Psychology, University of Adelaide, Adelaide, South Australia, Australia
| |
Collapse
|
6
|
Goettker A, Fiehler K, Voudouris D. Somatosensory target information is used for reaching but not for saccadic eye movements. J Neurophysiol 2020; 124:1092-1102. [DOI: 10.1152/jn.00258.2020] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
A systematic investigation of contributions of different somatosensory modalities (proprioception, kinesthesia, tactile) for goal-directed movements is missing. Here we demonstrate that while eye movements are not affected by different types of somatosensory information, reach precision improves when two different types of information are available. Moreover, reach accuracy and gaze precision to unseen somatosensory targets improve when performing coordinated eye-hand movements, suggesting bidirectional contributions of efferent information in reach and eye movement control.
Collapse
Affiliation(s)
- Alexander Goettker
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
| | - Katja Fiehler
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University, Giessen, Germany
| | - Dimitris Voudouris
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
| |
Collapse
|
7
|
Hadjidimitrakis K. Coupling of head and hand movements during eye-head-hand coordination: there is more to reaching than meets eye. J Neurophysiol 2020; 123:1579-1582. [PMID: 32233904 DOI: 10.1152/jn.00099.2020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Does arm reaching affect eye-head shifts? Does the head alter eye-hand coordinated movements? Sensorimotor research has focused on either eye-head or eye-hand coordination, with only occasional works studying all these effectors together. Arora et al. (Arora HK, Bharmauria V, Yan X, Sun S, Wang H, Crawford JD. J Neurophysiol 122: 1946-1961, 2019) examined eye-head-hand coordination for the first time in nonhuman primates and provide evidence suggesting that head and hand movements are more coupled than traditionally considered.
Collapse
|
8
|
Battaglia-Mayer A. A Brief History of the Encoding of Hand Position by the Cerebral Cortex: Implications for Motor Control and Cognition. Cereb Cortex 2020; 29:716-731. [PMID: 29373634 DOI: 10.1093/cercor/bhx354] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2017] [Accepted: 12/22/2017] [Indexed: 12/18/2022] Open
Abstract
Encoding hand position by the cerebral cortex is essential not only for the neural representation of the body image but also for different actions based on eye-hand coordination. These include reaching for visual objects as well as complex movement sequences, such as tea-making, tool use, and object construction, among many others. All these functions depend on a continuous refreshing of the hand position representation, relying on both predictive signaling and afferent information. The hand position influence on neural activity in the parietofrontal system, together with eye position signals, are the basic elements of an eye-hand matrix from which all the above functions can emerge and could be regarded as key features of a network with several entry points, command nodes and outflow pathways, as confirmed by the discovery of a direct parietospinal projection for the control of hand action. The integrity of this system is crucial for daily life, as testified by the consequences of cortical lesions, spanning from severe paralysis to complex forms of apraxia. In this review, I will sketch my personal understanding of the scientific and conceptual trajectory of a line of investigation with many unexpected influences on cortical function and disease, from motor behavior to cognition.
Collapse
|
9
|
Open-Source Joystick Manipulandum for Decision-Making, Reaching, and Motor Control Studies in Mice. eNeuro 2020; 7:ENEURO.0523-19.2020. [PMID: 32094292 PMCID: PMC7131984 DOI: 10.1523/eneuro.0523-19.2020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2019] [Revised: 01/29/2020] [Accepted: 01/31/2020] [Indexed: 12/03/2022] Open
Abstract
To make full use of optogenetic and molecular techniques in the study of motor control, rich behavioral paradigms for rodents must rise to the same level of sophistication and applicability. We describe the layout, construction, use and analysis of data from joystick-based reaching in a head-fixed mouse. The step-by-step guide is designed for both experienced rodent motor labs and new groups looking to enter into this research space. Using this platform, mice learn to consistently perform large, easily-quantified reaches, including during a two-armed bandit probabilistic learning task. The metrics of performance (reach trajectory, amplitude, speed, duration, and inter-reach interval) can be used to quantify behavior or administer stimulation in closed loop with behavior. We provide a highly customizable, low cost and reproducible open-source behavior training platform for studying motor control, decision-making, and reaching reaction time. The development of this software and hardware platform enables behavioral work to complement recent advances in rodents, while remaining accessible to smaller institutions and labs, thus providing a high-throughput method to study unexplored features of action selection, motivation, and value-based decisions.
Collapse
|
10
|
Su L, Chang CJ, Lynch N. Spike-Based Winner-Take-All Computation: Fundamental Limits and Order-Optimal Circuits. Neural Comput 2019; 31:2523-2561. [PMID: 31614103 DOI: 10.1162/neco_a_01242] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
Abstract
Winner-take-all (WTA) refers to the neural operation that selects a (typically small) group of neurons from a large neuron pool. It is conjectured to underlie many of the brain's fundamental computational abilities. However, not much is known about the robustness of a spike-based WTA network to the inherent randomness of the input spike trains. In this work, we consider a spike-based k-WTA model wherein n randomly generated input spike trains compete with each other based on their underlying firing rates and k winners are supposed to be selected. We slot the time evenly with each time slot of length 1 ms and model the n input spike trains as n independent Bernoulli processes. We analytically characterize the minimum waiting time needed so that a target minimax decision accuracy (success probability) can be reached. We first derive an information-theoretic lower bound on the waiting time. We show that to guarantee a (minimax) decision error ≤δ (where δ∈(0,1)), the waiting time of any WTA circuit is at least [Formula: see text]where R⊆(0,1) is a finite set of rates and TR is a difficulty parameter of a WTA task with respect to set R for independent input spike trains. Additionally, TR is independent of δ, n, and k. We then design a simple WTA circuit whose waiting time is [Formula: see text]provided that the local memory of each output neuron is sufficiently long. It turns out that for any fixed δ, this decision time is order-optimal (i.e., it matches the above lower bound up to a multiplicative constant factor) in terms of its scaling in n, k, and TR.
Collapse
Affiliation(s)
- Lili Su
- Computer Science and Artificial Intelligence Laboratory, MIT, Cambridge, MA 02142, U.S.A.
| | - Chia-Jung Chang
- Brain and Cognitive Sciences, MIT, Cambridge, MA 02142, U.S.A.
| | - Nancy Lynch
- Computer Science and Artificial Intelligence Laboratory, MIT, Cambridge, MA 02142, U.S.A.
| |
Collapse
|
11
|
Freedman DJ, Ibos G. An Integrative Framework for Sensory, Motor, and Cognitive Functions of the Posterior Parietal Cortex. Neuron 2019; 97:1219-1234. [PMID: 29566792 DOI: 10.1016/j.neuron.2018.01.044] [Citation(s) in RCA: 69] [Impact Index Per Article: 13.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2017] [Revised: 01/12/2018] [Accepted: 01/23/2018] [Indexed: 11/28/2022]
Abstract
Throughout the history of modern neuroscience, the parietal cortex has been associated with a wide array of sensory, motor, and cognitive functions. The use of non-human primates as a model organism has been instrumental in our current understanding of how areas in the posterior parietal cortex (PPC) modulate our perception and influence our behavior. In this Perspective, we highlight a series of influential studies over the last five decades examining the role of the PPC in visual perception and motor planning. We also integrate long-standing views of PPC functions with more recent evidence to propose a more general model framework to explain integrative sensory, motor, and cognitive functions of the PPC.
Collapse
Affiliation(s)
- David J Freedman
- Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA; Grossman Institute for Neuroscience, Quantitative Biology and Human Behavior, The University of Chicago, Chicago, IL 60637, USA.
| | - Guilhem Ibos
- Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA; Institut de Neuroscience de la Timone, UMR 7289 CNRS & Aix-Marseille Université, Marseille, France.
| |
Collapse
|
12
|
Malienko A, Harrar V, Khan AZ. Contrasting effects of exogenous cueing on saccades and reaches. J Vis 2018; 18:4. [DOI: 10.1167/18.9.4] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Affiliation(s)
- Anton Malienko
- Vision, Attention and Action Laboratory (VISATTAC), School of Optometry, University of Montreal, Montreal, Quebec, Canada
| | - Vanessa Harrar
- Vision, Attention and Action Laboratory (VISATTAC), School of Optometry, University of Montreal, Montreal, Quebec, Canada
| | - Aarlenne Z. Khan
- Vision, Attention and Action Laboratory (VISATTAC), School of Optometry, University of Montreal, Montreal, Quebec, Canada
| |
Collapse
|
13
|
Independent selection of eye and hand targets suggests effector-specific attentional mechanisms. Sci Rep 2018; 8:9434. [PMID: 29930389 PMCID: PMC6013452 DOI: 10.1038/s41598-018-27723-4] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2018] [Accepted: 06/04/2018] [Indexed: 11/23/2022] Open
Abstract
Both eye and hand movements bind visual attention to their target locations during movement preparation. However, it remains contentious whether eye and hand targets are selected jointly by a single selection system, or individually by independent systems. To unravel the controversy, we investigated the deployment of visual attention – a proxy of motor target selection – in coordinated eye-hand movements. Results show that attention builds up in parallel both at the eye and the hand target. Importantly, the allocation of attention to one effector’s motor target was not affected by the concurrent preparation of the other effector’s movement at any time during movement preparation. This demonstrates that eye and hand targets are represented in separate, effector-specific maps of action-relevant locations. The eye-hand synchronisation that is frequently observed on the behavioral level must emerge from mutual influences of the two effector systems at later, post-attentional processing stages.
Collapse
|
14
|
Lateral intraparietal area (LIP) is largely effector-specific in free-choice decisions. Sci Rep 2018; 8:8611. [PMID: 29872059 PMCID: PMC5988653 DOI: 10.1038/s41598-018-26366-9] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2017] [Accepted: 05/08/2018] [Indexed: 01/08/2023] Open
Abstract
Despite many years of intense research, there is no strong consensus about the role of the lateral intraparietal area (LIP) in decision making. One view of LIP function is that it guides spatial attention, providing a “saliency map” of the external world. If this were the case, it would contribute to target selection regardless of which action would be performed to implement the choice. On the other hand, LIP inactivation has been shown to influence spatial selection and oculomotor metrics in free-choice decisions, which are made using eye movements, arguing that it contributes to saccade decisions. To dissociate between a more general attention role and a more effector specific saccade role, we reversibly inactivated LIP while non-human primates freely selected between two targets, presented in the two hemifields, with either saccades or reaches. Unilateral LIP inactivation induced a strong choice bias to ipsilesional targets when decisions were made with saccades. Interestingly, the inactivation also caused a reduction of contralesional choices when decisions were made with reaches, albeit the effect was less pronounced. These findings suggest that LIP is part of a network for making oculomotor decisions and is largely effector-specific in free-choice decisions.
Collapse
|
15
|
O'Rielly JL, Ma-Wyatt A. Changes to online control and eye-hand coordination with healthy ageing. Hum Mov Sci 2018; 59:244-257. [PMID: 29747069 DOI: 10.1016/j.humov.2018.04.013] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2017] [Revised: 04/23/2018] [Accepted: 04/24/2018] [Indexed: 01/19/2023]
Abstract
Goal directed movements are typically accompanied by a saccade to the target location. Online control plays an important part in correction of a reach, especially if the target or goal of the reach moves during the reach. While there are notable changes to visual processing and motor control with healthy ageing, there is limited evidence about how eye-hand coordination during online updating changes with healthy ageing. We sought to quantify differences between older and younger people for eye-hand coordination during online updating. Participants completed a double step reaching task implemented under time pressure. The target perturbation could occur 200, 400 and 600 ms into a reach. We measured eye position and hand position throughout the trials to investigate changes to saccade latency, movement latency, movement time, reach characteristics and eye-hand latency and accuracy. Both groups were able to update their reach in response to a target perturbation that occurred at 200 or 400 ms into the reach. All participants demonstrated incomplete online updating for the 600 ms perturbation time. Saccade latencies, measured from the first target presentation, were generally longer for older participants. Older participants had significantly increased movement times but there was no significant difference between groups for touch accuracy. We speculate that the longer movement times enable the use of new visual information about the target location for online updating towards the end of the movement. Interestingly, older participants also produced a greater proportion of secondary saccades within the target perturbation condition and had generally shorter eye-hand latencies. This is perhaps a compensatory mechanism as there was no significant group effect on final saccade accuracy. Overall, the pattern of results suggests that online control of movements may be qualitatively different in older participants.
Collapse
Affiliation(s)
| | - Anna Ma-Wyatt
- School of Psychology, University of Adelaide, SA 5005, Australia.
| |
Collapse
|
16
|
Spatial eye-hand coordination during bimanual reaching is not systematically coded in either LIP or PRR. Proc Natl Acad Sci U S A 2018; 115:E3817-E3826. [PMID: 29610356 PMCID: PMC5910835 DOI: 10.1073/pnas.1718267115] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023] Open
Abstract
When we reach for something, we also look at it. If we reach for two objects at once, one with each hand, we look first at one and then the other. It is not known which brain areas underlie this coordination. We studied two parietal areas known to be involved in eye and arm movements. Neither area was sensitive to the order in which the targets were looked at. This implies that coordinated saccades are driven by downstream areas and not by the parietal cortex as is commonly assumed. We often orient to where we are about to reach. Spatial and temporal correlations in eye and arm movements may depend on the posterior parietal cortex (PPC). Spatial representations of saccade and reach goals preferentially activate cells in the lateral intraparietal area (LIP) and the parietal reach region (PRR), respectively. With unimanual reaches, eye and arm movement patterns are highly stereotyped. This makes it difficult to study the neural circuits involved in coordination. Here, we employ bimanual reaching to two different targets. Animals naturally make a saccade first to one target and then the other, resulting in different patterns of limb–gaze coordination on different trials. Remarkably, neither LIP nor PRR cells code which target the eyes will move to first. These results suggest that the parietal cortex plays at best only a permissive role in some aspects of eye–hand coordination and makes the role of LIP in saccade generation unclear.
Collapse
|
17
|
Cortical Afferents and Myeloarchitecture Distinguish the Medial Intraparietal Area (MIP) from Neighboring Subdivisions of the Macaque Cortex. eNeuro 2017; 4:eN-NWR-0344-17. [PMID: 29379868 PMCID: PMC5779118 DOI: 10.1523/eneuro.0344-17.2017] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2017] [Revised: 11/04/2017] [Accepted: 11/07/2017] [Indexed: 01/07/2023] Open
Abstract
The parietal reach region (PRR) in the medial bank of the macaque intraparietal sulcus has been a subject of considerable interest in research aimed at the development of brain-controlled prosthetic arms, but its anatomical organization remains poorly characterized. We examined the anatomical organization of the putative PRR territory based on myeloarchitecture and retrograde tracer injections. We found that the medial bank includes three areas: an extension of the dorsal subdivision of V6A (V6Ad), the medial intraparietal area (MIP), and a subdivision of area PE (PEip). Analysis of corticocortical connections revealed that both V6Ad and MIP receive inputs from visual area V6; the ventral subdivision of V6A (V6Av); medial (PGm, 31), superior (PEc), and inferior (PFG/PF) parietal association areas; and intraparietal areas AIP and VIP. They also receive long-range projections from the superior temporal sulcus (MST, TPO), cingulate area 23, and the dorsocaudal (area F2) and ventral (areas F4/F5) premotor areas. In comparison with V6Ad, MIP receives denser input from somatosensory areas, the primary motor cortex, and the medial motor fields, as well as from visual cortex in the ventral precuneate cortex and frontal regions associated with oculomotor guidance. Unlike MIP, V6Ad receives stronger visual input, from the caudal inferior parietal cortex (PG/Opt) and V6Av, whereas PEip shows marked emphasis on anterior parietal, primary motor, and ventral premotor connections. These anatomical results suggest that MIP and V6A have complementary roles in sensorimotor behavior, with MIP more directly involved in movement planning and execution in comparison with V6A.
Collapse
|
18
|
|
19
|
Kreyenmeier P, Fooken J, Spering M. Context effects on smooth pursuit and manual interception of a disappearing target. J Neurophysiol 2017; 118:404-415. [PMID: 28515287 DOI: 10.1152/jn.00217.2017] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2017] [Revised: 04/25/2017] [Accepted: 05/12/2017] [Indexed: 11/22/2022] Open
Abstract
In our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements. Here we ask whether eye and hand movements integrate motion signals from target and context similarly or differently, and whether context effects on eye and hand change over time. We developed a track-intercept task requiring participants to track the initial launch of a moving object ("ball") with smooth pursuit eye movements. The ball disappeared after a brief presentation, and participants had to intercept it in a designated "hit zone." In two experiments (n = 18 human observers each), the ball was shown in front of a uniform or a textured background that either was stationary or moved along with the target. Eye and hand movement latencies and speeds were similarly affected by the visual context, but eye and hand interception (eye position at time of interception, and hand interception timing error) did not differ significantly between context conditions. Eye and hand interception timing errors were strongly correlated on a trial-by-trial basis across all context conditions, highlighting the close relation between these responses in manual interception tasks. Our results indicate that visual contexts similarly affect eye and hand movements but that these effects may be short-lasting, affecting movement trajectories more than movement end points.NEW & NOTEWORTHY In a novel track-intercept paradigm, human observers tracked a briefly shown object moving across a textured, dynamic context and intercepted it with their finger after it had disappeared. Context motion significantly affected eye and hand movement latency and speed, but not interception accuracy; eye and hand position at interception were correlated on a trial-by-trial basis. Visual context effects may be short-lasting, affecting movement trajectories more than movement end points.
Collapse
Affiliation(s)
- Philipp Kreyenmeier
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada.,Graduate Program in Neuro-Cognitive Psychology, Ludwig Maximilian University, Munich, Germany
| | - Jolande Fooken
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada.,Graduate Program in Neuroscience, University of British Columbia, Vancouver, Canada
| | - Miriam Spering
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada; .,Graduate Program in Neuroscience, University of British Columbia, Vancouver, Canada.,Center for Brain Health, University of British Columbia, Vancouver, Canada.,Institute for Information, Computing and Cognitive Systems, University of British Columbia, Vancouver, Canada; and.,International Collaboration on Repair Discoveries, Vancouver, Canada
| |
Collapse
|
20
|
Neromyliotis E, Moschovakis AK. Response Properties of Motor Equivalence Neurons of the Primate Premotor Cortex. Front Behav Neurosci 2017; 11:61. [PMID: 28446867 PMCID: PMC5388740 DOI: 10.3389/fnbeh.2017.00061] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2017] [Accepted: 03/27/2017] [Indexed: 11/23/2022] Open
Abstract
To study the response properties of cells that could participate in eye-hand coordination we trained two macaque monkeys to perform center-out saccades and pointing movements with their right or left forelimb toward visual targets presented on a video display. We analyzed the phasic movement related discharges of neurons of the periarcuate cortex that fire before and during saccades and movements of the hand whether accompanied by movements of the other effector or not. Because such cells could encode an abstract form of the desired displacement vector without regard to the effector that would execute the movement we refer to such cells as motor equivalence neurons (Meq). Most of them (75%) were found in or near the smooth pursuit region and the grasp related region in the caudal bank of the arcuate sulcus. The onset of their phasic discharges preceded saccades by about 70 ms and hand movements by about 150 ms and was often correlated to both the onset of saccades and the onset of hand movements. The on-direction of Meq cells was uniformly distributed without preference for ipsiversive or contraversive movements. In about half of the Meq cells the preferred direction for saccades was the preferred direction for hand movements as well. In the remaining cells the difference was considerable (>90 deg), and the on-direction for eye-hand movements resembled that for isolated saccades in some cells and for isolated hand movements in others. A three layer neural network model that used Meq cells as its input layer showed that the combination of effector invariant discharges with non-invariant discharges could help reduce the number of decoding errors when the network attempts to compute the correct movement metrics of the right effector.
Collapse
Affiliation(s)
- Eleftherios Neromyliotis
- Institute of Applied and Computational Mathematics, Foundation for Research and TechnologyHeraklion, Greece.,Department of Basic Sciences, Faculty of Medicine, University of CreteHeraklion, Greece
| | - A K Moschovakis
- Institute of Applied and Computational Mathematics, Foundation for Research and TechnologyHeraklion, Greece.,Department of Basic Sciences, Faculty of Medicine, University of CreteHeraklion, Greece
| |
Collapse
|
21
|
Computational Architecture of the Parieto-Frontal Network Underlying Cognitive-Motor Control in Monkeys. eNeuro 2017; 4:eN-NWR-0306-16. [PMID: 28275714 PMCID: PMC5329620 DOI: 10.1523/eneuro.0306-16.2017] [Citation(s) in RCA: 50] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2016] [Revised: 01/31/2017] [Accepted: 02/01/2017] [Indexed: 11/21/2022] Open
Abstract
The statistical structure of intrinsic parietal and parieto-frontal connectivity in monkeys was studied through hierarchical cluster analysis. Based on their inputs, parietal and frontal areas were grouped into different clusters, including a variable number of areas that in most instances occupied contiguous architectonic fields. Connectivity tended to be stronger locally: that is, within areas of the same cluster. Distant frontal and parietal areas were targeted through connections that in most instances were reciprocal and often of different strength. These connections linked parietal and frontal clusters formed by areas sharing basic functional properties. This led to five different medio-laterally oriented pillar domains spanning the entire extent of the parieto-frontal system, in the posterior parietal, anterior parietal, cingulate, frontal, and prefrontal cortex. Different information processing streams could be identified thanks to inter-domain connectivity. These streams encode fast hand reaching and its control, complex visuomotor action spaces, hand grasping, action/intention recognition, oculomotor intention and visual attention, behavioral goals and strategies, and reward and decision value outcome. Most of these streams converge on the cingulate domain, the main hub of the system. All of them are embedded within a larger eye–hand coordination network, from which they can be selectively set in motion by task demands.
Collapse
|
22
|
Chen J, Valsecchi M, Gegenfurtner KR. Role of motor execution in the ocular tracking of self-generated movements. J Neurophysiol 2016; 116:2586-2593. [PMID: 27628207 DOI: 10.1152/jn.00574.2016] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2016] [Accepted: 09/09/2016] [Indexed: 11/22/2022] Open
Abstract
When human observers track the movements of their own hand with their gaze, the eyes can start moving before the finger (i.e., anticipatory smooth pursuit). The signals driving anticipation could come from motor commands during finger motor execution or from motor intention and decision processes associated with self-initiated movements. For the present study, we built a mechanical device that could move a visual target either in the same direction as the participant's hand or in the opposite direction. Gaze pursuit of the target showed stronger anticipation if it moved in the same direction as the hand compared with the opposite direction, as evidenced by decreased pursuit latency, increased positional lead of the eye relative to target, increased pursuit gain, decreased saccade rate, and decreased delay at the movement reversal. Some degree of anticipation occurred for incongruent pursuit, indicating that there is a role for higher-level movement prediction in pursuit anticipation. The fact that anticipation was larger when target and finger moved in the same direction provides evidence for a direct coupling between finger and eye motor commands.
Collapse
Affiliation(s)
- Jing Chen
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| | - Matteo Valsecchi
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| | - Karl R Gegenfurtner
- Abteilung Allgemeine Psychologie, Justus-Liebig-Universität Giessen, Giessen, Germany
| |
Collapse
|
23
|
Gopal A, Murthy A. Eye-hand coordination during a double-step task: evidence for a common stochastic accumulator. J Neurophysiol 2015; 114:1438-54. [PMID: 26084906 PMCID: PMC4556852 DOI: 10.1152/jn.00276.2015] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2015] [Accepted: 06/15/2015] [Indexed: 11/22/2022] Open
Abstract
Many studies of reaching and pointing have shown significant spatial and temporal correlations between eye and hand movements. Nevertheless, it remains unclear whether these correlations are incidental, arising from common inputs (independent model); whether these correlations represent an interaction between otherwise independent eye and hand systems (interactive model); or whether these correlations arise from a single dedicated eye-hand system (common command model). Subjects were instructed to redirect gaze and pointing movements in a double-step task in an attempt to decouple eye-hand movements and causally distinguish between the three architectures. We used a drift-diffusion framework in the context of a race model, which has been previously used to explain redirect behavior for eye and hand movements separately, to predict the pattern of eye-hand decoupling. We found that the common command architecture could best explain the observed frequency of different eye and hand response patterns to the target step. A common stochastic accumulator for eye-hand coordination also predicts comparable variances, despite significant difference in the means of the eye and hand reaction time (RT) distributions, which we tested. Consistent with this prediction, we observed that the variances of the eye and hand RTs were similar, despite much larger hand RTs (∼90 ms). Moreover, changes in mean eye RTs, which also increased eye RT variance, produced a similar increase in mean and variance of the associated hand RT. Taken together, these data suggest that a dedicated circuit underlies coordinated eye-hand planning.
Collapse
Affiliation(s)
- Atul Gopal
- National Brain Research Centre, Manesar, Haryana, India; and
| | - Aditya Murthy
- Centre for Neuroscience, Indian Institute of Science, Bangalore, Karnataka, India
| |
Collapse
|
24
|
Caminiti R, Innocenti GM, Battaglia-Mayer A. Organization and evolution of parieto-frontal processing streams in macaque monkeys and humans. Neurosci Biobehav Rev 2015; 56:73-96. [PMID: 26112130 DOI: 10.1016/j.neubiorev.2015.06.014] [Citation(s) in RCA: 56] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2015] [Revised: 05/08/2015] [Accepted: 06/09/2015] [Indexed: 01/01/2023]
Abstract
The functional organization of the parieto-frontal system is crucial for understanding cognitive-motor behavior and provides the basis for interpreting the consequences of parietal lesions in humans from a neurobiological perspective. The parieto-frontal connectivity defines some main information streams that, rather than being devoted to restricted functions, underlie a rich behavioral repertoire. Surprisingly, from macaque to humans, evolution has added only a few, new functional streams, increasing however their complexity and encoding power. In fact, the characterization of the conduction times of parietal and frontal areas to different target structures has recently opened a new window on cortical dynamics, suggesting that evolution has amplified the probability of dynamic interactions between the nodes of the network, thanks to communication patterns based on temporally-dispersed conduction delays. This might allow the representation of sensory-motor signals within multiple neural assemblies and reference frames, as to optimize sensory-motor remapping within an action space characterized by different and more complex demands across evolution.
Collapse
Affiliation(s)
- Roberto Caminiti
- Department of Physiology and Pharmacology, University of Rome SAPIENZA, P.le Aldo Moro 5, 00185 Rome, Italy.
| | - Giorgio M Innocenti
- Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden; Brain and Mind Institute, Federal Institute of Technology, EPFL, Lausanne, Switzerland
| | - Alexandra Battaglia-Mayer
- Department of Physiology and Pharmacology, University of Rome SAPIENZA, P.le Aldo Moro 5, 00185 Rome, Italy
| |
Collapse
|
25
|
Battaglia-Mayer A, Ferrari-Toniolo S, Visco-Comandini F. Timing and communication of parietal cortex for visuomotor control. Curr Opin Neurobiol 2015; 33:103-9. [PMID: 25841091 DOI: 10.1016/j.conb.2015.03.005] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2014] [Revised: 02/27/2015] [Accepted: 03/10/2015] [Indexed: 11/30/2022]
Abstract
In both monkeys and humans, motor cognition emerges from a parietal-frontal network containing discrete dominant domains of visual, eye and hand signals, where neurons are responsible for goal and effector selection. Within these domains, the combination of different inputs shape the tuning properties of neurons, while local and long cortico-cortical connections outline the architecture of the distributed network and determine the conduction time underlying eye-hand coordination, necessary for visually guided operations in the action space. The analysis of the communication timing between parietal and frontal nodes of the network helps understanding the sensorimotor cortical delays associated to different functions, such as online control of movement and eye-hand coordination, and opens a new perspective to the study of the parieto-frontal interactions.
Collapse
Affiliation(s)
- Alexandra Battaglia-Mayer
- Department of Physiology and Pharmacology, SAPIENZA University of Rome, P.le Aldo Moro 5, 00185 Rome, Italy.
| | - Simone Ferrari-Toniolo
- Department of Physiology and Pharmacology, SAPIENZA University of Rome, P.le Aldo Moro 5, 00185 Rome, Italy
| | - Federica Visco-Comandini
- Department of Physiology and Pharmacology, SAPIENZA University of Rome, P.le Aldo Moro 5, 00185 Rome, Italy
| |
Collapse
|
26
|
Chang SWC, Calton JL, Lawrence BM, Dickinson AR, Snyder LH. Region-Specific Summation Patterns Inform the Role of Cortical Areas in Selecting Motor Plans. Cereb Cortex 2015; 26:2154-66. [PMID: 25778345 DOI: 10.1093/cercor/bhv047] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Given an instruction regarding which effector to move and what location to move to, simply adding the effector and spatial signals together will not lead to movement selection. For this, a nonlinearity is required. Thresholds, for example, can be used to select a particular response and reject others. Here we consider another useful nonlinearity, a supralinear multiplicative interaction. To help select a motor plan, spatial and effector signals could multiply and thereby amplify each other. Such an amplification could constitute one step within a distributed network involved in response selection, effectively boosting one response while suppressing others. We therefore asked whether effector and spatial signals sum supralinearly for planning eye versus arm movements from the parietal reach region (PRR), the lateral intraparietal area (LIP), the frontal eye field (FEF), and a portion of area 5 (A5) lying just anterior to PRR. Unlike LIP neurons, PRR, FEF, and, to a lesser extent, A5 neurons show a supralinear interaction. Our results suggest that selecting visually guided eye versus arm movements is likely to be mediated by PRR and FEF but not LIP.
Collapse
Affiliation(s)
- Steve W C Chang
- Department of Psychology, Yale University, New Haven, CT 06511, USA Department of Neurobiology, Yale University School of Medicine, New Haven, CT 06520, USA
| | - Jeffrey L Calton
- Department of Psychology, Sacramento State University, Sacramento, CA 95819, USA
| | - Bonnie M Lawrence
- Department of Psychology, New York University, New York, NY 10003, USA
| | - Anthony R Dickinson
- Department of Anatomy and Neurobiology, Washington University in St Louis School of Medicine, St Louis, MO 63110, USA
| | - Lawrence H Snyder
- Department of Anatomy and Neurobiology, Washington University in St Louis School of Medicine, St Louis, MO 63110, USA
| |
Collapse
|
27
|
Abstract
Parietal cortex is central to spatial cognition. Lesions of parietal cortex often lead to hemispatial neglect, an impairment of choices of targets in space. It has been unclear whether parietal cortex implements target choice at the general cognitive level, or whether parietal cortex subserves the choice of targets of particular actions. To address this question, monkeys engaged in choice tasks in two distinct action contexts--eye movements and arm movements. We placed focused reversible lesions into specific parietal circuits using the GABAA receptor agonist muscimol and validated the lesion placement using MRI. We found that lesions on the lateral bank of the intraparietal sulcus [lateral intraparietal area (LIP)] specifically biased choices made using eye movements, whereas lesions on the medial bank of the intraparietal sulcus [parietal reach region (PRR)] specifically biased choices made using arm movements. This double dissociation suggests that target choice is implemented in dedicated parietal circuits in the context of specific actions. This finding emphasizes a motor role of parietal cortex in spatial choice making and contributes to our understanding of hemispatial neglect.
Collapse
|
28
|
A learning-based approach to artificial sensory feedback leads to optimal integration. Nat Neurosci 2014; 18:138-44. [PMID: 25420067 PMCID: PMC4282864 DOI: 10.1038/nn.3883] [Citation(s) in RCA: 124] [Impact Index Per Article: 12.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2014] [Accepted: 10/27/2014] [Indexed: 11/08/2022]
Abstract
Proprioception—the sense of the body’s position in space—plays an important role in natural movement planning and execution and will likewise be necessary for successful motor prostheses and Brain–Machine Interfaces (BMIs). Here, we demonstrated that monkeys could learn to use an initially unfamiliar multi–channel intracortical microstimulation (ICMS) signal, which provided continuous information about hand position relative to an unseen target, to complete accurate reaches. Furthermore, monkeys combined this artificial signal with vision to form an optimal, minimum–variance estimate of relative hand position. These results demonstrate that a learning–based approach can be used to provide a rich artificial sensory feedback signal, suggesting a new strategy for restoring proprioception to patients using BMIs as well as a powerful new tool for studying the adaptive mechanisms of sensory integration.
Collapse
|
29
|
Abstract
Coordinated eye movements are crucial for precision control of our hands. A commonly believed neural mechanism underlying eye-hand coordination is interaction between the neural networks controlling each effector, exchanging, and matching information, such as movement target location and onset time. Alternatively, eye-hand coordination may result simply from common inputs to independent eye and hand control pathways. Thus far, it remains unknown whether and where either of these two possible mechanisms exists. A candidate location for the former mechanism, interpathway communication, includes the posterior parietal cortex (PPC) where distinct effector-specific areas reside. If the PPC were within the network for eye-hand coordination, perturbing it would affect both eye and hand movements that are concurrently planned. In contrast, if eye-hand coordination arises solely from common inputs, perturbing one effector pathway, e.g., the parietal reach region (PRR), would not affect the other effector. To test these hypotheses, we inactivated part of PRR in the macaque, located in the medial bank of the intraparietal sulcus encompassing the medial intraparietal area and area 5V. When each effector moved alone, PRR inactivation shortened reach but not saccade amplitudes, compatible with the known reach-selective activity of PRR. However, when both effectors moved concurrently, PRR inactivation shortened both reach and saccade amplitudes, and decoupled their reaction times. Therefore, consistent with the interpathway communication hypothesis, we propose that the planning of concurrent eye and hand movements causes the spatial information in PRR to influence the otherwise independent eye control pathways, and that their temporal coupling requires an intact PRR.
Collapse
|
30
|
Andersen RA, Andersen KN, Hwang EJ, Hauschild M. Optic ataxia: from Balint's syndrome to the parietal reach region. Neuron 2014; 81:967-983. [PMID: 24607223 DOI: 10.1016/j.neuron.2014.02.025] [Citation(s) in RCA: 81] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/18/2014] [Indexed: 01/10/2023]
Abstract
Optic ataxia is a high-order deficit in reaching to visual goals that occurs with posterior parietal cortex (PPC) lesions. It is a component of Balint's syndrome that also includes attentional and gaze disorders. Aspects of optic ataxia are misreaching in the contralesional visual field, difficulty preshaping the hand for grasping, and an inability to correct reaches online. Recent research in nonhuman primates (NHPs) suggests that many aspects of Balint's syndrome and optic ataxia are a result of damage to specific functional modules for reaching, saccades, grasp, attention, and state estimation. The deficits from large lesions in humans are probably composite effects from damage to combinations of these functional modules. Interactions between these modules, either within posterior parietal cortex or downstream within frontal cortex, may account for more complex behaviors such as hand-eye coordination and reach-to-grasp.
Collapse
Affiliation(s)
- Richard A Andersen
- Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, CA 91125, USA.
| | - Kristen N Andersen
- Departments of Neurology and Pediatrics, University of California, Los Angeles Medical Center, Los Angeles, CA 90095, USA
| | - Eun Jung Hwang
- Division of Biological Sciences, University of California, San Diego, La Jolla, CA 92093, USA
| | - Markus Hauschild
- Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, CA 91125, USA
| |
Collapse
|
31
|
Yttri EA, Wang C, Liu Y, Snyder LH. The parietal reach region is limb specific and not involved in eye-hand coordination. J Neurophysiol 2013; 111:520-32. [PMID: 24198328 DOI: 10.1152/jn.00058.2013] [Citation(s) in RCA: 38] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Primates frequently reach toward visual targets. Neurons in early visual areas respond to stimuli in the contralateral visual hemifield and without regard to which limb will be used to reach toward that target. In contrast, neurons in motor areas typically respond when reaches are performed using the contralateral limb and with minimal regard to the visuospatial location of the target. The parietal reach region (PRR) is located early in the visuomotor processing hierarchy. PRR neurons are significantly modulated when targets for either limb or eye movement appear, similar to early sensory areas; however, they respond to targets in either visual field, similar to motor areas. The activity could reflect the subject's attentional locus, movement of a specific effector, or a related function, such as coordinating eye-arm movements. To examine the role of PRR in the visuomotor pathway, we reversibly inactivated PRR. Inactivation effects were specific to contralateral limb movements, leaving ipsilateral limb and saccadic movements intact. Neither visual hemifield bias nor visual attention deficits were observed. Thus our results are consistent with a motoric rather than visual organization in PRR, despite its early location in the visuomotor pathway. We found no effects on the temporal coupling of coordinated saccades and reaches, suggesting that this mechanism lies downstream of PRR. In sum, this study clarifies the role of PRR in the visuomotor hierarchy: despite its early position, it is a limb-specific area influencing reach planning and is positioned upstream from an active eye-hand coordination-coupling mechanism.
Collapse
Affiliation(s)
- Eric A Yttri
- Department of Anatomy and Neurobiology, Washington University School of Medicine, St. Louis, Missouri
| | | | | | | |
Collapse
|