1
|
Generalization in de novo learning of virtual upper limb movements is influenced by motor exploration. Front Sports Act Living 2024; 6:1370621. [PMID: 38510523 PMCID: PMC10950898 DOI: 10.3389/fspor.2024.1370621] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2024] [Accepted: 02/26/2024] [Indexed: 03/22/2024] Open
Abstract
The acquisition of new motor skills from scratch, also known as de novo learning, is an essential aspect of motor development. In de novo learning, the ability to generalize skills acquired under one condition to others is crucial because of the inherently limited range of motor experiences available for learning. However, the presence of generalization in de novo learning and its influencing factors remain unclear. This study aimed to elucidate the generalization of de novo motor learning by examining the motor exploration process, which is the accumulation of motor experiences. To this end, we manipulated the exploration process during practice by changing the target shape using either a small circular target or a bar-shaped target. Our findings demonstrated that the amount of learning during practice was generalized across different conditions. Furthermore, the extent of generalization is influenced by movement variability in the control space, which is irrelevant to the task, rather than the target shapes themselves. These results confirmed the occurrence of generalization in de novo learning and suggest that the exploration process within the control space plays a significant role in facilitating this generalization.
Collapse
|
2
|
Distributed and specific encoding of sensory, motor, and decision information in the mouse neocortex during goal-directed behavior. Cell Rep 2024; 43:113618. [PMID: 38150365 DOI: 10.1016/j.celrep.2023.113618] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Revised: 10/27/2023] [Accepted: 12/08/2023] [Indexed: 12/29/2023] Open
Abstract
Goal-directed behaviors involve coordinated activity in many cortical areas, but whether the encoding of task variables is distributed across areas or is more specifically represented in distinct areas remains unclear. Here, we compared representations of sensory, motor, and decision information in the whisker primary somatosensory cortex, medial prefrontal cortex, and tongue-jaw primary motor cortex in mice trained to lick in response to a whisker stimulus with mice that were not taught this association. Irrespective of learning, properties of the sensory stimulus were best encoded in the sensory cortex, whereas fine movement kinematics were best represented in the motor cortex. However, movement initiation and the decision to lick in response to the whisker stimulus were represented in all three areas, with decision neurons in the medial prefrontal cortex being more selective, showing minimal sensory responses in miss trials and motor responses during spontaneous licks. Our results reconcile previous studies indicating highly specific vs. highly distributed sensorimotor processing.
Collapse
|
3
|
Done in 65 ms: Express Visuomotor Responses in Upper Limb Muscles in Rhesus Macaques. eNeuro 2023; 10:ENEURO.0078-23.2023. [PMID: 37507227 PMCID: PMC10449271 DOI: 10.1523/eneuro.0078-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Revised: 07/17/2023] [Accepted: 07/18/2023] [Indexed: 07/30/2023] Open
Abstract
How rapidly can the brain transform vision into action? Work in humans has established that the transformation for visually-guided reaching can be remarkably rapid, with the first phase of upper limb muscle recruitment, the express visuomotor response, beginning within less than 100 ms of visual target presentation. Such short-latency responses limit the opportunities for extensive cortical processing, leading to the hypothesis that they are generated via the subcortical tecto-reticulo-spinal pathway. Here, we examine whether nonhuman primates (NHPs) exhibit express visuomotor responses. Two male macaques made visually-guided reaches in a behavioral paradigm known to elicit express visuomotor responses in humans, while we acquired intramuscular recordings from the deltoid muscle. Across several variants of this paradigm, express visuomotor responses began within 65 ms (range: 48-91 ms) of target presentation. Although the timing of the express visuomotor response did not co-vary with reaction time, larger express visuomotor responses tended to precede shorter latency reaches. Further, we observed that the magnitude of the express visuomotor response could be muted by contextual context, although this effect was quite variable. Overall, the response properties in NHPs resemble those in humans. Our results establish a new benchmark for visuomotor transformations underlying visually-guided reaches, setting the stage for experiments that can directly compare the role of cortical and subcortical areas in reaching when time is of the essence.
Collapse
|
4
|
Drifting population dynamics with transient resets characterize sensorimotor transformation in the monkey superior colliculus. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.01.03.522634. [PMID: 36711849 PMCID: PMC9881850 DOI: 10.1101/2023.01.03.522634] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/06/2023]
Abstract
To produce goal-directed eye movements known as saccades, we must channel sensory input from our environment through a process known as sensorimotor transformation. The behavioral output of this phenomenon (an accurate eye movement) is straightforward, but the coordinated activity of neurons underlying its dynamics is not well understood. We searched for a neural correlate of sensorimotor transformation in the activity patterns of simultaneously recorded neurons in the superior colliculus (SC) of three male rhesus monkeys performing a visually guided, delayed saccade task. Neurons in the intermediate layers produce a burst of spikes both following the appearance of a visual (sensory) stimulus and preceding an eye movement command, but many also exhibit a sustained activity level during the intervening time ("delay period"). This sustained activity could be representative of visual processing or motor preparation, along with countless cognitive processes. Using a novel measure we call the Visuomotor Proximity Index (VMPI), we pitted visual and motor signals against each other by measuring the degree to which each session's population activity (as summarized in a low-dimensional framework) could be considered more visual-like or more motor-like. The analysis highlighted two salient features of sensorimotor transformation. One, population activity on average drifted systematically toward a motor-like representation and intermittently reverted to a visual-like representation following a microsaccade. Two, activity patterns that drift to a stronger motor-like representation by the end of the delay period may enable a more rapid initiation of a saccade, substantiating the idea that this movement initiation mechanism is conserved across motor systems.
Collapse
|
5
|
Dorsolateral Striatum is a Bottleneck for Responding to Task-Relevant Stimuli in a Learned Whisker Detection Task in Mice. J Neurosci 2023; 43:2126-2139. [PMID: 36810226 PMCID: PMC10039746 DOI: 10.1523/jneurosci.1506-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2022] [Revised: 02/01/2023] [Accepted: 02/03/2023] [Indexed: 02/23/2023] Open
Abstract
A learned sensory-motor behavior engages multiple brain regions, including the neocortex and the basal ganglia. How a target stimulus is detected by these regions and converted to a motor response remains poorly understood. Here, we performed electrophysiological recordings and pharmacological inactivations of whisker motor cortex and dorsolateral striatum to determine the representations within, and functions of, each region during performance in a selective whisker detection task in male and female mice. From the recording experiments, we observed robust, lateralized sensory responses in both structures. We also observed bilateral choice probability and preresponse activity in both structures, with these features emerging earlier in whisker motor cortex than dorsolateral striatum. These findings establish both whisker motor cortex and dorsolateral striatum as potential contributors to the sensory-to-motor (sensorimotor) transformation. We performed pharmacological inactivation studies to determine the necessity of these brain regions for this task. We found that suppressing the dorsolateral striatum severely disrupts responding to task-relevant stimuli, without disrupting the ability to respond, whereas suppressing whisker motor cortex resulted in more subtle changes in sensory detection and response criterion. Together these data support the dorsolateral striatum as an essential node in the sensorimotor transformation of this whisker detection task.SIGNIFICANCE STATEMENT Selecting an item in a grocery store, hailing a cab - these daily practices require us to transform sensory stimuli into motor responses. Many decades of previous research have studied goal-directed sensory-to-motor transformations within various brain structures, including the neocortex and the basal ganglia. Yet, our understanding of how these regions coordinate to perform sensory-to-motor transformations is limited because these brain structures are often studied by different researchers and through different behavioral tasks. Here, we record and perturb specific regions of the neocortex and the basal ganglia and compare their contributions during performance of a goal-directed somatosensory detection task. We find notable differences in the activities and functions of these regions, which suggests specific contributions to the sensory-to-motor transformation process.
Collapse
|
6
|
Cortical-subcortical interactions in goal-directed behavior. Physiol Rev 2023; 103:347-389. [PMID: 35771984 PMCID: PMC9576171 DOI: 10.1152/physrev.00048.2021] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2021] [Revised: 06/21/2022] [Accepted: 06/26/2022] [Indexed: 11/22/2022] Open
Abstract
Flexibly selecting appropriate actions in response to complex, ever-changing environments requires both cortical and subcortical regions, which are typically described as participating in a strict hierarchy. In this traditional view, highly specialized subcortical circuits allow for efficient responses to salient stimuli, at the cost of adaptability and context specificity, which are attributed to the neocortex. Their interactions are often described as the cortex providing top-down command signals for subcortical structures to implement; however, as available technologies develop, studies increasingly demonstrate that behavior is represented by brainwide activity and that even subcortical structures contain early signals of choice, suggesting that behavioral functions emerge as a result of different regions interacting as truly collaborative networks. In this review, we discuss the field's evolving understanding of how cortical and subcortical regions in placental mammals interact cooperatively, not only via top-down cortical-subcortical inputs but through bottom-up interactions, especially via the thalamus. We describe our current understanding of the circuitry of both the cortex and two exemplar subcortical structures, the superior colliculus and striatum, to identify which information is prioritized by which regions. We then describe the functional circuits these regions form with one another, and the thalamus, to create parallel loops and complex networks for brainwide information flow. Finally, we challenge the classic view that functional modules are contained within specific brain regions; instead, we propose that certain regions prioritize specific types of information over others, but the subnetworks they form, defined by their anatomical connections and functional dynamics, are the basis of true specialization.
Collapse
|
7
|
Parietofrontal oscillations show hand-specific interactions with top-down movement plans. J Neurophysiol 2022; 128:1518-1533. [PMID: 36321728 DOI: 10.1152/jn.00240.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
Abstract
To generate a hand-specific reach plan, the brain must integrate hand-specific signals with the desired movement strategy. Although various neurophysiology/imaging studies have investigated hand-target interactions in simple reach-to-target tasks, the whole brain timing and distribution of this process remain unclear, especially for more complex, instruction-dependent motor strategies. Previously, we showed that a pro/anti pointing instruction influences magnetoencephalographic (MEG) signals in frontal cortex that then propagate recurrently through parietal cortex (Blohm G, Alikhanian H, Gaetz W, Goltz HC, DeSouza JF, Cheyne DO, Crawford JD. NeuroImage 197: 306-319, 2019). Here, we contrasted left versus right hand pointing in the same task to investigate 1) which cortical regions of interest show hand specificity and 2) which of those areas interact with the instructed motor plan. Eight bilateral areas, the parietooccipital junction (POJ), superior parietooccipital cortex (SPOC), supramarginal gyrus (SMG), medial/anterior interparietal sulcus (mIPS/aIPS), primary somatosensory/motor cortex (S1/M1), and dorsal premotor cortex (PMd), showed hand-specific changes in beta band power, with four of these (M1, S1, SMG, aIPS) showing robust activation before movement onset. M1, SMG, SPOC, and aIPS showed significant interactions between contralateral hand specificity and the instructed motor plan but not with bottom-up target signals. Separate hand/motor signals emerged relatively early and lasted through execution, whereas hand-motor interactions only occurred close to movement onset. Taken together with our previous results, these findings show that instruction-dependent motor plans emerge in frontal cortex and interact recurrently with hand-specific parietofrontal signals before movement onset to produce hand-specific motor behaviors.NEW & NOTEWORTHY The brain must generate different motor signals depending on which hand is used. The distribution and timing of hand use/instructed motor plan integration are not understood at the whole brain level. Using MEG we show that different action planning subnetworks code for hand usage and integrating hand use into a hand-specific motor plan. The timing indicates that frontal cortex first creates a general motor plan and then integrates hand specificity to produce a hand-specific motor plan.
Collapse
|
8
|
Decoding the Time Course of Spatial Information from Spiking and Local Field Potential Activities in the Superior Colliculus. eNeuro 2022; 9:ENEURO.0347-22.2022. [PMID: 36379711 PMCID: PMC9718355 DOI: 10.1523/eneuro.0347-22.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2022] [Revised: 10/31/2022] [Accepted: 11/05/2022] [Indexed: 11/17/2022] Open
Abstract
Place code representation is ubiquitous in circuits that encode spatial parameters. For visually guided eye movements, neurons in many brain regions emit spikes when a stimulus is presented in their receptive fields and/or when a movement is directed into their movement fields. Crucially, individual neurons respond for a broad range of directions or eccentricities away from the optimal vector, making it difficult to decode the stimulus location or the saccade vector from each cell's activity. We investigated whether it is possible to decode the spatial parameter with a population-level analysis, even when the optimal vectors are similar across neurons. Spiking activity and local field potentials (LFPs) in the superior colliculus (SC) were recorded with a laminar probe as monkeys performed a delayed saccade task to one of eight targets radially equidistant in direction. A classifier was applied offline to decode the spatial configuration as the trial progresses from sensation to action. For spiking activity, decoding performance across all eight directions was highest during the visual and motor epochs and lower but well above chance during the delay period. Classification performance followed a similar pattern for LFP activity too, except the performance during the delay period was limited mostly to the preferred direction. Increasing the number of neurons in the population consistently increased classifier performance for both modalities. Overall, this study demonstrates the power of population activity for decoding spatial information not possible from individual neurons.
Collapse
|
9
|
Abstract
Sensorimotor transformation, a process that converts sensory stimuli into motor actions, is critical for the brain to initiate behaviors. Although the circuitry involved in sensorimotor transformation has been well delineated, the molecular logic behind this process remains poorly understood. Here, we performed high-throughput and circuit-specific single-cell transcriptomic analyses of neurons in the superior colliculus (SC), a midbrain structure implicated in early sensorimotor transformation. We found that SC neurons in distinct laminae expressed discrete marker genes. Of particular interest, Cbln2 and Pitx2 were key markers that define glutamatergic projection neurons in the optic nerve (Op) and intermediate gray (InG) layers, respectively. The Cbln2+ neurons responded to visual stimuli mimicking cruising predators, while the Pitx2+ neurons encoded prey-derived vibrissal tactile cues. By forming distinct input and output connections with other brain areas, these neuronal subtypes independently mediated behaviors of predator avoidance and prey capture. Our results reveal that, in the midbrain, sensorimotor transformation for different behaviors may be performed by separate circuit modules that are molecularly defined by distinct transcriptomic codes.
Collapse
|
10
|
Delay tactics for action in the cortex. Neuron 2021; 109:2045-2046. [PMID: 34237277 DOI: 10.1016/j.neuron.2021.06.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
How the brain computes with sensory input to execute a delayed motor response remains elusive. In this issue of Neuron, Esmaeili et al. (2021) reveal a key cortical circuit that underlies sensorimotor transformation to execute a delayed motor output following a specific sensory input.
Collapse
|
11
|
Rapid suppression and sustained activation of distinct cortical regions for a delayed sensory-triggered motor response. Neuron 2021; 109:2183-2201.e9. [PMID: 34077741 PMCID: PMC8285666 DOI: 10.1016/j.neuron.2021.05.005] [Citation(s) in RCA: 33] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 03/24/2021] [Accepted: 05/06/2021] [Indexed: 01/16/2023]
Abstract
The neuronal mechanisms generating a delayed motor response initiated by a sensory cue remain elusive. Here, we tracked the precise sequence of cortical activity in mice transforming a brief whisker stimulus into delayed licking using wide-field calcium imaging, multiregion high-density electrophysiology, and time-resolved optogenetic manipulation. Rapid activity evoked by whisker deflection acquired two prominent features for task performance: (1) an enhanced excitation of secondary whisker motor cortex, suggesting its important role connecting whisker sensory processing to lick motor planning; and (2) a transient reduction of activity in orofacial sensorimotor cortex, which contributed to suppressing premature licking. Subsequent widespread cortical activity during the delay period largely correlated with anticipatory movements, but when these were accounted for, a focal sustained activity remained in frontal cortex, which was causally essential for licking in the response period. Our results demonstrate key cortical nodes for motor plan generation and timely execution in delayed goal-directed licking.
Collapse
|
12
|
Movement and Performance Explain Widespread Cortical Activity in a Visual Detection Task. Cereb Cortex 2021; 30:421-437. [PMID: 31711133 DOI: 10.1093/cercor/bhz206] [Citation(s) in RCA: 91] [Impact Index Per Article: 30.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2018] [Revised: 08/12/2019] [Accepted: 08/14/2019] [Indexed: 11/14/2022] Open
Abstract
Recent studies in mice reveal widespread cortical signals during task performance; however, the various task-related and task-independent processes underlying this activity are incompletely understood. Here, we recorded wide-field neural activity, as revealed by GCaMP6s, from dorsal cortex while simultaneously monitoring orofacial movements, walking, and arousal (pupil diameter) of head-fixed mice performing a Go/NoGo visual detection task and examined the ability of task performance and spontaneous or task-related movements to predict cortical activity. A linear model was able to explain a significant fraction (33-55% of variance) of widefield dorsal cortical activity, with the largest factors being movements (facial, walk, eye), response choice (hit, miss, false alarm), and arousal and indicate that a significant fraction of trial-to-trial variability arises from both spontaneous and task-related changes in state (e.g., movements, arousal). Importantly, secondary motor cortex was highly correlated with lick rate, critical for optimal task performance (high d'), and was the first region to significantly predict the lick response on target trials. These findings suggest that secondary motor cortex is critically involved in the decision and performance of learned movements and indicate that a significant fraction of trial-to-trial variation in cortical activity results from spontaneous and task-related movements and variations in behavioral/arousal state.
Collapse
|
13
|
Central Vestibular Tuning Arises from Patterned Convergence of Otolith Afferents. Neuron 2020; 108:748-762.e4. [PMID: 32937099 DOI: 10.1016/j.neuron.2020.08.019] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2020] [Revised: 07/09/2020] [Accepted: 08/19/2020] [Indexed: 01/31/2023]
Abstract
As sensory information moves through the brain, higher-order areas exhibit more complex tuning than lower areas. Though models predict that complexity arises via convergent inputs from neurons with diverse response properties, in most vertebrate systems, convergence has only been inferred rather than tested directly. Here, we measure sensory computations in zebrafish vestibular neurons across multiple axes in vivo. We establish that whole-cell physiological recordings reveal tuning of individual vestibular afferent inputs and their postsynaptic targets. Strong, sparse synaptic inputs can be distinguished by their amplitudes, permitting analysis of afferent convergence in vivo. An independent approach, serial-section electron microscopy, supports the inferred connectivity. We find that afferents with similar or differing preferred directions converge on central vestibular neurons, conferring more simple or complex tuning, respectively. Together, these results provide a direct, quantifiable demonstration of feedforward input convergence in vivo.
Collapse
|
14
|
Neuronal Correlates of Many-To-One Sensorimotor Mapping in Lateral Intraparietal Cortex. Cereb Cortex 2020; 30:5583-5596. [PMID: 32488241 DOI: 10.1093/cercor/bhaa145] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2019] [Revised: 04/01/2020] [Accepted: 05/09/2020] [Indexed: 11/14/2022] Open
Abstract
Efficiently mapping sensory stimuli onto motor programs is crucial for rapidly choosing appropriate behavioral responses. While neuronal mechanisms underlying simple, one-to-one sensorimotor mapping have been extensively studied, how the brain achieves complex, many-to-one sensorimotor mapping remains unclear. Here, we recorded single neuron activity from the lateral intraparietal (LIP) cortex of monkeys trained to map multiple spatial positions of visual cue onto two opposite saccades. We found that LIP neurons' activity was consistent with directly mapping multiple cue positions to the associated saccadic direction (SDir) regardless of whether the visual cue appeared in or outside neurons' receptive fields. Unlike the explicit encoding of the visual categories, such cue-target mapping (CTM)-related activity covaried with the associated SDirs. Furthermore, the CTM was preferentially mediated by visual neurons identified by memory-guided saccade. These results indicate that LIP plays a crucial role in the early stage of many-to-one sensorimotor transformation.
Collapse
|
15
|
Projection-specific Activity of Layer 2/3 Neurons Imaged in Mouse Primary Somatosensory Barrel Cortex During a Whisker Detection Task. FUNCTION (OXFORD, ENGLAND) 2020; 1:zqaa008. [PMID: 35330741 PMCID: PMC8788860 DOI: 10.1093/function/zqaa008] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/25/2020] [Revised: 06/27/2020] [Accepted: 06/29/2020] [Indexed: 01/06/2023]
Abstract
The brain processes sensory information in a context- and learning-dependent manner for adaptive behavior. Through reward-based learning, relevant sensory stimuli can become linked to execution of specific actions associated with positive outcomes. The neuronal circuits involved in such goal-directed sensory-to-motor transformations remain to be precisely determined. Studying simple learned sensorimotor transformations in head-restrained mice offers the opportunity for detailed measurements of cellular activity during task performance. Here, we trained mice to lick a reward spout in response to a whisker deflection and an auditory tone. Through two-photon calcium imaging of retrogradely labeled neurons, we found that neurons located in primary whisker somatosensory barrel cortex projecting to secondary whisker somatosensory cortex had larger calcium signals than neighboring neurons projecting to primary whisker motor cortex in response to whisker deflection and auditory stimulation, as well as before spontaneous licking. Longitudinal imaging of the same neurons revealed that these projection-specific responses were relatively stable across 3 days. In addition, the activity of neurons projecting to secondary whisker somatosensory cortex was more highly correlated than for neurons projecting to primary whisker motor cortex. The large and correlated activity of neurons projecting to secondary whisker somatosensory cortex might enhance the pathway-specific signaling of important sensory information contributing to task execution. Our data support the hypothesis that communication between primary and secondary somatosensory cortex might be an early critical step in whisker sensory perception. More generally, our data suggest the importance of investigating projection-specific neuronal activity in distinct populations of intermingled excitatory neocortical neurons during task performance.
Collapse
|
16
|
Dynamic Representation of Taste-Related Decisions in the Gustatory Insular Cortex of Mice. Curr Biol 2020; 30:1834-1844.e5. [PMID: 32243860 PMCID: PMC7239762 DOI: 10.1016/j.cub.2020.03.012] [Citation(s) in RCA: 30] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2019] [Revised: 02/11/2020] [Accepted: 03/04/2020] [Indexed: 12/22/2022]
Abstract
Research over the past decade has established the gustatory insular cortex (GC) as a model for studying howprimary sensory cortices integrate sensory,affective, and cognitive signals. This integration occurs through time-varyingpatterns of neural activity. Selective silencing of GC activity during specific temporal windows provided evidence forGC’s role in mediating taste palatability and expectation. Recent results also suggest that this areamay play a role in decision making. However, existing data are limited to GC involvement in controlling the timing of stereotyped, orofacial reactions to aversive tastants during consumption. Here,we present electrophysiological, chemogenetic, and optogenetic results demonstrating the key role of GCin the executionof a taste-guided, reward-directed decision-making task. Mice were trained in a two-alternative choice task, in which they had to associate tastants sampled from a central spout with different actions (i.e., licking either a left or a right spout). Stimulus sampling and action were separated by a delay period. Electrophysiological recordings revealed chemosensory processing during the sampling period and the emergence of task-related, cognitive signals during the delay period. Chemogenetic silencing of GCimpaired task performance. Optogenetic silencing of GC allowed us to tease apart the contribution of activity during sampling and delay periods. Although silencing during the sampling period had no effect, silencing during the delay period significantly impacted behavioral performance, demonstrating the importance of the cognitive signals processed by GC in driving decision making. Altogether, our data highlight a novel role ofGCin controlling taste-guided, reward-directed choices and actions. Relying on behavioral electrophysiology and neural manipulations, Vincis, Chen, et al. demonstrate that neurons in the gustatory cortex (GC) encode perceptual and cognitive signals important for tasteguided choices. These data demonstrate a novel role of GC as a key area for sensorimotor transformations related to gustatory perceptual decision making.
Collapse
|
17
|
TwoLumps Ascending Neurons Mediate Touch-Evoked Reversal of Walking Direction in Drosophila. Curr Biol 2019; 29:4337-4344.e5. [PMID: 31813606 DOI: 10.1016/j.cub.2019.11.004] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2019] [Revised: 10/09/2019] [Accepted: 11/01/2019] [Indexed: 12/27/2022]
Abstract
External cues, including touch, enable walking animals to flexibly maneuver around obstacles and extricate themselves from dead-ends (for reviews, see [1-3]). In a screen for neurons that enable Drosophila melanogaster to retreat when it encounters a dead-end, we identified a pair of ascending neurons, the TwoLumps Ascending (TLA) neurons. Silencing TLA activity impairs backward locomotion, whereas optogenetic activation triggers backward walking. TLA-induced reversal is mediated in part by the Moonwalker Descending Neurons (MDNs) [4], which receive excitatory input from the TLAs. Silencing the TLAs decreases the extent to which freely walking flies back up upon encountering a physical barrier in the dark, and TLAs show calcium responses to optogenetic activation of neurons expressing the mechanosensory channel NOMPC. We infer that TLAs convey feedforward mechanosensory stimuli to transiently activate MDNs in response to anterior body touch.
Collapse
|
18
|
Shared Song Detector Neurons in Drosophila Male and Female Brains Drive Sex-Specific Behaviors. Curr Biol 2019; 29:3200-3215.e5. [PMID: 31564492 PMCID: PMC6885007 DOI: 10.1016/j.cub.2019.08.008] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2019] [Revised: 07/10/2019] [Accepted: 08/02/2019] [Indexed: 10/25/2022]
Abstract
Males and females often produce distinct responses to the same sensory stimuli. How such differences arise-at the level of sensory processing or in the circuits that generate behavior-remains largely unresolved across sensory modalities. We address this issue in the acoustic communication system of Drosophila. During courtship, males generate time-varying songs, and each sex responds with specific behaviors. We characterize male and female behavioral tuning for all aspects of song and show that feature tuning is similar between sexes, suggesting sex-shared song detectors drive divergent behaviors. We then identify higher-order neurons in the Drosophila brain, called pC2, that are tuned for multiple temporal aspects of one mode of the male's song and drive sex-specific behaviors. We thus uncover neurons that are specifically tuned to an acoustic communication signal and that reside at the sensory-motor interface, flexibly linking auditory perception with sex-specific behavioral responses.
Collapse
|
19
|
Impact of precisely-timed inhibition of gustatory cortex on taste behavior depends on single-trial ensemble dynamics. eLife 2019; 8:e45968. [PMID: 31232693 PMCID: PMC6625792 DOI: 10.7554/elife.45968] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2019] [Accepted: 06/21/2019] [Indexed: 11/21/2022] Open
Abstract
Sensation and action are necessarily coupled during stimulus perception - while tasting, for instance, perception happens while an animal decides to expel or swallow the substance in the mouth (the former via a behavior known as 'gaping'). Taste responses in the rodent gustatory cortex (GC) span this sensorimotor divide, progressing through firing-rate epochs that culminate in the emergence of action-related firing. Population analyses reveal this emergence to be a sudden, coherent and variably-timed ensemble transition that reliably precedes gaping onset by 0.2-0.3s. Here, we tested whether this transition drives gaping, by delivering 0.5s GC perturbations in tasting trials. Perturbations significantly delayed gaping, but only when they preceded the action-related transition - thus, the same perturbation impacted behavior or not, depending on the transition latency in that particular trial. Our results suggest a distributed attractor network model of taste processing, and a dynamical role for cortex in driving motor behavior.
Collapse
|
20
|
Sensorimotor pathway controlling stopping behavior during chemotaxis in the Drosophila melanogaster larva. eLife 2018; 7:38740. [PMID: 30465650 PMCID: PMC6264072 DOI: 10.7554/elife.38740] [Citation(s) in RCA: 41] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2018] [Accepted: 11/07/2018] [Indexed: 02/02/2023] Open
Abstract
Sensory navigation results from coordinated transitions between distinct behavioral programs. During chemotaxis in the Drosophila melanogaster larva, the detection of positive odor gradients extends runs while negative gradients promote stops and turns. This algorithm represents a foundation for the control of sensory navigation across phyla. In the present work, we identified an olfactory descending neuron, PDM-DN, which plays a pivotal role in the organization of stops and turns in response to the detection of graded changes in odor concentrations. Artificial activation of this descending neuron induces deterministic stops followed by the initiation of turning maneuvers through head casts. Using electron microscopy, we reconstructed the main pathway that connects the PDM-DN neuron to the peripheral olfactory system and to the pre-motor circuit responsible for the actuation of forward peristalsis. Our results set the stage for a detailed mechanistic analysis of the sensorimotor conversion of graded olfactory inputs into action selection to perform goal-oriented navigation.
Collapse
|
21
|
The Mouse Superior Colliculus as a Model System for Investigating Cell Type-Based Mechanisms of Visual Motor Transformation. Front Neural Circuits 2018; 12:59. [PMID: 30140205 PMCID: PMC6094993 DOI: 10.3389/fncir.2018.00059] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2018] [Accepted: 07/03/2018] [Indexed: 11/13/2022] Open
Abstract
The mouse superior colliculus (SC) is a laminar midbrain structure involved in processing and transforming multimodal sensory stimuli into ethologically relevant behaviors such as escape, defense, and orienting movements. The SC is unique in that the sensory (visual, auditory, and somatosensory) and motor maps are overlaid. In the mouse, the SC receives inputs from more retinal ganglion cells than any other visual area. This makes the mouse SC an ideal model system for understanding how visual signals processed by retinal circuits are used to mediate visually guided behaviors. This Perspective provides an overview of the current understanding of visual motor transformations operated by the mouse SC and discusses the challenges to be overcome when investigating the input–output relationships in single collicular cell types.
Collapse
|
22
|
Comparison of Visually Guided Flight in Insects and Birds. Front Neurosci 2018; 12:157. [PMID: 29615852 PMCID: PMC5864886 DOI: 10.3389/fnins.2018.00157] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2017] [Accepted: 02/27/2018] [Indexed: 11/14/2022] Open
Abstract
Over the last half century, work with flies, bees, and moths have revealed a number of visual guidance strategies for controlling different aspects of flight. Some algorithms, such as the use of pattern velocity in forward flight, are employed by all insects studied so far, and are used to control multiple flight tasks such as regulation of speed, measurement of distance, and positioning through narrow passages. Although much attention has been devoted to long-range navigation and homing in birds, until recently, very little was known about how birds control flight in a moment-to-moment fashion. A bird that flies rapidly through dense foliage to land on a branch—as birds often do—engages in a veritable three-dimensional slalom, in which it has to continually dodge branches and leaves, and find, and possibly even plan a collision-free path to the goal in real time. Each mode of flight from take-off to goal could potentially involve a different visual guidance algorithm. Here, we briefly review strategies for visual guidance of flight in insects, synthesize recent work from short-range visual guidance in birds, and offer a general comparison between the two groups of organisms.
Collapse
|
23
|
Vocal Tract Images Reveal Neural Representations of Sensorimotor Transformation During Speech Imitation. Cereb Cortex 2018; 27:3064-3079. [PMID: 28334401 PMCID: PMC5939209 DOI: 10.1093/cercor/bhx056] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2016] [Indexed: 12/23/2022] Open
Abstract
Imitating speech necessitates the transformation from sensory targets to vocal tract motor output, yet little is known about the representational basis of this process in the human brain. Here, we address this question by using real-time MR imaging (rtMRI) of the vocal tract and functional MRI (fMRI) of the brain in a speech imitation paradigm. Participants trained on imitating a native vowel and a similar nonnative vowel that required lip rounding. Later, participants imitated these vowels and an untrained vowel pair during separate fMRI and rtMRI runs. Univariate fMRI analyses revealed that regions including left inferior frontal gyrus were more active during sensorimotor transformation (ST) and production of nonnative vowels, compared with native vowels; further, ST for nonnative vowels activated somatomotor cortex bilaterally, compared with ST of native vowels. Using test representational similarity analysis (RSA) models constructed from participants’ vocal tract images and from stimulus formant distances, we found that RSA searchlight analyses of fMRI data showed either type of model could be represented in somatomotor, temporal, cerebellar, and hippocampal neural activation patterns during ST. We thus provide the first evidence of widespread and robust cortical and subcortical neural representation of vocal tract and/or formant parameters, during prearticulatory ST.
Collapse
|
24
|
Activity of primate V1 neurons during the gap saccade task. J Neurophysiol 2017; 118:1361-1375. [PMID: 28615338 DOI: 10.1152/jn.00758.2016] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2016] [Revised: 06/14/2017] [Accepted: 06/14/2017] [Indexed: 12/18/2022] Open
Abstract
When a saccadic eye movement is made toward a visual stimulus, the variability in accompanying primary visual cortex (V1) activity is related to saccade latency in both humans and simians. To understand the nature of this relationship, we examined the functional link between V1 activity and the initiation of visually guided saccades during the gap saccade task, in which a brief temporal gap is inserted between the turning off of a fixation stimulus and the appearance of a saccadic target. The insertion of such a gap robustly reduces saccade latency and facilitates the occurrence of extremely short-latency (express) saccades. Here we recorded single-cell activity from macaque V1 while monkeys performed the gap saccade task. In parallel with the gap effect on saccade latency the neural latency (time of first spike) of V1 response elicited by the saccade target became shorter, and the firing rate increased as the gap duration increased. Similarly, neural latency was shorter and firing rate was higher before express saccades relative to regular-latency saccades. In addition to these posttarget changes, the level of spontaneous spike activity during the pretarget period was negatively correlated with both neural and saccade latencies. These results demonstrate that V1 activity correlates with the gap effect and indicate that trial-to-trial variability in the state of V1 accompanies the variability of neural and behavioral latencies.NEW & NOTEWORTHY The link between neural activity in monkey primary visual cortex (V1) and visually guided behavioral response is confirmed with the gap saccade paradigm. Results indicated that the variability in neural latency of V1 spike activity correlates with the gap effect on saccade latency and that the trial-to-trial variability in the state of V1 before the onset of saccade target correlates with the variability in neural and behavioral latencies.
Collapse
|
25
|
Gain Control in Predictive Smooth Pursuit Eye Movements: Evidence for an Acceleration-Based Predictive Mechanism. eNeuro 2017; 4:eN-NWR-0343-16. [PMID: 28560317 PMCID: PMC5446489 DOI: 10.1523/eneuro.0343-16.2017] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2016] [Revised: 04/01/2017] [Accepted: 04/06/2017] [Indexed: 11/23/2022] Open
Abstract
The smooth pursuit eye movement system incorporates various control features enabling adaptation to specific tracking situations. In this work, we analyzed the interplay between two of these mechanisms: gain control and predictive pursuit. We tested human responses to high-frequency perturbations during step-ramp pursuit, as well as the pursuit of a periodically moving target. For the latter task, we found a nonlinear interaction between perturbation response and carrier acceleration. Responses to perturbations where the initial perturbation acceleration was contradirectional to carrier acceleration increased with carrier velocity, in a manner similar to that observed during step-ramp pursuit. In contrast, responses to perturbations with ipsidirectional initial perturbation and carrier acceleration were large for all carrier velocities. Modeling the pursuit system suggests that gain control and short-term prediction are separable elements. The observed effect may be explained by combining the standard gain control mechanism with a derivative-based short-term predictive mechanism. The nonlinear interaction between perturbation and carrier acceleration can be reproduced by assuming a signal saturation, which is acting on the derivative of the target velocity signal. Our results therefore argue for the existence of an internal estimate of target acceleration as a basis for a simple yet efficient short-term predictive mechanism.
Collapse
|
26
|
Secondary Motor Cortex: Where 'Sensory' Meets 'Motor' in the Rodent Frontal Cortex. Trends Neurosci 2017; 40:181-193. [PMID: 28012708 PMCID: PMC5339050 DOI: 10.1016/j.tins.2016.11.006] [Citation(s) in RCA: 132] [Impact Index Per Article: 18.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2016] [Revised: 11/28/2016] [Accepted: 11/29/2016] [Indexed: 12/15/2022]
Abstract
In rodents, the medial aspect of the secondary motor cortex (M2) is known by other names, including medial agranular cortex (AGm), medial precentral cortex (PrCm), and frontal orienting field (FOF). As a subdivision of the medial prefrontal cortex (mPFC), M2 can be defined by a distinct set of afferent and efferent connections, microstimulation responses, and lesion outcomes. However, the behavioral role of M2 remains mysterious. Here, we focus on evidence from rodent studies, highlighting recent findings of early and context-dependent choice-related activity in M2 during voluntary behavior. Based on the current understanding, we suggest that a major function for M2 is to flexibly map antecedent signals such as sensory cues to motor actions, thereby enabling adaptive choice behavior.
Collapse
|
27
|
Emergence of Selectivity to Looming Stimuli in a Spiking Network Model of the Optic Tectum. Front Neural Circuits 2016; 10:95. [PMID: 27932957 PMCID: PMC5121234 DOI: 10.3389/fncir.2016.00095] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2016] [Accepted: 11/08/2016] [Indexed: 11/13/2022] Open
Abstract
The neural circuits in the optic tectum of Xenopus tadpoles are selectively responsive to looming visual stimuli that resemble objects approaching the animal at a collision trajectory. This selectivity is required for adaptive collision avoidance behavior in this species, but its underlying mechanisms are not known. In particular, it is still unclear how the balance between the recurrent spontaneous network activity and the newly arriving sensory flow is set in this structure, and to what degree this balance is important for collision detection. Also, despite the clear indication for the presence of strong recurrent excitation and spontaneous activity, the exact topology of recurrent feedback circuits in the tectum remains elusive. In this study we take advantage of recently published detailed cell-level data from tadpole tectum to build an informed computational model of it, and investigate whether dynamic activation in excitatory recurrent retinotopic networks may on its own underlie collision detection. We consider several possible recurrent connectivity configurations and compare their performance for collision detection under different levels of spontaneous neural activity. We show that even in the absence of inhibition, a retinotopic network of quickly inactivating spiking neurons is naturally selective for looming stimuli, but this selectivity is not robust to neuronal noise, and is sensitive to the balance between direct and recurrent inputs. We also describe how homeostatic modulation of intrinsic properties of individual tectal cells can change selectivity thresholds in this network, and qualitatively verify our predictions in a behavioral experiment in freely swimming tadpoles.
Collapse
|
28
|
The hand that 'sees' to grasp. eLife 2016; 5. [PMID: 27471905 PMCID: PMC4966891 DOI: 10.7554/elife.18887] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2016] [Accepted: 07/27/2016] [Indexed: 11/18/2022] Open
Abstract
New findings advance our understanding of how vision is used to guide the hand during object grasping.
Collapse
|
29
|
Object vision to hand action in macaque parietal, premotor, and motor cortices. eLife 2016; 5. [PMID: 27458796 PMCID: PMC4961460 DOI: 10.7554/elife.15278] [Citation(s) in RCA: 73] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2016] [Accepted: 06/13/2016] [Indexed: 12/02/2022] Open
Abstract
Grasping requires translating object geometries into appropriate hand shapes. How the brain computes these transformations is currently unclear. We investigated three key areas of the macaque cortical grasping circuit with microelectrode arrays and found cooperative but anatomically separated visual and motor processes. The parietal area AIP operated primarily in a visual mode. Its neuronal population revealed a specialization for shape processing, even for abstract geometries, and processed object features ultimately important for grasping. Premotor area F5 acted as a hub that shared the visual coding of AIP only temporarily and switched to highly dominant motor signals towards movement planning and execution. We visualize these non-discrete premotor signals that drive the primary motor cortex M1 to reflect the movement of the grasping hand. Our results reveal visual and motor features encoded in the grasping circuit and their communication to achieve transformation for grasping. DOI:http://dx.doi.org/10.7554/eLife.15278.001 In order to grasp and manipulate objects, our brains have to transform information about an object (such as its size, shape and position) into commands about movement that are sent to our hands. Previous work suggests that in primates (including humans and monkeys), this transformation is coordinated in three key brain areas: the parietal cortex, the premotor cortex and the motor cortex. But exactly how these transformations are computed is still not clear. Schaffelhofer and Scherberger attempted to find out how this transformation happens by recording the electrical activity from different brain areas as monkeys reached out to grasp different objects. The specific brain areas studied were the anterior intraparietal (AIP) area of the parietal cortex, a part of the premotor cortex known as F5, and the region of the motor cortex that controls hand movements. The exact movement made by the monkeys’ hands was also recorded. Analysing the recorded brain activity revealed that the three brain regions worked together to transform information about an object into commands for the hand, although each region also had its own specific, separate role in this process. Neurons in the AIP area of the parietal cortex mostly dealt with visual information about the object. These neurons specialized in processing information about the shape of an object, including information that was ultimately important for grasping it. In contrast, the premotor area F5 represented visual information about the object only briefly, quickly switching to representing information about the upcoming movement as it was planned and carried out. Finally, the neurons in the primary motor cortex were only active during the actual hand movement, and their activity strongly reflected the action of hand as it grasped the object. Overall, the results presented by Schaffelhofer and Scherberger suggest that grasping movements are generated from visual information about the object via AIP and F5 neurons communicating with each other. The strong links between the premotor and motor cortex also suggest that a common network related to movement executes and refines the prepared plan of movement. Further investigations are now needed to reveal how such networks process the information they receive. DOI:http://dx.doi.org/10.7554/eLife.15278.002
Collapse
|
30
|
Odor-identity dependent motor programs underlie behavioral responses to odors. eLife 2015; 4. [PMID: 26439011 PMCID: PMC4868540 DOI: 10.7554/elife.11092] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2015] [Accepted: 10/05/2015] [Indexed: 02/01/2023] Open
Abstract
All animals use olfactory information to perform tasks essential to their survival. Odors typically activate multiple olfactory receptor neuron (ORN) classes and are therefore represented by the patterns of active ORNs. How the patterns of active ORN classes are decoded to drive behavior is under intense investigation. In this study, using Drosophila as a model system, we investigate the logic by which odors modulate locomotion. We designed a novel behavioral arena in which we could examine a fly’s locomotion under precisely controlled stimulus condition. In this arena, in response to similarly attractive odors, flies modulate their locomotion differently implying that odors have a more diverse effect on locomotion than was anticipated. Three features underlie odor-guided locomotion: First, in response to odors, flies modulate a surprisingly large number of motor parameters. Second, similarly attractive odors elicit changes in different motor programs. Third, different ORN classes modulate different subset of motor parameters. DOI:http://dx.doi.org/10.7554/eLife.11092.001 Humans rely chiefly on vision to understand and navigate the world around them. But for many organisms, the world is dominated by their sense of smell. For these animals, everyday activities, like finding food, depend on being able to change behavior based on odor-based cues. To meet the challenges of detecting and discriminating between different odors, animals have many odorant receptors that bind to the odors, which are found on olfactory receptor neurons (ORNs). Each odor activates multiple ORNs, and different odors activate different combinations of ORNs. But it is not clear how activities from different classes of ORN are combined to create the perception of an odor or to guide behavior. Now, Jung et al. have investigated the logic by which odors can alter a fruit fly’s movements. The olfactory system of the fruit fly is organized along similar lines to that of a mammal, but is much simpler. Moreover, many genetic tools are available in fruit flies to allow neuroscientists to activate and inactivate specific neurons and assess the effect this has on behavior. The results suggest that odor-guided movement in fruit flies has two noteworthy features. Firstly, in the presence of odors, flies alter their walking in unexpectedly large number of ways. Therefore, one needs to consider many different factors, or “motor parameters”, to describe how odors affect a fly’s movement. For instance, instead of just walking faster or slower, a fly can change how long it stops (stop duration), how long it runs (run duration) and how fast it runs (run speed) – all of which will affect overall speed. Secondly, a single class of ORN can strongly affect some parameters (like run duration) without affecting others (like stop duration). These data indicate that the neural circuits involved have a modular organization in which each ORN class affects a subset of motor parameters, and each motor parameter is affected by a subset of ORN classes. These findings were largely unexpected. Jung et al.’s study focused on attractive odors. Future work will study repulsive odors to investigate if similar results are seen when studying repulsion versus attraction. DOI:http://dx.doi.org/10.7554/eLife.11092.002
Collapse
|
31
|
Abstract
Sensory events in the space around us trigger specific motor patterns directed toward or away from the spatial location of the sensory source. Spatially-defined sensorimotor associations are well-known in the visual domain but less so for the auditory modality. In particular no spatially-directed audio-motor association has been described for the upper limb. We tested the instantaneous directional tuning of the corticospinal system by means of single-pulse transcranial magnetic stimulation (TMS) over the left motor cortex in 16 healthy volunteers while at rest. We recorded the lateral accelerations of the TMS-evoked movement by means of an accelerometer placed on the forearm. Acoustic stimuli (pure tone frequency=1000Hz, duration=50ms) coming from 25 different directions lying in the axial anterior half-plane at the height of the participant's ears were played on earphones. The entire set of sound directions covered a span of 160° (±80° where 0° is the frontal direction) at a fixed azimuth angle. Six different intervals between sound onset and TMS (0, 25, 50, 100, 150 and 200ms) were tested for each sound direction. Significant correlations were found between sound origin and TMS-evoked arm accelerations only when TMS was delivered 50ms prior to sound onset. We show the presence in the upper limb motor system of auditory spatial tuning. Sound information accesses the motor system at very short latency, potentially compatible with both a subcortical and a cortical origin of the response. The use of TMS-evoked accelerations allowed us to disclose a strict directional tuning in audio-motor associations.
Collapse
|
32
|
Dissociable contribution of the parietal and frontal cortex to coding movement direction and amplitude. Front Hum Neurosci 2015; 9:241. [PMID: 25999837 PMCID: PMC4422032 DOI: 10.3389/fnhum.2015.00241] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2015] [Accepted: 04/14/2015] [Indexed: 11/13/2022] Open
Abstract
To reach for an object, we must convert its spatial location into an appropriate motor command, merging movement direction and amplitude. In humans, it has been suggested that this visuo-motor transformation occurs in a dorsomedial parieto-frontal pathway, although the causal contribution of the areas constituting the “reaching circuit” remains unknown. Here we used transcranial magnetic stimulation (TMS) in healthy volunteers to disrupt the function of either the medial intraparietal area (mIPS) or dorsal premotor cortex (PMd), in each hemisphere. The task consisted in performing step-tracking movements with the right wrist towards targets located in different directions and eccentricities; targets were either visible for the whole trial (Target-ON) or flashed for 200 ms (Target-OFF). Left and right mIPS disruption led to errors in the initial direction of movements performed towards contralateral targets. These errors were corrected online in the Target-ON condition but when the target was flashed for 200 ms, mIPS TMS manifested as a larger endpoint spreading. In contrast, left PMd virtual lesions led to higher acceleration and velocity peaks—two parameters typically used to probe the planned movement amplitude—irrespective of the target position, hemifield and presentation condition; in the Target-OFF condition, left PMd TMS induced overshooting and increased the endpoint dispersion along the axis of the target direction. These results indicate that left PMd intervenes in coding amplitude during movement preparation. The critical TMS timings leading to errors in direction and amplitude were different, namely 160–100 ms before movement onset for mIPS and 100–40 ms for left PMd. TMS applied over right PMd had no significant effect. These results demonstrate that, during motor preparation, direction and amplitude of goal-directed movements are processed by different cortical areas, at distinct timings, and according to a specific hemispheric organization.
Collapse
|
33
|
Visual-Motor Transformations Within Frontal Eye Fields During Head-Unrestrained Gaze Shifts in the Monkey. Cereb Cortex 2014; 25:3932-52. [PMID: 25491118 PMCID: PMC4585524 DOI: 10.1093/cercor/bhu279] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
A fundamental question in sensorimotor control concerns the transformation of spatial signals from the retina into eye and head motor commands required for accurate gaze shifts. Here, we investigated these transformations by identifying the spatial codes embedded in visually evoked and movement-related responses in the frontal eye fields (FEFs) during head-unrestrained gaze shifts. Monkeys made delayed gaze shifts to the remembered location of briefly presented visual stimuli, with delay serving to dissociate visual and movement responses. A statistical analysis of nonparametric model fits to response field data from 57 neurons (38 with visual and 49 with movement activities) eliminated most effector-specific, head-fixed, and space-fixed models, but confirmed the dominance of eye-centered codes observed in head-restrained studies. More importantly, the visual response encoded target location, whereas the movement response mainly encoded the final position of the imminent gaze shift (including gaze errors). This spatiotemporal distinction between target and gaze coding was present not only at the population level, but even at the single-cell level. We propose that an imperfect visual–motor transformation occurs during the brief memory interval between perception and action, and further transformations from the FEF's eye-centered gaze motor code to effector-specific codes in motor frames occur downstream in the subcortical areas.
Collapse
|
34
|
Abstract
Driver behavior and vehicle-road kinematics have been shown to change over prolonged periods of driving; however, the interaction between these two indices has not been examined. Here we develop a measure that examines how drivers turn the steering wheel relative to heading error velocity, which the authors call the relative steering wheel compensation (RSWC). The RSWC transiently changes on a short time scale coincident with a verbal query embedded within the study paradigm. In contrast, more traditional variables are dynamic over longer time scales consistent with previous research. The results suggest drivers alter their behavioral output (steering wheel correction) relative to sensory input (vehicle heading error velocity) on a distinct temporal scale and may reflect an interaction of alerting and control.
Collapse
|
35
|
Excitation and inhibition in recurrent networks mediate collision avoidance in Xenopus tadpoles. Eur J Neurosci 2014; 40:2948-62. [PMID: 24995793 DOI: 10.1111/ejn.12664] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2014] [Revised: 04/23/2014] [Accepted: 05/28/2014] [Indexed: 01/24/2023]
Abstract
Information processing in the vertebrate brain is thought to be mediated through distributed neural networks, but it is still unclear how sensory stimuli are encoded and detected by these networks, and what role synaptic inhibition plays in this process. Here we used a collision avoidance behavior in Xenopus tadpoles as a model for stimulus discrimination and recognition. We showed that the visual system of the tadpole is selective for behaviorally relevant looming stimuli, and that the detection of these stimuli first occurs in the optic tectum. By comparing visually guided behavior, optic nerve recordings, excitatory and inhibitory synaptic currents, and the spike output of tectal neurons, we showed that collision detection in the tadpole relies on the emergent properties of distributed recurrent networks within the tectum. We found that synaptic inhibition was temporally correlated with excitation, and did not actively sculpt stimulus selectivity, but rather it regulated the amount of integration between direct inputs from the retina and recurrent inputs from the tectum. Both pharmacological suppression and enhancement of synaptic inhibition disrupted emergent selectivity for looming stimuli. Taken together these findings suggested that, by regulating the amount of network activity, inhibition plays a critical role in maintaining selective sensitivity to behaviorally-relevant visual stimuli.
Collapse
|
36
|
Cortical processing of object affordances for self and others' action. Front Psychol 2014; 5:538. [PMID: 24987381 PMCID: PMC4060298 DOI: 10.3389/fpsyg.2014.00538] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2014] [Accepted: 05/14/2014] [Indexed: 01/05/2023] Open
Abstract
The perception of objects does not rely only on visual brain areas, but also involves cortical motor regions. In particular, different parietal and premotor areas host neurons discharging during both object observation and grasping. Most of these cells often show similar visual and motor selectivity for a specific object (or set of objects), suggesting that they might play a crucial role in representing the “potential motor act” afforded by the object. The existence of such a mechanism for the visuomotor transformation of object physical properties in the most appropriate motor plan for interacting with them has been convincingly demonstrated in humans as well. Interestingly, human studies have shown that visually presented objects can automatically trigger the representation of an action provided that they are located within the observer's reaching space (peripersonal space). The “affordance effect” also occurs when the presented object is outside the observer's peripersonal space, but inside the peripersonal space of an observed agent. These findings recently received direct support by single neuron studies in monkey, indicating that space-constrained processing of objects in the ventral premotor cortex might be relevant to represent objects as potential targets for one's own or others' action.
Collapse
|
37
|
Abstract
When using lever tools, subjects have to deal with two, not necessarily concordant effects of their motor behavior: the body-related proximal effects, like tactile sensations from the moving hand, and/or more external distal effects, like the moving effect points of the lever. As a consequence, spatial compatibility relationships between stimulus (S; at which the effect points of the lever aim at), responding hand (R) and effect point of the lever (E) play a critical role in response generation. In the present study we examine whether the occurrence of compatibility effects needs real tool movements or whether a similar response pattern can be already evoked by pure mental imaginations of the tool effects. In general, response times and errors observed with real and imagined tool movements showed a similar pattern of results, but there were also differences. With incompatible relationships and thus more difficult tasks, response times were reduced with imagined tool movements than compared with real tool movements. On the contrary, with compatible relationships and thus high overlap between proximal and distal action effects, response times were increased with imagined tool movements. Results are only in parts consistent with the ideomotor theory of motor control.
Collapse
|
38
|
Simultaneous optogenetic manipulation and calcium imaging in freely moving C. elegans. Front Neural Circuits 2014; 8:28. [PMID: 24715856 PMCID: PMC3970007 DOI: 10.3389/fncir.2014.00028] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2014] [Accepted: 03/07/2014] [Indexed: 11/13/2022] Open
Abstract
Understanding how an organism's nervous system transforms sensory input into behavioral outputs requires recording and manipulating its neural activity during unrestrained behavior. Here we present an instrument to simultaneously monitor and manipulate neural activity while observing behavior in a freely moving animal, the nematode Caenorhabditis elegans. Neural activity is recorded optically from cells expressing a calcium indicator, GCaMP3. Neural activity is manipulated optically by illuminating targeted neurons expressing the optogenetic protein Channelrhodopsin. Real-time computer vision software tracks the animal's behavior and identifies the location of targeted neurons in the nematode as it crawls. Patterned illumination from a DMD is used to selectively illuminate subsets of neurons for either calcium imaging or optogenetic stimulation. Real-time computer vision software constantly updates the illumination pattern in response to the worm's movement and thereby allows for independent optical recording or activation of different neurons in the worm as it moves freely. We use the instrument to directly observe the relationship between sensory neuron activation, interneuron dynamics and locomotion in the worm's mechanosensory circuit. We record and compare calcium transients in the backward locomotion command interneurons AVA, in response to optical activation of the anterior mechanosensory neurons ALM, AVM or both.
Collapse
|
39
|
Shared action spaces: a basis function framework for social re-calibration of sensorimotor representations supporting joint action. Front Hum Neurosci 2013; 7:800. [PMID: 24324425 PMCID: PMC3840313 DOI: 10.3389/fnhum.2013.00800] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2013] [Accepted: 11/03/2013] [Indexed: 11/17/2022] Open
Abstract
The article explores the possibilities of formalizing and explaining the mechanisms that support spatial and social perspective alignment sustained over the duration of a social interaction. The basic proposed principle is that in social contexts the mechanisms for sensorimotor transformations and multisensory integration (learn to) incorporate information relative to the other actor(s), similar to the “re-calibration” of visual receptive fields in response to repeated tool use. This process aligns or merges the co-actors’ spatial representations and creates a “Shared Action Space” (SAS) supporting key computations of social interactions and joint actions; for example, the remapping between the coordinate systems and frames of reference of the co-actors, including perspective taking, the sensorimotor transformations required for lifting jointly an object, and the predictions of the sensory effects of such joint action. The social re-calibration is proposed to be based on common basis function maps (BFMs) and could constitute an optimal solution to sensorimotor transformation and multisensory integration in joint action or more in general social interaction contexts. However, certain situations such as discrepant postural and viewpoint alignment and associated differences in perspectives between the co-actors could constrain the process quite differently. We discuss how alignment is achieved in the first place, and how it is maintained over time, providing a taxonomy of various forms and mechanisms of space alignment and overlap based, for instance, on automaticity vs. control of the transformations between the two agents. Finally, we discuss the link between low-level mechanisms for the sharing of space and high-level mechanisms for the sharing of cognitive representations.
Collapse
|
40
|
Reprogramming movements: extraction of motor intentions from cortical ensemble activity when movement goals change. FRONTIERS IN NEUROENGINEERING 2012; 5:16. [PMID: 22826698 PMCID: PMC3399119 DOI: 10.3389/fneng.2012.00016] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/09/2012] [Accepted: 07/02/2012] [Indexed: 01/15/2023]
Abstract
The ability to inhibit unwanted movements and change motor plans is essential for behaviors of advanced organisms. The neural mechanisms by which the primate motor system rejects undesired actions have received much attention during the last decade, but it is not well understood how this neural function could be utilized to improve the efficiency of brain-machine interfaces (BMIs). Here we employed linear discriminant analysis (LDA) and a Wiener filter to extract motor plan transitions from the activity of ensembles of sensorimotor cortex neurons. Two rhesus monkeys, chronically implanted with multielectrode arrays in primary motor (M1) and primary sensory (S1) cortices, were overtrained to produce reaching movements with a joystick toward visual targets upon their presentation. Then, the behavioral task was modified to include a distracting target that flashed for 50, 150, or 250 ms (25% of trials each) followed by the true target that appeared at a different screen location. In the remaining 25% of trials, the initial target stayed on the screen and was the target to be approached. M1 and S1 neuronal activity represented both the true and distracting targets, even for the shortest duration of the distracting event. This dual representation persisted both when the monkey initiated movements toward the distracting target and then made corrections and when they moved directly toward the second, true target. The Wiener filter effectively decoded the location of the true target, whereas the LDA classifier extracted the location of both targets from ensembles of 50–250 neurons. Based on these results, we suggest developing real-time BMIs that inhibit unwanted movements represented by brain activity while enacting the desired motor outcome concomitantly.
Collapse
|
41
|
Three-dimensional eye position signals shape both peripersonal space and arm movement activity in the medial posterior parietal cortex. Front Integr Neurosci 2012; 6:37. [PMID: 22754511 PMCID: PMC3385520 DOI: 10.3389/fnint.2012.00037] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2012] [Accepted: 06/01/2012] [Indexed: 11/13/2022] Open
Abstract
Research conducted over the last decades has established that the medial part of posterior parietal cortex (PPC) is crucial for controlling visually guided actions in human and non-human primates. Within this cortical sector there is area V6A, a crucial node of the parietofrontal network involved in arm movement control in both monkeys and humans. However, the encoding of action-in-depth by V6A cells had been not studied till recently. Recent neurophysiological studies show the existence in V6A neurons of signals related to the distance of targets from the eyes. These signals are integrated, often at the level of single cells, with information about the direction of gaze, thus encoding spatial location in 3D space. Moreover, 3D eye position signals seem to be further exploited at two additional levels of neural processing: (a) in determining whether targets are located in the peripersonal space or not, and (b) in shaping the spatial tuning of arm movement related activity toward reachable targets. These findings are in line with studies in putative homolog regions in humans and together point to a role of medial PPC in encoding both the vergence angle of the eyes and peripersonal space. Besides its role in spatial encoding also in depth, several findings demonstrate the involvement of this cortical sector in non-spatial processes.
Collapse
|
42
|
Abstract
Fitts’ law describes the fundamental trade-off between movement accuracy and speed: it states that the duration of reaching movements is a function of target size (TS) and distance. While Fitts’ law has been extensively studied in ergonomics and has guided the design of human–computer interfaces, there have been few studies on its neuronal correlates. To elucidate sensorimotor cortical activity underlying Fitts’ law, we implanted two monkeys with multielectrode arrays in the primary motor (M1) and primary somatosensory (S1) cortices. The monkeys performed reaches with a joystick-controlled cursor toward targets of different size. The reaction time (RT), movement time, and movement velocity changed with TS, and M1 and S1 activity reflected these changes. Moreover, modifications of cortical activity could not be explained by changes of movement parameters alone, but required TS as an additional parameter. Neuronal representation of TS was especially prominent during the early RT period where it influenced the slope of the firing rate rise preceding movement initiation. During the movement period, cortical activity was correlated with movement velocity. Neural decoders were applied to simultaneously decode TS and motor parameters from cortical modulations. We suggest that sensorimotor cortex activity reflects the characteristics of both the movement and the target. Classifiers that extract these parameters from cortical ensembles could improve neuroprosthetic control.
Collapse
|
43
|
What is 'anti' about anti-reaches? Reference frames selectively affect reaction times and endpoint variability. Exp Brain Res 2011; 208:287-96. [PMID: 21076817 PMCID: PMC3015212 DOI: 10.1007/s00221-010-2481-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2010] [Accepted: 10/25/2010] [Indexed: 11/27/2022]
Abstract
Reach movement planning involves the representation of spatial target information in different reference frames. Neurons at parietal and premotor stages of the cortical sensorimotor system represent target information in eye- or hand-centered reference frames, respectively. How the different neuronal representations affect behavioral parameters of motor planning and control, i.e. which stage of neural representation is relevant for which aspect of behavior, is not obvious from the physiology. Here, we test with a behavioral experiment if different kinematic movement parameters are affected to a different degree by either an eye- or hand-reference frame. We used a generalized anti-reach task to test the influence of stimulus-response compatibility (SRC) in eye- and hand-reference frames on reach reaction times, movement times, and endpoint variability. While in a standard anti-reach task, the SRC is identical in the eye- and hand-reference frames, we could separate SRC for the two reference frames. We found that reaction times were influenced by the SRC in eye- and hand-reference frame. In contrast, movement times were only influenced by the SRC in hand-reference frame, and endpoint variability was only influenced by the SRC in eye-reference frame. Since movement time and endpoint variability are the result of planning and control processes, while reaction times are consequences of only the planning process, we suggest that SRC effects on reaction times are highly suited to investigate reference frames of movement planning, and that eye- and hand-reference frames have distinct effects on different phases of motor action and different kinematic movement parameters.
Collapse
|
44
|
Sensorimotor transformation deficits for smooth pursuit in first-episode affective psychoses and schizophrenia. Biol Psychiatry 2010; 67:217-23. [PMID: 19782964 PMCID: PMC2879155 DOI: 10.1016/j.biopsych.2009.08.005] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/28/2009] [Revised: 08/07/2009] [Accepted: 08/07/2009] [Indexed: 12/14/2022]
Abstract
BACKGROUND Smooth pursuit deficits are an intermediate phenotype for schizophrenia that may result from disturbances in visual motion perception, sensorimotor transformation, predictive mechanisms, or alterations in basic oculomotor control. Which of these components are the primary causes of smooth pursuit impairments and whether they are impaired similarly across psychotic disorders remain to be established. METHODS First-episode psychotic patients with bipolar disorder (n = 34), unipolar depression (n = 24), or schizophrenia (n = 77) and matched healthy participants (n = 130) performed three smooth pursuit tasks designed to evaluate different components of pursuit tracking. RESULTS On ramp tasks, maintenance pursuit velocity was reduced in all three patients groups with psychotic bipolar patients exhibiting the most severe impairments. Open loop pursuit velocity was reduced in psychotic bipolar and schizophrenia patients. Motion perception during pursuit initiation, as indicated by the accuracy of saccades to moving targets, was not impaired in any patient group. Analyses in 138 participants followed for 6 weeks, during which patients were treated and psychotic symptom severity decreased, and no significant change in performance in any group was revealed. CONCLUSIONS Sensorimotor transformation deficits in all patient groups suggest a common alteration in frontostriatal networks that dynamically regulate gain control of pursuit responses using sensory input and feedback about performance. Predictive mechanisms appear to be sufficiently intact to compensate for this deficit across psychotic disorders. The absence of significant changes after acute treatment and symptom reduction suggests that these deficits appear to be stable over time.
Collapse
|
45
|
Relationship between the phases of sensory and motor activity during a looming-evoked multistage escape behavior. J Neurosci 2007; 27:10047-59. [PMID: 17855619 PMCID: PMC2081158 DOI: 10.1523/jneurosci.1515-07.2007] [Citation(s) in RCA: 85] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2007] [Revised: 07/16/2007] [Accepted: 07/17/2007] [Indexed: 11/21/2022] Open
Abstract
The firing patterns of visual neurons tracking approaching objects need to be translated into appropriate motor activation sequences to generate escape behaviors. Locusts possess an identified neuron highly sensitive to approaching objects (looming stimuli), thought to play an important role in collision avoidance through its motor projections. To study how the activity of this neuron relates to escape behaviors, we monitored jumps evoked by looming stimuli in freely behaving animals. By comparing electrophysiological and high-speed video recordings, we found that the initial preparatory phase of jumps occurs on average during the rising phase of the firing rate of the looming-sensitive neuron. The coactivation period of leg flexors and extensors, which is used to store the energy required for the jump, coincides with the timing of the peak firing rate of the neuron. The final preparatory phase occurs after the peak and takeoff happens when the firing rate of the looming-sensitive neuron has decayed to <10% of its peak. Both the initial and the final preparatory phases and takeoff are triggered when the approaching object crosses successive threshold angular sizes on the animal's retina. Our results therefore suggest that distinct phases of the firing patterns of individual sensory neurons may actively contribute to distinct phases of complex, multistage motor behaviors.
Collapse
|
46
|
Spatial interference during bimanual coordination: differential brain networks associated with control of movement amplitude and direction. Hum Brain Mapp 2005; 26:286-300. [PMID: 15965999 PMCID: PMC6871760 DOI: 10.1002/hbm.20151] [Citation(s) in RCA: 46] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2004] [Accepted: 01/31/2005] [Indexed: 11/10/2022] Open
Abstract
Bimanual interference emerges when spatial features, such as movement direction or amplitude, differ between limbs, as indicated by a mutual bias of limb trajectories. Although first insights into the neural basis of directional interference have been revealed recently, little is known about the neural network associated with amplitude interference. We investigated whether amplitude versus directional interference activates differential networks. Functional magnetic resonance imaging (fMRI) was applied while subjects performed cyclical, bimanual joystick movements with either the same vs. different amplitudes, directions, or both. The kinematic analysis confirmed that subjects experienced amplitude interference when they moved with different as compared to the same amplitude, and directional interference when they moved along different as compared to the same direction. On the brain level, amplitude and directional interference both resulted in activation of a bilateral superior parietal-premotor network, which is known to contribute to sensorimotor transformations during goal-directed movements. Interestingly, amplitude but not directional interference exclusively activated a bilateral network containing the dorsolateral prefrontal cortex, anterior cingulate, and supramarginal gyrus, which was shown previously to contribute to executive functions. Even though the encoding of amplitude and directional information converged and activated the same neural substrate, our data thus show that additional and partly independent mechanisms are involved in bimanual amplitude as compared to that in directional control.
Collapse
|
47
|
Developmental learning in a pain-related system: evidence for a cross-modality mechanism. J Neurosci 2003; 23:7719-25. [PMID: 12930812 PMCID: PMC6740755] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2003] [Revised: 06/27/2003] [Accepted: 06/30/2003] [Indexed: 03/04/2023] Open
Abstract
The nociceptive spinal reflex system performs highly precise sensorimotor transformations that require functionally specified synaptic strengths. The specification is gradually attained during early development and appears to be learning dependent. Here we determine the time course of this specification for heat-nociceptive tail withdrawal reflexes and analyze which types of primary afferents are important for the learning by applying various forms of noninvasive sensory deprivations. The percentage of erroneous heat-nociceptive tail withdrawal reflexes (i.e., movements directed toward the stimulation) decreased gradually from 64.1 +/- 2.5% (mean +/- SEM) to <10% during postnatal days 10-21. This improvement was completely blocked by anesthetizing the tail during the adaptation period, confirming that an experience-dependent mechanism is involved in the specification of synaptic strengths. However, the results show that the adaptation occurs to a significant extent despite local analgesia and protection of the tail from noxious input, provided that tactile sensitivity is preserved. Therefore, it appears that a nociceptive input is not necessary for the adaptation, and that input from tactile receptors can be used to guide the nociceptive synaptic organization during development. Sensory deprivation in the adult rat failed to affect the heat-nociceptive withdrawal reflex system, indicating that the adaptation has a "critical period" during early development. These findings provide a key to the puzzle of how pain-related systems can be functionally adapted through experience despite the rare occurrence of noxious input during early life.
Collapse
|
48
|
Effects of gaze shifts on maintenance of spatial memory in macaque frontal eye field. J Neurosci 2003; 23:5446-54. [PMID: 12843243 PMCID: PMC6741219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2003] [Revised: 04/16/2003] [Accepted: 04/17/2003] [Indexed: 03/03/2023] Open
Abstract
The activity of 91 neurons in the frontal eye fields (FEFs) of two macaque monkeys was recorded while the animals performed a delayed spatial match-to-sample task. During the delay, the animals were required to shift their gaze to one of four eccentric locations. Neuronal activity during the delay was analyzed for sensitivity to cue location and eye position. One-third of the neurons showed significant delay activity selective for cue location, whereas slightly more than one-half of the neurons showed significant modulation of delay activity when the gaze was shifted to an eccentric location. Despite this modulation, the neurons continued to signal their preferred cue location during most of the delay. However, after recentering saccades, the memory signal was temporarily abolished and then reemerged over a period of few hundred milliseconds. This is consistent with the idea that spatial working memory is buffered outside of the FEF. For most neurons, delay activity tended to increase when the gaze was shifted away from the preferred location and to decrease when the gaze was shifted toward the preferred location. This pattern of modulation is consistent with a vector subtraction mechanism that allows for the superposition of multiple saccade plans.
Collapse
|
49
|
Role of primate superior colliculus in preparation and execution of anti-saccades and pro-saccades. J Neurosci 1999; 19:2740-54. [PMID: 10087086 PMCID: PMC6786089] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/1998] [Revised: 11/11/1998] [Accepted: 01/17/1999] [Indexed: 02/11/2023] Open
Abstract
We investigated how the brain switches between the preparation of a movement where a stimulus is the target of the movement, and a movement where a stimulus serves as a landmark for an instructed movement elsewhere. Monkeys were trained on a pro-/anti-saccade paradigm in which they either had to generate a pro-saccade toward a visual stimulus or an anti-saccade away from the stimulus to its mirror position, depending on the color of an initial fixation point. Neural activity was recorded in the superior colliculus (SC), a structure that is known to be involved in the generation of fast saccades, to determine whether it was also involved in the generation of anti-saccades. On anti-saccade trials, fixation during the instruction period was associated with an increased activity of collicular fixation-related neurons and a decreased activity of saccade-related neurons. Stimulus-related and saccade-related activity was reduced on anti-saccade trials. Our results demonstrate that the anti-saccade task involves (and may require) the attenuation of preparatory and stimulus-related activity in the SC to avoid unwanted pro-saccades. Because the attenuated pre-saccade activity that we found in the SC may be insufficient by itself to elicit correct anti-saccades, additional movement signals from other brain areas are presumably required.
Collapse
|
50
|
Quantitative analysis of a directed behavior in the medicinal leech: implications for organizing motor output. J Neurosci 1998; 18:1571-82. [PMID: 9454862 PMCID: PMC6792712] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023] Open
Abstract
The local bend is a directed behavior produced by the leech, Hirudo medicinalis, in response to a light touch. Contraction of longitudinal muscles near the touched location results in a bend directed away from the stimulus. We quantify the relationship between the location of touch around the body perimeter and the behavioral output by using video analysis, muscle tension measurements, and electromyography. On average, the direction of the behavioral output differed from the touch location by <8% of the total body perimeter. We discuss our results in the context of two contrasting behavioral strategies: a Continuous strategy, in which the local bend is directed exactly opposite to stimulus location, and a Categorical strategy, in which there are four distinct bend directions, each elicited by stimuli given in a single quadrant of the body perimeter. To distinguish between these strategies, we delivered two competing stimuli simultaneously. The resulting behavioral output is best described by an average of the effects of each stimulus given alone and thus provides support for the Continuous strategy. We also use a simple model, based on anatomical and physiological data, to predict the responses of the known motor neurons to different stimulus locations. The model shows that the activation of two of the motor neurons (D and V) is inconsistent with a Categorical strategy. However, these neurons are known to be active during the local bend behavior. This result, along with our experimental observations, suggests that the local bend network uses a Continuous strategy to encode stimulus location and produce directed behavioral output.
Collapse
|