1
|
Scott DN, Frank MJ. Adaptive control of synaptic plasticity integrates micro- and macroscopic network function. Neuropsychopharmacology 2023; 48:121-144. [PMID: 36038780 PMCID: PMC9700774 DOI: 10.1038/s41386-022-01374-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 06/23/2022] [Accepted: 06/24/2022] [Indexed: 11/09/2022]
Abstract
Synaptic plasticity configures interactions between neurons and is therefore likely to be a primary driver of behavioral learning and development. How this microscopic-macroscopic interaction occurs is poorly understood, as researchers frequently examine models within particular ranges of abstraction and scale. Computational neuroscience and machine learning models offer theoretically powerful analyses of plasticity in neural networks, but results are often siloed and only coarsely linked to biology. In this review, we examine connections between these areas, asking how network computations change as a function of diverse features of plasticity and vice versa. We review how plasticity can be controlled at synapses by calcium dynamics and neuromodulatory signals, the manifestation of these changes in networks, and their impacts in specialized circuits. We conclude that metaplasticity-defined broadly as the adaptive control of plasticity-forges connections across scales by governing what groups of synapses can and can't learn about, when, and to what ends. The metaplasticity we discuss acts by co-opting Hebbian mechanisms, shifting network properties, and routing activity within and across brain systems. Asking how these operations can go awry should also be useful for understanding pathology, which we address in the context of autism, schizophrenia and Parkinson's disease.
Collapse
Affiliation(s)
- Daniel N Scott
- Cognitive Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA.
- Carney Institute for Brain Science, Brown University, Providence, RI, USA.
| | - Michael J Frank
- Cognitive Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA.
- Carney Institute for Brain Science, Brown University, Providence, RI, USA.
| |
Collapse
|
2
|
Gao T, Deng B, Wang J, Wang J, Yi G. The passive properties of dendrites modulate the propagation of slowly-varying firing rate in feedforward networks. Neural Netw 2022; 150:377-391. [DOI: 10.1016/j.neunet.2022.03.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2021] [Revised: 01/12/2022] [Accepted: 03/02/2022] [Indexed: 10/18/2022]
|
3
|
Setareh H, Deger M, Gerstner W. Excitable neuronal assemblies with adaptation as a building block of brain circuits for velocity-controlled signal propagation. PLoS Comput Biol 2018; 14:e1006216. [PMID: 29979674 PMCID: PMC6051644 DOI: 10.1371/journal.pcbi.1006216] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2017] [Revised: 07/18/2018] [Accepted: 05/21/2018] [Indexed: 01/07/2023] Open
Abstract
The time scale of neuronal network dynamics is determined by synaptic interactions and neuronal signal integration, both of which occur on the time scale of milliseconds. Yet many behaviors like the generation of movements or vocalizations of sounds occur on the much slower time scale of seconds. Here we ask the question of how neuronal networks of the brain can support reliable behavior on this time scale. We argue that excitable neuronal assemblies with spike-frequency adaptation may serve as building blocks that can flexibly adjust the speed of execution of neural circuit function. We show in simulations that a chain of neuronal assemblies can propagate signals reliably, similar to the well-known synfire chain, but with the crucial difference that the propagation speed is slower and tunable to the behaviorally relevant range. Moreover we study a grid of excitable neuronal assemblies as a simplified model of the somatosensory barrel cortex of the mouse and demonstrate that various patterns of experimentally observed spatial activity propagation can be explained. Models of activity propagation in cortical networks have often been based on feedforward structures. Here we propose a model of activity propagation, called excitation chain, which does not need such a feedforward structure. The model is composed of excitable neural assemblies with spike-frequency adaptation, connected bidirectionally in a row or a grid. This prototypical neural circuit can propagate activity forwards, backwards or in both directions. Furthermore, the propagation speed is slow enough to trigger the generation of behaviors on the time scale of hundreds of milliseconds. A two-dimensional variant of the model is able to generate different activity propagation patterns, similar to spontaneous activity and stimulus-evoked responses in anesthetized mouse barrel cortex. We propose the excitation chain model as a basic component that can be employed in various ways to create spiking neural circuit models that generate signals on behavioral time scales. In contrast to abstract models of excitable media, our model makes an explicit link to the time scale of neuronal spikes.
Collapse
Affiliation(s)
- Hesam Setareh
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Switzerland
| | - Moritz Deger
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Switzerland
- Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Köln, Germany
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Switzerland
- * E-mail:
| |
Collapse
|
4
|
Polyakov F. Affine differential geometry and smoothness maximization as tools for identifying geometric movement primitives. BIOLOGICAL CYBERNETICS 2017; 111:5-24. [PMID: 27822891 DOI: 10.1007/s00422-016-0705-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/27/2016] [Accepted: 10/19/2016] [Indexed: 06/06/2023]
Abstract
Neuroscientific studies of drawing-like movements usually analyze neural representation of either geometric (e.g., direction, shape) or temporal (e.g., speed) parameters of trajectories rather than trajectory's representation as a whole. This work is about identifying geometric building blocks of movements by unifying different empirically supported mathematical descriptions that characterize relationship between geometric and temporal aspects of biological motion. Movement primitives supposedly facilitate the efficiency of movements' representation in the brain and comply with such criteria for biological movements as kinematic smoothness and geometric constraint. The minimum-jerk model formalizes criterion for trajectories' maximal smoothness of order 3. I derive a class of differential equations obeyed by movement paths whose nth-order maximally smooth trajectories accumulate path measurement with constant rate. Constant rate of accumulating equi-affine arc complies with the 2/3 power-law model. Candidate primitive shapes identified as equations' solutions for arcs in different geometries in plane and in space are presented. Connection between geometric invariance, motion smoothness, compositionality and performance of the compromised motor control system is proposed within single invariance-smoothness framework. The derived class of differential equations is a novel tool for discovering candidates for geometric movement primitives.
Collapse
Affiliation(s)
- Felix Polyakov
- Department of Mathematics, Ariel University, Ariel, Israel.
- Department of Mathematics, Bar Ilan University, Ramat Gan, Israel.
| |
Collapse
|
5
|
Trengove C, Diesmann M, van Leeuwen C. Dynamic effective connectivity in cortically embedded systems of recurrently coupled synfire chains. J Comput Neurosci 2015; 40:1-26. [PMID: 26560334 PMCID: PMC4762935 DOI: 10.1007/s10827-015-0581-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2015] [Revised: 10/14/2015] [Accepted: 10/21/2015] [Indexed: 12/02/2022]
Abstract
As a candidate mechanism of neural representation, large numbers of synfire chains can efficiently be embedded in a balanced recurrent cortical network model. Here we study a model in which multiple synfire chains of variable strength are randomly coupled together to form a recurrent system. The system can be implemented both as a large-scale network of integrate-and-fire neurons and as a reduced model. The latter has binary-state pools as basic units but is otherwise isomorphic to the large-scale model, and provides an efficient tool for studying its behavior. Both the large-scale system and its reduced counterpart are able to sustain ongoing endogenous activity in the form of synfire waves, the proliferation of which is regulated by negative feedback caused by collateral noise. Within this equilibrium, diverse repertoires of ongoing activity are observed, including meta-stability and multiple steady states. These states arise in concert with an effective connectivity structure (ECS). The ECS admits a family of effective connectivity graphs (ECGs), parametrized by the mean global activity level. Of these graphs, the strongly connected components and their associated out-components account to a large extent for the observed steady states of the system. These results imply a notion of dynamic effective connectivity as governing neural computation with synfire chains, and related forms of cortical circuitry with complex topologies.
Collapse
Affiliation(s)
- Chris Trengove
- Perceptual Dynamics Laboratory, University of Leuven, Leuven, Belgium.
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany.,Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Cees van Leeuwen
- Perceptual Dynamics Laboratory, University of Leuven, Leuven, Belgium.,TU Kaiserslautern, Kaiserslautern, Germany
| |
Collapse
|
6
|
Abeles M, Diesmann M, Flash T, Geisel T, Herrmann M, Teicher M. Compositionality in neural control: an interdisciplinary study of scribbling movements in primates. Front Comput Neurosci 2013; 7:103. [PMID: 24062679 PMCID: PMC3771313 DOI: 10.3389/fncom.2013.00103] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2013] [Accepted: 07/11/2013] [Indexed: 01/22/2023] Open
Abstract
This article discusses the compositional structure of hand movements by analyzing and modeling neural and behavioral data obtained from experiments where a monkey (Macaca fascicularis) performed scribbling movements induced by a search task. Using geometrically based approaches to movement segmentation, it is shown that the hand trajectories are composed of elementary segments that are primarily parabolic in shape. The segments could be categorized into a small number of classes on the basis of decreasing intra-class variance over the course of training. A separate classification of the neural data employing a hidden Markov model showed a coincidence of the neural states with the behavioral categories. An additional analysis of both types of data by a data mining method provided evidence that the neural activity patterns underlying the behavioral primitives were formed by sets of specific and precise spike patterns. A geometric description of the movement trajectories, together with precise neural timing data indicates a compositional variant of a realistic synfire chain model. This model reproduces the typical shapes and temporal properties of the trajectories; hence the structure and composition of the primitives may reflect meaningful behavior.
Collapse
Affiliation(s)
- Moshe Abeles
- Gonda Brain Research Center, Bar Ilan University Ramat Gan, Israel ; Department of Physiology, The Hebrew University of Jerusalem Jerusalem, Israel
| | | | | | | | | | | |
Collapse
|
7
|
High-capacity embedding of synfire chains in a cortical network model. J Comput Neurosci 2012; 34:185-209. [PMID: 22878688 PMCID: PMC3605496 DOI: 10.1007/s10827-012-0413-9] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2011] [Revised: 04/18/2012] [Accepted: 07/02/2012] [Indexed: 10/28/2022]
Abstract
Synfire chains, sequences of pools linked by feedforward connections, support the propagation of precisely timed spike sequences, or synfire waves. An important question remains, how synfire chains can efficiently be embedded in cortical architecture. We present a model of synfire chain embedding in a cortical scale recurrent network using conductance-based synapses, balanced chains, and variable transmission delays. The network attains substantially higher embedding capacities than previous spiking neuron models and allows all its connections to be used for embedding. The number of waves in the model is regulated by recurrent background noise. We computationally explore the embedding capacity limit, and use a mean field analysis to describe the equilibrium state. Simulations confirm the mean field analysis over broad ranges of pool sizes and connectivity levels; the number of pools embedded in the system trades off against the firing rate and the number of waves. An optimal inhibition level balances the conflicting requirements of stable synfire propagation and limited response to background noise. A simplified analysis shows that the present conductance-based synapses achieve higher contrast between the responses to synfire input and background noise compared to current-based synapses, while regulation of wave numbers is traced to the use of variable transmission delays.
Collapse
|
8
|
An imperfect dopaminergic error signal can drive temporal-difference learning. PLoS Comput Biol 2011; 7:e1001133. [PMID: 21589888 PMCID: PMC3093351 DOI: 10.1371/journal.pcbi.1001133] [Citation(s) in RCA: 43] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2010] [Accepted: 04/06/2011] [Indexed: 12/03/2022] Open
Abstract
An open problem in the field of computational neuroscience is how to link synaptic plasticity to system-level learning. A promising framework in this context is temporal-difference (TD) learning. Experimental evidence that supports the hypothesis that the mammalian brain performs temporal-difference learning includes the resemblance of the phasic activity of the midbrain dopaminergic neurons to the TD error and the discovery that cortico-striatal synaptic plasticity is modulated by dopamine. However, as the phasic dopaminergic signal does not reproduce all the properties of the theoretical TD error, it is unclear whether it is capable of driving behavior adaptation in complex tasks. Here, we present a spiking temporal-difference learning model based on the actor-critic architecture. The model dynamically generates a dopaminergic signal with realistic firing rates and exploits this signal to modulate the plasticity of synapses as a third factor. The predictions of our proposed plasticity dynamics are in good agreement with experimental results with respect to dopamine, pre- and post-synaptic activity. An analytical mapping from the parameters of our proposed plasticity dynamics to those of the classical discrete-time TD algorithm reveals that the biological constraints of the dopaminergic signal entail a modified TD algorithm with self-adapting learning parameters and an adapting offset. We show that the neuronal network is able to learn a task with sparse positive rewards as fast as the corresponding classical discrete-time TD algorithm. However, the performance of the neuronal network is impaired with respect to the traditional algorithm on a task with both positive and negative rewards and breaks down entirely on a task with purely negative rewards. Our model demonstrates that the asymmetry of a realistic dopaminergic signal enables TD learning when learning is driven by positive rewards but not when driven by negative rewards. What are the physiological changes that take place in the brain when we solve a problem or learn a new skill? It is commonly assumed that behavior adaptations are realized on the microscopic level by changes in synaptic efficacies. However, this is hard to verify experimentally due to the difficulties of identifying the relevant synapses and monitoring them over long periods during a behavioral task. To address this question computationally, we develop a spiking neuronal network model of actor-critic temporal-difference learning, a variant of reinforcement learning for which neural correlates have already been partially established. The network learns a complex task by means of an internally generated reward signal constrained by recent findings on the dopaminergic system. Our model combines top-down and bottom-up modelling approaches to bridge the gap between synaptic plasticity and system-level learning. It paves the way for further investigations of the dopaminergic system in reward learning in the healthy brain and in pathological conditions such as Parkinson's disease, and can be used as a module in functional models based on brain-scale circuitry.
Collapse
|
9
|
Schrader S, Diesmann M, Morrison A. A compositionality machine realized by a hierarchic architecture of synfire chains. Front Comput Neurosci 2011; 4:154. [PMID: 21258641 PMCID: PMC3020397 DOI: 10.3389/fncom.2010.00154] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2010] [Accepted: 12/05/2010] [Indexed: 11/17/2022] Open
Abstract
The composition of complex behavior is thought to rely on the concurrent and sequential activation of simpler action components, or primitives. Systems of synfire chains have previously been proposed to account for either the simultaneous or the sequential aspects of compositionality; however, the compatibility of the two aspects has so far not been addressed. Moreover, the simultaneous activation of primitives has up until now only been investigated in the context of reactive computations, i.e., the perception of stimuli. In this study we demonstrate how a hierarchical organization of synfire chains is capable of generating both aspects of compositionality for proactive computations such as the generation of complex and ongoing action. To this end, we develop a network model consisting of two layers of synfire chains. Using simple drawing strokes as a visualization of abstract primitives, we map the feed-forward activity of the upper level synfire chains to motion in two-dimensional space. Our model is capable of producing drawing strokes that are combinations of primitive strokes by binding together the corresponding chains. Moreover, when the lower layer of the network is constructed in a closed-loop fashion, drawing strokes are generated sequentially. The generated pattern can be random or deterministic, depending on the connection pattern between the lower level chains. We propose quantitative measures for simultaneity and sequentiality, revealing a wide parameter range in which both aspects are fulfilled. Finally, we investigate the spiking activity of our model to propose candidate signatures of synfire chain computation in measurements of neural activity during action execution.
Collapse
|