251
|
Clopath C, Gerstner W. Voltage and Spike Timing Interact in STDP - A Unified Model. Front Synaptic Neurosci 2010; 2:25. [PMID: 21423511 PMCID: PMC3059665 DOI: 10.3389/fnsyn.2010.00025] [Citation(s) in RCA: 54] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2010] [Accepted: 06/07/2010] [Indexed: 11/13/2022] Open
Abstract
A phenomenological model of synaptic plasticity is able to account for a large body of experimental data on spike-timing-dependent plasticity (STDP). The basic ingredient of the model is the correlation of presynaptic spike arrival with postsynaptic voltage. The local membrane voltage is used twice: a first term accounts for the instantaneous voltage and the second one for a low-pass filtered voltage trace. Spike-timing effects emerge as a special case. We hypothesize that the voltage dependence can explain differential effects of STDP in dendrites, since the amplitude and time course of backpropagating action potentials or dendritic spikes influences the plasticity results in the model. The dendritic effects are simulated by variable choices of voltage time course at the site of the synapse, i.e., without an explicit model of the spatial structure of the neuron.
Collapse
Affiliation(s)
- Claudia Clopath
- Laboratory of Computational Neuroscience, Brain-Mind Institute, Ecole Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | | |
Collapse
|
252
|
Froemke RC, Debanne D, Bi GQ. Temporal modulation of spike-timing-dependent plasticity. Front Synaptic Neurosci 2010; 2:19. [PMID: 21423505 PMCID: PMC3059714 DOI: 10.3389/fnsyn.2010.00019] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2010] [Accepted: 05/27/2010] [Indexed: 11/13/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) has attracted considerable experimental and theoretical attention over the last decade. In the most basic formulation, STDP provides a fundamental unit – a spike pair – for quantifying the induction of long-term changes in synaptic strength. However, many factors, both pre- and postsynaptic, can affect synaptic transmission and integration, especially when multiple spikes are considered. Here we review the experimental evidence for multiple types of nonlinear temporal interactions in STDP, focusing on the contributions of individual spike pairs, overall spike rate, and precise spike timing for modification of cortical and hippocampal excitatory synapses. We discuss the underlying processes that determine the specific learning rules at different synapses, such as postsynaptic excitability and short-term depression. Finally, we describe the success of efforts toward building predictive, quantitative models of how complex and natural spike trains induce long-term synaptic modifications.
Collapse
Affiliation(s)
- Robert C Froemke
- Molecular Neurobiology Program, Departments of Otolaryngology and Physiology/Neuroscience, The Helen and Martin Kimmel Center for Biology and Medicine, Skirball Institute of Biomolecular Medicine, New York University School of Medicine New York, NY, USA
| | | | | |
Collapse
|
253
|
Meredith RM, Mansvelder HD. STDP and Mental Retardation: Dysregulation of Dendritic Excitability in Fragile X Syndrome. Front Synaptic Neurosci 2010; 2:10. [PMID: 21423496 PMCID: PMC3059693 DOI: 10.3389/fnsyn.2010.00010] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2010] [Accepted: 05/17/2010] [Indexed: 01/09/2023] Open
Abstract
Development of cognitive function requires the formation and refinement of synaptic networks of neurons in the brain. Morphological abnormalities of synaptic spines occur throughout the brain in a wide variety of syndromic and non-syndromic disorders of mental retardation (MR). In both neurons from human post-mortem tissue and mouse models of retardation, the changes observed in synaptic spine and dendritic morphology can be subtle, in the range of 10-20% alterations for spine protrusion length and density. Functionally, synapses in hippocampus and cortex show deficits in long-term potentiation (LTP) and long-term depression (LTD) in an array of neurodevelopmental disorders including Down's, Angelman, Fragile X and Rett syndrome. Recent studies have shown that in principle the machinery for synaptic plasticity is in place in these synapses, but that significant alterations in spike-timing-dependent plasticity (STDP) induction rules exist in cortical synaptic pathways of Fragile X MR syndrome. In this model, the threshold for inducing timing-dependent long-term potentiation (tLTP) is increased in these synapses. Increased postsynaptic activity can overcome this threshold and induce normal levels of tLTP. In this review, we bring together recent studies investigating STDP in neurodevelopmental learning disorders using Fragile X syndrome as a model and we argue that alterations in dendritic excitability underlie deficits seen in STDP. Known and candidate dendritic mechanisms that may underlie the plasticity deficits are discussed. Studying STDP in monogenic MR syndromes with clear deficits in information processing at the cognitive level also provides the field with an opportunity to make direct links between cognition and processing rules at the synapse during development.
Collapse
Affiliation(s)
- Rhiannon M. Meredith
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Neuroscience Campus Amsterdam, VU UniversityAmsterdam, Netherlands
| | - Huibert D. Mansvelder
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Neuroscience Campus Amsterdam, VU UniversityAmsterdam, Netherlands
| |
Collapse
|
254
|
Watt AJ, Desai NS. Homeostatic Plasticity and STDP: Keeping a Neuron's Cool in a Fluctuating World. Front Synaptic Neurosci 2010; 2:5. [PMID: 21423491 PMCID: PMC3059670 DOI: 10.3389/fnsyn.2010.00005] [Citation(s) in RCA: 98] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2010] [Accepted: 05/17/2010] [Indexed: 11/23/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) offers a powerful means of forming and modifying neural circuits. Experimental and theoretical studies have demonstrated its potential usefulness for functions as varied as cortical map development, sharpening of sensory receptive fields, working memory, and associative learning. Even so, it is unlikely that STDP works alone. Unless changes in synaptic strength are coordinated across multiple synapses and with other neuronal properties, it is difficult to maintain the stability and functionality of neural circuits. Moreover, there are certain features of early postnatal development (e.g., rapid changes in sensory input) that threaten neural circuit stability in ways that STDP may not be well placed to counter. These considerations have led researchers to investigate additional types of plasticity, complementary to STDP, that may serve to constrain synaptic weights and/or neuronal firing. These are collectively known as “homeostatic plasticity” and include schemes that control the total synaptic strength of a neuron, that modulate its intrinsic excitability as a function of average activity, or that make the ability of synapses to undergo Hebbian modification depend upon their history of use. In this article, we will review the experimental evidence for homeostatic forms of plasticity and consider how they might interact with STDP during development, and learning and memory.
Collapse
Affiliation(s)
- Alanna J Watt
- Wolfson Institute for Biomedical Research, University College London London, UK
| | | |
Collapse
|
255
|
Burst-induced anti-Hebbian depression acts through short-term synaptic dynamics to cancel redundant sensory signals. J Neurosci 2010; 30:6152-69. [PMID: 20427673 DOI: 10.1523/jneurosci.0303-10.2010] [Citation(s) in RCA: 48] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Weakly electric fish can enhance the detection and localization of important signals such as those of prey in part by cancellation of redundant spatially diffuse electric signals due to, e.g., their tail bending. The cancellation mechanism is based on descending input, conveyed by parallel fibers emanating from cerebellar granule cells, that produces a negative image of the global low-frequency signals in pyramidal cells within the first-order electrosensory region, the electrosensory lateral line lobe (ELL). Here we demonstrate that the parallel fiber synaptic input to ELL pyramidal cell undergoes long-term depression (LTD) whenever both parallel fiber afferents and their target cells are stimulated to produce paired burst discharges. Paired large bursts (4-4) induce robust LTD over pre-post delays of up to +/-50 ms, whereas smaller bursts (2-2) induce weaker LTD. Single spikes (either presynaptic or postsynaptic) paired with bursts did not induce LTD. Tetanic presynaptic stimulation was also ineffective in inducing LTD. Thus, we have demonstrated a form of anti-Hebbian LTD that depends on the temporal correlation of burst discharge. We then demonstrated that the burst-induced LTD is postsynaptic and requires the NR2B subunit of the NMDA receptor, elevation of postsynaptic Ca(2+), and activation of CaMKIIbeta. A model incorporating local inhibitory circuitry and previously identified short-term presynaptic potentiation of the parallel fiber synapses further suggests that the combination of burst-induced LTD, presynaptic potentiation, and local inhibition may be sufficient to explain the generation of the negative image and cancellation of redundant sensory input by ELL pyramidal cells.
Collapse
|
256
|
Modelling the molecular mechanisms of synaptic plasticity using systems biology approaches. Nat Rev Neurosci 2010; 11:239-51. [PMID: 20300102 DOI: 10.1038/nrn2807] [Citation(s) in RCA: 145] [Impact Index Per Article: 10.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/08/2023]
Abstract
Synaptic plasticity is thought to underlie learning and memory, but the complexity of the interactions between the ion channels, enzymes and genes that are involved in synaptic plasticity impedes a deep understanding of this phenomenon. Computer modelling has been used to investigate the information processing that is performed by the signalling pathways involved in synaptic plasticity in principal neurons of the hippocampus, striatum and cerebellum. In the past few years, new software developments that combine computational neuroscience techniques with systems biology techniques have allowed large-scale, kinetic models of the molecular mechanisms underlying long-term potentiation and long-term depression. We highlight important advancements produced by these quantitative modelling efforts and introduce promising approaches that use advancements in live-cell imaging.
Collapse
|
257
|
|
258
|
Kozloski J, Cecchi GA. A theory of loop formation and elimination by spike timing-dependent plasticity. Front Neural Circuits 2010; 4:7. [PMID: 20407633 PMCID: PMC2856591 DOI: 10.3389/fncir.2010.00007] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2009] [Accepted: 02/20/2010] [Indexed: 11/13/2022] Open
Abstract
We show that the local spike timing-dependent plasticity (STDP) rule has the effect of regulating the trans-synaptic weights of loops of any length within a simulated network of neurons. We show that depending on STDP's polarity, functional loops are formed or eliminated in networks driven to normal spiking conditions by random, partially correlated inputs, where functional loops comprise synaptic weights that exceed a positive threshold. We further prove that STDP is a form of loop-regulating plasticity for the case of a linear network driven by noise. Thus a notable local synaptic learning rule makes a specific prediction about synapses in the brain in which standard STDP is present: that under normal spiking conditions, they should participate in predominantly feed-forward connections at all scales. Our model implies that any deviations from this prediction would require a substantial modification to the hypothesized role for standard STDP. Given its widespread occurrence in the brain, we predict that STDP could also regulate long range functional loops among individual neurons across all brain scales, up to, and including, the scale of global brain network topology.
Collapse
Affiliation(s)
- James Kozloski
- Biometaphorical Computing, Computational Biology Center, IBM Research Division, IBM T. J. Watson Research Center Yorktown Heights, NY, USA
| | | |
Collapse
|
259
|
Possible role of cooperative action of NMDA receptor and GABA function in developmental plasticity. J Comput Neurosci 2010; 28:347-59. [DOI: 10.1007/s10827-010-0212-0] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2009] [Revised: 12/04/2009] [Accepted: 01/05/2010] [Indexed: 11/27/2022]
|
260
|
Ansorg R, Schwabe L. Domain-Specific Modeling as a Pragmatic Approach to Neuronal Model Descriptions. Brain Inform 2010. [DOI: 10.1007/978-3-642-15314-3_16] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
|
261
|
Elliott T. Discrete States of Synaptic Strength in a Stochastic Model of Spike-Timing-Dependent Plasticity. Neural Comput 2010; 22:244-72. [DOI: 10.1162/neco.2009.07-08-814] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
A stochastic model of spike-timing-dependent plasticity (STDP) postulates that single synapses presented with a single spike pair exhibit all-or-none quantal jumps in synaptic strength. The amplitudes of the jumps are independent of spiking timing, but their probabilities do depend on spiking timing. By making the amplitudes of both upward and downward transitions equal, synapses then occupy only a discrete set of states of synaptic strength. We explore the impact of a finite, discrete set of strength states on our model, finding three principal results. First, a finite set of strength states limits the capacity of a single synapse to express the standard, exponential STDP curve. We derive the expression for the expected change in synaptic strength in response to a standard, experimental spike pair protocol, finding a deviation from exponential behavior. We fit our prediction to recent data from single dendritic spine heads, finding results that are somewhat better than exponential fits. Second, we show that the fixed-point dynamics of our model regulate the upward and downward transition probabilities so that these are on average equal, leading to a uniform distribution of synaptic strength states. However, third, under long-term potentiation (LTP) and long-term depression (LTD) protocols, these probabilities are unequal, skewing the distribution away from uniformity. If the number of states of strength is at least of order 10, then we find that three effective states of synaptic strength appear, consistent with some experimental data on ternary-strength synapses. On this view, LTP and LTD protocols may therefore be saturating protocols.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K
| |
Collapse
|
262
|
|
263
|
Burst-time-dependent plasticity robustly guides ON/OFF segregation in the lateral geniculate nucleus. PLoS Comput Biol 2009; 5:e1000618. [PMID: 20041207 PMCID: PMC2790088 DOI: 10.1371/journal.pcbi.1000618] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2009] [Accepted: 11/17/2009] [Indexed: 11/19/2022] Open
Abstract
Spontaneous retinal activity (known as "waves") remodels synaptic connectivity to the lateral geniculate nucleus (LGN) during development. Analysis of retinal waves recorded with multielectrode arrays in mouse suggested that a cue for the segregation of functionally distinct (ON and OFF) retinal ganglion cells (RGCs) in the LGN may be a desynchronization in their firing, where ON cells precede OFF cells by one second. Using the recorded retinal waves as input, with two different modeling approaches we explore timing-based plasticity rules for the evolution of synaptic weights to identify key features underlying ON/OFF segregation. First, we analytically derive a linear model for the evolution of ON and OFF weights, to understand how synaptic plasticity rules extract input firing properties to guide segregation. Second, we simulate postsynaptic activity with a nonlinear integrate-and-fire model to compare findings with the linear model. We find that spike-time-dependent plasticity, which modifies synaptic weights based on millisecond-long timing and order of pre- and postsynaptic spikes, fails to segregate ON and OFF retinal inputs in the absence of normalization. Implementing homeostatic mechanisms results in segregation, but only with carefully-tuned parameters. Furthermore, extending spike integration timescales to match the second-long input correlation timescales always leads to ON segregation because ON cells fire before OFF cells. We show that burst-time-dependent plasticity can robustly guide ON/OFF segregation in the LGN without normalization, by integrating pre- and postsynaptic bursts irrespective of their firing order and over second-long timescales. We predict that an LGN neuron will become ON- or OFF-responsive based on a local competition of the firing patterns of neighboring RGCs connecting to it. Finally, we demonstrate consistency with ON/OFF segregation in ferret, despite differences in the firing properties of retinal waves. Our model suggests that diverse input statistics of retinal waves can be robustly interpreted by a burst-based rule, which underlies retinogeniculate plasticity across different species.
Collapse
|
264
|
Elliott T, Lagogiannis K. Taming Fluctuations in a Stochastic Model of Spike-Timing-Dependent Plasticity. Neural Comput 2009; 21:3363-407. [DOI: 10.1162/neco.2009.12-08-916] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
A stochastic model of spike-timing-dependent plasticity proposes that single synapses express fixed-amplitude jumps in strength, the amplitudes being independent of the spike time difference. However, the probability that a jump in strength occurs does depend on spike timing. Although the model has a number of desirable features, the stochasticity of response of a synapse introduces potentially large fluctuations into changes in synaptic strength. These can destabilize the segregated patterns of afferent connectivity characteristic of neuronal development. Previously we have taken these jumps to be small relative to overall synaptic strengths to control fluctuations, but doing so increases developmental timescales unacceptably. Here, we explore three alternative ways of taming fluctuations. First, a calculation of the variance for the change in synaptic strength shows that the mean change eventually dominates fluctuations, but on timescales that are too long. Second, it is possible that fluctuations in strength may cancel between synapses, but we show that correlations between synapses emasculate the law of large numbers. Finally, by separating plasticity induction and expression, we introduce a temporal window during which induction signals are low-pass-filtered before expression. In this way, fluctuations in strength are tamed, stabilizing segregated states of afferent connectivity.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton SO17 1BJ, U.K
| | - Konstantinos Lagogiannis
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton SO17 1BJ, U.K
| |
Collapse
|
265
|
Meisel C, Gross T. Adaptive self-organization in a realistic neural network model. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2009; 80:061917. [PMID: 20365200 DOI: 10.1103/physreve.80.061917] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/01/2009] [Indexed: 05/09/2023]
Abstract
Information processing in complex systems is often found to be maximally efficient close to critical states associated with phase transitions. It is therefore conceivable that also neural information processing operates close to criticality. This is further supported by the observation of power-law distributions, which are a hallmark of phase transitions. An important open question is how neural networks could remain close to a critical point while undergoing a continual change in the course of development, adaptation, learning, and more. An influential contribution was made by Bornholdt and Rohlf, introducing a generic mechanism of robust self-organized criticality in adaptive networks. Here, we address the question whether this mechanism is relevant for real neural networks. We show in a realistic model that spike-time-dependent synaptic plasticity can self-organize neural networks robustly toward criticality. Our model reproduces several empirical observations and makes testable predictions on the distribution of synaptic strength, relating them to the critical state of the network. These results suggest that the interplay between dynamics and topology may be essential for neural information processing.
Collapse
Affiliation(s)
- Christian Meisel
- Max-Planck-Institut für Physik komplexer Systeme, Nöthnitzer Strasse 38, 01187 Dresden, Germany.
| | | |
Collapse
|
266
|
Gilson M, Burkitt AN, Grayden DB, Thomas DA, van Hemmen JL. Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks IV: structuring synaptic pathways among recurrent connections. BIOLOGICAL CYBERNETICS 2009; 101:427-444. [PMID: 19937070 DOI: 10.1007/s00422-009-0346-1] [Citation(s) in RCA: 40] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/23/2009] [Accepted: 10/27/2009] [Indexed: 05/28/2023]
Abstract
In neuronal networks, the changes of synaptic strength (or weight) performed by spike-timing-dependent plasticity (STDP) are hypothesized to give rise to functional network structure. This article investigates how this phenomenon occurs for the excitatory recurrent connections of a network with fixed input weights that is stimulated by external spike trains. We develop a theoretical framework based on the Poisson neuron model to analyze the interplay between the neuronal activity (firing rates and the spike-time correlations) and the learning dynamics, when the network is stimulated by correlated pools of homogeneous Poisson spike trains. STDP can lead to both a stabilization of all the neuron firing rates (homeostatic equilibrium) and a robust weight specialization. The pattern of specialization for the recurrent weights is determined by a relationship between the input firing-rate and correlation structures, the network topology, the STDP parameters and the synaptic response properties. We find conditions for feed-forward pathways or areas with strengthened self-feedback to emerge in an initially homogeneous recurrent network.
Collapse
Affiliation(s)
- Matthieu Gilson
- Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, VIC 3010, Australia.
| | | | | | | | | |
Collapse
|
267
|
Vasilaki E, Frémaux N, Urbanczik R, Senn W, Gerstner W. Spike-based reinforcement learning in continuous state and action space: when policy gradient methods fail. PLoS Comput Biol 2009; 5:e1000586. [PMID: 19997492 PMCID: PMC2778872 DOI: 10.1371/journal.pcbi.1000586] [Citation(s) in RCA: 71] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2009] [Accepted: 10/30/2009] [Indexed: 12/01/2022] Open
Abstract
Changes of synaptic connections between neurons are thought to be the physiological basis of learning. These changes can be gated by neuromodulators that encode the presence of reward. We study a family of reward-modulated synaptic learning rules for spiking neurons on a learning task in continuous space inspired by the Morris Water maze. The synaptic update rule modifies the release probability of synaptic transmission and depends on the timing of presynaptic spike arrival, postsynaptic action potentials, as well as the membrane potential of the postsynaptic neuron. The family of learning rules includes an optimal rule derived from policy gradient methods as well as reward modulated Hebbian learning. The synaptic update rule is implemented in a population of spiking neurons using a network architecture that combines feedforward input with lateral connections. Actions are represented by a population of hypothetical action cells with strong mexican-hat connectivity and are read out at theta frequency. We show that in this architecture, a standard policy gradient rule fails to solve the Morris watermaze task, whereas a variant with a Hebbian bias can learn the task within 20 trials, consistent with experiments. This result does not depend on implementation details such as the size of the neuronal populations. Our theoretical approach shows how learning new behaviors can be linked to reward-modulated plasticity at the level of single synapses and makes predictions about the voltage and spike-timing dependence of synaptic plasticity and the influence of neuromodulators such as dopamine. It is an important step towards connecting formal theories of reinforcement learning with neuronal and synaptic properties.
Collapse
Affiliation(s)
- Eleni Vasilaki
- Laboratory of Computational Neuroscience, EPFL, Lausanne, Switzerland.
| | | | | | | | | |
Collapse
|
268
|
Gilson M, Burkitt AN, Grayden DB, Thomas DA, van Hemmen JL. Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks III: Partially connected neurons driven by spontaneous activity. BIOLOGICAL CYBERNETICS 2009; 101:411-426. [PMID: 19937071 DOI: 10.1007/s00422-009-0343-4] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/23/2009] [Accepted: 10/19/2009] [Indexed: 05/28/2023]
Abstract
In contrast to a feed-forward architecture, the weight dynamics induced by spike-timing-dependent plasticity (STDP) in a recurrent neuronal network is not yet well understood. In this article, we extend a previous study of the impact of additive STDP in a recurrent network that is driven by spontaneous activity (no external stimulating inputs) from a fully connected network to one that is only partially connected. The asymptotic state of the network is analyzed, and it is found that the equilibrium and stability conditions for the firing rates are similar for both full and partial connectivity: STDP causes the firing rates to converge toward the same value and remain quasi-homogeneous. However, when STDP induces strong weight competition, the connectivity affects the weight dynamics in that the distribution of the weights disperses more quickly for lower density than for higher density. The asymptotic weight distribution strongly depends upon that at the beginning of the learning epoch; consequently, homogeneous connectivity alone is not sufficient to obtain homogeneous neuronal activity. In the absence of external inputs, STDP can nevertheless generate structure in the network through autocorrelation effects, for example, by introducing asymmetry in network topology.
Collapse
Affiliation(s)
- Matthieu Gilson
- Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, VIC 3010, Australia.
| | | | | | | | | |
Collapse
|
269
|
Baroni F, Varona P. Spike timing-dependent plasticity is affected by the interplay of intrinsic and network oscillations. ACTA ACUST UNITED AC 2009; 104:91-8. [PMID: 19913095 DOI: 10.1016/j.jphysparis.2009.11.007] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
Spike timing-dependent plasticity (STDP) is a form of Hebbian learning which is thought to underlie structure formation during development, and learning and memory in later life. In this paper we show that the intrinsic properties of the postsynaptic neuron might have a deep influence on STDP dynamics by shaping the causal correlation between the pre- and the postsynaptic spike trains. The cell-specific effect of STDP is particularly evident in the presence of an oscillatory component in a cell input. In this case, the cell-specific phase response to an oscillatory modulation biases the oscillating afferents towards potentiation or depression, depending upon the intrinsic dynamics of the postsynaptic neuron and the period of the modulation.
Collapse
Affiliation(s)
- Fabiano Baroni
- GNB, Dpto. de Ing. Informatica, Escuela Politecnica Superior, Universidad Autonoma de Madrid, Spain.
| | | |
Collapse
|
270
|
Gilson M, Burkitt AN, Grayden DB, Thomas DA, van Hemmen JL. Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks. II. Input selectivity--symmetry breaking. BIOLOGICAL CYBERNETICS 2009; 101:103-114. [PMID: 19536559 DOI: 10.1007/s00422-009-0320-y] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/19/2008] [Accepted: 05/14/2009] [Indexed: 05/27/2023]
Abstract
Spike-timing-dependent plasticity (STDP) is believed to structure neuronal networks by slowly changing the strengths (or weights) of the synaptic connections between neurons depending upon their spiking activity, which in turn modifies the neuronal firing dynamics. In this paper, we investigate the change in synaptic weights induced by STDP in a recurrently connected network in which the input weights are plastic but the recurrent weights are fixed. The inputs are divided into two pools with identical constant firing rates and equal within-pool spike-time correlations, but with no between-pool correlations. Our analysis uses the Poisson neuron model in order to predict the evolution of the input synaptic weights and focuses on the asymptotic weight distribution that emerges due to STDP. The learning dynamics induces a symmetry breaking for the individual neurons, namely for sufficiently strong within-pool spike-time correlation each neuron specializes to one of the input pools. We show that the presence of fixed excitatory recurrent connections between neurons induces a group symmetry-breaking effect, in which neurons tend to specialize to the same input pool. Consequently STDP generates a functional structure on the input connections of the network.
Collapse
Affiliation(s)
- Matthieu Gilson
- Department of Electrical and Electronic Engineering, University of Melbourne, Melbourne, VIC 3010, Australia.
| | | | | | | | | |
Collapse
|
271
|
Gilson M, Burkitt AN, Grayden DB, Thomas DA, van Hemmen JL. Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks. I. Input selectivity--strengthening correlated input pathways. BIOLOGICAL CYBERNETICS 2009; 101:81-102. [PMID: 19536560 DOI: 10.1007/s00422-009-0319-4] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/23/2008] [Accepted: 05/13/2009] [Indexed: 05/27/2023]
Abstract
Spike-timing-dependent plasticity (STDP) determines the evolution of the synaptic weights according to their pre- and post-synaptic activity, which in turn changes the neuronal activity. In this paper, we extend previous studies of input selectivity induced by (STDP) for single neurons to the biologically interesting case of a neuronal network with fixed recurrent connections and plastic connections from external pools of input neurons. We use a theoretical framework based on the Poisson neuron model to analytically describe the network dynamics (firing rates and spike-time correlations) and thus the evolution of the synaptic weights. This framework incorporates the time course of the post-synaptic potentials and synaptic delays. Our analysis focuses on the asymptotic states of a network stimulated by two homogeneous pools of "steady" inputs, namely Poisson spike trains which have fixed firing rates and spike-time correlations. The (STDP) model extends rate-based learning in that it can implement, at the same time, both a stabilization of the individual neuron firing rates and a slower weight specialization depending on the input spike-time correlations. When one input pathway has stronger within-pool correlations, the resulting synaptic dynamics induced by (STDP) are shown to be similar to those arising in the case of a purely feed-forward network: the weights from the more correlated inputs are potentiated at the expense of the remaining input connections.
Collapse
Affiliation(s)
- Matthieu Gilson
- Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, VIC 3010, Australia.
| | | | | | | | | |
Collapse
|
272
|
Liu JK, She ZS. A spike-timing pattern based neural network model for the study of memory dynamics. PLoS One 2009; 4:e6247. [PMID: 19629179 PMCID: PMC2710501 DOI: 10.1371/journal.pone.0006247] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2009] [Accepted: 06/18/2009] [Indexed: 11/24/2022] Open
Abstract
It is well accepted that the brain's computation relies on spatiotemporal activity of neural networks. In particular, there is growing evidence of the importance of continuously and precisely timed spiking activity. Therefore, it is important to characterize memory states in terms of spike-timing patterns that give both reliable memory of firing activities and precise memory of firing timings. The relationship between memory states and spike-timing patterns has been studied empirically with large-scale recording of neuron population in recent years. Here, by using a recurrent neural network model with dynamics at two time scales, we construct a dynamical memory network model which embeds both fast neural and synaptic variation and slow learning dynamics. A state vector is proposed to describe memory states in terms of spike-timing patterns of neural population, and a distance measure of state vector is defined to study several important phenomena of memory dynamics: partial memory recall, learning efficiency, learning with correlated stimuli. We show that the distance measure can capture the timing difference of memory states. In addition, we examine the influence of network topology on learning ability, and show that local connections can increase the network's ability to embed more memory states. Together theses results suggest that the proposed system based on spike-timing patterns gives a productive model for the study of detailed learning and memory dynamics.
Collapse
Affiliation(s)
- Jian K. Liu
- Department of Mathematics, University of California Los Angeles, Los Angeles, California, United States of America
- * E-mail: (JKL); (Z-SS)
| | - Zhen-Su She
- State Key Lab for Turbulence and Complex Systems and Center for Theoretical Biology, Peking University, Beijing, China
- * E-mail: (JKL); (Z-SS)
| |
Collapse
|
273
|
Hippocampus, microcircuits and associative memory. Neural Netw 2009; 22:1120-8. [PMID: 19647982 DOI: 10.1016/j.neunet.2009.07.009] [Citation(s) in RCA: 47] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2009] [Revised: 05/20/2009] [Accepted: 07/14/2009] [Indexed: 11/22/2022]
Abstract
The hippocampus is one of the most widely studied brain region. One of its functional roles is the storage and recall of declarative memories. Recent hippocampus research has yielded a wealth of data on network architecture, cell types, the anatomy and membrane properties of pyramidal cells and interneurons, and synaptic plasticity. Understanding the functional roles of different families of hippocampal neurons in information processing, synaptic plasticity and network oscillations poses a great challenge but also promises deep insight into one of the major brain systems. Computational and mathematical models play an instrumental role in exploring such functions. In this paper, we provide an overview of abstract and biophysical models of associative memory with particular emphasis on the operations performed by the diverse (inter)neurons in encoding and retrieval of memories in the hippocampus.
Collapse
|
274
|
Abstract
Proper wiring up of the nervous system is critical to the development of organisms capable of complex and adaptable behaviors. Besides the many experimental advances in determining the cellular and molecular machinery that carries out this remarkable task precisely and robustly, theoretical approaches have also proven to be useful tools in analyzing this machinery. A quantitative understanding of these processes can allow us to make predictions, test hypotheses, and appraise established concepts in a new light. Three areas that have been fruitful in this regard are axon guidance, retinotectal mapping, and activity-dependent development. This chapter reviews some of the contributions made by mathematical modeling in these areas, illustrated by important examples of models in each section. For axon guidance, we discuss models of how growth cones respond to their environment, and how this environment can place constraints on growth cone behavior. Retinotectal mapping looks at computational models for how topography can be generated in populations of neurons based on molecular gradients and other mechanisms such as competition. In activity-dependent development, we discuss theoretical approaches largely based on Hebbian synaptic plasticity rules, and how they can generate maps in the visual cortex very similar to those seen in vivo. We show how theoretical approaches have substantially contributed to the advancement of developmental neuroscience, and discuss future directions for mathematical modeling in the field.
Collapse
|
275
|
Nageswaran JM, Dutt N, Krichmar JL, Nicolau A, Veidenbaum AV. A configurable simulation environment for the efficient simulation of large-scale spiking neural networks on graphics processors. Neural Netw 2009; 22:791-800. [PMID: 19615853 DOI: 10.1016/j.neunet.2009.06.028] [Citation(s) in RCA: 90] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2009] [Revised: 06/04/2009] [Accepted: 06/25/2009] [Indexed: 10/20/2022]
|
276
|
Urakubo H, Honda M, Tanaka K, Kuroda S. Experimental and computational aspects of signaling mechanisms of spike-timing-dependent plasticity. HFSP JOURNAL 2009; 3:240-54. [PMID: 20119481 DOI: 10.2976/1.3137602] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/05/2008] [Accepted: 04/27/2009] [Indexed: 11/19/2022]
Abstract
STDP (spike-timing-dependent synaptic plasticity) is thought to be a synaptic learning rule that embeds spike-timing information into a specific pattern of synaptic strengths in neuronal circuits, resulting in a memory. STDP consists of bidirectional long-term changes in synaptic strengths. This process includes long-term potentiation and long-term depression, which are dependent on the timing of presynaptic and postsynaptic spikings. In this review, we focus on computational aspects of signaling mechanisms that induce and maintain STDP as a key step toward the definition of a general synaptic learning rule. In addition, we discuss the temporal and spatial aspects of STDP, and the requirement of a homeostatic mechanism of STDP in vivo.
Collapse
|
277
|
Potjans W, Morrison A, Diesmann M. A spiking neural network model of an actor-critic learning agent. Neural Comput 2009; 21:301-39. [PMID: 19196231 DOI: 10.1162/neco.2008.08-07-593] [Citation(s) in RCA: 66] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The ability to adapt behavior to maximize reward as a result of interactions with the environment is crucial for the survival of any higher organism. In the framework of reinforcement learning, temporal-difference learning algorithms provide an effective strategy for such goal-directed adaptation, but it is unclear to what extent these algorithms are compatible with neural computation. In this article, we present a spiking neural network model that implements actor-critic temporal-difference learning by combining local plasticity rules with a global reward signal. The network is capable of solving a nontrivial gridworld task with sparse rewards. We derive a quantitative mapping of plasticity parameters and synaptic weights to the corresponding variables in the standard algorithmic formulation and demonstrate that the network learns with a similar speed to its discrete time counterpart and attains the same equilibrium performance.
Collapse
Affiliation(s)
- Wiebke Potjans
- Computational Neuroscience Group, RIKEN Brain Science Institute, Wako City, Saitama 351-0198, Japan.
| | | | | |
Collapse
|
278
|
Abstract
Short-term synaptic plasticity (STP) can significantly alter the amplitudes of synaptic responses in ways that depend on presynaptic history. Thus, it is widely assumed that STP acts as a filter for specific patterns of presynaptic inputs, and as a result can play key roles in neuronal information processing. To evaluate this assumption and directly quantify the effects of STP on information transmission, we consider a population of independent synaptic inputs to a model neuron. We show using standard information theoretic approaches that the changes in synaptic response amplitude resulting from STP interact with the related effects on fluctuations in membrane conductance, such that information transmission is broadband (no frequency-dependent filtering occurs), regardless of whether synaptic depression or facilitation dominates. Interestingly, this broadband transmission is preserved in the postsynaptic spike train as long as the postsynaptic neuron's baseline firing rate is relatively high; in contrast, low baseline firing rates lead to STP-dependent effects. Thus, background inputs that control the firing state of a postsynaptic neuron can gate the effects of STP on information transmission.
Collapse
|
279
|
Billings G, van Rossum MCW. Memory retention and spike-timing-dependent plasticity. J Neurophysiol 2009; 101:2775-88. [PMID: 19297513 DOI: 10.1152/jn.91007.2008] [Citation(s) in RCA: 52] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Memory systems should be plastic to allow for learning; however, they should also retain earlier memories. Here we explore how synaptic weights and memories are retained in models of single neurons and networks equipped with spike-timing-dependent plasticity. We show that for single neuron models, the precise learning rule has a strong effect on the memory retention time. In particular, a soft-bound, weight-dependent learning rule has a very short retention time as compared with a learning rule that is independent of the synaptic weights. Next, we explore how the retention time is reflected in receptive field stability in networks. As in the single neuron case, the weight-dependent learning rule yields less stable receptive fields than a weight-independent rule. However, receptive fields stabilize in the presence of sufficient lateral inhibition, demonstrating that plasticity in networks can be regulated by inhibition and suggesting a novel role for inhibition in neural circuits.
Collapse
Affiliation(s)
- Guy Billings
- Neuroinformatics Doctoral Training Centre, University of Edinburgh, Edinburgh, United Kingdom.
| | | |
Collapse
|
280
|
Eppler JM, Helias M, Muller E, Diesmann M, Gewaltig MO. PyNEST: A Convenient Interface to the NEST Simulator. Front Neuroinform 2009; 2:12. [PMID: 19198667 PMCID: PMC2636900 DOI: 10.3389/neuro.11.012.2008] [Citation(s) in RCA: 118] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2008] [Accepted: 12/30/2008] [Indexed: 11/13/2022] Open
Abstract
The neural simulation tool NEST (http://www.nest-initiative.org) is a simulator for heterogeneous networks of point neurons or neurons with a small number of compartments. It aims at simulations of large neural systems with more than 10(4) neurons and 10(7) to 10(9) synapses. NEST is implemented in C++ and can be used on a large range of architectures from single-core laptops over multi-core desktop computers to super-computers with thousands of processor cores. Python (http://www.python.org) is a modern programming language that has recently received considerable attention in Computational Neuroscience. Python is easy to learn and has many extension modules for scientific computing (e.g. http://www.scipy.org). In this contribution we describe PyNEST, the new user interface to NEST. PyNEST combines NEST's efficient simulation kernel with the simplicity and flexibility of Python. Compared to NEST's native simulation language SLI, PyNEST makes it easier to set up simulations, generate stimuli, and analyze simulation results. We describe how PyNEST connects NEST and Python and how it is implemented. With a number of examples, we illustrate how it is used.
Collapse
|
281
|
Helias M, Rotter S, Gewaltig MO, Diesmann M. Structural plasticity controlled by calcium based correlation detection. helias@bccn.uni-freiburg.de. Front Comput Neurosci 2008; 2:7. [PMID: 19129936 PMCID: PMC2614616 DOI: 10.3389/neuro.10.007.2008] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2008] [Accepted: 12/02/2008] [Indexed: 11/24/2022] Open
Abstract
Hebbian learning in cortical networks during development and adulthood relies on the presence of a mechanism to detect correlation between the presynaptic and the postsynaptic spiking activity. Recently, the calcium concentration in spines was experimentally shown to be a correlation sensitive signal with the necessary properties: it is confined to the spine volume, it depends on the relative timing of pre- and postsynaptic action potentials, and it is independent of the spine's location along the dendrite. NMDA receptors are a candidate mediator for the correlation dependent calcium signal. Here, we present a quantitative model of correlation detection in synapses based on the calcium influx through NMDA receptors under realistic conditions of irregular pre- and postsynaptic spiking activity with pairwise correlation. Our analytical framework captures the interaction of the learning rule and the correlation dynamics of the neurons. We find that a simple thresholding mechanism can act as a sensitive and reliable correlation detector at physiological firing rates. Furthermore, the mechanism is sensitive to correlation among afferent synapses by cooperation and competition. In our model this mechanism controls synapse formation and elimination. We explain how synapse elimination leads to firing rate homeostasis and show that the connectivity structure is shaped by the correlations between neighboring inputs.
Collapse
Affiliation(s)
- Moritz Helias
- Bernstein Center for Computational Neuroscience Freiburg, Germany
| | | | | | | |
Collapse
|
282
|
Goodman D, Brette R. Brian: a simulator for spiking neural networks in python. Front Neuroinform 2008; 2:5. [PMID: 19115011 PMCID: PMC2605403 DOI: 10.3389/neuro.11.005.2008] [Citation(s) in RCA: 224] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2008] [Accepted: 10/26/2008] [Indexed: 11/29/2022] Open
Abstract
“Brian” is a new simulator for spiking neural networks, written in Python (http://brian. di.ens.fr). It is an intuitive and highly flexible tool for rapidly developing new models, especially networks of single-compartment neurons. In addition to using standard types of neuron models, users can define models by writing arbitrary differential equations in ordinary mathematical notation. Python scientific libraries can also be used for defining models and analysing data. Vectorisation techniques allow efficient simulations despite the overheads of an interpreted language. Brian will be especially valuable for working on non-standard neuron models not easily covered by existing software, and as an alternative to using Matlab or C for simulations. With its easy and intuitive syntax, Brian is also very well suited for teaching computational neuroscience.
Collapse
Affiliation(s)
- Dan Goodman
- Département d'Informatique, Ecole Normale Supérieure Paris, France
| | | |
Collapse
|