1
|
Miehl C, Onasch S, Festa D, Gjorgjieva J. Formation and computational implications of assemblies in neural circuits. J Physiol 2022. [PMID: 36068723 DOI: 10.1113/jp282750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/22/2022] [Indexed: 11/08/2022] Open
Abstract
In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain. Abstract figure legend Assembly Formation. Assemblies are groups of strongly connected neurons formed by the interaction of multiple mechanisms and with vast computational implications. Four interacting components are thought to drive assembly formation: synaptic plasticity, symmetry breaking, competition and stability. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Christoph Miehl
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Sebastian Onasch
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Dylan Festa
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| |
Collapse
|
2
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
3
|
Rhythmogenesis evolves as a consequence of long-term plasticity of inhibitory synapses. Sci Rep 2018; 8:13050. [PMID: 30158555 PMCID: PMC6115462 DOI: 10.1038/s41598-018-31412-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2018] [Accepted: 08/07/2018] [Indexed: 11/08/2022] Open
Abstract
Brain rhythms are widely believed to reflect numerous cognitive processes. Changes in rhythmicity have been associated with pathological states. However, the mechanism underlying these rhythms remains unknown. Here, we present a theoretical analysis of the evolvement of rhythm generating capabilities in neuronal circuits. We tested the hypothesis that brain rhythms can be acquired via an intrinsic unsupervised learning process of activity dependent plasticity. Specifically, we focused on spike timing dependent plasticity (STDP) of inhibitory synapses. We detail how rhythmicity can develop via STDP under certain conditions that serve as a natural prediction of the hypothesis. We show how global features of the STDP rule govern and stabilize the resultant rhythmic activity. Finally, we demonstrate how rhythmicity is retained even in the face of synaptic variability. This study suggests a role for inhibitory plasticity that is beyond homeostatic processes.
Collapse
|
4
|
Delay-Induced Multistability and Loop Formation in Neuronal Networks with Spike-Timing-Dependent Plasticity. Sci Rep 2018; 8:12068. [PMID: 30104713 PMCID: PMC6089910 DOI: 10.1038/s41598-018-30565-9] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2018] [Accepted: 08/02/2018] [Indexed: 12/16/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) adjusts synaptic strengths according to the precise timing of pre- and postsynaptic spike pairs. Theoretical and computational studies have revealed that STDP may contribute to the emergence of a variety of structural and dynamical states in plastic neuronal populations. In this manuscript, we show that by incorporating dendritic and axonal propagation delays in recurrent networks of oscillatory neurons, the asymptotic connectivity displays multistability, where different structures emerge depending on the initial distribution of the synaptic strengths. In particular, we show that the standard deviation of the initial distribution of synaptic weights, besides its mean, determines the main properties of the emergent structural connectivity such as the mean final synaptic weight, the number of two-neuron loops and the symmetry of the final structure. We also show that the firing rates of the neurons affect the evolution of the network, and a more symmetric configuration of the synapses emerges at higher firing rates. We justify the network results based on a two-neuron framework and show how the results translate to large recurrent networks.
Collapse
|
5
|
Min B, Zhou D, Cai D. Effects of Firing Variability on Network Structures with Spike-Timing-Dependent Plasticity. Front Comput Neurosci 2018; 12:1. [PMID: 29410621 PMCID: PMC5787127 DOI: 10.3389/fncom.2018.00001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2017] [Accepted: 01/03/2018] [Indexed: 11/17/2022] Open
Abstract
Synaptic plasticity is believed to be the biological substrate underlying learning and memory. One of the most widespread forms of synaptic plasticity, spike-timing-dependent plasticity (STDP), uses the spike timing information of presynaptic and postsynaptic neurons to induce synaptic potentiation or depression. An open question is how STDP organizes the connectivity patterns in neuronal circuits. Previous studies have placed much emphasis on the role of firing rate in shaping connectivity patterns. Here, we go beyond the firing rate description to develop a self-consistent linear response theory that incorporates the information of both firing rate and firing variability. By decomposing the pairwise spike correlation into one component associated with local direct connections and the other associated with indirect connections, we identify two distinct regimes regarding the network structures learned through STDP. In one regime, the contribution of the direct-connection correlations dominates over that of the indirect-connection correlations in the learning dynamics; this gives rise to a network structure consistent with the firing rate description. In the other regime, the contribution of the indirect-connection correlations dominates in the learning dynamics, leading to a network structure different from the firing rate description. We demonstrate that the heterogeneity of firing variability across neuronal populations induces a temporally asymmetric structure of indirect-connection correlations. This temporally asymmetric structure underlies the emergence of the second regime. Our study provides a new perspective that emphasizes the role of high-order statistics of spiking activity in the spike-correlation-sensitive learning dynamics.
Collapse
Affiliation(s)
- Bin Min
- Center for Neural Science, Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - David Cai
- Center for Neural Science, Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates.,School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
6
|
Lajoie G, Krouchev NI, Kalaska JF, Fairhall AL, Fetz EE. Correlation-based model of artificially induced plasticity in motor cortex by a bidirectional brain-computer interface. PLoS Comput Biol 2017; 13:e1005343. [PMID: 28151957 PMCID: PMC5313237 DOI: 10.1371/journal.pcbi.1005343] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2016] [Revised: 02/16/2017] [Accepted: 01/03/2017] [Indexed: 12/19/2022] Open
Abstract
Experiments show that spike-triggered stimulation performed with Bidirectional Brain-Computer-Interfaces (BBCI) can artificially strengthen connections between separate neural sites in motor cortex (MC). When spikes from a neuron recorded at one MC site trigger stimuli at a second target site after a fixed delay, the connections between sites eventually strengthen. It was also found that effective spike-stimulus delays are consistent with experimentally derived spike-timing-dependent plasticity (STDP) rules, suggesting that STDP is key to drive these changes. However, the impact of STDP at the level of circuits, and the mechanisms governing its modification with neural implants remain poorly understood. The present work describes a recurrent neural network model with probabilistic spiking mechanisms and plastic synapses capable of capturing both neural and synaptic activity statistics relevant to BBCI conditioning protocols. Our model successfully reproduces key experimental results, both established and new, and offers mechanistic insights into spike-triggered conditioning. Using analytical calculations and numerical simulations, we derive optimal operational regimes for BBCIs, and formulate predictions concerning the efficacy of spike-triggered conditioning in different regimes of cortical activity.
Collapse
Affiliation(s)
- Guillaume Lajoie
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
| | | | - John F. Kalaska
- Groupe de recherche sur le système nerveux central, Département de neurosciences, Université de Montreal, Montreal, QC, Canada
| | - Adrienne L. Fairhall
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
- Dept. of Physiology and Biophysics, University of Washington, Seattle, WA, USA
- Dept. of Physics, University of Washington, Seattle, WA, USA
| | - Eberhard E. Fetz
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
- Dept. of Physiology and Biophysics, University of Washington, Seattle, WA, USA
| |
Collapse
|
7
|
Madadi Asl M, Valizadeh A, Tass PA. Dendritic and Axonal Propagation Delays Determine Emergent Structures of Neuronal Networks with Plastic Synapses. Sci Rep 2017; 7:39682. [PMID: 28045109 PMCID: PMC5206725 DOI: 10.1038/srep39682] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2016] [Accepted: 11/25/2016] [Indexed: 11/09/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) modifies synaptic strengths based on the relative timing of pre- and postsynaptic spikes. The temporal order of spikes turned out to be crucial. We here take into account how propagation delays, composed of dendritic and axonal delay times, may affect the temporal order of spikes. In a minimal setting, characterized by neglecting dendritic and axonal propagation delays, STDP eliminates bidirectional connections between two coupled neurons and turns them into unidirectional connections. In this paper, however, we show that depending on the dendritic and axonal propagation delays, the temporal order of spikes at the synapses can be different from those in the cell bodies and, consequently, qualitatively different connectivity patterns emerge. In particular, we show that for a system of two coupled oscillatory neurons, bidirectional synapses can be preserved and potentiated. Intriguingly, this finding also translates to large networks of type-II phase oscillators and, hence, crucially impacts on the overall hierarchical connectivity patterns of oscillatory neuronal networks.
Collapse
Affiliation(s)
- Mojtaba Madadi Asl
- Institute for Advanced Studies in Basic Sciences (IASBS), Department of Physics, Zanjan, 45195-1159, Iran
| | - Alireza Valizadeh
- Institute for Advanced Studies in Basic Sciences (IASBS), Department of Physics, Zanjan, 45195-1159, Iran.,Institute for Research in Fundamental Sciences (IPM), School of Cognitive Sciences, Tehran, 19395-5746, Iran
| | - Peter A Tass
- Institute of Neuroscience and Medicine - Neuromodulation (INM-7), Research Center Jülich, Jülich, 52425, Germany.,Stanford University, Department of Neurosurgery, Stanford, CA, 94305, USA.,University of Cologne, Department of Neuromodulation, Cologne, 50937, Germany
| |
Collapse
|
8
|
Luz Y, Shamir M. Oscillations via Spike-Timing Dependent Plasticity in a Feed-Forward Model. PLoS Comput Biol 2016; 12:e1004878. [PMID: 27082118 PMCID: PMC4833372 DOI: 10.1371/journal.pcbi.1004878] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2015] [Accepted: 03/16/2016] [Indexed: 12/18/2022] Open
Abstract
Neuronal oscillatory activity has been reported in relation to a wide range of cognitive processes including the encoding of external stimuli, attention, and learning. Although the specific role of these oscillations has yet to be determined, it is clear that neuronal oscillations are abundant in the central nervous system. This raises the question of the origin of these oscillations: are the mechanisms for generating these oscillations genetically hard-wired or can they be acquired via a learning process? Here, we study the conditions under which oscillatory activity emerges through a process of spike timing dependent plasticity (STDP) in a feed-forward architecture. First, we analyze the effect of oscillations on STDP-driven synaptic dynamics of a single synapse, and study how the parameters that characterize the STDP rule and the oscillations affect the resultant synaptic weight. Next, we analyze STDP-driven synaptic dynamics of a pre-synaptic population of neurons onto a single post-synaptic cell. The pre-synaptic neural population is assumed to be oscillating at the same frequency, albeit with different phases, such that the net activity of the pre-synaptic population is constant in time. Thus, in the homogeneous case in which all synapses are equal, the post-synaptic neuron receives constant input and hence does not oscillate. To investigate the transition to oscillatory activity, we develop a mean-field Fokker-Planck approximation of the synaptic dynamics. We analyze the conditions causing the homogeneous solution to lose its stability. The findings show that oscillatory activity appears through a mechanism of spontaneous symmetry breaking. However, in the general case the homogeneous solution is unstable, and the synaptic dynamics does not converge to a different fixed point, but rather to a limit cycle. We show how the temporal structure of the STDP rule determines the stability of the homogeneous solution and the drift velocity of the limit cycle. Oscillatory activity in the brain has been described in relation to many cognitive states and tasks, including the encoding of external stimuli, attention, learning and consolidation of memory. However, without tuning of synaptic weights with the preferred phase of firing the oscillatory signal may not be able to propagate downstream—due to distractive interference. Here we investigate how synaptic plasticity can facilitate the transmission of oscillatory signal downstream along the information processing pathway in the brain. We show that basic synaptic plasticity rules, that have been reported empirically, are sufficient to generate the required tuning that enables the propagation of the oscillatory signal. In addition, our work presents a synaptic learning process that does not converge to a stationary state, but rather remains dynamic. We demonstrate how the functionality of the system, i.e., transmission of oscillatory activity, can be maintained in the face of constant remodeling of synaptic weights.
Collapse
Affiliation(s)
- Yotam Luz
- Department of Physiology and Cell Biology Faculty of Health Sciences, Ben-Gurion University of the Negev, Beer-Sheva, Israel
- Zlotowski Center for Neuroscience, Ben-Gurion University of the Negev, Beer-Sheva, Israel
- * E-mail:
| | - Maoz Shamir
- Department of Physiology and Cell Biology Faculty of Health Sciences, Ben-Gurion University of the Negev, Beer-Sheva, Israel
- Zlotowski Center for Neuroscience, Ben-Gurion University of the Negev, Beer-Sheva, Israel
- Department of Physics Faculty of Natural Sciences, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| |
Collapse
|
9
|
Bi Z, Zhou C. Spike Pattern Structure Influences Synaptic Efficacy Variability under STDP and Synaptic Homeostasis. I: Spike Generating Models on Converging Motifs. Front Comput Neurosci 2016; 10:14. [PMID: 26941634 PMCID: PMC4763167 DOI: 10.3389/fncom.2016.00014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2015] [Accepted: 02/01/2016] [Indexed: 11/26/2022] Open
Abstract
In neural systems, synaptic plasticity is usually driven by spike trains. Due to the inherent noises of neurons and synapses as well as the randomness of connection details, spike trains typically exhibit variability such as spatial randomness and temporal stochasticity, resulting in variability of synaptic changes under plasticity, which we call efficacy variability. How the variability of spike trains influences the efficacy variability of synapses remains unclear. In this paper, we try to understand this influence under pair-wise additive spike-timing dependent plasticity (STDP) when the mean strength of plastic synapses into a neuron is bounded (synaptic homeostasis). Specifically, we systematically study, analytically and numerically, how four aspects of statistical features, i.e., synchronous firing, burstiness/regularity, heterogeneity of rates and heterogeneity of cross-correlations, as well as their interactions influence the efficacy variability in converging motifs (simple networks in which one neuron receives from many other neurons). Neurons (including the post-synaptic neuron) in a converging motif generate spikes according to statistical models with tunable parameters. In this way, we can explicitly control the statistics of the spike patterns, and investigate their influence onto the efficacy variability, without worrying about the feedback from synaptic changes onto the dynamics of the post-synaptic neuron. We separate efficacy variability into two parts: the drift part (DriftV) induced by the heterogeneity of change rates of different synapses, and the diffusion part (DiffV) induced by weight diffusion caused by stochasticity of spike trains. Our main findings are: (1) synchronous firing and burstiness tend to increase DiffV, (2) heterogeneity of rates induces DriftV when potentiation and depression in STDP are not balanced, and (3) heterogeneity of cross-correlations induces DriftV together with heterogeneity of rates. We anticipate our work important for understanding functional processes of neuronal networks (such as memory) and neural development.
Collapse
Affiliation(s)
- Zedong Bi
- State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of SciencesBeijing, China; Department of Physics, Hong Kong Baptist UniversityKowloon Tong, Hong Kong
| | - Changsong Zhou
- Department of Physics, Hong Kong Baptist UniversityKowloon Tong, Hong Kong; Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist UniversityKowloon Tong, Hong Kong; Beijing Computational Science Research CenterBeijing, China; Research Centre, HKBU Institute of Research and Continuing EducationShenzhen, China
| |
Collapse
|
10
|
Vogt SM, Hofmann UG. Balancing the critical period of spiking neurons with attractor-less STDP. BMC Neurosci 2015. [PMCID: PMC4697491 DOI: 10.1186/1471-2202-16-s1-p103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
|
11
|
Ocker GK, Litwin-Kumar A, Doiron B. Self-Organization of Microcircuits in Networks of Spiking Neurons with Plastic Synapses. PLoS Comput Biol 2015; 11:e1004458. [PMID: 26291697 PMCID: PMC4546203 DOI: 10.1371/journal.pcbi.1004458] [Citation(s) in RCA: 54] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2014] [Accepted: 07/19/2015] [Indexed: 11/18/2022] Open
Abstract
The synaptic connectivity of cortical networks features an overrepresentation of certain wiring motifs compared to simple random-network models. This structure is shaped, in part, by synaptic plasticity that promotes or suppresses connections between neurons depending on their joint spiking activity. Frequently, theoretical studies focus on how feedforward inputs drive plasticity to create this network structure. We study the complementary scenario of self-organized structure in a recurrent network, with spike timing-dependent plasticity driven by spontaneous dynamics. We develop a self-consistent theory for the evolution of network structure by combining fast spiking covariance with a slow evolution of synaptic weights. Through a finite-size expansion of network dynamics we obtain a low-dimensional set of nonlinear differential equations for the evolution of two-synapse connectivity motifs. With this theory in hand, we explore how the form of the plasticity rule drives the evolution of microcircuits in cortical networks. When potentiation and depression are in approximate balance, synaptic dynamics depend on weighted divergent, convergent, and chain motifs. For additive, Hebbian STDP these motif interactions create instabilities in synaptic dynamics that either promote or suppress the initial network structure. Our work provides a consistent theoretical framework for studying how spiking activity in recurrent networks interacts with synaptic plasticity to determine network structure.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Melon University, Pittsburgh, Pennsylvania, United States of America
| | - Ashok Litwin-Kumar
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Melon University, Pittsburgh, Pennsylvania, United States of America
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for Theoretical Neuroscience, Columbia University, New York, New York, United States of America
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Melon University, Pittsburgh, Pennsylvania, United States of America
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| |
Collapse
|
12
|
Einarsson H, Lengler J, Steger A. A high-capacity model for one shot association learning in the brain. Front Comput Neurosci 2014; 8:140. [PMID: 25426060 PMCID: PMC4224099 DOI: 10.3389/fncom.2014.00140] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2014] [Accepted: 10/16/2014] [Indexed: 11/13/2022] Open
Abstract
We present a high-capacity model for one-shot association learning (hetero-associative memory) in sparse networks. We assume that basic patterns are pre-learned in networks and associations between two patterns are presented only once and have to be learned immediately. The model is a combination of an Amit-Fusi like network sparsely connected to a Willshaw type network. The learning procedure is palimpsest and comes from earlier work on one-shot pattern learning. However, in our setup we can enhance the capacity of the network by iterative retrieval. This yields a model for sparse brain-like networks in which populations of a few thousand neurons are capable of learning hundreds of associations even if they are presented only once. The analysis of the model is based on a novel result by Janson et al. on bootstrap percolation in random graphs.
Collapse
Affiliation(s)
- Hafsteinn Einarsson
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zürich Zürich, Switzerland
| | - Johannes Lengler
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zürich Zürich, Switzerland
| | - Angelika Steger
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zürich Zürich, Switzerland ; Collegium Helveticum Zürich, Switzerland
| |
Collapse
|
13
|
Interplay between short- and long-term plasticity in cell-assembly formation. PLoS One 2014; 9:e101535. [PMID: 25007209 PMCID: PMC4090127 DOI: 10.1371/journal.pone.0101535] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2013] [Accepted: 06/08/2014] [Indexed: 11/19/2022] Open
Abstract
Various hippocampal and neocortical synapses of mammalian brain show both short-term plasticity and long-term plasticity, which are considered to underlie learning and memory by the brain. According to Hebb’s postulate, synaptic plasticity encodes memory traces of past experiences into cell assemblies in cortical circuits. However, it remains unclear how the various forms of long-term and short-term synaptic plasticity cooperatively create and reorganize such cell assemblies. Here, we investigate the mechanism in which the three forms of synaptic plasticity known in cortical circuits, i.e., spike-timing-dependent plasticity (STDP), short-term depression (STD) and homeostatic plasticity, cooperatively generate, retain and reorganize cell assemblies in a recurrent neuronal network model. We show that multiple cell assemblies generated by external stimuli can survive noisy spontaneous network activity for an adequate range of the strength of STD. Furthermore, our model predicts that a symmetric temporal window of STDP, such as observed in dopaminergic modulations on hippocampal neurons, is crucial for the retention and integration of multiple cell assemblies. These results may have implications for the understanding of cortical memory processes.
Collapse
|
14
|
Coexistence of reward and unsupervised learning during the operant conditioning of neural firing rates. PLoS One 2014; 9:e87123. [PMID: 24475240 PMCID: PMC3903641 DOI: 10.1371/journal.pone.0087123] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2013] [Accepted: 12/21/2013] [Indexed: 11/24/2022] Open
Abstract
A fundamental goal of neuroscience is to understand how cognitive processes, such as operant conditioning, are performed by the brain. Typical and well studied examples of operant conditioning, in which the firing rates of individual cortical neurons in monkeys are increased using rewards, provide an opportunity for insight into this. Studies of reward-modulated spike-timing-dependent plasticity (RSTDP), and of other models such as R-max, have reproduced this learning behavior, but they have assumed that no unsupervised learning is present (i.e., no learning occurs without, or independent of, rewards). We show that these models cannot elicit firing rate reinforcement while exhibiting both reward learning and ongoing, stable unsupervised learning. To fix this issue, we propose a new RSTDP model of synaptic plasticity based upon the observed effects that dopamine has on long-term potentiation and depression (LTP and LTD). We show, both analytically and through simulations, that our new model can exhibit unsupervised learning and lead to firing rate reinforcement. This requires that the strengthening of LTP by the reward signal is greater than the strengthening of LTD and that the reinforced neuron exhibits irregular firing. We show the robustness of our findings to spike-timing correlations, to the synaptic weight dependence that is assumed, and to changes in the mean reward. We also consider our model in the differential reinforcement of two nearby neurons. Our model aligns more strongly with experimental studies than previous models and makes testable predictions for future experiments.
Collapse
|
15
|
Babadi B, Abbott LF. Pairwise analysis can account for network structures arising from spike-timing dependent plasticity. PLoS Comput Biol 2013; 9:e1002906. [PMID: 23436986 PMCID: PMC3578766 DOI: 10.1371/journal.pcbi.1002906] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2012] [Accepted: 12/14/2012] [Indexed: 11/18/2022] Open
Abstract
Spike timing-dependent plasticity (STDP) modifies synaptic strengths based on timing information available locally at each synapse. Despite this, it induces global structures within a recurrently connected network. We study such structures both through simulations and by analyzing the effects of STDP on pair-wise interactions of neurons. We show how conventional STDP acts as a loop-eliminating mechanism and organizes neurons into in- and out-hubs. Loop-elimination increases when depression dominates and turns into loop-generation when potentiation dominates. STDP with a shifted temporal window such that coincident spikes cause depression enhances recurrent connections and functions as a strict buffering mechanism that maintains a roughly constant average firing rate. STDP with the opposite temporal shift functions as a loop eliminator at low rates and as a potent loop generator at higher rates. In general, studying pairwise interactions of neurons provides important insights about the structures that STDP can produce in large networks.
Collapse
Affiliation(s)
- Baktash Babadi
- Center for Theoretical Neuroscience, Department of Neuroscience, Columbia University, New York, New York, United States of America.
| | | |
Collapse
|
16
|
Kerr RR, Burkitt AN, Thomas DA, Gilson M, Grayden DB. Delay selection by spike-timing-dependent plasticity in recurrent networks of spiking neurons receiving oscillatory inputs. PLoS Comput Biol 2013; 9:e1002897. [PMID: 23408878 PMCID: PMC3567188 DOI: 10.1371/journal.pcbi.1002897] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2012] [Accepted: 12/10/2012] [Indexed: 11/28/2022] Open
Abstract
Learning rules, such as spike-timing-dependent plasticity (STDP), change the structure of networks of neurons based on the firing activity. A network level understanding of these mechanisms can help infer how the brain learns patterns and processes information. Previous studies have shown that STDP selectively potentiates feed-forward connections that have specific axonal delays, and that this underlies behavioral functions such as sound localization in the auditory brainstem of the barn owl. In this study, we investigate how STDP leads to the selective potentiation of recurrent connections with different axonal and dendritic delays during oscillatory activity. We develop analytical models of learning with additive STDP in recurrent networks driven by oscillatory inputs, and support the results using simulations with leaky integrate-and-fire neurons. Our results show selective potentiation of connections with specific axonal delays, which depended on the input frequency. In addition, we demonstrate how this can lead to a network becoming selective in the amplitude of its oscillatory response to this frequency. We extend this model of axonal delay selection within a single recurrent network in two ways. First, we show the selective potentiation of connections with a range of both axonal and dendritic delays. Second, we show axonal delay selection between multiple groups receiving out-of-phase, oscillatory inputs. We discuss the application of these models to the formation and activation of neuronal ensembles or cell assemblies in the cortex, and also to missing fundamental pitch perception in the auditory brainstem. Our brain's ability to perform cognitive processes, such as object identification, problem solving, and decision making, comes from the specific connections between neurons. The neurons carry information as spikes that are transmitted to other neurons via connections with different strengths and propagation delays. Experimentally observed learning rules can modify the strengths of connections between neurons based on the timing of their spikes. The learning that occurs in neuronal networks due to these rules is thought to be vital to creating the structures necessary for different cognitive processes as well as for memory. The spiking rate of populations of neurons has been observed to oscillate at particular frequencies in various brain regions, and there is evidence that these oscillations play a role in cognition. Here, we use analytical and numerical methods to investigate the changes to the network structure caused by a specific learning rule during oscillatory neural activity. We find the conditions under which connections with propagation delays that resonate with the oscillations are strengthened relative to the other connections. We demonstrate that networks learn to oscillate more strongly to oscillations at the frequency they were presented with during learning. We discuss the possible application of these results to specific areas of the brain.
Collapse
Affiliation(s)
- Robert R. Kerr
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Centre for Neural Engineering, The University of Melbourne, Melbourne, Victoria, Australia
| | - Anthony N. Burkitt
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Centre for Neural Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Bionics Institute, Melbourne, Victoria, Australia
- * E-mail:
| | - Doreen A. Thomas
- Department of Mechanical Engineering, The University of Melbourne, Melbourne, Victoria, Australia
| | - Matthieu Gilson
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Centre for Neural Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute, Saitama, Japan
| | - David B. Grayden
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Centre for Neural Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Bionics Institute, Melbourne, Victoria, Australia
| |
Collapse
|
17
|
Spectral analysis of input spike trains by spike-timing-dependent plasticity. PLoS Comput Biol 2012; 8:e1002584. [PMID: 22792056 PMCID: PMC3390410 DOI: 10.1371/journal.pcbi.1002584] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2011] [Accepted: 05/14/2012] [Indexed: 11/19/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) has been observed in many brain areas such as sensory cortices, where it is hypothesized to structure synaptic connections between neurons. Previous studies have demonstrated how STDP can capture spiking information at short timescales using specific input configurations, such as coincident spiking, spike patterns and oscillatory spike trains. However, the corresponding computation in the case of arbitrary input signals is still unclear. This paper provides an overarching picture of the algorithm inherent to STDP, tying together many previous results for commonly used models of pairwise STDP. For a single neuron with plastic excitatory synapses, we show how STDP performs a spectral analysis on the temporal cross-correlograms between its afferent spike trains. The postsynaptic responses and STDP learning window determine kernel functions that specify how the neuron “sees” the input correlations. We thus denote this unsupervised learning scheme as ‘kernel spectral component analysis’ (kSCA). In particular, the whole input correlation structure must be considered since all plastic synapses compete with each other. We find that kSCA is enhanced when weight-dependent STDP induces gradual synaptic competition. For a spiking neuron with a “linear” response and pairwise STDP alone, we find that kSCA resembles principal component analysis (PCA). However, plain STDP does not isolate correlation sources in general, e.g., when they are mixed among the input spike trains. In other words, it does not perform independent component analysis (ICA). Tuning the neuron to a single correlation source can be achieved when STDP is paired with a homeostatic mechanism that reinforces the competition between synaptic inputs. Our results suggest that neuronal networks equipped with STDP can process signals encoded in the transient spiking activity at the timescales of tens of milliseconds for usual STDP. Tuning feature extraction of sensory stimuli is an important function for synaptic plasticity models. A widely studied example is the development of orientation preference in the primary visual cortex, which can emerge using moving bars in the visual field. A crucial point is the decomposition of stimuli into basic information tokens, e.g., selecting individual bars even though they are presented in overlapping pairs (vertical and horizontal). Among classical unsupervised learning models, independent component analysis (ICA) is capable of isolating basic tokens, whereas principal component analysis (PCA) cannot. This paper focuses on spike-timing-dependent plasticity (STDP), whose functional implications for neural information processing have been intensively studied both theoretically and experimentally in the last decade. Following recent studies demonstrating that STDP can perform ICA for specific cases, we show how STDP relates to PCA or ICA, and in particular explains the conditions under which it switches between them. Here information at the neuronal level is assumed to be encoded in temporal cross-correlograms of spike trains. We find that a linear spiking neuron equipped with pairwise STDP requires additional mechanisms, such as a homeostatic regulation of its output firing, in order to separate mixed correlation sources and thus perform ICA.
Collapse
|
18
|
Gilson M, Bürck M, Burkitt AN, van Hemmen JL. Frequency selectivity emerging from spike-timing-dependent plasticity. Neural Comput 2012; 24:2251-79. [PMID: 22734488 DOI: 10.1162/neco_a_00331] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Periodic neuronal activity has been observed in various areas of the brain, from lower sensory to higher cortical levels. Specific frequency components contained in this periodic activity can be identified by a neuronal circuit that behaves as a bandpass filter with given preferred frequency, or best modulation frequency (BMF). For BMFs typically ranging from 10 to 200 Hz, a plausible and minimal configuration consists of a single neuron with adjusted excitatory and inhibitory synaptic connections. The emergence, however, of such a neuronal circuitry is still unclear. In this letter, we demonstrate how spike-timing-dependent plasticity (STDP) can give rise to frequency-dependent learning, thus leading to an input selectivity that enables frequency identification. We use an in-depth mathematical analysis of the learning dynamics in a population of plastic inhibitory connections. These provide inhomogeneous postsynaptic responses that depend on their dendritic location. We find that synaptic delays play a crucial role in organizing the weight specialization induced by STDP. Under suitable conditions on the synaptic delays and postsynaptic potentials (PSPs), the BMF of a neuron after learning can match the training frequency. In particular, proximal (distal) synapses with shorter (longer) dendritic delay and somatically measured PSP time constants respond better to higher (lower) frequencies. As a result, the neuron will respond maximally to any stimulating frequency (in a given range) with which it has been trained in an unsupervised manner. The model predicts that synapses responding to a given BMF form clusters on dendritic branches.
Collapse
Affiliation(s)
- Matthieu Gilson
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, University of Melbourne, VIC 3010, Australia; The Bionics Institute, East Melbourne, VIC 3002, Australia; NICTA the Victorian Research Lab, University of Melbourne, VIC 3010, Australia; and RIKEN Brain Science Institute, Saitama 351-0198, Japan
| | - Moritz Bürck
- Physik Department T35, Technische Universitat München, 85748 Garching bei München, Germany, and Bernstein Center for Computational Neuroscience—München, 82152 Martinsried, Germany
| | - Anthony N. Burkitt
- Neuroengineering Laboratory, Department of Electrical and Electronic Engineering, University of Melbourne, VIC 3010, Australia; The Bionics Institute, East Melbourne, VIC 3010, Australia; and Centre for Neural Engineering, University of Melbourne, VIC 3010, Australia
| | - J. Leo van Hemmen
- Physik Department T35, Technische Universitat München, 85748 Garching bei München, Germany, and Bernstein Center for Computational Neuroscience—München, 82152 Martinsried, Germany
| |
Collapse
|
19
|
Luz Y, Shamir M. Balancing feed-forward excitation and inhibition via Hebbian inhibitory synaptic plasticity. PLoS Comput Biol 2012; 8:e1002334. [PMID: 22291583 PMCID: PMC3266879 DOI: 10.1371/journal.pcbi.1002334] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2011] [Accepted: 11/16/2011] [Indexed: 12/02/2022] Open
Abstract
It has been suggested that excitatory and inhibitory inputs to cortical cells are balanced, and that this balance is important for the highly irregular firing observed in the cortex. There are two hypotheses as to the origin of this balance. One assumes that it results from a stable solution of the recurrent neuronal dynamics. This model can account for a balance of steady state excitation and inhibition without fine tuning of parameters, but not for transient inputs. The second hypothesis suggests that the feed forward excitatory and inhibitory inputs to a postsynaptic cell are already balanced. This latter hypothesis thus does account for the balance of transient inputs. However, it remains unclear what mechanism underlies the fine tuning required for balancing feed forward excitatory and inhibitory inputs. Here we investigated whether inhibitory synaptic plasticity is responsible for the balance of transient feed forward excitation and inhibition. We address this issue in the framework of a model characterizing the stochastic dynamics of temporally anti-symmetric Hebbian spike timing dependent plasticity of feed forward excitatory and inhibitory synaptic inputs to a single post-synaptic cell. Our analysis shows that inhibitory Hebbian plasticity generates 'negative feedback' that balances excitation and inhibition, which contrasts with the 'positive feedback' of excitatory Hebbian synaptic plasticity. As a result, this balance may increase the sensitivity of the learning dynamics to the correlation structure of the excitatory inputs.
Collapse
Affiliation(s)
- Yotam Luz
- Department of Physiology and Neurobiology, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| | - Maoz Shamir
- Department of Physiology and Neurobiology, Ben-Gurion University of the Negev, Beer-Sheva, Israel
- Department of Physics, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| |
Collapse
|
20
|
Gilson M, Masquelier T, Hugues E. STDP allows fast rate-modulated coding with Poisson-like spike trains. PLoS Comput Biol 2011; 7:e1002231. [PMID: 22046113 PMCID: PMC3203056 DOI: 10.1371/journal.pcbi.1002231] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2011] [Accepted: 09/01/2011] [Indexed: 11/18/2022] Open
Abstract
Spike timing-dependent plasticity (STDP) has been shown to enable single neurons to detect repeatedly presented spatiotemporal spike patterns. This holds even when such patterns are embedded in equally dense random spiking activity, that is, in the absence of external reference times such as a stimulus onset. Here we demonstrate, both analytically and numerically, that STDP can also learn repeating rate-modulated patterns, which have received more experimental evidence, for example, through post-stimulus time histograms (PSTHs). Each input spike train is generated from a rate function using a stochastic sampling mechanism, chosen to be an inhomogeneous Poisson process here. Learning is feasible provided significant covarying rate modulations occur within the typical timescale of STDP (∼10–20 ms) for sufficiently many inputs (∼100 among 1000 in our simulations), a condition that is met by many experimental PSTHs. Repeated pattern presentations induce spike-time correlations that are captured by STDP. Despite imprecise input spike times and even variable spike counts, a single trained neuron robustly detects the pattern just a few milliseconds after its presentation. Therefore, temporal imprecision and Poisson-like firing variability are not an obstacle to fast temporal coding. STDP provides an appealing mechanism to learn such rate patterns, which, beyond sensory processing, may also be involved in many cognitive tasks. In vivo neural responses to stimuli are known to have a lot of variability across trials. If the same number of spikes is emitted from trial to trial, the neuron is said to be reliable. If the timing of such spikes is roughly preserved across trials, the neuron is said to be precise. Here we demonstrate both analytically and numerically that the well-established Hebbian learning rule of spike-timing-dependent plasticity (STDP) can learn response patterns despite relatively low reliability (Poisson-like variability) and low temporal precision (10–20 ms). These features are in line with many experimental observations, in which a poststimulus time histogram (PSTH) is evaluated over multiple trials. In our model, however, information is extracted from the relative spike times between afferents without the need of an absolute reference time, such as a stimulus onset. Relevantly, recent experiments show that relative timing is often more informative than the absolute timing. Furthermore, the scope of application for our study is not restricted to sensory systems. Taken together, our results suggest a fine temporal resolution for the neural code, and that STDP is an appropriate candidate for encoding and decoding such activity.
Collapse
Affiliation(s)
- Matthieu Gilson
- Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Australia
- Lab for Neural Circuit Theory, Riken Brain Science Insitute, Wako-shi, Saitama, Japan
- * E-mail: (MG); (TM)
| | - Timothée Masquelier
- Unit for Brain and Cognition, Universitat Pompeu Fabra, Barcelona, Spain
- * E-mail: (MG); (TM)
| | - Etienne Hugues
- Unit for Brain and Cognition, Universitat Pompeu Fabra, Barcelona, Spain
| |
Collapse
|
21
|
Stability versus neuronal specialization for STDP: long-tail weight distributions solve the dilemma. PLoS One 2011; 6:e25339. [PMID: 22003389 PMCID: PMC3189213 DOI: 10.1371/journal.pone.0025339] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2011] [Accepted: 09/01/2011] [Indexed: 11/19/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) modifies the weight (or strength) of synaptic connections between neurons and is considered to be crucial for generating network structure. It has been observed in physiology that, in addition to spike timing, the weight update also depends on the current value of the weight. The functional implications of this feature are still largely unclear. Additive STDP gives rise to strong competition among synapses, but due to the absence of weight dependence, it requires hard boundaries to secure the stability of weight dynamics. Multiplicative STDP with linear weight dependence for depression ensures stability, but it lacks sufficiently strong competition required to obtain a clear synaptic specialization. A solution to this stability-versus-function dilemma can be found with an intermediate parametrization between additive and multiplicative STDP. Here we propose a novel solution to the dilemma, named log-STDP, whose key feature is a sublinear weight dependence for depression. Due to its specific weight dependence, this new model can produce significantly broad weight distributions with no hard upper bound, similar to those recently observed in experiments. Log-STDP induces graded competition between synapses, such that synapses receiving stronger input correlations are pushed further in the tail of (very) large weights. Strong weights are functionally important to enhance the neuronal response to synchronous spike volleys. Depending on the input configuration, multiple groups of correlated synaptic inputs exhibit either winner-share-all or winner-take-all behavior. When the configuration of input correlations changes, individual synapses quickly and robustly readapt to represent the new configuration. We also demonstrate the advantages of log-STDP for generating a stable structure of strong weights in a recurrently connected network. These properties of log-STDP are compared with those of previous models. Through long-tail weight distributions, log-STDP achieves both stable dynamics for and robust competition of synapses, which are crucial for spike-based information processing.
Collapse
|
22
|
Gilson M, Burkitt A, van Hemmen LJ. STDP in Recurrent Neuronal Networks. Front Comput Neurosci 2010; 4. [PMID: 20890448 PMCID: PMC2947928 DOI: 10.3389/fncom.2010.00023] [Citation(s) in RCA: 53] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2010] [Accepted: 06/28/2010] [Indexed: 11/13/2022] Open
Abstract
Recent results about spike-timing-dependent plasticity (STDP) in recurrently connected neurons are reviewed, with a focus on the relationship between the weight dynamics and the emergence of network structure. In particular, the evolution of synaptic weights in the two cases of incoming connections for a single neuron and recurrent connections are compared and contrasted. A theoretical framework is used that is based upon Poisson neurons with a temporally inhomogeneous firing rate and the asymptotic distribution of weights generated by the learning dynamics. Different network configurations examined in recent studies are discussed and an overview of the current understanding of STDP in recurrently connected neuronal networks is presented.
Collapse
|