1
|
Albesa-González A, Froc M, Williamson O, Rossum MCWV. Weight dependence in BCM leads to adjustable synaptic competition. J Comput Neurosci 2022; 50:431-444. [PMID: 35764852 PMCID: PMC9666303 DOI: 10.1007/s10827-022-00824-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Revised: 05/15/2022] [Accepted: 06/08/2022] [Indexed: 11/28/2022]
Abstract
Models of synaptic plasticity have been used to better understand neural development as well as learning and memory. One prominent classic model is the Bienenstock-Cooper-Munro (BCM) model that has been particularly successful in explaining plasticity of the visual cortex. Here, in an effort to include more biophysical detail in the BCM model, we incorporate 1) feedforward inhibition, and 2) the experimental observation that large synapses are relatively harder to potentiate than weak ones, while synaptic depression is proportional to the synaptic strength. These modifications change the outcome of unsupervised plasticity under the BCM model. The amount of feed-forward inhibition adds a parameter to BCM that turns out to determine the strength of competition. In the limit of strong inhibition the learning outcome is identical to standard BCM and the neuron becomes selective to one stimulus only (winner-take-all). For smaller values of inhibition, competition is weaker and the receptive fields are less selective. However, both BCM variants can yield realistic receptive fields.
Collapse
Affiliation(s)
- Albert Albesa-González
- School of Psychology and School of Mathematical Sciences, University of Nottingham, Nottingham, NH7 2RD, UK
| | - Maxime Froc
- School of Psychology and School of Mathematical Sciences, University of Nottingham, Nottingham, NH7 2RD, UK
| | - Oliver Williamson
- School of Psychology and School of Mathematical Sciences, University of Nottingham, Nottingham, NH7 2RD, UK
| | - Mark C W van Rossum
- School of Psychology and School of Mathematical Sciences, University of Nottingham, Nottingham, NH7 2RD, UK.
| |
Collapse
|
2
|
Geyer T, Seitz W, Zinchenko A, Müller HJ, Conci M. Why Are Acquired Search-Guiding Context Memories Resistant to Updating? Front Psychol 2021; 12:650245. [PMID: 33732200 PMCID: PMC7956950 DOI: 10.3389/fpsyg.2021.650245] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2021] [Accepted: 02/09/2021] [Indexed: 01/22/2023] Open
Abstract
Looking for goal-relevant objects in our various environments is one of the most ubiquitous tasks the human visual system has to accomplish (Wolfe, 1998). Visual search is guided by a number of separable selective-attention mechanisms that can be categorized as bottom-up driven - guidance by salient physical properties of the current stimuli - or top-down controlled - guidance by observers' "online" knowledge of search-critical object properties (e.g., Liesefeld and Müller, 2019). In addition, observers' expectations based on past experience also play also a significant role in goal-directed visual selection. Because sensory environments are typically stable, it is beneficial for the visual system to extract and learn the environmental regularities that are predictive of (the location of) the target stimulus. This perspective article is concerned with one of these predictive mechanisms: statistical context learning of consistent spatial patterns of target and distractor items in visual search. We review recent studies on context learning and its adaptability to incorporate consistent changes, with the aim to provide new directions to the study of processes involved in the acquisition of search-guiding context memories and their adaptation to consistent contextual changes - from a three-pronged, psychological, computational, and neurobiological perspective.
Collapse
Affiliation(s)
- Thomas Geyer
- Department Psychologie, Ludwig-Maximilians-Universität München, Munich, Germany
- Munich Center for Neurosciences – Brain & Mind, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Werner Seitz
- Department Psychologie, Ludwig-Maximilians-Universität München, Munich, Germany
- Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Artyom Zinchenko
- Department Psychologie, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Hermann J. Müller
- Department Psychologie, Ludwig-Maximilians-Universität München, Munich, Germany
- Munich Center for Neurosciences – Brain & Mind, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Markus Conci
- Department Psychologie, Ludwig-Maximilians-Universität München, Munich, Germany
| |
Collapse
|
3
|
Energetics of stochastic BCM type synaptic plasticity and storing of accurate information. J Comput Neurosci 2021; 49:71-106. [PMID: 33528721 PMCID: PMC8046702 DOI: 10.1007/s10827-020-00775-0] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2019] [Revised: 10/04/2020] [Accepted: 12/13/2020] [Indexed: 11/10/2022]
Abstract
Excitatory synaptic signaling in cortical circuits is thought to be metabolically expensive. Two fundamental brain functions, learning and memory, are associated with long-term synaptic plasticity, but we know very little about energetics of these slow biophysical processes. This study investigates the energy requirement of information storing in plastic synapses for an extended version of BCM plasticity with a decay term, stochastic noise, and nonlinear dependence of neuron’s firing rate on synaptic current (adaptation). It is shown that synaptic weights in this model exhibit bistability. In order to analyze the system analytically, it is reduced to a simple dynamic mean-field for a population averaged plastic synaptic current. Next, using the concepts of nonequilibrium thermodynamics, we derive the energy rate (entropy production rate) for plastic synapses and a corresponding Fisher information for coding presynaptic input. That energy, which is of chemical origin, is primarily used for battling fluctuations in the synaptic weights and presynaptic firing rates, and it increases steeply with synaptic weights, and more uniformly though nonlinearly with presynaptic firing. At the onset of synaptic bistability, Fisher information and memory lifetime both increase sharply, by a few orders of magnitude, but the plasticity energy rate changes only mildly. This implies that a huge gain in the precision of stored information does not have to cost large amounts of metabolic energy, which suggests that synaptic information is not directly limited by energy consumption. Interestingly, for very weak synaptic noise, such a limit on synaptic coding accuracy is imposed instead by a derivative of the plasticity energy rate with respect to the mean presynaptic firing, and this relationship has a general character that is independent of the plasticity type. An estimate for primate neocortex reveals that a relative metabolic cost of BCM type synaptic plasticity, as a fraction of neuronal cost related to fast synaptic transmission and spiking, can vary from negligible to substantial, depending on the synaptic noise level and presynaptic firing.
Collapse
|
4
|
Brivio S, Ly DRB, Vianello E, Spiga S. Non-linear Memristive Synaptic Dynamics for Efficient Unsupervised Learning in Spiking Neural Networks. Front Neurosci 2021; 15:580909. [PMID: 33633531 PMCID: PMC7901913 DOI: 10.3389/fnins.2021.580909] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2020] [Accepted: 01/06/2021] [Indexed: 11/13/2022] Open
Abstract
Spiking neural networks (SNNs) are a computational tool in which the information is coded into spikes, as in some parts of the brain, differently from conventional neural networks (NNs) that compute over real-numbers. Therefore, SNNs can implement intelligent information extraction in real-time at the edge of data acquisition and correspond to a complementary solution to conventional NNs working for cloud-computing. Both NN classes face hardware constraints due to limited computing parallelism and separation of logic and memory. Emerging memory devices, like resistive switching memories, phase change memories, or memristive devices in general are strong candidates to remove these hurdles for NN applications. The well-established training procedures of conventional NNs helped in defining the desiderata for memristive device dynamics implementing synaptic units. The generally agreed requirements are a linear evolution of memristive conductance upon stimulation with train of identical pulses and a symmetric conductance change for conductance increase and decrease. Conversely, little work has been done to understand the main properties of memristive devices supporting efficient SNN operation. The reason lies in the lack of a background theory for their training. As a consequence, requirements for NNs have been taken as a reference to develop memristive devices for SNNs. In the present work, we show that, for efficient CMOS/memristive SNNs, the requirements for synaptic memristive dynamics are very different from the needs of a conventional NN. System-level simulations of a SNN trained to classify hand-written digit images through a spike timing dependent plasticity protocol are performed considering various linear and non-linear plausible synaptic memristive dynamics. We consider memristive dynamics bounded by artificial hard conductance values and limited by the natural dynamics evolution toward asymptotic values (soft-boundaries). We quantitatively analyze the impact of resolution and non-linearity properties of the synapses on the network training and classification performance. Finally, we demonstrate that the non-linear synapses with hard boundary values enable higher classification performance and realize the best trade-off between classification accuracy and required training time. With reference to the obtained results, we discuss how memristive devices with non-linear dynamics constitute a technologically convenient solution for the development of on-line SNN training.
Collapse
Affiliation(s)
- Stefano Brivio
- CNR - IMM, Unit of Agrate Brianza, Agrate Brianza, Italy
| | - Denys R B Ly
- Université Grenoble Alpes, CEA, Leti, Grenoble, France
| | | | - Sabina Spiga
- CNR - IMM, Unit of Agrate Brianza, Agrate Brianza, Italy
| |
Collapse
|
5
|
López-Madrona VJ, Matias FS, Mirasso CR, Canals S, Pereda E. Inferring correlations associated to causal interactions in brain signals using autoregressive models. Sci Rep 2019; 9:17041. [PMID: 31745163 PMCID: PMC6863873 DOI: 10.1038/s41598-019-53453-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2019] [Accepted: 10/26/2019] [Indexed: 12/22/2022] Open
Abstract
The specific connectivity of a neuronal network is reflected in the dynamics of the signals recorded on its nodes. The analysis of how the activity in one node predicts the behaviour of another gives the directionality in their relationship. However, each node is composed of many different elements which define the properties of the links. For instance, excitatory and inhibitory neuronal subtypes determine the functionality of the connection. Classic indexes such as the Granger causality (GC) quantifies these interactions, but they do not infer into the mechanism behind them. Here, we introduce an extension of the well-known GC that analyses the correlation associated to the specific influence that a transmitter node has over the receiver. This way, the G-causal link has a positive or negative effect if the predicted activity follows directly or inversely, respectively, the dynamics of the sender. The method is validated in a neuronal population model, testing the paradigm that excitatory and inhibitory neurons have a differential effect in the connectivity. Our approach correctly infers the positive or negative coupling produced by different types of neurons. Our results suggest that the proposed approach provides additional information on the characterization of G-causal connections, which is potentially relevant when it comes to understanding interactions in the brain circuits.
Collapse
Affiliation(s)
| | - Fernanda S Matias
- Cognitive Neuroimaging Unit, Commissariat à l'Energie Atomique (CEA), INSERM U992, NeuroSpin Center, 91191, Gif-sur-Yvete, France.,Instituto de Física, Universidade Federal de Alagoas, 57072-970, Maceió, Alagoas, Brazil
| | - Claudio R Mirasso
- Instituto de Física Interdisciplinar y Sistemas Complejos, IFISC (UIB-CSIC), Campus Universitat de les Illes Balears, E-07122, Palma de Mallorca, Spain
| | - Santiago Canals
- Instituto de Neurociencias, CSIC-UMH, Sant Joan d'Alacant, 03550, Spain
| | - Ernesto Pereda
- Departamento de Ingeniería Industrial, Escuela Superior de Ingeniería y Tecnología, IUNE, Universidad de La Laguna, Tenerife, 38205, Spain. .,Laboratory of Cognitive and Computational Neuroscience, CTB, UPM, Madrid, Spain.
| |
Collapse
|
6
|
Deger M, Seeholzer A, Gerstner W. Multicontact Co-operativity in Spike-Timing-Dependent Structural Plasticity Stabilizes Networks. Cereb Cortex 2019; 28:1396-1415. [PMID: 29300903 PMCID: PMC6041941 DOI: 10.1093/cercor/bhx339] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2017] [Accepted: 11/30/2017] [Indexed: 12/12/2022] Open
Abstract
Excitatory synaptic connections in the adult neocortex consist of multiple synaptic contacts, almost exclusively formed on dendritic spines. Changes of spine volume, a correlate of synaptic strength, can be tracked in vivo for weeks. Here, we present a combined model of structural and spike-timing–dependent plasticity that explains the multicontact configuration of synapses in adult neocortical networks under steady-state and lesion-induced conditions. Our plasticity rule with Hebbian and anti-Hebbian terms stabilizes both the postsynaptic firing rate and correlations between the pre- and postsynaptic activity at an active synaptic contact. Contacts appear spontaneously at a low rate and disappear if their strength approaches zero. Many presynaptic neurons compete to make strong synaptic connections onto a postsynaptic neuron, whereas the synaptic contacts of a given presynaptic neuron co-operate via postsynaptic firing. We find that co-operation of multiple synaptic contacts is crucial for stable, long-term synaptic memories. In simulations of a simplified network model of barrel cortex, our plasticity rule reproduces whisker-trimming–induced rewiring of thalamocortical and recurrent synaptic connectivity on realistic time scales.
Collapse
Affiliation(s)
- Moritz Deger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.,Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, 50674 Cologne, Germany
| | - Alexander Seeholzer
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
7
|
Humble J, Hiratsuka K, Kasai H, Toyoizumi T. Intrinsic Spine Dynamics Are Critical for Recurrent Network Learning in Models With and Without Autism Spectrum Disorder. Front Comput Neurosci 2019; 13:38. [PMID: 31263407 PMCID: PMC6585147 DOI: 10.3389/fncom.2019.00038] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2019] [Accepted: 05/28/2019] [Indexed: 11/13/2022] Open
Abstract
It is often assumed that Hebbian synaptic plasticity forms a cell assembly, a mutually interacting group of neurons that encodes memory. However, in recurrently connected networks with pure Hebbian plasticity, cell assemblies typically diverge or fade under ongoing changes of synaptic strength. Previously assumed mechanisms that stabilize cell assemblies do not robustly reproduce the experimentally reported unimodal and long-tailed distribution of synaptic strengths. Here, we show that augmenting Hebbian plasticity with experimentally observed intrinsic spine dynamics can stabilize cell assemblies and reproduce the distribution of synaptic strengths. Moreover, we posit that strong intrinsic spine dynamics impair learning performance. Our theory explains how excessively strong spine dynamics, experimentally observed in several animal models of autism spectrum disorder, impair learning associations in the brain.
Collapse
Affiliation(s)
- James Humble
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama, Japan
| | - Kazuhiro Hiratsuka
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama, Japan
| | - Haruo Kasai
- Laboratory of Structural Physiology, Faculty of Medicine, Center for Disease Biology and Integrative Medicine, University of Tokyo, Tokyo, Japan
| | - Taro Toyoizumi
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama, Japan
| |
Collapse
|
8
|
Brivio S, Conti D, Nair MV, Frascaroli J, Covi E, Ricciardi C, Indiveri G, Spiga S. Extended memory lifetime in spiking neural networks employing memristive synapses with nonlinear conductance dynamics. NANOTECHNOLOGY 2019; 30:015102. [PMID: 30378572 DOI: 10.1088/1361-6528/aae81c] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Spiking neural networks (SNNs) employing memristive synapses are capable of life-long online learning. Because of their ability to process and classify large amounts of data in real-time using compact and low-power electronic systems, they promise a substantial technology breakthrough. However, the critical issue that memristor-based SNNs have to face is the fundamental limitation in their memory capacity due to finite resolution of the synaptic elements, which leads to the replacement of old memories with new ones and to a finite memory lifetime. In this study we demonstrate that the nonlinear conductance dynamics of memristive devices can be exploited to improve the memory lifetime of a network. The network is simulated on the basis of a spiking neuron model of mixed-signal digital-analogue sub-threshold neuromorphic CMOS circuits, and on memristive synapse models derived from the experimental nonlinear conductance dynamics of resistive memory devices when stimulated by trains of identical pulses. The network learning circuits implement a spike-based plasticity rule compatible with both spike-timing and rate-based learning rules. In order to get an insight on the memory lifetime of the network, we analyse the learning dynamics in the context of a classical benchmark of neural network learning, that is hand-written digit classification. In the proposed architecture, the memory lifetime and the performance of the network are improved for memristive synapses with nonlinear dynamics with respect to linear synapses with similar resolution. These results demonstrate the importance of following holistic approaches that combine the study of theoretical learning models with the development of neuromorphic CMOS SNNs with memristive devices used to implement life-long on-chip learning.
Collapse
Affiliation(s)
- S Brivio
- CNR-IMM, Unit of Agrate Brianza, via C. Olivetti 2, I-20864 Agrate Brianza, Italy
| | | | | | | | | | | | | | | |
Collapse
|
9
|
Triplett MA, Avitan L, Goodhill GJ. Emergence of spontaneous assembly activity in developing neural networks without afferent input. PLoS Comput Biol 2018; 14:e1006421. [PMID: 30265665 PMCID: PMC6161857 DOI: 10.1371/journal.pcbi.1006421] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2018] [Accepted: 08/07/2018] [Indexed: 02/04/2023] Open
Abstract
Spontaneous activity is a fundamental characteristic of the developing nervous system. Intriguingly, it often takes the form of multiple structured assemblies of neurons. Such assemblies can form even in the absence of afferent input, for instance in the zebrafish optic tectum after bilateral enucleation early in life. While the development of neural assemblies based on structured afferent input has been theoretically well-studied, it is less clear how they could arise in systems without afferent input. Here we show that a recurrent network of binary threshold neurons with initially random weights can form neural assemblies based on a simple Hebbian learning rule. Over development the network becomes increasingly modular while being driven by initially unstructured spontaneous activity, leading to the emergence of neural assemblies. Surprisingly, the set of neurons making up each assembly then continues to evolve, despite the number of assemblies remaining roughly constant. In the mature network assembly activity builds over several timesteps before the activation of the full assembly, as recently observed in calcium-imaging experiments. Our results show that Hebbian learning is sufficient to explain the emergence of highly structured patterns of neural activity in the absence of structured input.
Collapse
Affiliation(s)
- Marcus A. Triplett
- Queensland Brain Institute, University of Queensland, St Lucia, Queensland, Australia
- School of Mathematics and Physics, University of Queensland, St Lucia, Queensland, Australia
| | - Lilach Avitan
- Queensland Brain Institute, University of Queensland, St Lucia, Queensland, Australia
| | - Geoffrey J. Goodhill
- Queensland Brain Institute, University of Queensland, St Lucia, Queensland, Australia
- School of Mathematics and Physics, University of Queensland, St Lucia, Queensland, Australia
- * E-mail:
| |
Collapse
|
10
|
Kornijcuk V, Lim H, Kim I, Park JK, Lee WS, Choi JH, Choi BJ, Jeong DS. Scalable excitatory synaptic circuit design using floating gate based leaky integrators. Sci Rep 2017; 7:17579. [PMID: 29242504 PMCID: PMC5730552 DOI: 10.1038/s41598-017-17889-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2017] [Accepted: 12/01/2017] [Indexed: 11/09/2022] Open
Abstract
We propose a scalable synaptic circuit realizing spike timing dependent plasticity (STDP)-compatible with randomly spiking neurons. The feasible working of the circuit was examined by circuit simulation using the BSIM 4.6.0 model. A distinguishable feature of the circuit is the use of floating-gate integrators that provide the compact implementation of biologically plausible relaxation time scale. This relaxation occurs on the basis of charge tunneling that mainly relies upon area-independent tunnel barrier properties (e.g. barrier width and height) rather than capacitance. The circuit simulations feature (i) weight-dependent STDP that spontaneously limits the synaptic weight growth, (ii) competitive synaptic adaptation within both unsupervised and supervised frameworks with randomly spiking neurons. The estimated power consumption is merely 34 pW, perhaps meeting one of the most crucial principles (power-efficiency) of neuromorphic engineering. Finally, a means of fine-tuning the STDP behavior is provided.
Collapse
Affiliation(s)
- Vladimir Kornijcuk
- Center for Electronic Materials, Korea Institute of Science and Technology, Seoul, 02792, Republic of Korea.,Department of Nanomaterials, University of Science and Technology, Daejeon, 34113, Republic of Korea
| | - Hyungkwang Lim
- Center for Electronic Materials, Korea Institute of Science and Technology, Seoul, 02792, Republic of Korea
| | - Inho Kim
- Center for Electronic Materials, Korea Institute of Science and Technology, Seoul, 02792, Republic of Korea
| | - Jong-Keuk Park
- Center for Electronic Materials, Korea Institute of Science and Technology, Seoul, 02792, Republic of Korea
| | - Wook-Seong Lee
- Center for Electronic Materials, Korea Institute of Science and Technology, Seoul, 02792, Republic of Korea
| | - Jung-Hae Choi
- Center for Electronic Materials, Korea Institute of Science and Technology, Seoul, 02792, Republic of Korea
| | - Byung Joon Choi
- Department of Materials Science and Engineering, Seoul National University of Science and Technology, Seoul, 01811, Republic of Korea
| | - Doo Seok Jeong
- Center for Electronic Materials, Korea Institute of Science and Technology, Seoul, 02792, Republic of Korea. .,Department of Nanomaterials, University of Science and Technology, Daejeon, 34113, Republic of Korea.
| |
Collapse
|
11
|
Costa RP, Mizusaki BEP, Sjöström PJ, van Rossum MCW. Functional consequences of pre- and postsynaptic expression of synaptic plasticity. Philos Trans R Soc Lond B Biol Sci 2017; 372:rstb.2016.0153. [PMID: 28093547 PMCID: PMC5247585 DOI: 10.1098/rstb.2016.0153] [Citation(s) in RCA: 37] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/02/2016] [Indexed: 01/23/2023] Open
Abstract
Growing experimental evidence shows that both homeostatic and Hebbian synaptic plasticity can be expressed presynaptically as well as postsynaptically. In this review, we start by discussing this evidence and methods used to determine expression loci. Next, we discuss the functional consequences of this diversity in pre- and postsynaptic expression of both homeostatic and Hebbian synaptic plasticity. In particular, we explore the functional consequences of a biologically tuned model of pre- and postsynaptically expressed spike-timing-dependent plasticity complemented with postsynaptic homeostatic control. The pre- and postsynaptic expression in this model predicts (i) more reliable receptive fields and sensory perception, (ii) rapid recovery of forgotten information (memory savings), and (iii) reduced response latencies, compared with a model with postsynaptic expression only. Finally, we discuss open questions that will require a considerable research effort to better elucidate how the specific locus of expression of homeostatic and Hebbian plasticity alters synaptic and network computations.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'.
Collapse
Affiliation(s)
- Rui Ponte Costa
- Institute for Adaptive and Neural Computation, School of Informatics University of Edinburgh, Edinburgh, UK.,Centre for Neural Circuits and Behaviour, University of Oxford, Oxford, UK
| | - Beatriz E P Mizusaki
- Instituto de Física, Universidade Federal do Rio Grande do Sul, Porto Alegre, Brazil.,Centre for Research in Neuroscience, Department of Neurology and Neurosurgery, Program for Brain Repair and Integrative Neuroscience, The Research Institute of the McGill University Health Centre, McGill University, Montreal, Quebec, Canada
| | - P Jesper Sjöström
- Centre for Research in Neuroscience, Department of Neurology and Neurosurgery, Program for Brain Repair and Integrative Neuroscience, The Research Institute of the McGill University Health Centre, McGill University, Montreal, Quebec, Canada
| | - Mark C W van Rossum
- Institute for Adaptive and Neural Computation, School of Informatics University of Edinburgh, Edinburgh, UK
| |
Collapse
|
12
|
Suen JY, Navlakha S. Using Inspiration from Synaptic Plasticity Rules to Optimize Traffic Flow in Distributed Engineered Networks. Neural Comput 2017; 29:1204-1228. [DOI: 10.1162/neco_a_00945] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
Controlling the flow and routing of data is a fundamental problem in many distributed networks, including transportation systems, integrated circuits, and the Internet. In the brain, synaptic plasticity rules have been discovered that regulate network activity in response to environmental inputs, which enable circuits to be stable yet flexible. Here, we develop a new neuro-inspired model for network flow control that depends only on modifying edge weights in an activity-dependent manner. We show how two fundamental plasticity rules, long-term potentiation and long-term depression, can be cast as a distributed gradient descent algorithm for regulating traffic flow in engineered networks. We then characterize, both by simulation and analytically, how different forms of edge-weight-update rules affect network routing efficiency and robustness. We find a close correspondence between certain classes of synaptic weight update rules derived experimentally in the brain and rules commonly used in engineering, suggesting common principles to both.
Collapse
Affiliation(s)
- Jonathan Y. Suen
- Duke University, Department of Electrical and Computer Engineering. Durham, NC 27708, U.S.A
| | - Saket Navlakha
- Salk Institute for Biological Studies, Integrative Biology Laboratory, La Jolla, CA 92037, U.S.A
| |
Collapse
|
13
|
Zenke F, Gerstner W, Ganguli S. The temporal paradox of Hebbian learning and homeostatic plasticity. Curr Opin Neurobiol 2017; 43:166-176. [DOI: 10.1016/j.conb.2017.03.015] [Citation(s) in RCA: 104] [Impact Index Per Article: 14.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 03/07/2017] [Accepted: 03/22/2017] [Indexed: 11/16/2022]
|
14
|
Lajoie G, Krouchev NI, Kalaska JF, Fairhall AL, Fetz EE. Correlation-based model of artificially induced plasticity in motor cortex by a bidirectional brain-computer interface. PLoS Comput Biol 2017; 13:e1005343. [PMID: 28151957 PMCID: PMC5313237 DOI: 10.1371/journal.pcbi.1005343] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2016] [Revised: 02/16/2017] [Accepted: 01/03/2017] [Indexed: 12/19/2022] Open
Abstract
Experiments show that spike-triggered stimulation performed with Bidirectional Brain-Computer-Interfaces (BBCI) can artificially strengthen connections between separate neural sites in motor cortex (MC). When spikes from a neuron recorded at one MC site trigger stimuli at a second target site after a fixed delay, the connections between sites eventually strengthen. It was also found that effective spike-stimulus delays are consistent with experimentally derived spike-timing-dependent plasticity (STDP) rules, suggesting that STDP is key to drive these changes. However, the impact of STDP at the level of circuits, and the mechanisms governing its modification with neural implants remain poorly understood. The present work describes a recurrent neural network model with probabilistic spiking mechanisms and plastic synapses capable of capturing both neural and synaptic activity statistics relevant to BBCI conditioning protocols. Our model successfully reproduces key experimental results, both established and new, and offers mechanistic insights into spike-triggered conditioning. Using analytical calculations and numerical simulations, we derive optimal operational regimes for BBCIs, and formulate predictions concerning the efficacy of spike-triggered conditioning in different regimes of cortical activity.
Collapse
Affiliation(s)
- Guillaume Lajoie
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
| | | | - John F. Kalaska
- Groupe de recherche sur le système nerveux central, Département de neurosciences, Université de Montreal, Montreal, QC, Canada
| | - Adrienne L. Fairhall
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
- Dept. of Physiology and Biophysics, University of Washington, Seattle, WA, USA
- Dept. of Physics, University of Washington, Seattle, WA, USA
| | - Eberhard E. Fetz
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
- Dept. of Physiology and Biophysics, University of Washington, Seattle, WA, USA
| |
Collapse
|
15
|
Robinson BS, Berger TW, Song D. Identification of Stable Spike-Timing-Dependent Plasticity from Spiking Activity with Generalized Multilinear Modeling. Neural Comput 2016; 28:2320-2351. [PMID: 27557101 DOI: 10.1162/neco_a_00883] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Characterization of long-term activity-dependent plasticity from behaviorally driven spiking activity is important for understanding the underlying mechanisms of learning and memory. In this letter, we present a computational framework for quantifying spike-timing-dependent plasticity (STDP) during behavior by identifying a functional plasticity rule solely from spiking activity. First, we formulate a flexible point-process spiking neuron model structure with STDP, which includes functions that characterize the stationary and plastic properties of the neuron. The STDP model includes a novel function for prolonged plasticity induction, as well as a more typical function for synaptic weight change based on the relative timing of input-output spike pairs. Consideration for system stability is incorporated with weight-dependent synaptic modification. Next, we formalize an estimation technique using a generalized multilinear model (GMLM) structure with basis function expansion. The weight-dependent synaptic modification adds a nonlinearity to the model, which is addressed with an iterative unconstrained optimization approach. Finally, we demonstrate successful model estimation on simulated spiking data and show that all model functions can be estimated accurately with this method across a variety of simulation parameters, such as number of inputs, output firing rate, input firing type, and simulation time. Since this approach requires only naturally generated spikes, it can be readily applied to behaving animal studies to characterize the underlying mechanisms of learning and memory.
Collapse
Affiliation(s)
- Brian S Robinson
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| | - Theodore W Berger
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| | - Dong Song
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| |
Collapse
|
16
|
Vogt SM, Hofmann UG. Balancing the critical period of spiking neurons with attractor-less STDP. BMC Neurosci 2015. [PMCID: PMC4697491 DOI: 10.1186/1471-2202-16-s1-p103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
|
17
|
Yger P, Gilson M. Models of Metaplasticity: A Review of Concepts. Front Comput Neurosci 2015; 9:138. [PMID: 26617512 PMCID: PMC4639700 DOI: 10.3389/fncom.2015.00138] [Citation(s) in RCA: 55] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2015] [Accepted: 10/27/2015] [Indexed: 11/16/2022] Open
Abstract
Part of hippocampal and cortical plasticity is characterized by synaptic modifications that depend on the joint activity of the pre- and post-synaptic neurons. To which extent those changes are determined by the exact timing and the average firing rates is still a matter of debate; this may vary from brain area to brain area, as well as across neuron types. However, it has been robustly observed both in vitro and in vivo that plasticity itself slowly adapts as a function of the dynamical context, a phenomena commonly referred to as metaplasticity. An alternative concept considers the regulation of groups of synapses with an objective at the neuronal level, for example, maintaining a given average firing rate. In that case, the change in the strength of a particular synapse of the group (e.g., due to Hebbian learning) affects others' strengths, which has been coined as heterosynaptic plasticity. Classically, Hebbian synaptic plasticity is paired in neuron network models with such mechanisms in order to stabilize the activity and/or the weight structure. Here, we present an oriented review that brings together various concepts from heterosynaptic plasticity to metaplasticity, and show how they interact with Hebbian-type learning. We focus on approaches that are nowadays used to incorporate those mechanisms to state-of-the-art models of spiking plasticity inspired by experimental observations in the hippocampus and cortex. Making the point that metaplasticity is an ubiquitous mechanism acting on top of classical Hebbian learning and promoting the stability of neural function over multiple timescales, we stress the need for incorporating it as a key element in the framework of plasticity models. Bridging theoretical and experimental results suggests a more functional role for metaplasticity mechanisms than simply stabilizing neural activity.
Collapse
Affiliation(s)
- Pierre Yger
- Sorbonne Université, UPMC Univ Paris06 UMRS968 Paris, France ; Institut de la Vision, INSERM, U968, Centre National de la Recherche Scientifique, UMR7210 Paris, France
| | - Matthieu Gilson
- Computational Neurosciences Group, Departament de Tecnologies de la Informació i les Comunicacions, Universitat Pompeu Fabra Barcelona, Spain
| |
Collapse
|
18
|
Self-Organized Near-Zero-Lag Synchronization Induced by Spike-Timing Dependent Plasticity in Cortical Populations. PLoS One 2015; 10:e0140504. [PMID: 26474165 PMCID: PMC4608682 DOI: 10.1371/journal.pone.0140504] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2015] [Accepted: 09/21/2015] [Indexed: 12/30/2022] Open
Abstract
Several cognitive tasks related to learning and memory exhibit synchronization of macroscopic cortical areas together with synaptic plasticity at neuronal level. Therefore, there is a growing effort among computational neuroscientists to understand the underlying mechanisms relating synchrony and plasticity in the brain. Here we numerically study the interplay between spike-timing dependent plasticity (STDP) and anticipated synchronization (AS). AS emerges when a dominant flux of information from one area to another is accompanied by a negative time lag (or phase). This means that the receiver region pulses before the sender does. In this paper we study the interplay between different synchronization regimes and STDP at the level of three-neuron microcircuits as well as cortical populations. We show that STDP can promote auto-organized zero-lag synchronization in unidirectionally coupled neuronal populations. We also find synchronization regimes with negative phase difference (AS) that are stable against plasticity. Finally, we show that the interplay between negative phase difference and STDP provides limited synaptic weight distribution without the need of imposing artificial boundaries.
Collapse
|
19
|
Qiao N, Mostafa H, Corradi F, Osswald M, Stefanini F, Sumislawska D, Indiveri G. A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses. Front Neurosci 2015; 9:141. [PMID: 25972778 PMCID: PMC4413675 DOI: 10.3389/fnins.2015.00141] [Citation(s) in RCA: 159] [Impact Index Per Article: 17.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2014] [Accepted: 04/06/2015] [Indexed: 11/13/2022] Open
Abstract
Implementing compact, low-power artificial neural processing systems with real-time on-line learning abilities is still an open challenge. In this paper we present a full-custom mixed-signal VLSI device with neuromorphic learning circuits that emulate the biophysics of real spiking neurons and dynamic synapses for exploring the properties of computational neuroscience models and for building brain-inspired computing systems. The proposed architecture allows the on-chip configuration of a wide range of network connectivities, including recurrent and deep networks, with short-term and long-term plasticity. The device comprises 128 K analog synapse and 256 neuron circuits with biologically plausible dynamics and bi-stable spike-based plasticity mechanisms that endow it with on-line learning abilities. In addition to the analog circuits, the device comprises also asynchronous digital logic circuits for setting different synapse and neuron properties as well as different network configurations. This prototype device, fabricated using a 180 nm 1P6M CMOS process, occupies an area of 51.4 mm(2), and consumes approximately 4 mW for typical experiments, for example involving attractor networks. Here we describe the details of the overall architecture and of the individual circuits and present experimental results that showcase its potential. By supporting a wide range of cortical-like computational modules comprising plasticity mechanisms, this device will enable the realization of intelligent autonomous systems with on-line learning capabilities.
Collapse
Affiliation(s)
- Ning Qiao
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Hesham Mostafa
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Federico Corradi
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Marc Osswald
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Fabio Stefanini
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Dora Sumislawska
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Giacomo Indiveri
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| |
Collapse
|
20
|
Higgins D, Graupner M, Brunel N. Memory maintenance in synapses with calcium-based plasticity in the presence of background activity. PLoS Comput Biol 2014; 10:e1003834. [PMID: 25275319 PMCID: PMC4183374 DOI: 10.1371/journal.pcbi.1003834] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2014] [Accepted: 07/28/2014] [Indexed: 11/19/2022] Open
Abstract
Most models of learning and memory assume that memories are maintained in neuronal circuits by persistent synaptic modifications induced by specific patterns of pre- and postsynaptic activity. For this scenario to be viable, synaptic modifications must survive the ubiquitous ongoing activity present in neural circuits in vivo. In this paper, we investigate the time scales of memory maintenance in a calcium-based synaptic plasticity model that has been shown recently to be able to fit different experimental data-sets from hippocampal and neocortical preparations. We find that in the presence of background activity on the order of 1 Hz parameters that fit pyramidal layer 5 neocortical data lead to a very fast decay of synaptic efficacy, with time scales of minutes. We then identify two ways in which this memory time scale can be extended: (i) the extracellular calcium concentration in the experiments used to fit the model are larger than estimated concentrations in vivo. Lowering extracellular calcium concentration to in vivo levels leads to an increase in memory time scales of several orders of magnitude; (ii) adding a bistability mechanism so that each synapse has two stable states at sufficiently low background activity leads to a further boost in memory time scale, since memory decay is no longer described by an exponential decay from an initial state, but by an escape from a potential well. We argue that both features are expected to be present in synapses in vivo. These results are obtained first in a single synapse connecting two independent Poisson neurons, and then in simulations of a large network of excitatory and inhibitory integrate-and-fire neurons. Our results emphasise the need for studying plasticity at physiological extracellular calcium concentration, and highlight the role of synaptic bi- or multistability in the stability of learned synaptic structures. Synaptic plasticity is widely believed to be the main mechanism underlying learning and memory. In recent years, several mathematical plasticity rules have been shown to fit satisfactorily a wide range of experimental data in hippocampal and neocortical in vitro preparations. In particular, a model in which plasticity is driven by the postsynaptic calcium concentration was shown to reproduce successfully how synaptic changes depend on spike timing, specific spike patterns, and firing rate. The advantage of calcium-based rules is the possibility of predicting how changes in extracellular concentrations will affect plasticity. This is particularly significant in the view that in vitro studies are typically done at higher concentrations than the ones measured in vivo. Using such a rule, with parameters fitting in vitro data, we explore how long the memory of a particular synaptic change can be maintained in the presence of background neuronal activity, ubiquitously observed in cortex. We find that the memory time scales increase by several orders of magnitude when calcium concentrations are lowered from typical in vitro experiments to in vivo. Furthermore, we find that synaptic bistability further extends the memory time scale, and estimate that synaptic changes in vivo could be stable on the scale of weeks to months.
Collapse
Affiliation(s)
- David Higgins
- IBENS, École Normale Supérieure, Paris, France
- Departments of Statistics and Neurobiology, University of Chicago, Chicago, Illinois, United States of America
| | - Michael Graupner
- Center for Neural Science, New York University, New York, New York, United States of America
| | - Nicolas Brunel
- Departments of Statistics and Neurobiology, University of Chicago, Chicago, Illinois, United States of America
- * E-mail:
| |
Collapse
|
21
|
Interplay between short- and long-term plasticity in cell-assembly formation. PLoS One 2014; 9:e101535. [PMID: 25007209 PMCID: PMC4090127 DOI: 10.1371/journal.pone.0101535] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2013] [Accepted: 06/08/2014] [Indexed: 11/19/2022] Open
Abstract
Various hippocampal and neocortical synapses of mammalian brain show both short-term plasticity and long-term plasticity, which are considered to underlie learning and memory by the brain. According to Hebb’s postulate, synaptic plasticity encodes memory traces of past experiences into cell assemblies in cortical circuits. However, it remains unclear how the various forms of long-term and short-term synaptic plasticity cooperatively create and reorganize such cell assemblies. Here, we investigate the mechanism in which the three forms of synaptic plasticity known in cortical circuits, i.e., spike-timing-dependent plasticity (STDP), short-term depression (STD) and homeostatic plasticity, cooperatively generate, retain and reorganize cell assemblies in a recurrent neuronal network model. We show that multiple cell assemblies generated by external stimuli can survive noisy spontaneous network activity for an adequate range of the strength of STD. Furthermore, our model predicts that a symmetric temporal window of STDP, such as observed in dopaminergic modulations on hippocampal neurons, is crucial for the retention and integration of multiple cell assemblies. These results may have implications for the understanding of cortical memory processes.
Collapse
|
22
|
Zheng Y, Schwabe L. Shaping synaptic learning by the duration of postsynaptic action potential in a new STDP model. PLoS One 2014; 9:e88592. [PMID: 24551122 PMCID: PMC3925143 DOI: 10.1371/journal.pone.0088592] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2012] [Accepted: 01/13/2014] [Indexed: 12/04/2022] Open
Abstract
Single spikes and their timing matter in changing synaptic efficacy, which is known as spike-timing-dependent plasticity (STDP). Most previous studies treated spikes as all-or-none events, and considered their duration and magnitude as negligible. Here we explore the effects of action potential (AP) duration on synaptic plasticity in a simplified model neuron using computer simulations. We propose a novel STDP model that depresses synapses using an AP duration dependent LTD window and induces potentiation of synaptic strength when presynaptic spikes arrive before and during a postsynaptic AP (dSTDP). We demonstrate that AP duration is another key factor for insensitizing the postsynaptic neural firing and for controlling the shape of synaptic weight distribution. Extended AP durations produce a wide unimodal weight distribution that resembles the ones reported experimentally and make the postsynaptic neuron tranquil when disturbed by poisson noise spike trains, while equivalently sensitive to the synchronized. Our results suggest that the impact of AP duration, modeled here as an AP-dependent STDP window, on synaptic plasticity can be dramatic and should motivate future STDP studies.
Collapse
Affiliation(s)
- Youwei Zheng
- Faculty of Computer Science and Electrical Engineering, University of Rostock, Rostock, Germany
- * E-mail:
| | - Lars Schwabe
- Faculty of Computer Science and Electrical Engineering, University of Rostock, Rostock, Germany
| |
Collapse
|
23
|
Sinha DB, Ledbetter NM, Barbour DL. Spike-timing computation properties of a feed-forward neural network model. Front Comput Neurosci 2014; 8:5. [PMID: 24478688 PMCID: PMC3904091 DOI: 10.3389/fncom.2014.00005] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2013] [Accepted: 01/09/2014] [Indexed: 11/13/2022] Open
Abstract
Brain function is characterized by dynamical interactions among networks of neurons. These interactions are mediated by network topology at many scales ranging from microcircuits to brain areas. Understanding how networks operate can be aided by understanding how the transformation of inputs depends upon network connectivity patterns, e.g., serial and parallel pathways. To tractably determine how single synapses or groups of synapses in such pathways shape these transformations, we modeled feed-forward networks of 7–22 neurons in which synaptic strength changed according to a spike-timing dependent plasticity (STDP) rule. We investigated how activity varied when dynamics were perturbed by an activity-dependent electrical stimulation protocol (spike-triggered stimulation; STS) in networks of different topologies and background input correlations. STS can successfully reorganize functional brain networks in vivo, but with a variability in effectiveness that may derive partially from the underlying network topology. In a simulated network with a single disynaptic pathway driven by uncorrelated background activity, structured spike-timing relationships between polysynaptically connected neurons were not observed. When background activity was correlated or parallel disynaptic pathways were added, however, robust polysynaptic spike timing relationships were observed, and application of STS yielded predictable changes in synaptic strengths and spike-timing relationships. These observations suggest that precise input-related or topologically induced temporal relationships in network activity are necessary for polysynaptic signal propagation. Such constraints for polysynaptic computation suggest potential roles for higher-order topological structure in network organization, such as maintaining polysynaptic correlation in the face of relatively weak synapses.
Collapse
Affiliation(s)
- Drew B Sinha
- Laboratory of Sensory Neuroscience and Neuroengineering, Department of Biomedical Engineering, Washington University in St. Louis St. Louis, MO, USA
| | - Noah M Ledbetter
- Laboratory of Sensory Neuroscience and Neuroengineering, Department of Biomedical Engineering, Washington University in St. Louis St. Louis, MO, USA
| | - Dennis L Barbour
- Laboratory of Sensory Neuroscience and Neuroengineering, Department of Biomedical Engineering, Washington University in St. Louis St. Louis, MO, USA
| |
Collapse
|
24
|
Zenke F, Hennequin G, Gerstner W. Synaptic plasticity in neural networks needs homeostasis with a fast rate detector. PLoS Comput Biol 2013; 9:e1003330. [PMID: 24244138 PMCID: PMC3828150 DOI: 10.1371/journal.pcbi.1003330] [Citation(s) in RCA: 91] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2013] [Accepted: 09/25/2013] [Indexed: 01/17/2023] Open
Abstract
Hebbian changes of excitatory synapses are driven by and further enhance correlations between pre- and postsynaptic activities. Hence, Hebbian plasticity forms a positive feedback loop that can lead to instability in simulated neural networks. To keep activity at healthy, low levels, plasticity must therefore incorporate homeostatic control mechanisms. We find in numerical simulations of recurrent networks with a realistic triplet-based spike-timing-dependent plasticity rule (triplet STDP) that homeostasis has to detect rate changes on a timescale of seconds to minutes to keep the activity stable. We confirm this result in a generic mean-field formulation of network activity and homeostatic plasticity. Our results strongly suggest the existence of a homeostatic regulatory mechanism that reacts to firing rate changes on the order of seconds to minutes. Learning and memory in the brain are thought to be mediated through Hebbian plasticity. When a group of neurons is repetitively active together, their connections get strengthened. This can cause co-activation even in the absence of the stimulus that triggered the change. To avoid run-away behavior it is important to prevent neurons from forming excessively strong connections. This is achieved by regulatory homeostatic mechanisms that constrain the overall activity. Here we study the stability of background activity in a recurrent network model with a plausible Hebbian learning rule and homeostasis. We find that the activity in our model is unstable unless homeostasis reacts to rate changes on a timescale of minutes or faster. Since this timescale is incompatible with most known forms of homeostasis, this implies the existence of a previously unknown, rapid homeostatic regulatory mechanism capable of either gating the rate of plasticity, or affecting synaptic efficacies otherwise on a short timescale.
Collapse
Affiliation(s)
- Friedemann Zenke
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| | - Guillaume Hennequin
- Computational and Biological Learning Laboratory, Department of Engineering, University of Cambridge, Cambridge, United Kingdom
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
25
|
The interplay between STDP rules and anticipated synchronization in the organization of neuronal networks. BMC Neurosci 2013. [PMCID: PMC3704836 DOI: 10.1186/1471-2202-14-s1-p71] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
|
26
|
Srinivasa N, Jiang Q. Stable learning of functional maps in self-organizing spiking neural networks with continuous synaptic plasticity. Front Comput Neurosci 2013; 7:10. [PMID: 23450808 PMCID: PMC3583036 DOI: 10.3389/fncom.2013.00010] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2012] [Accepted: 02/09/2013] [Indexed: 11/13/2022] Open
Abstract
This study describes a spiking model that self-organizes for stable formation and maintenance of orientation and ocular dominance maps in the visual cortex (V1). This self-organization process simulates three development phases: an early experience-independent phase, a late experience-independent phase and a subsequent refinement phase during which experience acts to shape the map properties. The ocular dominance maps that emerge accommodate the two sets of monocular inputs that arise from the lateral geniculate nucleus (LGN) to layer 4 of V1. The orientation selectivity maps that emerge feature well-developed iso-orientation domains and fractures. During the last two phases of development the orientation preferences at some locations appear to rotate continuously through ±180° along circular paths and referred to as pinwheel-like patterns but without any corresponding point discontinuities in the orientation gradient maps. The formation of these functional maps is driven by balanced excitatory and inhibitory currents that are established via synaptic plasticity based on spike timing for both excitatory and inhibitory synapses. The stability and maintenance of the formed maps with continuous synaptic plasticity is enabled by homeostasis caused by inhibitory plasticity. However, a prolonged exposure to repeated stimuli does alter the formed maps over time due to plasticity. The results from this study suggest that continuous synaptic plasticity in both excitatory neurons and interneurons could play a critical role in the formation, stability, and maintenance of functional maps in the cortex.
Collapse
Affiliation(s)
- Narayan Srinivasa
- Center for Neural and Emergent Systems, HRL Laboratories LLC Malibu, CA, USA
| | | |
Collapse
|
27
|
van Rossum MCW, Shippi M, Barrett AB. Soft-bound synaptic plasticity increases storage capacity. PLoS Comput Biol 2012; 8:e1002836. [PMID: 23284281 PMCID: PMC3527223 DOI: 10.1371/journal.pcbi.1002836] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2012] [Accepted: 10/24/2012] [Indexed: 12/02/2022] Open
Abstract
Accurate models of synaptic plasticity are essential to understand the adaptive properties of the nervous system and for realistic models of learning and memory. Experiments have shown that synaptic plasticity depends not only on pre- and post-synaptic activity patterns, but also on the strength of the connection itself. Namely, weaker synapses are more easily strengthened than already strong ones. This so called soft-bound plasticity automatically constrains the synaptic strengths. It is known that this has important consequences for the dynamics of plasticity and the synaptic weight distribution, but its impact on information storage is unknown. In this modeling study we introduce an information theoretic framework to analyse memory storage in an online learning setting. We show that soft-bound plasticity increases a variety of performance criteria by about 18% over hard-bound plasticity, and likely maximizes the storage capacity of synapses. It is generally believed that our memories are stored in the synaptic connections between neurons. Numerous experimental studies have therefore examined when and how the synaptic connections change. In parallel, many computational studies have examined the properties of memory and synaptic plasticity, aiming to better understand human memory and allow for neural network models of the brain. However, the plasticity rules used in most studies are highly simplified and do not take into account the rich behaviour found in experiments. For instance, it has been observed in experiments that it is hard to make strong synapses even stronger. Here we show that this saturation of plasticity enhances the number of memories that can be stored and introduce a general framework to calculate information storage in online learning paradigms.
Collapse
Affiliation(s)
- Mark C W van Rossum
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh, United Kingdom.
| | | | | |
Collapse
|
28
|
Bamford SA, Murray AF, Willshaw DJ. Spike-timing-dependent plasticity with weight dependence evoked from physical constraints. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2012; 6:385-398. [PMID: 23853183 DOI: 10.1109/tbcas.2012.2184285] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
Analogue and mixed-signal VLSI implementations of Spike-Timing-Dependent Plasticity (STDP) are reviewed. A circuit is presented with a compact implementation of STDP suitable for parallel integration in large synaptic arrays. In contrast to previously published circuits, it uses the limitations of the silicon substrate to achieve various forms and degrees of weight dependence of STDP. It also uses reverse-biased transistors to reduce leakage from a capacitance representing weight. Chip results are presented showing: various ways in which the learning rule may be shaped; how synaptic weights may retain some indication of their learned values over periods of minutes; and how distributions of weights for synapses convergent on single neurons may shift between more or less extreme bimodality according to the strength of correlational cues in their inputs.
Collapse
Affiliation(s)
- Simeon A Bamford
- Neuroinformatics Doctoral Training Centre, University of Edinburgh, Edinburgh, Scotland EH8 9AB, UK.
| | | | | |
Collapse
|
29
|
Abstract
Long-term synaptic plasticity requires postsynaptic influx of Ca²⁺ and is accompanied by changes in dendritic spine size. Unless Ca²⁺ influx mechanisms and spine volume scale proportionally, changes in spine size will modify spine Ca²⁺ concentrations during subsequent synaptic activation. We show that the relationship between Ca²⁺ influx and spine volume is a fundamental determinant of synaptic stability. If Ca²⁺ influx is undercompensated for increases in spine size, then strong synapses are stabilized and synaptic strength distributions have a single peak. In contrast, overcompensation of Ca²⁺ influx leads to binary, persistent synaptic strengths with double-peaked distributions. Biophysical simulations predict that CA1 pyramidal neuron spines are undercompensating. This unifies experimental findings that weak synapses are more plastic than strong synapses, that synaptic strengths are unimodally distributed, and that potentiation saturates for a given stimulus strength. We conclude that structural plasticity provides a simple, local, and general mechanism that allows dendritic spines to foster both rapid memory formation and persistent memory storage.
Collapse
|
30
|
Basalyga G, Gleiser PM, Wennekers T. Emergence of small-world structure in networks of spiking neurons through STDP plasticity. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2011; 718:33-9. [PMID: 21744208 DOI: 10.1007/978-1-4614-0164-3_4] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/09/2023]
Abstract
In this work, we use a complex network approach to investigate how a neural network structure changes under synaptic plasticity. In particular, we consider a network of conductance-based, single-compartment integrate-and-fire excitatory and inhibitory neurons. Initially the neurons are connected randomly with uniformly distributed synaptic weights. The weights of excitatory connections can be strengthened or weakened during spiking activity by the mechanism known as spike-timing-dependent plasticity (STDP). We extract a binary directed connection matrix by thresholding the weights of the excitatory connections at every simulation step and calculate its major topological characteristics such as the network clustering coefficient, characteristic path length and small-world index. We numerically demonstrate that, under certain conditions, a nontrivial small-world structure can emerge from a random initial network subject to STDP learning.
Collapse
Affiliation(s)
- Gleb Basalyga
- Centre for Robotics and Neural Systems (CRNS), University of Plymouth, Plymouth, PL4 8AA, UK.
| | | | | |
Collapse
|
31
|
Zheng Y, Schwabe L. Robustness of STDP-induced memory to perturbations of presynaptic activity: a simulation study. BMC Neurosci 2011. [PMCID: PMC3240401 DOI: 10.1186/1471-2202-12-s1-p290] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/03/2023] Open
|
32
|
Gilson M, Masquelier T, Hugues E. STDP allows fast rate-modulated coding with Poisson-like spike trains. PLoS Comput Biol 2011; 7:e1002231. [PMID: 22046113 PMCID: PMC3203056 DOI: 10.1371/journal.pcbi.1002231] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2011] [Accepted: 09/01/2011] [Indexed: 11/18/2022] Open
Abstract
Spike timing-dependent plasticity (STDP) has been shown to enable single neurons to detect repeatedly presented spatiotemporal spike patterns. This holds even when such patterns are embedded in equally dense random spiking activity, that is, in the absence of external reference times such as a stimulus onset. Here we demonstrate, both analytically and numerically, that STDP can also learn repeating rate-modulated patterns, which have received more experimental evidence, for example, through post-stimulus time histograms (PSTHs). Each input spike train is generated from a rate function using a stochastic sampling mechanism, chosen to be an inhomogeneous Poisson process here. Learning is feasible provided significant covarying rate modulations occur within the typical timescale of STDP (∼10–20 ms) for sufficiently many inputs (∼100 among 1000 in our simulations), a condition that is met by many experimental PSTHs. Repeated pattern presentations induce spike-time correlations that are captured by STDP. Despite imprecise input spike times and even variable spike counts, a single trained neuron robustly detects the pattern just a few milliseconds after its presentation. Therefore, temporal imprecision and Poisson-like firing variability are not an obstacle to fast temporal coding. STDP provides an appealing mechanism to learn such rate patterns, which, beyond sensory processing, may also be involved in many cognitive tasks. In vivo neural responses to stimuli are known to have a lot of variability across trials. If the same number of spikes is emitted from trial to trial, the neuron is said to be reliable. If the timing of such spikes is roughly preserved across trials, the neuron is said to be precise. Here we demonstrate both analytically and numerically that the well-established Hebbian learning rule of spike-timing-dependent plasticity (STDP) can learn response patterns despite relatively low reliability (Poisson-like variability) and low temporal precision (10–20 ms). These features are in line with many experimental observations, in which a poststimulus time histogram (PSTH) is evaluated over multiple trials. In our model, however, information is extracted from the relative spike times between afferents without the need of an absolute reference time, such as a stimulus onset. Relevantly, recent experiments show that relative timing is often more informative than the absolute timing. Furthermore, the scope of application for our study is not restricted to sensory systems. Taken together, our results suggest a fine temporal resolution for the neural code, and that STDP is an appropriate candidate for encoding and decoding such activity.
Collapse
Affiliation(s)
- Matthieu Gilson
- Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Australia
- Lab for Neural Circuit Theory, Riken Brain Science Insitute, Wako-shi, Saitama, Japan
- * E-mail: (MG); (TM)
| | - Timothée Masquelier
- Unit for Brain and Cognition, Universitat Pompeu Fabra, Barcelona, Spain
- * E-mail: (MG); (TM)
| | - Etienne Hugues
- Unit for Brain and Cognition, Universitat Pompeu Fabra, Barcelona, Spain
| |
Collapse
|
33
|
Stability versus neuronal specialization for STDP: long-tail weight distributions solve the dilemma. PLoS One 2011; 6:e25339. [PMID: 22003389 PMCID: PMC3189213 DOI: 10.1371/journal.pone.0025339] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2011] [Accepted: 09/01/2011] [Indexed: 11/19/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) modifies the weight (or strength) of synaptic connections between neurons and is considered to be crucial for generating network structure. It has been observed in physiology that, in addition to spike timing, the weight update also depends on the current value of the weight. The functional implications of this feature are still largely unclear. Additive STDP gives rise to strong competition among synapses, but due to the absence of weight dependence, it requires hard boundaries to secure the stability of weight dynamics. Multiplicative STDP with linear weight dependence for depression ensures stability, but it lacks sufficiently strong competition required to obtain a clear synaptic specialization. A solution to this stability-versus-function dilemma can be found with an intermediate parametrization between additive and multiplicative STDP. Here we propose a novel solution to the dilemma, named log-STDP, whose key feature is a sublinear weight dependence for depression. Due to its specific weight dependence, this new model can produce significantly broad weight distributions with no hard upper bound, similar to those recently observed in experiments. Log-STDP induces graded competition between synapses, such that synapses receiving stronger input correlations are pushed further in the tail of (very) large weights. Strong weights are functionally important to enhance the neuronal response to synchronous spike volleys. Depending on the input configuration, multiple groups of correlated synaptic inputs exhibit either winner-share-all or winner-take-all behavior. When the configuration of input correlations changes, individual synapses quickly and robustly readapt to represent the new configuration. We also demonstrate the advantages of log-STDP for generating a stable structure of strong weights in a recurrently connected network. These properties of log-STDP are compared with those of previous models. Through long-tail weight distributions, log-STDP achieves both stable dynamics for and robust competition of synapses, which are crucial for spike-based information processing.
Collapse
|
34
|
Branch-specific plasticity enables self-organization of nonlinear computation in single neurons. J Neurosci 2011; 31:10787-802. [PMID: 21795531 DOI: 10.1523/jneurosci.5684-10.2011] [Citation(s) in RCA: 98] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
It has been conjectured that nonlinear processing in dendritic branches endows individual neurons with the capability to perform complex computational operations that are needed to solve for example the binding problem. However, it is not clear how single neurons could acquire such functionality in a self-organized manner, because most theoretical studies of synaptic plasticity and learning concentrate on neuron models without nonlinear dendritic properties. In the meantime, a complex picture of information processing with dendritic spikes and a variety of plasticity mechanisms in single neurons has emerged from experiments. In particular, new experimental data on dendritic branch strength potentiation in rat hippocampus have not yet been incorporated into such models. In this article, we investigate how experimentally observed plasticity mechanisms, such as depolarization-dependent spike-timing-dependent plasticity and branch-strength potentiation, could be integrated to self-organize nonlinear neural computations with dendritic spikes. We provide a mathematical proof that, in a simplified setup, these plasticity mechanisms induce a competition between dendritic branches, a novel concept in the analysis of single neuron adaptivity. We show via computer simulations that such dendritic competition enables a single neuron to become member of several neuronal ensembles and to acquire nonlinear computational capabilities, such as the capability to bind multiple input features. Hence, our results suggest that nonlinear neural computation may self-organize in single neurons through the interaction of local synaptic and dendritic plasticity mechanisms.
Collapse
|
35
|
Zheng Y, Schwabe L. Knowledge Representation Meets Simulation to Investigate Memory Problems after Seizures. Brain Inform 2011. [DOI: 10.1007/978-3-642-23605-1_11] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
|
36
|
Abstract
Neocortical neurons in vivo process each of their individual inputs in the context of ongoing synaptic background activity, produced by the thousands of presynaptic partners a typical neuron has. Previous work has shown that background activity affects multiple aspects of neuronal and network function. However, its effect on the induction of spike-timing dependent plasticity (STDP) is not clear. Here we report that injections of simulated background conductances (produced by a dynamic-clamp system) into pyramidal cells in rat brain slices selectively reduced the magnitude of timing-dependent synaptic potentiation while leaving the magnitude of timing-dependent synaptic depression unchanged. The conductance-dependent suppression also sharpened the STDP curve, with reliable synaptic potentiation induced only when EPSPs and action potentials (APs) were paired within 8 ms of each other. Dual somatic and dendritic patch recordings suggested that the deficit in synaptic potentiation arose from shunting of dendritic EPSPs and APs. Using a biophysically detailed computational model, we were not only able to replicate the conductance-dependent shunting of dendritic potentials, but show that synaptic background can truncate calcium dynamics within dendritic spines in a way that affects potentiation more strongly than depression. This conductance-dependent regulation of synaptic plasticity may constitute a novel homeostatic mechanism that can prevent the runaway synaptic potentiation to which Hebbian networks are vulnerable.
Collapse
|
37
|
Hennequin G, Gerstner W, Pfister JP. STDP in Adaptive Neurons Gives Close-To-Optimal Information Transmission. Front Comput Neurosci 2010; 4:143. [PMID: 21160559 PMCID: PMC3001990 DOI: 10.3389/fncom.2010.00143] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2010] [Accepted: 09/28/2010] [Indexed: 11/13/2022] Open
Abstract
Spike-frequency adaptation is known to enhance the transmission of information in sensory spiking neurons by rescaling the dynamic range for input processing, matching it to the temporal statistics of the sensory stimulus. Achieving maximal information transmission has also been recently postulated as a role for spike-timing-dependent plasticity (STDP). However, the link between optimal plasticity and STDP in cortex remains loose, as does the relationship between STDP and adaptation processes. We investigate how STDP, as described by recent minimal models derived from experimental data, influences the quality of information transmission in an adapting neuron. We show that a phenomenological model based on triplets of spikes yields almost the same information rate as an optimal model specially designed to this end. In contrast, the standard pair-based model of STDP does not improve information transmission as much. This result holds not only for additive STDP with hard weight bounds, known to produce bimodal distributions of synaptic weights, but also for weight-dependent STDP in the context of unimodal but skewed weight distributions. We analyze the similarities between the triplet model and the optimal learning rule, and find that the triplet effect is an important feature of the optimal model when the neuron is adaptive. If STDP is optimized for information transmission, it must take into account the dynamical properties of the postsynaptic cell, which might explain the target-cell specificity of STDP. In particular, it accounts for the differences found in vitro between STDP at excitatory synapses onto principal cells and those onto fast-spiking interneurons.
Collapse
Affiliation(s)
- Guillaume Hennequin
- School of Computer and Communication Sciences, Brain-Mind Institute, Ecole Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | | | | |
Collapse
|
38
|
Bush D, Philippides A, Husbands P, O'Shea M. Reconciling the STDP and BCM models of synaptic plasticity in a spiking recurrent neural network. Neural Comput 2010; 22:2059-85. [PMID: 20438333 DOI: 10.1162/neco_a_00003-bush] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Rate-coded Hebbian learning, as characterized by the BCM formulation, is an established computational model of synaptic plasticity. Recently it has been demonstrated that changes in the strength of synapses in vivo can also depend explicitly on the relative timing of pre- and postsynaptic firing. Computational modeling of this spike-timing-dependent plasticity (STDP) has demonstrated that it can provide inherent stability or competition based on local synaptic variables. However, it has also been demonstrated that these properties rely on synaptic weights being either depressed or unchanged by an increase in mean stochastic firing rates, which directly contradicts empirical data. Several analytical studies have addressed this apparent dichotomy and identified conditions under which distinct and disparate STDP rules can be reconciled with rate-coded Hebbian learning. The aim of this research is to verify, unify, and expand on these previous findings by manipulating each element of a standard computational STDP model in turn. This allows us to identify the conditions under which this plasticity rule can replicate experimental data obtained using both rate and temporal stimulation protocols in a spiking recurrent neural network. Our results describe how the relative scale of mean synaptic weights and their dependence on stochastic pre- or postsynaptic firing rates can be manipulated by adjusting the exact profile of the asymmetric learning window and temporal restrictions on spike pair interactions respectively. These findings imply that previously disparate models of rate-coded autoassociative learning and temporally coded heteroassociative learning, mediated by symmetric and asymmetric connections respectively, can be implemented in a single network using a single plasticity rule. However, we also demonstrate that forms of STDP that can be reconciled with rate-coded Hebbian learning do not generate inherent synaptic competition, and thus some additional mechanism is required to guarantee long-term input-output selectivity.
Collapse
Affiliation(s)
- Daniel Bush
- Centre for Computational Neuroscience and Robotics, University of Sussex, Brighton, Sussex, UK
| | | | | | | |
Collapse
|
39
|
Graupner M, Brunel N. Mechanisms of induction and maintenance of spike-timing dependent plasticity in biophysical synapse models. Front Comput Neurosci 2010; 4. [PMID: 20948584 PMCID: PMC2953414 DOI: 10.3389/fncom.2010.00136] [Citation(s) in RCA: 82] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2010] [Accepted: 08/25/2010] [Indexed: 01/02/2023] Open
Abstract
We review biophysical models of synaptic plasticity, with a focus on spike-timing dependent plasticity (STDP). The common property of the discussed models is that synaptic changes depend on the dynamics of the intracellular calcium concentration, which itself depends on pre- and postsynaptic activity. We start by discussing simple models in which plasticity changes are based directly on calcium amplitude and dynamics. We then consider models in which dynamic intracellular signaling cascades form the link between the calcium dynamics and the plasticity changes. Both mechanisms of induction of STDP (through the ability of pre/postsynaptic spikes to evoke changes in the state of the synapse) and of maintenance of the evoked changes (through bistability) are discussed.
Collapse
Affiliation(s)
- Michael Graupner
- Center for Neural Science, New York University New York City, NY, USA
| | | |
Collapse
|
40
|
Watt AJ, Desai NS. Homeostatic Plasticity and STDP: Keeping a Neuron's Cool in a Fluctuating World. Front Synaptic Neurosci 2010; 2:5. [PMID: 21423491 PMCID: PMC3059670 DOI: 10.3389/fnsyn.2010.00005] [Citation(s) in RCA: 98] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2010] [Accepted: 05/17/2010] [Indexed: 11/23/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) offers a powerful means of forming and modifying neural circuits. Experimental and theoretical studies have demonstrated its potential usefulness for functions as varied as cortical map development, sharpening of sensory receptive fields, working memory, and associative learning. Even so, it is unlikely that STDP works alone. Unless changes in synaptic strength are coordinated across multiple synapses and with other neuronal properties, it is difficult to maintain the stability and functionality of neural circuits. Moreover, there are certain features of early postnatal development (e.g., rapid changes in sensory input) that threaten neural circuit stability in ways that STDP may not be well placed to counter. These considerations have led researchers to investigate additional types of plasticity, complementary to STDP, that may serve to constrain synaptic weights and/or neuronal firing. These are collectively known as “homeostatic plasticity” and include schemes that control the total synaptic strength of a neuron, that modulate its intrinsic excitability as a function of average activity, or that make the ability of synapses to undergo Hebbian modification depend upon their history of use. In this article, we will review the experimental evidence for homeostatic forms of plasticity and consider how they might interact with STDP during development, and learning and memory.
Collapse
Affiliation(s)
- Alanna J Watt
- Wolfson Institute for Biomedical Research, University College London London, UK
| | | |
Collapse
|
41
|
Bamford SA, Murray AF, Willshaw DJ. Synaptic rewiring for topographic mapping and receptive field development. Neural Netw 2010; 23:517-27. [DOI: 10.1016/j.neunet.2010.01.005] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2009] [Revised: 01/30/2010] [Accepted: 01/31/2010] [Indexed: 11/26/2022]
|
42
|
Morrison A, Diesmann M, Gerstner W. Phenomenological models of synaptic plasticity based on spike timing. BIOLOGICAL CYBERNETICS 2008; 98:459-78. [PMID: 18491160 PMCID: PMC2799003 DOI: 10.1007/s00422-008-0233-1] [Citation(s) in RCA: 284] [Impact Index Per Article: 17.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/16/2008] [Accepted: 04/09/2008] [Indexed: 05/20/2023]
Abstract
Synaptic plasticity is considered to be the biological substrate of learning and memory. In this document we review phenomenological models of short-term and long-term synaptic plasticity, in particular spike-timing dependent plasticity (STDP). The aim of the document is to provide a framework for classifying and evaluating different models of plasticity. We focus on phenomenological synaptic models that are compatible with integrate-and-fire type neuron models where each neuron is described by a small number of variables. This implies that synaptic update rules for short-term or long-term plasticity can only depend on spike timing and, potentially, on membrane potential, as well as on the value of the synaptic weight, or on low-pass filtered (temporally averaged) versions of the above variables. We examine the ability of the models to account for experimental data and to fulfill expectations derived from theoretical considerations. We further discuss their relations to teacher-based rules (supervised learning) and reward-based rules (reinforcement learning). All models discussed in this paper are suitable for large-scale network simulations.
Collapse
Affiliation(s)
- Abigail Morrison
- Computational Neuroscience Group, RIKEN Brain Science Institute, Wako City, Japan
| | - Markus Diesmann
- Computational Neuroscience Group, RIKEN Brain Science Institute, Wako City, Japan
- Bernstein Center for Computational Neuroscience, Albert-Ludwigs-University, Freiburg, Germany
| | - Wulfram Gerstner
- Laboratory of Computational Neuroscience, LCN, Brain Mind Institute and School of Computer and Communication Sciences, Ecole Polytechnique Fédérale de Lausanne (EPFL), Station 15, 1015 Lausanne, Switzerland
| |
Collapse
|