201
|
Kerr RR, Burkitt AN, Thomas DA, Gilson M, Grayden DB. Delay selection by spike-timing-dependent plasticity in recurrent networks of spiking neurons receiving oscillatory inputs. PLoS Comput Biol 2013; 9:e1002897. [PMID: 23408878 PMCID: PMC3567188 DOI: 10.1371/journal.pcbi.1002897] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2012] [Accepted: 12/10/2012] [Indexed: 11/28/2022] Open
Abstract
Learning rules, such as spike-timing-dependent plasticity (STDP), change the structure of networks of neurons based on the firing activity. A network level understanding of these mechanisms can help infer how the brain learns patterns and processes information. Previous studies have shown that STDP selectively potentiates feed-forward connections that have specific axonal delays, and that this underlies behavioral functions such as sound localization in the auditory brainstem of the barn owl. In this study, we investigate how STDP leads to the selective potentiation of recurrent connections with different axonal and dendritic delays during oscillatory activity. We develop analytical models of learning with additive STDP in recurrent networks driven by oscillatory inputs, and support the results using simulations with leaky integrate-and-fire neurons. Our results show selective potentiation of connections with specific axonal delays, which depended on the input frequency. In addition, we demonstrate how this can lead to a network becoming selective in the amplitude of its oscillatory response to this frequency. We extend this model of axonal delay selection within a single recurrent network in two ways. First, we show the selective potentiation of connections with a range of both axonal and dendritic delays. Second, we show axonal delay selection between multiple groups receiving out-of-phase, oscillatory inputs. We discuss the application of these models to the formation and activation of neuronal ensembles or cell assemblies in the cortex, and also to missing fundamental pitch perception in the auditory brainstem. Our brain's ability to perform cognitive processes, such as object identification, problem solving, and decision making, comes from the specific connections between neurons. The neurons carry information as spikes that are transmitted to other neurons via connections with different strengths and propagation delays. Experimentally observed learning rules can modify the strengths of connections between neurons based on the timing of their spikes. The learning that occurs in neuronal networks due to these rules is thought to be vital to creating the structures necessary for different cognitive processes as well as for memory. The spiking rate of populations of neurons has been observed to oscillate at particular frequencies in various brain regions, and there is evidence that these oscillations play a role in cognition. Here, we use analytical and numerical methods to investigate the changes to the network structure caused by a specific learning rule during oscillatory neural activity. We find the conditions under which connections with propagation delays that resonate with the oscillations are strengthened relative to the other connections. We demonstrate that networks learn to oscillate more strongly to oscillations at the frequency they were presented with during learning. We discuss the possible application of these results to specific areas of the brain.
Collapse
Affiliation(s)
- Robert R. Kerr
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Centre for Neural Engineering, The University of Melbourne, Melbourne, Victoria, Australia
| | - Anthony N. Burkitt
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Centre for Neural Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Bionics Institute, Melbourne, Victoria, Australia
- * E-mail:
| | - Doreen A. Thomas
- Department of Mechanical Engineering, The University of Melbourne, Melbourne, Victoria, Australia
| | - Matthieu Gilson
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Centre for Neural Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute, Saitama, Japan
| | - David B. Grayden
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Centre for Neural Engineering, The University of Melbourne, Melbourne, Victoria, Australia
- Bionics Institute, Melbourne, Victoria, Australia
| |
Collapse
|
202
|
Yousefi A, Dibazar AA, Berger TW. Synaptic dynamics: Linear model and adaptation algorithm. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2013; 2012:1362-5. [PMID: 23366152 DOI: 10.1109/embc.2012.6346191] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Linear model for synapse temporal dynamics and learning algorithm for synaptic adaptation in spiking neural networks are presented. The proposed linear model substantially simplifies analysis and training of spiking neural networks, meanwhile accurately models facilitation and depression dynamics in synapse. The learning rule is biologically plausible and is capable of simultaneously adjusting both of LTP and STP parameters of individual synapses in a network. To prove efficiency of the system, a small size spiking neural network is trained for generating different spike and bursting patterns of cortical neurons. The simulation results revealed that the linear model of synaptic dynamics along with the proposed STDP based learning algorithm can provide a practical tool for simulating and training very large scale spiking neural circuitry comprising of significant number of synapses and neurons.
Collapse
Affiliation(s)
- Ali Yousefi
- Neural Dynamics Laboratory, University of Southern California, USA.
| | | | | |
Collapse
|
203
|
O'Brien MJ, Srinivasa N. A Spiking Neural Model for Stable Reinforcement of Synapses Based on Multiple Distal Rewards. Neural Comput 2013; 25:123-56. [DOI: 10.1162/neco_a_00387] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
In this letter, a novel critic-like algorithm was developed to extend the synaptic plasticity rule described in Florian ( 2007 ) and Izhikevich ( 2007 ) in order to solve the problem of learning multiple distal rewards simultaneously. The system is augmented with short-term plasticity (STP) to stabilize the learning dynamics, thereby increasing the system's learning capacity. A theoretical threshold is estimated for the number of distal rewards that this system can learn. The validity of the novel algorithm was verified by computer simulations.
Collapse
Affiliation(s)
- Michael J. O'Brien
- Department of Mathematics, University of California at Los Angeles, Los Angeles, CA 90095, U.S.A., and Center for Neural and Emergent Systems, Information and System Sciences Lab, HRL Laboratories LLC, Malibu CA 90265, U.S.A
| | - Narayan Srinivasa
- Center for Neural and Emergent Systems, Information and System Sciences Lab, HRL Laboratories LLC, Malibu CA 90265, U.S.A
| |
Collapse
|
204
|
Galtier M, Wainrib G. Multiscale analysis of slow-fast neuronal learning models with noise. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2012; 2:13. [PMID: 23174307 PMCID: PMC3571918 DOI: 10.1186/2190-8567-2-13] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/19/2012] [Accepted: 10/26/2012] [Indexed: 06/01/2023]
Abstract
This paper deals with the application of temporal averaging methods to recurrent networks of noisy neurons undergoing a slow and unsupervised modification of their connectivity matrix called learning. Three time-scales arise for these models: (i) the fast neuronal dynamics, (ii) the intermediate external input to the system, and (iii) the slow learning mechanisms. Based on this time-scale separation, we apply an extension of the mathematical theory of stochastic averaging with periodic forcing in order to derive a reduced deterministic model for the connectivity dynamics. We focus on a class of models where the activity is linear to understand the specificity of several learning rules (Hebbian, trace or anti-symmetric learning). In a weakly connected regime, we study the equilibrium connectivity which gathers the entire 'knowledge' of the network about the inputs. We develop an asymptotic method to approximate this equilibrium. We show that the symmetric part of the connectivity post-learning encodes the correlation structure of the inputs, whereas the anti-symmetric part corresponds to the cross correlation between the inputs and their time derivative. Moreover, the time-scales ratio appears as an important parameter revealing temporal correlations.
Collapse
Affiliation(s)
- Mathieu Galtier
- NeuroMathComp Project Team, INRIA/ENS Paris, 23 avenue d’Italie, Paris, 75013, France
- School of Engineering and Science, Jacobs University Bremen gGmbH, College Ring 1, P.O. Box 750 561, Bremen, 28725, Germany
| | - Gilles Wainrib
- Laboratoire Analyse Géométrie et Applications, Université Paris 13, 99 avenue Jean-Baptiste Clément, Villetaneuse, Paris, France
| |
Collapse
|
205
|
Abstract
In spike-timing-dependent plasticity (STDP), the order and precise temporal interval between presynaptic and postsynaptic spikes determine the sign and magnitude of long-term potentiation (LTP) or depression (LTD). STDP is widely utilized in models of circuit-level plasticity, development, and learning. However, spike timing is just one of several factors (including firing rate, synaptic cooperativity, and depolarization) that govern plasticity induction, and its relative importance varies across synapses and activity regimes. This review summarizes this broader view of plasticity, including the forms and cellular mechanisms for the spike-timing dependence of plasticity, and, the evidence that spike timing is an important determinant of plasticity in vivo.
Collapse
Affiliation(s)
- Daniel E Feldman
- Department of Molecular and Cell Biology, and Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA 94720-3200, USA.
| |
Collapse
|
206
|
Helias M, Kunkel S, Masumoto G, Igarashi J, Eppler JM, Ishii S, Fukai T, Morrison A, Diesmann M. Supercomputers ready for use as discovery machines for neuroscience. Front Neuroinform 2012; 6:26. [PMID: 23129998 PMCID: PMC3486988 DOI: 10.3389/fninf.2012.00026] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2012] [Accepted: 10/08/2012] [Indexed: 11/16/2022] Open
Abstract
NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 108 neurons and 1012 synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience.
Collapse
Affiliation(s)
- Moritz Helias
- Institute of Neuroscience and Medicine (INM-6), Computational and Systems Neuroscience, Jülich Research Centre Jülich, Germany ; RIKEN Brain Science Institute Wako, Japan
| | | | | | | | | | | | | | | | | |
Collapse
|
207
|
Humble J, Denham S, Wennekers T. Spatio-temporal pattern recognizers using spiking neurons and spike-timing-dependent plasticity. Front Comput Neurosci 2012; 6:84. [PMID: 23087641 PMCID: PMC3467690 DOI: 10.3389/fncom.2012.00084] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2012] [Accepted: 09/24/2012] [Indexed: 11/13/2022] Open
Abstract
It has previously been shown that by using spike-timing-dependent plasticity (STDP), neurons can adapt to the beginning of a repeating spatio-temporal firing pattern in their input. In the present work, we demonstrate that this mechanism can be extended to train recognizers for longer spatio-temporal input signals. Using a number of neurons that are mutually connected by plastic synapses and subject to a global winner-takes-all mechanism, chains of neurons can form where each neuron is selective to a different segment of a repeating input pattern, and the neurons are feed-forwardly connected in such a way that both the correct input segment and the firing of the previous neurons are required in order to activate the next neuron in the chain. This is akin to a simple class of finite state automata. We show that nearest-neighbor STDP (where only the pre-synaptic spike most recent to a post-synaptic one is considered) leads to "nearest-neighbor" chains where connections only form between subsequent states in a chain (similar to classic "synfire chains"). In contrast, "all-to-all spike-timing-dependent plasticity" (where all pre- and post-synaptic spike pairs matter) leads to multiple connections that can span several temporal stages in the chain; these connections respect the temporal order of the neurons. It is also demonstrated that previously learnt individual chains can be "stitched together" by repeatedly presenting them in a fixed order. This way longer sequence recognizers can be formed, and potentially also nested structures. Robustness of recognition with respect to speed variations in the input patterns is shown to depend on rise-times of post-synaptic potentials and the membrane noise. It is argued that the memory capacity of the model is high, but could theoretically be increased using sparse codes.
Collapse
Affiliation(s)
- James Humble
- Centre for Robotic and Neural Systems, Cognition Institute, Plymouth University Plymouth, UK
| | | | | |
Collapse
|
208
|
Deger M, Helias M, Rotter S, Diesmann M. Spike-timing dependence of structural plasticity explains cooperative synapse formation in the neocortex. PLoS Comput Biol 2012; 8:e1002689. [PMID: 23028287 PMCID: PMC3447982 DOI: 10.1371/journal.pcbi.1002689] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2012] [Accepted: 07/26/2012] [Indexed: 11/19/2022] Open
Abstract
Structural plasticity governs the long-term development of synaptic connections in the neocortex. While the underlying processes at the synapses are not fully understood, there is strong evidence that a process of random, independent formation and pruning of excitatory synapses can be ruled out. Instead, there must be some cooperation between the synaptic contacts connecting a single pre- and postsynaptic neuron pair. So far, the mechanism of cooperation is not known. Here we demonstrate that local correlation detection at the postsynaptic dendritic spine suffices to explain the synaptic cooperation effect, without assuming any hypothetical direct interaction pathway between the synaptic contacts. Candidate biomolecular mechanisms for dendritic correlation detection have been identified previously, as well as for structural plasticity based thereon. By analyzing and fitting of a simple model, we show that spike-timing correlation dependent structural plasticity, without additional mechanisms of cross-synapse interaction, can reproduce the experimentally observed distributions of numbers of synaptic contacts between pairs of neurons in the neocortex. Furthermore, the model yields a first explanation for the existence of both transient and persistent dendritic spines and allows to make predictions for future experiments. Structural plasticity has been observed even in the adult mammalian neocortex – in seemingly static neuronal circuits structural remodeling is continuously at work. Still, it has been shown that the connection patterns between pairs of neurons are not random. In contrast, there is evidence that the synaptic contacts between a pair of neurons cooperate: several experimental studies report either zero or about 3–6 synapses between neuron pairs. The mechanism by which the synapses cooperate, however, has not yet been identified. Here we propose a model for structural plasticity that relies on local processes at the dendritic spine. We combine and extend the previous models and determine the equilibrium probability distribution of synaptic contact numbers of the model. By optimizing the parameters numerically for each of three reference datasets, we obtain equilibrium contact number distributions that fit the references very well. We conclude that the local dendritic mechanisms that we assume suffice to explain the cooperative synapse formation in the neocortex.
Collapse
Affiliation(s)
- Moritz Deger
- Bernstein Center Freiburg, Albert-Ludwig University, Freiburg, Germany.
| | | | | | | |
Collapse
|
209
|
VLSI circuits implementing computational models of neocortical circuits. J Neurosci Methods 2012; 210:93-109. [DOI: 10.1016/j.jneumeth.2012.01.019] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2011] [Revised: 01/27/2012] [Accepted: 01/31/2012] [Indexed: 11/20/2022]
|
210
|
Knoblauch A, Hauser F, Gewaltig MO, Körner E, Palm G. Does spike-timing-dependent synaptic plasticity couple or decouple neurons firing in synchrony? Front Comput Neurosci 2012; 6:55. [PMID: 22936909 PMCID: PMC3424530 DOI: 10.3389/fncom.2012.00055] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2012] [Accepted: 07/12/2012] [Indexed: 12/25/2022] Open
Abstract
Spike synchronization is thought to have a constructive role for feature integration, attention, associative learning, and the formation of bidirectionally connected Hebbian cell assemblies. By contrast, theoretical studies on spike-timing-dependent plasticity (STDP) report an inherently decoupling influence of spike synchronization on synaptic connections of coactivated neurons. For example, bidirectional synaptic connections as found in cortical areas could be reproduced only by assuming realistic models of STDP and rate coding. We resolve this conflict by theoretical analysis and simulation of various simple and realistic STDP models that provide a more complete characterization of conditions when STDP leads to either coupling or decoupling of neurons firing in synchrony. In particular, we show that STDP consistently couples synchronized neurons if key model parameters are matched to physiological data: First, synaptic potentiation must be significantly stronger than synaptic depression for small (positive or negative) time lags between presynaptic and postsynaptic spikes. Second, spike synchronization must be sufficiently imprecise, for example, within a time window of 5-10 ms instead of 1 ms. Third, axonal propagation delays should not be much larger than dendritic delays. Under these assumptions synchronized neurons will be strongly coupled leading to a dominance of bidirectional synaptic connections even for simple STDP models and low mean firing rates at the level of spontaneous activity.
Collapse
|
211
|
Bamford SA, Murray AF, Willshaw DJ. Spike-timing-dependent plasticity with weight dependence evoked from physical constraints. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2012; 6:385-398. [PMID: 23853183 DOI: 10.1109/tbcas.2012.2184285] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
Analogue and mixed-signal VLSI implementations of Spike-Timing-Dependent Plasticity (STDP) are reviewed. A circuit is presented with a compact implementation of STDP suitable for parallel integration in large synaptic arrays. In contrast to previously published circuits, it uses the limitations of the silicon substrate to achieve various forms and degrees of weight dependence of STDP. It also uses reverse-biased transistors to reduce leakage from a capacitance representing weight. Chip results are presented showing: various ways in which the learning rule may be shaped; how synaptic weights may retain some indication of their learned values over periods of minutes; and how distributions of weights for synapses convergent on single neurons may shift between more or less extreme bimodality according to the strength of correlational cues in their inputs.
Collapse
Affiliation(s)
- Simeon A Bamford
- Neuroinformatics Doctoral Training Centre, University of Edinburgh, Edinburgh, Scotland EH8 9AB, UK.
| | | | | |
Collapse
|
212
|
Vogt SM, Hofmann UG. Neuromodulation of STDP through short-term changes in firing causality. Cogn Neurodyn 2012; 6:353-66. [PMID: 24995051 DOI: 10.1007/s11571-012-9202-4] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2011] [Revised: 03/11/2012] [Accepted: 04/02/2012] [Indexed: 10/28/2022] Open
Abstract
Spike timing dependent plasticity (STDP) likely plays an important role in forming and changing connectivity patterns between neurons in our brain. In a unidirectional synaptic connection between two neurons, it uses the causal relation between spiking activity of a presynaptic input neuron and a postsynaptic output neuron to change the strength of this connection. While the nature of STDP benefits unsupervised learning of correlated inputs, any incorporation of value into the learning process needs some form of reinforcement. Chemical neuromodulators such as Dopamine or Acetylcholine are thought to signal changes between external reward and internal expectation to many brain regions, including the basal ganglia. This effect is often modelled through a direct inclusion of the level of Dopamine as a third factor into the STDP rule. While this gives the benefit of direct control over synaptic modification, it does not account for observed instantaneous effects in neuronal activity on application of Dopamine agonists. Specifically, an instant facilitation of neuronal excitability in the striatum can not be explained by the only indirect effect that dopamine-modulated STDP has on a neuron's firing pattern. We therefore propose a model for synaptic transmission where the level of neuromodulator does not directly influence synaptic plasticity, but instead alters the relative firing causality between pre- and postsynaptic neurons. Through the direct effect on postsynaptic activity, our rule allows indirect modulation of the learning outcome even with unmodulated, two-factor STDP. However, it also does not prohibit joint operation together with three-factor STDP rules.
Collapse
Affiliation(s)
- Simon M Vogt
- Institute for Signal Processing, University of Luebeck, Ratzeburger Allee 160, Lübeck, Germany
| | - Ulrich G Hofmann
- Institute for Signal Processing, University of Luebeck, Ratzeburger Allee 160, Lübeck, Germany
| |
Collapse
|
213
|
Pfeil T, Potjans TC, Schrader S, Potjans W, Schemmel J, Diesmann M, Meier K. Is a 4-bit synaptic weight resolution enough? - constraints on enabling spike-timing dependent plasticity in neuromorphic hardware. Front Neurosci 2012; 6:90. [PMID: 22822388 PMCID: PMC3398398 DOI: 10.3389/fnins.2012.00090] [Citation(s) in RCA: 68] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2012] [Accepted: 06/04/2012] [Indexed: 11/13/2022] Open
Abstract
Large-scale neuromorphic hardware systems typically bear the trade-off between detail level and required chip resources. Especially when implementing spike-timing dependent plasticity, reduction in resources leads to limitations as compared to floating point precision. By design, a natural modification that saves resources would be reducing synaptic weight resolution. In this study, we give an estimate for the impact of synaptic weight discretization on different levels, ranging from random walks of individual weights to computer simulations of spiking neural networks. The FACETS wafer-scale hardware system offers a 4-bit resolution of synaptic weights, which is shown to be sufficient within the scope of our network benchmark. Our findings indicate that increasing the resolution may not even be useful in light of further restrictions of customized mixed-signal synapses. In addition, variations due to production imperfections are investigated and shown to be uncritical in the context of the presented study. Our results represent a general framework for setting up and configuring hardware-constrained synapses. We suggest how weight discretization could be considered for other backends dedicated to large-scale simulations. Thus, our proposition of a good hardware verification practice may rise synergy effects between hardware developers and neuroscientists.
Collapse
Affiliation(s)
- Thomas Pfeil
- Kirchhoff Institute for Physics, Ruprecht-Karls-University Heidelberg Heidelberg, Germany
| | | | | | | | | | | | | |
Collapse
|
214
|
Spectral analysis of input spike trains by spike-timing-dependent plasticity. PLoS Comput Biol 2012; 8:e1002584. [PMID: 22792056 PMCID: PMC3390410 DOI: 10.1371/journal.pcbi.1002584] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2011] [Accepted: 05/14/2012] [Indexed: 11/19/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) has been observed in many brain areas such as sensory cortices, where it is hypothesized to structure synaptic connections between neurons. Previous studies have demonstrated how STDP can capture spiking information at short timescales using specific input configurations, such as coincident spiking, spike patterns and oscillatory spike trains. However, the corresponding computation in the case of arbitrary input signals is still unclear. This paper provides an overarching picture of the algorithm inherent to STDP, tying together many previous results for commonly used models of pairwise STDP. For a single neuron with plastic excitatory synapses, we show how STDP performs a spectral analysis on the temporal cross-correlograms between its afferent spike trains. The postsynaptic responses and STDP learning window determine kernel functions that specify how the neuron “sees” the input correlations. We thus denote this unsupervised learning scheme as ‘kernel spectral component analysis’ (kSCA). In particular, the whole input correlation structure must be considered since all plastic synapses compete with each other. We find that kSCA is enhanced when weight-dependent STDP induces gradual synaptic competition. For a spiking neuron with a “linear” response and pairwise STDP alone, we find that kSCA resembles principal component analysis (PCA). However, plain STDP does not isolate correlation sources in general, e.g., when they are mixed among the input spike trains. In other words, it does not perform independent component analysis (ICA). Tuning the neuron to a single correlation source can be achieved when STDP is paired with a homeostatic mechanism that reinforces the competition between synaptic inputs. Our results suggest that neuronal networks equipped with STDP can process signals encoded in the transient spiking activity at the timescales of tens of milliseconds for usual STDP. Tuning feature extraction of sensory stimuli is an important function for synaptic plasticity models. A widely studied example is the development of orientation preference in the primary visual cortex, which can emerge using moving bars in the visual field. A crucial point is the decomposition of stimuli into basic information tokens, e.g., selecting individual bars even though they are presented in overlapping pairs (vertical and horizontal). Among classical unsupervised learning models, independent component analysis (ICA) is capable of isolating basic tokens, whereas principal component analysis (PCA) cannot. This paper focuses on spike-timing-dependent plasticity (STDP), whose functional implications for neural information processing have been intensively studied both theoretically and experimentally in the last decade. Following recent studies demonstrating that STDP can perform ICA for specific cases, we show how STDP relates to PCA or ICA, and in particular explains the conditions under which it switches between them. Here information at the neuronal level is assumed to be encoded in temporal cross-correlograms of spike trains. We find that a linear spiking neuron equipped with pairwise STDP requires additional mechanisms, such as a homeostatic regulation of its output firing, in order to separate mixed correlation sources and thus perform ICA.
Collapse
|
215
|
Droste F, Schwalger T, Lindner B. Heterogeneous short-term plasticity enables spectral separation of information in the neural spike train. BMC Neurosci 2012. [PMCID: PMC3403657 DOI: 10.1186/1471-2202-13-s1-p98] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
216
|
Bayati M, Valizadeh A. Effect of synaptic plasticity on the structure and dynamics of disordered networks of coupled neurons. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2012; 86:011925. [PMID: 23005470 DOI: 10.1103/physreve.86.011925] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/14/2012] [Revised: 06/08/2012] [Indexed: 06/01/2023]
Abstract
In an all-to-all network of integrate-and-fire neurons in which there is a disorder in the intrinsic oscillatory frequencies of the neurons, we show that through spike-timing-dependent plasticity the synapses which have the high-frequency neurons as presynaptic tend to be potentiated while the links originated from the low-frequency neurons are weakened. The emergent effective flow of directed connections introduces the high-frequency neurons as the more influential elements in the network and facilitates synchronization by decreasing the synaptic cost for onset of synchronization.
Collapse
Affiliation(s)
- M Bayati
- Institute for Advanced Studies in Basic Sciences, PO Box 45195-1159, Zanjan, Iran
| | | |
Collapse
|
217
|
Gilson M, Bürck M, Burkitt AN, van Hemmen JL. Frequency selectivity emerging from spike-timing-dependent plasticity. Neural Comput 2012; 24:2251-79. [PMID: 22734488 DOI: 10.1162/neco_a_00331] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Periodic neuronal activity has been observed in various areas of the brain, from lower sensory to higher cortical levels. Specific frequency components contained in this periodic activity can be identified by a neuronal circuit that behaves as a bandpass filter with given preferred frequency, or best modulation frequency (BMF). For BMFs typically ranging from 10 to 200 Hz, a plausible and minimal configuration consists of a single neuron with adjusted excitatory and inhibitory synaptic connections. The emergence, however, of such a neuronal circuitry is still unclear. In this letter, we demonstrate how spike-timing-dependent plasticity (STDP) can give rise to frequency-dependent learning, thus leading to an input selectivity that enables frequency identification. We use an in-depth mathematical analysis of the learning dynamics in a population of plastic inhibitory connections. These provide inhomogeneous postsynaptic responses that depend on their dendritic location. We find that synaptic delays play a crucial role in organizing the weight specialization induced by STDP. Under suitable conditions on the synaptic delays and postsynaptic potentials (PSPs), the BMF of a neuron after learning can match the training frequency. In particular, proximal (distal) synapses with shorter (longer) dendritic delay and somatically measured PSP time constants respond better to higher (lower) frequencies. As a result, the neuron will respond maximally to any stimulating frequency (in a given range) with which it has been trained in an unsupervised manner. The model predicts that synapses responding to a given BMF form clusters on dendritic branches.
Collapse
Affiliation(s)
- Matthieu Gilson
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, University of Melbourne, VIC 3010, Australia; The Bionics Institute, East Melbourne, VIC 3002, Australia; NICTA the Victorian Research Lab, University of Melbourne, VIC 3010, Australia; and RIKEN Brain Science Institute, Saitama 351-0198, Japan
| | - Moritz Bürck
- Physik Department T35, Technische Universitat München, 85748 Garching bei München, Germany, and Bernstein Center for Computational Neuroscience—München, 82152 Martinsried, Germany
| | - Anthony N. Burkitt
- Neuroengineering Laboratory, Department of Electrical and Electronic Engineering, University of Melbourne, VIC 3010, Australia; The Bionics Institute, East Melbourne, VIC 3010, Australia; and Centre for Neural Engineering, University of Melbourne, VIC 3010, Australia
| | - J. Leo van Hemmen
- Physik Department T35, Technische Universitat München, 85748 Garching bei München, Germany, and Bernstein Center for Computational Neuroscience—München, 82152 Martinsried, Germany
| |
Collapse
|
218
|
Guo D, Wang Q, Perc M. Complex synchronous behavior in interneuronal networks with delayed inhibitory and fast electrical synapses. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2012; 85:061905. [PMID: 23005125 DOI: 10.1103/physreve.85.061905] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/18/2011] [Revised: 04/17/2012] [Indexed: 06/01/2023]
Abstract
Networks of fast-spiking interneurons are crucial for the generation of neural oscillations in the brain. Here we study the synchronous behavior of interneuronal networks that are coupled by delayed inhibitory and fast electrical synapses. We find that both coupling modes play a crucial role by the synchronization of the network. In addition, delayed inhibitory synapses affect the emerging oscillatory patterns. By increasing the inhibitory synaptic delay, we observe a transition from regular to mixed oscillatory patterns at a critical value. We also examine how the unreliability of inhibitory synapses influences the emergence of synchronization and the oscillatory patterns. We find that low levels of reliability tend to destroy synchronization and, moreover, that interneuronal networks with long inhibitory synaptic delays require a minimal level of reliability for the mixed oscillatory pattern to be maintained.
Collapse
Affiliation(s)
- Daqing Guo
- Key Laboratory for NeuroInformation of Ministry of Education, School of Life Science and Technology, University of Electronic Science and Technology of China, Chengdu 610054, People's Republic of China.
| | | | | |
Collapse
|
219
|
Leen TK, Friel R. Stochastic perturbation methods for spike-timing-dependent plasticity. Neural Comput 2012; 24:1109-46. [PMID: 22295984 DOI: 10.1162/neco_a_00267] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Online machine learning rules and many biological spike-timing-dependent plasticity (STDP) learning rules generate jump process Markov chains for the synaptic weights. We give a perturbation expansion for the dynamics that, unlike the usual approximation by a Fokker-Planck equation (FPE), is well justified. Our approach extends the related system size expansion by giving an expansion for the probability density as well as its moments. We apply the approach to two observed STDP learning rules and show that in regimes where the FPE breaks down, the new perturbation expansion agrees well with Monte Carlo simulations. The methods are also applicable to the dynamics of stochastic neural activity. Like previous ensemble analyses of STDP, we focus on equilibrium solutions, although the methods can in principle be applied to transients as well.
Collapse
Affiliation(s)
- Todd K Leen
- Department of Biomedical Engineering, Oregon Health & Science University, Portland, OR 97239, USA.
| | | |
Collapse
|
220
|
Luz Y, Shamir M. Balancing feed-forward excitation and inhibition via Hebbian inhibitory synaptic plasticity. PLoS Comput Biol 2012; 8:e1002334. [PMID: 22291583 PMCID: PMC3266879 DOI: 10.1371/journal.pcbi.1002334] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2011] [Accepted: 11/16/2011] [Indexed: 12/02/2022] Open
Abstract
It has been suggested that excitatory and inhibitory inputs to cortical cells are balanced, and that this balance is important for the highly irregular firing observed in the cortex. There are two hypotheses as to the origin of this balance. One assumes that it results from a stable solution of the recurrent neuronal dynamics. This model can account for a balance of steady state excitation and inhibition without fine tuning of parameters, but not for transient inputs. The second hypothesis suggests that the feed forward excitatory and inhibitory inputs to a postsynaptic cell are already balanced. This latter hypothesis thus does account for the balance of transient inputs. However, it remains unclear what mechanism underlies the fine tuning required for balancing feed forward excitatory and inhibitory inputs. Here we investigated whether inhibitory synaptic plasticity is responsible for the balance of transient feed forward excitation and inhibition. We address this issue in the framework of a model characterizing the stochastic dynamics of temporally anti-symmetric Hebbian spike timing dependent plasticity of feed forward excitatory and inhibitory synaptic inputs to a single post-synaptic cell. Our analysis shows that inhibitory Hebbian plasticity generates 'negative feedback' that balances excitation and inhibition, which contrasts with the 'positive feedback' of excitatory Hebbian synaptic plasticity. As a result, this balance may increase the sensitivity of the learning dynamics to the correlation structure of the excitatory inputs.
Collapse
Affiliation(s)
- Yotam Luz
- Department of Physiology and Neurobiology, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| | - Maoz Shamir
- Department of Physiology and Neurobiology, Ben-Gurion University of the Negev, Beer-Sheva, Israel
- Department of Physics, Ben-Gurion University of the Negev, Beer-Sheva, Israel
| |
Collapse
|
221
|
Scheller B, Castellano M, Vicente R, Pipa G. Spike train auto-structure impacts post-synaptic firing and timing-based plasticity. Front Comput Neurosci 2011; 5:60. [PMID: 22203800 PMCID: PMC3243878 DOI: 10.3389/fncom.2011.00060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2010] [Accepted: 11/29/2011] [Indexed: 11/13/2022] Open
Abstract
Cortical neurons are typically driven by several thousand synapses. The precise spatiotemporal pattern formed by these inputs can modulate the response of a post-synaptic cell. In this work, we explore how the temporal structure of pre-synaptic inhibitory and excitatory inputs impact the post-synaptic firing of a conductance-based integrate and fire neuron. Both the excitatory and inhibitory input was modeled by renewal gamma processes with varying shape factors for modeling regular and temporally random Poisson activity. We demonstrate that the temporal structure of mutually independent inputs affects the post-synaptic firing, while the strength of the effect depends on the firing rates of both the excitatory and inhibitory inputs. In a second step, we explore the effect of temporal structure of mutually independent inputs on a simple version of Hebbian learning, i.e., hard bound spike-timing-dependent plasticity. We explore both the equilibrium weight distribution and the speed of the transient weight dynamics for different mutually independent gamma processes. We find that both the equilibrium distribution of the synaptic weights and the speed of synaptic changes are modulated by the temporal structure of the input. Finally, we highlight that the sensitivity of both the post-synaptic firing as well as the spike-timing-dependent plasticity on the auto-structure of the input of a neuron could be used to modulate the learning rate of synaptic modification.
Collapse
Affiliation(s)
- Bertram Scheller
- Clinic for Anesthesia, Intensive Care Medicine and Pain Therapy, Johann Wolfgang Goethe UniversityFrankfurt am Main, Germany
| | - Marta Castellano
- Institute of Cognitive Science, University of OsnabrückOsnabrück, Germany
- Department of Neurophysiology, Max-Planck-Institute for Brain ResearchFrankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Johann Wolfgang Goethe UniversityFrankfurt am Main, Germany
| | - Raul Vicente
- Department of Neurophysiology, Max-Planck-Institute for Brain ResearchFrankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Johann Wolfgang Goethe UniversityFrankfurt am Main, Germany
| | - Gordon Pipa
- Institute of Cognitive Science, University of OsnabrückOsnabrück, Germany
- Department of Neurophysiology, Max-Planck-Institute for Brain ResearchFrankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Johann Wolfgang Goethe UniversityFrankfurt am Main, Germany
| |
Collapse
|
222
|
Knoblauch A, Hauser F. STDP, Hebbian cell assemblies, and temporal coding by spike synchronization. BMC Neurosci 2011. [PMCID: PMC3240237 DOI: 10.1186/1471-2202-12-s1-p142] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
223
|
Nekorkin VI, Dmitrichev AS, Kasatkin DV, Afraimovich VS. Relating the sequential dynamics of excitatory neural networks to synaptic cellular automata. CHAOS (WOODBURY, N.Y.) 2011; 21:043124. [PMID: 22225361 DOI: 10.1063/1.3657384] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
We have developed a new approach for the description of sequential dynamics of excitatory neural networks. Our approach is based on the dynamics of synapses possessing the short-term plasticity property. We suggest a model of such synapses in the form of a second-order system of nonlinear ODEs. In the framework of the model two types of responses are realized-the fast and the slow ones. Under some relations between their timescales a cellular automaton (CA) on the graph of connections is constructed. Such a CA has only a finite number of attractors and all of them are periodic orbits. The attractors of the CA determine the regimes of sequential dynamics of the original neural network, i.e., itineraries along the network and the times of successive firing of neurons in the form of bunches of spikes. We illustrate our approach on the example of a Morris-Lecar neural network.
Collapse
Affiliation(s)
- V I Nekorkin
- Institute of Applied Physics of RAS, 46 Ul'yanov Street, 603950, Nizhny Novgorod, Russia
| | | | | | | |
Collapse
|
224
|
A biophysically-based neuromorphic model of spike rate- and timing-dependent plasticity. Proc Natl Acad Sci U S A 2011; 108:E1266-74. [PMID: 22089232 DOI: 10.1073/pnas.1106161108] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023] Open
Abstract
Current advances in neuromorphic engineering have made it possible to emulate complex neuronal ion channel and intracellular ionic dynamics in real time using highly compact and power-efficient complementary metal-oxide-semiconductor (CMOS) analog very-large-scale-integrated circuit technology. Recently, there has been growing interest in the neuromorphic emulation of the spike-timing-dependent plasticity (STDP) Hebbian learning rule by phenomenological modeling using CMOS, memristor or other analog devices. Here, we propose a CMOS circuit implementation of a biophysically grounded neuromorphic (iono-neuromorphic) model of synaptic plasticity that is capable of capturing both the spike rate-dependent plasticity (SRDP, of the Bienenstock-Cooper-Munro or BCM type) and STDP rules. The iono-neuromorphic model reproduces bidirectional synaptic changes with NMDA receptor-dependent and intracellular calcium-mediated long-term potentiation or long-term depression assuming retrograde endocannabinoid signaling as a second coincidence detector. Changes in excitatory or inhibitory synaptic weights are registered and stored in a nonvolatile and compact digital format analogous to the discrete insertion and removal of AMPA or GABA receptor channels. The versatile Hebbian synapse device is applicable to a variety of neuroprosthesis, brain-machine interface, neurorobotics, neuromimetic computation, machine learning, and neural-inspired adaptive control problems.
Collapse
|
225
|
Hanuschkin A, Diesmann M, Morrison A. A reafferent and feed-forward model of song syntax generation in the Bengalese finch. J Comput Neurosci 2011; 31:509-32. [PMID: 21404048 PMCID: PMC3232349 DOI: 10.1007/s10827-011-0318-z] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2010] [Revised: 01/28/2011] [Accepted: 02/03/2011] [Indexed: 12/04/2022]
Abstract
Adult Bengalese finches generate a variable song that obeys a distinct and individual syntax. The syntax is gradually lost over a period of days after deafening and is recovered when hearing is restored. We present a spiking neuronal network model of the song syntax generation and its loss, based on the assumption that the syntax is stored in reafferent connections from the auditory to the motor control area. Propagating synfire activity in the HVC codes for individual syllables of the song and priming signals from the auditory network reduce the competition between syllables to allow only those transitions that are permitted by the syntax. Both imprinting of song syntax within HVC and the interaction of the reafferent signal with an efference copy of the motor command are sufficient to explain the gradual loss of syntax in the absence of auditory feedback. The model also reproduces for the first time experimental findings on the influence of altered auditory feedback on the song syntax generation, and predicts song- and species-specific low frequency components in the LFP. This study illustrates how sequential compositionality following a defined syntax can be realized in networks of spiking neurons.
Collapse
Affiliation(s)
- Alexander Hanuschkin
- Functional Neural Circuits Group, Faculty of Biology, Albert-Ludwig University of Freiburg, Schänzlestrasse 1, 79104 Freiburg, Germany.
| | | | | |
Collapse
|
226
|
Stability versus neuronal specialization for STDP: long-tail weight distributions solve the dilemma. PLoS One 2011; 6:e25339. [PMID: 22003389 PMCID: PMC3189213 DOI: 10.1371/journal.pone.0025339] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2011] [Accepted: 09/01/2011] [Indexed: 11/19/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) modifies the weight (or strength) of synaptic connections between neurons and is considered to be crucial for generating network structure. It has been observed in physiology that, in addition to spike timing, the weight update also depends on the current value of the weight. The functional implications of this feature are still largely unclear. Additive STDP gives rise to strong competition among synapses, but due to the absence of weight dependence, it requires hard boundaries to secure the stability of weight dynamics. Multiplicative STDP with linear weight dependence for depression ensures stability, but it lacks sufficiently strong competition required to obtain a clear synaptic specialization. A solution to this stability-versus-function dilemma can be found with an intermediate parametrization between additive and multiplicative STDP. Here we propose a novel solution to the dilemma, named log-STDP, whose key feature is a sublinear weight dependence for depression. Due to its specific weight dependence, this new model can produce significantly broad weight distributions with no hard upper bound, similar to those recently observed in experiments. Log-STDP induces graded competition between synapses, such that synapses receiving stronger input correlations are pushed further in the tail of (very) large weights. Strong weights are functionally important to enhance the neuronal response to synchronous spike volleys. Depending on the input configuration, multiple groups of correlated synaptic inputs exhibit either winner-share-all or winner-take-all behavior. When the configuration of input correlations changes, individual synapses quickly and robustly readapt to represent the new configuration. We also demonstrate the advantages of log-STDP for generating a stable structure of strong weights in a recurrently connected network. These properties of log-STDP are compared with those of previous models. Through long-tail weight distributions, log-STDP achieves both stable dynamics for and robust competition of synapses, which are crucial for spike-based information processing.
Collapse
|
227
|
Liu JK. Learning rule of homeostatic synaptic scaling: presynaptic dependent or not. Neural Comput 2011; 23:3145-61. [PMID: 21919784 DOI: 10.1162/neco_a_00210] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
It has been established that homeostatic synaptic scaling plasticity can maintain neural network activity in a stable regime. However, the underlying learning rule for this mechanism is still unclear. Whether it is dependent on the presynaptic site remains a topic of debate. Here we focus on two forms of learning rules: traditional synaptic scaling (SS) without presynaptic effect and presynaptic-dependent synaptic scaling (PSD). Analysis of the synaptic matrices reveals that transition matrices between consecutive synaptic matrices are distinct: they are diagonal and linear to neural activity under SS, but become nondiagonal and nonlinear under PSD. These differences produce different dynamics in recurrent neural networks. Numerical simulations show that network dynamics are stable under PSD but not SS, which suggests that PSD is a better form to describe homeostatic synaptic scaling plasticity. Matrix analysis used in the study may provide a novel way to examine the stability of learning dynamics.
Collapse
Affiliation(s)
- Jian K Liu
- Laboratory of Neurophysics and Physiology, CNRS UMR 8119, Université Paris Descartes, Paris, France.
| |
Collapse
|
228
|
Bourjaily MA, Miller P. Excitatory, inhibitory, and structural plasticity produce correlated connectivity in random networks trained to solve paired-stimulus tasks. Front Comput Neurosci 2011; 5:37. [PMID: 21991253 PMCID: PMC3170885 DOI: 10.3389/fncom.2011.00037] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2010] [Accepted: 08/02/2011] [Indexed: 11/26/2022] Open
Abstract
The pattern of connections among cortical excitatory cells with overlapping arbors is non-random. In particular, correlations among connections produce clustering – cells in cliques connect to each other with high probability, but with lower probability to cells in other spatially intertwined cliques. In this study, we model initially randomly connected sparse recurrent networks of spiking neurons with random, overlapping inputs, to investigate what functional and structural synaptic plasticity mechanisms sculpt network connections into the patterns measured in vitro. Our Hebbian implementation of structural plasticity causes a removal of connections between uncorrelated excitatory cells, followed by their random replacement. To model a biconditional discrimination task, we stimulate the network via pairs (A + B, C + D, A + D, and C + B) of four inputs (A, B, C, and D). We find networks that produce neurons most responsive to specific paired inputs – a building block of computation and essential role for cortex – contain the excessive clustering of excitatory synaptic connections observed in cortical slices. The same networks produce the best performance in a behavioral readout of the networks’ ability to complete the task. A plasticity mechanism operating on inhibitory connections, long-term potentiation of inhibition, when combined with structural plasticity, indirectly enhances clustering of excitatory cells via excitatory connections. A rate-dependent (triplet) form of spike-timing-dependent plasticity (STDP) between excitatory cells is less effective and basic STDP is detrimental. Clustering also arises in networks stimulated with single stimuli and in networks undergoing raised levels of spontaneous activity when structural plasticity is combined with functional plasticity. In conclusion, spatially intertwined clusters or cliques of connected excitatory cells can arise via a Hebbian form of structural plasticity operating in initially randomly connected networks.
Collapse
|
229
|
Accuracy evaluation of numerical methods used in state-of-the-art simulators for spiking neural networks. J Comput Neurosci 2011; 32:309-26. [DOI: 10.1007/s10827-011-0353-9] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2010] [Revised: 05/17/2011] [Accepted: 07/06/2011] [Indexed: 10/17/2022]
|
230
|
Lourens MAJ, Nirody JA, Meijer HGE, Heida T, van Gils SA. The effect of spike time dependent plasticity on activity patterns in the basal ganglia. BMC Neurosci 2011. [PMCID: PMC3240469 DOI: 10.1186/1471-2202-12-s1-p351] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
231
|
Brüderle D, Petrovici MA, Vogginger B, Ehrlich M, Pfeil T, Millner S, Grübl A, Wendt K, Müller E, Schwartz MO, de Oliveira DH, Jeltsch S, Fieres J, Schilling M, Müller P, Breitwieser O, Petkov V, Muller L, Davison AP, Krishnamurthy P, Kremkow J, Lundqvist M, Muller E, Partzsch J, Scholze S, Zühl L, Mayr C, Destexhe A, Diesmann M, Potjans TC, Lansner A, Schüffny R, Schemmel J, Meier K. A comprehensive workflow for general-purpose neural modeling with highly configurable neuromorphic hardware systems. BIOLOGICAL CYBERNETICS 2011; 104:263-296. [PMID: 21618053 DOI: 10.1007/s00422-011-0435-9] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/10/2010] [Accepted: 04/19/2011] [Indexed: 05/30/2023]
Abstract
In this article, we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results.
Collapse
Affiliation(s)
- Daniel Brüderle
- Kirchhoff Institute for Physics, Ruprecht-Karls-Universität Heidelberg, Heidelberg, Germany.
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
232
|
Brette R, Goodman DFM. Vectorized algorithms for spiking neural network simulation. Neural Comput 2011; 23:1503-35. [PMID: 21395437 DOI: 10.1162/neco_a_00123] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
High-level languages (Matlab, Python) are popular in neuroscience because they are flexible and accelerate development. However, for simulating spiking neural networks, the cost of interpretation is a bottleneck. We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages.
Collapse
Affiliation(s)
- Romain Brette
- Laboratoire Psychologie de la Perception, CNRS and Université Paris Descartes, Paris 75006, France, and Département d'Etudes Cognitives, Ecole Normale Supérieure, Paris Cedex 05, 75230 France.
| | | |
Collapse
|
233
|
Helias M, Deger M, Rotter S, Diesmann M. Finite post synaptic potentials cause a fast neuronal response. Front Neurosci 2011; 5:19. [PMID: 21427776 PMCID: PMC3047297 DOI: 10.3389/fnins.2011.00019] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2010] [Accepted: 02/07/2011] [Indexed: 01/23/2023] Open
Abstract
A generic property of the communication between neurons is the exchange of pulses at discrete time points, the action potentials. However, the prevalent theory of spiking neuronal networks of integrate-and-fire model neurons relies on two assumptions: the superposition of many afferent synaptic impulses is approximated by Gaussian white noise, equivalent to a vanishing magnitude of the synaptic impulses, and the transfer of time varying signals by neurons is assessable by linearization. Going beyond both approximations, we find that in the presence of synaptic impulses the response to transient inputs differs qualitatively from previous predictions. It is instantaneous rather than exhibiting low-pass characteristics, depends non-linearly on the amplitude of the impulse, is asymmetric for excitation and inhibition and is promoted by a characteristic level of synaptic background noise. These findings resolve contradictions between the earlier theory and experimental observations. Here we review the recent theoretical progress that enabled these insights. We explain why the membrane potential near threshold is sensitive to properties of the afferent noise and show how this shapes the neural response. A further extension of the theory to time evolution in discrete steps quantifies simulation artifacts and yields improved methods to cross check results.
Collapse
Affiliation(s)
| | - Moritz Deger
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
| | - Stefan Rotter
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
- Computational Neuroscience, Faculty of Biology, Albert-Ludwig UniversityFreiburg, Germany
| | - Markus Diesmann
- RIKEN Brain Science InstituteWako City, Japan
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
- Institute for Neuroscience and Medicine (INM-6), Computational and Systems Neuroscience, Research Center JülichGermany
- Brain and Neural Systems Team, Computational Science Research Program, RIKENWako City, Japan
| |
Collapse
|
234
|
Kunkel S, Diesmann M, Morrison A. Limits to the development of feed-forward structures in large recurrent neuronal networks. Front Comput Neurosci 2011; 4:160. [PMID: 21415913 PMCID: PMC3042733 DOI: 10.3389/fncom.2010.00160] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2010] [Accepted: 12/25/2010] [Indexed: 11/25/2022] Open
Abstract
Spike-timing dependent plasticity (STDP) has traditionally been of great interest to theoreticians, as it seems to provide an answer to the question of how the brain can develop functional structure in response to repeated stimuli. However, despite this high level of interest, convincing demonstrations of this capacity in large, initially random networks have not been forthcoming. Such demonstrations as there are typically rely on constraining the problem artificially. Techniques include employing additional pruning mechanisms or STDP rules that enhance symmetry breaking, simulating networks with low connectivity that magnify competition between synapses, or combinations of the above. In this paper, we first review modeling choices that carry particularly high risks of producing non-generalizable results in the context of STDP in recurrent networks. We then develop a theory for the development of feed-forward structure in random networks and conclude that an unstable fixed point in the dynamics prevents the stable propagation of structure in recurrent networks with weight-dependent STDP. We demonstrate that the key predictions of the theory hold in large-scale simulations. The theory provides insight into the reasons why such development does not take place in unconstrained systems and enables us to identify biologically motivated candidate adaptations to the balanced random network model that might enable it.
Collapse
Affiliation(s)
- Susanne Kunkel
- Functional Neural Circuits Group, Faculty of Biology, Albert-Ludwig University of Freiburg Germany
| | | | | |
Collapse
|
235
|
Coulon A, Beslon G, Soula HA. Enhanced stimulus encoding capabilities with spectral selectivity in inhibitory circuits by STDP. Neural Comput 2011; 23:882-908. [PMID: 21222530 DOI: 10.1162/neco_a_00100] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The ability to encode and transmit a signal is an essential property that must demonstrate many neuronal circuits in sensory areas in addition to any processing they may provide. It is known that an appropriate level of lateral inhibition, as observed in these areas, can significantly improve the encoding ability of a population of neurons. We show here a homeostatic mechanism by which a spike-timing-dependent plasticity (STDP) rule with a symmetric timing window (swSTDP) spontaneously drives the inhibitory coupling to a level that ensures accurate encoding in response to input signals within a certain frequency range. Interpreting these results mathematically, we find that this coupling level depends on the overlap of spectral information between stimulus and STDP window function. Generalization to arbitrary swSTDP and arbitrary stimuli reveals that the signals for which this improvement of encoding takes place can be finely selected on spectral criteria. We finally show that this spectral overlap principle holds for a variety of neuron types and network characteristics. The highly tunable frequency-power domain of efficiency of this mechanism, together with its ability to operate in very various neuronal contexts, suggest that it may be at work in most sensory areas.
Collapse
Affiliation(s)
- Antoine Coulon
- Université de Lyon, INSA-Lyon, CNRS UMR5205, INRIA Laboratoire d'InfoRmatique en Image et Systemes d'information (LIRIS), F-69621 Lyon, France.
| | | | | |
Collapse
|
236
|
Elliott T. The Mean Time to Express Synaptic Plasticity in Integrate-and-Express, Stochastic Models of Synaptic Plasticity Induction. Neural Comput 2011; 23:124-59. [DOI: 10.1162/neco_a_00061] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Stochastic models of synaptic plasticity propose that single synapses perform a directed random walk of fixed step sizes in synaptic strength, thereby embracing the view that the mechanisms of synaptic plasticity constitute a stochastic dynamical system. However, fluctuations in synaptic strength present a formidable challenge to such an approach. We have previously proposed that single synapses must interpose an integration and filtering mechanism between the induction of synaptic plasticity and the expression of synaptic plasticity in order to control fluctuations. We analyze a class of three such mechanisms in the presence of possibly non-Markovian plasticity induction processes, deriving expressions for the mean expression time in these models. One of these filtering mechanisms constitutes a discrete low-pass filter that could be implemented on a small collection of molecules at single synapses, such as CaMKII, and we analyze this discrete filter in some detail. After considering Markov induction processes, we examine our own stochastic model of spike-timing-dependent plasticity, for which the probability density functions of the induction of plasticity steps have previously been derived. We determine the dependence of the mean time to express a plasticity step on pre- and postsynaptic firing rates in this model, and we also consider, numerically, the long-term stability against fluctuations of patterns of neuronal connectivity that typically emerge during neuronal development.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K
| |
Collapse
|
237
|
Abstract
Neocortical neurons in vivo process each of their individual inputs in the context of ongoing synaptic background activity, produced by the thousands of presynaptic partners a typical neuron has. Previous work has shown that background activity affects multiple aspects of neuronal and network function. However, its effect on the induction of spike-timing dependent plasticity (STDP) is not clear. Here we report that injections of simulated background conductances (produced by a dynamic-clamp system) into pyramidal cells in rat brain slices selectively reduced the magnitude of timing-dependent synaptic potentiation while leaving the magnitude of timing-dependent synaptic depression unchanged. The conductance-dependent suppression also sharpened the STDP curve, with reliable synaptic potentiation induced only when EPSPs and action potentials (APs) were paired within 8 ms of each other. Dual somatic and dendritic patch recordings suggested that the deficit in synaptic potentiation arose from shunting of dendritic EPSPs and APs. Using a biophysically detailed computational model, we were not only able to replicate the conductance-dependent shunting of dendritic potentials, but show that synaptic background can truncate calcium dynamics within dendritic spines in a way that affects potentiation more strongly than depression. This conductance-dependent regulation of synaptic plasticity may constitute a novel homeostatic mechanism that can prevent the runaway synaptic potentiation to which Hebbian networks are vulnerable.
Collapse
|
238
|
Manninen T, Hituri K, Kotaleski JH, Blackwell KT, Linne ML. Postsynaptic signal transduction models for long-term potentiation and depression. Front Comput Neurosci 2010; 4:152. [PMID: 21188161 PMCID: PMC3006457 DOI: 10.3389/fncom.2010.00152] [Citation(s) in RCA: 42] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2010] [Accepted: 11/22/2010] [Indexed: 01/01/2023] Open
Abstract
More than a hundred biochemical species, activated by neurotransmitters binding to transmembrane receptors, are important in long-term potentiation (LTP) and long-term depression (LTD). To investigate which species and interactions are critical for synaptic plasticity, many computational postsynaptic signal transduction models have been developed. The models range from simple models with a single reversible reaction to detailed models with several hundred kinetic reactions. In this study, more than a hundred models are reviewed, and their features are compared and contrasted so that similarities and differences are more readily apparent. The models are classified according to the type of synaptic plasticity that is modeled (LTP or LTD) and whether they include diffusion or electrophysiological phenomena. Other characteristics that discriminate the models include the phase of synaptic plasticity modeled (induction, expression, or maintenance) and the simulation method used (deterministic or stochastic). We find that models are becoming increasingly sophisticated, by including stochastic properties, integrating with electrophysiological properties of entire neurons, or incorporating diffusion of signaling molecules. Simpler models continue to be developed because they are computationally efficient and allow theoretical analysis. The more complex models permit investigation of mechanisms underlying specific properties and experimental verification of model predictions. Nonetheless, it is difficult to fully comprehend the evolution of these models because (1) several models are not described in detail in the publications, (2) only a few models are provided in existing model databases, and (3) comparison to previous models is lacking. We conclude that the value of these models for understanding molecular mechanisms of synaptic plasticity is increasing and will be enhanced further with more complete descriptions and sharing of the published models.
Collapse
Affiliation(s)
- Tiina Manninen
- Department of Signal Processing, Tampere University of Technology Tampere, Finland
| | | | | | | | | |
Collapse
|
239
|
Hennequin G, Gerstner W, Pfister JP. STDP in Adaptive Neurons Gives Close-To-Optimal Information Transmission. Front Comput Neurosci 2010; 4:143. [PMID: 21160559 PMCID: PMC3001990 DOI: 10.3389/fncom.2010.00143] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2010] [Accepted: 09/28/2010] [Indexed: 11/13/2022] Open
Abstract
Spike-frequency adaptation is known to enhance the transmission of information in sensory spiking neurons by rescaling the dynamic range for input processing, matching it to the temporal statistics of the sensory stimulus. Achieving maximal information transmission has also been recently postulated as a role for spike-timing-dependent plasticity (STDP). However, the link between optimal plasticity and STDP in cortex remains loose, as does the relationship between STDP and adaptation processes. We investigate how STDP, as described by recent minimal models derived from experimental data, influences the quality of information transmission in an adapting neuron. We show that a phenomenological model based on triplets of spikes yields almost the same information rate as an optimal model specially designed to this end. In contrast, the standard pair-based model of STDP does not improve information transmission as much. This result holds not only for additive STDP with hard weight bounds, known to produce bimodal distributions of synaptic weights, but also for weight-dependent STDP in the context of unimodal but skewed weight distributions. We analyze the similarities between the triplet model and the optimal learning rule, and find that the triplet effect is an important feature of the optimal model when the neuron is adaptive. If STDP is optimized for information transmission, it must take into account the dynamical properties of the postsynaptic cell, which might explain the target-cell specificity of STDP. In particular, it accounts for the differences found in vitro between STDP at excitatory synapses onto principal cells and those onto fast-spiking interneurons.
Collapse
Affiliation(s)
- Guillaume Hennequin
- School of Computer and Communication Sciences, Brain-Mind Institute, Ecole Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | | | | |
Collapse
|
240
|
Potjans W, Morrison A, Diesmann M. Enabling functional neural circuit simulations with distributed computing of neuromodulated plasticity. Front Comput Neurosci 2010; 4:141. [PMID: 21151370 PMCID: PMC2996144 DOI: 10.3389/fncom.2010.00141] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2010] [Accepted: 09/15/2010] [Indexed: 11/13/2022] Open
Abstract
A major puzzle in the field of computational neuroscience is how to relate system-level learning in higher organisms to synaptic plasticity. Recently, plasticity rules depending not only on pre- and post-synaptic activity but also on a third, non-local neuromodulatory signal have emerged as key candidates to bridge the gap between the macroscopic and the microscopic level of learning. Crucial insights into this topic are expected to be gained from simulations of neural systems, as these allow the simultaneous study of the multiple spatial and temporal scales that are involved in the problem. In particular, synaptic plasticity can be studied during the whole learning process, i.e., on a time scale of minutes to hours and across multiple brain areas. Implementing neuromodulated plasticity in large-scale network simulations where the neuromodulatory signal is dynamically generated by the network itself is challenging, because the network structure is commonly defined purely by the connectivity graph without explicit reference to the embedding of the nodes in physical space. Furthermore, the simulation of networks with realistic connectivity entails the use of distributed computing. A neuromodulated synapse must therefore be informed in an efficient way about the neuromodulatory signal, which is typically generated by a population of neurons located on different machines than either the pre- or post-synaptic neuron. Here, we develop a general framework to solve the problem of implementing neuromodulated plasticity in a time-driven distributed simulation, without reference to a particular implementation language, neuromodulator, or neuromodulated plasticity mechanism. We implement our framework in the simulator NEST and demonstrate excellent scaling up to 1024 processors for simulations of a recurrent network incorporating neuromodulated spike-timing dependent plasticity.
Collapse
Affiliation(s)
- Wiebke Potjans
- Institute of Neuroscience and Medicine (INM-6), Computational and Systems Neuroscience, Research Center Jülich Jülich, Germany
| | | | | |
Collapse
|
241
|
Gilson M, Burkitt AN, Grayden DB, Thomas DA, van Hemmen JL. Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks V: self-organization schemes and weight dependence. BIOLOGICAL CYBERNETICS 2010; 103:365-386. [PMID: 20882297 DOI: 10.1007/s00422-010-0405-7] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/25/2009] [Accepted: 08/23/2010] [Indexed: 05/29/2023]
Abstract
Spike-timing-dependent plasticity (STDP) determines the evolution of the synaptic weights according to their pre- and post-synaptic activity, which in turn changes the neuronal activity on a (much) slower time scale. This paper examines the effect of STDP in a recurrently connected network stimulated by external pools of input spike trains, where both input and recurrent synapses are plastic. Our previously developed theoretical framework is extended to incorporate weight-dependent STDP and dendritic delays. The weight dynamics is determined by an interplay between the neuronal activation mechanisms, the input spike-time correlations, and the learning parameters. For the case of two external input pools, the resulting learning scheme can exhibit a symmetry breaking of the input connections such that two neuronal groups emerge, each specialized to one input pool only. In addition, we show how the recurrent connections within each neuronal group can be strengthened by STDP at the expense of those between the two groups. This neuronal self-organization can be seen as a basic dynamical ingredient for the emergence of neuronal maps induced by activity-dependent plasticity.
Collapse
Affiliation(s)
- Matthieu Gilson
- Department of Electrical and Electronic Engineering, University of Melbourne, Melbourne, VIC 3010, Australia.
| | | | | | | | | |
Collapse
|
242
|
Bush D, Philippides A, Husbands P, O'Shea M. Reconciling the STDP and BCM models of synaptic plasticity in a spiking recurrent neural network. Neural Comput 2010; 22:2059-85. [PMID: 20438333 DOI: 10.1162/neco_a_00003-bush] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Rate-coded Hebbian learning, as characterized by the BCM formulation, is an established computational model of synaptic plasticity. Recently it has been demonstrated that changes in the strength of synapses in vivo can also depend explicitly on the relative timing of pre- and postsynaptic firing. Computational modeling of this spike-timing-dependent plasticity (STDP) has demonstrated that it can provide inherent stability or competition based on local synaptic variables. However, it has also been demonstrated that these properties rely on synaptic weights being either depressed or unchanged by an increase in mean stochastic firing rates, which directly contradicts empirical data. Several analytical studies have addressed this apparent dichotomy and identified conditions under which distinct and disparate STDP rules can be reconciled with rate-coded Hebbian learning. The aim of this research is to verify, unify, and expand on these previous findings by manipulating each element of a standard computational STDP model in turn. This allows us to identify the conditions under which this plasticity rule can replicate experimental data obtained using both rate and temporal stimulation protocols in a spiking recurrent neural network. Our results describe how the relative scale of mean synaptic weights and their dependence on stochastic pre- or postsynaptic firing rates can be manipulated by adjusting the exact profile of the asymmetric learning window and temporal restrictions on spike pair interactions respectively. These findings imply that previously disparate models of rate-coded autoassociative learning and temporally coded heteroassociative learning, mediated by symmetric and asymmetric connections respectively, can be implemented in a single network using a single plasticity rule. However, we also demonstrate that forms of STDP that can be reconciled with rate-coded Hebbian learning do not generate inherent synaptic competition, and thus some additional mechanism is required to guarantee long-term input-output selectivity.
Collapse
Affiliation(s)
- Daniel Bush
- Centre for Computational Neuroscience and Robotics, University of Sussex, Brighton, Sussex, UK
| | | | | | | |
Collapse
|
243
|
Bill J, Schuch K, Brüderle D, Schemmel J, Maass W, Meier K. Compensating Inhomogeneities of Neuromorphic VLSI Devices Via Short-Term Synaptic Plasticity. Front Comput Neurosci 2010; 4:129. [PMID: 21031027 PMCID: PMC2965017 DOI: 10.3389/fncom.2010.00129] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2010] [Accepted: 08/11/2010] [Indexed: 11/17/2022] Open
Abstract
Recent developments in neuromorphic hardware engineering make mixed-signal VLSI neural network models promising candidates for neuroscientific research tools and massively parallel computing devices, especially for tasks which exhaust the computing power of software simulations. Still, like all analog hardware systems, neuromorphic models suffer from a constricted configurability and production-related fluctuations of device characteristics. Since also future systems, involving ever-smaller structures, will inevitably exhibit such inhomogeneities on the unit level, self-regulation properties become a crucial requirement for their successful operation. By applying a cortically inspired self-adjusting network architecture, we show that the activity of generic spiking neural networks emulated on a neuromorphic hardware system can be kept within a biologically realistic firing regime and gain a remarkable robustness against transistor-level variations. As a first approach of this kind in engineering practice, the short-term synaptic depression and facilitation mechanisms implemented within an analog VLSI model of I&F neurons are functionally utilized for the purpose of network level stabilization. We present experimental data acquired both from the hardware model and from comparative software simulations which prove the applicability of the employed paradigm to neuromorphic VLSI devices.
Collapse
Affiliation(s)
- Johannes Bill
- Kirchhoff Institute for Physics, University of Heidelberg Heidelberg, Germany
| | | | | | | | | | | |
Collapse
|
244
|
Menzies JRW, Porrill J, Dutia M, Dean P. Synaptic plasticity in medial vestibular nucleus neurons: comparison with computational requirements of VOR adaptation. PLoS One 2010; 5. [PMID: 20957149 PMCID: PMC2950150 DOI: 10.1371/journal.pone.0013182] [Citation(s) in RCA: 42] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2010] [Accepted: 09/01/2010] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND Vestibulo-ocular reflex (VOR) gain adaptation, a longstanding experimental model of cerebellar learning, utilizes sites of plasticity in both cerebellar cortex and brainstem. However, the mechanisms by which the activity of cortical Purkinje cells may guide synaptic plasticity in brainstem vestibular neurons are unclear. Theoretical analyses indicate that vestibular plasticity should depend upon the correlation between Purkinje cell and vestibular afferent inputs, so that, in gain-down learning for example, increased cortical activity should induce long-term depression (LTD) at vestibular synapses. METHODOLOGY/PRINCIPAL FINDINGS Here we expressed this correlational learning rule in its simplest form, as an anti-Hebbian, heterosynaptic spike-timing dependent plasticity interaction between excitatory (vestibular) and inhibitory (floccular) inputs converging on medial vestibular nucleus (MVN) neurons (input-spike-timing dependent plasticity, iSTDP). To test this rule, we stimulated vestibular afferents to evoke EPSCs in rat MVN neurons in vitro. Control EPSC recordings were followed by an induction protocol where membrane hyperpolarizing pulses, mimicking IPSPs evoked by flocculus inputs, were paired with single vestibular nerve stimuli. A robust LTD developed at vestibular synapses when the afferent EPSPs coincided with membrane hyperpolarization, while EPSPs occurring before or after the simulated IPSPs induced no lasting change. Furthermore, the iSTDP rule also successfully predicted the effects of a complex protocol using EPSP trains designed to mimic classical conditioning. CONCLUSIONS These results, in strong support of theoretical predictions, suggest that the cerebellum alters the strength of vestibular synapses on MVN neurons through hetero-synaptic, anti-Hebbian iSTDP. Since the iSTDP rule does not depend on post-synaptic firing, it suggests a possible mechanism for VOR adaptation without compromising gaze-holding and VOR performance in vivo.
Collapse
Affiliation(s)
- John R. W. Menzies
- Centre for Integrative Physiology, School of Biomedical Sciences, University of Edinburgh, Edinburgh, United Kingdom
| | - John Porrill
- Department of Psychology, University of Sheffield, Sheffield, United Kingdom
| | - Mayank Dutia
- Centre for Integrative Physiology, School of Biomedical Sciences, University of Edinburgh, Edinburgh, United Kingdom
| | - Paul Dean
- Department of Psychology, University of Sheffield, Sheffield, United Kingdom
- * E-mail:
| |
Collapse
|
245
|
Thivierge JP, Cisek P. Spiking neurons that keep the rhythm. J Comput Neurosci 2010; 30:589-605. [PMID: 20886275 DOI: 10.1007/s10827-010-0280-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2009] [Revised: 09/16/2010] [Accepted: 09/21/2010] [Indexed: 10/19/2022]
Abstract
Detecting the temporal relationship among events in the environment is a fundamental goal of the brain. Following pulses of rhythmic stimuli, neurons of the retina and cortex produce activity that closely approximates the timing of an omitted pulse. This omitted stimulus response (OSR) is generally interpreted as a transient response to rhythmic input and is thought to form a basis of short-term perceptual memories. Despite its ubiquity across species and experimental protocols, the mechanisms underlying OSRs remain poorly understood. In particular, the highly transient nature of OSRs, typically limited to a single cycle after stimulation, cannot be explained by a simple mechanism that would remain locked to the frequency of stimulation. Here, we describe a set of realistic simulations that capture OSRs over a range of stimulation frequencies matching experimental work. The model does not require an explicit mechanism for learning temporal sequences. Instead, it relies on spike timing-dependent plasticity (STDP), a form of synaptic modification that is sensitive to the timing of pre- and post-synaptic action potentials. In the model, the transient nature of OSRs is attributed to the heterogeneous nature of neural properties and connections, creating intricate forms of activity that are continuously changing over time. Combined with STDP, neural heterogeneity enabled OSRs to complex rhythmic patterns as well as OSRs following a delay period. These results link the response of neurons to rhythmic patterns with the capacity of heterogeneous circuits to produce transient and highly flexible forms of neural activity.
Collapse
Affiliation(s)
- Jean-Philippe Thivierge
- Department of Psychological and Brain Sciences, Indiana University, 1101 East Tenth Street, Bloomington, IN 47405, USA.
| | | |
Collapse
|
246
|
Li N, DiCarlo JJ. Unsupervised natural visual experience rapidly reshapes size-invariant object representation in inferior temporal cortex. Neuron 2010; 67:1062-75. [PMID: 20869601 PMCID: PMC2946943 DOI: 10.1016/j.neuron.2010.08.029] [Citation(s) in RCA: 75] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/05/2010] [Indexed: 11/25/2022]
Abstract
We easily recognize objects and faces across a myriad of retinal images produced by each object. One hypothesis is that this tolerance (a.k.a. "invariance") is learned by relying on the fact that object identities are temporally stable. While we previously found neuronal evidence supporting this idea at the top of the nonhuman primate ventral visual stream (inferior temporal cortex, or IT), we here test if this is a general tolerance learning mechanism. First, we found that the same type of unsupervised experience that reshaped IT position tolerance also predictably reshaped IT size tolerance, and the magnitude of reshaping was quantitatively similar. Second, this tolerance reshaping can be induced under naturally occurring dynamic visual experience, even without eye movements. Third, unsupervised temporal contiguous experience can build new neuronal tolerance. These results suggest that the ventral visual stream uses a general unsupervised tolerance learning algorithm to build its invariant object representation.
Collapse
Affiliation(s)
- Nuo Li
- McGovern Institute for Brain Research, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139
| | - James J. DiCarlo
- McGovern Institute for Brain Research, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139
| |
Collapse
|
247
|
Graupner M, Brunel N. Mechanisms of induction and maintenance of spike-timing dependent plasticity in biophysical synapse models. Front Comput Neurosci 2010; 4. [PMID: 20948584 PMCID: PMC2953414 DOI: 10.3389/fncom.2010.00136] [Citation(s) in RCA: 82] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2010] [Accepted: 08/25/2010] [Indexed: 01/02/2023] Open
Abstract
We review biophysical models of synaptic plasticity, with a focus on spike-timing dependent plasticity (STDP). The common property of the discussed models is that synaptic changes depend on the dynamics of the intracellular calcium concentration, which itself depends on pre- and postsynaptic activity. We start by discussing simple models in which plasticity changes are based directly on calcium amplitude and dynamics. We then consider models in which dynamic intracellular signaling cascades form the link between the calcium dynamics and the plasticity changes. Both mechanisms of induction of STDP (through the ability of pre/postsynaptic spikes to evoke changes in the state of the synapse) and of maintenance of the evoked changes (through bistability) are discussed.
Collapse
Affiliation(s)
- Michael Graupner
- Center for Neural Science, New York University New York City, NY, USA
| | | |
Collapse
|
248
|
Gilson M, Burkitt A, van Hemmen LJ. STDP in Recurrent Neuronal Networks. Front Comput Neurosci 2010; 4. [PMID: 20890448 PMCID: PMC2947928 DOI: 10.3389/fncom.2010.00023] [Citation(s) in RCA: 53] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2010] [Accepted: 06/28/2010] [Indexed: 11/13/2022] Open
Abstract
Recent results about spike-timing-dependent plasticity (STDP) in recurrently connected neurons are reviewed, with a focus on the relationship between the weight dynamics and the emergence of network structure. In particular, the evolution of synaptic weights in the two cases of incoming connections for a single neuron and recurrent connections are compared and contrasted. A theoretical framework is used that is based upon Poisson neurons with a temporally inhomogeneous firing rate and the asymptotic distribution of weights generated by the learning dynamics. Different network configurations examined in recent studies are discussed and an overview of the current understanding of STDP in recurrently connected neuronal networks is presented.
Collapse
|
249
|
Helias M, Deger M, Rotter S, Diesmann M. Instantaneous non-linear processing by pulse-coupled threshold units. PLoS Comput Biol 2010; 6. [PMID: 20856583 PMCID: PMC2936519 DOI: 10.1371/journal.pcbi.1000929] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2010] [Accepted: 08/10/2010] [Indexed: 11/18/2022] Open
Abstract
Contemporary theory of spiking neuronal networks is based on the linear response of the integrate-and-fire neuron model derived in the diffusion limit. We find that for non-zero synaptic weights, the response to transient inputs differs qualitatively from this approximation. The response is instantaneous rather than exhibiting low-pass characteristics, non-linearly dependent on the input amplitude, asymmetric for excitation and inhibition, and is promoted by a characteristic level of synaptic background noise. We show that at threshold the probability density of the potential drops to zero within the range of one synaptic weight and explain how this shapes the response. The novel mechanism is exhibited on the network level and is a generic property of pulse-coupled networks of threshold units. Our work demonstrates a fast-firing response of nerve cells that remained unconsidered in network analysis, because it is inaccessible by the otherwise successful linear response theory. For the sake of analytic tractability, this theory assumes infinitesimally weak synaptic coupling. However, realistic synaptic impulses cause a measurable deflection of the membrane potential. Here we quantify the effect of this pulse-coupling on the firing rate and the membrane-potential distribution. We demonstrate how the postsynaptic potentials give rise to a fast, non-linear rate transient present for excitatory, but not for inhibitory, inputs. It is particularly pronounced in the presence of a characteristic level of synaptic background noise. We show that feed-forward inhibition enhances the fast response on the network level. This enables a mode of information processing based on short-lived activity transients. Moreover, the non-linear neural response appears on a time scale that critically interacts with spike-timing dependent synaptic plasticity rules. Our results are derived for biologically realistic synaptic amplitudes, but also extend earlier work based on Gaussian white noise. The novel theoretical framework is generically applicable to any threshold unit governed by a stochastic differential equation driven by finite jumps. Therefore, our results are relevant for a wide range of biological, physical, and technical systems.
Collapse
|
250
|
Mayr CG, Partzsch J. Rate and pulse based plasticity governed by local synaptic state variables. Front Synaptic Neurosci 2010; 2:33. [PMID: 21423519 PMCID: PMC3059700 DOI: 10.3389/fnsyn.2010.00033] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2010] [Accepted: 07/08/2010] [Indexed: 11/17/2022] Open
Abstract
Classically, action-potential-based learning paradigms such as the Bienenstock–Cooper–Munroe (BCM) rule for pulse rates or spike timing-dependent plasticity for pulse pairings have been experimentally demonstrated to evoke long-lasting synaptic weight changes (i.e., plasticity). However, several recent experiments have shown that plasticity also depends on the local dynamics at the synapse, such as membrane voltage, Calcium time course and level, or dendritic spikes. In this paper, we introduce a formulation of the BCM rule which is based on the instantaneous postsynaptic membrane potential as well as the transmission profile of the presynaptic spike. While this rule incorporates only simple local voltage- and current dynamics and is thus neither directly rate nor timing based, it can replicate a range of experiments, such as various rate and spike pairing protocols, combinations of the two, as well as voltage-dependent plasticity. A detailed comparison of current plasticity models with respect to this range of experiments also demonstrates the efficacy of the new plasticity rule. All experiments can be replicated with a limited set of parameters, avoiding the overfitting problem of more involved plasticity rules.
Collapse
Affiliation(s)
- Christian G Mayr
- Endowed Chair of Highly Parallel VLSI Systems and Neural Microelectronics, Institute of Circuits and Systems, Faculty of Electrical Engineering and Information Science, University of Technology Dresden Dresden, Sachsen, Germany
| | | |
Collapse
|