51
|
Zenke F, Gerstner W. Hebbian plasticity requires compensatory processes on multiple timescales. Philos Trans R Soc Lond B Biol Sci 2017; 372:rstb.2016.0259. [PMID: 28093557 PMCID: PMC5247595 DOI: 10.1098/rstb.2016.0259] [Citation(s) in RCA: 94] [Impact Index Per Article: 13.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/09/2016] [Indexed: 01/19/2023] Open
Abstract
We review a body of theoretical and experimental research on Hebbian and homeostatic plasticity, starting from a puzzling observation: while homeostasis of synapses found in experiments is a slow compensatory process, most mathematical models of synaptic plasticity use rapid compensatory processes (RCPs). Even worse, with the slow homeostatic plasticity reported in experiments, simulations of existing plasticity models cannot maintain network stability unless further control mechanisms are implemented. To solve this paradox, we suggest that in addition to slow forms of homeostatic plasticity there are RCPs which stabilize synaptic plasticity on short timescales. These rapid processes may include heterosynaptic depression triggered by episodes of high postsynaptic firing rate. While slower forms of homeostatic plasticity are not sufficient to stabilize Hebbian plasticity, they are important for fine-tuning neural circuits. Taken together we suggest that learning and memory rely on an intricate interplay of diverse plasticity mechanisms on different timescales which jointly ensure stability and plasticity of neural circuits.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'.
Collapse
Affiliation(s)
- Friedemann Zenke
- Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Wulfram Gerstner
- Brain Mind Institute, School of Life Sciences and School of Computer and Communication Sciences, Ecole Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
52
|
Richter LMA, Gjorgjieva J. Understanding neural circuit development through theory and models. Curr Opin Neurobiol 2017; 46:39-47. [DOI: 10.1016/j.conb.2017.07.004] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2017] [Revised: 07/07/2017] [Accepted: 07/10/2017] [Indexed: 11/25/2022]
|
53
|
Natural Firing Patterns Imply Low Sensitivity of Synaptic Plasticity to Spike Timing Compared with Firing Rate. J Neurosci 2017; 36:11238-11258. [PMID: 27807166 DOI: 10.1523/jneurosci.0104-16.2016] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2016] [Accepted: 09/02/2016] [Indexed: 01/28/2023] Open
Abstract
Synaptic plasticity is sensitive to the rate and the timing of presynaptic and postsynaptic action potentials. In experimental protocols inducing plasticity, the imposed spike trains are typically regular and the relative timing between every presynaptic and postsynaptic spike is fixed. This is at odds with firing patterns observed in the cortex of intact animals, where cells fire irregularly and the timing between presynaptic and postsynaptic spikes varies. To investigate synaptic changes elicited by in vivo-like firing, we used numerical simulations and mathematical analysis of synaptic plasticity models. We found that the influence of spike timing on plasticity is weaker than expected from regular stimulation protocols. Moreover, when neurons fire irregularly, synaptic changes induced by precise spike timing can be equivalently induced by a modest firing rate variation. Our findings bridge the gap between existing results on synaptic plasticity and plasticity occurring in vivo, and challenge the dominant role of spike timing in plasticity. SIGNIFICANCE STATEMENT Synaptic plasticity, the change in efficacy of connections between neurons, is thought to underlie learning and memory. The dominant paradigm posits that the precise timing of neural action potentials (APs) is central for plasticity induction. This concept is based on experiments using highly regular and stereotyped patterns of APs, in stark contrast with natural neuronal activity. Using synaptic plasticity models, we investigated how irregular, in vivo-like activity shapes synaptic plasticity. We found that synaptic changes induced by precise timing of APs are much weaker than suggested by regular stimulation protocols, and can be equivalently induced by modest variations of the AP rate alone. Our results call into question the dominant role of precise AP timing for plasticity in natural conditions.
Collapse
|
54
|
Pedretti G, Milo V, Ambrogio S, Carboni R, Bianchi S, Calderoni A, Ramaswamy N, Spinelli AS, Ielmini D. Memristive neural network for on-line learning and tracking with brain-inspired spike timing dependent plasticity. Sci Rep 2017; 7:5288. [PMID: 28706303 PMCID: PMC5509735 DOI: 10.1038/s41598-017-05480-0] [Citation(s) in RCA: 117] [Impact Index Per Article: 16.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2017] [Accepted: 05/30/2017] [Indexed: 11/09/2022] Open
Abstract
Brain-inspired computation can revolutionize information technology by introducing machines capable of recognizing patterns (images, speech, video) and interacting with the external world in a cognitive, humanlike way. Achieving this goal requires first to gain a detailed understanding of the brain operation, and second to identify a scalable microelectronic technology capable of reproducing some of the inherent functions of the human brain, such as the high synaptic connectivity (~104) and the peculiar time-dependent synaptic plasticity. Here we demonstrate unsupervised learning and tracking in a spiking neural network with memristive synapses, where synaptic weights are updated via brain-inspired spike timing dependent plasticity (STDP). The synaptic conductance is updated by the local time-dependent superposition of pre- and post-synaptic spikes within a hybrid one-transistor/one-resistor (1T1R) memristive synapse. Only 2 synaptic states, namely the low resistance state (LRS) and the high resistance state (HRS), are sufficient to learn and recognize patterns. Unsupervised learning of a static pattern and tracking of a dynamic pattern of up to 4 × 4 pixels are demonstrated, paving the way for intelligent hardware technology with up-scaled memristive neural networks.
Collapse
Affiliation(s)
- G Pedretti
- Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano and IU.NET, Piazza L. da Vinci 32, 20133, Milano, Italy
| | - V Milo
- Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano and IU.NET, Piazza L. da Vinci 32, 20133, Milano, Italy
| | - S Ambrogio
- Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano and IU.NET, Piazza L. da Vinci 32, 20133, Milano, Italy
| | - R Carboni
- Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano and IU.NET, Piazza L. da Vinci 32, 20133, Milano, Italy
| | - S Bianchi
- Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano and IU.NET, Piazza L. da Vinci 32, 20133, Milano, Italy
| | - A Calderoni
- Micron Technology, Inc., Boise, ID, 83707, USA
| | - N Ramaswamy
- Micron Technology, Inc., Boise, ID, 83707, USA
| | - A S Spinelli
- Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano and IU.NET, Piazza L. da Vinci 32, 20133, Milano, Italy
| | - D Ielmini
- Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano and IU.NET, Piazza L. da Vinci 32, 20133, Milano, Italy.
| |
Collapse
|
55
|
Christie IK, Miller P, Van Hooser SD. Cortical amplification models of experience-dependent development of selective columns and response sparsification. J Neurophysiol 2017; 118:874-893. [PMID: 28515285 DOI: 10.1152/jn.00177.2017] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2017] [Revised: 04/28/2017] [Accepted: 05/11/2017] [Indexed: 02/05/2023] Open
Abstract
The development of direction-selective cortical columns requires visual experience, but the neural circuits and plasticity mechanisms that are responsible for this developmental transition are unknown. To gain insight into the mechanisms that could underlie experience-dependent increases in selectivity, we explored families of cortical amplifier models that enhance weakly biased feedforward signals. Here we focused exclusively on possible contributions of cortico-cortical connections and took feedforward input to be constant. We modeled pairs of interconnected columns that received equal and oppositely biased inputs. In a single-element model of cortical columns, we found two ways that cortical columns could receive biased feedforward input and exhibit strong but unselective responses to stimuli: 1) within-column recurrent excitatory connections could be strong enough to amplify both strong and weak feedforward input, or 2) columns that received differently biased inputs could have strong excitatory cross-connections that destroy selectivity. A Hebbian plasticity rule combined with simulated experience with stimuli weakened these strong cross-connections across cortical columns, allowing the individual columns to respond selectively to their biased inputs. In a model that included both excitatory and inhibitory neurons in each column, an additional means of obtaining selectivity through the cortical circuit was uncovered: cross-column suppression of inhibition-stabilized networks. When each column operated as an inhibition-stabilized network, cross-column excitation onto inhibitory neurons forced competition between the columns but in a manner that did not involve strong null-direction inhibition, consistent with experimental measurements of direction selectivity in visual cortex. Experimental predictions of these possible contributions of cortical circuits are discussed.NEW & NOTEWORTHY Sensory circuits are initially constructed via mechanisms that are independent of sensory experience, but later refinement requires experience. We constructed models of how circuits that receive biased feedforward inputs can be initially unselective and then be modified by experience and plasticity so that the resulting circuit exhibits increased selectivity. We propose that neighboring cortical columns may initially exhibit coupling that is too strong for selectivity. Experience-dependent mechanisms decrease this coupling so individual columns can exhibit selectivity.
Collapse
Affiliation(s)
- Ian K Christie
- Department of Biology, Brandeis University, Waltham, Massachusetts.,Volen National Center for Complex Systems, Brandeis University, Waltham, Massachusetts; and
| | - Paul Miller
- Department of Biology, Brandeis University, Waltham, Massachusetts.,Volen National Center for Complex Systems, Brandeis University, Waltham, Massachusetts; and.,Sloan-Swartz Center for Theoretical Neurobiology, Brandeis University, Waltham, Massachusetts
| | - Stephen D Van Hooser
- Department of Biology, Brandeis University, Waltham, Massachusetts; .,Volen National Center for Complex Systems, Brandeis University, Waltham, Massachusetts; and.,Sloan-Swartz Center for Theoretical Neurobiology, Brandeis University, Waltham, Massachusetts
| |
Collapse
|
56
|
Scellier B, Bengio Y. Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation. Front Comput Neurosci 2017; 11:24. [PMID: 28522969 PMCID: PMC5415673 DOI: 10.3389/fncom.2017.00024] [Citation(s) in RCA: 81] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2016] [Accepted: 03/28/2017] [Indexed: 11/13/2022] Open
Abstract
We introduce Equilibrium Propagation, a learning framework for energy-based models. It involves only one kind of neural computation, performed in both the first phase (when the prediction is made) and the second phase of training (after the target or prediction error is revealed). Although this algorithm computes the gradient of an objective function just like Backpropagation, it does not need a special computation or circuit for the second phase, where errors are implicitly propagated. Equilibrium Propagation shares similarities with Contrastive Hebbian Learning and Contrastive Divergence while solving the theoretical issues of both algorithms: our algorithm computes the gradient of a well-defined objective function. Because the objective function is defined in terms of local perturbations, the second phase of Equilibrium Propagation corresponds to only nudging the prediction (fixed point or stationary distribution) toward a configuration that reduces prediction error. In the case of a recurrent multi-layer supervised network, the output units are slightly nudged toward their target in the second phase, and the perturbation introduced at the output layer propagates backward in the hidden layers. We show that the signal "back-propagated" during this second phase corresponds to the propagation of error derivatives and encodes the gradient of the objective function, when the synaptic update corresponds to a standard form of spike-timing dependent plasticity. This work makes it more plausible that a mechanism similar to Backpropagation could be implemented by brains, since leaky integrator neural computation performs both inference and error back-propagation in our model. The only local difference between the two phases is whether synaptic changes are allowed or not. We also show experimentally that multi-layer recurrently connected networks with 1, 2, and 3 hidden layers can be trained by Equilibrium Propagation on the permutation-invariant MNIST task.
Collapse
Affiliation(s)
- Benjamin Scellier
- Département d'Informatique et de Recherche Opérationnelle, Montreal Institute for Learning Algorithms, Université de MontréalMontreal, QC, Canada
| | | |
Collapse
|
57
|
Gopalakrishnan R, Basu A. Triplet Spike Time-Dependent Plasticity in a Floating-Gate Synapse. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2017; 28:778-790. [PMID: 26841419 DOI: 10.1109/tnnls.2015.2506740] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Synapse plays an important role in learning in a neural network; the learning rules that modify the synaptic strength based on the timing difference between the pre- and postsynaptic spike occurrence are termed spike time-dependent plasticity (STDP) rules. The most commonly used rule posits weight change based on time difference between one presynaptic spike and one postsynaptic spike and is hence termed doublet STDP (D-STDP). However, D-STDP could not reproduce results of many biological experiments; a triplet STDP (T-STDP) that considers triplets of spikes as the fundamental unit has been proposed recently to explain these observations. This paper describes the compact implementation of a synapse using a single floating-gate (FG) transistor that can store a weight in a nonvolatile manner and demonstrates the T-STDP learning rule by modifying drain voltages according to triplets of spikes. We describe a mathematical procedure to obtain control voltages for the FG device for T-STDP and also show measurement results from an FG synapse fabricated in TSMC 0.35-μm CMOS process to support the theory. Possible very large scale integration implementation of drain voltage waveform generator circuits is also presented with the simulation results.
Collapse
|
58
|
Zenke F, Gerstner W, Ganguli S. The temporal paradox of Hebbian learning and homeostatic plasticity. Curr Opin Neurobiol 2017; 43:166-176. [DOI: 10.1016/j.conb.2017.03.015] [Citation(s) in RCA: 104] [Impact Index Per Article: 14.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 03/07/2017] [Accepted: 03/22/2017] [Indexed: 11/16/2022]
|
59
|
Azghadi MR, Linares-Barranco B, Abbott D, Leong PHW. A Hybrid CMOS-Memristor Neuromorphic Synapse. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2017; 11:434-445. [PMID: 28026782 DOI: 10.1109/tbcas.2016.2618351] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Although data processing technology continues to advance at an astonishing rate, computers with brain-like processing capabilities still elude us. It is envisioned that such computers may be achieved by the fusion of neuroscience and nano-electronics to realize a brain-inspired platform. This paper proposes a high-performance nano-scale Complementary Metal Oxide Semiconductor (CMOS)-memristive circuit, which mimics a number of essential learning properties of biological synapses. The proposed synaptic circuit that is composed of memristors and CMOS transistors, alters its memristance in response to timing differences among its pre- and post-synaptic action potentials, giving rise to a family of Spike Timing Dependent Plasticity (STDP). The presented design advances preceding memristive synapse designs with regards to the ability to replicate essential behaviours characterised in a number of electrophysiological experiments performed in the animal brain, which involve higher order spike interactions. Furthermore, the proposed hybrid device CMOS area is estimated as [Formula: see text] in a [Formula: see text] process-this represents a factor of ten reduction in area with respect to prior CMOS art. The new design is integrated with silicon neurons in a crossbar array structure amenable to large-scale neuromorphic architectures and may pave the way for future neuromorphic systems with spike timing-dependent learning features. These systems are emerging for deployment in various applications ranging from basic neuroscience research, to pattern recognition, to Brain-Machine-Interfaces.
Collapse
|
60
|
Bengio Y, Mesnard T, Fischer A, Zhang S, Wu Y. STDP-Compatible Approximation of Backpropagation in an Energy-Based Model. Neural Comput 2017; 29:555-577. [DOI: 10.1162/neco_a_00934] [Citation(s) in RCA: 37] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We show that Langevin Markov chain Monte Carlo inference in an energy-based model with latent variables has the property that the early steps of inference, starting from a stationary point, correspond to propagating error gradients into internal layers, similar to backpropagation. The backpropagated error is with respect to output units that have received an outside driving force pushing them away from the stationary point. Backpropagated error gradients correspond to temporal derivatives with respect to the activation of hidden units. These lead to a weight update proportional to the product of the presynaptic firing rate and the temporal rate of change of the postsynaptic firing rate. Simulations and a theoretical argument suggest that this rate-based update rule is consistent with those associated with spike-timing-dependent plasticity. The ideas presented in this article could be an element of a theory for explaining how brains perform credit assignment in deep hierarchies as efficiently as backpropagation does, with neural computation corresponding to both approximate inference in continuous-valued latent variables and error backpropagation, at the same time.
Collapse
Affiliation(s)
- Yoshua Bengio
- Montreal Institute for Learning Algorithms, University of Montreal, Montreal H3T 1J4, Quebec, Canada, and Canadian Institute for Advanced Research
| | - Thomas Mesnard
- Computer Science Department, École Normale Supérieure, Paris 75005, France
| | - Asja Fischer
- Institute of Computer Science, University of Bonn, Bonn 53117, Germany
| | - Saizheng Zhang
- Montreal Institute for Learning Algorithms, University of Montreal, Montreal H3T 1J4, Quebec, Canada
| | - Yuhuai Wu
- University of Toronto, Toronto, Ontario, M5S 1A1, Canada
| |
Collapse
|
61
|
Pedrosa V, Clopath C. The Role of Neuromodulators in Cortical Plasticity. A Computational Perspective. Front Synaptic Neurosci 2017; 8:38. [PMID: 28119596 PMCID: PMC5222801 DOI: 10.3389/fnsyn.2016.00038] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Accepted: 12/12/2016] [Indexed: 11/13/2022] Open
Abstract
Neuromodulators play a ubiquitous role across the brain in regulating plasticity. With recent advances in experimental techniques, it is possible to study the effects of diverse neuromodulatory states in specific brain regions. Neuromodulators are thought to impact plasticity predominantly through two mechanisms: the gating of plasticity and the upregulation of neuronal activity. However, the consequences of these mechanisms are poorly understood and there is a need for both experimental and theoretical exploration. Here we illustrate how neuromodulatory state affects cortical plasticity through these two mechanisms. First, we explore the ability of neuromodulators to gate plasticity by reshaping the learning window for spike-timing-dependent plasticity. Using a simple computational model, we implement four different learning rules and demonstrate their effects on receptive field plasticity. We then compare the neuromodulatory effects of upregulating learning rate versus the effects of upregulating neuronal activity. We find that these seemingly similar mechanisms do not yield the same outcome: upregulating neuronal activity can lead to either a broadening or a sharpening of receptive field tuning, whereas upregulating learning rate only intensifies the sharpening of receptive field tuning. This simple model demonstrates the need for further exploration of the rich landscape of neuromodulator-mediated plasticity. Future experiments, coupled with biologically detailed computational models, will elucidate the diversity of mechanisms by which neuromodulatory state regulates cortical plasticity.
Collapse
Affiliation(s)
- Victor Pedrosa
- Department of Bioengineering, Imperial College LondonLondon, UK; CAPES Foundation, Ministry of Education of BrazilBrasilia, Brazil
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London London, UK
| |
Collapse
|
62
|
Han R, Wang J, Miao R, Deng B, Qin Y, Yu H, Wei X. Propagation of Collective Temporal Regularity in Noisy Hierarchical Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2017; 28:191-205. [PMID: 28055909 DOI: 10.1109/tnnls.2015.2502993] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Neuronal communication between different brain areas is achieved in terms of spikes. Consequently, spike-time regularity is closely related to many cognitive tasks and timing precision of neural information processing. A recent experiment on primate parietal cortex reports that spike-time regularity increases consistently from primary sensory to higher cortical regions. This observation conflicts with the influential view that spikes in the neocortex are fundamentally irregular. To uncover the underlying network mechanism, we construct a multilayered feedforward neural information transmission pathway and investigate how spike-time regularity evolves across subsequent layers. Numerical results reveal that despite the obviously irregular spiking patterns in previous several layers, neurons in downstream layers can generate rather regular spikes, which depends on the network topology. In particular, we find that collective temporal regularity in deeper layers exhibits resonance-like behavior with respect to both synaptic connection probability and synaptic weight, i.e., the optimal topology parameter maximizes the spike-timing regularity. Furthermore, it is demonstrated that synaptic properties, including inhibition, synaptic transient dynamics, and plasticity, have significant impacts on spike-timing regularity propagation. The emergence of the increasingly regular spiking (RS) patterns in higher parietal regions can, thus, be viewed as a natural consequence of spiking activity propagation between different brain areas. Finally, we validate an important function served by increased RS: promoting reliable propagation of spike-rate signals across downstream layers.
Collapse
|
63
|
Brito CSN, Gerstner W. Nonlinear Hebbian Learning as a Unifying Principle in Receptive Field Formation. PLoS Comput Biol 2016; 12:e1005070. [PMID: 27690349 PMCID: PMC5045191 DOI: 10.1371/journal.pcbi.1005070] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2015] [Accepted: 07/19/2016] [Indexed: 11/19/2022] Open
Abstract
The development of sensory receptive fields has been modeled in the past by a variety of models including normative models such as sparse coding or independent component analysis and bottom-up models such as spike-timing dependent plasticity or the Bienenstock-Cooper-Munro model of synaptic plasticity. Here we show that the above variety of approaches can all be unified into a single common principle, namely nonlinear Hebbian learning. When nonlinear Hebbian learning is applied to natural images, receptive field shapes were strongly constrained by the input statistics and preprocessing, but exhibited only modest variation across different choices of nonlinearities in neuron models or synaptic plasticity rules. Neither overcompleteness nor sparse network activity are necessary for the development of localized receptive fields. The analysis of alternative sensory modalities such as auditory models or V2 development lead to the same conclusions. In all examples, receptive fields can be predicted a priori by reformulating an abstract model as nonlinear Hebbian learning. Thus nonlinear Hebbian learning and natural statistics can account for many aspects of receptive field formation across models and sensory modalities. The question of how the brain self-organizes to develop precisely tuned neurons has puzzled neuroscientists at least since the discoveries of Hubel and Wiesel. In the past decades, a variety of theories and models have been proposed to describe receptive field formation, notably V1 simple cells, from natural inputs. We cut through the jungle of candidate explanations by demonstrating that in fact a single principle is sufficient to explain receptive field development. Our results follow from two major insights. First, we show that many representative models of sensory development are in fact implementing variations of a common principle: nonlinear Hebbian learning. Second, we reveal that nonlinear Hebbian learning is sufficient for receptive field formation through sensory inputs. The surprising result is that our findings are robust of specific details of a model, and allows for robust predictions on the learned receptive fields. Nonlinear Hebbian learning is therefore general in two senses: it applies to many models developed by theoreticians, and to many sensory modalities studied by experimental neuroscientists.
Collapse
Affiliation(s)
- Carlos S. N. Brito
- School of Computer and Communication Sciences and School of Life Science, Brain Mind Institute, Ecole Polytechnique Federale de Lausanne, Lausanne EPFL, Switzerland
- Gatsby Computational Neuroscience Unit, University College London, London, United Kingdom
- * E-mail:
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Science, Brain Mind Institute, Ecole Polytechnique Federale de Lausanne, Lausanne EPFL, Switzerland
| |
Collapse
|
64
|
Li Y, Kulvicius T, Tetzlaff C. Induction and Consolidation of Calcium-Based Homo- and Heterosynaptic Potentiation and Depression. PLoS One 2016; 11:e0161679. [PMID: 27560350 PMCID: PMC4999190 DOI: 10.1371/journal.pone.0161679] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2015] [Accepted: 08/10/2016] [Indexed: 11/19/2022] Open
Abstract
The adaptive mechanisms of homo- and heterosynaptic plasticity play an important role in learning and memory. In order to maintain plasticity-induced changes for longer time scales (up to several days), they have to be consolidated by transferring them from a short-lasting early-phase to a long-lasting late-phase state. The underlying processes of this synaptic consolidation are already well-known for homosynaptic plasticity, however, it is not clear whether the same processes also enable the induction and consolidation of heterosynaptic plasticity. In this study, by extending a generic calcium-based plasticity model with the processes of synaptic consolidation, we show in simulations that indeed heterosynaptic plasticity can be induced and, furthermore, consolidated by the same underlying processes as for homosynaptic plasticity. Furthermore, we show that by local diffusion processes the heterosynaptic effect can be restricted to a few synapses neighboring the homosynaptically changed ones. Taken together, this generic model reproduces many experimental results of synaptic tagging and consolidation, provides several predictions for heterosynaptic induction and consolidation, and yields insights into the complex interactions between homo- and heterosynaptic plasticity over a broad variety of time (minutes to days) and spatial scales (several micrometers).
Collapse
Affiliation(s)
- Yinyun Li
- III. Institute of Physics – Biophysics, Georg-August-University, 37077 Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, 37077 Göttingen, Germany
- School of System Science, Beijing Normal University, 100875 Beijing, China
- * E-mail:
| | - Tomas Kulvicius
- III. Institute of Physics – Biophysics, Georg-August-University, 37077 Göttingen, Germany
- Maersk Mc-Kinney Moller Institute, University of Southern Denmark, 5230 Odense, Denmark
| | - Christian Tetzlaff
- Bernstein Center for Computational Neuroscience, Georg-August-University, 37077 Göttingen, Germany
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
- Department of Neurobiology, Weizmann Institute of Science, 76100 Rehovot, Israel
| |
Collapse
|
65
|
Robinson BS, Berger TW, Song D. Identification of Stable Spike-Timing-Dependent Plasticity from Spiking Activity with Generalized Multilinear Modeling. Neural Comput 2016; 28:2320-2351. [PMID: 27557101 DOI: 10.1162/neco_a_00883] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Characterization of long-term activity-dependent plasticity from behaviorally driven spiking activity is important for understanding the underlying mechanisms of learning and memory. In this letter, we present a computational framework for quantifying spike-timing-dependent plasticity (STDP) during behavior by identifying a functional plasticity rule solely from spiking activity. First, we formulate a flexible point-process spiking neuron model structure with STDP, which includes functions that characterize the stationary and plastic properties of the neuron. The STDP model includes a novel function for prolonged plasticity induction, as well as a more typical function for synaptic weight change based on the relative timing of input-output spike pairs. Consideration for system stability is incorporated with weight-dependent synaptic modification. Next, we formalize an estimation technique using a generalized multilinear model (GMLM) structure with basis function expansion. The weight-dependent synaptic modification adds a nonlinearity to the model, which is addressed with an iterative unconstrained optimization approach. Finally, we demonstrate successful model estimation on simulated spiking data and show that all model functions can be estimated accurately with this method across a variety of simulation parameters, such as number of inputs, output firing rate, input firing type, and simulation time. Since this approach requires only naturally generated spikes, it can be readily applied to behaving animal studies to characterize the underlying mechanisms of learning and memory.
Collapse
Affiliation(s)
- Brian S Robinson
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| | - Theodore W Berger
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| | - Dong Song
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| |
Collapse
|
66
|
Sweeney Y, Clopath C. Emergent spatial synaptic structure from diffusive plasticity. Eur J Neurosci 2016; 45:1057-1067. [DOI: 10.1111/ejn.13279] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2016] [Revised: 05/04/2016] [Accepted: 05/13/2016] [Indexed: 11/29/2022]
Affiliation(s)
- Yann Sweeney
- Department of Bioengineering; Imperial College London, South Kensington Campus; London SW7 2AZ UK
| | - Claudia Clopath
- Department of Bioengineering; Imperial College London, South Kensington Campus; London SW7 2AZ UK
| |
Collapse
|
67
|
Gjorgjieva J, Drion G, Marder E. Computational implications of biophysical diversity and multiple timescales in neurons and synapses for circuit performance. Curr Opin Neurobiol 2016; 37:44-52. [PMID: 26774694 PMCID: PMC4860045 DOI: 10.1016/j.conb.2015.12.008] [Citation(s) in RCA: 79] [Impact Index Per Article: 9.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2015] [Revised: 12/17/2015] [Accepted: 12/22/2015] [Indexed: 12/27/2022]
Abstract
Despite advances in experimental and theoretical neuroscience, we are still trying to identify key biophysical details that are important for characterizing the operation of brain circuits. Biological mechanisms at the level of single neurons and synapses can be combined as 'building blocks' to generate circuit function. We focus on the importance of capturing multiple timescales when describing these intrinsic and synaptic components. Whether inherent in the ionic currents, the neuron's complex morphology, or the neurotransmitter composition of synapses, these multiple timescales prove crucial for capturing the variability and richness of circuit output and enhancing the information-carrying capacity observed across nervous systems.
Collapse
Affiliation(s)
- Julijana Gjorgjieva
- Volen Center and Biology Department, Brandeis University, Waltham, MA 02454, United States
| | - Guillaume Drion
- Volen Center and Biology Department, Brandeis University, Waltham, MA 02454, United States; Department of Electrical Engineering and Computer Science, University of Liège, Liège B-4000, Belgium
| | - Eve Marder
- Volen Center and Biology Department, Brandeis University, Waltham, MA 02454, United States.
| |
Collapse
|
68
|
Roy A, Osik JJ, Ritter NJ, Wang S, Shaw JT, Fiser J, Van Hooser SD. Optogenetic spatial and temporal control of cortical circuits on a columnar scale. J Neurophysiol 2015; 115:1043-62. [PMID: 26631152 DOI: 10.1152/jn.00960.2015] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2015] [Accepted: 11/28/2015] [Indexed: 11/22/2022] Open
Abstract
Many circuits in the mammalian brain are organized in a topographic or columnar manner. These circuits could be activated-in ways that reveal circuit function or restore function after disease-by an artificial stimulation system that is capable of independently driving local groups of neurons. Here we present a simple custom microscope called ProjectorScope 1 that incorporates off-the-shelf parts and a liquid crystal display (LCD) projector to stimulate surface brain regions that express channelrhodopsin-2 (ChR2). In principle, local optogenetic stimulation of the brain surface with optical projection systems might not produce local activation of a highly interconnected network like the cortex, because of potential stimulation of axons of passage or extended dendritic trees. However, here we demonstrate that the combination of virally mediated ChR2 expression levels and the light intensity of ProjectorScope 1 is capable of producing local spatial activation with a resolution of ∼200-300 μm. We use the system to examine the role of cortical activity in the experience-dependent emergence of motion selectivity in immature ferret visual cortex. We find that optogenetic cortical activation alone-without visual stimulation-is sufficient to produce increases in motion selectivity, suggesting the presence of a sharpening mechanism that does not require precise spatiotemporal activation of the visual system. These results demonstrate that optogenetic stimulation can sculpt the developing brain.
Collapse
Affiliation(s)
- Arani Roy
- Department of Biology, Brandeis University, Waltham, Massachusetts; Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts
| | - Jason J Osik
- Department of Biology, Brandeis University, Waltham, Massachusetts; Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts
| | - Neil J Ritter
- Department of Biology, Brandeis University, Waltham, Massachusetts; Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts
| | - Shen Wang
- Department of Biology, Brandeis University, Waltham, Massachusetts; Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts
| | - James T Shaw
- Department of Biology, Brandeis University, Waltham, Massachusetts
| | - József Fiser
- Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts; Sloan-Swartz Center for Theoretical Neurobiology, Brandeis University, Waltham, Massachusetts; and Department of Cognitive Sciences, Central European University, Budapest, Hungary
| | - Stephen D Van Hooser
- Department of Biology, Brandeis University, Waltham, Massachusetts; Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts; Sloan-Swartz Center for Theoretical Neurobiology, Brandeis University, Waltham, Massachusetts; and
| |
Collapse
|
69
|
Capitán JA, Manrubia S. Demography-based adaptive network model reproduces the spatial organization of human linguistic groups. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:062811. [PMID: 26764748 DOI: 10.1103/physreve.92.062811] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/09/2015] [Indexed: 06/05/2023]
Abstract
The distribution of human linguistic groups presents a number of interesting and nontrivial patterns. The distributions of the number of speakers per language and the area each group covers follow log-normal distributions, while population and area fulfill an allometric relationship. The topology of networks of spatial contacts between different linguistic groups has been recently characterized, showing atypical properties of the degree distribution and clustering, among others. Human demography, spatial conflicts, and the construction of networks of contacts between linguistic groups are mutually dependent processes. Here we introduce an adaptive network model that takes all of them into account and successfully reproduces, using only four model parameters, not only those features of linguistic groups already described in the literature, but also correlations between demographic and topological properties uncovered in this work. Besides their relevance when modeling and understanding processes related to human biogeography, our adaptive network model admits a number of generalizations that broaden its scope and make it suitable to represent interactions between agents based on population dynamics and competition for space.
Collapse
Affiliation(s)
- José A Capitán
- Departamento de Matemática Aplicada, Universidad Politécnica de Madrid, Av. Juan de Herrera 4, 28040 Madrid, Spain
| | - Susanna Manrubia
- Centro Nacional de Biotecnología (CSIC), C/ Darwin 3, 28049 Madrid, Spain
| |
Collapse
|
70
|
|
71
|
Veliz-Cuba A, Shouval HZ, Josić K, Kilpatrick ZP. Networks that learn the precise timing of event sequences. J Comput Neurosci 2015; 39:235-54. [PMID: 26334992 DOI: 10.1007/s10827-015-0574-4] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2015] [Revised: 08/06/2015] [Accepted: 08/10/2015] [Indexed: 11/28/2022]
Abstract
Neuronal circuits can learn and replay firing patterns evoked by sequences of sensory stimuli. After training, a brief cue can trigger a spatiotemporal pattern of neural activity similar to that evoked by a learned stimulus sequence. Network models show that such sequence learning can occur through the shaping of feedforward excitatory connectivity via long term plasticity. Previous models describe how event order can be learned, but they typically do not explain how precise timing can be recalled. We propose a mechanism for learning both the order and precise timing of event sequences. In our recurrent network model, long term plasticity leads to the learning of the sequence, while short term facilitation enables temporally precise replay of events. Learned synaptic weights between populations determine the time necessary for one population to activate another. Long term plasticity adjusts these weights so that the trained event times are matched during playback. While we chose short term facilitation as a time-tracking process, we also demonstrate that other mechanisms, such as spike rate adaptation, can fulfill this role. We also analyze the impact of trial-to-trial variability, showing how observational errors as well as neuronal noise result in variability in learned event times. The dynamics of the playback process determines how stochasticity is inherited in learned sequence timings. Future experiments that characterize such variability can therefore shed light on the neural mechanisms of sequence learning.
Collapse
Affiliation(s)
- Alan Veliz-Cuba
- Department of Mathematics, University of Houston, Houston, TX, 77204, USA.
| | - Harel Z Shouval
- Department of Neurobiology and Anatomy, University of Texas Medical School, Houston, TX, 77030, USA.
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, TX, 77204, USA. .,Department of Biology, University of Houston, Houston, TX, 77204, USA.
| | | |
Collapse
|
72
|
Logiaco L, Quilodran R, Procyk E, Arleo A. Spatiotemporal Spike Coding of Behavioral Adaptation in the Dorsal Anterior Cingulate Cortex. PLoS Biol 2015; 13:e1002222. [PMID: 26266537 PMCID: PMC4534466 DOI: 10.1371/journal.pbio.1002222] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2015] [Accepted: 07/06/2015] [Indexed: 11/18/2022] Open
Abstract
The frontal cortex controls behavioral adaptation in environments governed by complex rules. Many studies have established the relevance of firing rate modulation after informative events signaling whether and how to update the behavioral policy. However, whether the spatiotemporal features of these neuronal activities contribute to encoding imminent behavioral updates remains unclear. We investigated this issue in the dorsal anterior cingulate cortex (dACC) of monkeys while they adapted their behavior based on their memory of feedback from past choices. We analyzed spike trains of both single units and pairs of simultaneously recorded neurons using an algorithm that emulates different biologically plausible decoding circuits. This method permits the assessment of the performance of both spike-count and spike-timing sensitive decoders. In response to the feedback, single neurons emitted stereotypical spike trains whose temporal structure identified informative events with higher accuracy than mere spike count. The optimal decoding time scale was in the range of 70-200 ms, which is significantly shorter than the memory time scale required by the behavioral task. Importantly, the temporal spiking patterns of single units were predictive of the monkeys' behavioral response time. Furthermore, some features of these spiking patterns often varied between jointly recorded neurons. All together, our results suggest that dACC drives behavioral adaptation through complex spatiotemporal spike coding. They also indicate that downstream networks, which decode dACC feedback signals, are unlikely to act as mere neural integrators.
Collapse
Affiliation(s)
- Laureline Logiaco
- INSERM, U968, Paris, France
- Sorbonne Universités, UPMC Univ Paris 06, UMR_S 968, Institut de la Vision, Paris, France
- CNRS, UMR_7210, Paris, France
- * E-mail: (LL); (AA)
| | - René Quilodran
- Escuela de Medicina, Departamento de Pre-clínicas, Universidad de Valparaíso, Hontaneda, Valparaíso, Chile
| | - Emmanuel Procyk
- Stem Cell and Brain Research Institute, Institut National de la Santé et de la Recherche Médicale U846, 69500 Bron, France
- Université de Lyon, Université Lyon 1, Lyon, France
| | - Angelo Arleo
- INSERM, U968, Paris, France
- Sorbonne Universités, UPMC Univ Paris 06, UMR_S 968, Institut de la Vision, Paris, France
- CNRS, UMR_7210, Paris, France
- * E-mail: (LL); (AA)
| |
Collapse
|
73
|
Abstract
Synapses are highly plastic and are modified by changes in patterns of neural activity or sensory experience. Plasticity of cortical excitatory synapses is thought to be important for learning and memory, leading to alterations in sensory representations and cognitive maps. However, these changes must be coordinated across other synapses within local circuits to preserve neural coding schemes and the organization of excitatory and inhibitory inputs, i.e., excitatory-inhibitory balance. Recent studies indicate that inhibitory synapses are also plastic and are controlled directly by a large number of neuromodulators, particularly during episodes of learning. Many modulators transiently alter excitatory-inhibitory balance by decreasing inhibition, and thus disinhibition has emerged as a major mechanism by which neuromodulation might enable long-term synaptic modifications naturally. This review examines the relationships between neuromodulation and synaptic plasticity, focusing on the induction of long-term changes that collectively enhance cortical excitatory-inhibitory balance for improving perception and behavior.
Collapse
Affiliation(s)
- Robert C Froemke
- Skirball Institute for Biomolecular Medicine, Neuroscience Institute, and Departments of Otolaryngology, Neuroscience, and Physiology, New York University School of Medicine, New York, NY 10016;
| |
Collapse
|
74
|
Sadeh S, Clopath C, Rotter S. Emergence of Functional Specificity in Balanced Networks with Synaptic Plasticity. PLoS Comput Biol 2015; 11:e1004307. [PMID: 26090844 PMCID: PMC4474917 DOI: 10.1371/journal.pcbi.1004307] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2014] [Accepted: 04/30/2015] [Indexed: 11/19/2022] Open
Abstract
In rodent visual cortex, synaptic connections between orientation-selective neurons are unspecific at the time of eye opening, and become to some degree functionally specific only later during development. An explanation for this two-stage process was proposed in terms of Hebbian plasticity based on visual experience that would eventually enhance connections between neurons with similar response features. For this to work, however, two conditions must be satisfied: First, orientation selective neuronal responses must exist before specific recurrent synaptic connections can be established. Second, Hebbian learning must be compatible with the recurrent network dynamics contributing to orientation selectivity, and the resulting specific connectivity must remain stable for unspecific background activity. Previous studies have mainly focused on very simple models, where the receptive fields of neurons were essentially determined by feedforward mechanisms, and where the recurrent network was small, lacking the complex recurrent dynamics of large-scale networks of excitatory and inhibitory neurons. Here we studied the emergence of functionally specific connectivity in large-scale recurrent networks with synaptic plasticity. Our results show that balanced random networks, which already exhibit highly selective responses at eye opening, can develop feature-specific connectivity if appropriate rules of synaptic plasticity are invoked within and between excitatory and inhibitory populations. If these conditions are met, the initial orientation selectivity guides the process of Hebbian learning and, as a result, functionally specific and a surplus of bidirectional connections emerge. Our results thus demonstrate the cooperation of synaptic plasticity and recurrent dynamics in large-scale functional networks with realistic receptive fields, highlight the role of inhibition as a critical element in this process, and paves the road for further computational studies of sensory processing in neocortical network models equipped with synaptic plasticity. In primary visual cortex of mammals, neurons are selective to the orientation of contrast edges. In some species, as cats and monkeys, neurons preferring similar orientations are adjacent on the cortical surface, leading to smooth orientation maps. In rodents, in contrast, such spatial orientation maps do not exist, and neurons of different specificities are mixed in a salt-and-pepper fashion. During development, however, a “functional” map of orientation selectivity emerges, where connections between neurons of similar preferred orientations are selectively enhanced. Here we show how such feature-specific connectivity can arise in realistic neocortical networks of excitatory and inhibitory neurons. Our results demonstrate how recurrent dynamics can work in cooperation with synaptic plasticity to form networks where neurons preferring similar stimulus features connect more strongly together. Such networks, in turn, are known to enhance the specificity of neuronal responses to a stimulus. Our study thus reveals how self-organizing connectivity in neuronal networks enable them to achieve new or enhanced functions, and it underlines the essential role of recurrent inhibition and plasticity in this process.
Collapse
Affiliation(s)
- Sadra Sadeh
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Freiburg im Breisgau, Germany
- Bioengineering Department, Imperial College London, London, United Kingdom
- * E-mail:
| | - Claudia Clopath
- Bioengineering Department, Imperial College London, London, United Kingdom
| | - Stefan Rotter
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Freiburg im Breisgau, Germany
| |
Collapse
|
75
|
Bauer R, Zubler F, Pfister S, Hauri A, Pfeiffer M, Muir DR, Douglas RJ. Developmental self-construction and -configuration of functional neocortical neuronal networks. PLoS Comput Biol 2014; 10:e1003994. [PMID: 25474693 PMCID: PMC4256067 DOI: 10.1371/journal.pcbi.1003994] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2014] [Accepted: 10/09/2014] [Indexed: 11/20/2022] Open
Abstract
The prenatal development of neural circuits must provide sufficient configuration to support at least a set of core postnatal behaviors. Although knowledge of various genetic and cellular aspects of development is accumulating rapidly, there is less systematic understanding of how these various processes play together in order to construct such functional networks. Here we make some steps toward such understanding by demonstrating through detailed simulations how a competitive co-operative (‘winner-take-all’, WTA) network architecture can arise by development from a single precursor cell. This precursor is granted a simplified gene regulatory network that directs cell mitosis, differentiation, migration, neurite outgrowth and synaptogenesis. Once initial axonal connection patterns are established, their synaptic weights undergo homeostatic unsupervised learning that is shaped by wave-like input patterns. We demonstrate how this autonomous genetically directed developmental sequence can give rise to self-calibrated WTA networks, and compare our simulation results with biological data. Models of learning in artificial neural networks generally assume that the neurons and approximate network are given, and then learning tunes the synaptic weights. By contrast, we address the question of how an entire functional neuronal network containing many differentiated neurons and connections can develop from only a single progenitor cell. We chose a winner-take-all network as the developmental target, because it is a computationally powerful circuit, and a candidate motif of neocortical networks. The key aspect of this challenge is that the developmental mechanisms must be locally autonomous as in Biology: They cannot depend on global knowledge or supervision. We have explored this developmental process by simulating in physical detail the fundamental biological behaviors, such as cell proliferation, neurite growth and synapse formation that give rise to the structural connectivity observed in the superficial layers of the neocortex. These differentiated, approximately connected neurons then adapt their synaptic weights homeostatically to obtain a uniform electrical signaling activity before going on to organize themselves according to the fundamental correlations embedded in a noisy wave-like input signal. In this way the precursor expands itself through development and unsupervised learning into winner-take-all functionality and orientation selectivity in a biologically plausible manner.
Collapse
Affiliation(s)
- Roman Bauer
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
- School of Computing Science, Newcastle University, Newcastle upon Tyne, United Kingdom
- * E-mail:
| | - Frédéric Zubler
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
- Department of Neurology, Inselspital Bern, Bern University Hospital, University of Bern, Bern, Switzerland
| | - Sabina Pfister
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
| | - Andreas Hauri
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
| | - Michael Pfeiffer
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
| | - Dylan R. Muir
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
- Biozentrum, University of Basel, Basel, Switzerland
| | - Rodney J. Douglas
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
| |
Collapse
|
76
|
Van Hooser SD, Escobar GM, Maffei A, Miller P. Emerging feed-forward inhibition allows the robust formation of direction selectivity in the developing ferret visual cortex. J Neurophysiol 2014; 111:2355-73. [PMID: 24598528 PMCID: PMC4099478 DOI: 10.1152/jn.00891.2013] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2013] [Accepted: 03/03/2014] [Indexed: 11/22/2022] Open
Abstract
The computation of direction selectivity requires that a cell respond to joint spatial and temporal characteristics of the stimulus that cannot be separated into independent components. Direction selectivity in ferret visual cortex is not present at the time of eye opening but instead develops in the days and weeks following eye opening in a process that requires visual experience with moving stimuli. Classic Hebbian or spike timing-dependent modification of excitatory feed-forward synaptic inputs is unable to produce direction-selective cells from unselective or weakly directionally biased initial conditions because inputs eventually grow so strong that they can independently drive cortical neurons, violating the joint spatial-temporal activation requirement. Furthermore, without some form of synaptic competition, cells cannot develop direction selectivity in response to training with bidirectional stimulation, as cells in ferret visual cortex do. We show that imposing a maximum lateral geniculate nucleus (LGN)-to-cortex synaptic weight allows neurons to develop direction-selective responses that maintain the requirement for joint spatial and temporal activation. We demonstrate that a novel form of inhibitory plasticity, postsynaptic activity-dependent long-term potentiation of inhibition (POSD-LTPi), which operates in the developing cortex at the time of eye opening, can provide synaptic competition and enables robust development of direction-selective receptive fields with unidirectional or bidirectional stimulation. We propose a general model of the development of spatiotemporal receptive fields that consists of two phases: an experience-independent establishment of initial biases, followed by an experience-dependent amplification or modification of these biases via correlation-based plasticity of excitatory inputs that compete against gradually increasing feed-forward inhibition.
Collapse
Affiliation(s)
- Stephen D Van Hooser
- Department of Biology, Brandeis University, Waltham, Massachusetts; Sloan-Swartz Center for Theoretical Neurobiology, Brandeis University, Waltham, Massachusetts; Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts;
| | - Gina M Escobar
- Department of Biology, Brandeis University, Waltham, Massachusetts; Sloan-Swartz Center for Theoretical Neurobiology, Brandeis University, Waltham, Massachusetts; Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts
| | - Arianna Maffei
- Department of Neurobiology and Behavior, State University of New York-Stony Brook, Stony Brook, New York; and SUNY Eye Institute, State University of New York-Stony Brook, Stony Brook, New York
| | - Paul Miller
- Department of Biology, Brandeis University, Waltham, Massachusetts; Sloan-Swartz Center for Theoretical Neurobiology, Brandeis University, Waltham, Massachusetts; Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts
| |
Collapse
|
77
|
Kleberg FI, Fukai T, Gilson M. Excitatory and inhibitory STDP jointly tune feedforward neural circuits to selectively propagate correlated spiking activity. Front Comput Neurosci 2014; 8:53. [PMID: 24847242 PMCID: PMC4019846 DOI: 10.3389/fncom.2014.00053] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2013] [Accepted: 04/10/2014] [Indexed: 11/13/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) has been well established between excitatory neurons and several computational functions have been proposed in various neural systems. Despite some recent efforts, however, there is a significant lack of functional understanding of inhibitory STDP (iSTDP) and its interplay with excitatory STDP (eSTDP). Here, we demonstrate by analytical and numerical methods that iSTDP contributes crucially to the balance of excitatory and inhibitory weights for the selection of a specific signaling pathway among other pathways in a feedforward circuit. This pathway selection is based on the high sensitivity of STDP to correlations in spike times, which complements a recent proposal for the role of iSTDP in firing-rate based selection. Our model predicts that asymmetric anti-Hebbian iSTDP exceeds asymmetric Hebbian iSTDP for supporting pathway-specific balance, which we show is useful for propagating transient neuronal responses. Furthermore, we demonstrate how STDPs at excitatory-excitatory, excitatory-inhibitory, and inhibitory-excitatory synapses cooperate to improve the pathway selection. We propose that iSTDP is crucial for shaping the network structure that achieves efficient processing of synchronous spikes.
Collapse
Affiliation(s)
- Florence I Kleberg
- Lab for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan
| | - Tomoki Fukai
- Lab for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan
| | - Matthieu Gilson
- Lab for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan
| |
Collapse
|
78
|
Chistiakova M, Bannon NM, Bazhenov M, Volgushev M. Heterosynaptic plasticity: multiple mechanisms and multiple roles. Neuroscientist 2014; 20:483-98. [PMID: 24727248 DOI: 10.1177/1073858414529829] [Citation(s) in RCA: 76] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Plasticity is a universal property of synapses. It is expressed in a variety of forms mediated by a multitude of mechanisms. Here we consider two broad kinds of plasticity that differ in their requirement for presynaptic activity during the induction. Homosynaptic plasticity occurs at synapses that were active during the induction. It is also called input specific or associative, and it is governed by Hebbian-type learning rules. Heterosynaptic plasticity can be induced by episodes of strong postsynaptic activity also at synapses that were not active during the induction, thus making any synapse at a cell a target to heterosynaptic changes. Both forms can be induced by typical protocols used for plasticity induction and operate on the same time scales but have differential computational properties and play different roles in learning systems. Homosynaptic plasticity mediates associative modifications of synaptic weights. Heterosynaptic plasticity counteracts runaway dynamics introduced by Hebbian-type rules and balances synaptic changes. It provides learning systems with stability and enhances synaptic competition. We conclude that homosynaptic and heterosynaptic plasticity represent complementary properties of modifiable synapses, and both are necessary for normal operation of neural systems with plastic synapses.
Collapse
Affiliation(s)
| | - Nicholas M Bannon
- Department of Psychology, University of Connecticut, Storrs, CT, USA
| | - Maxim Bazhenov
- Department of Cell Biology and Neuroscience, University of California, Riverside, CA, USA
| | - Maxim Volgushev
- Department of Psychology, University of Connecticut, Storrs, CT, USA
| |
Collapse
|
79
|
Vasilaki E, Giugliano M. Emergence of connectivity motifs in networks of model neurons with short- and long-term plastic synapses. PLoS One 2014; 9:e84626. [PMID: 24454735 PMCID: PMC3893143 DOI: 10.1371/journal.pone.0084626] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2013] [Accepted: 11/16/2013] [Indexed: 11/29/2022] Open
Abstract
Recent experimental data from the rodent cerebral cortex and olfactory bulb indicate that specific connectivity motifs are correlated with short-term dynamics of excitatory synaptic transmission. It was observed that neurons with short-term facilitating synapses form predominantly reciprocal pairwise connections, while neurons with short-term depressing synapses form predominantly unidirectional pairwise connections. The cause of these structural differences in excitatory synaptic microcircuits is unknown. We show that these connectivity motifs emerge in networks of model neurons, from the interactions between short-term synaptic dynamics (SD) and long-term spike-timing dependent plasticity (STDP). While the impact of STDP on SD was shown in simultaneous neuronal pair recordings in vitro, the mutual interactions between STDP and SD in large networks are still the subject of intense research. Our approach combines an SD phenomenological model with an STDP model that faithfully captures long-term plasticity dependence on both spike times and frequency. As a proof of concept, we first simulate and analyze recurrent networks of spiking neurons with random initial connection efficacies and where synapses are either all short-term facilitating or all depressing. For identical external inputs to the network, and as a direct consequence of internally generated activity, we find that networks with depressing synapses evolve unidirectional connectivity motifs, while networks with facilitating synapses evolve reciprocal connectivity motifs. We then show that the same results hold for heterogeneous networks, including both facilitating and depressing synapses. This does not contradict a recent theory that proposes that motifs are shaped by external inputs, but rather complements it by examining the role of both the external inputs and the internally generated network activity. Our study highlights the conditions under which SD-STDP might explain the correlation between facilitation and reciprocal connectivity motifs, as well as between depression and unidirectional motifs.
Collapse
Affiliation(s)
- Eleni Vasilaki
- Department of Computer Science, University of Sheffield, Sheffield, United Kingdom
- Department of Biomedical Sciences, University of Antwerp, Wilrijk, Belgium
| | - Michele Giugliano
- Department of Computer Science, University of Sheffield, Sheffield, United Kingdom
- Department of Biomedical Sciences, University of Antwerp, Wilrijk, Belgium
- Brain Mind Institute, Swiss Federal Institute of Technology of Lausanne, Lausanne, Switzerland
- * E-mail:
| |
Collapse
|
80
|
Zenke F, Hennequin G, Gerstner W. Synaptic plasticity in neural networks needs homeostasis with a fast rate detector. PLoS Comput Biol 2013; 9:e1003330. [PMID: 24244138 PMCID: PMC3828150 DOI: 10.1371/journal.pcbi.1003330] [Citation(s) in RCA: 91] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2013] [Accepted: 09/25/2013] [Indexed: 01/17/2023] Open
Abstract
Hebbian changes of excitatory synapses are driven by and further enhance correlations between pre- and postsynaptic activities. Hence, Hebbian plasticity forms a positive feedback loop that can lead to instability in simulated neural networks. To keep activity at healthy, low levels, plasticity must therefore incorporate homeostatic control mechanisms. We find in numerical simulations of recurrent networks with a realistic triplet-based spike-timing-dependent plasticity rule (triplet STDP) that homeostasis has to detect rate changes on a timescale of seconds to minutes to keep the activity stable. We confirm this result in a generic mean-field formulation of network activity and homeostatic plasticity. Our results strongly suggest the existence of a homeostatic regulatory mechanism that reacts to firing rate changes on the order of seconds to minutes. Learning and memory in the brain are thought to be mediated through Hebbian plasticity. When a group of neurons is repetitively active together, their connections get strengthened. This can cause co-activation even in the absence of the stimulus that triggered the change. To avoid run-away behavior it is important to prevent neurons from forming excessively strong connections. This is achieved by regulatory homeostatic mechanisms that constrain the overall activity. Here we study the stability of background activity in a recurrent network model with a plausible Hebbian learning rule and homeostasis. We find that the activity in our model is unstable unless homeostasis reacts to rate changes on a timescale of minutes or faster. Since this timescale is incompatible with most known forms of homeostasis, this implies the existence of a previously unknown, rapid homeostatic regulatory mechanism capable of either gating the rate of plasticity, or affecting synaptic efficacies otherwise on a short timescale.
Collapse
Affiliation(s)
- Friedemann Zenke
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| | - Guillaume Hennequin
- Computational and Biological Learning Laboratory, Department of Engineering, University of Cambridge, Cambridge, United Kingdom
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
81
|
Yger P, Harris KD. The Convallis rule for unsupervised learning in cortical networks. PLoS Comput Biol 2013; 9:e1003272. [PMID: 24204224 PMCID: PMC3808450 DOI: 10.1371/journal.pcbi.1003272] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2013] [Accepted: 08/28/2013] [Indexed: 01/26/2023] Open
Abstract
The phenomenology and cellular mechanisms of cortical synaptic plasticity are becoming known in increasing detail, but the computational principles by which cortical plasticity enables the development of sensory representations are unclear. Here we describe a framework for cortical synaptic plasticity termed the "Convallis rule", mathematically derived from a principle of unsupervised learning via constrained optimization. Implementation of the rule caused a recurrent cortex-like network of simulated spiking neurons to develop rate representations of real-world speech stimuli, enabling classification by a downstream linear decoder. Applied to spike patterns used in in vitro plasticity experiments, the rule reproduced multiple results including and beyond STDP. However STDP alone produced poorer learning performance. The mathematical form of the rule is consistent with a dual coincidence detector mechanism that has been suggested by experiments in several synaptic classes of juvenile neocortex. Based on this confluence of normative, phenomenological, and mechanistic evidence, we suggest that the rule may approximate a fundamental computational principle of the neocortex.
Collapse
Affiliation(s)
- Pierre Yger
- UCL Institute of Neurology and UCL Department of Neuroscience, Physiology, and Pharmacology, London, United Kingdom
- * E-mail:
| | - Kenneth D. Harris
- UCL Institute of Neurology and UCL Department of Neuroscience, Physiology, and Pharmacology, London, United Kingdom
| |
Collapse
|
82
|
Trousdale J, Hu Y, Shea-Brown E, Josić K. A generative spike train model with time-structured higher order correlations. Front Comput Neurosci 2013; 7:84. [PMID: 23908626 PMCID: PMC3727174 DOI: 10.3389/fncom.2013.00084] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2013] [Accepted: 06/12/2013] [Indexed: 11/16/2022] Open
Abstract
Emerging technologies are revealing the spiking activity in ever larger neural ensembles. Frequently, this spiking is far from independent, with correlations in the spike times of different cells. Understanding how such correlations impact the dynamics and function of neural ensembles remains an important open problem. Here we describe a new, generative model for correlated spike trains that can exhibit many of the features observed in data. Extending prior work in mathematical finance, this generalized thinning and shift (GTaS) model creates marginally Poisson spike trains with diverse temporal correlation structures. We give several examples which highlight the model's flexibility and utility. For instance, we use it to examine how a neural network responds to highly structured patterns of inputs. We then show that the GTaS model is analytically tractable, and derive cumulant densities of all orders in terms of model parameters. The GTaS framework can therefore be an important tool in the experimental and theoretical exploration of neural dynamics.
Collapse
Affiliation(s)
- James Trousdale
- Department of Mathematics, University of Houston Houston, TX, USA
| | | | | | | |
Collapse
|
83
|
Testing the nanoparticle-allostatic cross-adaptation-sensitization model for homeopathic remedy effects. HOMEOPATHY 2013; 102:66-81. [PMID: 23290882 DOI: 10.1016/j.homp.2012.10.005] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2012] [Revised: 10/25/2012] [Accepted: 10/25/2012] [Indexed: 02/08/2023]
Abstract
Key concepts of the Nanoparticle-Allostatic Cross-Adaptation-Sensitization (NPCAS) Model for the action of homeopathic remedies in living systems include source nanoparticles as low level environmental stressors, heterotypic hormesis, cross-adaptation, allostasis (stress response network), time-dependent sensitization with endogenous amplification and bidirectional change, and self-organizing complex adaptive systems. The model accommodates the requirement for measurable physical agents in the remedy (source nanoparticles and/or source adsorbed to silica nanoparticles). Hormetic adaptive responses in the organism, triggered by nanoparticles; bipolar, metaplastic change, dependent on the history of the organism. Clinical matching of the patient's symptom picture, including modalities, to the symptom pattern that the source material can cause (cross-adaptation and cross-sensitization). Evidence for nanoparticle-related quantum macro-entanglement in homeopathic pathogenetic trials. This paper examines research implications of the model, discussing the following hypotheses: Variability in nanoparticle size, morphology, and aggregation affects remedy properties and reproducibility of findings. Homeopathic remedies modulate adaptive allostatic responses, with multiple dynamic short- and long-term effects. Simillimum remedy nanoparticles, as novel mild stressors corresponding to the organism's dysfunction initiate time-dependent cross-sensitization, reversing the direction of dysfunctional reactivity to environmental stressors. The NPCAS model suggests a way forward for systematic research on homeopathy. The central proposition is that homeopathic treatment is a form of nanomedicine acting by modulation of endogenous adaptation and metaplastic amplification processes in the organism to enhance long-term systemic resilience and health.
Collapse
|
84
|
Uramoto T, Torikai H. A calcium-based simple model of multiple spike interactions in spike-timing-dependent plasticity. Neural Comput 2013; 25:1853-69. [PMID: 23607556 DOI: 10.1162/neco_a_00462] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Spike-timing-dependent plasticity (STDP) is a form of synaptic modification that depends on the relative timings of presynaptic and postsynaptic spikes. In this letter, we proposed a calcium-based simple STDP model, described by an ordinary differential equation having only three state variables: one represents the density of intracellular calcium, one represents a fraction of open state NMDARs, and one represents the synaptic weight. We shown that in spite of its simplicity, the model can reproduce the properties of the plasticity that have been experimentally measured in various brain areas (e.g., layer 2/3 and 5 visual cortical slices, hippocampal cultures, and layer 2/3 somatosensory cortical slices) with respect to various patterns of presynaptic and postsynaptic spikes. In addition, comparisons with other STDP models are made, and the significance and advantages of the proposed model are discussed.
Collapse
Affiliation(s)
- Takumi Uramoto
- Graduate School of Engineering Science, Osaka University, Toyonaka, Osaka 560-8531, Japan.
| | | |
Collapse
|
85
|
Rahimi Azghadi M, Al-Sarawi S, Abbott D, Iannella N. A neuromorphic VLSI design for spike timing and rate based synaptic plasticity. Neural Netw 2013; 45:70-82. [PMID: 23566339 DOI: 10.1016/j.neunet.2013.03.003] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2012] [Revised: 12/14/2012] [Accepted: 03/03/2013] [Indexed: 11/27/2022]
Abstract
Triplet-based Spike Timing Dependent Plasticity (TSTDP) is a powerful synaptic plasticity rule that acts beyond conventional pair-based STDP (PSTDP). Here, the TSTDP is capable of reproducing the outcomes from a variety of biological experiments, while the PSTDP rule fails to reproduce them. Additionally, it has been shown that the behaviour inherent to the spike rate-based Bienenstock-Cooper-Munro (BCM) synaptic plasticity rule can also emerge from the TSTDP rule. This paper proposes an analogue implementation of the TSTDP rule. The proposed VLSI circuit has been designed using the AMS 0.35 μm CMOS process and has been simulated using design kits for Synopsys and Cadence tools. Simulation results demonstrate how well the proposed circuit can alter synaptic weights according to the timing difference amongst a set of different patterns of spikes. Furthermore, the circuit is shown to give rise to a BCM-like learning rule, which is a rate-based rule. To mimic an implementation environment, a 1000 run Monte Carlo (MC) analysis was conducted on the proposed circuit. The presented MC simulation analysis and the simulation result from fine-tuned circuits show that it is possible to mitigate the effect of process variations in the proof of concept circuit; however, a practical variation aware design technique is required to promise a high circuit performance in a large scale neural network. We believe that the proposed design can play a significant role in future VLSI implementations of both spike timing and rate based neuromorphic learning systems.
Collapse
Affiliation(s)
- Mostafa Rahimi Azghadi
- School of Electrical and Electronic Engineering, The University of Adelaide, Adelaide, SA 5005, Australia.
| | | | | | | |
Collapse
|
86
|
Nessler B, Pfeiffer M, Buesing L, Maass W. Bayesian computation emerges in generic cortical microcircuits through spike-timing-dependent plasticity. PLoS Comput Biol 2013; 9:e1003037. [PMID: 23633941 PMCID: PMC3636028 DOI: 10.1371/journal.pcbi.1003037] [Citation(s) in RCA: 112] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2012] [Accepted: 03/04/2013] [Indexed: 11/24/2022] Open
Abstract
The principles by which networks of neurons compute, and how spike-timing dependent plasticity (STDP) of synaptic weights generates and maintains their computational function, are unknown. Preceding work has shown that soft winner-take-all (WTA) circuits, where pyramidal neurons inhibit each other via interneurons, are a common motif of cortical microcircuits. We show through theoretical analysis and computer simulations that Bayesian computation is induced in these network motifs through STDP in combination with activity-dependent changes in the excitability of neurons. The fundamental components of this emergent Bayesian computation are priors that result from adaptation of neuronal excitability and implicit generative models for hidden causes that are created in the synaptic weights through STDP. In fact, a surprising result is that STDP is able to approximate a powerful principle for fitting such implicit generative models to high-dimensional spike inputs: Expectation Maximization. Our results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes. Furthermore it suggests networks of Bayesian computation modules as a new model for distributed information processing in the cortex.
Collapse
Affiliation(s)
- Bernhard Nessler
- Institute for Theoretical Computer Science, Graz University of Technology, Graz, Austria.
| | | | | | | |
Collapse
|
87
|
The BCM theory of synapse modification at 30: interaction of theory with experiment. Nat Rev Neurosci 2012; 13:798-810. [PMID: 23080416 DOI: 10.1038/nrn3353] [Citation(s) in RCA: 225] [Impact Index Per Article: 18.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Thirty years have passed since the publication of Elie Bienenstock, Leon Cooper and Paul Munro's 'Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex', known as the BCM theory of synaptic plasticity. This theory has guided experimentalists to discover some fundamental properties of synaptic plasticity and has provided a mathematical structure that bridges molecular mechanisms and systems-level consequences of learning and memory storage.
Collapse
|
88
|
Spectral analysis of input spike trains by spike-timing-dependent plasticity. PLoS Comput Biol 2012; 8:e1002584. [PMID: 22792056 PMCID: PMC3390410 DOI: 10.1371/journal.pcbi.1002584] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2011] [Accepted: 05/14/2012] [Indexed: 11/19/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) has been observed in many brain areas such as sensory cortices, where it is hypothesized to structure synaptic connections between neurons. Previous studies have demonstrated how STDP can capture spiking information at short timescales using specific input configurations, such as coincident spiking, spike patterns and oscillatory spike trains. However, the corresponding computation in the case of arbitrary input signals is still unclear. This paper provides an overarching picture of the algorithm inherent to STDP, tying together many previous results for commonly used models of pairwise STDP. For a single neuron with plastic excitatory synapses, we show how STDP performs a spectral analysis on the temporal cross-correlograms between its afferent spike trains. The postsynaptic responses and STDP learning window determine kernel functions that specify how the neuron “sees” the input correlations. We thus denote this unsupervised learning scheme as ‘kernel spectral component analysis’ (kSCA). In particular, the whole input correlation structure must be considered since all plastic synapses compete with each other. We find that kSCA is enhanced when weight-dependent STDP induces gradual synaptic competition. For a spiking neuron with a “linear” response and pairwise STDP alone, we find that kSCA resembles principal component analysis (PCA). However, plain STDP does not isolate correlation sources in general, e.g., when they are mixed among the input spike trains. In other words, it does not perform independent component analysis (ICA). Tuning the neuron to a single correlation source can be achieved when STDP is paired with a homeostatic mechanism that reinforces the competition between synaptic inputs. Our results suggest that neuronal networks equipped with STDP can process signals encoded in the transient spiking activity at the timescales of tens of milliseconds for usual STDP. Tuning feature extraction of sensory stimuli is an important function for synaptic plasticity models. A widely studied example is the development of orientation preference in the primary visual cortex, which can emerge using moving bars in the visual field. A crucial point is the decomposition of stimuli into basic information tokens, e.g., selecting individual bars even though they are presented in overlapping pairs (vertical and horizontal). Among classical unsupervised learning models, independent component analysis (ICA) is capable of isolating basic tokens, whereas principal component analysis (PCA) cannot. This paper focuses on spike-timing-dependent plasticity (STDP), whose functional implications for neural information processing have been intensively studied both theoretically and experimentally in the last decade. Following recent studies demonstrating that STDP can perform ICA for specific cases, we show how STDP relates to PCA or ICA, and in particular explains the conditions under which it switches between them. Here information at the neuronal level is assumed to be encoded in temporal cross-correlograms of spike trains. We find that a linear spiking neuron equipped with pairwise STDP requires additional mechanisms, such as a homeostatic regulation of its output firing, in order to separate mixed correlation sources and thus perform ICA.
Collapse
|
89
|
Gilson M, Bürck M, Burkitt AN, van Hemmen JL. Frequency selectivity emerging from spike-timing-dependent plasticity. Neural Comput 2012; 24:2251-79. [PMID: 22734488 DOI: 10.1162/neco_a_00331] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Periodic neuronal activity has been observed in various areas of the brain, from lower sensory to higher cortical levels. Specific frequency components contained in this periodic activity can be identified by a neuronal circuit that behaves as a bandpass filter with given preferred frequency, or best modulation frequency (BMF). For BMFs typically ranging from 10 to 200 Hz, a plausible and minimal configuration consists of a single neuron with adjusted excitatory and inhibitory synaptic connections. The emergence, however, of such a neuronal circuitry is still unclear. In this letter, we demonstrate how spike-timing-dependent plasticity (STDP) can give rise to frequency-dependent learning, thus leading to an input selectivity that enables frequency identification. We use an in-depth mathematical analysis of the learning dynamics in a population of plastic inhibitory connections. These provide inhomogeneous postsynaptic responses that depend on their dendritic location. We find that synaptic delays play a crucial role in organizing the weight specialization induced by STDP. Under suitable conditions on the synaptic delays and postsynaptic potentials (PSPs), the BMF of a neuron after learning can match the training frequency. In particular, proximal (distal) synapses with shorter (longer) dendritic delay and somatically measured PSP time constants respond better to higher (lower) frequencies. As a result, the neuron will respond maximally to any stimulating frequency (in a given range) with which it has been trained in an unsupervised manner. The model predicts that synapses responding to a given BMF form clusters on dendritic branches.
Collapse
Affiliation(s)
- Matthieu Gilson
- NeuroEngineering Laboratory, Department of Electrical and Electronic Engineering, University of Melbourne, VIC 3010, Australia; The Bionics Institute, East Melbourne, VIC 3002, Australia; NICTA the Victorian Research Lab, University of Melbourne, VIC 3010, Australia; and RIKEN Brain Science Institute, Saitama 351-0198, Japan
| | - Moritz Bürck
- Physik Department T35, Technische Universitat München, 85748 Garching bei München, Germany, and Bernstein Center for Computational Neuroscience—München, 82152 Martinsried, Germany
| | - Anthony N. Burkitt
- Neuroengineering Laboratory, Department of Electrical and Electronic Engineering, University of Melbourne, VIC 3010, Australia; The Bionics Institute, East Melbourne, VIC 3010, Australia; and Centre for Neural Engineering, University of Melbourne, VIC 3010, Australia
| | - J. Leo van Hemmen
- Physik Department T35, Technische Universitat München, 85748 Garching bei München, Germany, and Bernstein Center for Computational Neuroscience—München, 82152 Martinsried, Germany
| |
Collapse
|
90
|
Hunzinger JF, Chan VH, Froemke RC. Learning complex temporal patterns with resource-dependent spike timing-dependent plasticity. J Neurophysiol 2012; 108:551-66. [PMID: 22496526 DOI: 10.1152/jn.01150.2011] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Studies of spike timing-dependent plasticity (STDP) have revealed that long-term changes in the strength of a synapse may be modulated substantially by temporal relationships between multiple presynaptic and postsynaptic spikes. Whereas long-term potentiation (LTP) and long-term depression (LTD) of synaptic strength have been modeled as distinct or separate functional mechanisms, here, we propose a new shared resource model. A functional consequence of our model is fast, stable, and diverse unsupervised learning of temporal multispike patterns with a biologically consistent spiking neural network. Due to interdependencies between LTP and LTD, dendritic delays, and proactive homeostatic aspects of the model, neurons are equipped to learn to decode temporally coded information within spike bursts. Moreover, neurons learn spike timing with few exposures in substantial noise and jitter. Surprisingly, despite having only one parameter, the model also accurately predicts in vitro observations of STDP in more complex multispike trains, as well as rate-dependent effects. We discuss candidate commonalities in natural long-term plasticity mechanisms.
Collapse
|
91
|
Vasilaki E, Giugliano M. Emergence of Connectivity Patterns from Long-Term and Short-Term Plasticities. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING – ICANN 2012 2012. [DOI: 10.1007/978-3-642-33269-2_25] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
|
92
|
What is the appropriate description level for synaptic plasticity? Proc Natl Acad Sci U S A 2011; 108:19103-4. [PMID: 22089234 DOI: 10.1073/pnas.1117027108] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
|