1
|
Rodrigues YE, Tigaret CM, Marie H, O'Donnell C, Veltz R. A stochastic model of hippocampal synaptic plasticity with geometrical readout of enzyme dynamics. eLife 2023; 12:e80152. [PMID: 37589251 PMCID: PMC10435238 DOI: 10.7554/elife.80152] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Accepted: 03/22/2023] [Indexed: 08/18/2023] Open
Abstract
Discovering the rules of synaptic plasticity is an important step for understanding brain learning. Existing plasticity models are either (1) top-down and interpretable, but not flexible enough to account for experimental data, or (2) bottom-up and biologically realistic, but too intricate to interpret and hard to fit to data. To avoid the shortcomings of these approaches, we present a new plasticity rule based on a geometrical readout mechanism that flexibly maps synaptic enzyme dynamics to predict plasticity outcomes. We apply this readout to a multi-timescale model of hippocampal synaptic plasticity induction that includes electrical dynamics, calcium, CaMKII and calcineurin, and accurate representation of intrinsic noise sources. Using a single set of model parameters, we demonstrate the robustness of this plasticity rule by reproducing nine published ex vivo experiments covering various spike-timing and frequency-dependent plasticity induction protocols, animal ages, and experimental conditions. Our model also predicts that in vivo-like spike timing irregularity strongly shapes plasticity outcome. This geometrical readout modelling approach can be readily applied to other excitatory or inhibitory synapses to discover their synaptic plasticity rules.
Collapse
Affiliation(s)
- Yuri Elias Rodrigues
- Université Côte d’AzurNiceFrance
- Institut de Pharmacologie Moléculaire et Cellulaire (IPMC), CNRSValbonneFrance
- Inria Center of University Côte d’Azur (Inria)Sophia AntipolisFrance
| | - Cezar M Tigaret
- Neuroscience and Mental Health Research Innovation Institute, Division of Psychological Medicine and Clinical Neurosciences,School of Medicine, Cardiff UniversityCardiffUnited Kingdom
| | - Hélène Marie
- Université Côte d’AzurNiceFrance
- Institut de Pharmacologie Moléculaire et Cellulaire (IPMC), CNRSValbonneFrance
| | - Cian O'Donnell
- School of Computing, Engineering, and Intelligent Systems, Magee Campus, Ulster UniversityLondonderryUnited Kingdom
- School of Computer Science, Electrical and Electronic Engineering, and Engineering Mathematics, University of BristolBristolUnited Kingdom
| | - Romain Veltz
- Inria Center of University Côte d’Azur (Inria)Sophia AntipolisFrance
| |
Collapse
|
2
|
Stapmanns J, Hahne J, Helias M, Bolten M, Diesmann M, Dahmen D. Event-Based Update of Synapses in Voltage-Based Learning Rules. Front Neuroinform 2021; 15:609147. [PMID: 34177505 PMCID: PMC8222618 DOI: 10.3389/fninf.2021.609147] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2020] [Accepted: 04/07/2021] [Indexed: 11/13/2022] Open
Abstract
Due to the point-like nature of neuronal spiking, efficient neural network simulators often employ event-based simulation schemes for synapses. Yet many types of synaptic plasticity rely on the membrane potential of the postsynaptic cell as a third factor in addition to pre- and postsynaptic spike times. In some learning rules membrane potentials not only influence synaptic weight changes at the time points of spike events but in a continuous manner. In these cases, synapses therefore require information on the full time course of membrane potentials to update their strength which a priori suggests a continuous update in a time-driven manner. The latter hinders scaling of simulations to realistic cortical network sizes and relevant time scales for learning. Here, we derive two efficient algorithms for archiving postsynaptic membrane potentials, both compatible with modern simulation engines based on event-based synapse updates. We theoretically contrast the two algorithms with a time-driven synapse update scheme to analyze advantages in terms of memory and computations. We further present a reference implementation in the spiking neural network simulator NEST for two prototypical voltage-based plasticity rules: the Clopath rule and the Urbanczik-Senn rule. For both rules, the two event-based algorithms significantly outperform the time-driven scheme. Depending on the amount of data to be stored for plasticity, which heavily differs between the rules, a strong performance increase can be achieved by compressing or sampling of information on membrane potentials. Our results on computational efficiency related to archiving of information provide guidelines for the design of learning rules in order to make them practically usable in large-scale networks.
Collapse
Affiliation(s)
- Jonas Stapmanns
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure Function Relationship (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Institute for Theoretical Solid State Physics, RWTH Aachen University, Aachen, Germany
| | - Jan Hahne
- School of Mathematics and Natural Sciences, Bergische Universität Wuppertal, Wuppertal, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure Function Relationship (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Institute for Theoretical Solid State Physics, RWTH Aachen University, Aachen, Germany
| | - Matthias Bolten
- School of Mathematics and Natural Sciences, Bergische Universität Wuppertal, Wuppertal, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure Function Relationship (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure Function Relationship (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
3
|
George R, Chiappalone M, Giugliano M, Levi T, Vassanelli S, Partzsch J, Mayr C. Plasticity and Adaptation in Neuromorphic Biohybrid Systems. iScience 2020; 23:101589. [PMID: 33083749 PMCID: PMC7554028 DOI: 10.1016/j.isci.2020.101589] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
Neuromorphic systems take inspiration from the principles of biological information processing to form hardware platforms that enable the large-scale implementation of neural networks. The recent years have seen both advances in the theoretical aspects of spiking neural networks for their use in classification and control tasks and a progress in electrophysiological methods that is pushing the frontiers of intelligent neural interfacing and signal processing technologies. At the forefront of these new technologies, artificial and biological neural networks are tightly coupled, offering a novel "biohybrid" experimental framework for engineers and neurophysiologists. Indeed, biohybrid systems can constitute a new class of neuroprostheses opening important perspectives in the treatment of neurological disorders. Moreover, the use of biologically plausible learning rules allows forming an overall fault-tolerant system of co-developing subsystems. To identify opportunities and challenges in neuromorphic biohybrid systems, we discuss the field from the perspectives of neurobiology, computational neuroscience, and neuromorphic engineering.
Collapse
Affiliation(s)
- Richard George
- Department of Electrical Engineering and Information Technology, Technical University of Dresden, Dresden, Germany
| | | | - Michele Giugliano
- Neuroscience Area, International School of Advanced Studies, Trieste, Italy
| | - Timothée Levi
- Laboratoire de l’Intégration du Matéeriau au Systéme, University of Bordeaux, Bordeaux, France
- LIMMS/CNRS, Institute of Industrial Science, The University of Tokyo, Tokyo, Japan
| | - Stefano Vassanelli
- Department of Biomedical Sciences and Padova Neuroscience Center, University of Padova, Padova, Italy
| | - Johannes Partzsch
- Department of Electrical Engineering and Information Technology, Technical University of Dresden, Dresden, Germany
| | - Christian Mayr
- Department of Electrical Engineering and Information Technology, Technical University of Dresden, Dresden, Germany
| |
Collapse
|
4
|
Ivans RC, Dahl SG, Cantley KD. A Model for R(t) Elements and R(t) -Based Spike-Timing-Dependent Plasticity With Basic Circuit Examples. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2020; 31:4206-4216. [PMID: 31869804 DOI: 10.1109/tnnls.2019.2952768] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Spike-timing-dependent plasticity (STDP) is a fundamental synaptic learning rule observed in biology that leads to numerous behavioral and cognitive outcomes. Emulating STDP in electronic spiking neural networks with high-density memristive synapses is, therefore, of significant interest. While one popular method involves pulse-shaping the spiking neuron output voltages, an alternative approach is outlined in this article. The proposed STDP implementation uses time-varying dynamic resistance [ R ( t )] elements to achieve local synaptic learning from spike-pair STDP, spike triplet STDP, and firing rates. The R ( t ) elements are connected to each neuron circuit, thereby maintaining synaptic density and leveraging voltage division as a means of altering synaptic weight (memristor voltage). Example R ( t ) elements with their corresponding behaviors are demonstrated through simulation. A three-input-two-output network using single-memristor synaptic connections and R ( t ) elements is also simulated. Network-level effects, such as nonspecific synaptic plasticity, are discussed. Finally, spatiotemporal pattern recognition (STPR) using R ( t ) elements is demonstrated in simulation.
Collapse
|
5
|
Diederich N, Bartsch T, Kohlstedt H, Ziegler M. A memristive plasticity model of voltage-based STDP suitable for recurrent bidirectional neural networks in the hippocampus. Sci Rep 2018; 8:9367. [PMID: 29921840 PMCID: PMC6008480 DOI: 10.1038/s41598-018-27616-6] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2017] [Accepted: 06/04/2018] [Indexed: 01/02/2023] Open
Abstract
Memristive systems have gained considerable attention in the field of neuromorphic engineering, because they allow the emulation of synaptic functionality in solid state nano-physical systems. In this study, we show that memristive behavior provides a broad working framework for the phenomenological modelling of cellular synaptic mechanisms. In particular, we seek to understand how close a memristive system can account for the biological realism. The basic characteristics of memristive systems, i.e. voltage and memory behavior, are used to derive a voltage-based plasticity rule. We show that this model is suitable to account for a variety of electrophysiology plasticity data. Furthermore, we incorporate the plasticity model into an all-to-all connecting network scheme. Motivated by the auto-associative CA3 network of the hippocampus, we show that the implemented network allows the discrimination and processing of mnemonic pattern information, i.e. the formation of functional bidirectional connections resulting in the formation of local receptive fields. Since the presented plasticity model can be applied to real memristive devices as well, the presented theoretical framework can support both, the design of appropriate memristive devices for neuromorphic computing and the development of complex neuromorphic networks, which account for the specific advantage of memristive devices.
Collapse
Affiliation(s)
- Nick Diederich
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany
- Department of Neurology, Memory Disorders and Plasticity Group, University Hospital Schleswig-Holstein, Kiel, Germany
| | - Thorsten Bartsch
- Department of Neurology, Memory Disorders and Plasticity Group, University Hospital Schleswig-Holstein, Kiel, Germany
| | - Hermann Kohlstedt
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany
| | - Martin Ziegler
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany.
| |
Collapse
|
6
|
Weissenberger F, Gauy MM, Lengler J, Meier F, Steger A. Voltage dependence of synaptic plasticity is essential for rate based learning with short stimuli. Sci Rep 2018; 8:4609. [PMID: 29545553 PMCID: PMC5854671 DOI: 10.1038/s41598-018-22781-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2018] [Accepted: 02/28/2018] [Indexed: 11/09/2022] Open
Abstract
In computational neuroscience, synaptic plasticity rules are often formulated in terms of firing rates. The predominant description of in vivo neuronal activity, however, is the instantaneous rate (or spiking probability). In this article we resolve this discrepancy by showing that fluctuations of the membrane potential carry enough information to permit a precise estimate of the instantaneous rate in balanced networks. As a consequence, we find that rate based plasticity rules are not restricted to neuronal activity that is stable for hundreds of milliseconds to seconds, but can be carried over to situations in which it changes every few milliseconds. We illustrate this, by showing that a voltage-dependent realization of the classical BCM rule achieves input selectivity, even if stimulus duration is reduced to a few milliseconds each.
Collapse
Affiliation(s)
- Felix Weissenberger
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland.
| | - Marcelo Matheus Gauy
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Johannes Lengler
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Florian Meier
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Angelika Steger
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| |
Collapse
|
7
|
Gopalakrishnan R, Basu A. Triplet Spike Time-Dependent Plasticity in a Floating-Gate Synapse. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2017; 28:778-790. [PMID: 26841419 DOI: 10.1109/tnnls.2015.2506740] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Synapse plays an important role in learning in a neural network; the learning rules that modify the synaptic strength based on the timing difference between the pre- and postsynaptic spike occurrence are termed spike time-dependent plasticity (STDP) rules. The most commonly used rule posits weight change based on time difference between one presynaptic spike and one postsynaptic spike and is hence termed doublet STDP (D-STDP). However, D-STDP could not reproduce results of many biological experiments; a triplet STDP (T-STDP) that considers triplets of spikes as the fundamental unit has been proposed recently to explain these observations. This paper describes the compact implementation of a synapse using a single floating-gate (FG) transistor that can store a weight in a nonvolatile manner and demonstrates the T-STDP learning rule by modifying drain voltages according to triplets of spikes. We describe a mathematical procedure to obtain control voltages for the FG device for T-STDP and also show measurement results from an FG synapse fabricated in TSMC 0.35-μm CMOS process to support the theory. Possible very large scale integration implementation of drain voltage waveform generator circuits is also presented with the simulation results.
Collapse
|
8
|
Mayr C, Partzsch J, Noack M, Hänzsche S, Scholze S, Höppner S, Ellguth G, Schüffny R. A Biological-Realtime Neuromorphic System in 28 nm CMOS Using Low-Leakage Switched Capacitor Circuits. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2016; 10:243-254. [PMID: 25680215 DOI: 10.1109/tbcas.2014.2379294] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
A switched-capacitor (SC) neuromorphic system for closed-loop neural coupling in 28 nm CMOS is presented, occupying 600 um by 600 um. It offers 128 input channels (i.e., presynaptic terminals), 8192 synapses and 64 output channels (i.e., neurons). Biologically realistic neuron and synapse dynamics are achieved via a faithful translation of the behavioural equations to SC circuits. As leakage currents significantly affect circuit behaviour at this technology node, dedicated compensation techniques are employed to achieve biological-realtime operation, with faithful reproduction of time constants of several 100 ms at room temperature. Power draw of the overall system is 1.9 mW.
Collapse
|
9
|
|
10
|
Partzsch J, Schüffny R. Network-driven design principles for neuromorphic systems. Front Neurosci 2015; 9:386. [PMID: 26539079 PMCID: PMC4611986 DOI: 10.3389/fnins.2015.00386] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2015] [Accepted: 10/05/2015] [Indexed: 11/17/2022] Open
Abstract
Synaptic connectivity is typically the most resource-demanding part of neuromorphic systems. Commonly, the architecture of these systems is chosen mainly on technical considerations. As a consequence, the potential for optimization arising from the inherent constraints of connectivity models is left unused. In this article, we develop an alternative, network-driven approach to neuromorphic architecture design. We describe methods to analyse performance of existing neuromorphic architectures in emulating certain connectivity models. Furthermore, we show step-by-step how to derive a neuromorphic architecture from a given connectivity model. For this, we introduce a generalized description for architectures with a synapse matrix, which takes into account shared use of circuit components for reducing total silicon area. Architectures designed with this approach are fitted to a connectivity model, essentially adapting to its connection density. They are guaranteeing faithful reproduction of the model on chip, while requiring less total silicon area. In total, our methods allow designers to implement more area-efficient neuromorphic systems and verify usability of the connectivity resources in these systems.
Collapse
Affiliation(s)
- Johannes Partzsch
- Chair for Highly Parallel VLSI Systems and Neuromorphic Circuits, Department of Electrical Engineering and Information Technology, Technische Universität Dresden Dresden, Germany
| | - Rene Schüffny
- Chair for Highly Parallel VLSI Systems and Neuromorphic Circuits, Department of Electrical Engineering and Information Technology, Technische Universität Dresden Dresden, Germany
| |
Collapse
|
11
|
Jedlicka P, Benuskova L, Abraham WC. A Voltage-Based STDP Rule Combined with Fast BCM-Like Metaplasticity Accounts for LTP and Concurrent "Heterosynaptic" LTD in the Dentate Gyrus In Vivo. PLoS Comput Biol 2015; 11:e1004588. [PMID: 26544038 PMCID: PMC4636250 DOI: 10.1371/journal.pcbi.1004588] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2015] [Accepted: 10/06/2015] [Indexed: 11/18/2022] Open
Abstract
Long-term potentiation (LTP) and long-term depression (LTD) are widely accepted to be synaptic mechanisms involved in learning and memory. It remains uncertain, however, which particular activity rules are utilized by hippocampal neurons to induce LTP and LTD in behaving animals. Recent experiments in the dentate gyrus of freely moving rats revealed an unexpected pattern of LTP and LTD from high-frequency perforant path stimulation. While 400 Hz theta-burst stimulation (400-TBS) and 400 Hz delta-burst stimulation (400-DBS) elicited substantial LTP of the tetanized medial path input and, concurrently, LTD of the non-tetanized lateral path input, 100 Hz theta-burst stimulation (100-TBS, a normally efficient LTP protocol for in vitro preparations) produced only weak LTP and concurrent LTD. Here we show in a biophysically realistic compartmental granule cell model that this pattern of results can be accounted for by a voltage-based spike-timing-dependent plasticity (STDP) rule combined with a relatively fast Bienenstock-Cooper-Munro (BCM)-like homeostatic metaplasticity rule, all on a background of ongoing spontaneous activity in the input fibers. Our results suggest that, at least for dentate granule cells, the interplay of STDP-BCM plasticity rules and ongoing pre- and postsynaptic background activity determines not only the degree of input-specific LTP elicited by various plasticity-inducing protocols, but also the degree of associated LTD in neighboring non-tetanized inputs, as generated by the ongoing constitutive activity at these synapses. The vast majority of computational studies that model synaptic plasticity neglect the fact that in vivo neurons exhibit an ongoing spontaneous spiking which affects the dynamics of synaptic changes. Here we study how key components of learning mechanisms in the brain, namely spike timing-dependent plasticity and metaplasticity, interact with spontaneous activity in the input pathways of the neuron. Using biologically realistic simulations we show that ongoing background activity is a key determinant of the degree of long-term potentiation and long-term depression of synaptic transmission between nerve cells in the hippocampus of freely moving animals. This work helps better understand the computational rules which drive synaptic plasticity in vivo.
Collapse
Affiliation(s)
- Peter Jedlicka
- Institute of Clinical Neuroanatomy, Neuroscience Center, Goethe University Frankfurt, Frankfurt, Germany
- * E-mail: (PJ); (LB)
| | - Lubica Benuskova
- Department of Computer Science, University of Otago, Dunedin, New Zealand
- Brain Health Research Centre and Brain Research New Zealand, University of Otago, Dunedin, New Zealand
- * E-mail: (PJ); (LB)
| | - Wickliffe C. Abraham
- Brain Health Research Centre and Brain Research New Zealand, University of Otago, Dunedin, New Zealand
- Department of Psychology, University of Otago, Dunedin, New Zealand
| |
Collapse
|
12
|
Mostafa H, Khiat A, Serb A, Mayr CG, Indiveri G, Prodromakis T. Implementation of a spike-based perceptron learning rule using TiO2-x memristors. Front Neurosci 2015; 9:357. [PMID: 26483629 PMCID: PMC4591430 DOI: 10.3389/fnins.2015.00357] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2015] [Accepted: 09/18/2015] [Indexed: 11/13/2022] Open
Abstract
Synaptic plasticity plays a crucial role in allowing neural networks to learn and adapt to various input environments. Neuromorphic systems need to implement plastic synapses to obtain basic "cognitive" capabilities such as learning. One promising and scalable approach for implementing neuromorphic synapses is to use nano-scale memristors as synaptic elements. In this paper we propose a hybrid CMOS-memristor system comprising CMOS neurons interconnected through TiO2-x memristors, and spike-based learning circuits that modulate the conductance of the memristive synapse elements according to a spike-based Perceptron plasticity rule. We highlight a number of advantages for using this spike-based plasticity rule as compared to other forms of spike timing dependent plasticity (STDP) rules. We provide experimental proof-of-concept results with two silicon neurons connected through a memristive synapse that show how the CMOS plasticity circuits can induce stable changes in memristor conductances, giving rise to increased synaptic strength after a potentiation episode and to decreased strength after a depression episode.
Collapse
Affiliation(s)
- Hesham Mostafa
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Ali Khiat
- Nanoelectronics and Nanotechnology Research Group, School of Electronics and Computer Science, University of Southampton UK
| | - Alexander Serb
- Nanoelectronics and Nanotechnology Research Group, School of Electronics and Computer Science, University of Southampton UK
| | - Christian G Mayr
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Giacomo Indiveri
- Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Themis Prodromakis
- Nanoelectronics and Nanotechnology Research Group, School of Electronics and Computer Science, University of Southampton UK
| |
Collapse
|
13
|
Du N, Kiani M, Mayr CG, You T, Bürger D, Skorupa I, Schmidt OG, Schmidt H. Single pairing spike-timing dependent plasticity in BiFeO3 memristors with a time window of 25 ms to 125 μs. Front Neurosci 2015; 9:227. [PMID: 26175666 PMCID: PMC4485154 DOI: 10.3389/fnins.2015.00227] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2015] [Accepted: 06/11/2015] [Indexed: 11/17/2022] Open
Abstract
Memristive devices are popular among neuromorphic engineers for their ability to emulate forms of spike-driven synaptic plasticity by applying specific voltage and current waveforms at their two terminals. In this paper, we investigate spike-timing dependent plasticity (STDP) with a single pairing of one presynaptic voltage spike and one post-synaptic voltage spike in a BiFeO3 memristive device. In most memristive materials the learning window is primarily a function of the material characteristics and not of the applied waveform. In contrast, we show that the analog resistive switching of the developed artificial synapses allows to adjust the learning time constant of the STDP function from 25 ms to 125 μs via the duration of applied voltage spikes. Also, as the induced weight change may degrade, we investigate the remanence of the resistance change for several hours after analog resistive switching, thus emulating the processes expected in biological synapses. As the power consumption is a major constraint in neuromorphic circuits, we show methods to reduce the consumed energy per setting pulse to only 4.5 pJ in the developed artificial synapses.
Collapse
Affiliation(s)
- Nan Du
- Material Systems for Nanoelectronics, Faculty of Electrical and Information Engineering, Chemnitz University of Technology Chemnitz, Germany
| | - Mahdi Kiani
- Material Systems for Nanoelectronics, Faculty of Electrical and Information Engineering, Chemnitz University of Technology Chemnitz, Germany
| | - Christian G Mayr
- Neuromorphic Cognitive Systems Group, Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
| | - Tiangui You
- Material Systems for Nanoelectronics, Faculty of Electrical and Information Engineering, Chemnitz University of Technology Chemnitz, Germany
| | - Danilo Bürger
- Material Systems for Nanoelectronics, Faculty of Electrical and Information Engineering, Chemnitz University of Technology Chemnitz, Germany
| | - Ilona Skorupa
- Material Systems for Nanoelectronics, Faculty of Electrical and Information Engineering, Chemnitz University of Technology Chemnitz, Germany ; Semiconductor Materials, Institute of Ion Beam Physics and Materials Research, HZDR Innovation GmbH Dresden, Germany
| | - Oliver G Schmidt
- Material Systems for Nanoelectronics, Faculty of Electrical and Information Engineering, Chemnitz University of Technology Chemnitz, Germany ; Institute for Integrative Nanosciences, IFW Dresden Dresden, Germany
| | - Heidemarie Schmidt
- Material Systems for Nanoelectronics, Faculty of Electrical and Information Engineering, Chemnitz University of Technology Chemnitz, Germany
| |
Collapse
|
14
|
Saïghi S, Mayr CG, Serrano-Gotarredona T, Schmidt H, Lecerf G, Tomas J, Grollier J, Boyn S, Vincent AF, Querlioz D, La Barbera S, Alibart F, Vuillaume D, Bichler O, Gamrat C, Linares-Barranco B. Plasticity in memristive devices for spiking neural networks. Front Neurosci 2015; 9:51. [PMID: 25784849 PMCID: PMC4345885 DOI: 10.3389/fnins.2015.00051] [Citation(s) in RCA: 160] [Impact Index Per Article: 17.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2014] [Accepted: 02/05/2015] [Indexed: 12/05/2022] Open
Abstract
Memristive devices present a new device technology allowing for the realization of compact non-volatile memories. Some of them are already in the process of industrialization. Additionally, they exhibit complex multilevel and plastic behaviors, which make them good candidates for the implementation of artificial synapses in neuromorphic engineering. However, memristive effects rely on diverse physical mechanisms, and their plastic behaviors differ strongly from one technology to another. Here, we present measurements performed on different memristive devices and the opportunities that they provide. We show that they can be used to implement different learning rules whose properties emerge directly from device physics: real time or accelerated operation, deterministic or stochastic behavior, long term or short term plasticity. We then discuss how such devices might be integrated into a complete architecture. These results highlight that there is no unique way to exploit memristive devices in neuromorphic systems. Understanding and embracing device physics is the key for their optimal use.
Collapse
Affiliation(s)
- Sylvain Saïghi
- Laboratoire d'Intégration du Matériau au Système, UMR CNRS 5218, Université de BordeauxTalence, France
| | - Christian G. Mayr
- Institute of Neuroinformatics, University of Zurich and ETH ZurichZurich, Switzerland
| | | | - Heidemarie Schmidt
- Faculty of Electrical Engineering and Information Technology, Technische Universität ChemnitzChemnitz, Germany
| | - Gwendal Lecerf
- Laboratoire d'Intégration du Matériau au Système, UMR CNRS 5218, Université de BordeauxTalence, France
| | - Jean Tomas
- Laboratoire d'Intégration du Matériau au Système, UMR CNRS 5218, Université de BordeauxTalence, France
| | - Julie Grollier
- Unité Mixte de Physique CNRS/Thales, Palaiseau, France Associated to University Paris-SudOrsay, France
| | - Sören Boyn
- Unité Mixte de Physique CNRS/Thales, Palaiseau, France Associated to University Paris-SudOrsay, France
| | - Adrien F. Vincent
- Institut d'Electronique Fondamentale, Université Paris-Sud, CNRSOrsay, France
| | - Damien Querlioz
- Institut d'Electronique Fondamentale, Université Paris-Sud, CNRSOrsay, France
| | - Selina La Barbera
- Institut d'Electronique, Microelectronique et Nanotechnologies, UMR CNRS 8520Villeneuve d'Ascq, France
| | - Fabien Alibart
- Institut d'Electronique, Microelectronique et Nanotechnologies, UMR CNRS 8520Villeneuve d'Ascq, France
| | - Dominique Vuillaume
- Institut d'Electronique, Microelectronique et Nanotechnologies, UMR CNRS 8520Villeneuve d'Ascq, France
| | | | | | - Bernabé Linares-Barranco
- Instituto de Microelectrónica de Sevilla, IMSE-CNM, Universidad de Sevilla and CSICSevilla, Spain
| |
Collapse
|
15
|
Noack M, Partzsch J, Mayr CG, Hänzsche S, Scholze S, Höppner S, Ellguth G, Schüffny R. Switched-capacitor realization of presynaptic short-term-plasticity and stop-learning synapses in 28 nm CMOS. Front Neurosci 2015; 9:10. [PMID: 25698914 PMCID: PMC4313588 DOI: 10.3389/fnins.2015.00010] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2014] [Accepted: 01/09/2015] [Indexed: 11/13/2022] Open
Abstract
Synaptic dynamics, such as long- and short-term plasticity, play an important role in the complexity and biological realism achievable when running neural networks on a neuromorphic IC. For example, they endow the IC with an ability to adapt and learn from its environment. In order to achieve the millisecond to second time constants required for these synaptic dynamics, analog subthreshold circuits are usually employed. However, due to process variation and leakage problems, it is almost impossible to port these types of circuits to modern sub-100nm technologies. In contrast, we present a neuromorphic system in a 28 nm CMOS process that employs switched capacitor (SC) circuits to implement 128 short term plasticity presynapses as well as 8192 stop-learning synapses. The neuromorphic system consumes an area of 0.36 mm(2) and runs at a power consumption of 1.9 mW. The circuit makes use of a technique for minimizing leakage effects allowing for real-time operation with time constants up to several seconds. Since we rely on SC techniques for all calculations, the system is composed of only generic mixed-signal building blocks. These generic building blocks make the system easy to port between technologies and the large digital circuit part inherent in an SC system benefits fully from technology scaling.
Collapse
Affiliation(s)
- Marko Noack
- Chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits, Technische Universität DresdenDresden, Germany
| | - Johannes Partzsch
- Chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits, Technische Universität DresdenDresden, Germany
| | - Christian G. Mayr
- Institute of Neuroinformatics, University of Zurich and ETH ZurichZurich, Switzerland
| | - Stefan Hänzsche
- Chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits, Technische Universität DresdenDresden, Germany
| | - Stefan Scholze
- Chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits, Technische Universität DresdenDresden, Germany
| | - Sebastian Höppner
- Chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits, Technische Universität DresdenDresden, Germany
| | - Georg Ellguth
- Chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits, Technische Universität DresdenDresden, Germany
| | - Rene Schüffny
- Chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits, Technische Universität DresdenDresden, Germany
| |
Collapse
|
16
|
Tunable low energy, compact and high performance neuromorphic circuit for spike-based synaptic plasticity. PLoS One 2014; 9:e88326. [PMID: 24551089 PMCID: PMC3923791 DOI: 10.1371/journal.pone.0088326] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2013] [Accepted: 01/12/2014] [Indexed: 11/26/2022] Open
Abstract
Cortical circuits in the brain have long been recognised for their information processing capabilities and have been studied both experimentally and theoretically via spiking neural networks. Neuromorphic engineers are primarily concerned with translating the computational capabilities of biological cortical circuits, using the Spiking Neural Network (SNN) paradigm, into in silico applications that can mimic the behaviour and capabilities of real biological circuits/systems. These capabilities include low power consumption, compactness, and relevant dynamics. In this paper, we propose a new accelerated-time circuit that has several advantages over its previous neuromorphic counterparts in terms of compactness, power consumption, and capability to mimic the outcomes of biological experiments. The presented circuit simulation results demonstrate that, in comparing the new circuit to previous published synaptic plasticity circuits, reduced silicon area and lower energy consumption for processing each spike is achieved. In addition, it can be tuned in order to closely mimic the outcomes of various spike timing- and rate-based synaptic plasticity experiments. The proposed circuit is also investigated and compared to other designs in terms of tolerance to mismatch and process variation. Monte Carlo simulation results show that the proposed design is much more stable than its previous counterparts in terms of vulnerability to transistor mismatch, which is a significant challenge in analog neuromorphic design. All these features make the proposed design an ideal circuit for use in large scale SNNs, which aim at implementing neuromorphic systems with an inherent capability that can adapt to a continuously changing environment, thus leading to systems with significant learning and computational abilities.
Collapse
|
17
|
Standage D, Trappenberg T, Blohm G. Calcium-dependent calcium decay explains STDP in a dynamic model of hippocampal synapses. PLoS One 2014; 9:e86248. [PMID: 24465987 PMCID: PMC3899242 DOI: 10.1371/journal.pone.0086248] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2013] [Accepted: 12/12/2013] [Indexed: 11/18/2022] Open
Abstract
It is widely accepted that the direction and magnitude of synaptic plasticity depends on post-synaptic calcium flux, where high levels of calcium lead to long-term potentiation and moderate levels lead to long-term depression. At synapses onto neurons in region CA1 of the hippocampus (and many other synapses), NMDA receptors provide the relevant source of calcium. In this regard, post-synaptic calcium captures the coincidence of pre- and post-synaptic activity, due to the blockage of these receptors at low voltage. Previous studies show that under spike timing dependent plasticity (STDP) protocols, potentiation at CA1 synapses requires post-synaptic bursting and an inter-pairing frequency in the range of the hippocampal theta rhythm. We hypothesize that these requirements reflect the saturation of the mechanisms of calcium extrusion from the post-synaptic spine. We test this hypothesis with a minimal model of NMDA receptor-dependent plasticity, simulating slow extrusion with a calcium-dependent calcium time constant. In simulations of STDP experiments, the model accounts for latency-dependent depression with either post-synaptic bursting or theta-frequency pairing (or neither) and accounts for latency-dependent potentiation when both of these requirements are met. The model makes testable predictions for STDP experiments and our simple implementation is tractable at the network level, demonstrating associative learning in a biophysical network model with realistic synaptic dynamics.
Collapse
Affiliation(s)
- Dominic Standage
- Department of Biomedical and Molecular Sciences and Center for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada
- * E-mail:
| | - Thomas Trappenberg
- Faculty of Computer Science, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Gunnar Blohm
- Department of Biomedical and Molecular Sciences and Center for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada
| |
Collapse
|
18
|
Rahimi Azghadi M, Al-Sarawi S, Abbott D, Iannella N. A neuromorphic VLSI design for spike timing and rate based synaptic plasticity. Neural Netw 2013; 45:70-82. [PMID: 23566339 DOI: 10.1016/j.neunet.2013.03.003] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2012] [Revised: 12/14/2012] [Accepted: 03/03/2013] [Indexed: 11/27/2022]
Abstract
Triplet-based Spike Timing Dependent Plasticity (TSTDP) is a powerful synaptic plasticity rule that acts beyond conventional pair-based STDP (PSTDP). Here, the TSTDP is capable of reproducing the outcomes from a variety of biological experiments, while the PSTDP rule fails to reproduce them. Additionally, it has been shown that the behaviour inherent to the spike rate-based Bienenstock-Cooper-Munro (BCM) synaptic plasticity rule can also emerge from the TSTDP rule. This paper proposes an analogue implementation of the TSTDP rule. The proposed VLSI circuit has been designed using the AMS 0.35 μm CMOS process and has been simulated using design kits for Synopsys and Cadence tools. Simulation results demonstrate how well the proposed circuit can alter synaptic weights according to the timing difference amongst a set of different patterns of spikes. Furthermore, the circuit is shown to give rise to a BCM-like learning rule, which is a rate-based rule. To mimic an implementation environment, a 1000 run Monte Carlo (MC) analysis was conducted on the proposed circuit. The presented MC simulation analysis and the simulation result from fine-tuned circuits show that it is possible to mitigate the effect of process variations in the proof of concept circuit; however, a practical variation aware design technique is required to promise a high circuit performance in a large scale neural network. We believe that the proposed design can play a significant role in future VLSI implementations of both spike timing and rate based neuromorphic learning systems.
Collapse
Affiliation(s)
- Mostafa Rahimi Azghadi
- School of Electrical and Electronic Engineering, The University of Adelaide, Adelaide, SA 5005, Australia.
| | | | | | | |
Collapse
|
19
|
Bugmann G. Modeling fast stimulus-response association learning along the occipito-parieto-frontal pathway following rule instructions. Brain Res 2011; 1434:73-89. [PMID: 22041227 DOI: 10.1016/j.brainres.2011.09.028] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2011] [Revised: 06/24/2011] [Accepted: 09/15/2011] [Indexed: 10/17/2022]
Abstract
On the basis of instructions, humans are able to set up associations between sensory and motor areas of the brain separated by several neuronal relays, within a few seconds. This paper proposes a model of fast learning along the dorsal pathway, from primary visual areas to pre-motor cortex. A new synaptic learning rule is proposed where synaptic efficacies converge rapidly toward a specific value determined by the number of active inputs of a neuron, respecting a principle of resource limitation in terms of total synaptic input efficacy available to a neuron. The efficacies are stable with regards to repeated arrival of spikes in a spike train. This rule reproduces the inverse relationship between initial and final synaptic efficacy observed in long-term potentiation (LTP) experiments. Simulations of learning experiments are conducted in a multilayer network of leaky integrate-and-fire (LIF) spiking neuron models. It is proposed that cortical feedback connections convey a top-down learning-enabling signal that guides bottom-up learning in "hidden" neurons that are not directly exposed to input or output activity. Simulations of repeated presentation of the same stimulus-response pair, show that, under conditions of fast learning with probabilistic synaptic transmission, the networks tend to recruit a new sub-network at each presentation to represent the association, rather than re-using a previously trained one. This increasing allocation of neural resources results in progressively shorter execution times, in line with experimentally observed reduction in response time with practice. This article is part of a Special Issue entitled: Neural Coding.
Collapse
Affiliation(s)
- Guido Bugmann
- Centre for Robotic and Neural Systems, University of Plymouth, Drake Circus, Plymouth PL4 8AA, UK.
| |
Collapse
|