1
|
Khanjanianpak M, Azimi-Tafreshi N, Valizadeh A. Emergence of complex oscillatory dynamics in the neuronal networks with long activity time of inhibitory synapses. iScience 2024; 27:109401. [PMID: 38532887 PMCID: PMC10963234 DOI: 10.1016/j.isci.2024.109401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Revised: 12/30/2023] [Accepted: 02/28/2024] [Indexed: 03/28/2024] Open
Abstract
The brain displays complex dynamics, including collective oscillations, and extensive research has been conducted to understand their generation. However, our understanding of how biological constraints influence these oscillations is incomplete. This study investigates the essential properties of neuronal networks needed to generate oscillations resembling those in the brain. A simple discrete-time model of interconnected excitable elements is developed, capable of closely resembling the complex oscillations observed in biological neural networks. In the model, synaptic connections remain active for a duration exceeding individual neuron activity. We show that the inhibitory synapses must exhibit longer activity than excitatory synapses to produce a diverse range of the dynamical states, including biologically plausible oscillations. Upon meeting this condition, the transition between different dynamical states can be controlled by external stochastic input to the neurons. The study provides a comprehensive explanation for the emergence of distinct dynamical states in neural networks based on specific parameters.
Collapse
Affiliation(s)
- Mozhgan Khanjanianpak
- Physics Department, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan 45137-66731, Iran
- Pasargad Institute for Advanced Innovative Solutions (PIAIS), Tehran 1991633357, Iran
| | - Nahid Azimi-Tafreshi
- Physics Department, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan 45137-66731, Iran
| | - Alireza Valizadeh
- Physics Department, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan 45137-66731, Iran
- Pasargad Institute for Advanced Innovative Solutions (PIAIS), Tehran 1991633357, Iran
| |
Collapse
|
2
|
Peace ST, Johnson BC, Werth JC, Li G, Kaiser ME, Fukunaga I, Schaefer AT, Molnar AC, Cleland TA. Coherent olfactory bulb gamma oscillations arise from coupling independent columnar oscillators. J Neurophysiol 2024; 131:492-508. [PMID: 38264784 PMCID: PMC7615692 DOI: 10.1152/jn.00361.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2023] [Revised: 01/19/2024] [Accepted: 01/20/2024] [Indexed: 01/25/2024] Open
Abstract
Spike timing-based representations of sensory information depend on embedded dynamical frameworks within neuronal networks that establish the rules of local computation and interareal communication. Here, we investigated the dynamical properties of olfactory bulb circuitry in mice of both sexes using microelectrode array recordings from slice and in vivo preparations. Neurochemical activation or optogenetic stimulation of sensory afferents evoked persistent gamma oscillations in the local field potential. These oscillations arose from slower, GABA(A) receptor-independent intracolumnar oscillators coupled by GABA(A)-ergic synapses into a faster, broadly coherent network oscillation. Consistent with the theoretical properties of coupled-oscillator networks, the spatial extent of zero-phase coherence was bounded in slices by the reduced density of lateral interactions. The intact in vivo network, however, exhibited long-range lateral interactions that suffice in simulation to enable zero-phase gamma coherence across the olfactory bulb. The timing of action potentials in a subset of principal neurons was phase-constrained with respect to evoked gamma oscillations. Coupled-oscillator dynamics in olfactory bulb thereby enable a common clock, robust to biological heterogeneities, that is capable of supporting gamma-band spike synchronization and phase coding across the ensemble of activated principal neurons.NEW & NOTEWORTHY Odor stimulation evokes rhythmic gamma oscillations in the field potential of the olfactory bulb, but the dynamical mechanisms governing these oscillations have remained unclear. Establishing these mechanisms is important as they determine the biophysical capacities of the bulbar circuit to, for example, maintain zero-phase coherence across a spatially extended network, or coordinate the timing of action potentials in principal neurons. These properties in turn constrain and suggest hypotheses of sensory coding.
Collapse
Affiliation(s)
- Shane T Peace
- Department of Neurobiology & Behavior, Cornell University, Ithaca, New York, United States
| | - Benjamin C Johnson
- Department of Electrical and Computer Engineering, Cornell University, Ithaca, New York, United States
| | - Jesse C Werth
- Department of Psychology, Cornell University, Ithaca, New York, United States
| | - Guoshi Li
- Department of Psychology, Cornell University, Ithaca, New York, United States
| | - Martin E Kaiser
- Behavioural Neurophysiology, Max Planck Institute for Medical Research, Heidelberg, Germany
| | - Izumi Fukunaga
- Behavioural Neurophysiology, Max Planck Institute for Medical Research, Heidelberg, Germany
- Neurophysiology of Behaviour Laboratory, The Francis Crick Institute, London, United Kingdom
- Sensory and Behavioural Neuroscience Unit, Okinawa Institute of Science and Technology Graduate University, Onna, Japan
| | - Andreas T Schaefer
- Behavioural Neurophysiology, Max Planck Institute for Medical Research, Heidelberg, Germany
- Neurophysiology of Behaviour Laboratory, The Francis Crick Institute, London, United Kingdom
- Department of Neuroscience, Physiology & Pharmacology, University College London, London, United Kingdom
- Department of Anatomy and Cell Biology, Faculty of Medicine, University of Heidelberg, Heidelberg, Germany
| | - Alyosha C Molnar
- Department of Electrical and Computer Engineering, Cornell University, Ithaca, New York, United States
| | - Thomas A Cleland
- Department of Psychology, Cornell University, Ithaca, New York, United States
| |
Collapse
|
3
|
Park E, Jang S, Noh G, Jo Y, Lee DK, Kim IS, Song HC, Kim S, Kwak JY. Indium-Gallium-Zinc Oxide-Based Synaptic Charge Trap Flash for Spiking Neural Network-Restricted Boltzmann Machine. NANO LETTERS 2023; 23:9626-9633. [PMID: 37819875 DOI: 10.1021/acs.nanolett.3c03510] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/13/2023]
Abstract
Recently, neuromorphic computing has been proposed to overcome the drawbacks of the current von Neumann computing architecture. Especially, spiking neural network (SNN) has received significant attention due to its ability to mimic the spike-driven behavior of biological neurons and synapses, potentially leading to low-power consumption and other advantages. In this work, we designed the indium-gallium-zinc oxide (IGZO) channel charge-trap flash (CTF) synaptic device based on a HfO2/Al2O3/Si3N4/Al2O3 layer. Our IGZO-based CTF device exhibits synaptic functions with 128 levels of synaptic weight states and spike-timing-dependent plasticity. The SNN-restricted Boltzmann machine was used to simulate the fabricated CTF device to evaluate the efficiency for the SNN system, achieving the high pattern-recognition accuracy of 83.9%. We believe that our results show the suitability of the fabricated IGZO CTF device as a synaptic device for neuromorphic computing.
Collapse
Affiliation(s)
- Eunpyo Park
- Center for Neuromorphic Engineering, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea
- Department of Materials Science & Engineering, Seoul National University, Seoul 08826, Republic of Korea
| | - Suyeon Jang
- Department of Materials Science & Engineering, Seoul National University, Seoul 08826, Republic of Korea
- Research Institute of Advanced Materials (RIAM), Seoul National University, Seoul 08826, Republic of Korea
- Inter-University Semiconductor Research Center (ISRC), Seoul National University, Seoul 08826, Republic of Korea
| | - Gichang Noh
- Center for Neuromorphic Engineering, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea
| | - Yooyeon Jo
- Center for Neuromorphic Engineering, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea
| | - Dae Kyu Lee
- Center for Neuromorphic Engineering, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea
| | - In Soo Kim
- Nanophotonics Research Center, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea
- KIST-SKKU Carbon-Neutral Research Center, Sungkyunkwan University (SKKU), Suwon 16419, Republic of Korea
| | - Hyun-Cheol Song
- KIST-SKKU Carbon-Neutral Research Center, Sungkyunkwan University (SKKU), Suwon 16419, Republic of Korea
- Electronic Materials Research Center, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea
| | - Sangbum Kim
- Department of Materials Science & Engineering, Seoul National University, Seoul 08826, Republic of Korea
- Research Institute of Advanced Materials (RIAM), Seoul National University, Seoul 08826, Republic of Korea
- Inter-University Semiconductor Research Center (ISRC), Seoul National University, Seoul 08826, Republic of Korea
| | - Joon Young Kwak
- Center for Neuromorphic Engineering, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea
- Division of Nanoscience and Technology, Korea University of Science and Technology (UST), Daejeon 34113, Republic of Korea
| |
Collapse
|
4
|
Madar A, Dong C, Sheffield M. BTSP, not STDP, Drives Shifts in Hippocampal Representations During Familiarization. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.10.17.562791. [PMID: 37904999 PMCID: PMC10614909 DOI: 10.1101/2023.10.17.562791] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/02/2023]
Abstract
Synaptic plasticity is widely thought to support memory storage in the brain, but the rules determining impactful synaptic changes in-vivo are not known. We considered the trial-by-trial shifting dynamics of hippocampal place fields (PFs) as an indicator of ongoing plasticity during memory formation. By implementing different plasticity rules in computational models of spiking place cells and comparing to experimentally measured PFs from mice navigating familiar and novel environments, we found that Behavioral-Timescale-Synaptic-Plasticity (BTSP), rather than Hebbian Spike-Timing-Dependent-Plasticity, is the principal mechanism governing PF shifting dynamics. BTSP-triggering events are rare, but more frequent during novel experiences. During exploration, their probability is dynamic: it decays after PF onset, but continually drives a population-level representational drift. Finally, our results show that BTSP occurs in CA3 but is less frequent and phenomenologically different than in CA1. Overall, our study provides a new framework to understand how synaptic plasticity shapes neuronal representations during learning.
Collapse
Affiliation(s)
- A.D. Madar
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| | - C. Dong
- Department of Neurobiology, Neuroscience Institute, University of Chicago
- current affiliation: Department of Neurobiology, Stanford University School of Medicine
| | - M.E.J. Sheffield
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| |
Collapse
|
5
|
Etter G, Carmichael JE, Williams S. Linking temporal coordination of hippocampal activity to memory function. Front Cell Neurosci 2023; 17:1233849. [PMID: 37720546 PMCID: PMC10501408 DOI: 10.3389/fncel.2023.1233849] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2023] [Accepted: 08/01/2023] [Indexed: 09/19/2023] Open
Abstract
Oscillations in neural activity are widespread throughout the brain and can be observed at the population level through the local field potential. These rhythmic patterns are associated with cycles of excitability and are thought to coordinate networks of neurons, in turn facilitating effective communication both within local circuits and across brain regions. In the hippocampus, theta rhythms (4-12 Hz) could contribute to several key physiological mechanisms including long-range synchrony, plasticity, and at the behavioral scale, support memory encoding and retrieval. While neurons in the hippocampus appear to be temporally coordinated by theta oscillations, they also tend to fire in sequences that are developmentally preconfigured. Although loss of theta rhythmicity impairs memory, these sequences of spatiotemporal representations persist in conditions of altered hippocampal oscillations. The focus of this review is to disentangle the relative contribution of hippocampal oscillations from single-neuron activity in learning and memory. We first review cellular, anatomical, and physiological mechanisms underlying the generation and maintenance of hippocampal rhythms and how they contribute to memory function. We propose candidate hypotheses for how septohippocampal oscillations could support memory function while not contributing directly to hippocampal sequences. In particular, we explore how theta rhythms could coordinate the integration of upstream signals in the hippocampus to form future decisions, the relevance of such integration to downstream regions, as well as setting the stage for behavioral timescale synaptic plasticity. Finally, we leverage stimulation-based treatment in Alzheimer's disease conditions as an opportunity to assess the sufficiency of hippocampal oscillations for memory function.
Collapse
Affiliation(s)
| | | | - Sylvain Williams
- Department of Psychiatry, Douglas Mental Health Research Institute, McGill University, Montreal, QC, Canada
| |
Collapse
|
6
|
Saponati M, Vinck M. Sequence anticipation and spike-timing-dependent plasticity emerge from a predictive learning rule. Nat Commun 2023; 14:4985. [PMID: 37604825 PMCID: PMC10442404 DOI: 10.1038/s41467-023-40651-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2021] [Accepted: 08/03/2023] [Indexed: 08/23/2023] Open
Abstract
Intelligent behavior depends on the brain's ability to anticipate future events. However, the learning rules that enable neurons to predict and fire ahead of sensory inputs remain largely unknown. We propose a plasticity rule based on predictive processing, where the neuron learns a low-rank model of the synaptic input dynamics in its membrane potential. Neurons thereby amplify those synapses that maximally predict other synaptic inputs based on their temporal relations, which provide a solution to an optimization problem that can be implemented at the single-neuron level using only local information. Consequently, neurons learn sequences over long timescales and shift their spikes towards the first inputs in a sequence. We show that this mechanism can explain the development of anticipatory signalling and recall in a recurrent network. Furthermore, we demonstrate that the learning rule gives rise to several experimentally observed STDP (spike-timing-dependent plasticity) mechanisms. These findings suggest prediction as a guiding principle to orchestrate learning and synaptic plasticity in single neurons.
Collapse
Affiliation(s)
- Matteo Saponati
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528, Frankfurt Am Main, Germany.
- IMPRS for Neural Circuits, Max-Planck Institute for Brain Research, 60438, Frankfurt Am Main, Germany.
- Donders Centre for Neuroscience, Department of Neuroinformatics, Radboud University, 6525, Nijmegen, The Netherlands.
| | - Martin Vinck
- Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528, Frankfurt Am Main, Germany.
- Donders Centre for Neuroscience, Department of Neuroinformatics, Radboud University, 6525, Nijmegen, The Netherlands.
| |
Collapse
|
7
|
Moldwin T, Kalmenson M, Segev I. Asymmetric Voltage Attenuation in Dendrites Can Enable Hierarchical Heterosynaptic Plasticity. eNeuro 2023; 10:ENEURO.0014-23.2023. [PMID: 37414554 PMCID: PMC10354808 DOI: 10.1523/eneuro.0014-23.2023] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2022] [Revised: 05/16/2023] [Accepted: 06/14/2023] [Indexed: 07/08/2023] Open
Abstract
Long-term synaptic plasticity is mediated via cytosolic calcium concentrations ([Ca2+]). Using a synaptic model that implements calcium-based long-term plasticity via two sources of Ca2+ - NMDA receptors and voltage-gated calcium channels (VGCCs) - we show in dendritic cable simulations that the interplay between these two calcium sources can result in a diverse array of heterosynaptic effects. When spatially clustered synaptic input produces a local NMDA spike, the resulting dendritic depolarization can activate VGCCs at nonactivated spines, resulting in heterosynaptic plasticity. NMDA spike activation at a given dendritic location will tend to depolarize dendritic regions that are located distally to the input site more than dendritic sites that are proximal to it. This asymmetry can produce a hierarchical effect in branching dendrites, where an NMDA spike at a proximal branch can induce heterosynaptic plasticity primarily at branches that are distal to it. We also explored how simultaneously activated synaptic clusters located at different dendritic locations synergistically affect the plasticity at the active synapses, as well as the heterosynaptic plasticity of an inactive synapse "sandwiched" between them. We conclude that the inherent electrical asymmetry of dendritic trees enables sophisticated schemes for spatially targeted supervision of heterosynaptic plasticity.
Collapse
Affiliation(s)
| | - Menachem Kalmenson
- Department of Neurobiology, The Hebrew University of Jerusalem, 91904 Jerusalem, Israel
| | - Idan Segev
- Edmond and Lily Safra Center for Brain Sciences
- Department of Neurobiology, The Hebrew University of Jerusalem, 91904 Jerusalem, Israel
| |
Collapse
|
8
|
Siddique A, Vai MI, Pun SH. A low cost neuromorphic learning engine based on a high performance supervised SNN learning algorithm. Sci Rep 2023; 13:6280. [PMID: 37072443 PMCID: PMC10113267 DOI: 10.1038/s41598-023-32120-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2022] [Accepted: 03/22/2023] [Indexed: 05/03/2023] Open
Abstract
Spiking neural networks (SNNs) are more energy- and resource-efficient than artificial neural networks (ANNs). However, supervised SNN learning is a challenging task due to non-differentiability of spikes and computation of complex terms. Moreover, the design of SNN learning engines is not an easy task due to limited hardware resources and tight energy constraints. In this article, a novel hardware-efficient SNN back-propagation scheme that offers fast convergence is proposed. The learning scheme does not require any complex operation such as error normalization and weight-threshold balancing, and can achieve an accuracy of around 97.5% on MNIST dataset using only 158,800 synapses. The multiplier-less inference engine trained using the proposed hard sigmoid SNN training (HaSiST) scheme can operate at a frequency of 135 MHz and consumes only 1.03 slice registers per synapse, 2.8 slice look-up tables, and can infer about 0.03[Formula: see text] features in a second, equivalent to 9.44 giga synaptic operations per second (GSOPS). The article also presents a high-speed, cost-efficient SNN training engine that consumes only 2.63 slice registers per synapse, 37.84 slice look-up tables per synapse, and can operate at a maximum computational frequency of around 50 MHz on a Virtex 6 FPGA.
Collapse
Affiliation(s)
- Ali Siddique
- Department of Electrical and Computer Engineering, Faculty of Science and Technology, University of Macau, Taipa, 999078, Macau.
| | - Mang I Vai
- Department of Electrical and Computer Engineering, Faculty of Science and Technology, University of Macau, Taipa, 999078, Macau
| | - Sio Hang Pun
- Department of Electrical and Computer Engineering, Faculty of Science and Technology, University of Macau, Taipa, 999078, Macau
| |
Collapse
|
9
|
Tanim MMH, Templin Z, Zhao F. Natural Organic Materials Based Memristors and Transistors for Artificial Synaptic Devices in Sustainable Neuromorphic Computing Systems. MICROMACHINES 2023; 14:235. [PMID: 36837935 PMCID: PMC9963886 DOI: 10.3390/mi14020235] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/25/2022] [Revised: 01/15/2023] [Accepted: 01/16/2023] [Indexed: 06/18/2023]
Abstract
Natural organic materials such as protein and carbohydrates are abundant in nature, renewable, and biodegradable, desirable for the construction of artificial synaptic devices for emerging neuromorphic computing systems with energy efficient operation and environmentally friendly disposal. These artificial synaptic devices are based on memristors or transistors with the memristive layer or gate dielectric formed by natural organic materials. The fundamental requirement for these synaptic devices is the ability to mimic the memory and learning behaviors of biological synapses. This paper reviews the synaptic functions emulated by a variety of artificial synaptic devices based on natural organic materials and provides a useful guidance for testing and investigating more of such devices.
Collapse
|
10
|
Arsalan M, Santra A, Issakov V. Power-efficient gesture sensing for edge devices: mimicking fourier transforms with spiking neural networks. APPL INTELL 2022. [DOI: 10.1007/s10489-022-04258-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
AbstractOne of the key design requirements for any portable/mobile device is low power. To enable such a low powered device, we propose an embedded gesture detection system that uses spiking neural networks (SNNs) applied directly to raw ADC data of a 60GHz frequency modulated continuous wave radar. SNNs can facilitate low power systems because they are sparse in time and space and are event-driven. The proposed system, as opposed to earlier state-of-the-art methods, relies solely on the target’s raw ADC data, thus avoiding the overhead of performing slow-time and fast-time Fourier transforms (FFTs) processing. The proposed architecture mimics the discrete Fourier transformation within the SNN itself avoiding the need for FFT accelerators and makes the FFT processing tailored to the specific application, in this case gesture sensing. The experimental results demonstrate that the proposed system is capable of classifying 8 different gestures with an accuracy of 98.7%. This result is comparable to the conventional approaches, yet it offers lower complexity, lower power consumption and faster computations comparable to the conventional approaches.
Collapse
|
11
|
Dorman DB, Blackwell KT. Synaptic Plasticity Is Predicted by Spatiotemporal Firing Rate Patterns and Robust to In Vivo-like Variability. Biomolecules 2022; 12:1402. [PMID: 36291612 PMCID: PMC9599115 DOI: 10.3390/biom12101402] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Revised: 09/13/2022] [Accepted: 09/28/2022] [Indexed: 11/22/2022] Open
Abstract
Synaptic plasticity, the experience-induced change in connections between neurons, underlies learning and memory in the brain. Most of our understanding of synaptic plasticity derives from in vitro experiments with precisely repeated stimulus patterns; however, neurons exhibit significant variability in vivo during repeated experiences. Further, the spatial pattern of synaptic inputs to the dendritic tree influences synaptic plasticity, yet is not considered in most synaptic plasticity rules. Here, we investigate how spatiotemporal synaptic input patterns produce plasticity with in vivo-like conditions using a data-driven computational model with a plasticity rule based on calcium dynamics. Using in vivo spike train recordings as inputs to different size clusters of spines, we show that plasticity is strongly robust to trial-to-trial variability of spike timing. In addition, we derive general synaptic plasticity rules describing how spatiotemporal patterns of synaptic inputs control the magnitude and direction of plasticity. Synapses that strongly potentiated have greater firing rates and calcium concentration later in the trial, whereas strongly depressing synapses have hiring firing rates early in the trial. The neighboring synaptic activity influences the direction and magnitude of synaptic plasticity, with small clusters of spines producing the greatest increase in synaptic strength. Together, our results reveal that calcium dynamics can unify diverse plasticity rules and reveal how spatiotemporal firing rate patterns control synaptic plasticity.
Collapse
Affiliation(s)
- Daniel B. Dorman
- Interdisciplinary Program in Neuroscience, George Mason University, Fairfax, VA 22030, USA
| | - Kim T. Blackwell
- Interdisciplinary Program in Neuroscience, George Mason University, Fairfax, VA 22030, USA
- Department of Bioengineering, Volgenau School of Engineering, George Mason University, Fairfax, VA 22030, USA
| |
Collapse
|
12
|
Recognition Memory Induces Natural LTP-like Hippocampal Synaptic Excitation and Inhibition. Int J Mol Sci 2022; 23:ijms231810806. [PMID: 36142727 PMCID: PMC9501019 DOI: 10.3390/ijms231810806] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2022] [Revised: 09/09/2022] [Accepted: 09/12/2022] [Indexed: 11/29/2022] Open
Abstract
Synaptic plasticity is a cellular process involved in learning and memory by which specific patterns of neural activity adapt the synaptic strength and efficacy of the synaptic transmission. Its induction is governed by fine tuning between excitatory/inhibitory synaptic transmission. In experimental conditions, synaptic plasticity can be artificially evoked at hippocampal CA1 pyramidal neurons by repeated stimulation of Schaffer collaterals. However, long-lasting synaptic modifications studies during memory formation in physiological conditions in freely moving animals are very scarce. Here, to study synaptic plasticity phenomena during recognition memory in the dorsal hippocampus, field postsynaptic potentials (fPSPs) evoked at the CA3–CA1 synapse were recorded in freely moving mice during object-recognition task performance. Paired pulse stimuli were applied to Schaffer collaterals at the moment that the animal explored a new or a familiar object along different phases of the test. Stimulation evoked a complex synaptic response composed of an ionotropic excitatory glutamatergic fEPSP, followed by two inhibitory responses, an ionotropic, GABAA-mediated fIPSP and a metabotropic, G-protein-gated inwardly rectifying potassium (GirK) channel-mediated fIPSP. Our data showed the induction of LTP-like enhancements for both the glutamatergic and GirK-dependent components of the dorsal hippocampal CA3–CA1 synapse during the exploration of novel but not familiar objects. These results support the contention that synaptic plasticity processes that underlie hippocampal-dependent memory are sustained by fine tuning mechanisms that control excitatory and inhibitory neurotransmission balance.
Collapse
|
13
|
Chindemi G, Abdellah M, Amsalem O, Benavides-Piccione R, Delattre V, Doron M, Ecker A, Jaquier AT, King J, Kumbhar P, Monney C, Perin R, Rössert C, Tuncel AM, Van Geit W, DeFelipe J, Graupner M, Segev I, Markram H, Muller EB. A calcium-based plasticity model for predicting long-term potentiation and depression in the neocortex. Nat Commun 2022; 13:3038. [PMID: 35650191 PMCID: PMC9160074 DOI: 10.1038/s41467-022-30214-w] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2021] [Accepted: 04/19/2022] [Indexed: 01/14/2023] Open
Abstract
Pyramidal cells (PCs) form the backbone of the layered structure of the neocortex, and plasticity of their synapses is thought to underlie learning in the brain. However, such long-term synaptic changes have been experimentally characterized between only a few types of PCs, posing a significant barrier for studying neocortical learning mechanisms. Here we introduce a model of synaptic plasticity based on data-constrained postsynaptic calcium dynamics, and show in a neocortical microcircuit model that a single parameter set is sufficient to unify the available experimental findings on long-term potentiation (LTP) and long-term depression (LTD) of PC connections. In particular, we find that the diverse plasticity outcomes across the different PC types can be explained by cell-type-specific synaptic physiology, cell morphology and innervation patterns, without requiring type-specific plasticity. Generalizing the model to in vivo extracellular calcium concentrations, we predict qualitatively different plasticity dynamics from those observed in vitro. This work provides a first comprehensive null model for LTP/LTD between neocortical PC types in vivo, and an open framework for further developing models of cortical synaptic plasticity.
Collapse
Affiliation(s)
- Giuseppe Chindemi
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland.
| | - Marwan Abdellah
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Oren Amsalem
- Department of Neurobiology, the Hebrew University of Jerusalem, Jerusalem, Israel.,Division of Endocrinology, Diabetes and Metabolism, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, 02215, USA
| | - Ruth Benavides-Piccione
- Instituto Cajal, Consejo Superior de Investigaciones Científicas, Madrid, Spain.,Laboratorio Cajal de Circuitos Corticales, Centro de Tecnología Biomédica, Universidad Politécnica de Madrid, Madrid, Spain
| | - Vincent Delattre
- Laboratory of Neural Microcircuitry, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Michael Doron
- Edmond and Lily Safra Center for Brain Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel
| | - András Ecker
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Aurélien T Jaquier
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - James King
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Pramod Kumbhar
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Caitlin Monney
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Rodrigo Perin
- Laboratory of Neural Microcircuitry, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Christian Rössert
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Anil M Tuncel
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Werner Van Geit
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Javier DeFelipe
- Instituto Cajal, Consejo Superior de Investigaciones Científicas, Madrid, Spain.,Laboratorio Cajal de Circuitos Corticales, Centro de Tecnología Biomédica, Universidad Politécnica de Madrid, Madrid, Spain
| | - Michael Graupner
- Université de Paris, SPPIN - Saints-Pères Paris Institute for the Neurosciences, CNRS, Paris, France
| | - Idan Segev
- Department of Neurobiology, the Hebrew University of Jerusalem, Jerusalem, Israel.,Edmond and Lily Safra Center for Brain Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel
| | - Henry Markram
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland.,Laboratory of Neural Microcircuitry, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Eilif B Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland. .,Department of Neurosciences, Faculty of Medicine, Université de Montréal, Montréal, QC, Canada. .,CHU Sainte-Justine Research Center, Montréal, QC, Canada. .,Quebec Artificial Intelligence Institute (Mila), Montréal, Canada.
| |
Collapse
|
14
|
Kurikawa T, Kaneko K. Multiple-Timescale Neural Networks: Generation of History-Dependent Sequences and Inference Through Autonomous Bifurcations. Front Comput Neurosci 2021; 15:743537. [PMID: 34955798 PMCID: PMC8702558 DOI: 10.3389/fncom.2021.743537] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2021] [Accepted: 11/09/2021] [Indexed: 11/17/2022] Open
Abstract
Sequential transitions between metastable states are ubiquitously observed in the neural system and underlying various cognitive functions such as perception and decision making. Although a number of studies with asymmetric Hebbian connectivity have investigated how such sequences are generated, the focused sequences are simple Markov ones. On the other hand, fine recurrent neural networks trained with supervised machine learning methods can generate complex non-Markov sequences, but these sequences are vulnerable against perturbations and such learning methods are biologically implausible. How stable and complex sequences are generated in the neural system still remains unclear. We have developed a neural network with fast and slow dynamics, which are inspired by the hierarchy of timescales on neural activities in the cortex. The slow dynamics store the history of inputs and outputs and affect the fast dynamics depending on the stored history. We show that the learning rule that requires only local information can form the network generating the complex and robust sequences in the fast dynamics. The slow dynamics work as bifurcation parameters for the fast one, wherein they stabilize the next pattern of the sequence before the current pattern is destabilized depending on the previous patterns. This co-existence period leads to the stable transition between the current and the next pattern in the non-Markov sequence. We further find that timescale balance is critical to the co-existence period. Our study provides a novel mechanism generating robust complex sequences with multiple timescales. Considering the multiple timescales are widely observed, the mechanism advances our understanding of temporal processing in the neural system.
Collapse
Affiliation(s)
- Tomoki Kurikawa
- Department of Physics, Kansai Medical University, Hirakata, Japan
| | - Kunihiko Kaneko
- Department of Basic Science, Graduate School of Arts and Sciences, University of Tokyo, Tokyo, Japan.,Center for Complex Systems Biology, Universal Biology Institute, University of Tokyo, Tokyo, Japan
| |
Collapse
|
15
|
Milstein AD, Li Y, Bittner KC, Grienberger C, Soltesz I, Magee JC, Romani S. Bidirectional synaptic plasticity rapidly modifies hippocampal representations. eLife 2021; 10:e73046. [PMID: 34882093 PMCID: PMC8776257 DOI: 10.7554/elife.73046] [Citation(s) in RCA: 40] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2021] [Accepted: 12/08/2021] [Indexed: 11/13/2022] Open
Abstract
Learning requires neural adaptations thought to be mediated by activity-dependent synaptic plasticity. A relatively non-standard form of synaptic plasticity driven by dendritic calcium spikes, or plateau potentials, has been reported to underlie place field formation in rodent hippocampal CA1 neurons. Here, we found that this behavioral timescale synaptic plasticity (BTSP) can also reshape existing place fields via bidirectional synaptic weight changes that depend on the temporal proximity of plateau potentials to pre-existing place fields. When evoked near an existing place field, plateau potentials induced less synaptic potentiation and more depression, suggesting BTSP might depend inversely on postsynaptic activation. However, manipulations of place cell membrane potential and computational modeling indicated that this anti-correlation actually results from a dependence on current synaptic weight such that weak inputs potentiate and strong inputs depress. A network model implementing this bidirectional synaptic learning rule suggested that BTSP enables population activity, rather than pairwise neuronal correlations, to drive neural adaptations to experience.
Collapse
Affiliation(s)
- Aaron D Milstein
- Department of Neurosurgery and Stanford Neurosciences Institute, Stanford University School of MedicineStanfordUnited States
- Department of Neuroscience and Cell Biology, Robert Wood Johnson Medical School and Center for Advanced Biotechnology and Medicine, Rutgers UniversityPiscatawayUnited States
| | - Yiding Li
- Howard Hughes Medical Institute, Baylor College of MedicineHoustonUnited States
| | - Katie C Bittner
- Howard Hughes Medical Institute, Janelia Research CampusAshburnUnited States
| | | | - Ivan Soltesz
- Department of Neurosurgery and Stanford Neurosciences Institute, Stanford University School of MedicineStanfordUnited States
| | - Jeffrey C Magee
- Howard Hughes Medical Institute, Baylor College of MedicineHoustonUnited States
| | - Sandro Romani
- Howard Hughes Medical Institute, Janelia Research CampusAshburnUnited States
| |
Collapse
|
16
|
Li E, He W, Yu R, He L, Wu X, Chen Q, Liu Y, Chen H, Guo T. High-Density Reconfigurable Synaptic Transistors Targeting a Minimalist Neural Network. ACS APPLIED MATERIALS & INTERFACES 2021; 13:28564-28573. [PMID: 34100580 DOI: 10.1021/acsami.1c05484] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Enormous synaptic devices are required to build a parallel, precise, and efficient neural computing system. To further improve the energy efficiency of neuromorphic computing, a single high-density synaptic (HDS) device with multiple nonvolatile synaptic states is suggested to reduce the number of synaptic devices in the neural network, although such a powerful synaptic device is rarely demonstrated. Here, a photoisomerism material, namely, diarylethene, whose energy level varies with the wavelength of illumination is first introduced to construct a powerful HDS device. The multiple synaptic states of the HDS device are intrinsically converted under UV-vis regulation and remain nonvolatile after the removal of illumination. More importantly, the conversion is reconfigurable and reversible under different light conditions, and the synaptic characteristics are comprehensively mimicked in each state. Finally, compared with a two-layer multilayer perceptron (MLP) architecture based on static synaptic devices, the HDS device-based architecture reduces the device number by 16 times to achieve a minimalist neural computing structure. The invention of the HDS device opens up a revolutionary paradigm for the establishment of a brain-like network.
Collapse
Affiliation(s)
- Enlong Li
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology, Fuzhou University, Fuzhou 350002, China
- Fujian Science & Technology Innovation Laboratory for Optoelectronic Information of China, Fuzhou 350100, China
| | - Weixin He
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology, Fuzhou University, Fuzhou 350002, China
| | - Rengjian Yu
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology, Fuzhou University, Fuzhou 350002, China
| | - Lihua He
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology, Fuzhou University, Fuzhou 350002, China
| | - Xiaomin Wu
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology, Fuzhou University, Fuzhou 350002, China
| | - Qizhen Chen
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology, Fuzhou University, Fuzhou 350002, China
| | - Yuan Liu
- State Key Laboratory for Chemo/Biosensing and Chemometrics, School of Physics and Electronics, Hunan University, Changsha 410082, China
| | - Huipeng Chen
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology, Fuzhou University, Fuzhou 350002, China
- Fujian Science & Technology Innovation Laboratory for Optoelectronic Information of China, Fuzhou 350100, China
| | - Tailiang Guo
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology, Fuzhou University, Fuzhou 350002, China
- Fujian Science & Technology Innovation Laboratory for Optoelectronic Information of China, Fuzhou 350100, China
| |
Collapse
|
17
|
Reyes-García SE, Escobar ML. Calcineurin Participation in Hebbian and Homeostatic Plasticity Associated With Extinction. Front Cell Neurosci 2021; 15:685838. [PMID: 34220454 PMCID: PMC8242195 DOI: 10.3389/fncel.2021.685838] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2021] [Accepted: 05/25/2021] [Indexed: 12/21/2022] Open
Abstract
In nature, animals need to adapt to constant changes in their environment. Learning and memory are cognitive capabilities that allow this to happen. Extinction, the reduction of a certain behavior or learning previously established, refers to a very particular and interesting type of learning that has been the basis of a series of therapies to diminish non-adaptive behaviors. In recent years, the exploration of the cellular and molecular mechanisms underlying this type of learning has received increasing attention. Hebbian plasticity (the activity-dependent modification of the strength or efficacy of synaptic transmission), and homeostatic plasticity (the homeostatic regulation of plasticity) constitute processes intimately associated with memory formation and maintenance. Particularly, long-term depression (LTD) has been proposed as the underlying mechanism of extinction, while the protein phosphatase calcineurin (CaN) has been widely related to both the extinction process and LTD. In this review, we focus on the available evidence that sustains CaN modulation of LTD and its association with extinction. Beyond the classic view, we also examine the interconnection among extinction, Hebbian and homeostatic plasticity, as well as emergent evidence of the participation of kinases and long-term potentiation (LTP) on extinction learning, highlighting the importance of the balance between kinases and phosphatases in the expression of extinction. Finally, we also integrate data that shows the association between extinction and less-studied phenomena, such as synaptic silencing and engram formation that open new perspectives in the field.
Collapse
Affiliation(s)
- Salma E Reyes-García
- Laboratorio de Neurobiología del Aprendizaje y la Memoria, División de Investigación y Estudios de Posgrado, Facultad de Psicología, Universidad Nacional Autónoma de México, Ciudad de México, Mexico
| | - Martha L Escobar
- Laboratorio de Neurobiología del Aprendizaje y la Memoria, División de Investigación y Estudios de Posgrado, Facultad de Psicología, Universidad Nacional Autónoma de México, Ciudad de México, Mexico
| |
Collapse
|
18
|
Cone I, Shouval HZ. Behavioral Time Scale Plasticity of Place Fields: Mathematical Analysis. Front Comput Neurosci 2021; 15:640235. [PMID: 33732128 PMCID: PMC7959845 DOI: 10.3389/fncom.2021.640235] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2020] [Accepted: 02/08/2021] [Indexed: 11/17/2022] Open
Abstract
Traditional synaptic plasticity experiments and models depend on tight temporal correlations between pre- and postsynaptic activity. These tight temporal correlations, on the order of tens of milliseconds, are incompatible with significantly longer behavioral time scales, and as such might not be able to account for plasticity induced by behavior. Indeed, recent findings in hippocampus suggest that rapid, bidirectional synaptic plasticity which modifies place fields in CA1 operates at behavioral time scales. These experimental results suggest that presynaptic activity generates synaptic eligibility traces both for potentiation and depression, which last on the order of seconds. These traces can be converted to changes in synaptic efficacies by the activation of an instructive signal that depends on naturally occurring or experimentally induced plateau potentials. We have developed a simple mathematical model that is consistent with these observations. This model can be fully analyzed to find the fixed points of induced place fields and how these fixed points depend on system parameters such as the size and shape of presynaptic place fields, the animal's velocity during induction, and the parameters of the plasticity rule. We also make predictions about the convergence time to these fixed points, both for induced and pre-existing place fields.
Collapse
Affiliation(s)
- Ian Cone
- Department of Neurobiology and Anatomy, University of Texas Medical School, Houston, TX, United States
- Applied Physics Program, Rice University, Houston, TX, United States
| | - Harel Z. Shouval
- Department of Neurobiology and Anatomy, University of Texas Medical School, Houston, TX, United States
| |
Collapse
|
19
|
Masoli S, Ottaviani A, Casali S, D’Angelo E. Cerebellar Golgi cell models predict dendritic processing and mechanisms of synaptic plasticity. PLoS Comput Biol 2020; 16:e1007937. [PMID: 33378395 PMCID: PMC7837495 DOI: 10.1371/journal.pcbi.1007937] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Revised: 01/26/2021] [Accepted: 11/13/2020] [Indexed: 02/06/2023] Open
Abstract
The Golgi cells are the main inhibitory interneurons of the cerebellar granular layer. Although recent works have highlighted the complexity of their dendritic organization and synaptic inputs, the mechanisms through which these neurons integrate complex input patterns remained unknown. Here we have used 8 detailed morphological reconstructions to develop multicompartmental models of Golgi cells, in which Na, Ca, and K channels were distributed along dendrites, soma, axonal initial segment and axon. The models faithfully reproduced a rich pattern of electrophysiological and pharmacological properties and predicted the operating mechanisms of these neurons. Basal dendrites turned out to be more tightly electrically coupled to the axon initial segment than apical dendrites. During synaptic transmission, parallel fibers caused slow Ca-dependent depolarizations in apical dendrites that boosted the axon initial segment encoder and Na-spike backpropagation into basal dendrites, while inhibitory synapses effectively shunted backpropagating currents. This oriented dendritic processing set up a coincidence detector controlling voltage-dependent NMDA receptor unblock in basal dendrites, which, by regulating local calcium influx, may provide the basis for spike-timing dependent plasticity anticipated by theory.
Collapse
Affiliation(s)
- Stefano Masoli
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | | | - Stefano Casali
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Egidio D’Angelo
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
- Brain Connectivity Center, IRCCS Mondino Foundation, Pavia, Italy
| |
Collapse
|
20
|
Ivans RC, Dahl SG, Cantley KD. A Model for R(t) Elements and R(t) -Based Spike-Timing-Dependent Plasticity With Basic Circuit Examples. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2020; 31:4206-4216. [PMID: 31869804 DOI: 10.1109/tnnls.2019.2952768] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Spike-timing-dependent plasticity (STDP) is a fundamental synaptic learning rule observed in biology that leads to numerous behavioral and cognitive outcomes. Emulating STDP in electronic spiking neural networks with high-density memristive synapses is, therefore, of significant interest. While one popular method involves pulse-shaping the spiking neuron output voltages, an alternative approach is outlined in this article. The proposed STDP implementation uses time-varying dynamic resistance [ R ( t )] elements to achieve local synaptic learning from spike-pair STDP, spike triplet STDP, and firing rates. The R ( t ) elements are connected to each neuron circuit, thereby maintaining synaptic density and leveraging voltage division as a means of altering synaptic weight (memristor voltage). Example R ( t ) elements with their corresponding behaviors are demonstrated through simulation. A three-input-two-output network using single-memristor synaptic connections and R ( t ) elements is also simulated. Network-level effects, such as nonspecific synaptic plasticity, are discussed. Finally, spatiotemporal pattern recognition (STPR) using R ( t ) elements is demonstrated in simulation.
Collapse
|
21
|
Ebner C, Clopath C, Jedlicka P, Cuntz H. Unifying Long-Term Plasticity Rules for Excitatory Synapses by Modeling Dendrites of Cortical Pyramidal Neurons. Cell Rep 2020; 29:4295-4307.e6. [PMID: 31875541 PMCID: PMC6941234 DOI: 10.1016/j.celrep.2019.11.068] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2018] [Revised: 05/02/2019] [Accepted: 11/15/2019] [Indexed: 11/30/2022] Open
Abstract
A large number of experiments have indicated that precise spike times, firing rates, and synapse locations crucially determine the dynamics of long-term plasticity induction in excitatory synapses. However, it remains unknown how plasticity mechanisms of synapses distributed along dendritic trees cooperate to produce the wide spectrum of outcomes for various plasticity protocols. Here, we propose a four-pathway plasticity framework that is well grounded in experimental evidence and apply it to a biophysically realistic cortical pyramidal neuron model. We show in computer simulations that several seemingly contradictory experimental landmark studies are consistent with one unifying set of mechanisms when considering the effects of signal propagation in dendritic trees with respect to synapse location. Our model identifies specific spatiotemporal contributions of dendritic and axo-somatic spikes as well as of subthreshold activation of synaptic clusters, providing a unified parsimonious explanation not only for rate and timing dependence but also for location dependence of synaptic changes. A phenomenological synaptic plasticity rule is applied to a pyramidal neuron model Model reproduces rate-, timing-, and location-dependent plasticity results Active dendrites allow plasticity via dendritic spikes and subthreshold events Cooperative plasticity exists across the dendritic tree and within single branches
Collapse
Affiliation(s)
- Christian Ebner
- Frankfurt Institute for Advanced Studies, 60438 Frankfurt am Main, Germany; Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528 Frankfurt am Main, Germany; NeuroCure Cluster of Excellence, Charité-Universitätsmedizin Berlin, 10117 Berlin, Germany; Institute for Biology, Humboldt-Universität zu Berlin, 10117 Berlin, Germany.
| | - Claudia Clopath
- Computational Neuroscience Laboratory, Bioengineering Department, Imperial College London, London SW7 2AZ, UK
| | - Peter Jedlicka
- Frankfurt Institute for Advanced Studies, 60438 Frankfurt am Main, Germany; Institute of Clinical Neuroanatomy, Neuroscience Center, Goethe University Frankfurt, 60528 Frankfurt am Main, Germany; ICAR3R-Interdisciplinary Centre for 3Rs in Animal Research, Faculty of Medicine, Justus-Liebig-University, 35392 Giessen, Germany
| | - Hermann Cuntz
- Frankfurt Institute for Advanced Studies, 60438 Frankfurt am Main, Germany; Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528 Frankfurt am Main, Germany
| |
Collapse
|
22
|
Eidum DM, Henriquez CS. Modeling the effects of sinusoidal stimulation and synaptic plasticity on linked neural oscillators. CHAOS (WOODBURY, N.Y.) 2020; 30:033105. [PMID: 32237786 DOI: 10.1063/1.5126104] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Accepted: 02/13/2020] [Indexed: 06/11/2023]
Abstract
The brain exhibits intrinsic oscillatory behavior, which plays a vital role in communication and information processing. Abnormalities in brain rhythms have been linked to numerous disorders, including depression and schizophrenia. Rhythmic electrical stimulation (e.g., transcranial magnetic stimulation and transcranial alternating current stimulation) has been used to modulate these oscillations and produce lasting changes in neural activity. In this computational study, we investigate the combined effects of sinusoidal stimulation and synaptic plasticity on model networks comprised of simple, tunable four-neuron oscillators. While not intended to model a specific brain circuit, this idealization was created to provide some intuition on how electrical modulation can induce plastic changes in the oscillatory state. Linked pairs of oscillators were stimulated with sinusoidal current, and their behavior was measured as a function of their intrinsic frequencies, inter-oscillator synaptic strengths, and stimulus strength and frequency. Under certain stimulus conditions, sinusoidal current can disrupt the network's natural firing patterns. Synaptic plasticity can induce weight imbalances that permanently change the characteristic firing behavior of the network. Grids of 100 oscillators with random frequencies were also subjected to a wide array of stimulus conditions. The characteristics of the post-stimulus network activity depend heavily on the stimulus frequency and amplitude as well as the initial strength of inter-oscillator connections. Synchronization arises at the network level from complex patterns of activity propagation, which are enhanced or disrupted by different stimuli. The findings may prove important to the design of novel neuromodulation treatments and techniques seeking to affect oscillatory activity in the brain.
Collapse
Affiliation(s)
- Derek M Eidum
- Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA
| | - Craig S Henriquez
- Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|
23
|
Yang S, Deng B, Wang J, Li H, Lu M, Che Y, Wei X, Loparo KA. Scalable Digital Neuromorphic Architecture for Large-Scale Biophysically Meaningful Neural Network With Multi-Compartment Neurons. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2020; 31:148-162. [PMID: 30892250 DOI: 10.1109/tnnls.2019.2899936] [Citation(s) in RCA: 83] [Impact Index Per Article: 20.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Multicompartment emulation is an essential step to enhance the biological realism of neuromorphic systems and to further understand the computational power of neurons. In this paper, we present a hardware efficient, scalable, and real-time computing strategy for the implementation of large-scale biologically meaningful neural networks with one million multi-compartment neurons (CMNs). The hardware platform uses four Altera Stratix III field-programmable gate arrays, and both the cellular and the network levels are considered, which provides an efficient implementation of a large-scale spiking neural network with biophysically plausible dynamics. At the cellular level, a cost-efficient multi-CMN model is presented, which can reproduce the detailed neuronal dynamics with representative neuronal morphology. A set of efficient neuromorphic techniques for single-CMN implementation are presented with all the hardware cost of memory and multiplier resources removed and with hardware performance of computational speed enhanced by 56.59% in comparison with the classical digital implementation method. At the network level, a scalable network-on-chip (NoC) architecture is proposed with a novel routing algorithm to enhance the NoC performance including throughput and computational latency, leading to higher computational efficiency and capability in comparison with state-of-the-art projects. The experimental results demonstrate that the proposed work can provide an efficient model and architecture for large-scale biologically meaningful networks, while the hardware synthesis results demonstrate low area utilization and high computational speed that supports the scalability of the approach.
Collapse
|
24
|
Li E, Lin W, Yan Y, Yang H, Wang X, Chen Q, Lv D, Chen G, Chen H, Guo T. Synaptic Transistor Capable of Accelerated Learning Induced by Temperature-Facilitated Modulation of Synaptic Plasticity. ACS APPLIED MATERIALS & INTERFACES 2019; 11:46008-46016. [PMID: 31724851 DOI: 10.1021/acsami.9b17227] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Neuromorphic computation, which emulates the signal process of the human brain, is considered to be a feasible way for future computation. Realization of dynamic modulation of synaptic plasticity and accelerated learning, which could improve the processing capacity and learning ability of artificial synaptic devices, is considered to further improve energy efficiency of neuromorphic computation. Nevertheless, realization of dynamic regulation of synaptic weight without an external regular terminal and the method that could endow artificial synaptic devices with the ability to modulate learning speed have rarely been reported. Furthermore, finding suitable materials to fully mimic the response of photoelectric stimulation is still challenging for photoelectric synapses. Here, a floating gate synaptic transistor based on an L-type ligand-modified all-inorganic CsPbBr3 perovskite quantum dots is demonstrated. This work provides first clear experimental evidence that the synaptic plasticity can be dynamically regulated by changing the waveforms of action potential and the environment temperature and both of these parameters originate from and are crucial in higher organisms. Moreover, benefiting from the excellent photoelectric properties and stability of quantum dots, a temperature-facilitated learning process is illustrated by the classical conditioning experiment with and without illumination, and the mechanism of synaptic plasticity is also demonstrated. This work offers a feasible way to realize dynamic modulation of synaptic weight and accelerating the learning process of artificial synapses, which showed great potential in the reduction of energy consumption and improvement of efficiency of future neuromorphic computing.
Collapse
Affiliation(s)
- Enlong Li
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - Weikun Lin
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - Yujie Yan
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - Huihuang Yang
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - Xiumei Wang
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - Qizhen Chen
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - DongXu Lv
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - Gengxu Chen
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - Huipeng Chen
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| | - Tailiang Guo
- Institute of Optoelectronic Display, National & Local United Engineering Lab of Flat Panel Display Technology , Fuzhou University , Fuzhou 350002 , China
| |
Collapse
|
25
|
Xu R, Jang H, Lee MH, Amanov D, Cho Y, Kim H, Park S, Shin HJ, Ham D. Vertical MoS 2 Double-Layer Memristor with Electrochemical Metallization as an Atomic-Scale Synapse with Switching Thresholds Approaching 100 mV. NANO LETTERS 2019; 19:2411-2417. [PMID: 30896171 DOI: 10.1021/acs.nanolett.8b05140] [Citation(s) in RCA: 120] [Impact Index Per Article: 24.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Atomically thin two-dimensional (2D) materials-such as transition metal dichalcogenide (TMD) monolayers and hexagonal boron nitride (hBN)-and their van der Waals layered preparations have been actively researched to build electronic devices such as field-effect transistors, junction diodes, tunneling devices, and, more recently, memristors. Two-dimensional material memristors built in lateral form, with horizontal placement of electrodes and the 2D material layers, have provided an intriguing window into the motions of ions along the atomically thin layers. On the other hand, 2D material memristors built in vertical form with top and bottom electrodes sandwiching 2D material layers may provide opportunities to explore the extreme of the memristive performance with the atomic-scale interelectrode distance. In particular, they may help push the switching voltages to a lower limit, which is an important pursuit in memristor research in general, given their roles in neuromorphic computing. In fact, recently Akinwande et al. performed a pioneering work to demonstrate a vertical memristor that sandwiches a single MoS2 monolayer between two inert Au electrodes, but it could neither attain switching voltages below 1 V nor control the switching polarity, obtaining both unipolar and bipolar switching devices. Here, we report a vertical memristor that sandwiches two MoS2 monolayers between an active Cu top electrode and an inert Au bottom electrode. Cu ions diffuse through the MoS2 double layers to form atomic-scale filaments. The atomic-scale thickness, combined with the electrochemical metallization, lowers switching voltages down to 0.1-0.2 V, on par with the state of the art. Furthermore, our memristor achieves consistent bipolar and analogue switching, and thus exhibits the synapse-like learning behavior such as the spike-timing dependent plasticity (STDP), the very first STDP demonstration among all 2D-material-based vertical memristors. The demonstrated STDP with low switching voltages is promising not only for low-power neuromorphic computing, but also from the point of view that the voltage range approaches the biological action potentials, opening up a possibility for direct interfacing with mammalian neuronal networks.
Collapse
Affiliation(s)
- Renjing Xu
- School of Engineering and Applied Sciences , Harvard University , Cambridge , Massachusetts 02138 , United States
| | - Houk Jang
- School of Engineering and Applied Sciences , Harvard University , Cambridge , Massachusetts 02138 , United States
| | - Min-Hyun Lee
- Samsung Advanced Institute of Technology , Samsung Electronics , Suwon 443-803 , South Korea
| | - Dovran Amanov
- School of Engineering and Applied Sciences , Harvard University , Cambridge , Massachusetts 02138 , United States
| | - Yeonchoo Cho
- Samsung Advanced Institute of Technology , Samsung Electronics , Suwon 443-803 , South Korea
| | - Haeryong Kim
- Samsung Advanced Institute of Technology , Samsung Electronics , Suwon 443-803 , South Korea
| | - Seongjun Park
- Samsung Advanced Institute of Technology , Samsung Electronics , Suwon 443-803 , South Korea
| | - Hyeon-Jin Shin
- Samsung Advanced Institute of Technology , Samsung Electronics , Suwon 443-803 , South Korea
| | - Donhee Ham
- School of Engineering and Applied Sciences , Harvard University , Cambridge , Massachusetts 02138 , United States
| |
Collapse
|
26
|
Madadi Asl M, Valizadeh A, Tass PA. Dendritic and Axonal Propagation Delays May Shape Neuronal Networks With Plastic Synapses. Front Physiol 2018; 9:1849. [PMID: 30618847 PMCID: PMC6307091 DOI: 10.3389/fphys.2018.01849] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2018] [Accepted: 12/07/2018] [Indexed: 12/27/2022] Open
Abstract
Biological neuronal networks are highly adaptive and plastic. For instance, spike-timing-dependent plasticity (STDP) is a core mechanism which adapts the synaptic strengths based on the relative timing of pre- and postsynaptic spikes. In various fields of physiology, time delays cause a plethora of biologically relevant dynamical phenomena. However, time delays increase the complexity of model systems together with the computational and theoretical analysis burden. Accordingly, in computational neuronal network studies propagation delays were often neglected. As a downside, a classic STDP rule in oscillatory neurons without propagation delays is unable to give rise to bidirectional synaptic couplings, i.e., loops or uncoupled states. This is at variance with basic experimental results. In this mini review, we focus on recent theoretical studies focusing on how things change in the presence of propagation delays. Realistic propagation delays may lead to the emergence of neuronal activity and synaptic connectivity patterns, which cannot be captured by classic STDP models. In fact, propagation delays determine the inventory of attractor states and shape their basins of attractions. The results reviewed here enable to overcome fundamental discrepancies between theory and experiments. Furthermore, these findings are relevant for the development of therapeutic brain stimulation techniques aiming at shifting the diseased brain to more favorable attractor states.
Collapse
Affiliation(s)
- Mojtaba Madadi Asl
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan, Iran
| | - Alireza Valizadeh
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan, Iran.,School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), Tehran, Iran
| | - Peter A Tass
- Department of Neurosurgery, Stanford University School of Medicine, Stanford, CA, United States
| |
Collapse
|
27
|
Henderson JA, Gong P. Functional mechanisms underlie the emergence of a diverse range of plasticity phenomena. PLoS Comput Biol 2018; 14:e1006590. [PMID: 30419014 PMCID: PMC6258383 DOI: 10.1371/journal.pcbi.1006590] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2018] [Revised: 11/26/2018] [Accepted: 10/23/2018] [Indexed: 11/18/2022] Open
Abstract
Diverse plasticity mechanisms are orchestrated to shape the spatiotemporal dynamics underlying brain functions. However, why these plasticity rules emerge and how their dynamics interact with neural activity to give rise to complex neural circuit dynamics remains largely unknown. Here we show that both Hebbian and homeostatic plasticity rules emerge from a functional perspective of neuronal dynamics whereby each neuron learns to encode its own activity in the population activity, so that the activity of the presynaptic neuron can be decoded from the activity of its postsynaptic neurons. We explain how a range of experimentally observed plasticity phenomena with widely separated time scales emerge from learning this encoding function, including STDP and its frequency dependence, and metaplasticity. We show that when implemented in neural circuits, these plasticity rules naturally give rise to essential neural response properties, including variable neural dynamics with balanced excitation and inhibition, and approximately log-normal distributions of synaptic strengths, while simultaneously encoding a complex real-world visual stimulus. These findings establish a novel function-based account of diverse plasticity mechanisms, providing a unifying framework relating plasticity, dynamics and neural computation. Many experiments have documented a variety of ways in which the connectivity strengths between neurons change in response to the activity of neurons. These changes are an important part of learning. However, it is not understood how such a diverse range of observations can be understood as consequences of an underlying algorithm used by brains for learning. In order to understand such a learning algorithm it is also necessary to understand the neural computation that is being learned, that is, how the functions of the brain are encoded in the activity of its neurons and its connectivity. In this work we propose a simple way in which information can be encoded and decoded in a network of neurons for operating on real-world stimuli, and how this can be learned using two fundamental plasticity rules that change the strength of connections between neurons in response to neural activity. Surprisingly, many experimental observations result as consequences of this approach, indicating that studying the learning of function provides a novel framework for unifying plasticity, dynamics, and neural computation.
Collapse
Affiliation(s)
- James A. Henderson
- School of Physics, The University of Sydney, Sydney, NSW, Australia
- ARC Centre of Excellence for Integrative Brain Function, The University of Sydney, Sydney, NSW, Australia
- * E-mail: (JAH); (PG)
| | - Pulin Gong
- School of Physics, The University of Sydney, Sydney, NSW, Australia
- ARC Centre of Excellence for Integrative Brain Function, The University of Sydney, Sydney, NSW, Australia
- * E-mail: (JAH); (PG)
| |
Collapse
|
28
|
Neftci EO. Data and Power Efficient Intelligence with Neuromorphic Learning Machines. iScience 2018; 5:52-68. [PMID: 30240646 PMCID: PMC6123858 DOI: 10.1016/j.isci.2018.06.010] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2018] [Revised: 06/04/2018] [Accepted: 06/26/2018] [Indexed: 11/22/2022] Open
Abstract
The success of deep networks and recent industry involvement in brain-inspired computing is igniting a widespread interest in neuromorphic hardware that emulates the biological processes of the brain on an electronic substrate. This review explores interdisciplinary approaches anchored in machine learning theory that enable the applicability of neuromorphic technologies to real-world, human-centric tasks. We find that (1) recent work in binary deep networks and approximate gradient descent learning are strikingly compatible with a neuromorphic substrate; (2) where real-time adaptability and autonomy are necessary, neuromorphic technologies can achieve significant advantages over main-stream ones; and (3) challenges in memory technologies, compounded by a tradition of bottom-up approaches in the field, block the road to major breakthroughs. We suggest that a neuromorphic learning framework, tuned specifically for the spatial and temporal constraints of the neuromorphic substrate, will help guiding hardware algorithm co-design and deploying neuromorphic hardware for proactive learning of real-world data.
Collapse
Affiliation(s)
- Emre O Neftci
- Department of Cognitive Sciences, UC Irvine, Irvine, CA 92697-5100, USA; Department of Computer Science, UC Irvine, Irvine, CA 92697-5100, USA.
| |
Collapse
|
29
|
Manninen T, Aćimović J, Havela R, Teppola H, Linne ML. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures. Front Neuroinform 2018; 12:20. [PMID: 29765315 PMCID: PMC5938413 DOI: 10.3389/fninf.2018.00020] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2018] [Accepted: 04/06/2018] [Indexed: 01/26/2023] Open
Abstract
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Collapse
Affiliation(s)
- Tiina Manninen
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Jugoslava Aćimović
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Riikka Havela
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Heidi Teppola
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Marja-Leena Linne
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| |
Collapse
|
30
|
Guo R, Zhou Y, Wu L, Wang Z, Lim Z, Yan X, Lin W, Wang H, Yoong HY, Chen S, Venkatesan T, Wang J, Chow GM, Gruverman A, Miao X, Zhu Y, Chen J. Control of Synaptic Plasticity Learning of Ferroelectric Tunnel Memristor by Nanoscale Interface Engineering. ACS APPLIED MATERIALS & INTERFACES 2018; 10:12862-12869. [PMID: 29617112 DOI: 10.1021/acsami.8b01469] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Brain-inspired computing is an emerging field, which intends to extend the capabilities of information technology beyond digital logic. The progress of the field relies on artificial synaptic devices as the building block for brainlike computing systems. Here, we report an electronic synapse based on a ferroelectric tunnel memristor, where its synaptic plasticity learning property can be controlled by nanoscale interface engineering. The effect of the interface engineering on the device performance was studied. Different memristor interfaces lead to an opposite virgin resistance state of the devices. More importantly, nanoscale interface engineering could tune the intrinsic band alignment of the ferroelectric/metal-semiconductor heterostructure over a large range of 1.28 eV, which eventually results in different memristive and spike-timing-dependent plasticity (STDP) properties of the devices. Bidirectional and unidirectional gradual resistance modulation of the devices could therefore be controlled by tuning the band alignment. This study gives useful insights on tuning device functionalities through nanoscale interface engineering. The diverse STDP forms of the memristors with different interfaces may play different specific roles in various spike neural networks.
Collapse
Affiliation(s)
- Rui Guo
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
- NUSNNI-Nanocore , National University of Singapore , 117411 , Singapore
| | - Yaxiong Zhou
- Wuhan National Laboratory for Optoelectronics, School of Optical and Electronic Information , Huazhong University of Science and Technology , Wuhan 430074 , China
| | - Lijun Wu
- Condensed Matter Physics & Materials Science Division, Brookhaven National Laboratory , Upton, New York , New York 11973 , United States
| | - Zhuorui Wang
- Wuhan National Laboratory for Optoelectronics, School of Optical and Electronic Information , Huazhong University of Science and Technology , Wuhan 430074 , China
| | - Zhishiuh Lim
- NUSNNI-Nanocore , National University of Singapore , 117411 , Singapore
| | - Xiaobing Yan
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
| | - Weinan Lin
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
| | - Han Wang
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
| | - Herng Yau Yoong
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
| | - Shaohai Chen
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
| | - Thirumalai Venkatesan
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
- NUSNNI-Nanocore , National University of Singapore , 117411 , Singapore
- Department of Physics , National University of Singapore , 117542 , Singapore
- Department of Electrical and Computer Engineering , National University of Singapore , 117583 , Singapore
| | - John Wang
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
| | - Gan Moog Chow
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
| | - Alexei Gruverman
- Department of Physics and Astronomy , University of Nebraska-Lincoln , Lincoln , Nebraska 68588 , United States
| | - Xiangshui Miao
- Wuhan National Laboratory for Optoelectronics, School of Optical and Electronic Information , Huazhong University of Science and Technology , Wuhan 430074 , China
| | - Yimei Zhu
- Condensed Matter Physics & Materials Science Division, Brookhaven National Laboratory , Upton, New York , New York 11973 , United States
| | - Jingsheng Chen
- Department of Materials Science and Engineering , National University of Singapore , 117575 , Singapore
- NUSNNI-Nanocore , National University of Singapore , 117411 , Singapore
| |
Collapse
|
31
|
Olde Scheper TV, Meredith RM, Mansvelder HD, van Pelt J, van Ooyen A. Dynamic Hebbian Cross-Correlation Learning Resolves the Spike Timing Dependent Plasticity Conundrum. Front Comput Neurosci 2018; 11:119. [PMID: 29375358 PMCID: PMC5768644 DOI: 10.3389/fncom.2017.00119] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2016] [Accepted: 12/22/2017] [Indexed: 11/13/2022] Open
Abstract
Spike Timing-Dependent Plasticity has been found to assume many different forms. The classic STDP curve, with one potentiating and one depressing window, is only one of many possible curves that describe synaptic learning using the STDP mechanism. It has been shown experimentally that STDP curves may contain multiple LTP and LTD windows of variable width, and even inverted windows. The underlying STDP mechanism that is capable of producing such an extensive, and apparently incompatible, range of learning curves is still under investigation. In this paper, it is shown that STDP originates from a combination of two dynamic Hebbian cross-correlations of local activity at the synapse. The correlation of the presynaptic activity with the local postsynaptic activity is a robust and reliable indicator of the discrepancy between the presynaptic neuron and the postsynaptic neuron's activity. The second correlation is between the local postsynaptic activity with dendritic activity which is a good indicator of matching local synaptic and dendritic activity. We show that this simple time-independent learning rule can give rise to many forms of the STDP learning curve. The rule regulates synaptic strength without the need for spike matching or other supervisory learning mechanisms. Local differences in dendritic activity at the synapse greatly affect the cross-correlation difference which determines the relative contributions of different neural activity sources. Dendritic activity due to nearby synapses, action potentials, both forward and back-propagating, as well as inhibitory synapses will dynamically modify the local activity at the synapse, and the resulting STDP learning rule. The dynamic Hebbian learning rule ensures furthermore, that the resulting synaptic strength is dynamically stable, and that interactions between synapses do not result in local instabilities. The rule clearly demonstrates that synapses function as independent localized computational entities, each contributing to the global activity, not in a simply linear fashion, but in a manner that is appropriate to achieve local and global stability of the neuron and the entire dendritic structure.
Collapse
Affiliation(s)
- Tjeerd V Olde Scheper
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands.,Department of Computing and Communication Technologies, Faculty of Technology, Design and Environment, Oxford Brookes University, Oxford, United Kingdom
| | - Rhiannon M Meredith
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - Huibert D Mansvelder
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - Jaap van Pelt
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - Arjen van Ooyen
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
32
|
Matsubara T. Conduction Delay Learning Model for Unsupervised and Supervised Classification of Spatio-Temporal Spike Patterns. Front Comput Neurosci 2017; 11:104. [PMID: 29209191 PMCID: PMC5702355 DOI: 10.3389/fncom.2017.00104] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Accepted: 11/02/2017] [Indexed: 12/15/2022] Open
Abstract
Precise spike timing is considered to play a fundamental role in communications and signal processing in biological neural networks. Understanding the mechanism of spike timing adjustment would deepen our understanding of biological systems and enable advanced engineering applications such as efficient computational architectures. However, the biological mechanisms that adjust and maintain spike timing remain unclear. Existing algorithms adopt a supervised approach, which adjusts the axonal conduction delay and synaptic efficacy until the spike timings approximate the desired timings. This study proposes a spike timing-dependent learning model that adjusts the axonal conduction delay and synaptic efficacy in both unsupervised and supervised manners. The proposed learning algorithm approximates the Expectation-Maximization algorithm, and classifies the input data encoded into spatio-temporal spike patterns. Even in the supervised classification, the algorithm requires no external spikes indicating the desired spike timings unlike existing algorithms. Furthermore, because the algorithm is consistent with biological models and hypotheses found in existing biological studies, it could capture the mechanism underlying biological delay learning.
Collapse
Affiliation(s)
- Takashi Matsubara
- Computational Intelligence, Fundamentals of Computational Science, Department of Computational Science, Graduate School of System Informatics, Kobe University, Hyogo, Japan
| |
Collapse
|
33
|
Modeling somatic and dendritic spike mediated plasticity at the single neuron and network level. Nat Commun 2017; 8:706. [PMID: 28951585 PMCID: PMC5615054 DOI: 10.1038/s41467-017-00740-z] [Citation(s) in RCA: 58] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2016] [Accepted: 07/25/2017] [Indexed: 12/11/2022] Open
Abstract
Synaptic plasticity is thought to be the principal neuronal mechanism underlying learning. Models of plastic networks typically combine point neurons with spike-timing-dependent plasticity (STDP) as the learning rule. However, a point neuron does not capture the local non-linear processing of synaptic inputs allowed for by dendrites. Furthermore, experimental evidence suggests that STDP is not the only learning rule available to neurons. By implementing biophysically realistic neuron models, we study how dendrites enable multiple synaptic plasticity mechanisms to coexist in a single cell. In these models, we compare the conditions for STDP and for synaptic strengthening by local dendritic spikes. We also explore how the connectivity between two cells is affected by these plasticity rules and by different synaptic distributions. Finally, we show that how memory retention during associative learning can be prolonged in networks of neurons by including dendrites. Synaptic plasticity is the neuronal mechanism underlying learning. Here the authors construct biophysical models of pyramidal neurons that reproduce observed plasticity gradients along the dendrite and show that dendritic spike dependent LTP which is predominant in distal sections can prolong memory retention.
Collapse
|
34
|
Natural Firing Patterns Imply Low Sensitivity of Synaptic Plasticity to Spike Timing Compared with Firing Rate. J Neurosci 2017; 36:11238-11258. [PMID: 27807166 DOI: 10.1523/jneurosci.0104-16.2016] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2016] [Accepted: 09/02/2016] [Indexed: 01/28/2023] Open
Abstract
Synaptic plasticity is sensitive to the rate and the timing of presynaptic and postsynaptic action potentials. In experimental protocols inducing plasticity, the imposed spike trains are typically regular and the relative timing between every presynaptic and postsynaptic spike is fixed. This is at odds with firing patterns observed in the cortex of intact animals, where cells fire irregularly and the timing between presynaptic and postsynaptic spikes varies. To investigate synaptic changes elicited by in vivo-like firing, we used numerical simulations and mathematical analysis of synaptic plasticity models. We found that the influence of spike timing on plasticity is weaker than expected from regular stimulation protocols. Moreover, when neurons fire irregularly, synaptic changes induced by precise spike timing can be equivalently induced by a modest firing rate variation. Our findings bridge the gap between existing results on synaptic plasticity and plasticity occurring in vivo, and challenge the dominant role of spike timing in plasticity. SIGNIFICANCE STATEMENT Synaptic plasticity, the change in efficacy of connections between neurons, is thought to underlie learning and memory. The dominant paradigm posits that the precise timing of neural action potentials (APs) is central for plasticity induction. This concept is based on experiments using highly regular and stereotyped patterns of APs, in stark contrast with natural neuronal activity. Using synaptic plasticity models, we investigated how irregular, in vivo-like activity shapes synaptic plasticity. We found that synaptic changes induced by precise timing of APs are much weaker than suggested by regular stimulation protocols, and can be equivalently induced by modest variations of the AP rate alone. Our results call into question the dominant role of precise AP timing for plasticity in natural conditions.
Collapse
|
35
|
Xu W, Baker SN. Timing Intervals Using Population Synchrony and Spike Timing Dependent Plasticity. Front Comput Neurosci 2016; 10:123. [PMID: 27990109 PMCID: PMC5133049 DOI: 10.3389/fncom.2016.00123] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2016] [Accepted: 11/15/2016] [Indexed: 11/13/2022] Open
Abstract
We present a computational model by which ensembles of regularly spiking neurons can encode different time intervals through synchronous firing. We show that a neuron responding to a large population of convergent inputs has the potential to learn to produce an appropriately-timed output via spike-time dependent plasticity. We explain why temporal variability of this population synchrony increases with increasing time intervals. We also show that the scalar property of timing and its violation at short intervals can be explained by the spike-wise accumulation of jitter in the inter-spike intervals of timing neurons. We explore how the challenge of encoding longer time intervals can be overcome and conclude that this may involve a switch to a different population of neurons with lower firing rate, with the added effect of producing an earlier bias in response. Experimental data on human timing performance show features in agreement with the model's output.
Collapse
Affiliation(s)
- Wei Xu
- Movement Laboratory, Institute of Neuroscience, Medical School, Newcastle University Newcastle Upon Tyne, UK
| | - Stuart N Baker
- Movement Laboratory, Institute of Neuroscience, Medical School, Newcastle University Newcastle Upon Tyne, UK
| |
Collapse
|
36
|
Babadi B, Abbott LF. Stability and Competition in Multi-spike Models of Spike-Timing Dependent Plasticity. PLoS Comput Biol 2016; 12:e1004750. [PMID: 26939080 PMCID: PMC4777380 DOI: 10.1371/journal.pcbi.1004750] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2015] [Accepted: 01/12/2016] [Indexed: 11/18/2022] Open
Abstract
Spike-timing dependent plasticity (STDP) is a widespread plasticity mechanism in the nervous system. The simplest description of STDP only takes into account pairs of pre- and postsynaptic spikes, with potentiation of the synapse when a presynaptic spike precedes a postsynaptic spike and depression otherwise. In light of experiments that explored a variety of spike patterns, the pair-based STDP model has been augmented to account for multiple pre- and postsynaptic spike interactions. As a result, a number of different "multi-spike" STDP models have been proposed based on different experimental observations. The behavior of these models at the population level is crucial for understanding mechanisms of learning and memory. The challenging balance between the stability of a population of synapses and their competitive modification is well studied for pair-based models, but it has not yet been fully analyzed for multi-spike models. Here, we address this issue through numerical simulations of an integrate-and-fire model neuron with excitatory synapses subject to STDP described by three different proposed multi-spike models. We also analytically calculate average synaptic changes and fluctuations about these averages. Our results indicate that the different multi-spike models behave quite differently at the population level. Although each model can produce synaptic competition in certain parameter regions, none of them induces synaptic competition with its originally fitted parameters. The dichotomy between synaptic stability and Hebbian competition, which is well characterized for pair-based STDP models, persists in multi-spike models. However, anti-Hebbian competition can coexist with synaptic stability in some models. We propose that the collective behavior of synaptic plasticity models at the population level should be used as an additional guideline in applying phenomenological models based on observations of single synapses.
Collapse
Affiliation(s)
- Baktash Babadi
- Center for Theoretical Neuroscience, Department of Neuroscience, Columbia University, New York, New York, United States of America
- Swartz Program in Theoretical Neuroscience, Center for Brain Science, Harvard University, Cambridge, Massachusetts, United States of America
- * E-mail:
| | - L. F. Abbott
- Center for Theoretical Neuroscience, Department of Neuroscience, Columbia University, New York, New York, United States of America
| |
Collapse
|
37
|
Tigaret CM, Olivo V, Sadowski JHLP, Ashby MC, Mellor JR. Coordinated activation of distinct Ca(2+) sources and metabotropic glutamate receptors encodes Hebbian synaptic plasticity. Nat Commun 2016; 7:10289. [PMID: 26758963 PMCID: PMC4735496 DOI: 10.1038/ncomms10289] [Citation(s) in RCA: 50] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2015] [Accepted: 11/26/2015] [Indexed: 01/10/2023] Open
Abstract
At glutamatergic synapses, induction of associative synaptic plasticity requires time-correlated presynaptic and postsynaptic spikes to activate postsynaptic NMDA receptors (NMDARs). The magnitudes of the ensuing Ca2+ transients within dendritic spines are thought to determine the amplitude and direction of synaptic change. In contrast, we show that at mature hippocampal Schaffer collateral synapses the magnitudes of Ca2+ transients during plasticity induction do not match this rule. Indeed, LTP induced by time-correlated pre- and postsynaptic spikes instead requires the sequential activation of NMDARs followed by voltage-sensitive Ca2+ channels within dendritic spines. Furthermore, LTP requires inhibition of SK channels by mGluR1, which removes a negative feedback loop that constitutively regulates NMDARs. Therefore, rather than being controlled simply by the magnitude of the postsynaptic calcium rise, LTP induction requires the coordinated activation of distinct sources of Ca2+ and mGluR1-dependent facilitation of NMDAR function. During STDP, the magnitude of postsynaptic Ca2+ transients is hypothesized to determine the strength of synaptic plasticity. Here, the authors find that STDP in mature hippocampal synapses does not obey this rule but instead relies on the coordinated activation of NMDARs and VGCCs and their regulation by mGluRs and SK channels.
Collapse
Affiliation(s)
- Cezar M Tigaret
- Centre for Synaptic Plasticity, School of Physiology, Pharmacology and Neuroscience, University of Bristol, University Walk, Bristol BS8 1TD, UK
| | - Valeria Olivo
- Centre for Synaptic Plasticity, School of Physiology, Pharmacology and Neuroscience, University of Bristol, University Walk, Bristol BS8 1TD, UK
| | - Josef H L P Sadowski
- Centre for Synaptic Plasticity, School of Physiology, Pharmacology and Neuroscience, University of Bristol, University Walk, Bristol BS8 1TD, UK
| | - Michael C Ashby
- Centre for Synaptic Plasticity, School of Physiology, Pharmacology and Neuroscience, University of Bristol, University Walk, Bristol BS8 1TD, UK
| | - Jack R Mellor
- Centre for Synaptic Plasticity, School of Physiology, Pharmacology and Neuroscience, University of Bristol, University Walk, Bristol BS8 1TD, UK
| |
Collapse
|
38
|
Almási AD, Woźniak S, Cristea V, Leblebici Y, Engbersen T. Review of advances in neural networks: Neural design technology stack. Neurocomputing 2016. [DOI: 10.1016/j.neucom.2015.02.092] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
39
|
Costa RP, Froemke RC, Sjöström PJ, van Rossum MCW. Unified pre- and postsynaptic long-term plasticity enables reliable and flexible learning. eLife 2015; 4:e09457. [PMID: 26308579 PMCID: PMC4584257 DOI: 10.7554/elife.09457] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2015] [Accepted: 08/25/2015] [Indexed: 12/26/2022] Open
Abstract
Although it is well known that long-term synaptic plasticity can be expressed both pre- and postsynaptically, the functional consequences of this arrangement have remained elusive. We show that spike-timing-dependent plasticity with both pre- and postsynaptic expression develops receptive fields with reduced variability and improved discriminability compared to postsynaptic plasticity alone. These long-term modifications in receptive field statistics match recent sensory perception experiments. Moreover, learning with this form of plasticity leaves a hidden postsynaptic memory trace that enables fast relearning of previously stored information, providing a cellular substrate for memory savings. Our results reveal essential roles for presynaptic plasticity that are missed when only postsynaptic expression of long-term plasticity is considered, and suggest an experience-dependent distribution of pre- and postsynaptic strength changes.
Collapse
Affiliation(s)
- Rui Ponte Costa
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh, United Kingdom
- Neuroinformatics Doctoral Training Centre, School of Informatics, University of Edinburgh, Edinburgh, United Kingdom
- The Research Institute of the McGill University Health Centre, Department of Neurology and Neurosurgery, McGill University, Montreal, Canada
- Centre for Neural Circuits and Behaviour, University of Oxford, Oxford, United Kingdom
| | - Robert C Froemke
- Skirball Institute for Biomolecular Medicine, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, United States
- Center for Neural Science, New York University, New York, United States
| | - P Jesper Sjöström
- The Research Institute of the McGill University Health Centre, Department of Neurology and Neurosurgery, McGill University, Montreal, Canada
| | - Mark CW van Rossum
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh, United Kingdom
| |
Collapse
|
40
|
Duarte RCF, Morrison A. Dynamic stability of sequential stimulus representations in adapting neuronal networks. Front Comput Neurosci 2014; 8:124. [PMID: 25374534 PMCID: PMC4205815 DOI: 10.3389/fncom.2014.00124] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2014] [Accepted: 09/16/2014] [Indexed: 12/16/2022] Open
Abstract
The ability to acquire and maintain appropriate representations of time-varying, sequential stimulus events is a fundamental feature of neocortical circuits and a necessary first step toward more specialized information processing. The dynamical properties of such representations depend on the current state of the circuit, which is determined primarily by the ongoing, internally generated activity, setting the ground state from which input-specific transformations emerge. Here, we begin by demonstrating that timing-dependent synaptic plasticity mechanisms have an important role to play in the active maintenance of an ongoing dynamics characterized by asynchronous and irregular firing, closely resembling cortical activity in vivo. Incoming stimuli, acting as perturbations of the local balance of excitation and inhibition, require fast adaptive responses to prevent the development of unstable activity regimes, such as those characterized by a high degree of population-wide synchrony. We establish a link between such pathological network activity, which is circumvented by the action of plasticity, and a reduced computational capacity. Additionally, we demonstrate that the action of plasticity shapes and stabilizes the transient network states exhibited in the presence of sequentially presented stimulus events, allowing the development of adequate and discernible stimulus representations. The main feature responsible for the increased discriminability of stimulus-driven population responses in plastic networks is shown to be the decorrelating action of inhibitory plasticity and the consequent maintenance of the asynchronous irregular dynamic regime both for ongoing activity and stimulus-driven responses, whereas excitatory plasticity is shown to play only a marginal role.
Collapse
Affiliation(s)
- Renato C F Duarte
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6), Jülich Research Center and JARA Jülich, Germany ; Bernstein Center Freiburg, Albert-Ludwig University of Freiburg Freiburg im Breisgau, Germany ; Faculty of Biology, Albert-Ludwig University of Freiburg Freiburg im Breisgau, Germany ; School of Informatics, Institute of Adaptive and Neural Computation, University of Edinburgh Edinburgh, UK
| | - Abigail Morrison
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6), Jülich Research Center and JARA Jülich, Germany ; Bernstein Center Freiburg, Albert-Ludwig University of Freiburg Freiburg im Breisgau, Germany ; Faculty of Biology, Albert-Ludwig University of Freiburg Freiburg im Breisgau, Germany ; Faculty of Psychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum Bochum, Germany
| |
Collapse
|
41
|
Echeveste R, Gros C. Generating Functionals for Computational Intelligence: The Fisher Information as an Objective Function for Self-Limiting Hebbian Learning Rules. Front Robot AI 2014. [DOI: 10.3389/frobt.2014.00001] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
|
42
|
Coexistence of reward and unsupervised learning during the operant conditioning of neural firing rates. PLoS One 2014; 9:e87123. [PMID: 24475240 PMCID: PMC3903641 DOI: 10.1371/journal.pone.0087123] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2013] [Accepted: 12/21/2013] [Indexed: 11/24/2022] Open
Abstract
A fundamental goal of neuroscience is to understand how cognitive processes, such as operant conditioning, are performed by the brain. Typical and well studied examples of operant conditioning, in which the firing rates of individual cortical neurons in monkeys are increased using rewards, provide an opportunity for insight into this. Studies of reward-modulated spike-timing-dependent plasticity (RSTDP), and of other models such as R-max, have reproduced this learning behavior, but they have assumed that no unsupervised learning is present (i.e., no learning occurs without, or independent of, rewards). We show that these models cannot elicit firing rate reinforcement while exhibiting both reward learning and ongoing, stable unsupervised learning. To fix this issue, we propose a new RSTDP model of synaptic plasticity based upon the observed effects that dopamine has on long-term potentiation and depression (LTP and LTD). We show, both analytically and through simulations, that our new model can exhibit unsupervised learning and lead to firing rate reinforcement. This requires that the strengthening of LTP by the reward signal is greater than the strengthening of LTD and that the reinforced neuron exhibits irregular firing. We show the robustness of our findings to spike-timing correlations, to the synaptic weight dependence that is assumed, and to changes in the mean reward. We also consider our model in the differential reinforcement of two nearby neurons. Our model aligns more strongly with experimental studies than previous models and makes testable predictions for future experiments.
Collapse
|
43
|
Standage D, Trappenberg T, Blohm G. Calcium-dependent calcium decay explains STDP in a dynamic model of hippocampal synapses. PLoS One 2014; 9:e86248. [PMID: 24465987 PMCID: PMC3899242 DOI: 10.1371/journal.pone.0086248] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2013] [Accepted: 12/12/2013] [Indexed: 11/18/2022] Open
Abstract
It is widely accepted that the direction and magnitude of synaptic plasticity depends on post-synaptic calcium flux, where high levels of calcium lead to long-term potentiation and moderate levels lead to long-term depression. At synapses onto neurons in region CA1 of the hippocampus (and many other synapses), NMDA receptors provide the relevant source of calcium. In this regard, post-synaptic calcium captures the coincidence of pre- and post-synaptic activity, due to the blockage of these receptors at low voltage. Previous studies show that under spike timing dependent plasticity (STDP) protocols, potentiation at CA1 synapses requires post-synaptic bursting and an inter-pairing frequency in the range of the hippocampal theta rhythm. We hypothesize that these requirements reflect the saturation of the mechanisms of calcium extrusion from the post-synaptic spine. We test this hypothesis with a minimal model of NMDA receptor-dependent plasticity, simulating slow extrusion with a calcium-dependent calcium time constant. In simulations of STDP experiments, the model accounts for latency-dependent depression with either post-synaptic bursting or theta-frequency pairing (or neither) and accounts for latency-dependent potentiation when both of these requirements are met. The model makes testable predictions for STDP experiments and our simple implementation is tractable at the network level, demonstrating associative learning in a biophysical network model with realistic synaptic dynamics.
Collapse
Affiliation(s)
- Dominic Standage
- Department of Biomedical and Molecular Sciences and Center for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada
- * E-mail:
| | - Thomas Trappenberg
- Faculty of Computer Science, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Gunnar Blohm
- Department of Biomedical and Molecular Sciences and Center for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada
| |
Collapse
|
44
|
Miyata R, Ota K, Aonishi T. Optimal design for hetero-associative memory: hippocampal CA1 phase response curve and spike-timing-dependent plasticity. PLoS One 2013; 8:e77395. [PMID: 24204822 PMCID: PMC3812027 DOI: 10.1371/journal.pone.0077395] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2013] [Accepted: 09/02/2013] [Indexed: 11/29/2022] Open
Abstract
Recently reported experimental findings suggest that the hippocampal CA1 network stores spatio-temporal spike patterns and retrieves temporally reversed and spread-out patterns. In this paper, we explore the idea that the properties of the neural interactions and the synaptic plasticity rule in the CA1 network enable it to function as a hetero-associative memory recalling such reversed and spread-out spike patterns. In line with Lengyel’s speculation (Lengyel et al., 2005), we firstly derive optimally designed spike-timing-dependent plasticity (STDP) rules that are matched to neural interactions formalized in terms of phase response curves (PRCs) for performing the hetero-associative memory function. By maximizing object functions formulated in terms of mutual information for evaluating memory retrieval performance, we search for STDP window functions that are optimal for retrieval of normal and doubly spread-out patterns under the constraint that the PRCs are those of CA1 pyramidal neurons. The system, which can retrieve normal and doubly spread-out patterns, can also retrieve reversed patterns with the same quality. Finally, we demonstrate that purposely designed STDP window functions qualitatively conform to typical ones found in CA1 pyramidal neurons.
Collapse
Affiliation(s)
- Ryota Miyata
- Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology, Kanagawa, Japan
- Research Fellow of the Japan Society for the Promotion of Science, Tokyo, Japan
| | - Keisuke Ota
- Brain Science Institute, RIKEN, Saitama, Japan
| | - Toru Aonishi
- Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology, Kanagawa, Japan
- * E-mail:
| |
Collapse
|
45
|
Abstract
In pyramidal cells, the induction of spike-dependent plasticity (STDP) follows a simple Hebbian rule in which the order of presynaptic and postsynaptic firing dictates the induction of LTP or LTD. In contrast, cortical fast spiking (FS) interneurons, which control the rate and timing of pyramidal cell firing, reportedly express timing-dependent LTD, but not timing-dependent LTP. Because a mismatch in STDP rules could impact the maintenance of the excitation/inhibition balance, we examined the neuromodulation of STDP in FS cells of mouse visual cortex. We found that stimulation of adrenergic receptors enables the induction of Hebbian bidirectional STDP in FS cells in a manner consistent with a pull-push mechanism previously characterized in pyramidal cells. However, in pyramidal cells, STDP induction depends on NMDA receptors, whereas in FS cells it depends on mGluR5 receptors. We propose that neuromodulators control the polarity of STDP in different synapses in the same manner, and independently of the induction mechanism, by acting downstream in the plasticity cascade. By doing so, neuromodulators may allow coordinated plastic changes in FS and pyramidal cells.
Collapse
|
46
|
Yger P, Harris KD. The Convallis rule for unsupervised learning in cortical networks. PLoS Comput Biol 2013; 9:e1003272. [PMID: 24204224 PMCID: PMC3808450 DOI: 10.1371/journal.pcbi.1003272] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2013] [Accepted: 08/28/2013] [Indexed: 01/26/2023] Open
Abstract
The phenomenology and cellular mechanisms of cortical synaptic plasticity are becoming known in increasing detail, but the computational principles by which cortical plasticity enables the development of sensory representations are unclear. Here we describe a framework for cortical synaptic plasticity termed the "Convallis rule", mathematically derived from a principle of unsupervised learning via constrained optimization. Implementation of the rule caused a recurrent cortex-like network of simulated spiking neurons to develop rate representations of real-world speech stimuli, enabling classification by a downstream linear decoder. Applied to spike patterns used in in vitro plasticity experiments, the rule reproduced multiple results including and beyond STDP. However STDP alone produced poorer learning performance. The mathematical form of the rule is consistent with a dual coincidence detector mechanism that has been suggested by experiments in several synaptic classes of juvenile neocortex. Based on this confluence of normative, phenomenological, and mechanistic evidence, we suggest that the rule may approximate a fundamental computational principle of the neocortex.
Collapse
Affiliation(s)
- Pierre Yger
- UCL Institute of Neurology and UCL Department of Neuroscience, Physiology, and Pharmacology, London, United Kingdom
- * E-mail:
| | - Kenneth D. Harris
- UCL Institute of Neurology and UCL Department of Neuroscience, Physiology, and Pharmacology, London, United Kingdom
| |
Collapse
|
47
|
Wilson MT, Goodwin DP, Brownjohn PW, Shemmell J, Reynolds JNJ. Numerical modelling of plasticity induced by transcranial magnetic stimulation. J Comput Neurosci 2013; 36:499-514. [PMID: 24150916 DOI: 10.1007/s10827-013-0485-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2013] [Revised: 09/09/2013] [Accepted: 10/02/2013] [Indexed: 10/26/2022]
Abstract
We use neural field theory and spike-timing dependent plasticity to make a simple but biophysically reasonable model of long-term plasticity changes in the cortex due to transcranial magnetic stimulation (TMS). We show how common TMS protocols can be captured and studied within existing neural field theory. Specifically, we look at repetitive TMS protocols such as theta burst stimulation and paired-pulse protocols. Continuous repetitive protocols result mostly in depression, but intermittent repetitive protocols in potentiation. A paired pulse protocol results in depression at short ( < ∼ 10 ms) and long ( > ∼ 100 ms) interstimulus intervals, but potentiation for mid-range intervals. The model is sensitive to the choice of neural populations that are driven by the TMS pulses, and to the parameters that describe plasticity, which may aid interpretation of the high variability in existing experimental results. Driving excitatory populations results in greater plasticity changes than driving inhibitory populations. Modelling also shows the merit in optimizing a TMS protocol based on an individual's electroencephalogram. Moreover, the model can be used to make predictions about protocols that may lead to improvements in repetitive TMS outcomes.
Collapse
Affiliation(s)
- M T Wilson
- School of Engineering, Faculty of Science and Engineering, University of Waikato, Private Bag 3105, Hamilton, 3240, New Zealand,
| | | | | | | | | |
Collapse
|
48
|
Pawlak V, Greenberg DS, Sprekeler H, Gerstner W, Kerr JND. Changing the responses of cortical neurons from sub- to suprathreshold using single spikes in vivo. eLife 2013; 2:e00012. [PMID: 23359858 PMCID: PMC3552422 DOI: 10.7554/elife.00012] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2012] [Accepted: 11/29/2012] [Indexed: 11/13/2022] Open
Abstract
Action Potential (APs) patterns of sensory cortex neurons encode a variety of stimulus features, but how can a neuron change the feature to which it responds? Here, we show that in vivo a spike-timing-dependent plasticity (STDP) protocol—consisting of pairing a postsynaptic AP with visually driven presynaptic inputs—modifies a neurons' AP-response in a bidirectional way that depends on the relative AP-timing during pairing. Whereas postsynaptic APs repeatedly following presynaptic activation can convert subthreshold into suprathreshold responses, APs repeatedly preceding presynaptic activation reduce AP responses to visual stimulation. These changes were paralleled by restructuring of the neurons response to surround stimulus locations and membrane-potential time-course. Computational simulations could reproduce the observed subthreshold voltage changes only when presynaptic temporal jitter was included. Together this shows that STDP rules can modify output patterns of sensory neurons and the timing of single-APs plays a crucial role in sensory coding and plasticity. DOI:http://dx.doi.org/10.7554/eLife.00012.001 Nerve cells, called neurons, are one of the core components of the brain and form complex networks by connecting to other neurons via long, thin ‘wire-like’ processes called axons. Axons can extend across the brain, enabling neurons to form connections—or synapses—with thousands of others. It is through these complex networks that incoming information from sensory organs, such as the eye, is propagated through the brain and encoded. The basic unit of communication between neurons is the action potential, often called a ‘spike’, which propagates along the network of axons and, through a chemical process at synapses, communicates with the postsynaptic neurons that the axon is connected to. These action potentials excite the neuron that they arrive at, and this excitatory process can generate a new action potential that then propagates along the axon to excite additional target neurons. In the visual areas of the cortex, neurons respond with action potentials when they ‘recognize’ a particular feature in a scene—a process called tuning. How a neuron becomes tuned to certain features in the world and not to others is unclear, as are the rules that enable a neuron to change what it is tuned to. What is clear, however, is that to understand this process is to understand the basis of sensory perception. Memory storage and formation is thought to occur at synapses. The efficiency of signal transmission between neurons can increase or decrease over time, and this process is often referred to as synaptic plasticity. But for these synaptic changes to be transmitted to target neurons, the changes must alter the number of action potentials. Although it has been shown in vitro that the efficiency of synaptic transmission—that is the strength of the synapse—can be altered by changing the order in which the pre- and postsynaptic cells are activated (referred to as ‘Spike-timing-dependent plasticity’), this has never been shown to have an effect on the number of action potentials generated in a single neuron in vivo. It is therefore unknown whether this process is functionally relevant. Now Pawlak et al. report that spike-timing-dependent plasticity in the visual cortex of anaesthetized rats can change the spiking of neurons in the visual cortex. They used a visual stimulus (a bar flashed up for half a second) to activate a presynaptic cell, and triggered a single action potential in the postsynaptic cell a very short time later. By repeatedly activating the cells in this way, they increased the strength of the synaptic connection between the two neurons. After a small number of these pairing activations, presenting the visual stimulus alone to the presynaptic cell was enough to trigger an action potential (a suprathreshold response) in the postsynaptic neuron—even though this was not the case prior to the pairing. This study shows that timing rules known to change the strength of synaptic connections—and proposed to underlie learning and memory—have functional relevance in vivo, and that the timing of single action potentials can change the functional status of a cortical neuron. DOI:http://dx.doi.org/10.7554/eLife.00012.002
Collapse
Affiliation(s)
- Verena Pawlak
- Network Imaging Group , Max Planck Institute for Biological Cybernetics , Tübingen , Germany
| | | | | | | | | |
Collapse
|
49
|
Abstract
Metaplasticity, the adaptive changes of long-term potentiation (LTP) and long-term depression (LTD) in response to fluctuations in neural activity is well documented in visual cortex, where dark rearing shifts the frequency threshold for the induction of LTP and LTD. Here we studied metaplasticity affecting spike-timing-dependent plasticity, in which the polarity of plasticity is determined not by the stimulation frequency, but by the temporal relationship between near-coincidental presynaptic and postsynaptic firing. We found that in mouse visual cortex the same regime of deprivation that restricts the frequency range for inducing rate-dependent LTD extends the integration window for inducing timing-dependent LTD, enabling LTD induction with random presynaptic and postsynaptic firing. Notably, the underlying mechanism for the changes in both rate-dependent and time-dependent LTD appears to be an increase of NR2b-containing NMDAR at the synapse. Thus, the rules of metaplasticity might manifest in opposite directions, depending on the plasticity-induction paradigms.
Collapse
|
50
|
Sun XR, Badura A, Pacheco DA, Lynch LA, Schneider ER, Taylor MP, Hogue IB, Enquist LW, Murthy M, Wang SSH. Fast GCaMPs for improved tracking of neuronal activity. Nat Commun 2013; 4:2170. [PMID: 23863808 PMCID: PMC3824390 DOI: 10.1038/ncomms3170] [Citation(s) in RCA: 98] [Impact Index Per Article: 8.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2013] [Accepted: 06/19/2013] [Indexed: 01/31/2023] Open
Abstract
The use of genetically encodable calcium indicator proteins to monitor neuronal activity is hampered by slow response times and a narrow Ca(2+)-sensitive range. Here we identify three performance-limiting features of GCaMP3, a popular genetically encodable calcium indicator protein. First, we find that affinity is regulated by the calmodulin domain's Ca(2+)-chelating residues. Second, we find that off-responses to Ca(2+) are rate-limited by dissociation of the RS20 domain from calmodulin's hydrophobic pocket. Third, we find that on-responses are limited by fast binding to the N-lobe at high Ca(2+) and by slow binding to the C-lobe at lower Ca(2+). We develop Fast-GCaMPs, which have up to 20-fold accelerated off-responses and show that they have a 200-fold range of K(D), allowing coexpression of multiple variants to span an expanded range of Ca(2+) concentrations. Finally, we show that Fast-GCaMPs track natural song in Drosophila auditory neurons and generate rapid responses in mammalian neurons, supporting the utility of our approach.
Collapse
Affiliation(s)
- Xiaonan R Sun
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Aleksandra Badura
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Diego A Pacheco
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Laura A Lynch
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Eve R Schneider
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Matthew P Taylor
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Ian B Hogue
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Lynn W Enquist
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Mala Murthy
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| | - Samuel S-H Wang
- Department of Molecular Biology, Princeton University, Princeton, New Jersey 08544, USA
- Neuroscience Institute, Princeton University, Princeton, New Jersey 08544, USA
| |
Collapse
|