101
|
Xu X, Cang J, Riecke H. Development and binocular matching of orientation selectivity in visual cortex: a computational model. J Neurophysiol 2020; 123:1305-1319. [PMID: 31913758 DOI: 10.1152/jn.00386.2019] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
In mouse visual cortex, right after eye opening binocular cells have different preferred orientations for input from the two eyes. With normal visual experience during a critical period, these preferred orientations evolve and eventually become well matched. To gain insight into the matching process, we developed a computational model of a cortical cell receiving orientation selective inputs via plastic synapses. The model captures the experimentally observed matching of the preferred orientations, the dependence of matching on ocular dominance of the cell, and the relationship between the degree of matching and the resulting monocular orientation selectivity. Moreover, our model puts forward testable predictions: 1) The matching speed increases with initial ocular dominance. 2) While the matching improves more slowly for cells that are more orientation selective, the selectivity increases faster for better matched cells during the matching process. This suggests that matching drives orientation selectivity but not vice versa. 3) There are two main routes to matching: the preferred orientations either drift toward each other or one of the orientations switches suddenly. The latter occurs for cells with large initial mismatch and can render the cells monocular. We expect that these results provide insight more generally into the development of neuronal systems that integrate inputs from multiple sources, including different sensory modalities.NEW & NOTEWORTHY Animals gather information through multiple modalities (vision, audition, touch, etc.). These information streams have to be merged coherently to provide a meaningful representation of the world. Thus, for neurons in visual cortex V1, the orientation selectivities for inputs from the two eyes have to match to enable binocular vision. We analyze the postnatal process underlying this matching using computational modeling. It captures recent experimental results and reveals interdependence between matching, ocular dominance, and orientation selectivity.
Collapse
Affiliation(s)
- Xize Xu
- Department of Engineering Science and Applied Mathematics, Northwestern University, Evanston, Illinois
| | - Jianhua Cang
- Department of Biology and Department of Psychology, University of Virginia, Charlottesville, Virginia
| | - Hermann Riecke
- Department of Engineering Science and Applied Mathematics, Northwestern University, Evanston, Illinois
| |
Collapse
|
102
|
Frequency cluster formation and slow oscillations in neural populations with plasticity. PLoS One 2019; 14:e0225094. [PMID: 31725782 PMCID: PMC6855470 DOI: 10.1371/journal.pone.0225094] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Accepted: 10/29/2019] [Indexed: 11/20/2022] Open
Abstract
We report the phenomenon of frequency clustering in a network of Hodgkin-Huxley neurons with spike timing-dependent plasticity. The clustering leads to a splitting of a neural population into a few groups synchronized at different frequencies. In this regime, the amplitude of the mean field undergoes low-frequency modulations, which may contribute to the mechanism of the emergence of slow oscillations of neural activity observed in spectral power of local field potentials or electroencephalographic signals at high frequencies. In addition to numerical simulations of such multi-clusters, we investigate the mechanisms of the observed phenomena using the simplest case of two clusters. In particular, we propose a phenomenological model which describes the dynamics of two clusters taking into account the adaptation of coupling weights. We also determine the set of plasticity functions (update rules), which lead to multi-clustering.
Collapse
|
103
|
Gastaldi C, Muscinelli S, Gerstner W. Optimal Stimulation Protocol in a Bistable Synaptic Consolidation Model. Front Comput Neurosci 2019; 13:78. [PMID: 31798436 PMCID: PMC6874130 DOI: 10.3389/fncom.2019.00078] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2019] [Accepted: 10/21/2019] [Indexed: 01/12/2023] Open
Abstract
Synaptic changes induced by neural activity need to be consolidated to maintain memory over a timescale of hours. In experiments, synaptic consolidation can be induced by repeating a stimulation protocol several times and the effectiveness of consolidation depends crucially on the repetition frequency of the stimulations. We address the question: is there an understandable reason why induction protocols with repetitions at some frequency work better than sustained protocols—even though the accumulated stimulation strength might be exactly the same in both cases? In real synapses, plasticity occurs on multiple time scales from seconds (induction), to several minutes (early phase of long-term potentiation) to hours and days (late phase of synaptic consolidation). We use a simplified mathematical model of just two times scales to elucidate the above question in a purified setting. Our mathematical results show that, even in such a simple model, the repetition frequency of stimulation plays an important role for the successful induction, and stabilization, of potentiation.
Collapse
Affiliation(s)
- Chiara Gastaldi
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Samuel Muscinelli
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
104
|
Inhibitory microcircuits for top-down plasticity of sensory representations. Nat Commun 2019; 10:5055. [PMID: 31699994 PMCID: PMC6838080 DOI: 10.1038/s41467-019-12972-2] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2018] [Accepted: 10/11/2019] [Indexed: 01/06/2023] Open
Abstract
Rewards influence plasticity of early sensory representations, but the underlying changes in circuitry are unclear. Recent experimental findings suggest that inhibitory circuits regulate learning. In addition, inhibitory neurons are highly modulated by diverse long-range inputs, including reward signals. We, therefore, hypothesise that inhibitory plasticity plays a major role in adjusting stimulus representations. We investigate how top-down modulation by rewards interacts with local plasticity to induce long-lasting changes in circuitry. Using a computational model of layer 2/3 primary visual cortex, we demonstrate how interneuron circuits can store information about rewarded stimuli to instruct long-term changes in excitatory connectivity in the absence of further reward. In our model, stimulus-tuned somatostatin-positive interneurons develop strong connections to parvalbumin-positive interneurons during reward such that they selectively disinhibit the pyramidal layer henceforth. This triggers excitatory plasticity, leading to increased stimulus representation. We make specific testable predictions and show that this two-stage model allows for translation invariance of the learned representation. Rewards can improve stimulus processing in early sensory areas but the underlying neural circuit mechanisms are unknown. Here, the authors build a computational model of layer 2/3 primary visual cortex and suggest that plastic inhibitory circuits change first and then increase excitatory representations beyond the presence of rewards.
Collapse
|
105
|
From space to time: Spatial inhomogeneities lead to the emergence of spatiotemporal sequences in spiking neuronal networks. PLoS Comput Biol 2019; 15:e1007432. [PMID: 31652259 PMCID: PMC6834288 DOI: 10.1371/journal.pcbi.1007432] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2019] [Revised: 11/06/2019] [Accepted: 09/24/2019] [Indexed: 01/01/2023] Open
Abstract
Spatio-temporal sequences of neuronal activity are observed in many brain regions in a variety of tasks and are thought to form the basis of meaningful behavior. However, mechanisms by which a neuronal network can generate spatio-temporal activity sequences have remained obscure. Existing models are biologically untenable because they either require manual embedding of a feedforward network within a random network or supervised learning to train the connectivity of a network to generate sequences. Here, we propose a biologically plausible, generative rule to create spatio-temporal activity sequences in a network of spiking neurons with distance-dependent connectivity. We show that the emergence of spatio-temporal activity sequences requires: (1) individual neurons preferentially project a small fraction of their axons in a specific direction, and (2) the preferential projection direction of neighboring neurons is similar. Thus, an anisotropic but correlated connectivity of neuron groups suffices to generate spatio-temporal activity sequences in an otherwise random neuronal network model.
Collapse
|
106
|
Taherkhani A, Belatreche A, Li Y, Cosma G, Maguire LP, McGinnity TM. A review of learning in biologically plausible spiking neural networks. Neural Netw 2019; 122:253-272. [PMID: 31726331 DOI: 10.1016/j.neunet.2019.09.036] [Citation(s) in RCA: 73] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2019] [Revised: 09/17/2019] [Accepted: 09/23/2019] [Indexed: 11/30/2022]
Abstract
Artificial neural networks have been used as a powerful processing tool in various areas such as pattern recognition, control, robotics, and bioinformatics. Their wide applicability has encouraged researchers to improve artificial neural networks by investigating the biological brain. Neurological research has significantly progressed in recent years and continues to reveal new characteristics of biological neurons. New technologies can now capture temporal changes in the internal activity of the brain in more detail and help clarify the relationship between brain activity and the perception of a given stimulus. This new knowledge has led to a new type of artificial neural network, the Spiking Neural Network (SNN), that draws more faithfully on biological properties to provide higher processing abilities. A review of recent developments in learning of spiking neurons is presented in this paper. First the biological background of SNN learning algorithms is reviewed. The important elements of a learning algorithm such as the neuron model, synaptic plasticity, information encoding and SNN topologies are then presented. Then, a critical review of the state-of-the-art learning algorithms for SNNs using single and multiple spikes is presented. Additionally, deep spiking neural networks are reviewed, and challenges and opportunities in the SNN field are discussed.
Collapse
Affiliation(s)
- Aboozar Taherkhani
- School of Computer Science and Informatics, Faculty of Computing, Engineering and Media, De Montfort University, Leicester, UK.
| | - Ammar Belatreche
- Department of Computer and Information Sciences, Northumbria University, Newcastle upon Tyne, UK
| | - Yuhua Li
- School of Computer Science and Informatics, Cardiff University, Cardiff, UK
| | - Georgina Cosma
- Department of Computer Science, Loughborough University, Loughborough, UK
| | - Liam P Maguire
- Intelligent Systems Research Centre, Ulster University, Northern Ireland, Derry, UK
| | - T M McGinnity
- Intelligent Systems Research Centre, Ulster University, Northern Ireland, Derry, UK; School of Science and Technology, Nottingham Trent University, Nottingham, UK
| |
Collapse
|
107
|
Berner R, Fialkowski J, Kasatkin D, Nekorkin V, Yanchuk S, Schöll E. Hierarchical frequency clusters in adaptive networks of phase oscillators. CHAOS (WOODBURY, N.Y.) 2019; 29:103134. [PMID: 31675820 DOI: 10.1063/1.5097835] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2019] [Accepted: 10/03/2019] [Indexed: 06/10/2023]
Abstract
Adaptive dynamical networks appear in various real-word systems. One of the simplest phenomenological models for investigating basic properties of adaptive networks is the system of coupled phase oscillators with adaptive couplings. In this paper, we investigate the dynamics of this system. We extend recent results on the appearance of hierarchical frequency multiclusters by investigating the effect of the time scale separation. We show that the slow adaptation in comparison with the fast phase dynamics is necessary for the emergence of the multiclusters and their stability. Additionally, we study the role of double antipodal clusters, which appear to be unstable for all considered parameter values. We show that such states can be observed for a relatively long time, i.e., they are metastable. A geometrical explanation for such an effect is based on the emergence of a heteroclinic orbit.
Collapse
Affiliation(s)
- Rico Berner
- Institute of Theoretical Physics, Technische Universität Berlin, Hardenbergstr. 36, D-10623 Berlin, Germany
| | - Jan Fialkowski
- Institute of Theoretical Physics, Technische Universität Berlin, Hardenbergstr. 36, D-10623 Berlin, Germany
| | - Dmitry Kasatkin
- Institute of Applied Physics of RAS, 46 Ul'yanov Street, 603950 Nizhny Novgorod, Russia
| | - Vladimir Nekorkin
- Institute of Applied Physics of RAS, 46 Ul'yanov Street, 603950 Nizhny Novgorod, Russia
| | - Serhiy Yanchuk
- Institute of Mathematics, Technische Universität Berlin, Strasse des 17. Juni 136, D-10623 Berlin, Germany
| | - Eckehard Schöll
- Institute of Theoretical Physics, Technische Universität Berlin, Hardenbergstr. 36, D-10623 Berlin, Germany
| |
Collapse
|
108
|
Busch SE, Khakhalin AS. Intrinsic temporal tuning of neurons in the optic tectum is shaped by multisensory experience. J Neurophysiol 2019; 122:1084-1096. [PMID: 31291161 DOI: 10.1152/jn.00099.2019] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
For a biological neural network to be functional, its neurons need to be connected with synapses of appropriate strength, and each neuron needs to appropriately respond to its synaptic inputs. This second aspect of network tuning is maintained by intrinsic plasticity; yet it is often considered secondary to changes in connectivity and mostly limited to adjustments of overall excitability of each neuron. Here we argue that even nonoscillatory neurons can be tuned to inputs of different temporal dynamics and that they can routinely adjust this tuning to match the statistics of their synaptic activation. Using the dynamic clamp technique, we show that, in the tectum of Xenopus tadpole, neurons become selective for faster inputs when animals are exposed to fast visual stimuli but remain responsive to longer inputs in animals exposed to slower, looming, or multisensory stimulation. We also report a homeostatic cotuning between synaptic and intrinsic temporal properties of individual tectal cells. These results expand our understanding of intrinsic plasticity in the brain and suggest that there may exist an additional dimension of network tuning that has been so far overlooked.NEW & NOTEWORTHY We use dynamic clamp to show that individual neurons in the tectum of Xenopus tadpoles are selectively tuned to either shorter (more synchronous) or longer (less synchronous) synaptic inputs. We also demonstrate that this intrinsic temporal tuning is strongly shaped by sensory experiences. This new phenomenon, which is likely to be mediated by changes in sodium channel inactivation, is bound to have important consequences for signal processing and the development of local recurrent connections.
Collapse
Affiliation(s)
- Silas E Busch
- Biology Program, Bard College, Annandale-on-Hudson, New York
| | | |
Collapse
|
109
|
Marshel JH, Kim YS, Machado TA, Quirin S, Benson B, Kadmon J, Raja C, Chibukhchyan A, Ramakrishnan C, Inoue M, Shane JC, McKnight DJ, Yoshizawa S, Kato HE, Ganguli S, Deisseroth K. Cortical layer-specific critical dynamics triggering perception. Science 2019; 365:eaaw5202. [PMID: 31320556 PMCID: PMC6711485 DOI: 10.1126/science.aaw5202] [Citation(s) in RCA: 323] [Impact Index Per Article: 64.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2019] [Accepted: 07/02/2019] [Indexed: 12/24/2022]
Abstract
Perceptual experiences may arise from neuronal activity patterns in mammalian neocortex. We probed mouse neocortex during visual discrimination using a red-shifted channelrhodopsin (ChRmine, discovered through structure-guided genome mining) alongside multiplexed multiphoton-holography (MultiSLM), achieving control of individually specified neurons spanning large cortical volumes with millisecond precision. Stimulating a critical number of stimulus-orientation-selective neurons drove widespread recruitment of functionally related neurons, a process enhanced by (but not requiring) orientation-discrimination task learning. Optogenetic targeting of orientation-selective ensembles elicited correct behavioral discrimination. Cortical layer-specific dynamics were apparent, as emergent neuronal activity asymmetrically propagated from layer 2/3 to layer 5, and smaller layer 5 ensembles were as effective as larger layer 2/3 ensembles in eliciting orientation discrimination behavior. Population dynamics emerging after optogenetic stimulation both correctly predicted behavior and resembled natural internal representations of visual stimuli at cellular resolution over volumes of cortex.
Collapse
Affiliation(s)
- James H Marshel
- CNC Department, Stanford University, Stanford, CA 94305, USA
| | - Yoon Seok Kim
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
| | - Timothy A Machado
- CNC Department, Stanford University, Stanford, CA 94305, USA
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
| | - Sean Quirin
- CNC Department, Stanford University, Stanford, CA 94305, USA
| | - Brandon Benson
- Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Jonathan Kadmon
- Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Cephra Raja
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
| | | | - Charu Ramakrishnan
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
| | - Masatoshi Inoue
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
| | | | | | - Susumu Yoshizawa
- Department of Natural Environmental Studies, Graduate School of Frontier Sciences, University of Tokyo, Kashiwa 277-8564, Japan
| | - Hideaki E Kato
- Department of Molecular and Cellular Physiology, Stanford University, Stanford, CA 94305, USA
| | - Surya Ganguli
- Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Karl Deisseroth
- CNC Department, Stanford University, Stanford, CA 94305, USA.
- Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, CA 94305, USA
- Howard Hughes Medical Institute, Stanford University, Stanford, CA 94305, USA
| |
Collapse
|
110
|
van Gils T, Tiesinga PHE, Englitz B, Martens MB. Sensitivity to Stimulus Irregularity Is Inherent in Neural Networks. Neural Comput 2019; 31:1789-1824. [PMID: 31335294 DOI: 10.1162/neco_a_01215] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
Behavior is controlled by complex neural networks in which neurons process thousands of inputs. However, even short spike trains evoked in a single cortical neuron were demonstrated to be sufficient to influence behavior in vivo. Specifically, irregular sequences of interspike intervals (ISIs) had a more reliable influence on behavior despite their resemblance to stochastic activity. Similarly, irregular tactile stimulation led to higher rates of behavioral responses. In this study, we identify the mechanisms enabling this sensitivity to stimulus irregularity (SSI) on the neuronal and network levels using simulated spiking neural networks. Matching in vivo experiments, we find that irregular stimulation elicits more detectable network events (bursts) than regular stimulation. Dissecting the stimuli, we identify short ISIs-occurring more frequently in irregular stimulations-as the main drivers of SSI rather than complex irregularity per se. In addition, we find that short-term plasticity modulates SSI. We subsequently eliminate the different mechanisms in turn to assess their role in generating SSI. Removing inhibitory interneurons, we find that SSI is retained, suggesting that SSI is not dependent on inhibition. Removing recurrency, we find that SSI is retained due to the ability of individual neurons to integrate activity over short timescales ("cell memory"). Removing single-neuron dynamics, we find that SSI is retained based on the short-term retention of activity within the recurrent network structure ("network memory"). Finally, using a further simplified probabilistic model, we find that local network structure is not required for SSI. Hence, SSI is identified as a general property that we hypothesize to be ubiquitous in neural networks with different structures and biophysical properties. Irregular sequences contain shorter ISIs, which are the main drivers underlying SSI. The experimentally observed SSI should thus generalize to other systems, suggesting a functional role for irregular activity in cortex.
Collapse
Affiliation(s)
- Teun van Gils
- Department of Neuroinformatics and Department of Neurophysiology, Donders Institute for Brain, Cognition, and Behaviour, 6525 AJ Nijmegen, Gelderland, The Netherlands
| | - Paul H E Tiesinga
- Department of Neuroinformatics, Donders Institute for Brain, Cognition, and Behaviour, 6525 AJ Nijmegen, Gelderland, The Netherlands
| | - Bernhard Englitz
- Department of Neurophysiology, Donders Institute for Brain, Cognition, and Behaviour, 6525 AJ Nijmegen, Gelderland, The Netherlands
| | - Marijn B Martens
- Department of Neuroinformatics, Donders Institute for Brain, Cognition, and Behaviour, 6525 AJ Nijmegen, Gelderland, The Netherlands
| |
Collapse
|
111
|
Sammons RP, Clopath C, Barnes SJ. Size-Dependent Axonal Bouton Dynamics following Visual Deprivation In Vivo. Cell Rep 2019; 22:576-584. [PMID: 29346758 PMCID: PMC5792425 DOI: 10.1016/j.celrep.2017.12.065] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2017] [Revised: 12/04/2017] [Accepted: 12/20/2017] [Indexed: 11/26/2022] Open
Abstract
Persistent synapses are thought to underpin the storage of sensory experience, yet little is known about their structural plasticity in vivo. We investigated how persistent presynaptic structures respond to the loss of primary sensory input. Using in vivo two-photon (2P) imaging, we measured fluctuations in the size of excitatory axonal boutons in L2/3 of adult mouse visual cortex after monocular enucleation. The average size of boutons did not change after deprivation, but the range of bouton sizes was reduced. Large boutons decreased, and small boutons increased. Reduced bouton variance was accompanied by a reduced range of correlated calcium-mediated neural activity in L2/3 of awake animals. Network simulations predicted that size-dependent plasticity may promote conditions of greater bidirectional plasticity. These predictions were supported by electrophysiological measures of short- and long-term plasticity. We propose size-dependent dynamics facilitate cortical reorganization by maximizing the potential for bidirectional plasticity. The range of persistent axonal bouton sizes is reduced following visual deprivation Bouton sizes move toward the mean in a size-dependent manner Bouton plasticity is accompanied by a reduced range of correlated network activity Deprived cortex exhibits greater bidirectional functional presynaptic plasticity
Collapse
Affiliation(s)
- Rosanna P Sammons
- Department of Neuroscience, Physiology and Pharmacology, University College London, 21 University St., London WC1E 6DE, UK
| | - Claudia Clopath
- Department of Biomedical Engineering, Imperial College London, South Kensington Campus, London SW7 2AZ, UK
| | - Samuel J Barnes
- Division of Brain Sciences, Department of Medicine, Imperial College London, Hammersmith Hospital Campus, Du Cane Road, London W12 0NN, UK.
| |
Collapse
|
112
|
Kulkarni MR, John RA, Tiwari N, Nirmal A, Ng SE, Nguyen AC, Mathews N. Field-Driven Athermal Activation of Amorphous Metal Oxide Semiconductors for Flexible Programmable Logic Circuits and Neuromorphic Electronics. SMALL (WEINHEIM AN DER BERGSTRASSE, GERMANY) 2019; 15:e1901457. [PMID: 31120199 DOI: 10.1002/smll.201901457] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/20/2019] [Revised: 04/12/2019] [Indexed: 06/09/2023]
Abstract
Despite extensive research, large-scale realization of metal-oxide electronics is still impeded by high-temperature fabrication, incompatible with flexible substrates. Ideally, an athermal treatment modifying the electronic structure of amorphous metal oxide semiconductors (AMOS) to generate sufficient carrier concentration would help mitigate such high-temperature requirements, enabling realization of high-performance electronics on flexible substrates. Here, a novel field-driven athermal activation of AMOS channels is demonstrated via an electrolyte-gating approach. Facilitating migration of charged oxygen species across the semiconductor-dielectric interface, this approach modulates the local electronic structure of the channel, generating sufficient carriers for charge transport and activating oxygen-compensated thin films. The thin-film transistors (TFTs) investigated here depict an enhancement of linear mobility from 51 to 105.25 cm2 V-1 s-1 (ionic-gated) and from 8.09 to 14.49 cm2 V-1 s-1 (back-gated), by creating additional oxygen vacancies. The accompanying stochiometric transformations, monitored via spectroscopic measurements (X-ray photoelectron spectroscopy) corroborate the detailed electrical (TFT, current evolution) parameter analyses, providing critical insights into the underlying oxygen-vacancy generation mechanism and clearly demonstrating field-induced activation as a promising alternative to conventional high-temperature annealing strategies. Facilitating on-demand active programing of the operation modes of transistors (enhancement vs depletion), this technique paves way for facile fabrication of logic circuits and neuromorphic transistors for bioinspired computing.
Collapse
Affiliation(s)
- Mohit Rameshchandra Kulkarni
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Rohit Abraham John
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Nidhi Tiwari
- Energy Research Institute @ NTU (ERI@N), Nanyang Technological University, Singapore, 637553, Singapore
| | - Amoolya Nirmal
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Si En Ng
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Anh Chien Nguyen
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Nripan Mathews
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
- Energy Research Institute @ NTU (ERI@N), Nanyang Technological University, Singapore, 637553, Singapore
| |
Collapse
|
113
|
Deger M, Seeholzer A, Gerstner W. Multicontact Co-operativity in Spike-Timing-Dependent Structural Plasticity Stabilizes Networks. Cereb Cortex 2019; 28:1396-1415. [PMID: 29300903 PMCID: PMC6041941 DOI: 10.1093/cercor/bhx339] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2017] [Accepted: 11/30/2017] [Indexed: 12/12/2022] Open
Abstract
Excitatory synaptic connections in the adult neocortex consist of multiple synaptic contacts, almost exclusively formed on dendritic spines. Changes of spine volume, a correlate of synaptic strength, can be tracked in vivo for weeks. Here, we present a combined model of structural and spike-timing–dependent plasticity that explains the multicontact configuration of synapses in adult neocortical networks under steady-state and lesion-induced conditions. Our plasticity rule with Hebbian and anti-Hebbian terms stabilizes both the postsynaptic firing rate and correlations between the pre- and postsynaptic activity at an active synaptic contact. Contacts appear spontaneously at a low rate and disappear if their strength approaches zero. Many presynaptic neurons compete to make strong synaptic connections onto a postsynaptic neuron, whereas the synaptic contacts of a given presynaptic neuron co-operate via postsynaptic firing. We find that co-operation of multiple synaptic contacts is crucial for stable, long-term synaptic memories. In simulations of a simplified network model of barrel cortex, our plasticity rule reproduces whisker-trimming–induced rewiring of thalamocortical and recurrent synaptic connectivity on realistic time scales.
Collapse
Affiliation(s)
- Moritz Deger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.,Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, 50674 Cologne, Germany
| | - Alexander Seeholzer
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
114
|
Nicola W, Clopath C. A diversity of interneurons and Hebbian plasticity facilitate rapid compressible learning in the hippocampus. Nat Neurosci 2019; 22:1168-1181. [PMID: 31235906 DOI: 10.1038/s41593-019-0415-2] [Citation(s) in RCA: 39] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2018] [Accepted: 04/23/2019] [Indexed: 11/09/2022]
Abstract
The hippocampus is able to rapidly learn incoming information, even if that information is only observed once. Furthermore, this information can be replayed in a compressed format in either forward or reverse modes during sharp wave-ripples (SPW-Rs). We leveraged state-of-the-art techniques in training recurrent spiking networks to demonstrate how primarily interneuron networks can achieve the following: (1) generate internal theta sequences to bind externally elicited spikes in the presence of inhibition from the medial septum; (2) compress learned spike sequences in the form of a SPW-R when septal inhibition is removed; (3) generate and refine high-frequency assemblies during SPW-R-mediated compression; and (4) regulate the inter-SPW interval timing between SPW-Rs in ripple clusters. From the fast timescale of neurons to the slow timescale of behaviors, interneuron networks serve as the scaffolding for one-shot learning by replaying, reversing, refining, and regulating spike sequences.
Collapse
Affiliation(s)
- Wilten Nicola
- Department of Bioengineering, Imperial College London, London, UK
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, UK.
| |
Collapse
|
115
|
Clancy KJ, Baisley SK, Albizu A, Kartvelishvili N, Ding M, Li W. Lasting connectivity increase and anxiety reduction via transcranial alternating current stimulation. Soc Cogn Affect Neurosci 2019; 13:1305-1316. [PMID: 30380131 PMCID: PMC6277743 DOI: 10.1093/scan/nsy096] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2018] [Accepted: 10/28/2018] [Indexed: 12/12/2022] Open
Abstract
Growing evidence of transcranial alternating current stimulation (tACS) modulating intrinsic neural oscillations has spawned interest in applying tACS to treat psychiatric disorders associated with aberrant neural oscillations. The alpha rhythmic activity is known to dominate neural oscillations at the awake, restful state, while attenuated resting-state alpha activity has been implicated in anxious mood. Administering repeated alpha-frequency tACS (α-tACS; at individual peak alpha frequency; 8–12 Hz) over four consecutive days (in the experiment group, sham stimulation in the control group), we demonstrated immediate and lasting (>24 h) increases in resting-state posterior ➔frontal connectivity in the alpha frequency, quantified by Granger causality. Critically, this connectivity enhancement was accompanied by sustained reductions in both anxious arousal and negative perception of sensory stimuli. Resting-state alpha power also increased, albeit only transiently, reversing to the baseline level within 24 h after tACS. Therefore, the lasting enhancement of long-range alpha connectivity due to α-tACS differs from local alpha activity that is nonetheless conserved, highlighting the adaptability of alpha oscillatory networks. In light of increasing recognition of large-scale network dysfunctions as a transdiagnostic pathophysiology of psychiatric disorders, this enduring connectivity plasticity, along with the behavioral improvements, paves the way for tACS applications in clinical interventions of psychiatric ‘oscillopathies’.
Collapse
Affiliation(s)
- Kevin J Clancy
- Department of Psychology, Florida State University, Tallahassee, FL, USA
| | - Sarah K Baisley
- Department of Psychology, Florida State University, Tallahassee, FL, USA
| | - Alejandro Albizu
- Department of Psychology, Florida State University, Tallahassee, FL, USA
| | | | - Mingzhou Ding
- J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, FL, USA
| | - Wen Li
- Department of Psychology, Florida State University, Tallahassee, FL, USA
| |
Collapse
|
116
|
Pinto MA, Rosso OA, Matias FS. Inhibitory autapse mediates anticipated synchronization between coupled neurons. Phys Rev E 2019; 99:062411. [PMID: 31330650 DOI: 10.1103/physreve.99.062411] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2019] [Indexed: 06/10/2023]
Abstract
Two identical autonomous dynamical systems unidirectionally coupled in a sender-receiver configuration can exhibit anticipated synchronization (AS) if the receiver neuron also receives a delayed negative self-feedback. Recently, AS was shown to occur in a three-neuron motif with standard chemical synapses where the delayed inhibition was provided by an interneuron. Here, we show that a two-neuron model in the presence of an inhibitory autapse, which is a massive self-innervation present in the cortical architecture, may present AS. The GABAergic autapse regulates the internal dynamics of the receiver neuron and acts as the negative delayed self-feedback required by dynamical systems in order to exhibit AS. In this biologically plausible scenario, a smooth transition from the usual delayed synchronization (DS) to AS typically occurs when the inhibitory conductance is increased. The phenomenon is shown to be robust when model parameters are varied within a physiological range. For extremely large values of the inhibitory autapse the system undergoes to a phase-drift regime in which the receiver is faster than the sender. Furthermore, we show that the inhibitory autapse promotes a faster internal dynamics of the free-running Receiver when the two neurons are uncoupled, which could be the mechanism underlying anticipated synchronization and the DS-AS transition.
Collapse
Affiliation(s)
- Marcel A Pinto
- Instituto de Física, Universidade Federal de Alagoas, Maceió, Alagoas 57072-970, Brazil
| | - Osvaldo A Rosso
- Instituto de Física, Universidade Federal de Alagoas, Maceió, Alagoas 57072-970, Brazil
- Departamento de Informática en Salud, Hospital Italiano de Buenos Aires and CONICET, C1199ABB, Ciudad Autónoma de Buenos Aires, Argentina
| | - Fernanda S Matias
- Instituto de Física, Universidade Federal de Alagoas, Maceió, Alagoas 57072-970, Brazil
| |
Collapse
|
117
|
Beyeler M, Rounds EL, Carlson KD, Dutt N, Krichmar JL. Neural correlates of sparse coding and dimensionality reduction. PLoS Comput Biol 2019; 15:e1006908. [PMID: 31246948 PMCID: PMC6597036 DOI: 10.1371/journal.pcbi.1006908] [Citation(s) in RCA: 39] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Abstract
Supported by recent computational studies, there is increasing evidence that a wide range of neuronal responses can be understood as an emergent property of nonnegative sparse coding (NSC), an efficient population coding scheme based on dimensionality reduction and sparsity constraints. We review evidence that NSC might be employed by sensory areas to efficiently encode external stimulus spaces, by some associative areas to conjunctively represent multiple behaviorally relevant variables, and possibly by the basal ganglia to coordinate movement. In addition, NSC might provide a useful theoretical framework under which to understand the often complex and nonintuitive response properties of neurons in other brain areas. Although NSC might not apply to all brain areas (for example, motor or executive function areas) the success of NSC-based models, especially in sensory areas, warrants further investigation for neural correlates in other regions.
Collapse
Affiliation(s)
- Michael Beyeler
- Department of Psychology, University of Washington, Seattle, Washington, United States of America
- Institute for Neuroengineering, University of Washington, Seattle, Washington, United States of America
- eScience Institute, University of Washington, Seattle, Washington, United States of America
- Department of Computer Science, University of California, Irvine, California, United States of America
| | - Emily L. Rounds
- Department of Cognitive Sciences, University of California, Irvine, California, United States of America
| | - Kristofor D. Carlson
- Department of Cognitive Sciences, University of California, Irvine, California, United States of America
- Sandia National Laboratories, Albuquerque, New Mexico, United States of America
| | - Nikil Dutt
- Department of Computer Science, University of California, Irvine, California, United States of America
- Department of Cognitive Sciences, University of California, Irvine, California, United States of America
| | - Jeffrey L. Krichmar
- Department of Computer Science, University of California, Irvine, California, United States of America
- Department of Cognitive Sciences, University of California, Irvine, California, United States of America
| |
Collapse
|
118
|
Ocker GK, Doiron B. Training and Spontaneous Reinforcement of Neuronal Assemblies by Spike Timing Plasticity. Cereb Cortex 2019; 29:937-951. [PMID: 29415191 PMCID: PMC7963120 DOI: 10.1093/cercor/bhy001] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 01/01/2018] [Accepted: 01/05/2018] [Indexed: 12/15/2022] Open
Abstract
The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, PA, USA
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
119
|
Thornton C, Hutchings F, Kaiser M. The Virtual Electrode Recording Tool for EXtracellular Potentials (VERTEX) Version 2.0: Modelling in vitro electrical stimulation of brain tissue. Wellcome Open Res 2019; 4:20. [PMID: 30984877 PMCID: PMC6439485 DOI: 10.12688/wellcomeopenres.15058.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/21/2019] [Indexed: 11/20/2022] Open
Abstract
Neuronal circuits can be modelled in detail allowing us to predict the effects of stimulation on individual neurons. Electrical stimulation of neuronal circuits in vitro and in vivo excites a range of neurons within the tissue and measurements of neural activity, e.g the local field potential (LFP), are again an aggregate of a large pool of cells. The previous version of our Virtual Electrode Recording Tool for EXtracellular Potentials (VERTEX) allowed for the simulation of the LFP generated by a patch of brain tissue. Here, we extend VERTEX to simulate the effect of electrical stimulation through a focal electric field. We observe both direct changes in neural activity and changes in synaptic plasticity. Testing our software in a model of a rat neocortical slice, we determine the currents contributing to the LFP, the effects of paired pulse stimulation to induce short term plasticity (STP), and the effect of theta burst stimulation (TBS) to induce long term potentiation (LTP).
Collapse
Affiliation(s)
- Christopher Thornton
- Interdisciplinary Computing and Complex bioSystems (ICOS) Research Group, Newcastle University, UK, Newcastle upon Tyne, NE4 5TG, UK
| | - Frances Hutchings
- Interdisciplinary Computing and Complex bioSystems (ICOS) Research Group, Newcastle University, UK, Newcastle upon Tyne, NE4 5TG, UK
| | - Marcus Kaiser
- Interdisciplinary Computing and Complex bioSystems (ICOS) Research Group, Newcastle University, UK, Newcastle upon Tyne, NE4 5TG, UK
- Institute of Neuroscience, Newcastle University, UK, Newcastle upon Tyne, NE2 4HH, UK
| |
Collapse
|
120
|
Madadi Asl M, Valizadeh A, Tass PA. Dendritic and Axonal Propagation Delays May Shape Neuronal Networks With Plastic Synapses. Front Physiol 2018; 9:1849. [PMID: 30618847 PMCID: PMC6307091 DOI: 10.3389/fphys.2018.01849] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2018] [Accepted: 12/07/2018] [Indexed: 12/27/2022] Open
Abstract
Biological neuronal networks are highly adaptive and plastic. For instance, spike-timing-dependent plasticity (STDP) is a core mechanism which adapts the synaptic strengths based on the relative timing of pre- and postsynaptic spikes. In various fields of physiology, time delays cause a plethora of biologically relevant dynamical phenomena. However, time delays increase the complexity of model systems together with the computational and theoretical analysis burden. Accordingly, in computational neuronal network studies propagation delays were often neglected. As a downside, a classic STDP rule in oscillatory neurons without propagation delays is unable to give rise to bidirectional synaptic couplings, i.e., loops or uncoupled states. This is at variance with basic experimental results. In this mini review, we focus on recent theoretical studies focusing on how things change in the presence of propagation delays. Realistic propagation delays may lead to the emergence of neuronal activity and synaptic connectivity patterns, which cannot be captured by classic STDP models. In fact, propagation delays determine the inventory of attractor states and shape their basins of attractions. The results reviewed here enable to overcome fundamental discrepancies between theory and experiments. Furthermore, these findings are relevant for the development of therapeutic brain stimulation techniques aiming at shifting the diseased brain to more favorable attractor states.
Collapse
Affiliation(s)
- Mojtaba Madadi Asl
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan, Iran
| | - Alireza Valizadeh
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan, Iran.,School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), Tehran, Iran
| | - Peter A Tass
- Department of Neurosurgery, Stanford University School of Medicine, Stanford, CA, United States
| |
Collapse
|
121
|
Matheus Gauy M, Lengler J, Einarsson H, Meier F, Weissenberger F, Yanik MF, Steger A. A Hippocampal Model for Behavioral Time Acquisition and Fast Bidirectional Replay of Spatio-Temporal Memory Sequences. Front Neurosci 2018; 12:961. [PMID: 30618583 PMCID: PMC6306028 DOI: 10.3389/fnins.2018.00961] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2018] [Accepted: 12/03/2018] [Indexed: 01/09/2023] Open
Abstract
The hippocampus is known to play a crucial role in the formation of long-term memory. For this, fast replays of previously experienced activities during sleep or after reward experiences are believed to be crucial. But how such replays are generated is still completely unclear. In this paper we propose a possible mechanism for this: we present a model that can store experienced trajectories on a behavioral timescale after a single run, and can subsequently bidirectionally replay such trajectories, thereby omitting any specifics of the previous behavior like speed, etc, but allowing repetitions of events, even with different subsequent events. Our solution builds on well-known concepts, one-shot learning and synfire chains, enhancing them by additional mechanisms using global inhibition and disinhibition. For replays our approach relies on dendritic spikes and cholinergic modulation, as supported by experimental data. We also hypothesize a functional role of disinhibition as a pacemaker during behavioral time.
Collapse
Affiliation(s)
- Marcelo Matheus Gauy
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Johannes Lengler
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Hafsteinn Einarsson
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Florian Meier
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Felix Weissenberger
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Mehmet Fatih Yanik
- Department of Information Technology and Electrical Engineering, Institute for Neuroinformatics, ETH Zurich, Zurich, Switzerland
| | - Angelika Steger
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| |
Collapse
|
122
|
Khalil R, Karim AA, Khedr E, Moftah M, Moustafa AA. Dynamic Communications Between GABA A Switch, Local Connectivity, and Synapses During Cortical Development: A Computational Study. Front Cell Neurosci 2018; 12:468. [PMID: 30618625 PMCID: PMC6304749 DOI: 10.3389/fncel.2018.00468] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2018] [Accepted: 11/16/2018] [Indexed: 11/13/2022] Open
Abstract
Several factors regulate cortical development, such as changes in local connectivity and the influences of dynamical synapses. In this study, we simulated various factors affecting the regulation of neural network activity during cortical development. Previous studies have shown that during early cortical development, the reversal potential of GABAA shifts from depolarizing to hyperpolarizing. Here we provide the first integrative computational model to simulate the combined effects of these factors in a unified framework (building on our prior work: Khalil et al., 2017a,b). In the current study, we extend our model to monitor firing activity in response to the excitatory action of GABAA. Precisely, we created a Spiking Neural Network model that included certain biophysical parameters for lateral connectivity (distance between adjacent neurons) and nearby local connectivity (complex connections involving those between neuronal groups). We simulated different network scenarios (for immature and mature conditions) based on these biophysical parameters. Then, we implemented two forms of Short-term synaptic plasticity (depression and facilitation). Each form has two distinct kinds according to its synaptic time constant value. Finally, in both sets of networks, we compared firing rate activity responses before and after simulating dynamical synapses. Based on simulation results, we found that the modulation effect of dynamical synapses for evaluating and shaping the firing activity of the neural network is strongly dependent on the physiological state of GABAA. Moreover, the STP mechanism acts differently in every network scenario, mirroring the crucial modulating roles of these critical parameters during cortical development. Clinical implications for pathological alterations of GABAergic signaling in neurological and psychiatric disorders are discussed.
Collapse
Affiliation(s)
- Radwa Khalil
- Department of Psychology and Methods, Jacobs University Bremen, Bremen, Germany
| | - Ahmed A Karim
- Department of Psychology and Methods, Jacobs University Bremen, Bremen, Germany.,University Clinic of Psychiatry and Psychotherapy, Tübingen, Germany
| | - Eman Khedr
- Department of Neuropsychiatry, Faculty of Medicine, Assiut University, Assiut, Egypt
| | - Marie Moftah
- Zoology Department, Faculty of Science, Alexandria University, Alexandria, Egypt
| | - Ahmed A Moustafa
- MARCS Institute for Brain and Behaviour, Western Sydney University, Sydney, NSW, Australia.,Department of Social Sciences, College of Arts and Sciences, Qatar University, Doha, Qatar
| |
Collapse
|
123
|
Knight JC, Nowotny T. GPUs Outperform Current HPC and Neuromorphic Solutions in Terms of Speed and Energy When Simulating a Highly-Connected Cortical Model. Front Neurosci 2018; 12:941. [PMID: 30618570 PMCID: PMC6299048 DOI: 10.3389/fnins.2018.00941] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2018] [Accepted: 11/29/2018] [Indexed: 11/15/2022] Open
Abstract
While neuromorphic systems may be the ultimate platform for deploying spiking neural networks (SNNs), their distributed nature and optimization for specific types of models makes them unwieldy tools for developing them. Instead, SNN models tend to be developed and simulated on computers or clusters of computers with standard von Neumann CPU architectures. Over the last decade, as well as becoming a common fixture in many workstations, NVIDIA GPU accelerators have entered the High Performance Computing field and are now used in 50 % of the Top 10 super computing sites worldwide. In this paper we use our GeNN code generator to re-implement two neo-cortex-inspired, circuit-scale, point neuron network models on GPU hardware. We verify the correctness of our GPU simulations against prior results obtained with NEST running on traditional HPC hardware and compare the performance with respect to speed and energy consumption against published data from CPU-based HPC and neuromorphic hardware. A full-scale model of a cortical column can be simulated at speeds approaching 0.5× real-time using a single NVIDIA Tesla V100 accelerator-faster than is currently possible using a CPU based cluster or the SpiNNaker neuromorphic system. In addition, we find that, across a range of GPU systems, the energy to solution as well as the energy per synaptic event of the microcircuit simulation is as much as 14× lower than either on SpiNNaker or in CPU-based simulations. Besides performance in terms of speed and energy consumption of the simulation, efficient initialization of models is also a crucial concern, particularly in a research context where repeated runs and parameter-space exploration are required. Therefore, we also introduce in this paper some of the novel parallel initialization methods implemented in the latest version of GeNN and demonstrate how they can enable further speed and energy advantages.
Collapse
Affiliation(s)
- James C. Knight
- Centre for Computational Neuroscience and Robotics, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | | |
Collapse
|
124
|
Demin V, Nekhaev D. Recurrent Spiking Neural Network Learning Based on a Competitive Maximization of Neuronal Activity. Front Neuroinform 2018; 12:79. [PMID: 30498439 PMCID: PMC6250118 DOI: 10.3389/fninf.2018.00079] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2018] [Accepted: 10/18/2018] [Indexed: 12/21/2022] Open
Abstract
Spiking neural networks (SNNs) are believed to be highly computationally and energy efficient for specific neurochip hardware real-time solutions. However, there is a lack of learning algorithms for complex SNNs with recurrent connections, comparable in efficiency with back-propagation techniques and capable of unsupervised training. Here we suppose that each neuron in a biological neural network tends to maximize its activity in competition with other neurons, and put this principle at the basis of a new SNN learning algorithm. In such a way, a spiking network with the learned feed-forward, reciprocal and intralayer inhibitory connections, is introduced to the MNIST database digit recognition. It has been demonstrated that this SNN can be trained without a teacher, after a short supervised initialization of weights by the same algorithm. Also, it has been shown that neurons are grouped into families of hierarchical structures, corresponding to different digit classes and their associations. This property is expected to be useful to reduce the number of layers in deep neural networks and modeling the formation of various functional structures in a biological nervous system. Comparison of the learning properties of the suggested algorithm, with those of the Sparse Distributed Representation approach shows similarity in coding but also some advantages of the former. The basic principle of the proposed algorithm is believed to be practically applicable to the construction of much more complicated and diverse task solving SNNs. We refer to this new approach as "Family-Engaged Execution and Learning of Induced Neuron Groups," or FEELING.
Collapse
Affiliation(s)
- Vyacheslav Demin
- National Research Center "Kurchatov Institute", Moscow, Russia.,Moscow Institute of Phycics and Technology, Dolgoprudny, Russia
| | - Dmitry Nekhaev
- National Research Center "Kurchatov Institute", Moscow, Russia
| |
Collapse
|
125
|
Henderson JA, Gong P. Functional mechanisms underlie the emergence of a diverse range of plasticity phenomena. PLoS Comput Biol 2018; 14:e1006590. [PMID: 30419014 PMCID: PMC6258383 DOI: 10.1371/journal.pcbi.1006590] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2018] [Revised: 11/26/2018] [Accepted: 10/23/2018] [Indexed: 11/18/2022] Open
Abstract
Diverse plasticity mechanisms are orchestrated to shape the spatiotemporal dynamics underlying brain functions. However, why these plasticity rules emerge and how their dynamics interact with neural activity to give rise to complex neural circuit dynamics remains largely unknown. Here we show that both Hebbian and homeostatic plasticity rules emerge from a functional perspective of neuronal dynamics whereby each neuron learns to encode its own activity in the population activity, so that the activity of the presynaptic neuron can be decoded from the activity of its postsynaptic neurons. We explain how a range of experimentally observed plasticity phenomena with widely separated time scales emerge from learning this encoding function, including STDP and its frequency dependence, and metaplasticity. We show that when implemented in neural circuits, these plasticity rules naturally give rise to essential neural response properties, including variable neural dynamics with balanced excitation and inhibition, and approximately log-normal distributions of synaptic strengths, while simultaneously encoding a complex real-world visual stimulus. These findings establish a novel function-based account of diverse plasticity mechanisms, providing a unifying framework relating plasticity, dynamics and neural computation. Many experiments have documented a variety of ways in which the connectivity strengths between neurons change in response to the activity of neurons. These changes are an important part of learning. However, it is not understood how such a diverse range of observations can be understood as consequences of an underlying algorithm used by brains for learning. In order to understand such a learning algorithm it is also necessary to understand the neural computation that is being learned, that is, how the functions of the brain are encoded in the activity of its neurons and its connectivity. In this work we propose a simple way in which information can be encoded and decoded in a network of neurons for operating on real-world stimuli, and how this can be learned using two fundamental plasticity rules that change the strength of connections between neurons in response to neural activity. Surprisingly, many experimental observations result as consequences of this approach, indicating that studying the learning of function provides a novel framework for unifying plasticity, dynamics, and neural computation.
Collapse
Affiliation(s)
- James A. Henderson
- School of Physics, The University of Sydney, Sydney, NSW, Australia
- ARC Centre of Excellence for Integrative Brain Function, The University of Sydney, Sydney, NSW, Australia
- * E-mail: (JAH); (PG)
| | - Pulin Gong
- School of Physics, The University of Sydney, Sydney, NSW, Australia
- ARC Centre of Excellence for Integrative Brain Function, The University of Sydney, Sydney, NSW, Australia
- * E-mail: (JAH); (PG)
| |
Collapse
|
126
|
Goodhill GJ. Theoretical Models of Neural Development. iScience 2018; 8:183-199. [PMID: 30321813 PMCID: PMC6197653 DOI: 10.1016/j.isci.2018.09.017] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2018] [Revised: 08/06/2018] [Accepted: 09/19/2018] [Indexed: 12/22/2022] Open
Abstract
Constructing a functioning nervous system requires the precise orchestration of a vast array of mechanical, molecular, and neural-activity-dependent cues. Theoretical models can play a vital role in helping to frame quantitative issues, reveal mathematical commonalities between apparently diverse systems, identify what is and what is not possible in principle, and test the abilities of specific mechanisms to explain the data. This review focuses on the progress that has been made over the last decade in our theoretical understanding of neural development.
Collapse
Affiliation(s)
- Geoffrey J Goodhill
- Queensland Brain Institute and School of Mathematics and Physics, The University of Queensland, St Lucia, QLD 4072, Australia.
| |
Collapse
|
127
|
Theodoni P, Rovira B, Wang Y, Roxin A. Theta-modulation drives the emergence of connectivity patterns underlying replay in a network model of place cells. eLife 2018; 7:37388. [PMID: 30355442 PMCID: PMC6224194 DOI: 10.7554/elife.37388] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2018] [Accepted: 10/24/2018] [Indexed: 01/05/2023] Open
Abstract
Place cells of the rodent hippocampus fire action potentials when the animal traverses a particular spatial location in any environment. Therefore for any given trajectory one observes a repeatable sequence of place cell activations. When the animal is quiescent or sleeping, one can observe similar sequences of activation known as replay, which underlie the process of memory consolidation. However, it remains unclear how replay is generated. Here we show how a temporally asymmetric plasticity rule during spatial exploration gives rise to spontaneous replay in a model network by shaping the recurrent connectivity to reflect the topology of the learned environment. Crucially, the rate of this encoding is strongly modulated by ongoing rhythms. Oscillations in the theta range optimize learning by generating repeated pre-post pairings on a time-scale commensurate with the window for plasticity, while lower and higher frequencies generate learning rates which are lower by orders of magnitude.
Collapse
Affiliation(s)
- Panagiota Theodoni
- Centre de Recerca Matemàtica, Bellaterra, Spain.,New York University Shanghai, Shanghai, China.,NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, Shanghai, China
| | - Bernat Rovira
- Centre de Recerca Matemàtica, Bellaterra, Spain.,Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
| | - Yingxue Wang
- Max Planck Florida Institute for Neuroscience, Jupiter, United States
| | - Alex Roxin
- Centre de Recerca Matemàtica, Bellaterra, Spain.,Barcelona Graduate School of Mathematics, Barcelona, Spain
| |
Collapse
|
128
|
Madadi Asl M, Valizadeh A, Tass PA. Propagation delays determine neuronal activity and synaptic connectivity patterns emerging in plastic neuronal networks. CHAOS (WOODBURY, N.Y.) 2018; 28:106308. [PMID: 30384625 DOI: 10.1063/1.5037309] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/23/2018] [Accepted: 08/01/2018] [Indexed: 06/08/2023]
Abstract
In plastic neuronal networks, the synaptic strengths are adapted to the neuronal activity. Specifically, spike-timing-dependent plasticity (STDP) is a fundamental mechanism that modifies the synaptic strengths based on the relative timing of pre- and postsynaptic spikes, taking into account the spikes' temporal order. In many studies, propagation delays were neglected to avoid additional dynamic complexity or computational costs. So far, networks equipped with a classic STDP rule typically rule out bidirectional couplings (i.e., either loops or uncoupled states) and are, hence, not able to reproduce fundamental experimental findings. In this review paper, we consider additional features, e.g., extensions of the classic STDP rule or additional aspects like noise, in order to overcome the contradictions between theory and experiment. In addition, we review in detail recent studies showing that a classic STDP rule combined with realistic propagation patterns is able to capture relevant experimental findings. In two coupled oscillatory neurons with propagation delays, bidirectional synapses can be preserved and potentiated. This result also holds for large networks of type-II phase oscillators. In addition, not only the mean of the initial distribution of synaptic weights, but also its standard deviation crucially determines the emergent structural connectivity, i.e., the mean final synaptic weight, the number of two-neuron loops, and the symmetry of the final connectivity pattern. The latter is affected by the firing rates, where more symmetric synaptic configurations emerge at higher firing rates. Finally, we discuss these findings in the context of the computational neuroscience-based development of desynchronizing brain stimulation techniques.
Collapse
Affiliation(s)
- Mojtaba Madadi Asl
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan 45195-1159, Iran
| | - Alireza Valizadeh
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan 45195-1159, Iran
| | - Peter A Tass
- Department of Neurosurgery, School of Medicine, Stanford University, Stanford, California 94305, USA
| |
Collapse
|
129
|
Richards BA, Lillicrap TP. Dendritic solutions to the credit assignment problem. Curr Opin Neurobiol 2018; 54:28-36. [PMID: 30205266 DOI: 10.1016/j.conb.2018.08.003] [Citation(s) in RCA: 50] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2018] [Revised: 07/19/2018] [Accepted: 08/07/2018] [Indexed: 11/27/2022]
Abstract
Guaranteeing that synaptic plasticity leads to effective learning requires a means for assigning credit to each neuron for its contribution to behavior. The 'credit assignment problem' refers to the fact that credit assignment is non-trivial in hierarchical networks with multiple stages of processing. One difficulty is that if credit signals are integrated with other inputs, then it is hard for synaptic plasticity rules to distinguish credit-related activity from non-credit-related activity. A potential solution is to use the spatial layout and non-linear properties of dendrites to distinguish credit signals from other inputs. In cortical pyramidal neurons, evidence hints that top-down feedback signals are integrated in the distal apical dendrites and have a distinct impact on spike-firing and synaptic plasticity. This suggests that the distal apical dendrites of pyramidal neurons help the brain to solve the credit assignment problem.
Collapse
Affiliation(s)
- Blake A Richards
- Department of Biological Sciences, University of Toronto Scarborough, Toronto, ON, Canada; Department of Cell and Systems Biology, University of Toronto, Toronto, ON, Canada; Learning in Machines and Brains Program, Canadian Institute for Advanced Research, Toronto, ON, Canada
| | | |
Collapse
|
130
|
Osipov V, Osipova M. Space–time signal binding in recurrent neural networks with controlled elements. Neurocomputing 2018. [DOI: 10.1016/j.neucom.2018.05.009] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/16/2022]
|
131
|
Detorakis G, Sheik S, Augustine C, Paul S, Pedroni BU, Dutt N, Krichmar J, Cauwenberghs G, Neftci E. Neural and Synaptic Array Transceiver: A Brain-Inspired Computing Framework for Embedded Learning. Front Neurosci 2018; 12:583. [PMID: 30210274 PMCID: PMC6123384 DOI: 10.3389/fnins.2018.00583] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Accepted: 08/03/2018] [Indexed: 11/13/2022] Open
Abstract
Embedded, continual learning for autonomous and adaptive behavior is a key application of neuromorphic hardware. However, neuromorphic implementations of embedded learning at large scales that are both flexible and efficient have been hindered by a lack of a suitable algorithmic framework. As a result, most neuromorphic hardware are trained off-line on large clusters of dedicated processors or GPUs and transferred post hoc to the device. We address this by introducing the neural and synaptic array transceiver (NSAT), a neuromorphic computational framework facilitating flexible and efficient embedded learning by matching algorithmic requirements and neural and synaptic dynamics. NSAT supports event-driven supervised, unsupervised and reinforcement learning algorithms including deep learning. We demonstrate the NSAT in a wide range of tasks, including the simulation of Mihalas-Niebur neuron, dynamic neural fields, event-driven random back-propagation for event-based deep learning, event-based contrastive divergence for unsupervised learning, and voltage-based learning rules for sequence learning. We anticipate that this contribution will establish the foundation for a new generation of devices enabling adaptive mobile systems, wearable devices, and robots with data-driven autonomy.
Collapse
Affiliation(s)
- Georgios Detorakis
- Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, United States
| | - Sadique Sheik
- Biocircuits Institute, University of California, San Diego, La Jolla, CA, United States
| | - Charles Augustine
- Intel Corporation-Circuit Research Lab, Hillsboro, OR, United States
| | - Somnath Paul
- Intel Corporation-Circuit Research Lab, Hillsboro, OR, United States
| | - Bruno U. Pedroni
- Department of Bioengineering and Institute for Neural Computation, University of California, San Diego, La Jolla, CA, United States
| | - Nikil Dutt
- Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, United States
- Department of Computer Science, University of California, Irvine, Irvine, CA, United States
| | - Jeffrey Krichmar
- Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, United States
- Department of Computer Science, University of California, Irvine, Irvine, CA, United States
| | - Gert Cauwenberghs
- Department of Bioengineering and Institute for Neural Computation, University of California, San Diego, La Jolla, CA, United States
| | - Emre Neftci
- Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, United States
- Department of Computer Science, University of California, Irvine, Irvine, CA, United States
| |
Collapse
|
132
|
Zanos S, Rembado I, Chen D, Fetz EE. Phase-Locked Stimulation during Cortical Beta Oscillations Produces Bidirectional Synaptic Plasticity in Awake Monkeys. Curr Biol 2018; 28:2515-2526.e4. [PMID: 30100342 PMCID: PMC6108550 DOI: 10.1016/j.cub.2018.07.009] [Citation(s) in RCA: 64] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2017] [Revised: 04/04/2018] [Accepted: 07/04/2018] [Indexed: 12/19/2022]
Abstract
The functional role of cortical beta oscillations, if any, remains unresolved. During oscillations, the periodic fluctuation in excitability of entrained cells modulates transmission of neural impulses and periodically enhances synaptic interactions. The extent to which oscillatory episodes affect activity-dependent synaptic plasticity remains to be determined. In nonhuman primates, we delivered single-pulse electrical cortical stimulation to a "stimulated" site in sensorimotor cortex triggered on a specific phase of ongoing beta (12-25 Hz) field potential oscillations recorded at a separate "triggering" site. Corticocortical connectivity from the stimulated to the triggering site as well as to other (non-triggering) sites was assessed by cortically evoked potentials elicited by test stimuli to the stimulated site, delivered outside of oscillatory episodes. In separate experiments, connectivity was assessed by intracellular recordings of evoked excitatory postsynaptic potentials. The conditioning paradigm produced transient (1-2 s long) changes in connectivity between the stimulated and the triggering site that outlasted the duration of the oscillatory episodes. The direction of the plasticity effect depended on the phase from which stimulation was triggered: potentiation in depolarizing phases, depression in hyperpolarizing phases. Plasticity effects were also seen at non-triggering sites that exhibited oscillations synchronized with those at the triggering site. These findings indicate that cortical beta oscillations provide a spatial and temporal substrate for short-term, activity-dependent synaptic plasticity in primate neocortex and may help explain the role of oscillations in attention, learning, and cortical reorganization.
Collapse
Affiliation(s)
- Stavros Zanos
- Center for Bioelectronic Medicine, Feinstein Institute for Medical Research, 350 Community Drive, Manhasset NY 11030, USA; Department of Physiology & Biophysics, University of Washington, 1705 NE Pacific St, Seattle, WA 98195, USA.
| | - Irene Rembado
- Department of Physiology & Biophysics, University of Washington, 1705 NE Pacific St, Seattle, WA 98195, USA.
| | - Daofen Chen
- Division of Neuroscience, National Institutes of Neurological Disorders and Stroke, National Institutes of Health, 6001 Executive Boulevard, Bethesda, MD 20892, USA.
| | - Eberhard E Fetz
- Department of Physiology & Biophysics, University of Washington, 1705 NE Pacific St, Seattle, WA 98195, USA.
| |
Collapse
|
133
|
Interplay of multiple pathways and activity-dependent rules in STDP. PLoS Comput Biol 2018; 14:e1006184. [PMID: 30106953 PMCID: PMC6112684 DOI: 10.1371/journal.pcbi.1006184] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2018] [Revised: 08/28/2018] [Accepted: 05/09/2018] [Indexed: 12/13/2022] Open
Abstract
Hebbian plasticity describes a basic mechanism for synaptic plasticity whereby synaptic weights evolve depending on the relative timing of paired activity of the pre- and postsynaptic neurons. Spike-timing-dependent plasticity (STDP) constitutes a central experimental and theoretical synaptic Hebbian learning rule. Various mechanisms, mostly calcium-based, account for the induction and maintenance of STDP. Classically STDP is assumed to gradually emerge in a monotonic way as the number of pairings increases. However, non-monotonic STDP accounting for fast associative learning led us to challenge this monotonicity hypothesis and explore how the existence of multiple plasticity pathways affects the dynamical establishment of plasticity. To account for distinct forms of STDP emerging from increasing numbers of pairings and the variety of signaling pathways involved, we developed a general class of simple mathematical models of plasticity based on calcium transients and accommodating various calcium-based plasticity mechanisms. These mechanisms can either compete or cooperate for the establishment of long-term potentiation (LTP) and depression (LTD), that emerge depending on past calcium activity. Our model reproduces accurately the striatal STDP that involves endocannabinoid and NMDAR signaling pathways. Moreover, we predict how stimulus frequency alters plasticity, and how triplet rules are affected by the number of pairings. We further investigate the general model with an arbitrary number of pathways and show that depending on those pathways and their properties, a variety of plasticities may emerge upon variation of the number and/or the frequency of pairings, even when the outcome after large numbers of pairings is identical. These findings, built upon a biologically realistic example and generalized to other applications, argue that in order to fully describe synaptic plasticity it is not sufficient to record STDP curves at fixed pairing numbers and frequencies. In fact, considering the whole spectrum of activity-dependent parameters could have a great impact on the description of plasticity, and a better understanding of the engram. The brain’s capacity to treat information, learn and store memory relies on synaptic connectivity patterns, which are altered through synaptic plasticity mechanisms. Experimentally, such plasticities were evidenced through protocols involving numerous repetitive stimulations of a given synapse, and were shown to be supported by multiple pathways. Using a simple biologically grounded mathematical model, we show how activation timescales and inactivation levels of each pathway interact and alter plasticity in an intricate manner as stimuli are presented. Building upon data from the synapse between cortex and striatum, we show that synaptic changes may revert or re-emerge as stimuli are presented, and predict specific responses to changes in stimulus frequency or to distinct simulation patterns. Our general model shows that a given plasticity profile emerging in response to a repetitive stimulation protocol can unfold into various scenarii upon variations of the number of stimulus presentations or patterns, which tightly depends on the underlying activated pathways. Altogether, these results argue that in order to better understand learning and memory, single plasticity responses obtained through intensive stimulations do not reveal the complexity of the responses for smaller number of presentations, which may have a strong impact in fast learning of stimuli with low numbers of presentations.
Collapse
|
134
|
Lee C, Panda P, Srinivasan G, Roy K. Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning. Front Neurosci 2018; 12:435. [PMID: 30123103 PMCID: PMC6085488 DOI: 10.3389/fnins.2018.00435] [Citation(s) in RCA: 51] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2018] [Accepted: 06/11/2018] [Indexed: 12/02/2022] Open
Abstract
Spiking Neural Networks (SNNs) are fast becoming a promising candidate for brain-inspired neuromorphic computing because of their inherent power efficiency and impressive inference accuracy across several cognitive tasks such as image classification and speech recognition. The recent efforts in SNNs have been focused on implementing deeper networks with multiple hidden layers to incorporate exponentially more difficult functional representations. In this paper, we propose a pre-training scheme using biologically plausible unsupervised learning, namely Spike-Timing-Dependent-Plasticity (STDP), in order to better initialize the parameters in multi-layer systems prior to supervised optimization. The multi-layer SNN is comprised of alternating convolutional and pooling layers followed by fully-connected layers, which are populated with leaky integrate-and-fire spiking neurons. We train the deep SNNs in two phases wherein, first, convolutional kernels are pre-trained in a layer-wise manner with unsupervised learning followed by fine-tuning the synaptic weights with spike-based supervised gradient descent backpropagation. Our experiments on digit recognition demonstrate that the STDP-based pre-training with gradient-based optimization provides improved robustness, faster (~2.5 ×) training time and better generalization compared with purely gradient-based training without pre-training.
Collapse
Affiliation(s)
- Chankyu Lee
- Nanoelectronics Research Laboratory, School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, United States
| | | | | | | |
Collapse
|
135
|
Gerstner W, Lehmann M, Liakoni V, Corneil D, Brea J. Eligibility Traces and Plasticity on Behavioral Time Scales: Experimental Support of NeoHebbian Three-Factor Learning Rules. Front Neural Circuits 2018; 12:53. [PMID: 30108488 PMCID: PMC6079224 DOI: 10.3389/fncir.2018.00053] [Citation(s) in RCA: 117] [Impact Index Per Article: 19.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2018] [Accepted: 06/19/2018] [Indexed: 11/13/2022] Open
Abstract
Most elementary behaviors such as moving the arm to grasp an object or walking into the next room to explore a museum evolve on the time scale of seconds; in contrast, neuronal action potentials occur on the time scale of a few milliseconds. Learning rules of the brain must therefore bridge the gap between these two different time scales. Modern theories of synaptic plasticity have postulated that the co-activation of pre- and postsynaptic neurons sets a flag at the synapse, called an eligibility trace, that leads to a weight change only if an additional factor is present while the flag is set. This third factor, signaling reward, punishment, surprise, or novelty, could be implemented by the phasic activity of neuromodulators or specific neuronal inputs signaling special events. While the theoretical framework has been developed over the last decades, experimental evidence in support of eligibility traces on the time scale of seconds has been collected only during the last few years. Here we review, in the context of three-factor rules of synaptic plasticity, four key experiments that support the role of synaptic eligibility traces in combination with a third factor as a biological implementation of neoHebbian three-factor learning rules.
Collapse
Affiliation(s)
- Wulfram Gerstner
- School of Computer Science and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | | | | | | | | |
Collapse
|
136
|
Foncelle A, Mendes A, Jędrzejewska-Szmek J, Valtcheva S, Berry H, Blackwell KT, Venance L. Modulation of Spike-Timing Dependent Plasticity: Towards the Inclusion of a Third Factor in Computational Models. Front Comput Neurosci 2018; 12:49. [PMID: 30018546 PMCID: PMC6037788 DOI: 10.3389/fncom.2018.00049] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2018] [Accepted: 06/06/2018] [Indexed: 11/13/2022] Open
Abstract
In spike-timing dependent plasticity (STDP) change in synaptic strength depends on the timing of pre- vs. postsynaptic spiking activity. Since STDP is in compliance with Hebb's postulate, it is considered one of the major mechanisms of memory storage and recall. STDP comprises a system of two coincidence detectors with N-methyl-D-aspartate receptor (NMDAR) activation often posited as one of the main components. Numerous studies have unveiled a third component of this coincidence detection system, namely neuromodulation and glia activity shaping STDP. Even though dopaminergic control of STDP has most often been reported, acetylcholine, noradrenaline, nitric oxide (NO), brain-derived neurotrophic factor (BDNF) or gamma-aminobutyric acid (GABA) also has been shown to effectively modulate STDP. Furthermore, it has been demonstrated that astrocytes, via the release or uptake of glutamate, gate STDP expression. At the most fundamental level, the timing properties of STDP are expected to depend on the spatiotemporal dynamics of the underlying signaling pathways. However in most cases, due to technical limitations experiments grant only indirect access to these pathways. Computational models carefully constrained by experiments, allow for a better qualitative understanding of the molecular basis of STDP and its regulation by neuromodulators. Recently, computational models of calcium dynamics and signaling pathway molecules have started to explore STDP emergence in ex and in vivo-like conditions. These models are expected to reproduce better at least part of the complex modulation of STDP as an emergent property of the underlying molecular pathways. Elucidation of the mechanisms underlying STDP modulation and its consequences on network dynamics is of critical importance and will allow better understanding of the major mechanisms of memory storage and recall both in health and disease.
Collapse
Affiliation(s)
- Alexandre Foncelle
- INRIA, Villeurbanne, France
- LIRIS UMR 5205 CNRS-INSA, University of Lyon, Villeurbanne, France
| | - Alexandre Mendes
- Dynamic and Pathophysiology of Neuronal Networks, Center for Interdisciplinary Research in Biology (CIRB), College de France, INSERM U1050, CNRS UMR7241, Labex Memolife, Paris, France
- University Pierre et Marie Curie, ED 158, Paris, France
| | | | - Silvana Valtcheva
- Dynamic and Pathophysiology of Neuronal Networks, Center for Interdisciplinary Research in Biology (CIRB), College de France, INSERM U1050, CNRS UMR7241, Labex Memolife, Paris, France
- University Pierre et Marie Curie, ED 158, Paris, France
| | - Hugues Berry
- INRIA, Villeurbanne, France
- LIRIS UMR 5205 CNRS-INSA, University of Lyon, Villeurbanne, France
| | - Kim T. Blackwell
- The Krasnow Institute for Advanced Studies, George Mason University, Fairfax, VA, United States
| | - Laurent Venance
- Dynamic and Pathophysiology of Neuronal Networks, Center for Interdisciplinary Research in Biology (CIRB), College de France, INSERM U1050, CNRS UMR7241, Labex Memolife, Paris, France
- University Pierre et Marie Curie, ED 158, Paris, France
| |
Collapse
|
137
|
Diederich N, Bartsch T, Kohlstedt H, Ziegler M. A memristive plasticity model of voltage-based STDP suitable for recurrent bidirectional neural networks in the hippocampus. Sci Rep 2018; 8:9367. [PMID: 29921840 PMCID: PMC6008480 DOI: 10.1038/s41598-018-27616-6] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2017] [Accepted: 06/04/2018] [Indexed: 01/02/2023] Open
Abstract
Memristive systems have gained considerable attention in the field of neuromorphic engineering, because they allow the emulation of synaptic functionality in solid state nano-physical systems. In this study, we show that memristive behavior provides a broad working framework for the phenomenological modelling of cellular synaptic mechanisms. In particular, we seek to understand how close a memristive system can account for the biological realism. The basic characteristics of memristive systems, i.e. voltage and memory behavior, are used to derive a voltage-based plasticity rule. We show that this model is suitable to account for a variety of electrophysiology plasticity data. Furthermore, we incorporate the plasticity model into an all-to-all connecting network scheme. Motivated by the auto-associative CA3 network of the hippocampus, we show that the implemented network allows the discrimination and processing of mnemonic pattern information, i.e. the formation of functional bidirectional connections resulting in the formation of local receptive fields. Since the presented plasticity model can be applied to real memristive devices as well, the presented theoretical framework can support both, the design of appropriate memristive devices for neuromorphic computing and the development of complex neuromorphic networks, which account for the specific advantage of memristive devices.
Collapse
Affiliation(s)
- Nick Diederich
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany
- Department of Neurology, Memory Disorders and Plasticity Group, University Hospital Schleswig-Holstein, Kiel, Germany
| | - Thorsten Bartsch
- Department of Neurology, Memory Disorders and Plasticity Group, University Hospital Schleswig-Holstein, Kiel, Germany
| | - Hermann Kohlstedt
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany
| | - Martin Ziegler
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany.
| |
Collapse
|
138
|
Bridging structure and function: A model of sequence learning and prediction in primary visual cortex. PLoS Comput Biol 2018; 14:e1006187. [PMID: 29870532 PMCID: PMC6003695 DOI: 10.1371/journal.pcbi.1006187] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2017] [Revised: 06/15/2018] [Accepted: 05/09/2018] [Indexed: 11/29/2022] Open
Abstract
Recent experiments have demonstrated that visual cortex engages in spatio-temporal sequence learning and prediction. The cellular basis of this learning remains unclear, however. Here we present a spiking neural network model that explains a recent study on sequence learning in the primary visual cortex of rats. The model posits that the sequence learning and prediction abilities of cortical circuits result from the interaction of spike-timing dependent plasticity (STDP) and homeostatic plasticity mechanisms. It also reproduces changes in stimulus-evoked multi-unit activity during learning. Furthermore, it makes precise predictions regarding how training shapes network connectivity to establish its prediction ability. Finally, it predicts that the adapted connectivity gives rise to systematic changes in spontaneous network activity. Taken together, our model establishes a new conceptual bridge between the structure and function of cortical circuits in the context of sequence learning and prediction. A central goal of Neuroscience is to understand the relationship between the structure and function of brain networks. Of particular interest are the circuits of the neocortex, the seat of our highest cognitive abilities. Here we provide a new link between the structure and function of neocortical circuits in the context of sequence learning. We study a spiking neural network model that self-organizes its connectivity and activity via a combination of different plasticity mechanisms known to operate in cortical circuits. We use this model to explain various findings from a recent experimental study on sequence learning and prediction in rat visual cortex. Our model reproduces the changes in activity patterns as the animal learns the sequential pattern of visual stimulation. In addition, the model predicts what stimulation-induced structural changes underlie this sequence learning ability. Finally, the model also predicts how the adapted network structure influences spontaneous network activity when there is no visual stimulation. Hence, our model provides new insights about the relationship between structure and function of cortical circuits.
Collapse
|
139
|
John RA, Liu F, Chien NA, Kulkarni MR, Zhu C, Fu Q, Basu A, Liu Z, Mathews N. Synergistic Gating of Electro-Iono-Photoactive 2D Chalcogenide Neuristors: Coexistence of Hebbian and Homeostatic Synaptic Metaplasticity. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2018; 30:e1800220. [PMID: 29726076 DOI: 10.1002/adma.201800220] [Citation(s) in RCA: 43] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/10/2018] [Revised: 02/25/2018] [Indexed: 05/22/2023]
Abstract
Emulation of brain-like signal processing with thin-film devices can lay the foundation for building artificially intelligent learning circuitry in future. Encompassing higher functionalities into single artificial neural elements will allow the development of robust neuromorphic circuitry emulating biological adaptation mechanisms with drastically lesser neural elements, mitigating strict process challenges and high circuit density requirements necessary to match the computational complexity of the human brain. Here, 2D transition metal di-chalcogenide (MoS2 ) neuristors are designed to mimic intracellular ion endocytosis-exocytosis dynamics/neurotransmitter-release in chemical synapses using three approaches: (i) electronic-mode: a defect modulation approach where the traps at the semiconductor-dielectric interface are perturbed; (ii) ionotronic-mode: where electronic responses are modulated via ionic gating; and (iii) photoactive-mode: harnessing persistent photoconductivity or trap-assisted slow recombination mechanisms. Exploiting a novel multigated architecture incorporating electrical and optical biases, this incarnation not only addresses different charge-trapping probabilities to finely modulate the synaptic weights, but also amalgamates neuromodulation schemes to achieve "plasticity of plasticity-metaplasticity" via dynamic control of Hebbian spike-time dependent plasticity and homeostatic regulation. Coexistence of such multiple forms of synaptic plasticity increases the efficacy of memory storage and processing capacity of artificial neuristors, enabling design of highly efficient novel neural architectures.
Collapse
Affiliation(s)
- Rohit Abraham John
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Fucai Liu
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Nguyen Anh Chien
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Mohit R Kulkarni
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Chao Zhu
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Qundong Fu
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Arindam Basu
- School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Zheng Liu
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Nripan Mathews
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
- Energy Research Institute @ NTU (ERI@N), Nanyang Technological University, Singapore, 637553
| |
Collapse
|
140
|
Cui Y, Prokin I, Mendes A, Berry H, Venance L. Robustness of STDP to spike timing jitter. Sci Rep 2018; 8:8139. [PMID: 29802357 PMCID: PMC5970212 DOI: 10.1038/s41598-018-26436-y] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2018] [Accepted: 05/09/2018] [Indexed: 01/26/2023] Open
Abstract
In Hebbian plasticity, neural circuits adjust their synaptic weights depending on patterned firing. Spike-timing-dependent plasticity (STDP), a synaptic Hebbian learning rule, relies on the order and timing of the paired activities in pre- and postsynaptic neurons. Classically, in ex vivo experiments, STDP is assessed with deterministic (constant) spike timings and time intervals between successive pairings, thus exhibiting a regularity that differs from biological variability. Hence, STDP emergence from noisy inputs as occurring in in vivo-like firing remains unresolved. Here, we used noisy STDP pairings where the spike timing and/or interval between pairings were jittered. We explored with electrophysiology and mathematical modeling, the impact of jitter on three forms of STDP at corticostriatal synapses: NMDAR-LTP, endocannabinoid-LTD and endocannabinoid-LTP. We found that NMDAR-LTP was highly fragile to jitter, whereas endocannabinoid-plasticity appeared more resistant. When the frequency or number of pairings was increased, NMDAR-LTP became more robust and could be expressed despite strong jittering. Our results identify endocannabinoid-plasticity as a robust form of STDP, whereas the sensitivity to jitter of NMDAR-LTP varies with activity frequency. This provides new insights into the mechanisms at play during the different phases of learning and memory and the emergence of Hebbian plasticity in in vivo-like activity.
Collapse
Affiliation(s)
- Yihui Cui
- Dynamics and Pathophysiology of Neuronal Networks Team, Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, PSL Research University, Paris, France
| | - Ilya Prokin
- INRIA, Villeurbanne, France.,University of Lyon, LIRIS UMR5205, Villeurbanne, France
| | - Alexandre Mendes
- Dynamics and Pathophysiology of Neuronal Networks Team, Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, PSL Research University, Paris, France
| | - Hugues Berry
- INRIA, Villeurbanne, France. .,University of Lyon, LIRIS UMR5205, Villeurbanne, France.
| | - Laurent Venance
- Dynamics and Pathophysiology of Neuronal Networks Team, Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, PSL Research University, Paris, France.
| |
Collapse
|
141
|
Sollini J, Chapuis GA, Clopath C, Chadderton P. ON-OFF receptive fields in auditory cortex diverge during development and contribute to directional sweep selectivity. Nat Commun 2018; 9:2084. [PMID: 29802383 PMCID: PMC5970219 DOI: 10.1038/s41467-018-04548-3] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2017] [Accepted: 05/03/2018] [Indexed: 01/06/2023] Open
Abstract
Neurons in the auditory cortex exhibit distinct frequency tuning to the onset and offset of sounds, but the cause and significance of ON and OFF receptive field (RF) organisation are not understood. Here we demonstrate that distinct ON and OFF frequency tuning is largely absent in immature mouse auditory cortex and is thus a consequence of cortical development. Simulations using a novel implementation of a standard Hebbian plasticity model show that the natural alternation of sound onset and offset is sufficient for the formation of non-overlapping adjacent ON and OFF RFs in cortical neurons. Our model predicts that ON/OFF RF arrangement contributes towards direction selectivity to frequency-modulated tone sweeps, which we confirm by neuronal recordings. These data reveal that a simple and universally accepted learning rule can explain the organisation of ON and OFF RFs and direction selectivity in the developing auditory cortex. Auditory cortex neurons exhibit distinct frequency tuning to sound onset and offset. Here the authors demonstrate that during development ON-OFF receptive fields diverge to occupy adjacent frequency ranges that may underlie their direction selective responses to frequency modulated sweeps.
Collapse
Affiliation(s)
- Joseph Sollini
- Department of Bioengineering, Imperial College London, London, SW7 2AZ, United Kingdom
| | - Gaëlle A Chapuis
- Department of Bioengineering, Imperial College London, London, SW7 2AZ, United Kingdom
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, SW7 2AZ, United Kingdom.
| | - Paul Chadderton
- Department of Bioengineering, Imperial College London, London, SW7 2AZ, United Kingdom. .,School of Physiology, Pharmacology and Neuroscience, Biomedical Sciences Building, University of Bristol, University Walk, Bristol, BS8 1TD, United Kingdom.
| |
Collapse
|
142
|
González-Rueda A, Pedrosa V, Feord RC, Clopath C, Paulsen O. Activity-Dependent Downscaling of Subthreshold Synaptic Inputs during Slow-Wave-Sleep-like Activity In Vivo. Neuron 2018; 97:1244-1252.e5. [PMID: 29503184 PMCID: PMC5873548 DOI: 10.1016/j.neuron.2018.01.047] [Citation(s) in RCA: 69] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2017] [Revised: 12/19/2017] [Accepted: 01/26/2018] [Indexed: 01/13/2023]
Abstract
Activity-dependent synaptic plasticity is critical for cortical circuit refinement. The synaptic homeostasis hypothesis suggests that synaptic connections are strengthened during wake and downscaled during sleep; however, it is not obvious how the same plasticity rules could explain both outcomes. Using whole-cell recordings and optogenetic stimulation of presynaptic input in urethane-anesthetized mice, which exhibit slow-wave-sleep (SWS)-like activity, we show that synaptic plasticity rules are gated by cortical dynamics in vivo. While Down states support conventional spike timing-dependent plasticity, Up states are biased toward depression such that presynaptic stimulation alone leads to synaptic depression, while connections contributing to postsynaptic spiking are protected against this synaptic weakening. We find that this novel activity-dependent and input-specific downscaling mechanism has two important computational advantages: (1) improved signal-to-noise ratio, and (2) preservation of previously stored information. Thus, these synaptic plasticity rules provide an attractive mechanism for SWS-related synaptic downscaling and circuit refinement.
Collapse
Affiliation(s)
- Ana González-Rueda
- Department of Physiology, Development and Neuroscience, University of Cambridge, Cambridge, CB2 3EG, UK; Neurobiology Division, Medical Research Council (MRC) Laboratory of Molecular Biology, Cambridge, CB2 0QH, UK.
| | - Victor Pedrosa
- Department of Bioengineering, Imperial College London, London, SW7 2AZ, UK; CAPES Foundation, Ministry of Education of Brazil, Brasilia, 70040-020, Brazil
| | - Rachael C Feord
- Department of Physiology, Development and Neuroscience, University of Cambridge, Cambridge, CB2 3EG, UK
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, SW7 2AZ, UK
| | - Ole Paulsen
- Department of Physiology, Development and Neuroscience, University of Cambridge, Cambridge, CB2 3EG, UK.
| |
Collapse
|
143
|
Weissenberger F, Gauy MM, Lengler J, Meier F, Steger A. Voltage dependence of synaptic plasticity is essential for rate based learning with short stimuli. Sci Rep 2018; 8:4609. [PMID: 29545553 PMCID: PMC5854671 DOI: 10.1038/s41598-018-22781-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2018] [Accepted: 02/28/2018] [Indexed: 11/09/2022] Open
Abstract
In computational neuroscience, synaptic plasticity rules are often formulated in terms of firing rates. The predominant description of in vivo neuronal activity, however, is the instantaneous rate (or spiking probability). In this article we resolve this discrepancy by showing that fluctuations of the membrane potential carry enough information to permit a precise estimate of the instantaneous rate in balanced networks. As a consequence, we find that rate based plasticity rules are not restricted to neuronal activity that is stable for hundreds of milliseconds to seconds, but can be carried over to situations in which it changes every few milliseconds. We illustrate this, by showing that a voltage-dependent realization of the classical BCM rule achieves input selectivity, even if stimulus duration is reduced to a few milliseconds each.
Collapse
Affiliation(s)
- Felix Weissenberger
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland.
| | - Marcelo Matheus Gauy
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Johannes Lengler
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Florian Meier
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Angelika Steger
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| |
Collapse
|
144
|
Mikaitis M, Pineda García G, Knight JC, Furber SB. Neuromodulated Synaptic Plasticity on the SpiNNaker Neuromorphic System. Front Neurosci 2018. [PMID: 29535600 PMCID: PMC5835099 DOI: 10.3389/fnins.2018.00105] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/04/2022] Open
Abstract
SpiNNaker is a digital neuromorphic architecture, designed specifically for the low power simulation of large-scale spiking neural networks at speeds close to biological real-time. Unlike other neuromorphic systems, SpiNNaker allows users to develop their own neuron and synapse models as well as specify arbitrary connectivity. As a result SpiNNaker has proved to be a powerful tool for studying different neuron models as well as synaptic plasticity—believed to be one of the main mechanisms behind learning and memory in the brain. A number of Spike-Timing-Dependent-Plasticity(STDP) rules have already been implemented on SpiNNaker and have been shown to be capable of solving various learning tasks in real-time. However, while STDP is an important biological theory of learning, it is a form of Hebbian or unsupervised learning and therefore does not explain behaviors that depend on feedback from the environment. Instead, learning rules based on neuromodulated STDP (three-factor learning rules) have been shown to be capable of solving reinforcement learning tasks in a biologically plausible manner. In this paper we demonstrate for the first time how a model of three-factor STDP, with the third-factor representing spikes from dopaminergic neurons, can be implemented on the SpiNNaker neuromorphic system. Using this learning rule we first show how reward and punishment signals can be delivered to a single synapse before going on to demonstrate it in a larger network which solves the credit assignment problem in a Pavlovian conditioning experiment. Because of its extra complexity, we find that our three-factor learning rule requires approximately 2× as much processing time as the existing SpiNNaker STDP learning rules. However, we show that it is still possible to run our Pavlovian conditioning model with up to 1 × 104 neurons in real-time, opening up new research opportunities for modeling behavioral learning on SpiNNaker.
Collapse
Affiliation(s)
- Mantas Mikaitis
- Advanced Processor Technologies, Faculty of Science and Engineering, School of Computer Science, University of Manchester, Manchester, United Kingdom
| | - Garibaldi Pineda García
- Advanced Processor Technologies, Faculty of Science and Engineering, School of Computer Science, University of Manchester, Manchester, United Kingdom
| | - James C Knight
- Centre for Computational Neuroscience and Robotics, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Steve B Furber
- Advanced Processor Technologies, Faculty of Science and Engineering, School of Computer Science, University of Manchester, Manchester, United Kingdom
| |
Collapse
|
145
|
Min B, Zhou D, Cai D. Effects of Firing Variability on Network Structures with Spike-Timing-Dependent Plasticity. Front Comput Neurosci 2018; 12:1. [PMID: 29410621 PMCID: PMC5787127 DOI: 10.3389/fncom.2018.00001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2017] [Accepted: 01/03/2018] [Indexed: 11/17/2022] Open
Abstract
Synaptic plasticity is believed to be the biological substrate underlying learning and memory. One of the most widespread forms of synaptic plasticity, spike-timing-dependent plasticity (STDP), uses the spike timing information of presynaptic and postsynaptic neurons to induce synaptic potentiation or depression. An open question is how STDP organizes the connectivity patterns in neuronal circuits. Previous studies have placed much emphasis on the role of firing rate in shaping connectivity patterns. Here, we go beyond the firing rate description to develop a self-consistent linear response theory that incorporates the information of both firing rate and firing variability. By decomposing the pairwise spike correlation into one component associated with local direct connections and the other associated with indirect connections, we identify two distinct regimes regarding the network structures learned through STDP. In one regime, the contribution of the direct-connection correlations dominates over that of the indirect-connection correlations in the learning dynamics; this gives rise to a network structure consistent with the firing rate description. In the other regime, the contribution of the indirect-connection correlations dominates in the learning dynamics, leading to a network structure different from the firing rate description. We demonstrate that the heterogeneity of firing variability across neuronal populations induces a temporally asymmetric structure of indirect-connection correlations. This temporally asymmetric structure underlies the emergence of the second regime. Our study provides a new perspective that emphasizes the role of high-order statistics of spiking activity in the spike-correlation-sensitive learning dynamics.
Collapse
Affiliation(s)
- Bin Min
- Center for Neural Science, Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - David Cai
- Center for Neural Science, Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates.,School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
146
|
Olde Scheper TV, Meredith RM, Mansvelder HD, van Pelt J, van Ooyen A. Dynamic Hebbian Cross-Correlation Learning Resolves the Spike Timing Dependent Plasticity Conundrum. Front Comput Neurosci 2018; 11:119. [PMID: 29375358 PMCID: PMC5768644 DOI: 10.3389/fncom.2017.00119] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2016] [Accepted: 12/22/2017] [Indexed: 11/13/2022] Open
Abstract
Spike Timing-Dependent Plasticity has been found to assume many different forms. The classic STDP curve, with one potentiating and one depressing window, is only one of many possible curves that describe synaptic learning using the STDP mechanism. It has been shown experimentally that STDP curves may contain multiple LTP and LTD windows of variable width, and even inverted windows. The underlying STDP mechanism that is capable of producing such an extensive, and apparently incompatible, range of learning curves is still under investigation. In this paper, it is shown that STDP originates from a combination of two dynamic Hebbian cross-correlations of local activity at the synapse. The correlation of the presynaptic activity with the local postsynaptic activity is a robust and reliable indicator of the discrepancy between the presynaptic neuron and the postsynaptic neuron's activity. The second correlation is between the local postsynaptic activity with dendritic activity which is a good indicator of matching local synaptic and dendritic activity. We show that this simple time-independent learning rule can give rise to many forms of the STDP learning curve. The rule regulates synaptic strength without the need for spike matching or other supervisory learning mechanisms. Local differences in dendritic activity at the synapse greatly affect the cross-correlation difference which determines the relative contributions of different neural activity sources. Dendritic activity due to nearby synapses, action potentials, both forward and back-propagating, as well as inhibitory synapses will dynamically modify the local activity at the synapse, and the resulting STDP learning rule. The dynamic Hebbian learning rule ensures furthermore, that the resulting synaptic strength is dynamically stable, and that interactions between synapses do not result in local instabilities. The rule clearly demonstrates that synapses function as independent localized computational entities, each contributing to the global activity, not in a simply linear fashion, but in a manner that is appropriate to achieve local and global stability of the neuron and the entire dendritic structure.
Collapse
Affiliation(s)
- Tjeerd V Olde Scheper
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands.,Department of Computing and Communication Technologies, Faculty of Technology, Design and Environment, Oxford Brookes University, Oxford, United Kingdom
| | - Rhiannon M Meredith
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - Huibert D Mansvelder
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - Jaap van Pelt
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - Arjen van Ooyen
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
147
|
Muir DR, Molina-Luna P, Roth MM, Helmchen F, Kampa BM. Specific excitatory connectivity for feature integration in mouse primary visual cortex. PLoS Comput Biol 2017; 13:e1005888. [PMID: 29240769 PMCID: PMC5746254 DOI: 10.1371/journal.pcbi.1005888] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2017] [Revised: 12/28/2017] [Accepted: 11/23/2017] [Indexed: 11/21/2022] Open
Abstract
Local excitatory connections in mouse primary visual cortex (V1) are stronger and more prevalent between neurons that share similar functional response features. However, the details of how functional rules for local connectivity shape neuronal responses in V1 remain unknown. We hypothesised that complex responses to visual stimuli may arise as a consequence of rules for selective excitatory connectivity within the local network in the superficial layers of mouse V1. In mouse V1 many neurons respond to overlapping grating stimuli (plaid stimuli) with highly selective and facilitatory responses, which are not simply predicted by responses to single gratings presented alone. This complexity is surprising, since excitatory neurons in V1 are considered to be mainly tuned to single preferred orientations. Here we examined the consequences for visual processing of two alternative connectivity schemes: in the first case, local connections are aligned with visual properties inherited from feedforward input (a 'like-to-like' scheme specifically connecting neurons that share similar preferred orientations); in the second case, local connections group neurons into excitatory subnetworks that combine and amplify multiple feedforward visual properties (a 'feature binding' scheme). By comparing predictions from large scale computational models with in vivo recordings of visual representations in mouse V1, we found that responses to plaid stimuli were best explained by assuming feature binding connectivity. Unlike under the like-to-like scheme, selective amplification within feature-binding excitatory subnetworks replicated experimentally observed facilitatory responses to plaid stimuli; explained selective plaid responses not predicted by grating selectivity; and was consistent with broad anatomical selectivity observed in mouse V1. Our results show that visual feature binding can occur through local recurrent mechanisms without requiring feedforward convergence, and that such a mechanism is consistent with visual responses and cortical anatomy in mouse V1.
Collapse
Affiliation(s)
- Dylan R. Muir
- Biozentrum, University of Basel, Basel, Switzerland
- Laboratory of Neural Circuit Dynamics, Brain Research Institute, University of Zurich, Zurich, Switzerland
| | - Patricia Molina-Luna
- Laboratory of Neural Circuit Dynamics, Brain Research Institute, University of Zurich, Zurich, Switzerland
| | - Morgane M. Roth
- Biozentrum, University of Basel, Basel, Switzerland
- Laboratory of Neural Circuit Dynamics, Brain Research Institute, University of Zurich, Zurich, Switzerland
| | - Fritjof Helmchen
- Laboratory of Neural Circuit Dynamics, Brain Research Institute, University of Zurich, Zurich, Switzerland
| | - Björn M. Kampa
- Laboratory of Neural Circuit Dynamics, Brain Research Institute, University of Zurich, Zurich, Switzerland
- Department of Neurophysiology, Institute of Biology 2, RWTH Aachen University, Aachen, Germany
- JARA-BRAIN, Aachen, Germany
| |
Collapse
|
148
|
Zenke F, Gerstner W. Hebbian plasticity requires compensatory processes on multiple timescales. Philos Trans R Soc Lond B Biol Sci 2017; 372:rstb.2016.0259. [PMID: 28093557 PMCID: PMC5247595 DOI: 10.1098/rstb.2016.0259] [Citation(s) in RCA: 94] [Impact Index Per Article: 13.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/09/2016] [Indexed: 01/19/2023] Open
Abstract
We review a body of theoretical and experimental research on Hebbian and homeostatic plasticity, starting from a puzzling observation: while homeostasis of synapses found in experiments is a slow compensatory process, most mathematical models of synaptic plasticity use rapid compensatory processes (RCPs). Even worse, with the slow homeostatic plasticity reported in experiments, simulations of existing plasticity models cannot maintain network stability unless further control mechanisms are implemented. To solve this paradox, we suggest that in addition to slow forms of homeostatic plasticity there are RCPs which stabilize synaptic plasticity on short timescales. These rapid processes may include heterosynaptic depression triggered by episodes of high postsynaptic firing rate. While slower forms of homeostatic plasticity are not sufficient to stabilize Hebbian plasticity, they are important for fine-tuning neural circuits. Taken together we suggest that learning and memory rely on an intricate interplay of diverse plasticity mechanisms on different timescales which jointly ensure stability and plasticity of neural circuits.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'.
Collapse
Affiliation(s)
- Friedemann Zenke
- Department of Applied Physics, Stanford University, Stanford, CA 94305, USA
| | - Wulfram Gerstner
- Brain Mind Institute, School of Life Sciences and School of Computer and Communication Sciences, Ecole Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
149
|
Costa RP, Mizusaki BEP, Sjöström PJ, van Rossum MCW. Functional consequences of pre- and postsynaptic expression of synaptic plasticity. Philos Trans R Soc Lond B Biol Sci 2017; 372:rstb.2016.0153. [PMID: 28093547 PMCID: PMC5247585 DOI: 10.1098/rstb.2016.0153] [Citation(s) in RCA: 37] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/02/2016] [Indexed: 01/23/2023] Open
Abstract
Growing experimental evidence shows that both homeostatic and Hebbian synaptic plasticity can be expressed presynaptically as well as postsynaptically. In this review, we start by discussing this evidence and methods used to determine expression loci. Next, we discuss the functional consequences of this diversity in pre- and postsynaptic expression of both homeostatic and Hebbian synaptic plasticity. In particular, we explore the functional consequences of a biologically tuned model of pre- and postsynaptically expressed spike-timing-dependent plasticity complemented with postsynaptic homeostatic control. The pre- and postsynaptic expression in this model predicts (i) more reliable receptive fields and sensory perception, (ii) rapid recovery of forgotten information (memory savings), and (iii) reduced response latencies, compared with a model with postsynaptic expression only. Finally, we discuss open questions that will require a considerable research effort to better elucidate how the specific locus of expression of homeostatic and Hebbian plasticity alters synaptic and network computations.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'.
Collapse
Affiliation(s)
- Rui Ponte Costa
- Institute for Adaptive and Neural Computation, School of Informatics University of Edinburgh, Edinburgh, UK.,Centre for Neural Circuits and Behaviour, University of Oxford, Oxford, UK
| | - Beatriz E P Mizusaki
- Instituto de Física, Universidade Federal do Rio Grande do Sul, Porto Alegre, Brazil.,Centre for Research in Neuroscience, Department of Neurology and Neurosurgery, Program for Brain Repair and Integrative Neuroscience, The Research Institute of the McGill University Health Centre, McGill University, Montreal, Quebec, Canada
| | - P Jesper Sjöström
- Centre for Research in Neuroscience, Department of Neurology and Neurosurgery, Program for Brain Repair and Integrative Neuroscience, The Research Institute of the McGill University Health Centre, McGill University, Montreal, Quebec, Canada
| | - Mark C W van Rossum
- Institute for Adaptive and Neural Computation, School of Informatics University of Edinburgh, Edinburgh, UK
| |
Collapse
|
150
|
Clopath C, Bonhoeffer T, Hübener M, Rose T. Variance and invariance of neuronal long-term representations. Philos Trans R Soc Lond B Biol Sci 2017; 372:rstb.2016.0161. [PMID: 28093555 PMCID: PMC5247593 DOI: 10.1098/rstb.2016.0161] [Citation(s) in RCA: 67] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/01/2016] [Indexed: 12/13/2022] Open
Abstract
The brain extracts behaviourally relevant sensory input to produce appropriate motor output. On the one hand, our constantly changing environment requires this transformation to be plastic. On the other hand, plasticity is thought to be balanced by mechanisms ensuring constancy of neuronal representations in order to achieve stable behavioural performance. Yet, prominent changes in synaptic strength and connectivity also occur during normal sensory experience, indicating a certain degree of constitutive plasticity. This raises the question of how stable neuronal representations are on the population level and also on the single neuron level. Here, we review recent data from longitudinal electrophysiological and optical recordings of single-cell activity that assess the long-term stability of neuronal stimulus selectivities under conditions of constant sensory experience, during learning, and after reversible modification of sensory input. The emerging picture is that neuronal representations are stabilized by behavioural relevance and that the degree of long-term tuning stability and perturbation resistance directly relates to the functional role of the respective neurons, cell types and circuits. Using a 'toy' model, we show that stable baseline representations and precise recovery from perturbations in visual cortex could arise from a 'backbone' of strong recurrent connectivity between similarly tuned cells together with a small number of 'anchor' neurons exempt from plastic changes.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'.
Collapse
Affiliation(s)
- Claudia Clopath
- Bioengineering Department, Imperial College London, South Kensington Campus, London SW7 2AZ, UK
| | - Tobias Bonhoeffer
- Max Planck Institute of Neurobiology, Am Klopferspitz 18, 82152 Martinsried, Germany
| | - Mark Hübener
- Max Planck Institute of Neurobiology, Am Klopferspitz 18, 82152 Martinsried, Germany
| | - Tobias Rose
- Max Planck Institute of Neurobiology, Am Klopferspitz 18, 82152 Martinsried, Germany
| |
Collapse
|