1
|
de Brito CSN, Gerstner W. Learning what matters: Synaptic plasticity with invariance to second-order input correlations. PLoS Comput Biol 2024; 20:e1011844. [PMID: 38346073 PMCID: PMC10890752 DOI: 10.1371/journal.pcbi.1011844] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Revised: 02/23/2024] [Accepted: 01/18/2024] [Indexed: 02/25/2024] Open
Abstract
Cortical populations of neurons develop sparse representations adapted to the statistics of the environment. To learn efficient population codes, synaptic plasticity mechanisms must differentiate relevant latent features from spurious input correlations, which are omnipresent in cortical networks. Here, we develop a theory for sparse coding and synaptic plasticity that is invariant to second-order correlations in the input. Going beyond classical Hebbian learning, our learning objective explains the functional form of observed excitatory plasticity mechanisms, showing how Hebbian long-term depression (LTD) cancels the sensitivity to second-order correlations so that receptive fields become aligned with features hidden in higher-order statistics. Invariance to second-order correlations enhances the versatility of biologically realistic learning models, supporting optimal decoding from noisy inputs and sparse population coding from spatially correlated stimuli. In a spiking model with triplet spike-timing-dependent plasticity (STDP), we show that individual neurons can learn localized oriented receptive fields, circumventing the need for input preprocessing, such as whitening, or population-level lateral inhibition. The theory advances our understanding of local unsupervised learning in cortical circuits, offers new interpretations of the Bienenstock-Cooper-Munro and triplet STDP models, and assigns a specific functional role to synaptic LTD mechanisms in pyramidal neurons.
Collapse
Affiliation(s)
- Carlos Stein Naves de Brito
- École Polytechnique Fédérale de Lausanne, EPFL, Lusanne, Switzerland
- Champalimaud Research, Champalimaud Centre for the Unknown, Lisbon, Portugal
| | - Wulfram Gerstner
- École Polytechnique Fédérale de Lausanne, EPFL, Lusanne, Switzerland
| |
Collapse
|
2
|
Duchet B, Bick C, Byrne Á. Mean-Field Approximations With Adaptive Coupling for Networks With Spike-Timing-Dependent Plasticity. Neural Comput 2023; 35:1481-1528. [PMID: 37437202 PMCID: PMC10422128 DOI: 10.1162/neco_a_01601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2022] [Accepted: 04/26/2023] [Indexed: 07/14/2023]
Abstract
Understanding the effect of spike-timing-dependent plasticity (STDP) is key to elucidating how neural networks change over long timescales and to design interventions aimed at modulating such networks in neurological disorders. However, progress is restricted by the significant computational cost associated with simulating neural network models with STDP and by the lack of low-dimensional description that could provide analytical insights. Phase-difference-dependent plasticity (PDDP) rules approximate STDP in phase oscillator networks, which prescribe synaptic changes based on phase differences of neuron pairs rather than differences in spike timing. Here we construct mean-field approximations for phase oscillator networks with STDP to describe part of the phase space for this very high-dimensional system. We first show that single-harmonic PDDP rules can approximate a simple form of symmetric STDP, while multiharmonic rules are required to accurately approximate causal STDP. We then derive exact expressions for the evolution of the average PDDP coupling weight in terms of network synchrony. For adaptive networks of Kuramoto oscillators that form clusters, we formulate a family of low-dimensional descriptions based on the mean-field dynamics of each cluster and average coupling weights between and within clusters. Finally, we show that such a two-cluster mean-field model can be fitted to synthetic data to provide a low-dimensional approximation of a full adaptive network with symmetric STDP. Our framework represents a step toward a low-dimensional description of adaptive networks with STDP, and could for example inform the development of new therapies aimed at maximizing the long-lasting effects of brain stimulation.
Collapse
Affiliation(s)
- Benoit Duchet
- Nuffield Department of Clinical Neuroscience, University of Oxford, Oxford X3 9DU, U.K
- MRC Brain Network Dynamics Unit, University of Oxford, Oxford X1 3TH, U.K.
| | - Christian Bick
- Department of Mathematics, Vrije Universiteit Amsterdam, Amsterdam 1081 HV, the Netherlands
- Amsterdam Neuroscience-Systems and Network Neuroscience, Amsterdam 1081 HV, the Netherlands
- Mathematical Institute, University of Oxford, Oxford X2 6GG, U.K.
| | - Áine Byrne
- School of Mathematics and Statistics, University College Dublin, Dublin D04 V1W8, Ireland
| |
Collapse
|
3
|
Fernandez-Ruiz A, Sirota A, Lopes-Dos-Santos V, Dupret D. Over and above frequency: Gamma oscillations as units of neural circuit operations. Neuron 2023; 111:936-953. [PMID: 37023717 PMCID: PMC7614431 DOI: 10.1016/j.neuron.2023.02.026] [Citation(s) in RCA: 24] [Impact Index Per Article: 24.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2022] [Revised: 11/30/2022] [Accepted: 02/16/2023] [Indexed: 04/08/2023]
Abstract
Gamma oscillations (∼30-150 Hz) are widespread correlates of neural circuit functions. These network activity patterns have been described across multiple animal species, brain structures, and behaviors, and are usually identified based on their spectral peak frequency. Yet, despite intensive investigation, whether gamma oscillations implement causal mechanisms of specific brain functions or represent a general dynamic mode of neural circuit operation remains unclear. In this perspective, we review recent advances in the study of gamma oscillations toward a deeper understanding of their cellular mechanisms, neural pathways, and functional roles. We discuss that a given gamma rhythm does not per se implement any specific cognitive function but rather constitutes an activity motif reporting the cellular substrates, communication channels, and computational operations underlying information processing in its generating brain circuit. Accordingly, we propose shifting the attention from a frequency-based to a circuit-level definition of gamma oscillations.
Collapse
Affiliation(s)
| | - Anton Sirota
- Bernstein Center for Computational Neuroscience, Faculty of Medicine, Ludwig-Maximilians Universität München, Planegg-Martinsried, Germany.
| | - Vítor Lopes-Dos-Santos
- Medical Research Council Brain Network Dynamics Unit, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK.
| | - David Dupret
- Medical Research Council Brain Network Dynamics Unit, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK.
| |
Collapse
|
4
|
Scott DN, Frank MJ. Adaptive control of synaptic plasticity integrates micro- and macroscopic network function. Neuropsychopharmacology 2023; 48:121-144. [PMID: 36038780 PMCID: PMC9700774 DOI: 10.1038/s41386-022-01374-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 06/23/2022] [Accepted: 06/24/2022] [Indexed: 11/09/2022]
Abstract
Synaptic plasticity configures interactions between neurons and is therefore likely to be a primary driver of behavioral learning and development. How this microscopic-macroscopic interaction occurs is poorly understood, as researchers frequently examine models within particular ranges of abstraction and scale. Computational neuroscience and machine learning models offer theoretically powerful analyses of plasticity in neural networks, but results are often siloed and only coarsely linked to biology. In this review, we examine connections between these areas, asking how network computations change as a function of diverse features of plasticity and vice versa. We review how plasticity can be controlled at synapses by calcium dynamics and neuromodulatory signals, the manifestation of these changes in networks, and their impacts in specialized circuits. We conclude that metaplasticity-defined broadly as the adaptive control of plasticity-forges connections across scales by governing what groups of synapses can and can't learn about, when, and to what ends. The metaplasticity we discuss acts by co-opting Hebbian mechanisms, shifting network properties, and routing activity within and across brain systems. Asking how these operations can go awry should also be useful for understanding pathology, which we address in the context of autism, schizophrenia and Parkinson's disease.
Collapse
Affiliation(s)
- Daniel N Scott
- Cognitive Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA.
- Carney Institute for Brain Science, Brown University, Providence, RI, USA.
| | - Michael J Frank
- Cognitive Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA.
- Carney Institute for Brain Science, Brown University, Providence, RI, USA.
| |
Collapse
|
5
|
Miehl C, Gjorgjieva J. Stability and learning in excitatory synapses by nonlinear inhibitory plasticity. PLoS Comput Biol 2022; 18:e1010682. [PMID: 36459503 PMCID: PMC9718420 DOI: 10.1371/journal.pcbi.1010682] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2022] [Accepted: 10/25/2022] [Indexed: 12/03/2022] Open
Abstract
Synaptic changes are hypothesized to underlie learning and memory formation in the brain. But Hebbian synaptic plasticity of excitatory synapses on its own is unstable, leading to either unlimited growth of synaptic strengths or silencing of neuronal activity without additional homeostatic mechanisms. To control excitatory synaptic strengths, we propose a novel form of synaptic plasticity at inhibitory synapses. Using computational modeling, we suggest two key features of inhibitory plasticity, dominance of inhibition over excitation and a nonlinear dependence on the firing rate of postsynaptic excitatory neurons whereby inhibitory synaptic strengths change with the same sign (potentiate or depress) as excitatory synaptic strengths. We demonstrate that the stable synaptic strengths realized by this novel inhibitory plasticity model affects excitatory/inhibitory weight ratios in agreement with experimental results. Applying a disinhibitory signal can gate plasticity and lead to the generation of receptive fields and strong bidirectional connectivity in a recurrent network. Hence, a novel form of nonlinear inhibitory plasticity can simultaneously stabilize excitatory synaptic strengths and enable learning upon disinhibition.
Collapse
Affiliation(s)
- Christoph Miehl
- Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- School of Life Sciences, Technical University of Munich, Freising, Germany
- * E-mail: (CM); (JG)
| | - Julijana Gjorgjieva
- Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- School of Life Sciences, Technical University of Munich, Freising, Germany
- * E-mail: (CM); (JG)
| |
Collapse
|
6
|
Miehl C, Onasch S, Festa D, Gjorgjieva J. Formation and computational implications of assemblies in neural circuits. J Physiol 2022. [PMID: 36068723 DOI: 10.1113/jp282750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/22/2022] [Indexed: 11/08/2022] Open
Abstract
In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain. Abstract figure legend Assembly Formation. Assemblies are groups of strongly connected neurons formed by the interaction of multiple mechanisms and with vast computational implications. Four interacting components are thought to drive assembly formation: synaptic plasticity, symmetry breaking, competition and stability. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Christoph Miehl
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Sebastian Onasch
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Dylan Festa
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| |
Collapse
|
7
|
Drifting assemblies for persistent memory: Neuron transitions and unsupervised compensation. Proc Natl Acad Sci U S A 2021; 118:2023832118. [PMID: 34772802 DOI: 10.1073/pnas.2023832118] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/11/2021] [Indexed: 11/18/2022] Open
Abstract
Change is ubiquitous in living beings. In particular, the connectome and neural representations can change. Nevertheless, behaviors and memories often persist over long times. In a standard model, associative memories are represented by assemblies of strongly interconnected neurons. For faithful storage these assemblies are assumed to consist of the same neurons over time. Here we propose a contrasting memory model with complete temporal remodeling of assemblies, based on experimentally observed changes of synapses and neural representations. The assemblies drift freely as noisy autonomous network activity and spontaneous synaptic turnover induce neuron exchange. The gradual exchange allows activity-dependent and homeostatic plasticity to conserve the representational structure and keep inputs, outputs, and assemblies consistent. This leads to persistent memory. Our findings explain recent experimental results on temporal evolution of fear memory representations and suggest that memory systems need to be understood in their completeness as individual parts may constantly change.
Collapse
|
8
|
Hu X, Zeng Z. Bridging the Functional and Wiring Properties of V1 Neurons Through Sparse Coding. Neural Comput 2021; 34:104-137. [PMID: 34758484 DOI: 10.1162/neco_a_01453] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2021] [Accepted: 07/20/2021] [Indexed: 11/04/2022]
Abstract
The functional properties of neurons in the primary visual cortex (V1) are thought to be closely related to the structural properties of this network, but the specific relationships remain unclear. Previous theoretical studies have suggested that sparse coding, an energy-efficient coding method, might underlie the orientation selectivity of V1 neurons. We thus aimed to delineate how the neurons are wired to produce this feature. We constructed a model and endowed it with a simple Hebbian learning rule to encode images of natural scenes. The excitatory neurons fired sparsely in response to images and developed strong orientation selectivity. After learning, the connectivity between excitatory neuron pairs, inhibitory neuron pairs, and excitatory-inhibitory neuron pairs depended on firing pattern and receptive field similarity between the neurons. The receptive fields (RFs) of excitatory neurons and inhibitory neurons were well predicted by the RFs of presynaptic excitatory neurons and inhibitory neurons, respectively. The excitatory neurons formed a small-world network, in which certain local connection patterns were significantly overrepresented. Bidirectionally manipulating the firing rates of inhibitory neurons caused linear transformations of the firing rates of excitatory neurons, and vice versa. These wiring properties and modulatory effects were congruent with a wide variety of data measured in V1, suggesting that the sparse coding principle might underlie both the functional and wiring properties of V1 neurons.
Collapse
Affiliation(s)
- Xiaolin Hu
- Department of Computer Science and Technology, State Key Laboratory of Intelligent Technology and Systems, BNRist, Tsinghua Laboratory of Brain and Intelligence, and IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing 100084, China
| | - Zhigang Zeng
- School of Artificial Intelligence and Automation, Huazhong University of Science and Technology, Wuhan 430074, China, and Key Laboratory of Image Processing and Intelligent Control, Education Ministry of China, Wuhan 430074, China
| |
Collapse
|
9
|
Gallinaro JV, Clopath C. Memories in a network with excitatory and inhibitory plasticity are encoded in the spiking irregularity. PLoS Comput Biol 2021; 17:e1009593. [PMID: 34762644 PMCID: PMC8610285 DOI: 10.1371/journal.pcbi.1009593] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2021] [Revised: 11/23/2021] [Accepted: 10/26/2021] [Indexed: 11/19/2022] Open
Abstract
Cell assemblies are thought to be the substrate of memory in the brain. Theoretical studies have previously shown that assemblies can be formed in networks with multiple types of plasticity. But how exactly they are formed and how they encode information is yet to be fully understood. One possibility is that memories are stored in silent assemblies. Here we used a computational model to study the formation of silent assemblies in a network of spiking neurons with excitatory and inhibitory plasticity. We found that even though the formed assemblies were silent in terms of mean firing rate, they had an increased coefficient of variation of inter-spike intervals. We also found that this spiking irregularity could be read out with support of short-term plasticity, and that it could contribute to the longevity of memories.
Collapse
Affiliation(s)
- Júlia V. Gallinaro
- Bioengineering Department, Imperial College London, London, United Kingdom
| | - Claudia Clopath
- Bioengineering Department, Imperial College London, London, United Kingdom
| |
Collapse
|
10
|
Schulz A, Miehl C, Berry MJ, Gjorgjieva J. The generation of cortical novelty responses through inhibitory plasticity. eLife 2021; 10:e65309. [PMID: 34647889 PMCID: PMC8516419 DOI: 10.7554/elife.65309] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2020] [Accepted: 09/22/2021] [Indexed: 12/17/2022] Open
Abstract
Animals depend on fast and reliable detection of novel stimuli in their environment. Neurons in multiple sensory areas respond more strongly to novel in comparison to familiar stimuli. Yet, it remains unclear which circuit, cellular, and synaptic mechanisms underlie those responses. Here, we show that spike-timing-dependent plasticity of inhibitory-to-excitatory synapses generates novelty responses in a recurrent spiking network model. Inhibitory plasticity increases the inhibition onto excitatory neurons tuned to familiar stimuli, while inhibition for novel stimuli remains low, leading to a network novelty response. The generation of novelty responses does not depend on the periodicity but rather on the distribution of presented stimuli. By including tuning of inhibitory neurons, the network further captures stimulus-specific adaptation. Finally, we suggest that disinhibition can control the amplification of novelty responses. Therefore, inhibitory plasticity provides a flexible, biologically plausible mechanism to detect the novelty of bottom-up stimuli, enabling us to make experimentally testable predictions.
Collapse
Affiliation(s)
- Auguste Schulz
- Max Planck Institute for Brain ResearchFrankfurtGermany
- Technical University of Munich, Department of Electrical and Computer EngineeringMunichGermany
| | - Christoph Miehl
- Max Planck Institute for Brain ResearchFrankfurtGermany
- Technical University of Munich, School of Life SciencesFreisingGermany
| | - Michael J Berry
- Princeton University, Princeton Neuroscience InstitutePrincetonUnited States
| | - Julijana Gjorgjieva
- Max Planck Institute for Brain ResearchFrankfurtGermany
- Technical University of Munich, School of Life SciencesFreisingGermany
| |
Collapse
|
11
|
|
12
|
Wang J, Shi C, Sushko ML, Lan J, Sun K, Zhao J, Liu X, Yan X. Boost of the Bio-memristor Performance for Artificial Electronic Synapses by Surface Reconstruction. ACS APPLIED MATERIALS & INTERFACES 2021; 13:39641-39651. [PMID: 34374517 DOI: 10.1021/acsami.1c07687] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Biomaterial-based memristors (bio-memristors) are often adopted to emulate biological synapse functions and applied to construct neural computing networks in brain-inspired chip systems. However, the randomness of conductive filament formation in bio-memristors inhibits their switching performance by causing the dispersion of the device-switching parameters. In this case, a facile porous silk fibroin (p-SF) memristor was obtained through a protein surface reconstruction strategy, in which the size of the hole can be adjusted by the density of hybrid nanoseeds. The porous SF memristors exhibit greatly enhanced electrical characteristics, including uniform I-V cycles, centralized distribution of the switching voltages, and both high and low resistances, compared to devices without pores. The results of three-dimensional (3D) simulations based on classical density functional theory (cDFT) suggest that the reconstructed pores in the SF layers guide the formation and fracture of Ag filaments under an electric field and enhance the overall conductivity by separating Ag+ ion and electron diffusion pathways. Ag+ ions are predicted to preferentially diffuse through pores, whereas electrons diffuse through the SF network. Interestingly, the device conductance can be bidirectionally modulated gradually by positive and negative voltages, can faithfully simulate short-term and long-term plasticity, and can even realize the triplet-spike-timing-dependent plasticity (triplet-STDP) rule, which can be used for pattern recognition in biological systems. The simulation results reveal that a memristor network of this type has an accuracy of ∼95.78% in memory learning and the capability of pattern learning. This work provides a facile technology route to improve the performance of bionic-material memristors.
Collapse
Affiliation(s)
- Jingjuan Wang
- National-Local Joint Engineering Laboratory of New Energy Photovoltaic Devices, Key Laboratory of Brain-Like Neuromorphic Devices and Systems of Hebei Province, College of Electron and Information Engineering, Hebei University, Baoding 071002, China
| | - Chenyang Shi
- Department of Chemistry, University of Washington, Seattle, Washington 98195, United States
- Research Institute for Biomimetics and Soft Matter, College of Materials, Fujian Provincial Key Laboratory for Soft Functional Materials Research, Xiamen University, Xiamen 361005, China
| | - Maria L Sushko
- Physical and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States
| | - Jinling Lan
- National-Local Joint Engineering Laboratory of New Energy Photovoltaic Devices, Key Laboratory of Brain-Like Neuromorphic Devices and Systems of Hebei Province, College of Electron and Information Engineering, Hebei University, Baoding 071002, China
| | - Kaixuan Sun
- National-Local Joint Engineering Laboratory of New Energy Photovoltaic Devices, Key Laboratory of Brain-Like Neuromorphic Devices and Systems of Hebei Province, College of Electron and Information Engineering, Hebei University, Baoding 071002, China
| | - Jianhui Zhao
- National-Local Joint Engineering Laboratory of New Energy Photovoltaic Devices, Key Laboratory of Brain-Like Neuromorphic Devices and Systems of Hebei Province, College of Electron and Information Engineering, Hebei University, Baoding 071002, China
| | - XiangYang Liu
- College of Ocean and Earth Sciences, State Key Laboratory of Marine Environmental Science (MEL), Xiamen University, Xiamen 361005, China
| | - Xiaobing Yan
- National-Local Joint Engineering Laboratory of New Energy Photovoltaic Devices, Key Laboratory of Brain-Like Neuromorphic Devices and Systems of Hebei Province, College of Electron and Information Engineering, Hebei University, Baoding 071002, China
| |
Collapse
|
13
|
Aljadeff J, Gillett M, Pereira Obilinovic U, Brunel N. From synapse to network: models of information storage and retrieval in neural circuits. Curr Opin Neurobiol 2021; 70:24-33. [PMID: 34175521 DOI: 10.1016/j.conb.2021.05.005] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2021] [Revised: 05/06/2021] [Accepted: 05/25/2021] [Indexed: 10/21/2022]
Abstract
The mechanisms of information storage and retrieval in brain circuits are still the subject of debate. It is widely believed that information is stored at least in part through changes in synaptic connectivity in networks that encode this information and that these changes lead in turn to modifications of network dynamics, such that the stored information can be retrieved at a later time. Here, we review recent progress in deriving synaptic plasticity rules from experimental data and in understanding how plasticity rules affect the dynamics of recurrent networks. We show that the dynamics generated by such networks exhibit a large degree of diversity, depending on parameters, similar to experimental observations in vivo during delayed response tasks.
Collapse
Affiliation(s)
- Johnatan Aljadeff
- Neurobiology Section, Division of Biological Sciences, UC San Diego, USA
| | | | | | - Nicolas Brunel
- Department of Neurobiology, Duke University, USA; Department of Physics, Duke University, USA.
| |
Collapse
|
14
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
15
|
Wu YK, Hengen KB, Turrigiano GG, Gjorgjieva J. Homeostatic mechanisms regulate distinct aspects of cortical circuit dynamics. Proc Natl Acad Sci U S A 2020; 117:24514-24525. [PMID: 32917810 PMCID: PMC7533694 DOI: 10.1073/pnas.1918368117] [Citation(s) in RCA: 31] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2019] [Accepted: 08/04/2020] [Indexed: 11/18/2022] Open
Abstract
Homeostasis is indispensable to counteract the destabilizing effects of Hebbian plasticity. Although it is commonly assumed that homeostasis modulates synaptic strength, membrane excitability, and firing rates, its role at the neural circuit and network level is unknown. Here, we identify changes in higher-order network properties of freely behaving rodents during prolonged visual deprivation. Strikingly, our data reveal that functional pairwise correlations and their structure are subject to homeostatic regulation. Using a computational model, we demonstrate that the interplay of different plasticity and homeostatic mechanisms can capture the initial drop and delayed recovery of firing rates and correlations observed experimentally. Moreover, our model indicates that synaptic scaling is crucial for the recovery of correlations and network structure, while intrinsic plasticity is essential for the rebound of firing rates, suggesting that synaptic scaling and intrinsic plasticity can serve distinct functions in homeostatically regulating network dynamics.
Collapse
Affiliation(s)
- Yue Kris Wu
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, 60438 Frankfurt, Germany
| | - Keith B Hengen
- Department of Biology, Brandeis University, Waltham, MA 02454
- Department of Biology, Washington University in St. Louis, St. Louis, MO 63130
| | | | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, 60438 Frankfurt, Germany;
- School of Life Sciences, Technical University of Munich, 85354 Freising, Germany
| |
Collapse
|