1
|
Chauhan K, Neiman AB, Tass PA. Synaptic reorganization of synchronized neuronal networks with synaptic weight and structural plasticity. PLoS Comput Biol 2024; 20:e1012261. [PMID: 38980898 PMCID: PMC11259284 DOI: 10.1371/journal.pcbi.1012261] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Revised: 07/19/2024] [Accepted: 06/20/2024] [Indexed: 07/11/2024] Open
Abstract
Abnormally strong neural synchronization may impair brain function, as observed in several brain disorders. We computationally study how neuronal dynamics, synaptic weights, and network structure co-emerge, in particular, during (de)synchronization processes and how they are affected by external perturbation. To investigate the impact of different types of plasticity mechanisms, we combine a network of excitatory integrate-and-fire neurons with different synaptic weight and/or structural plasticity mechanisms: (i) only spike-timing-dependent plasticity (STDP), (ii) only homeostatic structural plasticity (hSP), i.e., without weight-dependent pruning and without STDP, (iii) a combination of STDP and hSP, i.e., without weight-dependent pruning, and (iv) a combination of STDP and structural plasticity (SP) that includes hSP and weight-dependent pruning. To accommodate the diverse time scales of neuronal firing, STDP, and SP, we introduce a simple stochastic SP model, enabling detailed numerical analyses. With tools from network theory, we reveal that structural reorganization may remarkably enhance the network's level of synchrony. When weaker contacts are preferentially eliminated by weight-dependent pruning, synchrony is achieved with significantly sparser connections than in randomly structured networks in the STDP-only model. In particular, the strengthening of contacts from neurons with higher natural firing rates to those with lower rates and the weakening of contacts in the opposite direction, followed by selective removal of weak contacts, allows for strong synchrony with fewer connections. This activity-led network reorganization results in the emergence of degree-frequency, degree-degree correlations, and a mixture of degree assortativity. We compare the stimulation-induced desynchronization of synchronized states in the STDP-only model (i) with the desynchronization of models (iii) and (iv). The latter require stimuli of significantly higher intensity to achieve long-term desynchronization. These findings may inform future pre-clinical and clinical studies with invasive or non-invasive stimulus modalities aiming at inducing long-lasting relief of symptoms, e.g., in Parkinson's disease.
Collapse
Affiliation(s)
- Kanishk Chauhan
- Department of Physics and Astronomy, Ohio University, Athens, Ohio, United States of America
- Neuroscience Program, Ohio University, Athens, Ohio, United States of America
| | - Alexander B. Neiman
- Department of Physics and Astronomy, Ohio University, Athens, Ohio, United States of America
- Neuroscience Program, Ohio University, Athens, Ohio, United States of America
| | - Peter A. Tass
- Department of Neurosurgery, Stanford University, Stanford, California, United States of America
| |
Collapse
|
2
|
Schmitt FJ, Rostami V, Nawrot MP. Efficient parameter calibration and real-time simulation of large-scale spiking neural networks with GeNN and NEST. Front Neuroinform 2023; 17:941696. [PMID: 36844916 PMCID: PMC9950635 DOI: 10.3389/fninf.2023.941696] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2022] [Accepted: 01/16/2023] [Indexed: 02/12/2023] Open
Abstract
Spiking neural networks (SNNs) represent the state-of-the-art approach to the biologically realistic modeling of nervous system function. The systematic calibration for multiple free model parameters is necessary to achieve robust network function and demands high computing power and large memory resources. Special requirements arise from closed-loop model simulation in virtual environments and from real-time simulation in robotic application. Here, we compare two complementary approaches to efficient large-scale and real-time SNN simulation. The widely used NEural Simulation Tool (NEST) parallelizes simulation across multiple CPU cores. The GPU-enhanced Neural Network (GeNN) simulator uses the highly parallel GPU-based architecture to gain simulation speed. We quantify fixed and variable simulation costs on single machines with different hardware configurations. As a benchmark model, we use a spiking cortical attractor network with a topology of densely connected excitatory and inhibitory neuron clusters with homogeneous or distributed synaptic time constants and in comparison to the random balanced network. We show that simulation time scales linearly with the simulated biological model time and, for large networks, approximately linearly with the model size as dominated by the number of synaptic connections. Additional fixed costs with GeNN are almost independent of model size, while fixed costs with NEST increase linearly with model size. We demonstrate how GeNN can be used for simulating networks with up to 3.5 · 106 neurons (> 3 · 1012synapses) on a high-end GPU, and up to 250, 000 neurons (25 · 109 synapses) on a low-cost GPU. Real-time simulation was achieved for networks with 100, 000 neurons. Network calibration and parameter grid search can be efficiently achieved using batch processing. We discuss the advantages and disadvantages of both approaches for different use cases.
Collapse
Affiliation(s)
| | | | - Martin Paul Nawrot
- Computational Systems Neuroscience, Institute of Zoology, University of Cologne, Cologne, Germany
| |
Collapse
|
3
|
The molecular memory code and synaptic plasticity: A synthesis. Biosystems 2023; 224:104825. [PMID: 36610586 DOI: 10.1016/j.biosystems.2022.104825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 12/29/2022] [Accepted: 12/30/2022] [Indexed: 01/06/2023]
Abstract
The most widely accepted view of memory in the brain holds that synapses are the storage sites of memory, and that memories are formed through associative modification of synapses. This view has been challenged on conceptual and empirical grounds. As an alternative, it has been proposed that molecules within the cell body are the storage sites of memory, and that memories are formed through biochemical operations on these molecules. This paper proposes a synthesis of these two views, grounded in a computational model of memory. Synapses are conceived as storage sites for the parameters of an approximate posterior probability distribution over latent causes. Intracellular molecules are conceived as storage sites for the parameters of a generative model. The model stipulates how these two components work together as part of an integrated algorithm for learning and inference.
Collapse
|
4
|
Spiking neural P systems with cooperative synapses. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.05.088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
5
|
Bouhadjar Y, Wouters DJ, Diesmann M, Tetzlaff T. Sequence learning, prediction, and replay in networks of spiking neurons. PLoS Comput Biol 2022; 18:e1010233. [PMID: 35727857 PMCID: PMC9273101 DOI: 10.1371/journal.pcbi.1010233] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Revised: 07/11/2022] [Accepted: 05/20/2022] [Indexed: 11/24/2022] Open
Abstract
Sequence learning, prediction and replay have been proposed to constitute the universal computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) algorithm realizes these forms of computation. It learns sequences in an unsupervised and continuous manner using local learning rules, permits a context specific prediction of future sequence elements, and generates mismatch signals in case the predictions are not met. While the HTM algorithm accounts for a number of biological features such as topographic receptive fields, nonlinear dendritic processing, and sparse connectivity, it is based on abstract discrete-time neuron and synapse dynamics, as well as on plasticity mechanisms that can only partly be related to known biological mechanisms. Here, we devise a continuous-time implementation of the temporal-memory (TM) component of the HTM algorithm, which is based on a recurrent network of spiking neurons with biophysically interpretable variables and parameters. The model learns high-order sequences by means of a structural Hebbian synaptic plasticity mechanism supplemented with a rate-based homeostatic control. In combination with nonlinear dendritic input integration and local inhibitory feedback, this type of plasticity leads to the dynamic self-organization of narrow sequence-specific subnetworks. These subnetworks provide the substrate for a faithful propagation of sparse, synchronous activity, and, thereby, for a robust, context specific prediction of future sequence elements as well as for the autonomous replay of previously learned sequences. By strengthening the link to biology, our implementation facilitates the evaluation of the TM hypothesis based on experimentally accessible quantities. The continuous-time implementation of the TM algorithm permits, in particular, an investigation of the role of sequence timing for sequence learning, prediction and replay. We demonstrate this aspect by studying the effect of the sequence speed on the sequence learning performance and on the speed of autonomous sequence replay. Essentially all data processed by mammals and many other living organisms is sequential. This holds true for all types of sensory input data as well as motor output activity. Being able to form memories of such sequential data, to predict future sequence elements, and to replay learned sequences is a necessary prerequisite for survival. It has been hypothesized that sequence learning, prediction and replay constitute the fundamental computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) constitutes an abstract powerful algorithm implementing this form of computation and has been proposed to serve as a model of neocortical processing. In this study, we are reformulating this algorithm in terms of known biological ingredients and mechanisms to foster the verifiability of the HTM hypothesis based on electrophysiological and behavioral data. The proposed model learns continuously in an unsupervised manner by biologically plausible, local plasticity mechanisms, and successfully predicts and replays complex sequences. Apart from establishing contact to biology, the study sheds light on the mechanisms determining at what speed we can process sequences and provides an explanation of fast sequence replay observed in the hippocampus and in the neocortex.
Collapse
Affiliation(s)
- Younes Bouhadjar
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Peter Grünberg Institute (PGI-7,10), Jülich Research Centre and JARA, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
- * E-mail:
| | - Dirk J. Wouters
- Institute of Electronic Materials (IWE 2) & JARA-FIT, RWTH Aachen University, Aachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, & Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
6
|
Madadi Asl M, Vahabie AH, Valizadeh A, Tass PA. Spike-Timing-Dependent Plasticity Mediated by Dopamine and its Role in Parkinson's Disease Pathophysiology. FRONTIERS IN NETWORK PHYSIOLOGY 2022; 2:817524. [PMID: 36926058 PMCID: PMC10013044 DOI: 10.3389/fnetp.2022.817524] [Citation(s) in RCA: 16] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/18/2021] [Accepted: 02/08/2022] [Indexed: 01/05/2023]
Abstract
Parkinson's disease (PD) is a multi-systemic neurodegenerative brain disorder. Motor symptoms of PD are linked to the significant dopamine (DA) loss in substantia nigra pars compacta (SNc) followed by basal ganglia (BG) circuit dysfunction. Increasing experimental and computational evidence indicates that (synaptic) plasticity plays a key role in the emergence of PD-related pathological changes following DA loss. Spike-timing-dependent plasticity (STDP) mediated by DA provides a mechanistic model for synaptic plasticity to modify synaptic connections within the BG according to the neuronal activity. To shed light on how DA-mediated STDP can shape neuronal activity and synaptic connectivity in the PD condition, we reviewed experimental and computational findings addressing the modulatory effect of DA on STDP as well as other plasticity mechanisms and discussed their potential role in PD pathophysiology and related network dynamics and connectivity. In particular, reshaping of STDP profiles together with other plasticity-mediated processes following DA loss may abnormally modify synaptic connections in competing pathways of the BG. The cascade of plasticity-induced maladaptive or compensatory changes can impair the excitation-inhibition balance towards the BG output nuclei, leading to the emergence of pathological activity-connectivity patterns in PD. Pre-clinical, clinical as well as computational studies reviewed here provide an understanding of the impact of synaptic plasticity and other plasticity mechanisms on PD pathophysiology, especially PD-related network activity and connectivity, after DA loss. This review may provide further insights into the abnormal structure-function relationship within the BG contributing to the emergence of pathological states in PD. Specifically, this review is intended to provide detailed information for the development of computational network models for PD, serving as testbeds for the development and optimization of invasive and non-invasive brain stimulation techniques. Computationally derived hypotheses may accelerate the development of therapeutic stimulation techniques and potentially reduce the number of related animal experiments.
Collapse
Affiliation(s)
- Mojtaba Madadi Asl
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan, Iran
| | - Abdol-Hossein Vahabie
- School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, Iran.,Department of Psychology, Faculty of Psychology and Education, University of Tehran, Tehran, Iran
| | - Alireza Valizadeh
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan, Iran
| | - Peter A Tass
- Department of Neurosurgery, Stanford University School of Medicine, Stanford, CA, United States
| |
Collapse
|
7
|
Computational roles of intrinsic synaptic dynamics. Curr Opin Neurobiol 2021; 70:34-42. [PMID: 34303124 DOI: 10.1016/j.conb.2021.06.002] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2021] [Revised: 05/14/2021] [Accepted: 06/15/2021] [Indexed: 12/26/2022]
Abstract
Conventional theories assume that long-term information storage in the brain is implemented by modifying synaptic efficacy. Recent experimental findings challenge this view by demonstrating that dendritic spine sizes, or their corresponding synaptic weights, are highly volatile even in the absence of neural activity. Here, we review previous computational works on the roles of these intrinsic synaptic dynamics. We first present the possibility for neuronal networks to sustain stable performance in their presence, and we then hypothesize that intrinsic dynamics could be more than mere noise to withstand, but they may improve information processing in the brain.
Collapse
|
8
|
Sinha A, Metzner C, Davey N, Adams R, Schmuker M, Steuber V. Growth rules for the repair of Asynchronous Irregular neuronal networks after peripheral lesions. PLoS Comput Biol 2021; 17:e1008996. [PMID: 34061830 PMCID: PMC8195387 DOI: 10.1371/journal.pcbi.1008996] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2020] [Revised: 06/11/2021] [Accepted: 04/23/2021] [Indexed: 12/02/2022] Open
Abstract
Several homeostatic mechanisms enable the brain to maintain desired levels of neuronal activity. One of these, homeostatic structural plasticity, has been reported to restore activity in networks disrupted by peripheral lesions by altering their neuronal connectivity. While multiple lesion experiments have studied the changes in neurite morphology that underlie modifications of synapses in these networks, the underlying mechanisms that drive these changes are yet to be explained. Evidence suggests that neuronal activity modulates neurite morphology and may stimulate neurites to selective sprout or retract to restore network activity levels. We developed a new spiking network model of peripheral lesioning and accurately reproduced the characteristics of network repair after deafferentation that are reported in experiments to study the activity dependent growth regimes of neurites. To ensure that our simulations closely resemble the behaviour of networks in the brain, we model deafferentation in a biologically realistic balanced network model that exhibits low frequency Asynchronous Irregular (AI) activity as observed in cerebral cortex. Our simulation results indicate that the re-establishment of activity in neurons both within and outside the deprived region, the Lesion Projection Zone (LPZ), requires opposite activity dependent growth rules for excitatory and inhibitory post-synaptic elements. Analysis of these growth regimes indicates that they also contribute to the maintenance of activity levels in individual neurons. Furthermore, in our model, the directional formation of synapses that is observed in experiments requires that pre-synaptic excitatory and inhibitory elements also follow opposite growth rules. Lastly, we observe that our proposed structural plasticity growth rules and the inhibitory synaptic plasticity mechanism that also balances our AI network both contribute to the restoration of the network to pre-deafferentation stable activity levels. An accumulating body of evidence suggests that our brain can compensate for peripheral lesions by adaptive rewiring of its neuronal circuitry. The underlying process, structural plasticity, can modify the connectivity of neuronal networks in the brain, thus affecting their function. To better understand the mechanisms of structural plasticity in the brain, we have developed a novel model of peripheral lesions and the resulting activity-dependent rewiring in a simplified balanced cortical network model that exhibits biologically realistic Asynchronous Irregular (AI) activity. In order to accurately reproduce the directionality and course of network rewiring after injury that is observed in peripheral lesion experiments, we derive activity dependent growth rules for different synaptic elements: dendritic and axonal contacts. Our simulation results suggest that excitatory and inhibitory synaptic elements have to react to changes in neuronal activity in opposite ways. We show that these rules result in a homeostatic stabilisation of activity in individual neurons. In our simulations, both synaptic and structural plasticity mechanisms contribute to network repair. Furthermore, our simulations indicate that while activity is restored in neurons deprived by the peripheral lesion, the temporal firing characteristics of the network may not be retained by the rewiring process.
Collapse
Affiliation(s)
- Ankur Sinha
- UH Biocomputation Research Group, Centre for Computer Science and Informatics Research, University of Hertfordshire, Hatfield United Kingdom
- * E-mail:
| | - Christoph Metzner
- UH Biocomputation Research Group, Centre for Computer Science and Informatics Research, University of Hertfordshire, Hatfield United Kingdom
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
| | - Neil Davey
- UH Biocomputation Research Group, Centre for Computer Science and Informatics Research, University of Hertfordshire, Hatfield United Kingdom
| | - Roderick Adams
- UH Biocomputation Research Group, Centre for Computer Science and Informatics Research, University of Hertfordshire, Hatfield United Kingdom
| | - Michael Schmuker
- UH Biocomputation Research Group, Centre for Computer Science and Informatics Research, University of Hertfordshire, Hatfield United Kingdom
| | - Volker Steuber
- UH Biocomputation Research Group, Centre for Computer Science and Informatics Research, University of Hertfordshire, Hatfield United Kingdom
| |
Collapse
|
9
|
Limbacher T, Legenstein R. Emergence of Stable Synaptic Clusters on Dendrites Through Synaptic Rewiring. Front Comput Neurosci 2020; 14:57. [PMID: 32848681 PMCID: PMC7424032 DOI: 10.3389/fncom.2020.00057] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2020] [Accepted: 05/22/2020] [Indexed: 11/16/2022] Open
Abstract
The connectivity structure of neuronal networks in cortex is highly dynamic. This ongoing cortical rewiring is assumed to serve important functions for learning and memory. We analyze in this article a model for the self-organization of synaptic inputs onto dendritic branches of pyramidal cells. The model combines a generic stochastic rewiring principle with a simple synaptic plasticity rule that depends on local dendritic activity. In computer simulations, we find that this synaptic rewiring model leads to synaptic clustering, that is, temporally correlated inputs become locally clustered on dendritic branches. This empirical finding is backed up by a theoretical analysis which shows that rewiring in our model favors network configurations with synaptic clustering. We propose that synaptic clustering plays an important role in the organization of computation and memory in cortical circuits: we find that synaptic clustering through the proposed rewiring mechanism can serve as a mechanism to protect memories from subsequent modifications on a medium time scale. Rewiring of synaptic connections onto specific dendritic branches may thus counteract the general problem of catastrophic forgetting in neural networks.
Collapse
Affiliation(s)
| | - Robert Legenstein
- Institute of Theoretical Computer Science, Graz University of Technology, Graz, Austria
| |
Collapse
|
10
|
Deger M, Seeholzer A, Gerstner W. Multicontact Co-operativity in Spike-Timing-Dependent Structural Plasticity Stabilizes Networks. Cereb Cortex 2019; 28:1396-1415. [PMID: 29300903 PMCID: PMC6041941 DOI: 10.1093/cercor/bhx339] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2017] [Accepted: 11/30/2017] [Indexed: 12/12/2022] Open
Abstract
Excitatory synaptic connections in the adult neocortex consist of multiple synaptic contacts, almost exclusively formed on dendritic spines. Changes of spine volume, a correlate of synaptic strength, can be tracked in vivo for weeks. Here, we present a combined model of structural and spike-timing–dependent plasticity that explains the multicontact configuration of synapses in adult neocortical networks under steady-state and lesion-induced conditions. Our plasticity rule with Hebbian and anti-Hebbian terms stabilizes both the postsynaptic firing rate and correlations between the pre- and postsynaptic activity at an active synaptic contact. Contacts appear spontaneously at a low rate and disappear if their strength approaches zero. Many presynaptic neurons compete to make strong synaptic connections onto a postsynaptic neuron, whereas the synaptic contacts of a given presynaptic neuron co-operate via postsynaptic firing. We find that co-operation of multiple synaptic contacts is crucial for stable, long-term synaptic memories. In simulations of a simplified network model of barrel cortex, our plasticity rule reproduces whisker-trimming–induced rewiring of thalamocortical and recurrent synaptic connectivity on realistic time scales.
Collapse
Affiliation(s)
- Moritz Deger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.,Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, 50674 Cologne, Germany
| | - Alexander Seeholzer
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
11
|
Humble J, Hiratsuka K, Kasai H, Toyoizumi T. Intrinsic Spine Dynamics Are Critical for Recurrent Network Learning in Models With and Without Autism Spectrum Disorder. Front Comput Neurosci 2019; 13:38. [PMID: 31263407 PMCID: PMC6585147 DOI: 10.3389/fncom.2019.00038] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2019] [Accepted: 05/28/2019] [Indexed: 11/13/2022] Open
Abstract
It is often assumed that Hebbian synaptic plasticity forms a cell assembly, a mutually interacting group of neurons that encodes memory. However, in recurrently connected networks with pure Hebbian plasticity, cell assemblies typically diverge or fade under ongoing changes of synaptic strength. Previously assumed mechanisms that stabilize cell assemblies do not robustly reproduce the experimentally reported unimodal and long-tailed distribution of synaptic strengths. Here, we show that augmenting Hebbian plasticity with experimentally observed intrinsic spine dynamics can stabilize cell assemblies and reproduce the distribution of synaptic strengths. Moreover, we posit that strong intrinsic spine dynamics impair learning performance. Our theory explains how excessively strong spine dynamics, experimentally observed in several animal models of autism spectrum disorder, impair learning associations in the brain.
Collapse
Affiliation(s)
- James Humble
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama, Japan
| | - Kazuhiro Hiratsuka
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama, Japan
| | - Haruo Kasai
- Laboratory of Structural Physiology, Faculty of Medicine, Center for Disease Biology and Integrative Medicine, University of Tokyo, Tokyo, Japan
| | - Taro Toyoizumi
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama, Japan
| |
Collapse
|
12
|
Fauth MJ, van Rossum MC. Self-organized reactivation maintains and reinforces memories despite synaptic turnover. eLife 2019; 8:43717. [PMID: 31074745 PMCID: PMC6546393 DOI: 10.7554/elife.43717] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2018] [Accepted: 04/30/2019] [Indexed: 01/21/2023] Open
Abstract
Long-term memories are believed to be stored in the synapses of cortical neuronal networks. However, recent experiments report continuous creation and removal of cortical synapses, which raises the question how memories can survive on such a variable substrate. Here, we study the formation and retention of associative memory in a computational model based on Hebbian cell assemblies in the presence of both synaptic and structural plasticity. During rest periods, such as may occur during sleep, the assemblies reactivate spontaneously, reinforcing memories against ongoing synapse removal and replacement. Brief daily reactivations during rest-periods suffice to not only maintain the assemblies, but even strengthen them, and improve pattern completion, consistent with offline memory gains observed experimentally. While the connectivity inside memory representations is strengthened during rest phases, connections in the rest of the network decay and vanish thus reconciling apparently conflicting hypotheses of the influence of sleep on cortical connectivity.
Collapse
Affiliation(s)
- Michael Jan Fauth
- School of Informatics, University of Edinburgh, Edinburgh, United Kingdom.,Third Physics Institute, University of Göttingen, Göttingen, Germany
| | - Mark Cw van Rossum
- School of Psychology, University of Nottingham, Nottingham, United Kingdom.,School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| |
Collapse
|
13
|
Shi Y, Nguyen L, Oh S, Liu X, Kuzum D. A Soft-Pruning Method Applied During Training of Spiking Neural Networks for In-memory Computing Applications. Front Neurosci 2019; 13:405. [PMID: 31080402 PMCID: PMC6497807 DOI: 10.3389/fnins.2019.00405] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2018] [Accepted: 04/09/2019] [Indexed: 11/13/2022] Open
Abstract
Inspired from the computational efficiency of the biological brain, spiking neural networks (SNNs) emulate biological neural networks, neural codes, dynamics, and circuitry. SNNs show great potential for the implementation of unsupervised learning using in-memory computing. Here, we report an algorithmic optimization that improves energy efficiency of online learning with SNNs on emerging non-volatile memory (eNVM) devices. We develop a pruning method for SNNs by exploiting the output firing characteristics of neurons. Our pruning method can be applied during network training, which is different from previous approaches in the literature that employ pruning on already-trained networks. This approach prevents unnecessary updates of network parameters during training. This algorithmic optimization can complement the energy efficiency of eNVM technology, which offers a unique in-memory computing platform for the parallelization of neural network operations. Our SNN maintains ~90% classification accuracy on the MNIST dataset with up to ~75% pruning, significantly reducing the number of weight updates. The SNN and pruning scheme developed in this work can pave the way toward applications of eNVM based neuro-inspired systems for energy efficient online learning in low power applications.
Collapse
Affiliation(s)
- Yuhan Shi
- Electrical and Computer Engineering Department, University of California, San Diego, San Diego, CA, United States
| | - Leon Nguyen
- Electrical and Computer Engineering Department, University of California, San Diego, San Diego, CA, United States
| | - Sangheon Oh
- Electrical and Computer Engineering Department, University of California, San Diego, San Diego, CA, United States
| | - Xin Liu
- Electrical and Computer Engineering Department, University of California, San Diego, San Diego, CA, United States
| | - Duygu Kuzum
- Electrical and Computer Engineering Department, University of California, San Diego, San Diego, CA, United States
| |
Collapse
|
14
|
Gerstner W, Lehmann M, Liakoni V, Corneil D, Brea J. Eligibility Traces and Plasticity on Behavioral Time Scales: Experimental Support of NeoHebbian Three-Factor Learning Rules. Front Neural Circuits 2018; 12:53. [PMID: 30108488 PMCID: PMC6079224 DOI: 10.3389/fncir.2018.00053] [Citation(s) in RCA: 117] [Impact Index Per Article: 19.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2018] [Accepted: 06/19/2018] [Indexed: 11/13/2022] Open
Abstract
Most elementary behaviors such as moving the arm to grasp an object or walking into the next room to explore a museum evolve on the time scale of seconds; in contrast, neuronal action potentials occur on the time scale of a few milliseconds. Learning rules of the brain must therefore bridge the gap between these two different time scales. Modern theories of synaptic plasticity have postulated that the co-activation of pre- and postsynaptic neurons sets a flag at the synapse, called an eligibility trace, that leads to a weight change only if an additional factor is present while the flag is set. This third factor, signaling reward, punishment, surprise, or novelty, could be implemented by the phasic activity of neuromodulators or specific neuronal inputs signaling special events. While the theoretical framework has been developed over the last decades, experimental evidence in support of eligibility traces on the time scale of seconds has been collected only during the last few years. Here we review, in the context of three-factor rules of synaptic plasticity, four key experiments that support the role of synaptic eligibility traces in combination with a third factor as a biological implementation of neoHebbian three-factor learning rules.
Collapse
Affiliation(s)
- Wulfram Gerstner
- School of Computer Science and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | | | | | | | | |
Collapse
|
15
|
Concurrence of form and function in developing networks and its role in synaptic pruning. Nat Commun 2018; 9:2236. [PMID: 29884799 PMCID: PMC5993834 DOI: 10.1038/s41467-018-04537-6] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2017] [Accepted: 05/03/2018] [Indexed: 02/07/2023] Open
Abstract
A fundamental question in neuroscience is how structure and function of neural systems are related. We study this interplay by combining a familiar auto-associative neural network with an evolving mechanism for the birth and death of synapses. A feedback loop then arises leading to two qualitatively different types of behaviour. In one, the network structure becomes heterogeneous and dissasortative, and the system displays good memory performance; furthermore, the structure is optimised for the particular memory patterns stored during the process. In the other, the structure remains homogeneous and incapable of pattern retrieval. These findings provide an inspiring picture of brain structure and dynamics that is compatible with experimental results on early brain development, and may help to explain synaptic pruning. Other evolving networks—such as those of protein interactions—might share the basic ingredients for this feedback loop and other questions, and indeed many of their structural features are as predicted by our model. How structure and function coevolve in developing brains is little understood. Here, the authors study a coupled model of network development and memory, and find that due to the feedback networks with some initial memory capacity evolve into heterogeneous structures with high memory performance.
Collapse
|
16
|
Mongillo G, Rumpel S, Loewenstein Y. Intrinsic volatility of synaptic connections — a challenge to the synaptic trace theory of memory. Curr Opin Neurobiol 2017; 46:7-13. [DOI: 10.1016/j.conb.2017.06.006] [Citation(s) in RCA: 77] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2017] [Revised: 06/18/2017] [Accepted: 06/27/2017] [Indexed: 02/07/2023]
|
17
|
Fauth M, Tetzlaff C. Opposing Effects of Neuronal Activity on Structural Plasticity. Front Neuroanat 2016; 10:75. [PMID: 27445713 PMCID: PMC4923203 DOI: 10.3389/fnana.2016.00075] [Citation(s) in RCA: 52] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2015] [Accepted: 06/16/2016] [Indexed: 12/21/2022] Open
Abstract
The connectivity of the brain is continuously adjusted to new environmental influences by several activity-dependent adaptive processes. The most investigated adaptive mechanism is activity-dependent functional or synaptic plasticity regulating the transmission efficacy of existing synapses. Another important but less prominently discussed adaptive process is structural plasticity, which changes the connectivity by the formation and deletion of synapses. In this review, we show, based on experimental evidence, that structural plasticity can be classified similar to synaptic plasticity into two categories: (i) Hebbian structural plasticity, which leads to an increase (decrease) of the number of synapses during phases of high (low) neuronal activity and (ii) homeostatic structural plasticity, which balances these changes by removing and adding synapses. Furthermore, based on experimental and theoretical insights, we argue that each type of structural plasticity fulfills a different function. While Hebbian structural changes enhance memory lifetime, storage capacity, and memory robustness, homeostatic structural plasticity self-organizes the connectivity of the neural network to assure stability. However, the link between functional synaptic and structural plasticity as well as the detailed interactions between Hebbian and homeostatic structural plasticity are more complex. This implies even richer dynamics requiring further experimental and theoretical investigations.
Collapse
Affiliation(s)
- Michael Fauth
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August UniversityGöttingen, Germany; Bernstein Center for Computational NeuroscienceGöttingen, Germany
| | - Christian Tetzlaff
- Bernstein Center for Computational NeuroscienceGöttingen, Germany; Max Planck Institute for Dynamics and Self-OrganizationGöttingen, Germany
| |
Collapse
|
18
|
Knoblauch A, Sommer FT. Structural Plasticity, Effectual Connectivity, and Memory in Cortex. Front Neuroanat 2016; 10:63. [PMID: 27378861 PMCID: PMC4909771 DOI: 10.3389/fnana.2016.00063] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2015] [Accepted: 05/26/2016] [Indexed: 11/13/2022] Open
Abstract
Learning and memory is commonly attributed to the modification of synaptic strengths in neuronal networks. More recent experiments have also revealed a major role of structural plasticity including elimination and regeneration of synapses, growth and retraction of dendritic spines, and remodeling of axons and dendrites. Here we work out the idea that one likely function of structural plasticity is to increase "effectual connectivity" in order to improve the capacity of sparsely connected networks to store Hebbian cell assemblies that are supposed to represent memories. For this we define effectual connectivity as the fraction of synaptically linked neuron pairs within a cell assembly representing a memory. We show by theory and numerical simulation the close links between effectual connectivity and both information storage capacity of neural networks and effective connectivity as commonly employed in functional brain imaging and connectome analysis. Then, by applying our model to a recently proposed memory model, we can give improved estimates on the number of cell assemblies that can be stored in a cortical macrocolumn assuming realistic connectivity. Finally, we derive a simplified model of structural plasticity to enable large scale simulation of memory phenomena, and apply our model to link ongoing adult structural plasticity to recent behavioral data on the spacing effect of learning.
Collapse
Affiliation(s)
- Andreas Knoblauch
- Informatics Faculty, Albstadt-Sigmaringen University Albstadt, Germany
| | - Friedrich T Sommer
- Redwood Center for Theoretical Neuroscience, University of California at Berkeley Berkeley, CA, USA
| |
Collapse
|
19
|
Hiratani N, Fukai T. Hebbian Wiring Plasticity Generates Efficient Network Structures for Robust Inference with Synaptic Weight Plasticity. Front Neural Circuits 2016; 10:41. [PMID: 27303271 PMCID: PMC4885844 DOI: 10.3389/fncir.2016.00041] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2015] [Accepted: 05/11/2016] [Indexed: 12/17/2022] Open
Abstract
In the adult mammalian cortex, a small fraction of spines are created and eliminated every day, and the resultant synaptic connection structure is highly nonrandom, even in local circuits. However, it remains unknown whether a particular synaptic connection structure is functionally advantageous in local circuits, and why creation and elimination of synaptic connections is necessary in addition to rich synaptic weight plasticity. To answer these questions, we studied an inference task model through theoretical and numerical analyses. We demonstrate that a robustly beneficial network structure naturally emerges by combining Hebbian-type synaptic weight plasticity and wiring plasticity. Especially in a sparsely connected network, wiring plasticity achieves reliable computation by enabling efficient information transmission. Furthermore, the proposed rule reproduces experimental observed correlation between spine dynamics and task performance.
Collapse
Affiliation(s)
- Naoki Hiratani
- Department of Complexity Science and Engineering, The University of TokyoKashiwa, Japan; Laboratory for Neural Circuit Theory, RIKEN Brain Science InstituteWako, Japan
| | - Tomoki Fukai
- Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan
| |
Collapse
|
20
|
Fauth M, Wörgötter F, Tetzlaff C. Formation and Maintenance of Robust Long-Term Information Storage in the Presence of Synaptic Turnover. PLoS Comput Biol 2015; 11:e1004684. [PMID: 26713858 PMCID: PMC4699846 DOI: 10.1371/journal.pcbi.1004684] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2015] [Accepted: 11/30/2015] [Indexed: 11/19/2022] Open
Abstract
A long-standing problem is how memories can be stored for very long times despite the volatility of the underlying neural substrate, most notably the high turnover of dendritic spines and synapses. To address this problem, here we are using a generic and simple probabilistic model for the creation and removal of synapses. We show that information can be stored for several months when utilizing the intrinsic dynamics of multi-synapse connections. In such systems, single synapses can still show high turnover, which enables fast learning of new information, but this will not perturb prior stored information (slow forgetting), which is represented by the compound state of the connections. The model matches the time course of recent experimental spine data during learning and memory in mice supporting the assumption of multi-synapse connections as the basis for long-term storage. It is widely believed that information is stored in the connectivity, i.e. the synapses of neural networks. Yet, the morphological correlates of excitatory synapses, the dendritic spines, have been found to undergo a remarkable turnover on daily basis. This poses the question, how information can be retained on such a variable substrate. In this study, using connections with multiple synapses, we show that connections which follow the experimentally measured bimodal distribution in the number of synapses can store information orders of magnitude longer than the lifetime of a single synapse. This is a consequence of the underlying bistable collective dynamic of multiple synapses: Single synapses can appear and disappear without disturbing the memory as a whole. Furthermore, increasing or decreasing neural activity changes the distribution of the number of synapses of multi-synaptic connections such that only one of the peaks remains. This leads to a desirable property: information about these altered activities can be stored much faster than it is forgotten. Remarkably, the resulting model dynamics match recent experimental data investigating the long-term effect of learning on the dynamics of dendritic spines.
Collapse
Affiliation(s)
- Michael Fauth
- Third Physics Institute, Georg-August University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
- * E-mail:
| | - Florentin Wörgötter
- Third Physics Institute, Georg-August University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - Christian Tetzlaff
- Max-Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| |
Collapse
|
21
|
Kappel D, Habenschuss S, Legenstein R, Maass W. Network Plasticity as Bayesian Inference. PLoS Comput Biol 2015; 11:e1004485. [PMID: 26545099 PMCID: PMC4636322 DOI: 10.1371/journal.pcbi.1004485] [Citation(s) in RCA: 68] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2015] [Accepted: 08/03/2015] [Indexed: 12/23/2022] Open
Abstract
General results from statistical learning theory suggest to understand not only brain computations, but also brain plasticity as probabilistic inference. But a model for that has been missing. We propose that inherently stochastic features of synaptic plasticity and spine motility enable cortical networks of neurons to carry out probabilistic inference by sampling from a posterior distribution of network configurations. This model provides a viable alternative to existing models that propose convergence of parameters to maximum likelihood values. It explains how priors on weight distributions and connection probabilities can be merged optimally with learned experience, how cortical networks can generalize learned information so well to novel experiences, and how they can compensate continuously for unforeseen disturbances of the network. The resulting new theory of network plasticity explains from a functional perspective a number of experimental data on stochastic aspects of synaptic plasticity that previously appeared to be quite puzzling.
Collapse
Affiliation(s)
- David Kappel
- Institute for Theoretical Computer Science, Graz University of Technology, A-8010 Graz, Austria
- * E-mail:
| | - Stefan Habenschuss
- Institute for Theoretical Computer Science, Graz University of Technology, A-8010 Graz, Austria
| | - Robert Legenstein
- Institute for Theoretical Computer Science, Graz University of Technology, A-8010 Graz, Austria
| | - Wolfgang Maass
- Institute for Theoretical Computer Science, Graz University of Technology, A-8010 Graz, Austria
| |
Collapse
|
22
|
Liu KKL, Bartsch RP, Lin A, Mantegna RN, Ivanov PC. Plasticity of brain wave network interactions and evolution across physiologic states. Front Neural Circuits 2015; 9:62. [PMID: 26578891 PMCID: PMC4620446 DOI: 10.3389/fncir.2015.00062] [Citation(s) in RCA: 77] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2015] [Accepted: 10/02/2015] [Indexed: 11/13/2022] Open
Abstract
Neural plasticity transcends a range of spatio-temporal scales and serves as the basis of various brain activities and physiologic functions. At the microscopic level, it enables the emergence of brain waves with complex temporal dynamics. At the macroscopic level, presence and dominance of specific brain waves is associated with important brain functions. The role of neural plasticity at different levels in generating distinct brain rhythms and how brain rhythms communicate with each other across brain areas to generate physiologic states and functions remains not understood. Here we perform an empirical exploration of neural plasticity at the level of brain wave network interactions representing dynamical communications within and between different brain areas in the frequency domain. We introduce the concept of time delay stability (TDS) to quantify coordinated bursts in the activity of brain waves, and we employ a system-wide Network Physiology integrative approach to probe the network of coordinated brain wave activations and its evolution across physiologic states. We find an association between network structure and physiologic states. We uncover a hierarchical reorganization in the brain wave networks in response to changes in physiologic state, indicating new aspects of neural plasticity at the integrated level. Globally, we find that the entire brain network undergoes a pronounced transition from low connectivity in Deep Sleep and REM to high connectivity in Light Sleep and Wake. In contrast, we find that locally, different brain areas exhibit different network dynamics of brain wave interactions to achieve differentiation in function during different sleep stages. Moreover, our analyses indicate that plasticity also emerges in frequency-specific networks, which represent interactions across brain locations mediated through a specific frequency band. Comparing frequency-specific networks within the same physiologic state we find very different degree of network connectivity and link strength, while at the same time each frequency-specific network is characterized by a different signature pattern of sleep-stage stratification, reflecting a remarkable flexibility in response to change in physiologic state. These new aspects of neural plasticity demonstrate that in addition to dominant brain waves, the network of brain wave interactions is a previously unrecognized hallmark of physiologic state and function.
Collapse
Affiliation(s)
- Kang K. L. Liu
- Laboratory for Network Physiology, Department of Physics, Boston UniversityBoston, MA, USA
- Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical SchoolBoston, MA, USA
| | | | - Aijing Lin
- Laboratory for Network Physiology, Department of Physics, Boston UniversityBoston, MA, USA
- Department of Mathematics, School of Science, Beijing Jiaotong UniversityBeijing, China
| | - Rosario N. Mantegna
- Dipartimento di Fisica e Chimica, Viale delle Scienze, University of PalermoPalermo, Italy
- Center for Network Science and Department of Economics, Central European UniversityBudapest, Hungary
| | - Plamen Ch. Ivanov
- Laboratory for Network Physiology, Department of Physics, Boston UniversityBoston, MA, USA
- Division of Sleep Medicine, Brigham and Women's Hospital, Harvard Medical SchoolBoston, MA, USA
- Institute of Solid State Physics, Bulgarian Academy of SciencesSofia, Bulgaria
| |
Collapse
|
23
|
Zaytsev YV, Morrison A, Deger M. Reconstruction of recurrent synaptic connectivity of thousands of neurons from simulated spiking activity. J Comput Neurosci 2015; 39:77-103. [PMID: 26041729 PMCID: PMC4493949 DOI: 10.1007/s10827-015-0565-5] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2014] [Revised: 04/18/2015] [Accepted: 04/22/2015] [Indexed: 10/30/2022]
Abstract
Dynamics and function of neuronal networks are determined by their synaptic connectivity. Current experimental methods to analyze synaptic network structure on the cellular level, however, cover only small fractions of functional neuronal circuits, typically without a simultaneous record of neuronal spiking activity. Here we present a method for the reconstruction of large recurrent neuronal networks from thousands of parallel spike train recordings. We employ maximum likelihood estimation of a generalized linear model of the spiking activity in continuous time. For this model the point process likelihood is concave, such that a global optimum of the parameters can be obtained by gradient ascent. Previous methods, including those of the same class, did not allow recurrent networks of that order of magnitude to be reconstructed due to prohibitive computational cost and numerical instabilities. We describe a minimal model that is optimized for large networks and an efficient scheme for its parallelized numerical optimization on generic computing clusters. For a simulated balanced random network of 1000 neurons, synaptic connectivity is recovered with a misclassification error rate of less than 1 % under ideal conditions. We show that the error rate remains low in a series of example cases under progressively less ideal conditions. Finally, we successfully reconstruct the connectivity of a hidden synfire chain that is embedded in a random network, which requires clustering of the network connectivity to reveal the synfire groups. Our results demonstrate how synaptic connectivity could potentially be inferred from large-scale parallel spike train recordings.
Collapse
Affiliation(s)
- Yury V. Zaytsev
- Simulation Laboratory Neuroscience – Bernstein Facility for Simulation and Database Technology, Institute for Advanced Simulation, Jülich Aachen Research Alliance, Jülich Research Center, Jülich, Germany
- Faculty of Biology, Albert-Ludwig University of Freiburg, Freiburg im Breisgau, Germany
- Forschungszentrum Jülich GmbH, Jülich Supercomputing Center (JSC), 52425 Jülich, Germany
| | - Abigail Morrison
- Simulation Laboratory Neuroscience – Bernstein Facility for Simulation and Database Technology, Institute for Advanced Simulation, Jülich Aachen Research Alliance, Jülich Research Center, Jülich, Germany
- Institute for Advanced Simulation (IAS-6), Theoretical Neuroscience & Institute of Neuroscience and Medicine (INM-6), Computational and Systems Neuroscience, Jülich Research Center and JARA, Jülich, Germany
- Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr-University Bochum, Bochum, Germany
| | - Moritz Deger
- School of Life Sciences, Brain Mind Institute and School of Computer and Communication Sciences, École polytechnique fédérale de Lausanne, 1015 Lausanne, EPFL Switzerland
| |
Collapse
|
24
|
Fauth M, Wörgötter F, Tetzlaff C. The formation of multi-synaptic connections by the interaction of synaptic and structural plasticity and their functional consequences. PLoS Comput Biol 2015; 11:e1004031. [PMID: 25590330 PMCID: PMC4295841 DOI: 10.1371/journal.pcbi.1004031] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2014] [Accepted: 11/06/2014] [Indexed: 11/19/2022] Open
Abstract
Cortical connectivity emerges from the permanent interaction between neuronal activity and synaptic as well as structural plasticity. An important experimentally observed feature of this connectivity is the distribution of the number of synapses from one neuron to another, which has been measured in several cortical layers. All of these distributions are bimodal with one peak at zero and a second one at a small number (3–8) of synapses. In this study, using a probabilistic model of structural plasticity, which depends on the synaptic weights, we explore how these distributions can emerge and which functional consequences they have. We find that bimodal distributions arise generically from the interaction of structural plasticity with synaptic plasticity rules that fulfill the following biological realistic constraints: First, the synaptic weights have to grow with the postsynaptic activity. Second, this growth curve and/or the input-output relation of the postsynaptic neuron have to change sub-linearly (negative curvature). As most neurons show such input-output-relations, these constraints can be fulfilled by many biological reasonable systems. Given such a system, we show that the different activities, which can explain the layer-specific distributions, correspond to experimentally observed activities. Considering these activities as working point of the system and varying the pre- or postsynaptic stimulation reveals a hysteresis in the number of synapses. As a consequence of this, the connectivity between two neurons can be controlled by activity but is also safeguarded against overly fast changes. These results indicate that the complex dynamics between activity and plasticity will, already between a pair of neurons, induce a variety of possible stable synaptic distributions, which could support memory mechanisms. The connectivity between neurons is modified by different mechanisms. On a time scale of minutes to hours one finds synaptic plasticity, whereas mechanisms for structural changes at axons or dendrites may take days. One main factor determining structural changes is the weight of a connection, which, in turn, is adapted by synaptic plasticity. Both mechanisms, synaptic and structural plasticity, are influenced and determined by the activity pattern in the network. Hence, it is important to understand how activity and the different plasticity mechanisms influence each other. Especially how activity influences rewiring in adult networks is still an open question. We present a model, which captures these complex interactions by abstracting structural plasticity with weight-dependent probabilities. This allows for calculating the distribution of the number of synapses between two neurons analytically. We report that biologically realistic connection patterns for different cortical layers generically arise with synaptic plasticity rules in which the synaptic weights grow with postsynaptic activity. The connectivity patterns also lead to different activity levels resembling those found in the different cortical layers. Interestingly such a system exhibits a hysteresis by which connections remain stable longer than expected, which may add to the stability of information storage in the network.
Collapse
Affiliation(s)
- Michael Fauth
- Georg-August University Göttingen, Third Institute of Physics, Bernstein Center for Computational Neuroscience, Göttingen, Germany
- * E-mail:
| | - Florentin Wörgötter
- Georg-August University Göttingen, Third Institute of Physics, Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - Christian Tetzlaff
- Georg-August University Göttingen, Third Institute of Physics, Bernstein Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
25
|
Knoblauch A, Körner E, Körner U, Sommer FT. Structural synaptic plasticity has high memory capacity and can explain graded amnesia, catastrophic forgetting, and the spacing effect. PLoS One 2014; 9:e96485. [PMID: 24858841 PMCID: PMC4032253 DOI: 10.1371/journal.pone.0096485] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2013] [Accepted: 04/08/2014] [Indexed: 11/19/2022] Open
Abstract
Although already William James and, more explicitly, Donald Hebb's theory of cell assemblies have suggested that activity-dependent rewiring of neuronal networks is the substrate of learning and memory, over the last six decades most theoretical work on memory has focused on plasticity of existing synapses in prewired networks. Research in the last decade has emphasized that structural modification of synaptic connectivity is common in the adult brain and tightly correlated with learning and memory. Here we present a parsimonious computational model for learning by structural plasticity. The basic modeling units are "potential synapses" defined as locations in the network where synapses can potentially grow to connect two neurons. This model generalizes well-known previous models for associative learning based on weight plasticity. Therefore, existing theory can be applied to analyze how many memories and how much information structural plasticity can store in a synapse. Surprisingly, we find that structural plasticity largely outperforms weight plasticity and can achieve a much higher storage capacity per synapse. The effect of structural plasticity on the structure of sparsely connected networks is quite intuitive: Structural plasticity increases the "effectual network connectivity", that is, the network wiring that specifically supports storage and recall of the memories. Further, this model of structural plasticity produces gradients of effectual connectivity in the course of learning, thereby explaining various cognitive phenomena including graded amnesia, catastrophic forgetting, and the spacing effect.
Collapse
Affiliation(s)
- Andreas Knoblauch
- Engineering Faculty, Albstadt-Sigmaringen University, Albstadt, Germany
- Honda Research Institute Europe, Offenbach am Main, Germany
| | - Edgar Körner
- Honda Research Institute Europe, Offenbach am Main, Germany
| | - Ursula Körner
- Honda Research Institute Europe, Offenbach am Main, Germany
| | - Friedrich T. Sommer
- Redwood Center for Theoretical Neuroscience, University of California, Berkeley, California, United States of America
| |
Collapse
|
26
|
Butz M, van Ooyen A. A simple rule for dendritic spine and axonal bouton formation can account for cortical reorganization after focal retinal lesions. PLoS Comput Biol 2013; 9:e1003259. [PMID: 24130472 PMCID: PMC3794906 DOI: 10.1371/journal.pcbi.1003259] [Citation(s) in RCA: 54] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2013] [Accepted: 08/08/2013] [Indexed: 12/24/2022] Open
Abstract
Lasting alterations in sensory input trigger massive structural and functional adaptations in cortical networks. The principles governing these experience-dependent changes are, however, poorly understood. Here, we examine whether a simple rule based on the neurons' need for homeostasis in electrical activity may serve as driving force for cortical reorganization. According to this rule, a neuron creates new spines and boutons when its level of electrical activity is below a homeostatic set-point and decreases the number of spines and boutons when its activity exceeds this set-point. In addition, neurons need a minimum level of activity to form spines and boutons. Spine and bouton formation depends solely on the neuron's own activity level, and synapses are formed by merging spines and boutons independently of activity. Using a novel computational model, we show that this simple growth rule produces neuron and network changes as observed in the visual cortex after focal retinal lesions. In the model, as in the cortex, the turnover of dendritic spines was increased strongest in the center of the lesion projection zone, while axonal boutons displayed a marked overshoot followed by pruning. Moreover, the decrease in external input was compensated for by the formation of new horizontal connections, which caused a retinotopic remapping. Homeostatic regulation may provide a unifying framework for understanding cortical reorganization, including network repair in degenerative diseases or following focal stroke. The adult brain is less hard-wired than traditionally thought. About ten percent of synapses in the mature visual cortex is continually replaced by new ones (structural plasticity). This percentage greatly increases after lasting changes in visual input. Due to the topographically organized nerve connections from the retina in the eye to the primary visual cortex in the brain, a small circumscribed lesion in the retina leads to a defined area in the cortex that is deprived of input. Recent experimental studies have revealed that axonal sprouting and dendritic spine turnover are massively increased in and around the cortical area that is deprived of input. However, the driving forces for this structural plasticity remain unclear. Using a novel computational model, we examine whether the need for activity homeostasis of individual neurons may drive cortical reorganization after lasting changes in input activity. We show that homeostatic growth rules indeed give rise to structural and functional reorganization of neuronal networks similar to the cortical reorganization observed experimentally. Understanding the principles of structural plasticity may eventually lead to novel treatment strategies for stimulating functional reorganization after brain damage and neurodegeneration.
Collapse
Affiliation(s)
- Markus Butz
- Simulation Lab Neuroscience - Bernstein Facility for Simulation and Database Technology, Institute for Advanced Simulation, Jülich Aachen Research Alliance, Forschungszentrum Jülich, Jülich, Germany
- * E-mail:
| | | |
Collapse
|