1
|
Eckmann S, Young EJ, Gjorgjieva J. Synapse-type-specific competitive Hebbian learning forms functional recurrent networks. Proc Natl Acad Sci U S A 2024; 121:e2305326121. [PMID: 38870059 PMCID: PMC11194505 DOI: 10.1073/pnas.2305326121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2023] [Accepted: 04/25/2024] [Indexed: 06/15/2024] Open
Abstract
Cortical networks exhibit complex stimulus-response patterns that are based on specific recurrent interactions between neurons. For example, the balance between excitatory and inhibitory currents has been identified as a central component of cortical computations. However, it remains unclear how the required synaptic connectivity can emerge in developing circuits where synapses between excitatory and inhibitory neurons are simultaneously plastic. Using theory and modeling, we propose that a wide range of cortical response properties can arise from a single plasticity paradigm that acts simultaneously at all excitatory and inhibitory connections-Hebbian learning that is stabilized by the synapse-type-specific competition for a limited supply of synaptic resources. In plastic recurrent circuits, this competition enables the formation and decorrelation of inhibition-balanced receptive fields. Networks develop an assembly structure with stronger synaptic connections between similarly tuned excitatory and inhibitory neurons and exhibit response normalization and orientation-specific center-surround suppression, reflecting the stimulus statistics during training. These results demonstrate how neurons can self-organize into functional networks and suggest an essential role for synapse-type-specific competitive learning in the development of cortical circuits.
Collapse
Affiliation(s)
- Samuel Eckmann
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt am Main60438, Germany
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, CambridgeCB2 1PZ, United Kingdom
| | - Edward James Young
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, CambridgeCB2 1PZ, United Kingdom
| | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt am Main60438, Germany
- School of Life Sciences, Technical University Munich, Freising85354, Germany
| |
Collapse
|
2
|
Bressloff PC. Asymptotic analysis of particle cluster formation in the presence of anchoring sites. THE EUROPEAN PHYSICAL JOURNAL. E, SOFT MATTER 2024; 47:30. [PMID: 38720027 PMCID: PMC11078859 DOI: 10.1140/epje/s10189-024-00425-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Accepted: 04/15/2024] [Indexed: 05/12/2024]
Abstract
The aggregation or clustering of proteins and other macromolecules plays an important role in the formation of large-scale molecular assemblies within cell membranes. Examples of such assemblies include lipid rafts, and postsynaptic domains (PSDs) at excitatory and inhibitory synapses in neurons. PSDs are rich in scaffolding proteins that can transiently trap transmembrane neurotransmitter receptors, thus localizing them at specific spatial positions. Hence, PSDs play a key role in determining the strength of synaptic connections and their regulation during learning and memory. Recently, a two-dimensional (2D) diffusion-mediated aggregation model of PSD formation has been developed in which the spatial locations of the clusters are determined by a set of fixed anchoring sites. The system is kept out of equilibrium by the recycling of particles between the cell membrane and interior. This results in a stationary distribution consisting of multiple clusters, whose average size can be determined using an effective mean-field description of the particle concentration around each anchored cluster. In this paper, we derive corrections to the mean-field approximation by applying the theory of diffusion in singularly perturbed domains. The latter is a powerful analytical method for solving two-dimensional (2D) and three-dimensional (3D) diffusion problems in domains where small holes or perforations have been removed from the interior. Applications range from modeling intracellular diffusion, where interior holes could represent subcellular structures such as organelles or biological condensates, to tracking the spread of chemical pollutants or heat from localized sources. In this paper, we take the bounded domain to be the cell membrane and the holes to represent anchored clusters. The analysis proceeds by partitioning the membrane into a set of inner regions around each cluster, and an outer region where mean-field interactions occur. Asymptotically matching the inner and outer stationary solutions generates an asymptotic expansion of the particle concentration, which includes higher-order corrections to mean-field theory that depend on the positions of the clusters and the boundary of the domain. Motivated by a recent study of light-activated protein oligomerization in cells, we also develop the analogous theory for cluster formation in a three-dimensional (3D) domain. The details of the asymptotic analysis differ from the 2D case due to the contrasting singularity structure of 2D and 3D Green's functions.
Collapse
Affiliation(s)
- Paul C Bressloff
- Department of Mathematics, Imperial College London, London, SW7 2AZ, UK.
| |
Collapse
|
3
|
Koesters AG, Rich MM, Engisch KL. Diverging from the Norm: Reevaluating What Miniature Excitatory Postsynaptic Currents Tell Us about Homeostatic Synaptic Plasticity. Neuroscientist 2024; 30:49-70. [PMID: 35904350 DOI: 10.1177/10738584221112336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The idea that the nervous system maintains a set point of network activity and homeostatically returns to that set point in the face of dramatic disruption-during development, after injury, in pathologic states, and during sleep/wake cycles-is rapidly becoming accepted as a key plasticity behavior, placing it alongside long-term potentiation and depression. The dramatic growth in studies of homeostatic synaptic plasticity of miniature excitatory synaptic currents (mEPSCs) is attributable, in part, to the simple yet elegant mechanism of uniform multiplicative scaling proposed by Turrigiano and colleagues: that neurons sense their own activity and globally multiply the strength of every synapse by a single factor to return activity to the set point without altering established differences in synaptic weights. We have recently shown that for mEPSCs recorded from control and activity-blocked cultures of mouse cortical neurons, the synaptic scaling factor is not uniform but is close to 1 for the smallest mEPSC amplitudes and progressively increases as mEPSC amplitudes increase, which we term divergent scaling. Using insights gained from simulating uniform multiplicative scaling, we review evidence from published studies and conclude that divergent synaptic scaling is the norm rather than the exception. This conclusion has implications for hypotheses about the molecular mechanisms underlying synaptic scaling.
Collapse
Affiliation(s)
- Andrew G Koesters
- Department of Behavior, Cognition, and Neurophysiology, Environmental Health Effects Laboratory, Naval Medical Research Unit-Dayton, Wright-Patterson AFB, OH, USA
| | - Mark M Rich
- Department of Neuroscience, Cell Biology, and Physiology, College of Science and Mathematics, and Boonshoft School of Medicine, Wright State University, Dayton, OH, USA
| | - Kathrin L Engisch
- Department of Neuroscience, Cell Biology, and Physiology, College of Science and Mathematics, and Boonshoft School of Medicine, Wright State University, Dayton, OH, USA
| |
Collapse
|
4
|
Pache A, van Rossum MCW. Energetically efficient learning in neuronal networks. Curr Opin Neurobiol 2023; 83:102779. [PMID: 37672980 DOI: 10.1016/j.conb.2023.102779] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2023] [Revised: 08/03/2023] [Accepted: 08/09/2023] [Indexed: 09/08/2023]
Abstract
Human and animal experiments have shown that acquiring and storing information can require substantial amounts of metabolic energy. However, computational models of neural plasticity only seldom take this cost into account, and might thereby miss an important constraint on biological learning. This review explores various ways to reduce energy requirements for learning in neural networks. By comparing the resulting learning rules to cognitive and neurophysiological observations, we discuss how energy efficiency might have shaped biological learning.
Collapse
Affiliation(s)
- Aaron Pache
- School of Mathematical Sciences, University of Nottingham, Nottingham NG7 2RD, United Kingdom
| | - Mark C W van Rossum
- School of Mathematical Sciences, University of Nottingham, Nottingham NG7 2RD, United Kingdom; School of Psychology, University of Nottingham, Nottingham, United Kingdom.
| |
Collapse
|
5
|
Wei H, Li F. The storage capacity of a directed graph and nodewise autonomous, ubiquitous learning. Front Comput Neurosci 2023; 17:1254355. [PMID: 37927548 PMCID: PMC10620732 DOI: 10.3389/fncom.2023.1254355] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2023] [Accepted: 09/20/2023] [Indexed: 11/07/2023] Open
Abstract
The brain, an exceedingly intricate information processing system, poses a constant challenge to memory research, particularly in comprehending how it encodes, stores, and retrieves information. Cognitive psychology studies memory mechanism from behavioral experiment level and fMRI level, and neurobiology studies memory mechanism from anatomy and electrophysiology level. Current research findings are insufficient to provide a comprehensive, detailed explanation of memory processes within the brain. Numerous unknown details must be addressed to establish a complete information processing mechanism connecting micro molecular cellular levels with macro cognitive behavioral levels. Key issues include characterizing and distributing content within biological neural networks, coexisting information with varying content, and sharing limited resources and storage capacity. Compared with the hard disk of computer mass storage, it is very clear from the polarity of magnetic particles in the bottom layer, the division of tracks and sectors in the middle layer, to the directory tree and file management system in the high layer, but the understanding of memory is not sufficient. Biological neural networks are abstracted as directed graphs, and the encoding, storage, and retrieval of information within directed graphs at the cellular level are explored. A memory computational model based on active directed graphs and node-adaptive learning is proposed. First, based on neuronal local perspectives, autonomous initiative, limited resource competition, and other neurobiological characteristics, a resource-based adaptive learning algorithm for directed graph nodes is designed. To minimize resource consumption of memory content in directed graphs, two resource-occupancy optimization strategies-lateral inhibition and path pruning-are proposed. Second, this paper introduces a novel memory mechanism grounded in graph theory, which considers connected subgraphs as the physical manifestation of memory content in directed graphs. The encoding, storage, consolidation, and retrieval of the brain's memory system correspond to specific operations such as forming subgraphs, accommodating multiple subgraphs, strengthening connections and connectivity of subgraphs, and activating subgraphs. Lastly, a series of experiments were designed to simulate cognitive processes and evaluate the performance of the directed graph model. Experimental results reveal that the proposed adaptive connectivity learning algorithm for directed graphs in this paper possesses the following four features: (1) Demonstrating distributed, self-organizing, and self-adaptive properties, the algorithm achieves global-level functions through local node interactions; (2) Enabling incremental storage and supporting continuous learning capabilities; (3) Displaying stable memory performance, it surpasses the Hopfield network in memory accuracy, capacity, and diversity, as demonstrated in experimental comparisons. Moreover, it maintains high memory performance with large-scale datasets; (4) Exhibiting a degree of generalization ability, the algorithm's macroscopic performance remains unaffected by the topological structure of the directed graph. Large-scale, decentralized, and node-autonomous directed graphs are suitable simulation methods. Examining storage problems within directed graphs can reveal the essence of phenomena and uncover fundamental storage rules hidden within complex neuronal mechanisms, such as synaptic plasticity, ion channels, neurotransmitters, and electrochemical activities.
Collapse
Affiliation(s)
- Hui Wei
- Laboratory of Algorithms for Cognitive Models, School of Computer Science, Shanghai Key Laboratory of Data Science, Fudan University, Shanghai, China
| | | |
Collapse
|
6
|
Madar A, Dong C, Sheffield M. BTSP, not STDP, Drives Shifts in Hippocampal Representations During Familiarization. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.10.17.562791. [PMID: 37904999 PMCID: PMC10614909 DOI: 10.1101/2023.10.17.562791] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/02/2023]
Abstract
Synaptic plasticity is widely thought to support memory storage in the brain, but the rules determining impactful synaptic changes in-vivo are not known. We considered the trial-by-trial shifting dynamics of hippocampal place fields (PFs) as an indicator of ongoing plasticity during memory formation. By implementing different plasticity rules in computational models of spiking place cells and comparing to experimentally measured PFs from mice navigating familiar and novel environments, we found that Behavioral-Timescale-Synaptic-Plasticity (BTSP), rather than Hebbian Spike-Timing-Dependent-Plasticity, is the principal mechanism governing PF shifting dynamics. BTSP-triggering events are rare, but more frequent during novel experiences. During exploration, their probability is dynamic: it decays after PF onset, but continually drives a population-level representational drift. Finally, our results show that BTSP occurs in CA3 but is less frequent and phenomenologically different than in CA1. Overall, our study provides a new framework to understand how synaptic plasticity shapes neuronal representations during learning.
Collapse
Affiliation(s)
- A.D. Madar
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| | - C. Dong
- Department of Neurobiology, Neuroscience Institute, University of Chicago
- current affiliation: Department of Neurobiology, Stanford University School of Medicine
| | - M.E.J. Sheffield
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| |
Collapse
|
7
|
Fung CCA, Fukai T. Competition on presynaptic resources enhances the discrimination of interfering memories. PNAS NEXUS 2023; 2:pgad161. [PMID: 37275260 PMCID: PMC10235910 DOI: 10.1093/pnasnexus/pgad161] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/24/2023] [Revised: 04/27/2023] [Accepted: 05/09/2023] [Indexed: 06/07/2023]
Abstract
Evidence suggests that hippocampal adult neurogenesis is critical for discriminating considerably interfering memories. During adult neurogenesis, synaptic competition modifies the weights of synaptic connections nonlocally across neurons, thus providing a different form of unsupervised learning from Hebb's local plasticity rule. However, how synaptic competition achieves separating similar memories largely remains unknown. Here, we aim to link synaptic competition with such pattern separation. In synaptic competition, adult-born neurons are integrated into the existing neuronal pool by competing with mature neurons for synaptic connections from the entorhinal cortex. We show that synaptic competition and neuronal maturation play distinct roles in separating interfering memory patterns. Furthermore, we demonstrate that a feedforward neural network trained by a competition-based learning rule can outperform a multilayer perceptron trained by the backpropagation algorithm when only a small number of samples are available. Our results unveil the functional implications and potential applications of synaptic competition in neural computation.
Collapse
Affiliation(s)
| | - Tomoki Fukai
- To whom correspondence should be addressed: (C.C.A. Fung); (T. Fukai)
| |
Collapse
|
8
|
Hafner AS, Triesch J. Synaptic logistics: Competing over shared resources. Mol Cell Neurosci 2023; 125:103858. [PMID: 37172922 DOI: 10.1016/j.mcn.2023.103858] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2022] [Revised: 05/05/2023] [Accepted: 05/05/2023] [Indexed: 05/15/2023] Open
Abstract
High turnover rates of synaptic proteins imply that synapses constantly need to replace their constituent building blocks. This requires sophisticated supply chains and potentially exposes synapses to shortages as they compete for limited resources. Interestingly, competition in neurons has been observed at different scales. Whether it is competition of receptors for binding sites inside a single synapse or synapses fighting for resources to grow. Here we review the implications of such competition for synaptic function and plasticity. We identify multiple mechanisms that synapses use to safeguard themselves against supply shortages and identify a fundamental neurologistic trade-off governing the sizes of reserve pools of essential synaptic building blocks.
Collapse
Affiliation(s)
- Anne-Sophie Hafner
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands.
| | - Jochen Triesch
- Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany; Goethe University, Frankfurt am Main, Germany
| |
Collapse
|
9
|
Wagle S, Kraynyukova N, Hafner AS, Tchumatchenko T. Computational insights into mRNA and protein dynamics underlying synaptic plasticity rules. Mol Cell Neurosci 2023; 125:103846. [PMID: 36963534 DOI: 10.1016/j.mcn.2023.103846] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 03/14/2023] [Accepted: 03/15/2023] [Indexed: 03/26/2023] Open
Abstract
Recent advances in experimental techniques provide an unprecedented peek into the intricate molecular dynamics inside synapses and dendrites. The experimental insights into the molecular turnover revealed that such processes as diffusion, active transport, spine uptake, and local protein synthesis could dynamically modulate the copy numbers of plasticity-related molecules in synapses. Subsequently, theoretical models were designed to understand the interaction of these processes better and to explain how local synaptic plasticity cues can up or down-regulate the molecular copy numbers across synapses. In this review, we discuss the recent advances in experimental techniques and computational models to highlight how these complementary approaches can provide insight into molecular cross-talk across synapses, ultimately allowing us to develop biologically-inspired neural network models to understand brain function.
Collapse
Affiliation(s)
- Surbhit Wagle
- Institute for Physiological Chemistry, University Medical Center of the Johannes Gutenberg-University Mainz, Anselm-Franz-von-Bentzel-Weg 3, 55128 Mainz, Germany
| | - Nataliya Kraynyukova
- Institute of Experimental Epileptology and Cognition Research, University of Bonn, Venusberg-Campus 1, 53127 Bonn, Germany
| | - Anne-Sophie Hafner
- Donders Institute for Brain, Cognition and Behaviour, Nijmegen, Netherlands; Faculty of Science, Radboud University, Nijmegen, Netherlands
| | - Tatjana Tchumatchenko
- Institute for Physiological Chemistry, University Medical Center of the Johannes Gutenberg-University Mainz, Anselm-Franz-von-Bentzel-Weg 3, 55128 Mainz, Germany; Institute of Experimental Epileptology and Cognition Research, University of Bonn, Venusberg-Campus 1, 53127 Bonn, Germany.
| |
Collapse
|
10
|
KASAI H. Unraveling the mysteries of dendritic spine dynamics: Five key principles shaping memory and cognition. PROCEEDINGS OF THE JAPAN ACADEMY. SERIES B, PHYSICAL AND BIOLOGICAL SCIENCES 2023; 99:254-305. [PMID: 37821392 PMCID: PMC10749395 DOI: 10.2183/pjab.99.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/13/2023] [Accepted: 07/11/2023] [Indexed: 10/13/2023]
Abstract
Recent research extends our understanding of brain processes beyond just action potentials and chemical transmissions within neural circuits, emphasizing the mechanical forces generated by excitatory synapses on dendritic spines to modulate presynaptic function. From in vivo and in vitro studies, we outline five central principles of synaptic mechanics in brain function: P1: Stability - Underpinning the integral relationship between the structure and function of the spine synapses. P2: Extrinsic dynamics - Highlighting synapse-selective structural plasticity which plays a crucial role in Hebbian associative learning, distinct from pathway-selective long-term potentiation (LTP) and depression (LTD). P3: Neuromodulation - Analyzing the role of G-protein-coupled receptors, particularly dopamine receptors, in time-sensitive modulation of associative learning frameworks such as Pavlovian classical conditioning and Thorndike's reinforcement learning (RL). P4: Instability - Addressing the intrinsic dynamics crucial to memory management during continual learning, spotlighting their role in "spine dysgenesis" associated with mental disorders. P5: Mechanics - Exploring how synaptic mechanics influence both sides of synapses to establish structural traces of short- and long-term memory, thereby aiding the integration of mental functions. We also delve into the historical background and foresee impending challenges.
Collapse
Affiliation(s)
- Haruo KASAI
- International Research Center for Neurointelligence (WPI-IRCN), UTIAS, The University of Tokyo, Bunkyo-ku, Tokyo, Japan
- Laboratory of Structural Physiology, Center for Disease Biology and Integrative Medicine, Faculty of Medicine, The University of Tokyo, Bunkyo-ku, Tokyo, Japan
| |
Collapse
|
11
|
Meng JH, Riecke H. Structural spine plasticity: Learning and forgetting of odor-specific subnetworks in the olfactory bulb. PLoS Comput Biol 2022; 18:e1010338. [PMID: 36279303 PMCID: PMC9632792 DOI: 10.1371/journal.pcbi.1010338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2022] [Revised: 11/03/2022] [Accepted: 09/28/2022] [Indexed: 11/05/2022] Open
Abstract
Learning to discriminate between different sensory stimuli is essential for survival. In rodents, the olfactory bulb, which contributes to odor discrimination via pattern separation, exhibits extensive structural synaptic plasticity involving the formation and removal of synaptic spines, even in adult animals. The network connectivity resulting from this plasticity is still poorly understood. To gain insight into this connectivity we present here a computational model for the structural plasticity of the reciprocal synapses between the dominant population of excitatory principal neurons and inhibitory interneurons. It incorporates the observed modulation of spine stability by odor exposure. The model captures the striking experimental observation that the exposure to odors does not always enhance their discriminability: while training with similar odors enhanced their discriminability, training with dissimilar odors actually reduced the discriminability of the training stimuli. Strikingly, this differential learning does not require the activity-dependence of the spine stability and occurs also in a model with purely random spine dynamics in which the spine density is changed homogeneously, e.g., due to a global signal. However, the experimentally observed odor-specific reduction in the response of principal cells as a result of extended odor exposure and the concurrent disinhibition of a subset of principal cells arise only in the activity-dependent model. Moreover, this model predicts the experimentally testable recovery of odor response through weak but not through strong odor re-exposure and the forgetting of odors via exposure to interfering odors. Combined with the experimental observations, the computational model provides strong support for the prediction that odor exposure leads to the formation of odor-specific subnetworks in the olfactory bulb. A key feature of the brain is its ability to learn through the plasticity of its network. The olfactory bulb in the olfactory system is a remarkable brain area whose anatomical structure evolves substantially still in adult animals by establishing new synaptic connections and removing existing ones. We present a computational model for this process and employ it to interpret recent experimental results. By comparing the results of our model with those of a random control model we identify various experimental observations that lend strong support to the notion that the network of the olfactory bulb comprises learned, odor-specific subnetworks. Moreover, our model explains the recent observation that the learning of odors does not always improve their discriminability and provides testable predictions for the recovery of odor response after repeated odor exposure and for when the learning of new odors interferes with retaining the memory of familiar odors.
Collapse
Affiliation(s)
- John Hongyu Meng
- Engineering Sciences and Applied Mathematics, Northwestern University, Evanston, Illinois, United States of America
| | - Hermann Riecke
- Engineering Sciences and Applied Mathematics, Northwestern University, Evanston, Illinois, United States of America
- * E-mail:
| |
Collapse
|
12
|
Amano R, Nakao M, Matsumiya K, Miwakeichi F. A computational model to explore how temporal stimulation patterns affect synapse plasticity. PLoS One 2022; 17:e0275059. [PMID: 36149886 PMCID: PMC9506666 DOI: 10.1371/journal.pone.0275059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2022] [Accepted: 09/09/2022] [Indexed: 11/18/2022] Open
Abstract
Plasticity-related proteins (PRPs), which are synthesized in a synapse activation-dependent manner, are shared by multiple synapses to a limited spatial extent for a specific period. In addition, stimulated synapses can utilize shared PRPs through synaptic tagging and capture (STC). In particular, the phenomenon by which short-lived early long-term potentiation is transformed into long-lived late long-term potentiation using shared PRPs is called “late-associativity,” which is the underlying principle of “cluster plasticity.” We hypothesized that the competitive capture of PRPs by multiple synapses modulates late-associativity and affects the fate of each synapse in terms of whether it is integrated into a synapse cluster. We tested our hypothesis by developing a computational model to simulate STC, late-associativity, and the competitive capture of PRPs. The experimental results obtained using the model revealed that the number of competing synapses, timing of stimulation to each synapse, and basal PRP level in the dendritic compartment altered the effective temporal window of STC and influenced the conditions under which late-associativity occurs. Furthermore, it is suggested that the competitive capture of PRPs results in the selection of synapses to be integrated into a synapse cluster via late-associativity.
Collapse
Affiliation(s)
- Ryota Amano
- Graduate School of Information Sciences, Tohoku University, Sendai, Japan
- * E-mail:
| | - Mitsuyuki Nakao
- Graduate School of Information Sciences, Tohoku University, Sendai, Japan
| | | | - Fumikazu Miwakeichi
- Graduate School of Information Sciences, Tohoku University, Sendai, Japan
- Department of Statistical Modeling, The Institute of Statistical Mathematics, Tachikawa-Shi, Japan
| |
Collapse
|
13
|
Habibey R, Rojo Arias JE, Striebel J, Busskamp V. Microfluidics for Neuronal Cell and Circuit Engineering. Chem Rev 2022; 122:14842-14880. [PMID: 36070858 PMCID: PMC9523714 DOI: 10.1021/acs.chemrev.2c00212] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
The widespread adoption of microfluidic devices among the neuroscience and neurobiology communities has enabled addressing a broad range of questions at the molecular, cellular, circuit, and system levels. Here, we review biomedical engineering approaches that harness the power of microfluidics for bottom-up generation of neuronal cell types and for the assembly and analysis of neural circuits. Microfluidics-based approaches are instrumental to generate the knowledge necessary for the derivation of diverse neuronal cell types from human pluripotent stem cells, as they enable the isolation and subsequent examination of individual neurons of interest. Moreover, microfluidic devices allow to engineer neural circuits with specific orientations and directionality by providing control over neuronal cell polarity and permitting the isolation of axons in individual microchannels. Similarly, the use of microfluidic chips enables the construction not only of 2D but also of 3D brain, retinal, and peripheral nervous system model circuits. Such brain-on-a-chip and organoid-on-a-chip technologies are promising platforms for studying these organs as they closely recapitulate some aspects of in vivo biological processes. Microfluidic 3D neuronal models, together with 2D in vitro systems, are widely used in many applications ranging from drug development and toxicology studies to neurological disease modeling and personalized medicine. Altogether, microfluidics provide researchers with powerful systems that complement and partially replace animal models.
Collapse
Affiliation(s)
- Rouhollah Habibey
- Department of Ophthalmology, Universitäts-Augenklinik Bonn, University of Bonn, Ernst-Abbe-Straße 2, D-53127 Bonn, Germany
| | - Jesús Eduardo Rojo Arias
- Wellcome─MRC Cambridge Stem Cell Institute, Jeffrey Cheah Biomedical Centre, Cambridge Biomedical Campus, University of Cambridge, Cambridge CB2 0AW, United Kingdom
| | - Johannes Striebel
- Department of Ophthalmology, Universitäts-Augenklinik Bonn, University of Bonn, Ernst-Abbe-Straße 2, D-53127 Bonn, Germany
| | - Volker Busskamp
- Department of Ophthalmology, Universitäts-Augenklinik Bonn, University of Bonn, Ernst-Abbe-Straße 2, D-53127 Bonn, Germany
| |
Collapse
|
14
|
Miehl C, Onasch S, Festa D, Gjorgjieva J. Formation and computational implications of assemblies in neural circuits. J Physiol 2022. [PMID: 36068723 DOI: 10.1113/jp282750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/22/2022] [Indexed: 11/08/2022] Open
Abstract
In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain. Abstract figure legend Assembly Formation. Assemblies are groups of strongly connected neurons formed by the interaction of multiple mechanisms and with vast computational implications. Four interacting components are thought to drive assembly formation: synaptic plasticity, symmetry breaking, competition and stability. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Christoph Miehl
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Sebastian Onasch
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Dylan Festa
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| |
Collapse
|
15
|
Schumm RD, Bressloff PC. Local accumulation times in a diffusion-trapping model of receptor dynamics at proximal axodendritic synapses. Phys Rev E 2022; 105:064407. [PMID: 35854532 DOI: 10.1103/physreve.105.064407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2022] [Accepted: 05/18/2022] [Indexed: 11/07/2022]
Abstract
The lateral diffusion and trapping of neurotransmitter receptors within the postsynaptic membrane of a neuron play a key role in determining synaptic strength and plasticity. Trapping is mediated by the reversible binding of receptors to scaffolding proteins (slots) within a synapse. In this paper we introduce a method for analyzing the transient dynamics of proximal axodendritic synapses in a diffusion-trapping model of receptor trafficking. Given a population of spatially distributed synapses, each of which has a fixed number of slots, we calculate the rate of relaxation to the steady-state distribution of bound slots (synaptic weights) in terms of a set of local accumulation times. Assuming that the rates of exocytosis and endocytosis are sufficiently slow, we show that the steady-state synaptic weights are independent of each other (purely local). On the other hand, the local accumulation time of a given synapse depends on the number of slots and the spatial location of all the synapses, indicating a form of transient heterosynaptic plasticity. This suggests that local accumulation time measurements could provide useful information regarding the distribution of synaptic weights within a dendrite.
Collapse
Affiliation(s)
- Ryan D Schumm
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, Utah 84112, USA
| | - P C Bressloff
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, Utah 84112, USA
| |
Collapse
|
16
|
Sleep promotes the formation of dendritic filopodia and spines near learning-inactive existing spines. Proc Natl Acad Sci U S A 2021; 118:2114856118. [PMID: 34873044 DOI: 10.1073/pnas.2114856118] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/01/2021] [Indexed: 01/20/2023] Open
Abstract
Changes in synaptic connections are believed to underlie long-term memory storage. Previous studies have suggested that sleep is important for synapse formation after learning, but how sleep is involved in the process of synapse formation remains unclear. To address this question, we used transcranial two-photon microscopy to investigate the effect of postlearning sleep on the location of newly formed dendritic filopodia and spines of layer 5 pyramidal neurons in the primary motor cortex of adolescent mice. We found that newly formed filopodia and spines were partially clustered with existing spines along individual dendritic segments 24 h after motor training. Notably, posttraining sleep was critical for promoting the formation of dendritic filopodia and spines clustered with existing spines within 8 h. A fraction of these filopodia was converted into new spines and contributed to clustered spine formation 24 h after motor training. This sleep-dependent spine formation via filopodia was different from retraining-induced new spine formation, which emerged from dendritic shafts without prior presence of filopodia. Furthermore, sleep-dependent new filopodia and spines tended to be formed away from existing spines that were active at the time of motor training. Taken together, these findings reveal a role of postlearning sleep in regulating the number and location of new synapses via promoting filopodial formation.
Collapse
|
17
|
Kourosh-Arami M, Hosseini N, Komaki A. Brain is modulated by neuronal plasticity during postnatal development. J Physiol Sci 2021; 71:34. [PMID: 34789147 PMCID: PMC10716960 DOI: 10.1186/s12576-021-00819-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2021] [Accepted: 10/27/2021] [Indexed: 11/10/2022]
Abstract
Neuroplasticity is referred to the ability of the nervous system to change its structure or functions as a result of former stimuli. It is a plausible mechanism underlying a dynamic brain through adaptation processes of neural structure and activity patterns. Nevertheless, it is still unclear how the plastic neural systems achieve and maintain their equilibrium. Additionally, the alterations of balanced brain dynamics under different plasticity rules have not been explored either. Therefore, the present article primarily aims to review recent research studies regarding homosynaptic and heterosynaptic neuroplasticity characterized by the manipulation of excitatory and inhibitory synaptic inputs. Moreover, it attempts to understand different mechanisms related to the main forms of synaptic plasticity at the excitatory and inhibitory synapses during the brain development processes. Hence, this study comprised surveying those articles published since 1988 and available through PubMed, Google Scholar and science direct databases on a keyword-based search paradigm. All in all, the study results presented extensive and corroborative pieces of evidence for the main types of plasticity, including the long-term potentiation (LTP) and long-term depression (LTD) of the excitatory and inhibitory postsynaptic potentials (EPSPs and IPSPs).
Collapse
Affiliation(s)
- Masoumeh Kourosh-Arami
- Department of Neuroscience, School of Advanced Technologies in Medicine, Iran University of Medical Sciences, Tehran, Iran.
| | - Nasrin Hosseini
- Neuroscience Research Center, Iran University of Medical Sciences, Tehran, Iran.
| | - Alireza Komaki
- Neurophysiology Research Center, Hamadan University of Medical Sciences, Hamadan, Iran
| |
Collapse
|
18
|
Shen Y, Wang J, Navlakha S. A Correspondence Between Normalization Strategies in Artificial and Biological Neural Networks. Neural Comput 2021; 33:3179-3203. [PMID: 34474484 PMCID: PMC8662716 DOI: 10.1162/neco_a_01439] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 06/14/2021] [Indexed: 12/24/2022]
Abstract
A fundamental challenge at the interface of machine learning and neuroscience is to uncover computational principles that are shared between artificial and biological neural networks. In deep learning, normalization methods such as batch normalization, weight normalization, and their many variants help to stabilize hidden unit activity and accelerate network training, and these methods have been called one of the most important recent innovations for optimizing deep networks. In the brain, homeostatic plasticity represents a set of mechanisms that also stabilize and normalize network activity to lie within certain ranges, and these mechanisms are critical for maintaining normal brain function. In this article, we discuss parallels between artificial and biological normalization methods at four spatial scales: normalization of a single neuron's activity, normalization of synaptic weights of a neuron, normalization of a layer of neurons, and normalization of a network of neurons. We argue that both types of methods are functionally equivalent-that is, both push activation patterns of hidden units toward a homeostatic state, where all neurons are equally used-and we argue that such representations can improve coding capacity, discrimination, and regularization. As a proof of concept, we develop an algorithm, inspired by a neural normalization technique called synaptic scaling, and show that this algorithm performs competitively against existing normalization methods on several data sets. Overall, we hope this bidirectional connection will inspire neuroscientists and machine learners in three ways: to uncover new normalization algorithms based on established neurobiological principles; to help quantify the trade-offs of different homeostatic plasticity mechanisms used in the brain; and to offer insights about how stability may not hinder, but may actually promote, plasticity.
Collapse
Affiliation(s)
- Yang Shen
- Cold Spring Harbor Laboratory, Simons Center for Quantitative Biology, Cold Spring Harbor, NY 11724, U.S.A.
| | - Julia Wang
- Cold Spring Harbor Laboratory, Simons Center for Quantitative Biology, Cold Spring Harbor, NY 11724, U.S.A.
| | - Saket Navlakha
- Cold Spring Harbor Laboratory, Simons Center for Quantitative Biology, Cold Spring Harbor, NY 11724, U.S.A.
| |
Collapse
|
19
|
Schmalz JT, Kumar G. A computational model of dopaminergic modulation of hippocampal Schaffer collateral-CA1 long-term plasticity. J Comput Neurosci 2021; 50:51-90. [PMID: 34431067 DOI: 10.1007/s10827-021-00793-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2021] [Revised: 05/14/2021] [Accepted: 05/28/2021] [Indexed: 10/20/2022]
Abstract
Dopamine plays a critical role in modulating the long-term synaptic plasticity of the hippocampal Schaffer collateral-CA1 pyramidal neuron synapses (SC-CA1), a widely accepted cellular model of learning and memory. Limited results from hippocampal slice experiments over the last four decades have shown that the timing of the activation of dopamine D1/D5 receptors relative to a high/low-frequency stimulation (HFS/LFS) in SC-CA1 synapses regulates the modulation of HFS/LFS-induced long-term potentiation/depression (LTP/LTD) in these synapses. However, the existing literature lacks a complete picture of how various concentrations of D1/D5 agonists and the relative timing between the activation of D1/D5 receptors and LTP/LTD induction by HFS/LFS, affect the spatiotemporal modulation of SC-CA1 synaptic dynamics. In this paper, we have developed a computational model, a first of its kind, to make quantitative predictions of the temporal dose-dependent modulation of the HFS/LFS induced LTP/LTD in SC-CA1 synapses by various D1/D5 agonists. Our model combines the biochemical effects with the electrical effects at the electrophysiological level. We have estimated the model parameters from the published electrophysiological data, available from diverse hippocampal CA1 slice experiments, in a Bayesian framework. Our modeling results demonstrate the capability of our model in making quantitative predictions of the available experimental results under diverse HFS/LFS protocols. The predictions from our model show a strong nonlinear dependency of the modulated LTP/LTD by D1/D5 agonists on the relative timing between the activated D1/D5 receptors and the HFS/LFS protocol and the applied concentration of D1/D5 agonists.
Collapse
|
20
|
Auth JM, Nachstedt T, Tetzlaff C. The Interplay of Synaptic Plasticity and Scaling Enables Self-Organized Formation and Allocation of Multiple Memory Representations. Front Neural Circuits 2020; 14:541728. [PMID: 33117130 PMCID: PMC7575689 DOI: 10.3389/fncir.2020.541728] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2020] [Accepted: 08/19/2020] [Indexed: 12/23/2022] Open
Abstract
It is commonly assumed that memories about experienced stimuli are represented by groups of highly interconnected neurons called cell assemblies. This requires allocating and storing information in the neural circuitry, which happens through synaptic weight adaptations at different types of synapses. In general, memory allocation is associated with synaptic changes at feed-forward synapses while memory storage is linked with adaptation of recurrent connections. It remains, however, largely unknown how memory allocation and storage can be achieved and the adaption of the different synapses involved be coordinated to allow for a faithful representation of multiple memories without disruptive interference between them. In this theoretical study, by using network simulations and phase space analyses, we show that the interplay between long-term synaptic plasticity and homeostatic synaptic scaling organizes simultaneously the adaptations of feed-forward and recurrent synapses such that a new stimulus forms a new memory and where different stimuli are assigned to distinct cell assemblies. The resulting dynamics can reproduce experimental in-vivo data, focusing on how diverse factors, such as neuronal excitability and network connectivity, influence memory formation. Thus, the here presented model suggests that a few fundamental synaptic mechanisms may suffice to implement memory allocation and storage in neural circuitry.
Collapse
Affiliation(s)
- Johannes Maria Auth
- Department of Computational Neuroscience, Third Institute of Physics, Georg-August-Universität, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - Timo Nachstedt
- Department of Computational Neuroscience, Third Institute of Physics, Georg-August-Universität, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics, Georg-August-Universität, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
21
|
Ebner C, Clopath C, Jedlicka P, Cuntz H. Unifying Long-Term Plasticity Rules for Excitatory Synapses by Modeling Dendrites of Cortical Pyramidal Neurons. Cell Rep 2020; 29:4295-4307.e6. [PMID: 31875541 PMCID: PMC6941234 DOI: 10.1016/j.celrep.2019.11.068] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2018] [Revised: 05/02/2019] [Accepted: 11/15/2019] [Indexed: 11/30/2022] Open
Abstract
A large number of experiments have indicated that precise spike times, firing rates, and synapse locations crucially determine the dynamics of long-term plasticity induction in excitatory synapses. However, it remains unknown how plasticity mechanisms of synapses distributed along dendritic trees cooperate to produce the wide spectrum of outcomes for various plasticity protocols. Here, we propose a four-pathway plasticity framework that is well grounded in experimental evidence and apply it to a biophysically realistic cortical pyramidal neuron model. We show in computer simulations that several seemingly contradictory experimental landmark studies are consistent with one unifying set of mechanisms when considering the effects of signal propagation in dendritic trees with respect to synapse location. Our model identifies specific spatiotemporal contributions of dendritic and axo-somatic spikes as well as of subthreshold activation of synaptic clusters, providing a unified parsimonious explanation not only for rate and timing dependence but also for location dependence of synaptic changes. A phenomenological synaptic plasticity rule is applied to a pyramidal neuron model Model reproduces rate-, timing-, and location-dependent plasticity results Active dendrites allow plasticity via dendritic spikes and subthreshold events Cooperative plasticity exists across the dendritic tree and within single branches
Collapse
Affiliation(s)
- Christian Ebner
- Frankfurt Institute for Advanced Studies, 60438 Frankfurt am Main, Germany; Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528 Frankfurt am Main, Germany; NeuroCure Cluster of Excellence, Charité-Universitätsmedizin Berlin, 10117 Berlin, Germany; Institute for Biology, Humboldt-Universität zu Berlin, 10117 Berlin, Germany.
| | - Claudia Clopath
- Computational Neuroscience Laboratory, Bioengineering Department, Imperial College London, London SW7 2AZ, UK
| | - Peter Jedlicka
- Frankfurt Institute for Advanced Studies, 60438 Frankfurt am Main, Germany; Institute of Clinical Neuroanatomy, Neuroscience Center, Goethe University Frankfurt, 60528 Frankfurt am Main, Germany; ICAR3R-Interdisciplinary Centre for 3Rs in Animal Research, Faculty of Medicine, Justus-Liebig-University, 35392 Giessen, Germany
| | - Hermann Cuntz
- Frankfurt Institute for Advanced Studies, 60438 Frankfurt am Main, Germany; Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, 60528 Frankfurt am Main, Germany
| |
Collapse
|
22
|
Krüppel S, Tetzlaff C. The self-organized learning of noisy environmental stimuli requires distinct phases of plasticity. Netw Neurosci 2020; 4:174-199. [PMID: 32166207 PMCID: PMC7055647 DOI: 10.1162/netn_a_00118] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2019] [Accepted: 12/09/2019] [Indexed: 11/25/2022] Open
Abstract
Along sensory pathways, representations of environmental stimuli become increasingly sparse and expanded. If additionally the feed-forward synaptic weights are structured according to the inherent organization of stimuli, the increase in sparseness and expansion leads to a reduction of sensory noise. However, it is unknown how the synapses in the brain form the required structure, especially given the omnipresent noise of environmental stimuli. Here, we employ a combination of synaptic plasticity and intrinsic plasticity—adapting the excitability of each neuron individually—and present stimuli with an inherent organization to a feed-forward network. We observe that intrinsic plasticity maintains the sparseness of the neural code and thereby allows synaptic plasticity to learn the organization of stimuli in low-noise environments. Nevertheless, even high levels of noise can be handled after a subsequent phase of readaptation of the neuronal excitabilities by intrinsic plasticity. Interestingly, during this phase the synaptic structure has to be maintained. These results demonstrate that learning and recalling in the presence of noise requires the coordinated interplay between plasticity mechanisms adapting different properties of the neuronal circuit. Everyday life requires living beings to continuously recognize and categorize perceived stimuli from the environment. To master this task, the representations of these stimuli become increasingly sparse and expanded along the sensory pathways of the brain. In addition, the underlying neuronal network has to be structured according to the inherent organization of the environmental stimuli. However, how the neuronal network learns the required structure even in the presence of noise remains unknown. In this theoretical study, we show that the interplay between synaptic plasticity—controlling the synaptic efficacies—and intrinsic plasticity—adapting the neuronal excitabilities—enables the network to encode the organization of environmental stimuli. It thereby structures the network to correctly categorize stimuli even in the presence of noise. After having encoded the stimuli’s organization, consolidating the synaptic structure while keeping the neuronal excitabilities dynamic enables the neuronal system to readapt to arbitrary levels of noise resulting in a near-optimal classification performance for all noise levels. These results provide new insights into the interplay between different plasticity mechanisms and how this interplay enables sensory systems to reliably learn and categorize stimuli from the surrounding environment.
Collapse
Affiliation(s)
- Steffen Krüppel
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
| |
Collapse
|
23
|
Activity Dependent and Independent Determinants of Synaptic Size Diversity. J Neurosci 2020; 40:2828-2848. [PMID: 32127494 DOI: 10.1523/jneurosci.2181-19.2020] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2019] [Revised: 02/04/2020] [Accepted: 02/13/2020] [Indexed: 11/21/2022] Open
Abstract
The extraordinary diversity of excitatory synapse sizes is commonly attributed to activity-dependent processes that drive synaptic growth and diminution. Recent studies also point to activity-independent size fluctuations, possibly driven by innate synaptic molecule dynamics, as important generators of size diversity. To examine the contributions of activity-dependent and independent processes to excitatory synapse size diversity, we studied glutamatergic synapse size dynamics and diversification in cultured rat cortical neurons (both sexes), silenced from plating. We found that in networks with no history of activity whatsoever, synaptic size diversity was no less extensive than that observed in spontaneously active networks. Synapses in silenced networks were larger, size distributions were broader, yet these were rightward-skewed and similar in shape when scaled by mean synaptic size. Silencing reduced the magnitude of size fluctuations and weakened constraints on size distributions, yet these were sufficient to explain synaptic size diversity in silenced networks. Model-based exploration followed by experimental testing indicated that silencing-associated changes in innate molecular dynamics and fluctuation characteristics might negatively impact synaptic persistence, resulting in reduced synaptic numbers. This, in turn, would increase synaptic molecule availability, promote synaptic enlargement, and ultimately alter fluctuation characteristics. These findings suggest that activity-independent size fluctuations are sufficient to fully diversify glutamatergic synaptic sizes, with activity-dependent processes primarily setting the scale rather than the shape of size distributions. Moreover, they point to reciprocal relationships between synaptic size fluctuations, size distributions, and synaptic numbers mediated by the innate dynamics of synaptic molecules as they move in, out, and between synapses.SIGNIFICANCE STATEMENT Sizes of glutamatergic synapses vary tremendously, even when formed on the same neuron. This diversity is commonly thought to reflect the outcome of activity-dependent forms of synaptic plasticity, yet activity-independent processes might also play some part. Here we show that in neurons with no history of activity whatsoever, synaptic sizes are no less diverse. We show that this diversity is the product of activity-independent size fluctuations, which are sufficient to generate a full repertoire of synaptic sizes at correct proportions. By combining modeling and experimentation we expose reciprocal relationships between size fluctuations, synaptic sizes and synaptic counts, and show how these phenomena might be connected through the dynamics of synaptic molecules as they move in, out, and between synapses.
Collapse
|
24
|
Egbert MD, Gruenert G, Ibrahim B, Dittrich P. Combining evolution and self-organization to find natural Boolean representations in unconventional computational media. Biosystems 2019; 184:104011. [DOI: 10.1016/j.biosystems.2019.104011] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2019] [Revised: 07/16/2019] [Accepted: 07/27/2019] [Indexed: 10/26/2022]
|
25
|
How mRNA Localization and Protein Synthesis Sites Influence Dendritic Protein Distribution and Dynamics. Neuron 2019; 103:1109-1122.e7. [PMID: 31350097 DOI: 10.1016/j.neuron.2019.06.022] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2018] [Revised: 02/21/2019] [Accepted: 06/22/2019] [Indexed: 01/23/2023]
Abstract
Proteins drive the function of neuronal synapses. The synapses are distributed throughout the dendritic arbor, often hundreds of micrometers away from the soma. It is still unclear how somatic and dendritic sources of proteins shape protein distribution and respectively contribute to local protein changes during synaptic plasticity. Here, we present a unique computational framework describing for a given protein species the dendritic distribution of the mRNA and the corresponding protein in a dendrite. Using CaMKIIα as a test case, our model reveals the key role active transport plays in the maintenance of dendritic mRNA and protein levels and predicts the short and long timescales of protein dynamics. Our model reveals the fundamental role of mRNA localization and dendritic mRNA translation in synaptic maintenance and plasticity in distal compartments. We developed a web application for neuroscientists to explore the dynamics of the mRNA or protein of interest.
Collapse
|
26
|
Letellier M, Levet F, Thoumine O, Goda Y. Differential role of pre- and postsynaptic neurons in the activity-dependent control of synaptic strengths across dendrites. PLoS Biol 2019; 17:e2006223. [PMID: 31166943 PMCID: PMC6576792 DOI: 10.1371/journal.pbio.2006223] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2018] [Revised: 06/17/2019] [Accepted: 05/17/2019] [Indexed: 01/07/2023] Open
Abstract
Neurons receive a large number of active synaptic inputs from their many presynaptic partners across their dendritic tree. However, little is known about how the strengths of individual synapses are controlled in balance with other synapses to effectively encode information while maintaining network homeostasis. This is in part due to the difficulty in assessing the activity of individual synapses with identified afferent and efferent connections for a synapse population in the brain. Here, to gain insights into the basic cellular rules that drive the activity-dependent spatial distribution of pre- and postsynaptic strengths across incoming axons and dendrites, we combine patch-clamp recordings with live-cell imaging of hippocampal pyramidal neurons in dissociated cultures and organotypic slices. Under basal conditions, both pre- and postsynaptic strengths cluster on single dendritic branches according to the identity of the presynaptic neurons, thus highlighting the ability of single dendritic branches to exhibit input specificity. Stimulating a single presynaptic neuron induces input-specific and dendritic branchwise spatial clustering of presynaptic strengths, which accompanies a widespread multiplicative scaling of postsynaptic strengths in dissociated cultures and heterosynaptic plasticity at distant synapses in organotypic slices. Our study provides evidence for a potential homeostatic mechanism by which the rapid changes in global or distant postsynaptic strengths compensate for input-specific presynaptic plasticity.
Collapse
Affiliation(s)
- Mathieu Letellier
- RIKEN Brain Science Institute, Wako, Saitama, Japan
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
- * E-mail: (ML); (YG)
| | - Florian Levet
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
- Bordeaux Imaging Center, University of Bordeaux, Bordeaux, France
- Bordeaux Imaging Center, CNRS UMS 3420, Bordeaux, France
- Bordeaux Imaging Center, INSERM US04, Bordeaux, France
| | - Olivier Thoumine
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
| | - Yukiko Goda
- RIKEN Center for Brain Science, Wako, Saitama, Japan
- * E-mail: (ML); (YG)
| |
Collapse
|
27
|
Kim T, Tanaka-Yamamoto K. Postsynaptic Stability and Variability Described by a Stochastic Model of Endosomal Trafficking. Front Cell Neurosci 2019; 13:72. [PMID: 30863286 PMCID: PMC6399135 DOI: 10.3389/fncel.2019.00072] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2018] [Accepted: 02/13/2019] [Indexed: 12/04/2022] Open
Abstract
Neurons undergo dynamic processes of constitutive AMPA-type glutamate receptor (AMPAR) trafficking, such as the insertion and internalization of AMPARs by exocytosis and endocytosis, while stably maintaining synaptic efficacy. Studies using advanced imaging techniques have suggested that the frequency of these constitutive trafficking processes, as well as the number of AMPARs that are involved in a particular event highly fluctuate. In addition, mechanisms that trigger some forms of synaptic plasticity have been shown to include not only these processes but also additional fluctuating processes, such as the sorting of AMPARs to late endosomes (LEs). Thus, the regulation of postsynaptic AMPARs by the endosomal trafficking system appears to have superficially conflicting properties between the stability or organized control of plasticity and highly fluctuating or stochastic processes. However, it is not clear how the endosomal trafficking system reconciles and utilizes such conflicting properties. Although deterministic models have been effective to describe the stable maintenance of synaptic AMPAR numbers by constitutive recycling, as well as the involvement of endosomal trafficking in synaptic plasticity, they do not take stochasticity into account. In this study, we introduced the stochasticity into the model of each crucial machinery of the endosomal trafficking system. The specific questions we solved by our improved model are whether stability is accomplished even with a combination of fluctuating processes, and how overall variability occurs while controlling long-term synaptic depression (LTD). Our new stochastic model indeed demonstrated the stable regulation of postsynaptic AMPAR numbers at the basal state and during LTD maintenance, despite fast fluctuations in AMPAR numbers as well as high variability in the time course and amounts of LTD. In addition, our analysis suggested that the high variability arising from this stochasticity is beneficial for reproducing the relatively constant timing of LE sorting for LTD. We therefore propose that the coexistence of stability and stochasticity in the endosomal trafficking system is suitable for stable synaptic transmission and the reliable induction of synaptic plasticity, with variable properties that have been observed experimentally.
Collapse
Affiliation(s)
- Taegon Kim
- Center for Functional Connectomics, Korea Institute of Science and Technology (KIST), Seoul, South Korea
| | - Keiko Tanaka-Yamamoto
- Center for Functional Connectomics, Korea Institute of Science and Technology (KIST), Seoul, South Korea.,Division of Bio-Medical Science & Technology, KIST School, Korea University of Science and Technology, Seoul, South Korea
| |
Collapse
|