1
|
Piette C, Gervasi N, Venance L. Synaptic plasticity through a naturalistic lens. Front Synaptic Neurosci 2023; 15:1250753. [PMID: 38145207 PMCID: PMC10744866 DOI: 10.3389/fnsyn.2023.1250753] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Accepted: 11/20/2023] [Indexed: 12/26/2023] Open
Abstract
From the myriad of studies on neuronal plasticity, investigating its underlying molecular mechanisms up to its behavioral relevance, a very complex landscape has emerged. Recent efforts have been achieved toward more naturalistic investigations as an attempt to better capture the synaptic plasticity underpinning of learning and memory, which has been fostered by the development of in vivo electrophysiological and imaging tools. In this review, we examine these naturalistic investigations, by devoting a first part to synaptic plasticity rules issued from naturalistic in vivo-like activity patterns. We next give an overview of the novel tools, which enable an increased spatio-temporal specificity for detecting and manipulating plasticity expressed at individual spines up to neuronal circuit level during behavior. Finally, we put particular emphasis on works considering brain-body communication loops and macroscale contributors to synaptic plasticity, such as body internal states and brain energy metabolism.
Collapse
Affiliation(s)
- Charlotte Piette
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France
| | | | - Laurent Venance
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France
| |
Collapse
|
2
|
Madar A, Dong C, Sheffield M. BTSP, not STDP, Drives Shifts in Hippocampal Representations During Familiarization. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.10.17.562791. [PMID: 37904999 PMCID: PMC10614909 DOI: 10.1101/2023.10.17.562791] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/02/2023]
Abstract
Synaptic plasticity is widely thought to support memory storage in the brain, but the rules determining impactful synaptic changes in-vivo are not known. We considered the trial-by-trial shifting dynamics of hippocampal place fields (PFs) as an indicator of ongoing plasticity during memory formation. By implementing different plasticity rules in computational models of spiking place cells and comparing to experimentally measured PFs from mice navigating familiar and novel environments, we found that Behavioral-Timescale-Synaptic-Plasticity (BTSP), rather than Hebbian Spike-Timing-Dependent-Plasticity, is the principal mechanism governing PF shifting dynamics. BTSP-triggering events are rare, but more frequent during novel experiences. During exploration, their probability is dynamic: it decays after PF onset, but continually drives a population-level representational drift. Finally, our results show that BTSP occurs in CA3 but is less frequent and phenomenologically different than in CA1. Overall, our study provides a new framework to understand how synaptic plasticity shapes neuronal representations during learning.
Collapse
Affiliation(s)
- A.D. Madar
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| | - C. Dong
- Department of Neurobiology, Neuroscience Institute, University of Chicago
- current affiliation: Department of Neurobiology, Stanford University School of Medicine
| | - M.E.J. Sheffield
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| |
Collapse
|
3
|
O'Donnell C. Nonlinear slow-timescale mechanisms in synaptic plasticity. Curr Opin Neurobiol 2023; 82:102778. [PMID: 37657186 DOI: 10.1016/j.conb.2023.102778] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 08/07/2023] [Accepted: 08/09/2023] [Indexed: 09/03/2023]
Abstract
Learning and memory rely on synapses changing their strengths in response to neural activity. However, there is a substantial gap between the timescales of neural electrical dynamics (1-100 ms) and organism behaviour during learning (seconds-minutes). What mechanisms bridge this timescale gap? What are the implications for theories of brain learning? Here I first cover experimental evidence for slow-timescale factors in plasticity induction. Then I review possible underlying cellular and synaptic mechanisms, and insights from recent computational models that incorporate such slow-timescale variables. I conclude that future progress in understanding brain learning across timescales will require both experimental and computational modelling studies that map out the nonlinearities implemented by both fast and slow plasticity mechanisms at synapses, and crucially, their joint interactions.
Collapse
Affiliation(s)
- Cian O'Donnell
- School of Computing, Engineering, and Intelligent Systems, Magee Campus, Ulster University, Derry/Londonderry, UK; School of Computer Science, Electrical and Electronic Engineering, and Engineering Maths, University of Bristol, Bristol, UK.
| |
Collapse
|
4
|
Huang CH, Lin CCK. New biophysical rate-based modeling of long-term plasticity in mean-field neuronal population models. Comput Biol Med 2023; 163:107213. [PMID: 37413849 DOI: 10.1016/j.compbiomed.2023.107213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Revised: 05/20/2023] [Accepted: 06/25/2023] [Indexed: 07/08/2023]
Abstract
The formation of customized neural networks as the basis of brain functions such as receptive field selectivity, learning or memory depends heavily on the long-term plasticity of synaptic connections. However, the current mean-field population models commonly used to simulate large-scale neural network dynamics lack explicit links to the underlying cellular mechanisms of long-term plasticity. In this study, we developed a new mean-field population model, the plastic density-based neural mass model (pdNMM), by incorporating a newly developed rate-based plasticity model based on the calcium control hypothesis into an existing density-based neural mass model. Derivation of the plasticity model was carried out using population density methods. Our results showed that the synaptic plasticity represented by the resulting rate-based plasticity model exhibited Bienenstock-Cooper-Munro-like learning rules. Furthermore, we demonstrated that the pdNMM accurately reproduced previous experimental observations of long-term plasticity, including characteristics of Hebbian plasticity such as longevity, associativity and input specificity, on hippocampal slices, and the formation of receptive field selectivity in the visual cortex. In conclusion, the pdNMM is a novel approach that can confer long-term plasticity to conventional mean-field neuronal population models.
Collapse
Affiliation(s)
- Chih-Hsu Huang
- Department of Neurology, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Chou-Ching K Lin
- Department of Neurology, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Innovation Center of Medical Devices and Technology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Medical Device Innovation Center, National Cheng Kung University, Tainan, Taiwan.
| |
Collapse
|
5
|
Rodrigues YE, Tigaret CM, Marie H, O'Donnell C, Veltz R. A stochastic model of hippocampal synaptic plasticity with geometrical readout of enzyme dynamics. eLife 2023; 12:e80152. [PMID: 37589251 PMCID: PMC10435238 DOI: 10.7554/elife.80152] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Accepted: 03/22/2023] [Indexed: 08/18/2023] Open
Abstract
Discovering the rules of synaptic plasticity is an important step for understanding brain learning. Existing plasticity models are either (1) top-down and interpretable, but not flexible enough to account for experimental data, or (2) bottom-up and biologically realistic, but too intricate to interpret and hard to fit to data. To avoid the shortcomings of these approaches, we present a new plasticity rule based on a geometrical readout mechanism that flexibly maps synaptic enzyme dynamics to predict plasticity outcomes. We apply this readout to a multi-timescale model of hippocampal synaptic plasticity induction that includes electrical dynamics, calcium, CaMKII and calcineurin, and accurate representation of intrinsic noise sources. Using a single set of model parameters, we demonstrate the robustness of this plasticity rule by reproducing nine published ex vivo experiments covering various spike-timing and frequency-dependent plasticity induction protocols, animal ages, and experimental conditions. Our model also predicts that in vivo-like spike timing irregularity strongly shapes plasticity outcome. This geometrical readout modelling approach can be readily applied to other excitatory or inhibitory synapses to discover their synaptic plasticity rules.
Collapse
Affiliation(s)
- Yuri Elias Rodrigues
- Université Côte d’AzurNiceFrance
- Institut de Pharmacologie Moléculaire et Cellulaire (IPMC), CNRSValbonneFrance
- Inria Center of University Côte d’Azur (Inria)Sophia AntipolisFrance
| | - Cezar M Tigaret
- Neuroscience and Mental Health Research Innovation Institute, Division of Psychological Medicine and Clinical Neurosciences,School of Medicine, Cardiff UniversityCardiffUnited Kingdom
| | - Hélène Marie
- Université Côte d’AzurNiceFrance
- Institut de Pharmacologie Moléculaire et Cellulaire (IPMC), CNRSValbonneFrance
| | - Cian O'Donnell
- School of Computing, Engineering, and Intelligent Systems, Magee Campus, Ulster UniversityLondonderryUnited Kingdom
- School of Computer Science, Electrical and Electronic Engineering, and Engineering Mathematics, University of BristolBristolUnited Kingdom
| | - Romain Veltz
- Inria Center of University Côte d’Azur (Inria)Sophia AntipolisFrance
| |
Collapse
|
6
|
Shervani-Tabar N, Rosenbaum R. Meta-learning biologically plausible plasticity rules with random feedback pathways. Nat Commun 2023; 14:1805. [PMID: 37002222 PMCID: PMC10066328 DOI: 10.1038/s41467-023-37562-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Accepted: 03/21/2023] [Indexed: 04/04/2023] Open
Abstract
Backpropagation is widely used to train artificial neural networks, but its relationship to synaptic plasticity in the brain is unknown. Some biological models of backpropagation rely on feedback projections that are symmetric with feedforward connections, but experiments do not corroborate the existence of such symmetric backward connectivity. Random feedback alignment offers an alternative model in which errors are propagated backward through fixed, random backward connections. This approach successfully trains shallow models, but learns slowly and does not perform well with deeper models or online learning. In this study, we develop a meta-learning approach to discover interpretable, biologically plausible plasticity rules that improve online learning performance with fixed random feedback connections. The resulting plasticity rules show improved online training of deep models in the low data regime. Our results highlight the potential of meta-learning to discover effective, interpretable learning rules satisfying biological constraints.
Collapse
Affiliation(s)
- Navid Shervani-Tabar
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, 46556, USA.
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, 46556, USA
| |
Collapse
|
7
|
Chen S, Yang Q, Lim S. Efficient inference of synaptic plasticity rule with Gaussian process regression. iScience 2023; 26:106182. [PMID: 36879810 PMCID: PMC9985048 DOI: 10.1016/j.isci.2023.106182] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2022] [Revised: 01/24/2023] [Accepted: 02/07/2023] [Indexed: 02/16/2023] Open
Abstract
Finding the form of synaptic plasticity is critical to understanding its functions underlying learning and memory. We investigated an efficient method to infer synaptic plasticity rules in various experimental settings. We considered biologically plausible models fitting a wide range of in-vitro studies and examined the recovery of their firing-rate dependence from sparse and noisy data. Among the methods assuming low-rankness or smoothness of plasticity rules, Gaussian process regression (GPR), a nonparametric Bayesian approach, performs the best. Under the conditions measuring changes in synaptic weights directly or measuring changes in neural activities as indirect observables of synaptic plasticity, which leads to different inference problems, GPR performs well. Also, GPR could simultaneously recover multiple plasticity rules and robustly perform under various plasticity rules and noise levels. Such flexibility and efficiency, particularly at the low sampling regime, make GPR suitable for recent experimental developments and inferring a broader class of plasticity models.
Collapse
Affiliation(s)
- Shirui Chen
- Department of Applied Mathematics, University of Washington, Lewis Hall 201, Box 353925, Seattle, WA 98195-3925, USA.,Neural Science, New York University Shanghai, 1555 Century Avenue, Shanghai, 200122, China
| | - Qixin Yang
- The Edmond and Lily Safra Center for Brain Sciences, The Hebrew University, The Suzanne and Charles Goodman Brain Sciences Building, Edmond J. Safra Campus, Jerusalem, 9190401, Israel.,Neural Science, New York University Shanghai, 1555 Century Avenue, Shanghai, 200122, China
| | - Sukbin Lim
- Neural Science, New York University Shanghai, 1555 Century Avenue, Shanghai, 200122, China.,NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, 3663 Zhongshan Road North, Shanghai, 200062, China
| |
Collapse
|
8
|
Goldt S, Krzakala F, Zdeborová L, Brunel N. Bayesian reconstruction of memories stored in neural networks from their connectivity. PLoS Comput Biol 2023; 19:e1010813. [PMID: 36716332 PMCID: PMC9910750 DOI: 10.1371/journal.pcbi.1010813] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2021] [Revised: 02/09/2023] [Accepted: 12/12/2022] [Indexed: 02/01/2023] Open
Abstract
The advent of comprehensive synaptic wiring diagrams of large neural circuits has created the field of connectomics and given rise to a number of open research questions. One such question is whether it is possible to reconstruct the information stored in a recurrent network of neurons, given its synaptic connectivity matrix. Here, we address this question by determining when solving such an inference problem is theoretically possible in specific attractor network models and by providing a practical algorithm to do so. The algorithm builds on ideas from statistical physics to perform approximate Bayesian inference and is amenable to exact analysis. We study its performance on three different models, compare the algorithm to standard algorithms such as PCA, and explore the limitations of reconstructing stored patterns from synaptic connectivity.
Collapse
Affiliation(s)
- Sebastian Goldt
- International School of Advanced Studies (SISSA), Trieste, Italy
- * E-mail:
| | - Florent Krzakala
- IdePHICS laboratory, Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland
| | - Lenka Zdeborová
- SPOC laboratory, Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland
| | - Nicolas Brunel
- Department of Neurobiology, Duke University, Durham, North Carolina, United States of America
- Department of Physics, Duke University, Durham, North Carolina, United States of America
| |
Collapse
|
9
|
Dorman DB, Blackwell KT. Synaptic Plasticity Is Predicted by Spatiotemporal Firing Rate Patterns and Robust to In Vivo-like Variability. Biomolecules 2022; 12:1402. [PMID: 36291612 PMCID: PMC9599115 DOI: 10.3390/biom12101402] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Revised: 09/13/2022] [Accepted: 09/28/2022] [Indexed: 11/22/2022] Open
Abstract
Synaptic plasticity, the experience-induced change in connections between neurons, underlies learning and memory in the brain. Most of our understanding of synaptic plasticity derives from in vitro experiments with precisely repeated stimulus patterns; however, neurons exhibit significant variability in vivo during repeated experiences. Further, the spatial pattern of synaptic inputs to the dendritic tree influences synaptic plasticity, yet is not considered in most synaptic plasticity rules. Here, we investigate how spatiotemporal synaptic input patterns produce plasticity with in vivo-like conditions using a data-driven computational model with a plasticity rule based on calcium dynamics. Using in vivo spike train recordings as inputs to different size clusters of spines, we show that plasticity is strongly robust to trial-to-trial variability of spike timing. In addition, we derive general synaptic plasticity rules describing how spatiotemporal patterns of synaptic inputs control the magnitude and direction of plasticity. Synapses that strongly potentiated have greater firing rates and calcium concentration later in the trial, whereas strongly depressing synapses have hiring firing rates early in the trial. The neighboring synaptic activity influences the direction and magnitude of synaptic plasticity, with small clusters of spines producing the greatest increase in synaptic strength. Together, our results reveal that calcium dynamics can unify diverse plasticity rules and reveal how spatiotemporal firing rate patterns control synaptic plasticity.
Collapse
Affiliation(s)
- Daniel B. Dorman
- Interdisciplinary Program in Neuroscience, George Mason University, Fairfax, VA 22030, USA
| | - Kim T. Blackwell
- Interdisciplinary Program in Neuroscience, George Mason University, Fairfax, VA 22030, USA
- Department of Bioengineering, Volgenau School of Engineering, George Mason University, Fairfax, VA 22030, USA
| |
Collapse
|
10
|
Wang B, Aljadeff J. Multiplicative Shot-Noise: A New Route to Stability of Plastic Networks. PHYSICAL REVIEW LETTERS 2022; 129:068101. [PMID: 36018633 DOI: 10.1103/physrevlett.129.068101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 06/30/2022] [Indexed: 06/15/2023]
Abstract
Fluctuations of synaptic weights, among many other physical, biological, and ecological quantities, are driven by coincident events of two "parent" processes. We propose a multiplicative shot-noise model that can capture the behaviors of a broad range of such natural phenomena, and analytically derive an approximation that accurately predicts its statistics. We apply our results to study the effects of a multiplicative synaptic plasticity rule that was recently extracted from measurements in physiological conditions. Using mean-field theory analysis and network simulations, we investigate how this rule shapes the connectivity and dynamics of recurrent spiking neural networks. The multiplicative plasticity rule is shown to support efficient learning of input stimuli, and it gives a stable, unimodal synaptic-weight distribution with a large fraction of strong synapses. The strong synapses remain stable over long times but do not "run away." Our results suggest that the multiplicative shot-noise offers a new route to understand the tradeoff between flexibility and stability in neural circuits and other dynamic networks.
Collapse
Affiliation(s)
- Bin Wang
- Department of Physics, University of California San Diego, La Jolla, California 92093, USA
| | - Johnatan Aljadeff
- Department of Neurobiology, University of California San Diego, La Jolla, California 92093, USA
| |
Collapse
|
11
|
Wert-Carvajal C, Reneaux M, Tchumatchenko T, Clopath C. Dopamine and serotonin interplay for valence-based spatial learning. Cell Rep 2022; 39:110645. [PMID: 35417691 DOI: 10.1016/j.celrep.2022.110645] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2021] [Revised: 10/31/2021] [Accepted: 03/17/2022] [Indexed: 11/17/2022] Open
Abstract
Dopamine (DA) and serotonin (5-HT) are important neuromodulators of synaptic plasticity that have been linked to learning from positive or negative outcomes or valence-based learning. In the hippocampus, both affect long-term plasticity but play different roles in encoding uncertainty or predicted reward. DA has been related to positive valence, from reward consumption or avoidance behavior, and 5-HT to aversive encoding. We propose DA produces overall LTP while 5-HT elicits LTD. Here, we compare two reward-modulated spike timing-dependent plasticity (R-STDP) rules to describe the action of these neuromodulators. We examined their role in cognitive performance and flexibility for computational models of the Morris water maze task and reversal learning. Our results show that the interplay of DA and 5-HT improves learning performance and can explain experimental evidence. This study reinforces the importance of neuromodulation in determining the direction of plasticity.
Collapse
Affiliation(s)
- Carlos Wert-Carvajal
- Bioengineering Department, Imperial College London, London SW7 2AZ, UK; Theory of Neural Dynamics Group, Max Planck Institute for Brain Research, 60438 Frankfurt, Germany; Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, University of Bonn Medical Center, 53127 Bonn, Germany
| | - Melissa Reneaux
- Bioengineering Department, Imperial College London, London SW7 2AZ, UK
| | - Tatjana Tchumatchenko
- Theory of Neural Dynamics Group, Max Planck Institute for Brain Research, 60438 Frankfurt, Germany; Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, University of Bonn Medical Center, 53127 Bonn, Germany; Institute of Physiological Chemistry, University of Mainz Medical Center, 55131 Mainz, Germany.
| | - Claudia Clopath
- Bioengineering Department, Imperial College London, London SW7 2AZ, UK.
| |
Collapse
|
12
|
Takeda Y, Hata K, Yamazaki T, Kaneko M, Yokoi O, Tsai C, Umemura K, Nikuni T. Numerical Simulation: Fluctuation in Background Synaptic Activity Regulates Synaptic Plasticity. Front Syst Neurosci 2021; 15:771661. [PMID: 34880734 PMCID: PMC8646040 DOI: 10.3389/fnsys.2021.771661] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2021] [Accepted: 10/27/2021] [Indexed: 11/13/2022] Open
Abstract
Synaptic plasticity is vital for learning and memory in the brain. It consists of long-term potentiation (LTP) and long-term depression (LTD). Spike frequency is one of the major components of synaptic plasticity in the brain, a noisy environment. Recently, we mathematically analyzed the frequency-dependent synaptic plasticity (FDP) in vivo and found that LTP is more likely to occur with an increase in the frequency of background synaptic activity. Meanwhile, previous studies suggest statistical fluctuation in the amplitude of background synaptic activity. Little is understood, however, about its contribution to synaptic plasticity. To address this issue, we performed numerical simulations of a calcium-based synapse model. Then, we found attenuation of the tendency to become LTD due to an increase in the fluctuation of background synaptic activity, leading to an enhancement of synaptic weight. Our result suggests that the fluctuation affects synaptic plasticity in the brain.
Collapse
Affiliation(s)
- Yuto Takeda
- Department of Physics, Tokyo University of Science, Tokyo, Japan
| | - Katsuhiko Hata
- Department of Physics, Tokyo University of Science, Tokyo, Japan.,Department of Neuroscience, Research Center for Mathematical Medicine, Tokyo, Japan.,Department of Sports and Medical Science, Kokushikan University, Tokyo, Japan.,Graduate School of Emergency Medical System, Kokushikan University, Tokyo, Japan
| | - Tokio Yamazaki
- Department of Physics, Tokyo University of Science, Tokyo, Japan
| | - Masaki Kaneko
- KYB Medical Service Co., Ltd., Tokyo, Japan.,The Institute of Physical Education, Kokushikan University, Tokyo, Japan
| | - Osamu Yokoi
- Department of Neuroscience, Research Center for Mathematical Medicine, Tokyo, Japan
| | - Chengta Tsai
- Department of Neuroscience, Research Center for Mathematical Medicine, Tokyo, Japan.,Graduate School of Emergency Medical System, Kokushikan University, Tokyo, Japan
| | - Kazuo Umemura
- Department of Physics, Tokyo University of Science, Tokyo, Japan
| | - Tetsuro Nikuni
- Department of Physics, Tokyo University of Science, Tokyo, Japan
| |
Collapse
|
13
|
Sarazin MXB, Victor J, Medernach D, Naudé J, Delord B. Online Learning and Memory of Neural Trajectory Replays for Prefrontal Persistent and Dynamic Representations in the Irregular Asynchronous State. Front Neural Circuits 2021; 15:648538. [PMID: 34305535 PMCID: PMC8298038 DOI: 10.3389/fncir.2021.648538] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2020] [Accepted: 05/31/2021] [Indexed: 11/13/2022] Open
Abstract
In the prefrontal cortex (PFC), higher-order cognitive functions and adaptive flexible behaviors rely on continuous dynamical sequences of spiking activity that constitute neural trajectories in the state space of activity. Neural trajectories subserve diverse representations, from explicit mappings in physical spaces to generalized mappings in the task space, and up to complex abstract transformations such as working memory, decision-making and behavioral planning. Computational models have separately assessed learning and replay of neural trajectories, often using unrealistic learning rules or decoupling simulations for learning from replay. Hence, the question remains open of how neural trajectories are learned, memorized and replayed online, with permanently acting biological plasticity rules. The asynchronous irregular regime characterizing cortical dynamics in awake conditions exerts a major source of disorder that may jeopardize plasticity and replay of locally ordered activity. Here, we show that a recurrent model of local PFC circuitry endowed with realistic synaptic spike timing-dependent plasticity and scaling processes can learn, memorize and replay large-size neural trajectories online under asynchronous irregular dynamics, at regular or fast (sped-up) timescale. Presented trajectories are quickly learned (within seconds) as synaptic engrams in the network, and the model is able to chunk overlapping trajectories presented separately. These trajectory engrams last long-term (dozen hours) and trajectory replays can be triggered over an hour. In turn, we show the conditions under which trajectory engrams and replays preserve asynchronous irregular dynamics in the network. Functionally, spiking activity during trajectory replays at regular timescale accounts for both dynamical coding with temporal tuning in individual neurons, persistent activity at the population level, and large levels of variability consistent with observed cognitive-related PFC dynamics. Together, these results offer a consistent theoretical framework accounting for how neural trajectories can be learned, memorized and replayed in PFC networks circuits to subserve flexible dynamic representations and adaptive behaviors.
Collapse
Affiliation(s)
- Matthieu X B Sarazin
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Julie Victor
- CEA Paris-Saclay, CNRS, NeuroSpin, Saclay, France
| | - David Medernach
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Jérémie Naudé
- Neuroscience Paris Seine - Institut de biologie Paris Seine, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Bruno Delord
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| |
Collapse
|
14
|
Dong C, Madar AD, Sheffield MEJ. Distinct place cell dynamics in CA1 and CA3 encode experience in new environments. Nat Commun 2021; 12:2977. [PMID: 34016996 PMCID: PMC8137926 DOI: 10.1038/s41467-021-23260-3] [Citation(s) in RCA: 54] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2020] [Accepted: 04/21/2021] [Indexed: 02/07/2023] Open
Abstract
When exploring new environments animals form spatial memories that are updated with experience and retrieved upon re-exposure to the same environment. The hippocampus is thought to support these memory processes, but how this is achieved by different subnetworks such as CA1 and CA3 remains unclear. To understand how hippocampal spatial representations emerge and evolve during familiarization, we performed 2-photon calcium imaging in mice running in new virtual environments and compared the trial-to-trial dynamics of place cells in CA1 and CA3 over days. We find that place fields in CA1 emerge rapidly but tend to shift backwards from trial-to-trial and remap upon re-exposure to the environment a day later. In contrast, place fields in CA3 emerge gradually but show more stable trial-to-trial and day-to-day dynamics. These results reflect different roles in CA1 and CA3 in spatial memory processing during familiarization to new environments and constrain the potential mechanisms that support them.
Collapse
MESH Headings
- Animals
- Behavior Observation Techniques
- Behavior, Animal/physiology
- CA1 Region, Hippocampal/cytology
- CA1 Region, Hippocampal/diagnostic imaging
- CA1 Region, Hippocampal/physiology
- CA3 Region, Hippocampal/cytology
- CA3 Region, Hippocampal/diagnostic imaging
- CA3 Region, Hippocampal/physiology
- Craniotomy
- Intravital Microscopy/instrumentation
- Intravital Microscopy/methods
- Male
- Mice
- Microscopy, Confocal/instrumentation
- Microscopy, Confocal/methods
- Models, Animal
- Optical Imaging/instrumentation
- Optical Imaging/methods
- Place Cells/physiology
- Space Perception/physiology
- Spatial Memory/physiology
Collapse
Affiliation(s)
- Can Dong
- Department of Neurobiology and Institute for Neuroscience, University of Chicago, Chicago, IL, USA
| | - Antoine D Madar
- Department of Neurobiology and Institute for Neuroscience, University of Chicago, Chicago, IL, USA
| | - Mark E J Sheffield
- Department of Neurobiology and Institute for Neuroscience, University of Chicago, Chicago, IL, USA.
| |
Collapse
|
15
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
16
|
Deperrois N, Graupner M. Short-term depression and long-term plasticity together tune sensitive range of synaptic plasticity. PLoS Comput Biol 2020; 16:e1008265. [PMID: 32976516 PMCID: PMC7549837 DOI: 10.1371/journal.pcbi.1008265] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2019] [Revised: 10/12/2020] [Accepted: 08/17/2020] [Indexed: 01/24/2023] Open
Abstract
Synaptic efficacy is subjected to activity-dependent changes on short- and long time scales. While short-term changes decay over minutes, long-term modifications last from hours up to a lifetime and are thought to constitute the basis of learning and memory. Both plasticity mechanisms have been studied extensively but how their interaction shapes synaptic dynamics is little known. To investigate how both short- and long-term plasticity together control the induction of synaptic depression and potentiation, we used numerical simulations and mathematical analysis of a calcium-based model, where pre- and postsynaptic activity induces calcium transients driving synaptic long-term plasticity. We found that the model implementing known synaptic short-term dynamics in the calcium transients can be successfully fitted to long-term plasticity data obtained in visual- and somatosensory cortex. Interestingly, the impact of spike-timing and firing rate changes on plasticity occurs in the prevalent firing rate range, which is different in both cortical areas considered here. Our findings suggest that short- and long-term plasticity are together tuned to adapt plasticity to area-specific activity statistics such as firing rates. Synaptic long-term plasticity, the long-lasting change in efficacy of connections between neurons, is believed to underlie learning and memory. Synapses furthermore change their efficacy reversibly in an activity-dependent manner on the subsecond time scale, referred to as short-term plasticity. It is not known how both synaptic plasticity mechanisms—long- and short-term—interact during activity epochs. To address this question, we used a biologically-inspired plasticity model in which calcium drives changes in synaptic efficacy. We applied the model to plasticity data from visual- and somatosensory cortex and found that synaptic changes occur in very different firing rate ranges, which correspond to the prevalent firing rates in both structures. Our results suggest that short- and long-term plasticity act in a well concerted fashion.
Collapse
Affiliation(s)
- Nicolas Deperrois
- Université de Paris, CNRS, SPPIN - Saints-Pères Paris Institute for the Neurosciences, F-75006 Paris, France
| | - Michael Graupner
- Université de Paris, CNRS, SPPIN - Saints-Pères Paris Institute for the Neurosciences, F-75006 Paris, France
- * E-mail:
| |
Collapse
|
17
|
Multicoding in neural information transfer suggested by mathematical analysis of the frequency-dependent synaptic plasticity in vivo. Sci Rep 2020; 10:13974. [PMID: 32811844 PMCID: PMC7435278 DOI: 10.1038/s41598-020-70876-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2020] [Accepted: 08/04/2020] [Indexed: 11/29/2022] Open
Abstract
Two elements of neural information processing have primarily been proposed: firing rate and spike timing of neurons. In the case of synaptic plasticity, although spike-timing-dependent plasticity (STDP) depending on presynaptic and postsynaptic spike times had been considered the most common rule, recent studies have shown the inhibitory nature of the brain in vivo for precise spike timing, which is key to the STDP. Thus, the importance of the firing frequency in synaptic plasticity in vivo has been recognized again. However, little is understood about how the frequency-dependent synaptic plasticity (FDP) is regulated in vivo. Here, we focused on the presynaptic input pattern, the intracellular calcium decay time constants, and the background synaptic activity, which vary depending on neuron types and the anatomical and physiological environment in the brain. By analyzing a calcium-based model, we found that the synaptic weight differs depending on these factors characteristic in vivo, even if neurons receive the same input rate. This finding suggests the involvement of multifaceted factors other than input frequency in FDP and even neural coding in vivo.
Collapse
|
18
|
Thomas CW, Guillaumin MC, McKillop LE, Achermann P, Vyazovskiy VV. Global sleep homeostasis reflects temporally and spatially integrated local cortical neuronal activity. eLife 2020; 9:54148. [PMID: 32614324 PMCID: PMC7332296 DOI: 10.7554/elife.54148] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2019] [Accepted: 06/19/2020] [Indexed: 12/16/2022] Open
Abstract
Sleep homeostasis manifests as a relative constancy of its daily amount and intensity. Theoretical descriptions define ‘Process S’, a variable with dynamics dependent on global sleep-wake history, and reflected in electroencephalogram (EEG) slow wave activity (SWA, 0.5–4 Hz) during sleep. The notion of sleep as a local, activity-dependent process suggests that activity history must be integrated to determine the dynamics of global Process S. Here, we developed novel mathematical models of Process S based on cortical activity recorded in freely behaving mice, describing local Process S as a function of the deviation of neuronal firing rates from a locally defined set-point, independent of global sleep-wake state. Averaging locally derived Processes S and their rate parameters yielded values resembling those obtained from EEG SWA and global vigilance states. We conclude that local Process S dynamics reflects neuronal activity integrated over time, and global Process S reflects local processes integrated over space.
Collapse
Affiliation(s)
- Christopher W Thomas
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| | - Mathilde Cc Guillaumin
- Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
| | - Laura E McKillop
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| | - Peter Achermann
- Institute of Pharmacology and Toxicology, University of Zurich, Zurich, Switzerland.,The KEY Institute for Brain-Mind Research, Department of Psychiatry, Psychotherapy and Psychosomatics, University Hospital of Psychiatry, Zurich, Switzerland
| | - Vladyslav V Vyazovskiy
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| |
Collapse
|
19
|
Montangie L, Miehl C, Gjorgjieva J. Autonomous emergence of connectivity assemblies via spike triplet interactions. PLoS Comput Biol 2020; 16:e1007835. [PMID: 32384081 PMCID: PMC7239496 DOI: 10.1371/journal.pcbi.1007835] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2019] [Revised: 05/20/2020] [Accepted: 03/31/2020] [Indexed: 01/08/2023] Open
Abstract
Non-random connectivity can emerge without structured external input driven by activity-dependent mechanisms of synaptic plasticity based on precise spiking patterns. Here we analyze the emergence of global structures in recurrent networks based on a triplet model of spike timing dependent plasticity (STDP), which depends on the interactions of three precisely-timed spikes, and can describe plasticity experiments with varying spike frequency better than the classical pair-based STDP rule. We derive synaptic changes arising from correlations up to third-order and describe them as the sum of structural motifs, which determine how any spike in the network influences a given synaptic connection through possible connectivity paths. This motif expansion framework reveals novel structural motifs under the triplet STDP rule, which support the formation of bidirectional connections and ultimately the spontaneous emergence of global network structure in the form of self-connected groups of neurons, or assemblies. We propose that under triplet STDP assembly structure can emerge without the need for externally patterned inputs or assuming a symmetric pair-based STDP rule common in previous studies. The emergence of non-random network structure under triplet STDP occurs through internally-generated higher-order correlations, which are ubiquitous in natural stimuli and neuronal spiking activity, and important for coding. We further demonstrate how neuromodulatory mechanisms that modulate the shape of the triplet STDP rule or the synaptic transmission function differentially promote structural motifs underlying the emergence of assemblies, and quantify the differences using graph theoretic measures. Emergent non-random connectivity structures in different brain regions are tightly related to specific patterns of neural activity and support diverse brain functions. For instance, self-connected groups of neurons, known as assemblies, have been proposed to represent functional units in brain circuits and can emerge even without patterned external instruction. Here we investigate the emergence of non-random connectivity in recurrent networks using a particular plasticity rule, triplet STDP, which relies on the interaction of spike triplets and can capture higher-order statistical dependencies in neural activity. We derive the evolution of the synaptic strengths in the network and explore the conditions for the self-organization of connectivity into assemblies. We demonstrate key differences of the triplet STDP rule compared to the classical pair-based rule in terms of how assemblies are formed, including the realistic asymmetric shape and influence of novel connectivity motifs on network plasticity driven by higher-order correlations. Assembly formation depends on the specific shape of the STDP window and synaptic transmission function, pointing towards an important role of neuromodulatory signals on formation of intrinsically generated assemblies.
Collapse
Affiliation(s)
- Lisandro Montangie
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
| | - Christoph Miehl
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
- Technical University of Munich, School of Life Sciences, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
- Technical University of Munich, School of Life Sciences, Freising, Germany
- * E-mail:
| |
Collapse
|
20
|
Lappalainen J, Herpich J, Tetzlaff C. A Theoretical Framework to Derive Simple, Firing-Rate-Dependent Mathematical Models of Synaptic Plasticity. Front Comput Neurosci 2019; 13:26. [PMID: 31133837 PMCID: PMC6517541 DOI: 10.3389/fncom.2019.00026] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Accepted: 04/10/2019] [Indexed: 11/13/2022] Open
Abstract
Synaptic plasticity serves as an essential mechanism underlying cognitive processes as learning and memory. For a better understanding detailed theoretical models combine experimental underpinnings of synaptic plasticity and match experimental results. However, these models are mathematically complex impeding the comprehensive investigation of their link to cognitive processes generally executed on the neuronal network level. Here, we derive a mathematical framework enabling the simplification of such detailed models of synaptic plasticity facilitating further mathematical analyses. By this framework we obtain a compact, firing-rate-dependent mathematical formulation, which includes the essential dynamics of the detailed model and, thus, of experimentally verified properties of synaptic plasticity. Amongst others, by testing our framework by abstracting the dynamics of two well-established calcium-dependent synaptic plasticity models, we derived that the synaptic changes depend on the square of the presynaptic firing rate, which is in contrast to previous assumptions. Thus, the here-presented framework enables the derivation of biologically plausible but simple mathematical models of synaptic plasticity allowing to analyze the underlying dependencies of synaptic dynamics from neuronal properties such as the firing rate and to investigate their implications in complex neuronal networks.
Collapse
Affiliation(s)
- Janne Lappalainen
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany
| | - Juliane Herpich
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany.,Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany.,Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| |
Collapse
|
21
|
Ocker GK, Doiron B. Training and Spontaneous Reinforcement of Neuronal Assemblies by Spike Timing Plasticity. Cereb Cortex 2019; 29:937-951. [PMID: 29415191 PMCID: PMC7963120 DOI: 10.1093/cercor/bhy001] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 01/01/2018] [Accepted: 01/05/2018] [Indexed: 12/15/2022] Open
Abstract
The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, PA, USA
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
22
|
Theodoni P, Rovira B, Wang Y, Roxin A. Theta-modulation drives the emergence of connectivity patterns underlying replay in a network model of place cells. eLife 2018; 7:37388. [PMID: 30355442 PMCID: PMC6224194 DOI: 10.7554/elife.37388] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2018] [Accepted: 10/24/2018] [Indexed: 01/05/2023] Open
Abstract
Place cells of the rodent hippocampus fire action potentials when the animal traverses a particular spatial location in any environment. Therefore for any given trajectory one observes a repeatable sequence of place cell activations. When the animal is quiescent or sleeping, one can observe similar sequences of activation known as replay, which underlie the process of memory consolidation. However, it remains unclear how replay is generated. Here we show how a temporally asymmetric plasticity rule during spatial exploration gives rise to spontaneous replay in a model network by shaping the recurrent connectivity to reflect the topology of the learned environment. Crucially, the rate of this encoding is strongly modulated by ongoing rhythms. Oscillations in the theta range optimize learning by generating repeated pre-post pairings on a time-scale commensurate with the window for plasticity, while lower and higher frequencies generate learning rates which are lower by orders of magnitude.
Collapse
Affiliation(s)
- Panagiota Theodoni
- Centre de Recerca Matemàtica, Bellaterra, Spain.,New York University Shanghai, Shanghai, China.,NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, Shanghai, China
| | - Bernat Rovira
- Centre de Recerca Matemàtica, Bellaterra, Spain.,Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
| | - Yingxue Wang
- Max Planck Florida Institute for Neuroscience, Jupiter, United States
| | - Alex Roxin
- Centre de Recerca Matemàtica, Bellaterra, Spain.,Barcelona Graduate School of Mathematics, Barcelona, Spain
| |
Collapse
|
23
|
Rongala UB, Spanne A, Mazzoni A, Bengtsson F, Oddo CM, Jörntell H. Intracellular Dynamics in Cuneate Nucleus Neurons Support Self-Stabilizing Learning of Generalizable Tactile Representations. Front Cell Neurosci 2018; 12:210. [PMID: 30108485 PMCID: PMC6079306 DOI: 10.3389/fncel.2018.00210] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2018] [Accepted: 06/26/2018] [Indexed: 12/12/2022] Open
Abstract
How the brain represents the external world is an unresolved issue for neuroscience, which could provide fundamental insights into brain circuitry operation and solutions for artificial intelligence and robotics. The neurons of the cuneate nucleus form the first interface for the sense of touch in the brain. They were previously shown to have a highly skewed synaptic weight distribution for tactile primary afferent inputs, suggesting that their connectivity is strongly shaped by learning. Here we first characterized the intracellular dynamics and inhibitory synaptic inputs of cuneate neurons in vivo and modeled their integration of tactile sensory inputs. We then replaced the tactile inputs with input from a sensorized bionic fingertip and modeled the learning-induced representations that emerged from varied sensory experiences. The model reproduced both the intrinsic membrane dynamics and the synaptic weight distributions observed in cuneate neurons in vivo. In terms of higher level model properties, individual cuneate neurons learnt to identify specific sets of correlated sensors, which at the population level resulted in a decomposition of the sensor space into its recurring high-dimensional components. Such vector components could be applied to identify both past and novel sensory experiences and likely correspond to the fundamental haptic input features these neurons encode in vivo. In addition, we show that the cuneate learning architecture is robust to a wide range of intrinsic parameter settings due to the neuronal intrinsic dynamics. Therefore, the architecture is a potentially generic solution for forming versatile representations of the external world in different sensor systems.
Collapse
Affiliation(s)
- Udaya B Rongala
- The BioRobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Anton Spanne
- Section for Neurobiology, Department of Experimental Medical Sciences, Biomedical Center, Lund University, Lund, Sweden
| | - Alberto Mazzoni
- The BioRobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Fredrik Bengtsson
- Section for Neurobiology, Department of Experimental Medical Sciences, Biomedical Center, Lund University, Lund, Sweden
| | - Calogero M Oddo
- The BioRobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Henrik Jörntell
- Section for Neurobiology, Department of Experimental Medical Sciences, Biomedical Center, Lund University, Lund, Sweden
| |
Collapse
|
24
|
Foncelle A, Mendes A, Jędrzejewska-Szmek J, Valtcheva S, Berry H, Blackwell KT, Venance L. Modulation of Spike-Timing Dependent Plasticity: Towards the Inclusion of a Third Factor in Computational Models. Front Comput Neurosci 2018; 12:49. [PMID: 30018546 PMCID: PMC6037788 DOI: 10.3389/fncom.2018.00049] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2018] [Accepted: 06/06/2018] [Indexed: 11/13/2022] Open
Abstract
In spike-timing dependent plasticity (STDP) change in synaptic strength depends on the timing of pre- vs. postsynaptic spiking activity. Since STDP is in compliance with Hebb's postulate, it is considered one of the major mechanisms of memory storage and recall. STDP comprises a system of two coincidence detectors with N-methyl-D-aspartate receptor (NMDAR) activation often posited as one of the main components. Numerous studies have unveiled a third component of this coincidence detection system, namely neuromodulation and glia activity shaping STDP. Even though dopaminergic control of STDP has most often been reported, acetylcholine, noradrenaline, nitric oxide (NO), brain-derived neurotrophic factor (BDNF) or gamma-aminobutyric acid (GABA) also has been shown to effectively modulate STDP. Furthermore, it has been demonstrated that astrocytes, via the release or uptake of glutamate, gate STDP expression. At the most fundamental level, the timing properties of STDP are expected to depend on the spatiotemporal dynamics of the underlying signaling pathways. However in most cases, due to technical limitations experiments grant only indirect access to these pathways. Computational models carefully constrained by experiments, allow for a better qualitative understanding of the molecular basis of STDP and its regulation by neuromodulators. Recently, computational models of calcium dynamics and signaling pathway molecules have started to explore STDP emergence in ex and in vivo-like conditions. These models are expected to reproduce better at least part of the complex modulation of STDP as an emergent property of the underlying molecular pathways. Elucidation of the mechanisms underlying STDP modulation and its consequences on network dynamics is of critical importance and will allow better understanding of the major mechanisms of memory storage and recall both in health and disease.
Collapse
Affiliation(s)
- Alexandre Foncelle
- INRIA, Villeurbanne, France
- LIRIS UMR 5205 CNRS-INSA, University of Lyon, Villeurbanne, France
| | - Alexandre Mendes
- Dynamic and Pathophysiology of Neuronal Networks, Center for Interdisciplinary Research in Biology (CIRB), College de France, INSERM U1050, CNRS UMR7241, Labex Memolife, Paris, France
- University Pierre et Marie Curie, ED 158, Paris, France
| | | | - Silvana Valtcheva
- Dynamic and Pathophysiology of Neuronal Networks, Center for Interdisciplinary Research in Biology (CIRB), College de France, INSERM U1050, CNRS UMR7241, Labex Memolife, Paris, France
- University Pierre et Marie Curie, ED 158, Paris, France
| | - Hugues Berry
- INRIA, Villeurbanne, France
- LIRIS UMR 5205 CNRS-INSA, University of Lyon, Villeurbanne, France
| | - Kim T. Blackwell
- The Krasnow Institute for Advanced Studies, George Mason University, Fairfax, VA, United States
| | - Laurent Venance
- Dynamic and Pathophysiology of Neuronal Networks, Center for Interdisciplinary Research in Biology (CIRB), College de France, INSERM U1050, CNRS UMR7241, Labex Memolife, Paris, France
- University Pierre et Marie Curie, ED 158, Paris, France
| |
Collapse
|
25
|
Cui Y, Prokin I, Mendes A, Berry H, Venance L. Robustness of STDP to spike timing jitter. Sci Rep 2018; 8:8139. [PMID: 29802357 PMCID: PMC5970212 DOI: 10.1038/s41598-018-26436-y] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2018] [Accepted: 05/09/2018] [Indexed: 01/26/2023] Open
Abstract
In Hebbian plasticity, neural circuits adjust their synaptic weights depending on patterned firing. Spike-timing-dependent plasticity (STDP), a synaptic Hebbian learning rule, relies on the order and timing of the paired activities in pre- and postsynaptic neurons. Classically, in ex vivo experiments, STDP is assessed with deterministic (constant) spike timings and time intervals between successive pairings, thus exhibiting a regularity that differs from biological variability. Hence, STDP emergence from noisy inputs as occurring in in vivo-like firing remains unresolved. Here, we used noisy STDP pairings where the spike timing and/or interval between pairings were jittered. We explored with electrophysiology and mathematical modeling, the impact of jitter on three forms of STDP at corticostriatal synapses: NMDAR-LTP, endocannabinoid-LTD and endocannabinoid-LTP. We found that NMDAR-LTP was highly fragile to jitter, whereas endocannabinoid-plasticity appeared more resistant. When the frequency or number of pairings was increased, NMDAR-LTP became more robust and could be expressed despite strong jittering. Our results identify endocannabinoid-plasticity as a robust form of STDP, whereas the sensitivity to jitter of NMDAR-LTP varies with activity frequency. This provides new insights into the mechanisms at play during the different phases of learning and memory and the emergence of Hebbian plasticity in in vivo-like activity.
Collapse
Affiliation(s)
- Yihui Cui
- Dynamics and Pathophysiology of Neuronal Networks Team, Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, PSL Research University, Paris, France
| | - Ilya Prokin
- INRIA, Villeurbanne, France.,University of Lyon, LIRIS UMR5205, Villeurbanne, France
| | - Alexandre Mendes
- Dynamics and Pathophysiology of Neuronal Networks Team, Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, PSL Research University, Paris, France
| | - Hugues Berry
- INRIA, Villeurbanne, France. .,University of Lyon, LIRIS UMR5205, Villeurbanne, France.
| | - Laurent Venance
- Dynamics and Pathophysiology of Neuronal Networks Team, Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, PSL Research University, Paris, France.
| |
Collapse
|
26
|
Weissenberger F, Gauy MM, Lengler J, Meier F, Steger A. Voltage dependence of synaptic plasticity is essential for rate based learning with short stimuli. Sci Rep 2018; 8:4609. [PMID: 29545553 PMCID: PMC5854671 DOI: 10.1038/s41598-018-22781-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2018] [Accepted: 02/28/2018] [Indexed: 11/09/2022] Open
Abstract
In computational neuroscience, synaptic plasticity rules are often formulated in terms of firing rates. The predominant description of in vivo neuronal activity, however, is the instantaneous rate (or spiking probability). In this article we resolve this discrepancy by showing that fluctuations of the membrane potential carry enough information to permit a precise estimate of the instantaneous rate in balanced networks. As a consequence, we find that rate based plasticity rules are not restricted to neuronal activity that is stable for hundreds of milliseconds to seconds, but can be carried over to situations in which it changes every few milliseconds. We illustrate this, by showing that a voltage-dependent realization of the classical BCM rule achieves input selectivity, even if stimulus duration is reduced to a few milliseconds each.
Collapse
Affiliation(s)
- Felix Weissenberger
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland.
| | - Marcelo Matheus Gauy
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Johannes Lengler
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Florian Meier
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| | - Angelika Steger
- Institute of Theoretical Computer Science, Department of Computer Science, ETHZ, 8092, Zürich, Switzerland
| |
Collapse
|
27
|
Richter LMA, Gjorgjieva J. Understanding neural circuit development through theory and models. Curr Opin Neurobiol 2017; 46:39-47. [DOI: 10.1016/j.conb.2017.07.004] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2017] [Revised: 07/07/2017] [Accepted: 07/10/2017] [Indexed: 11/25/2022]
|
28
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|