1
|
Bin Ibrahim MZ, Wang Z, Sajikumar S. Synapses tagged, memories kept: synaptic tagging and capture hypothesis in brain health and disease. Philos Trans R Soc Lond B Biol Sci 2024; 379:20230237. [PMID: 38853570 DOI: 10.1098/rstb.2023.0237] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2023] [Accepted: 02/13/2024] [Indexed: 06/11/2024] Open
Abstract
The synaptic tagging and capture (STC) hypothesis lays the framework on the synapse-specific mechanism of protein synthesis-dependent long-term plasticity upon synaptic induction. Activated synapses will display a transient tag that will capture plasticity-related products (PRPs). These two events, tag setting and PRP synthesis, can be teased apart and have been studied extensively-from their electrophysiological and pharmacological properties to the molecular events involved. Consequently, the hypothesis also permits interactions of synaptic populations that encode different memories within the same neuronal population-hence, it gives rise to the associativity of plasticity. In this review, the recent advances and progress since the experimental debut of the STC hypothesis will be shared. This includes the role of neuromodulation in PRP synthesis and tag integrity, behavioural correlates of the hypothesis and modelling in silico. STC, as a more sensitive assay for synaptic health, can also assess neuronal aberrations. We will also expound how synaptic plasticity and associativity are altered in ageing-related decline and pathological conditions such as juvenile stress, cancer, sleep deprivation and Alzheimer's disease. This article is part of a discussion meeting issue 'Long-term potentiation: 50 years on'.
Collapse
Affiliation(s)
- Mohammad Zaki Bin Ibrahim
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore , Singapore 117597, Singapore
- Neurobiology Programme, Life Sciences Institute, National University of Singapore , Singapore 119077, Singapore
| | - Zijun Wang
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore , Singapore 117597, Singapore
- Neurobiology Programme, Life Sciences Institute, National University of Singapore , Singapore 119077, Singapore
| | - Sreedharan Sajikumar
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore , Singapore 117597, Singapore
- Neurobiology Programme, Life Sciences Institute, National University of Singapore , Singapore 119077, Singapore
- Healthy Longevity Translational Research Programme, Yong Loo Lin School of Medicine, National University of Singapore , Singapore 117597, Singapore
| |
Collapse
|
2
|
Faress I, Khalil V, Hou WH, Moreno A, Andersen N, Fonseca R, Piriz J, Capogna M, Nabavi S. Non-Hebbian plasticity transforms transient experiences into lasting memories. eLife 2024; 12:RP91421. [PMID: 39023519 PMCID: PMC11257676 DOI: 10.7554/elife.91421] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/20/2024] Open
Abstract
The dominant models of learning and memory, such as Hebbian plasticity, propose that experiences are transformed into memories through input-specific synaptic plasticity at the time of learning. However, synaptic plasticity is neither strictly input-specific nor restricted to the time of its induction. The impact of such forms of non-Hebbian plasticity on memory has been difficult to test, and hence poorly understood. Here, we demonstrate that synaptic manipulations can deviate from the Hebbian model of learning, yet produce a lasting memory. First, we established a weak associative conditioning protocol in mice, where optogenetic stimulation of sensory thalamic input to the amygdala was paired with a footshock, but no detectable memory was formed. However, when the same input was potentiated minutes before or after, or even 24 hr later, the associative experience was converted into a lasting memory. Importantly, potentiating an independent input to the amygdala minutes but not 24 hr after the pairing produced a lasting memory. Thus, our findings suggest that the process of transformation of a transient experience into a memory is neither restricted to the time of the experience nor to the synapses triggered by it; instead, it can be influenced by past and future events.
Collapse
Affiliation(s)
- Islam Faress
- Department of Molecular Biology and Genetics, Aarhus UniversityAahrusDenmark
- Department of Biomedicine, Aarhus UniversityAarhusDenmark
- DANDRITE, The Danish Research Institute of Translational Neuroscience, Aarhus UniversityAahrusDenmark
- Center for Proteins in Memory – PROMEMO, Danish National Research Foundation, Aarhus UniversityAahrusDenmark
| | - Valentina Khalil
- Department of Molecular Biology and Genetics, Aarhus UniversityAahrusDenmark
- DANDRITE, The Danish Research Institute of Translational Neuroscience, Aarhus UniversityAahrusDenmark
- Center for Proteins in Memory – PROMEMO, Danish National Research Foundation, Aarhus UniversityAahrusDenmark
| | - Wen-Hsien Hou
- Department of Biomedicine, Aarhus UniversityAarhusDenmark
| | - Andrea Moreno
- Department of Molecular Biology and Genetics, Aarhus UniversityAahrusDenmark
- DANDRITE, The Danish Research Institute of Translational Neuroscience, Aarhus UniversityAahrusDenmark
- Center for Proteins in Memory – PROMEMO, Danish National Research Foundation, Aarhus UniversityAahrusDenmark
| | - Niels Andersen
- Department of Molecular Biology and Genetics, Aarhus UniversityAahrusDenmark
- DANDRITE, The Danish Research Institute of Translational Neuroscience, Aarhus UniversityAahrusDenmark
- Center for Proteins in Memory – PROMEMO, Danish National Research Foundation, Aarhus UniversityAahrusDenmark
| | - Rosalina Fonseca
- Cellular and Systems Neurobiology, Universidade Nova de LisboaLisbonPortugal
| | - Joaquin Piriz
- Instituto de Fisiología Biología Molecular y Neurociencias (IFIBYNE), Universidad de Buenos AiresBuenos AiresArgentina
| | - Marco Capogna
- Department of Biomedicine, Aarhus UniversityAarhusDenmark
- Center for Proteins in Memory – PROMEMO, Danish National Research Foundation, Aarhus UniversityAahrusDenmark
| | - Sadegh Nabavi
- Department of Molecular Biology and Genetics, Aarhus UniversityAahrusDenmark
- DANDRITE, The Danish Research Institute of Translational Neuroscience, Aarhus UniversityAahrusDenmark
- Center for Proteins in Memory – PROMEMO, Danish National Research Foundation, Aarhus UniversityAahrusDenmark
| |
Collapse
|
3
|
Bredenberg C, Savin C. Desiderata for Normative Models of Synaptic Plasticity. Neural Comput 2024; 36:1245-1285. [PMID: 38776950 DOI: 10.1162/neco_a_01671] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2023] [Accepted: 02/06/2024] [Indexed: 05/25/2024]
Abstract
Normative models of synaptic plasticity use computational rationales to arrive at predictions of behavioral and network-level adaptive phenomena. In recent years, there has been an explosion of theoretical work in this realm, but experimental confirmation remains limited. In this review, we organize work on normative plasticity models in terms of a set of desiderata that, when satisfied, are designed to ensure that a given model demonstrates a clear link between plasticity and adaptive behavior, is consistent with known biological evidence about neural plasticity and yields specific testable predictions. As a prototype, we include a detailed analysis of the REINFORCE algorithm. We also discuss how new models have begun to improve on the identified criteria and suggest avenues for further development. Overall, we provide a conceptual guide to help develop neural learning theories that are precise, powerful, and experimentally testable.
Collapse
Affiliation(s)
- Colin Bredenberg
- Center for Neural Science, New York University, New York, NY 10003, U.S.A
- Mila-Quebec AI Institute, Montréal, QC H2S 3H1, Canada
| | - Cristina Savin
- Center for Neural Science, New York University, New York, NY 10003, U.S.A
- Center for Data Science, New York University, New York, NY 10011, U.S.A.
| |
Collapse
|
4
|
Vignoud G, Venance L, Touboul JD. Anti-Hebbian plasticity drives sequence learning in striatum. Commun Biol 2024; 7:555. [PMID: 38724614 PMCID: PMC11082161 DOI: 10.1038/s42003-024-06203-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2022] [Accepted: 04/17/2024] [Indexed: 05/12/2024] Open
Abstract
Spatio-temporal activity patterns have been observed in a variety of brain areas in spontaneous activity, prior to or during action, or in response to stimuli. Biological mechanisms endowing neurons with the ability to distinguish between different sequences remain largely unknown. Learning sequences of spikes raises multiple challenges, such as maintaining in memory spike history and discriminating partially overlapping sequences. Here, we show that anti-Hebbian spike-timing dependent plasticity (STDP), as observed at cortico-striatal synapses, can naturally lead to learning spike sequences. We design a spiking model of the striatal output neuron receiving spike patterns defined as sequential input from a fixed set of cortical neurons. We use a simple synaptic plasticity rule that combines anti-Hebbian STDP and non-associative potentiation for a subset of the presented patterns called rewarded patterns. We study the ability of striatal output neurons to discriminate rewarded from non-rewarded patterns by firing only after the presentation of a rewarded pattern. In particular, we show that two biological properties of striatal networks, spiking latency and collateral inhibition, contribute to an increase in accuracy, by allowing a better discrimination of partially overlapping sequences. These results suggest that anti-Hebbian STDP may serve as a biological substrate for learning sequences of spikes.
Collapse
Affiliation(s)
- Gaëtan Vignoud
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France
| | - Laurent Venance
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France.
| | - Jonathan D Touboul
- Department of Mathematics and Volen National Center for Complex Systems, Brandeis University, Waltham, MA, USA.
| |
Collapse
|
5
|
Albesa-González A, Clopath C. Learning with filopodia and spines: Complementary strong and weak competition lead to specialized, graded, and protected receptive fields. PLoS Comput Biol 2024; 20:e1012110. [PMID: 38743789 PMCID: PMC11125506 DOI: 10.1371/journal.pcbi.1012110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 05/24/2024] [Accepted: 04/25/2024] [Indexed: 05/16/2024] Open
Abstract
Filopodia are thin synaptic protrusions that have been long known to play an important role in early development. Recently, they have been found to be more abundant in the adult cortex than previously thought, and more plastic than spines (button-shaped mature synapses). Inspired by these findings, we introduce a new model of synaptic plasticity that jointly describes learning of filopodia and spines. The model assumes that filopodia exhibit strongly competitive learning dynamics -similarly to additive spike-timing-dependent plasticity (STDP). At the same time it proposes that, if filopodia undergo sufficient potentiation, they consolidate into spines. Spines follow weakly competitive learning, classically associated with multiplicative, soft-bounded models of STDP. This makes spines more stable and sensitive to the fine structure of input correlations. We show that our learning rule has a selectivity comparable to additive STDP and captures input correlations as well as multiplicative models of STDP. We also show how it can protect previously formed memories and perform synaptic consolidation. Overall, our results can be seen as a phenomenological description of how filopodia and spines could cooperate to overcome the individual difficulties faced by strong and weak competition mechanisms.
Collapse
Affiliation(s)
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, United Kingdom
| |
Collapse
|
6
|
Fitz H, Hagoort P, Petersson KM. Neurobiological Causal Models of Language Processing. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2024; 5:225-247. [PMID: 38645618 PMCID: PMC11025648 DOI: 10.1162/nol_a_00133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/29/2022] [Accepted: 12/18/2023] [Indexed: 04/23/2024]
Abstract
The language faculty is physically realized in the neurobiological infrastructure of the human brain. Despite significant efforts, an integrated understanding of this system remains a formidable challenge. What is missing from most theoretical accounts is a specification of the neural mechanisms that implement language function. Computational models that have been put forward generally lack an explicit neurobiological foundation. We propose a neurobiologically informed causal modeling approach which offers a framework for how to bridge this gap. A neurobiological causal model is a mechanistic description of language processing that is grounded in, and constrained by, the characteristics of the neurobiological substrate. It intends to model the generators of language behavior at the level of implementational causality. We describe key features and neurobiological component parts from which causal models can be built and provide guidelines on how to implement them in model simulations. Then we outline how this approach can shed new light on the core computational machinery for language, the long-term storage of words in the mental lexicon and combinatorial processing in sentence comprehension. In contrast to cognitive theories of behavior, causal models are formulated in the "machine language" of neurobiology which is universal to human cognition. We argue that neurobiological causal modeling should be pursued in addition to existing approaches. Eventually, this approach will allow us to develop an explicit computational neurobiology of language.
Collapse
Affiliation(s)
- Hartmut Fitz
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Peter Hagoort
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Karl Magnus Petersson
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
- Faculty of Medicine and Biomedical Sciences, University of Algarve, Faro, Portugal
| |
Collapse
|
7
|
Tomé DF, Zhang Y, Aida T, Mosto O, Lu Y, Chen M, Sadeh S, Roy DS, Clopath C. Dynamic and selective engrams emerge with memory consolidation. Nat Neurosci 2024; 27:561-572. [PMID: 38243089 PMCID: PMC10917686 DOI: 10.1038/s41593-023-01551-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2022] [Accepted: 12/12/2023] [Indexed: 01/21/2024]
Abstract
Episodic memories are encoded by experience-activated neuronal ensembles that remain necessary and sufficient for recall. However, the temporal evolution of memory engrams after initial encoding is unclear. In this study, we employed computational and experimental approaches to examine how the neural composition and selectivity of engrams change with memory consolidation. Our spiking neural network model yielded testable predictions: memories transition from unselective to selective as neurons drop out of and drop into engrams; inhibitory activity during recall is essential for memory selectivity; and inhibitory synaptic plasticity during memory consolidation is critical for engrams to become selective. Using activity-dependent labeling, longitudinal calcium imaging and a combination of optogenetic and chemogenetic manipulations in mouse dentate gyrus, we conducted contextual fear conditioning experiments that supported our model's predictions. Our results reveal that memory engrams are dynamic and that changes in engram composition mediated by inhibitory plasticity are crucial for the emergence of memory selectivity.
Collapse
Affiliation(s)
- Douglas Feitosa Tomé
- Department of Bioengineering, Imperial College London, London, UK.
- Institute of Science and Technology Austria, Klosterneuburg, Austria.
| | - Ying Zhang
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA.
- Center for Life Sciences & IDG/McGovern Institute for Brain Research, School of Life Sciences, Tsinghua University, Beijing, China.
| | - Tomomi Aida
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Olivia Mosto
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Yifeng Lu
- Center for Life Sciences & IDG/McGovern Institute for Brain Research, School of Life Sciences, Tsinghua University, Beijing, China
| | - Mandy Chen
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Sadra Sadeh
- Department of Bioengineering, Imperial College London, London, UK
- Department of Brain Sciences, Imperial College London, London, UK
| | - Dheeraj S Roy
- Stanley Center for Psychiatric Research, Broad Institute of MIT and Harvard, Cambridge, MA, USA.
- Department of Physiology and Biophysics, Jacobs School of Medicine and Biomedical Sciences, State University of New York at Buffalo, Buffalo, NY, USA.
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, UK.
| |
Collapse
|
8
|
Feng Y, Brunel N. Attractor neural networks with double well synapses. PLoS Comput Biol 2024; 20:e1011354. [PMID: 38324630 PMCID: PMC10878535 DOI: 10.1371/journal.pcbi.1011354] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2023] [Revised: 02/20/2024] [Accepted: 01/23/2024] [Indexed: 02/09/2024] Open
Abstract
It is widely believed that memory storage depends on activity-dependent synaptic modifications. Classical studies of learning and memory in neural networks describe synaptic efficacy either as continuous or discrete. However, recent results suggest an intermediate scenario in which synaptic efficacy can be described by a continuous variable, but whose distribution is peaked around a small set of discrete values. Motivated by these results, we explored a model in which each synapse is described by a continuous variable that evolves in a potential with multiple minima. External inputs to the network can switch synapses from one potential well to another. Our analytical and numerical results show that this model can interpolate between models with discrete synapses which correspond to the deep potential limit, and models in which synapses evolve in a single quadratic potential. We find that the storage capacity of the network with double well synapses exhibits a power law dependence on the network size, rather than the logarithmic dependence observed in models with single well synapses. In addition, synapses with deeper potential wells lead to more robust information storage in the presence of noise. When memories are sparsely encoded, the scaling of the capacity with network size is similar to previously studied network models in the sparse coding limit.
Collapse
Affiliation(s)
- Yu Feng
- Department of Physics, Duke University, Durham, North Carolina, United States of America
| | - Nicolas Brunel
- Department of Physics, Duke University, Durham, North Carolina, United States of America
- Department of Neurobiology, Duke University, Durham, North Carolina, United States of America
| |
Collapse
|
9
|
O'Donnell C. Nonlinear slow-timescale mechanisms in synaptic plasticity. Curr Opin Neurobiol 2023; 82:102778. [PMID: 37657186 DOI: 10.1016/j.conb.2023.102778] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 08/07/2023] [Accepted: 08/09/2023] [Indexed: 09/03/2023]
Abstract
Learning and memory rely on synapses changing their strengths in response to neural activity. However, there is a substantial gap between the timescales of neural electrical dynamics (1-100 ms) and organism behaviour during learning (seconds-minutes). What mechanisms bridge this timescale gap? What are the implications for theories of brain learning? Here I first cover experimental evidence for slow-timescale factors in plasticity induction. Then I review possible underlying cellular and synaptic mechanisms, and insights from recent computational models that incorporate such slow-timescale variables. I conclude that future progress in understanding brain learning across timescales will require both experimental and computational modelling studies that map out the nonlinearities implemented by both fast and slow plasticity mechanisms at synapses, and crucially, their joint interactions.
Collapse
Affiliation(s)
- Cian O'Donnell
- School of Computing, Engineering, and Intelligent Systems, Magee Campus, Ulster University, Derry/Londonderry, UK; School of Computer Science, Electrical and Electronic Engineering, and Engineering Maths, University of Bristol, Bristol, UK.
| |
Collapse
|
10
|
Bredenberg C, Savin C. Desiderata for normative models of synaptic plasticity. ARXIV 2023:arXiv:2308.04988v1. [PMID: 37608931 PMCID: PMC10441445] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Subscribe] [Scholar Register] [Indexed: 08/24/2023]
Abstract
Normative models of synaptic plasticity use a combination of mathematics and computational simulations to arrive at predictions of behavioral and network-level adaptive phenomena. In recent years, there has been an explosion of theoretical work on these models, but experimental confirmation is relatively limited. In this review, we organize work on normative plasticity models in terms of a set of desiderata which, when satisfied, are designed to guarantee that a model has a clear link between plasticity and adaptive behavior, consistency with known biological evidence about neural plasticity, and specific testable predictions. We then discuss how new models have begun to improve on these criteria and suggest avenues for further development. As prototypes, we provide detailed analyses of two specific models - REINFORCE and the Wake-Sleep algorithm. We provide a conceptual guide to help develop neural learning theories that are precise, powerful, and experimentally testable.
Collapse
Affiliation(s)
- Colin Bredenberg
- Center for Neural Science, New York University, New York, NY 10003, USA
- Mila-Quebec AI Institute, 6666 Rue Saint-Urbain, Montréal, QC H2S 3H1
| | - Cristina Savin
- Center for Neural Science, New York University, New York, NY 10003, USA
- Center for Data Science, New York University, New York, NY 10011, USA
| |
Collapse
|
11
|
Wagle S, Kraynyukova N, Hafner AS, Tchumatchenko T. Computational insights into mRNA and protein dynamics underlying synaptic plasticity rules. Mol Cell Neurosci 2023; 125:103846. [PMID: 36963534 DOI: 10.1016/j.mcn.2023.103846] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 03/14/2023] [Accepted: 03/15/2023] [Indexed: 03/26/2023] Open
Abstract
Recent advances in experimental techniques provide an unprecedented peek into the intricate molecular dynamics inside synapses and dendrites. The experimental insights into the molecular turnover revealed that such processes as diffusion, active transport, spine uptake, and local protein synthesis could dynamically modulate the copy numbers of plasticity-related molecules in synapses. Subsequently, theoretical models were designed to understand the interaction of these processes better and to explain how local synaptic plasticity cues can up or down-regulate the molecular copy numbers across synapses. In this review, we discuss the recent advances in experimental techniques and computational models to highlight how these complementary approaches can provide insight into molecular cross-talk across synapses, ultimately allowing us to develop biologically-inspired neural network models to understand brain function.
Collapse
Affiliation(s)
- Surbhit Wagle
- Institute for Physiological Chemistry, University Medical Center of the Johannes Gutenberg-University Mainz, Anselm-Franz-von-Bentzel-Weg 3, 55128 Mainz, Germany
| | - Nataliya Kraynyukova
- Institute of Experimental Epileptology and Cognition Research, University of Bonn, Venusberg-Campus 1, 53127 Bonn, Germany
| | - Anne-Sophie Hafner
- Donders Institute for Brain, Cognition and Behaviour, Nijmegen, Netherlands; Faculty of Science, Radboud University, Nijmegen, Netherlands
| | - Tatjana Tchumatchenko
- Institute for Physiological Chemistry, University Medical Center of the Johannes Gutenberg-University Mainz, Anselm-Franz-von-Bentzel-Weg 3, 55128 Mainz, Germany; Institute of Experimental Epileptology and Cognition Research, University of Bonn, Venusberg-Campus 1, 53127 Bonn, Germany.
| |
Collapse
|
12
|
Sheynikhovich D, Otani S, Bai J, Arleo A. Long-term memory, synaptic plasticity and dopamine in rodent medial prefrontal cortex: Role in executive functions. Front Behav Neurosci 2023; 16:1068271. [PMID: 36710953 PMCID: PMC9875091 DOI: 10.3389/fnbeh.2022.1068271] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Accepted: 12/26/2022] [Indexed: 01/12/2023] Open
Abstract
Mnemonic functions, supporting rodent behavior in complex tasks, include both long-term and (short-term) working memory components. While working memory is thought to rely on persistent activity states in an active neural network, long-term memory and synaptic plasticity contribute to the formation of the underlying synaptic structure, determining the range of possible states. Whereas, the implication of working memory in executive functions, mediated by the prefrontal cortex (PFC) in primates and rodents, has been extensively studied, the contribution of long-term memory component to these tasks received little attention. This review summarizes available experimental data and theoretical work concerning cellular mechanisms of synaptic plasticity in the medial region of rodent PFC and the link between plasticity, memory and behavior in PFC-dependent tasks. A special attention is devoted to unique properties of dopaminergic modulation of prefrontal synaptic plasticity and its contribution to executive functions.
Collapse
Affiliation(s)
- Denis Sheynikhovich
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris, France,*Correspondence: Denis Sheynikhovich ✉
| | - Satoru Otani
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris, France
| | - Jing Bai
- Institute of Psychiatry and Neuroscience of Paris, INSERM U1266, Paris, France
| | - Angelo Arleo
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris, France
| |
Collapse
|
13
|
Organization and Priming of Long-term Memory Representations with Two-phase Plasticity. Cognit Comput 2022. [DOI: 10.1007/s12559-022-10021-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Abstract
Background / Introduction
In recurrent neural networks in the brain, memories are represented by so-called Hebbian cell assemblies. Such assemblies are groups of neurons with particularly strong synaptic connections formed by synaptic plasticity and consolidated by synaptic tagging and capture (STC). To link these synaptic mechanisms to long-term memory on the level of cognition and behavior, their functional implications on the level of neural networks have to be understood.
Methods
We employ a biologically detailed recurrent network of spiking neurons featuring synaptic plasticity and STC to model the learning and consolidation of long-term memory representations. Using this, we investigate the effects of different organizational paradigms, and of priming stimulation, on the functionality of multiple memory representations. We quantify these effects by the spontaneous activation of memory representations driven by background noise.
Results
We find that the learning order of the memory representations significantly biases the likelihood of activation towards more recently learned representations, and that hub-like overlap structure counters this effect. We identify long-term depression as the mechanism underlying these findings. Finally, we demonstrate that STC has functional consequences for the interaction of long-term memory representations: 1. intermediate consolidation in between learning the individual representations strongly alters the previously described effects, and 2. STC enables the priming of a long-term memory representation on a timescale of minutes to hours.
Conclusion
Our findings show how synaptic and neuronal mechanisms can provide an explanatory basis for known cognitive effects.
Collapse
|
14
|
Chindemi G, Abdellah M, Amsalem O, Benavides-Piccione R, Delattre V, Doron M, Ecker A, Jaquier AT, King J, Kumbhar P, Monney C, Perin R, Rössert C, Tuncel AM, Van Geit W, DeFelipe J, Graupner M, Segev I, Markram H, Muller EB. A calcium-based plasticity model for predicting long-term potentiation and depression in the neocortex. Nat Commun 2022; 13:3038. [PMID: 35650191 PMCID: PMC9160074 DOI: 10.1038/s41467-022-30214-w] [Citation(s) in RCA: 16] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2021] [Accepted: 04/19/2022] [Indexed: 01/14/2023] Open
Abstract
Pyramidal cells (PCs) form the backbone of the layered structure of the neocortex, and plasticity of their synapses is thought to underlie learning in the brain. However, such long-term synaptic changes have been experimentally characterized between only a few types of PCs, posing a significant barrier for studying neocortical learning mechanisms. Here we introduce a model of synaptic plasticity based on data-constrained postsynaptic calcium dynamics, and show in a neocortical microcircuit model that a single parameter set is sufficient to unify the available experimental findings on long-term potentiation (LTP) and long-term depression (LTD) of PC connections. In particular, we find that the diverse plasticity outcomes across the different PC types can be explained by cell-type-specific synaptic physiology, cell morphology and innervation patterns, without requiring type-specific plasticity. Generalizing the model to in vivo extracellular calcium concentrations, we predict qualitatively different plasticity dynamics from those observed in vitro. This work provides a first comprehensive null model for LTP/LTD between neocortical PC types in vivo, and an open framework for further developing models of cortical synaptic plasticity.
Collapse
Affiliation(s)
- Giuseppe Chindemi
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland.
| | - Marwan Abdellah
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Oren Amsalem
- Department of Neurobiology, the Hebrew University of Jerusalem, Jerusalem, Israel.,Division of Endocrinology, Diabetes and Metabolism, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, 02215, USA
| | - Ruth Benavides-Piccione
- Instituto Cajal, Consejo Superior de Investigaciones Científicas, Madrid, Spain.,Laboratorio Cajal de Circuitos Corticales, Centro de Tecnología Biomédica, Universidad Politécnica de Madrid, Madrid, Spain
| | - Vincent Delattre
- Laboratory of Neural Microcircuitry, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Michael Doron
- Edmond and Lily Safra Center for Brain Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel
| | - András Ecker
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Aurélien T Jaquier
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - James King
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Pramod Kumbhar
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Caitlin Monney
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Rodrigo Perin
- Laboratory of Neural Microcircuitry, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Christian Rössert
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Anil M Tuncel
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Werner Van Geit
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Javier DeFelipe
- Instituto Cajal, Consejo Superior de Investigaciones Científicas, Madrid, Spain.,Laboratorio Cajal de Circuitos Corticales, Centro de Tecnología Biomédica, Universidad Politécnica de Madrid, Madrid, Spain
| | - Michael Graupner
- Université de Paris, SPPIN - Saints-Pères Paris Institute for the Neurosciences, CNRS, Paris, France
| | - Idan Segev
- Department of Neurobiology, the Hebrew University of Jerusalem, Jerusalem, Israel.,Edmond and Lily Safra Center for Brain Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel
| | - Henry Markram
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland.,Laboratory of Neural Microcircuitry, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Eilif B Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland. .,Department of Neurosciences, Faculty of Medicine, Université de Montréal, Montréal, QC, Canada. .,CHU Sainte-Justine Research Center, Montréal, QC, Canada. .,Quebec Artificial Intelligence Institute (Mila), Montréal, Canada.
| |
Collapse
|
15
|
Ding Y, Wang Y, Cao L. A Simplified Plasticity Model Based on Synaptic Tagging and Capture Theory: Simplified STC. Front Comput Neurosci 2022; 15:798418. [PMID: 35221955 PMCID: PMC8873158 DOI: 10.3389/fncom.2021.798418] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2021] [Accepted: 12/27/2021] [Indexed: 01/06/2023] Open
Abstract
The formation and consolidation of memory play a vital role for survival in an ever-changing environment. In the brain, the change and stabilization of potentiated and depressed synapses are the neural basis of memory formation and maintenance. These changes can be induced by rather short stimuli (only a few seconds or even less) but should then be stable for months or years. Recently, the neural mechanism of conversion from rapid change during the early phase of synaptic plasticity into a stable memory trace in the late phase of synaptic plasticity is more and more clear at the protein and molecular levels, among which synaptic tagging and capture (STC) theory is one of the most popular theories. According to the STC theory, the change and stabilization of synaptic efficiency mainly depend on three processes related to calcium concentration, including synaptic tagging, synthesis of plasticity-related product (PRP), and the capture of PRP by tagged synapse. Based on the STC theory, several computational models are proposed. However, these models hardly take simplicity and biological interpretability into account simultaneously. Here, we propose a simplified STC (SM-STC) model to address this issue. In the SM-STC model, the concentration of calcium ion in each neuronal compartment and synapse is first calculated, and then the tag state of synapse and PRP are updated, and the coupling effect of tagged synapse and PRP is further considered to determine the plasticity state of the synapse, either potentiation or depression. We simulated the Schaffer collaterals pathway of the hippocampus targeting a multicompartment CA1 neuron for several hours of biological time. The results show that the SM-STC model can produce a broad range of experimental phenomena known in the physiological experiments, including long-term potentiation induced by high-frequency stimuli, long-term depression induced by low-frequency stimuli, and cross-capture with two stimuli separated by a delay. Thus, the SM-STC model proposed in this study provides an effective learning rule for brain-like computation on the premise of ensuring biological plausibility and computational efficiency.
Collapse
Affiliation(s)
- Yiwen Ding
- State Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing, China
- Neuroscience and Intelligent Media Institute, Communication University of China, Beijing, China
| | - Ye Wang
- State Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing, China
- Neuroscience and Intelligent Media Institute, Communication University of China, Beijing, China
- *Correspondence: Ye Wang,
| | - Lihong Cao
- State Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing, China
- Neuroscience and Intelligent Media Institute, Communication University of China, Beijing, China
- State Key Laboratory of Mathematical Engineering and Advanced Computing, Wuxi, China
- Lihong Cao,
| |
Collapse
|
16
|
Tomé DF, Sadeh S, Clopath C. Coordinated hippocampal-thalamic-cortical communication crucial for engram dynamics underneath systems consolidation. Nat Commun 2022; 13:840. [PMID: 35149680 PMCID: PMC8837777 DOI: 10.1038/s41467-022-28339-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2021] [Accepted: 01/13/2022] [Indexed: 11/09/2022] Open
Abstract
Systems consolidation refers to the time-dependent reorganization of memory representations or engrams across brain regions. Despite recent advancements in unravelling this process, the exact mechanisms behind engram dynamics and the role of associated pathways remain largely unknown. Here we propose a biologically-plausible computational model to address this knowledge gap. By coordinating synaptic plasticity timescales and incorporating a hippocampus-thalamus-cortex circuit, our model is able to couple engram reactivations across these regions and thereby reproduce key dynamics of cortical and hippocampal engram cells along with their interdependencies. Decoupling hippocampal-thalamic-cortical activity disrupts systems consolidation. Critically, our model yields testable predictions regarding hippocampal and thalamic engram cells, inhibitory engrams, thalamic inhibitory input, and the effect of thalamocortical synaptic coupling on retrograde amnesia induced by hippocampal lesions. Overall, our results suggest that systems consolidation emerges from coupled reactivations of engram cells in distributed brain regions enabled by coordinated synaptic plasticity timescales in multisynaptic subcortical-cortical circuits.
Collapse
Affiliation(s)
| | - Sadra Sadeh
- Department of Bioengineering, Imperial College London, London, UK
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, UK.
| |
Collapse
|
17
|
Sarazin MXB, Victor J, Medernach D, Naudé J, Delord B. Online Learning and Memory of Neural Trajectory Replays for Prefrontal Persistent and Dynamic Representations in the Irregular Asynchronous State. Front Neural Circuits 2021; 15:648538. [PMID: 34305535 PMCID: PMC8298038 DOI: 10.3389/fncir.2021.648538] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2020] [Accepted: 05/31/2021] [Indexed: 11/13/2022] Open
Abstract
In the prefrontal cortex (PFC), higher-order cognitive functions and adaptive flexible behaviors rely on continuous dynamical sequences of spiking activity that constitute neural trajectories in the state space of activity. Neural trajectories subserve diverse representations, from explicit mappings in physical spaces to generalized mappings in the task space, and up to complex abstract transformations such as working memory, decision-making and behavioral planning. Computational models have separately assessed learning and replay of neural trajectories, often using unrealistic learning rules or decoupling simulations for learning from replay. Hence, the question remains open of how neural trajectories are learned, memorized and replayed online, with permanently acting biological plasticity rules. The asynchronous irregular regime characterizing cortical dynamics in awake conditions exerts a major source of disorder that may jeopardize plasticity and replay of locally ordered activity. Here, we show that a recurrent model of local PFC circuitry endowed with realistic synaptic spike timing-dependent plasticity and scaling processes can learn, memorize and replay large-size neural trajectories online under asynchronous irregular dynamics, at regular or fast (sped-up) timescale. Presented trajectories are quickly learned (within seconds) as synaptic engrams in the network, and the model is able to chunk overlapping trajectories presented separately. These trajectory engrams last long-term (dozen hours) and trajectory replays can be triggered over an hour. In turn, we show the conditions under which trajectory engrams and replays preserve asynchronous irregular dynamics in the network. Functionally, spiking activity during trajectory replays at regular timescale accounts for both dynamical coding with temporal tuning in individual neurons, persistent activity at the population level, and large levels of variability consistent with observed cognitive-related PFC dynamics. Together, these results offer a consistent theoretical framework accounting for how neural trajectories can be learned, memorized and replayed in PFC networks circuits to subserve flexible dynamic representations and adaptive behaviors.
Collapse
Affiliation(s)
- Matthieu X B Sarazin
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Julie Victor
- CEA Paris-Saclay, CNRS, NeuroSpin, Saclay, France
| | - David Medernach
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Jérémie Naudé
- Neuroscience Paris Seine - Institut de biologie Paris Seine, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Bruno Delord
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| |
Collapse
|
18
|
Bin Ibrahim MZ, Benoy A, Sajikumar S. Long-term plasticity in the hippocampus: maintaining within and 'tagging' between synapses. FEBS J 2021; 289:2176-2201. [PMID: 34109726 DOI: 10.1111/febs.16065] [Citation(s) in RCA: 32] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Revised: 05/15/2021] [Accepted: 06/01/2021] [Indexed: 12/11/2022]
Abstract
Synapses between neurons are malleable biochemical structures, strengthening and diminishing over time dependent on the type of information they receive. This phenomenon known as synaptic plasticity underlies learning and memory, and its different forms, long-term potentiation (LTP) and long-term depression (LTD), perform varied cognitive roles in reinforcement, relearning and associating memories. Moreover, both LTP and LTD can exist in an early transient form (early-LTP/LTD) or a late persistent form (late-LTP/LTD), which are triggered by different induction protocols, and also differ in their dependence on protein synthesis and the involvement of key molecular players. Beyond homosynaptic modifications, synapses can also interact with one another. This is encapsulated in the synaptic tagging and capture hypothesis (STC), where synapses expressing early-LTP/LTD present a 'tag' that can capture the protein synthesis products generated during a temporally proximal late-LTP/LTD induction. This 'tagging' phenomenon forms the framework of synaptic interactions in various conditions and accounts for the cellular basis of the time-dependent associativity of short-lasting and long-lasting memories. All these synaptic modifications take place under controlled neuronal conditions, regulated by subcellular elements such as epigenetic regulation, proteasomal degradation and neuromodulatory signals. Here, we review current understanding of the different forms of synaptic plasticity and its regulatory mechanisms in the hippocampus, a brain region critical for memory formation. We also discuss expression of plasticity in hippocampal CA2 area, a long-overlooked narrow hippocampal subfield and the behavioural correlate of STC. Lastly, we put forth perspectives for an integrated view of memory representation in synapses.
Collapse
Affiliation(s)
- Mohammad Zaki Bin Ibrahim
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore.,Life Sciences Institute Neurobiology Programme, National University of Singapore, Singapore
| | - Amrita Benoy
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore.,Life Sciences Institute Neurobiology Programme, National University of Singapore, Singapore
| | - Sreedharan Sajikumar
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore.,Life Sciences Institute Neurobiology Programme, National University of Singapore, Singapore.,Healthy Longevity Translational Research Programme, Yong Loo Lin School of Medicine, National University of Singapore, Singapore
| |
Collapse
|
19
|
Luboeinski J, Tetzlaff C. Memory consolidation and improvement by synaptic tagging and capture in recurrent neural networks. Commun Biol 2021; 4:275. [PMID: 33658641 PMCID: PMC7977149 DOI: 10.1038/s42003-021-01778-y] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2020] [Accepted: 01/21/2021] [Indexed: 11/09/2022] Open
Abstract
The synaptic-tagging-and-capture (STC) hypothesis formulates that at each synapse the concurrence of a tag with protein synthesis yields the maintenance of changes induced by synaptic plasticity. This hypothesis provides a biological principle underlying the synaptic consolidation of memories that is not verified for recurrent neural circuits. We developed a theoretical model integrating the mechanisms underlying the STC hypothesis with calcium-based synaptic plasticity in a recurrent spiking neural network. In the model, calcium-based synaptic plasticity yields the formation of strongly interconnected cell assemblies encoding memories, followed by consolidation through the STC mechanisms. Furthermore, we show for the first time that STC mechanisms modify the storage of memories such that after several hours memory recall is significantly improved. We identify two contributing processes: a merely time-dependent passive improvement, and an active improvement during recall. The described characteristics can provide a new principle for storing information in biological and artificial neural circuits.
Collapse
Affiliation(s)
- Jannik Luboeinski
- Department of Computational Neuroscience, III. Institute of Physics-Biophysics, University of Göttingen, Göttingen, Germany.
- Bernstein Center for Computational Neuroscience, Göttingen, Germany.
| | - Christian Tetzlaff
- Department of Computational Neuroscience, III. Institute of Physics-Biophysics, University of Göttingen, Göttingen, Germany.
- Bernstein Center for Computational Neuroscience, Göttingen, Germany.
| |
Collapse
|
20
|
Embracing Change: Continual Learning in Deep Neural Networks. Trends Cogn Sci 2020; 24:1028-1040. [PMID: 33158755 DOI: 10.1016/j.tics.2020.09.004] [Citation(s) in RCA: 46] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2020] [Revised: 09/04/2020] [Accepted: 09/08/2020] [Indexed: 11/21/2022]
Abstract
Artificial intelligence research has seen enormous progress over the past few decades, but it predominantly relies on fixed datasets and stationary environments. Continual learning is an increasingly relevant area of study that asks how artificial systems might learn sequentially, as biological systems do, from a continuous stream of correlated data. In the present review, we relate continual learning to the learning dynamics of neural networks, highlighting the potential it has to considerably improve data efficiency. We further consider the many new biologically inspired approaches that have emerged in recent years, focusing on those that utilize regularization, modularity, memory, and meta-learning, and highlight some of the most promising and impactful directions.
Collapse
|
21
|
Liu Z, Luo R, Fu R, Yuan C, Xu X, Zhou D, Zhao M, Yuan TF, Du J. The Influences of Impulsivity and Education Levels on Severity of Alcohol Dependence. Front Psychiatry 2020; 11:737. [PMID: 32848917 PMCID: PMC7419695 DOI: 10.3389/fpsyt.2020.00737] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/16/2020] [Accepted: 07/14/2020] [Indexed: 11/16/2022] Open
Abstract
BACKGROUND Impulsivity contributes to the severity of alcohol use disorder. The association is affected by expectation towards alcohol use, emotional regulation and self-control. Here we investigated the influences of self-reported impulsivity and levels of education on severity of alcohol dependence. METHOD We retrospectively analyzed the basic demographic information, alcohol consumption state, education years, depression and anxiety state, Alcohol Use Disorder Identification Test (AUDIT) and Barrett Impulsivity Scales (BIS) from a group of 66 AUD patients. RESULT Impulsivity significantly predicted alcohol dependence severity (R 2 = 0.069, F = 4.724, p = 0.034). In addition, education years served as a moderator in the relationship between impulsivity and alcohol dependence severity (ΔR2 = 0.059, F = 4.414, p = 0.040). CONCLUSION Self-reported impulsivity affects the severity of alcohol dependence, which might be different in patients with different education levels.
Collapse
Affiliation(s)
- Ziqi Liu
- Shanghai Key Laboratory of Psychotic Disorders, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China
| | - Ruyan Luo
- Department of Substance Abuse and Addiction, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China
| | - Rao Fu
- Department of Substance Abuse and Addiction, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China
| | - Chenxin Yuan
- Department of Substance Abuse and Addiction, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China
| | - Xueming Xu
- Department of Addiction, Taizhou Second People's Hospital, Taizhou, China
| | - Dongsheng Zhou
- Department of Psychiatry, Ningbo Kangning Hospital, Ningbo, China
| | - Min Zhao
- Shanghai Key Laboratory of Psychotic Disorders, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China.,Department of Substance Abuse and Addiction, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China
| | - Ti-Fei Yuan
- Shanghai Key Laboratory of Psychotic Disorders, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China.,Co-innovation Center of Neuroregeneration, Nantong University, Nantong, China
| | - Jiang Du
- Shanghai Key Laboratory of Psychotic Disorders, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China.,Department of Substance Abuse and Addiction, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China
| |
Collapse
|
22
|
Helfer P, Shultz TR. A computational model of systems memory consolidation and reconsolidation. Hippocampus 2019; 30:659-677. [PMID: 31872960 DOI: 10.1002/hipo.23187] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2019] [Revised: 10/05/2019] [Accepted: 12/04/2019] [Indexed: 12/27/2022]
Abstract
In the mammalian brain, newly acquired memories depend on the hippocampus (HPC) for maintenance and recall, but over time, the neocortex takes over these functions, rendering memories HPC-independent. The process responsible for this transformation is called systems memory consolidation. Reactivation of a well-consolidated memory can trigger a temporary return to a HPC-dependent state, a phenomenon known as systems memory reconsolidation. The neural mechanisms underlying systems memory consolidation and reconsolidation are not well understood. Here, we propose a neural model based on well-documented mechanisms of synaptic plasticity and stability and describe a computational implementation that demonstrates the model's ability to account for a range of findings from the systems consolidation and reconsolidation literature. We derive several predictions from the computational model and suggest experiments that may test its validity.
Collapse
Affiliation(s)
- Peter Helfer
- Department of Psychology, McGill University, 2001 McGill College, Montreal, QC, Canada
| | - Thomas R Shultz
- Department of Psychology, McGill University, 2001 McGill College, Montreal, QC, Canada
| |
Collapse
|
23
|
Gastaldi C, Muscinelli S, Gerstner W. Optimal Stimulation Protocol in a Bistable Synaptic Consolidation Model. Front Comput Neurosci 2019; 13:78. [PMID: 31798436 PMCID: PMC6874130 DOI: 10.3389/fncom.2019.00078] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2019] [Accepted: 10/21/2019] [Indexed: 01/12/2023] Open
Abstract
Synaptic changes induced by neural activity need to be consolidated to maintain memory over a timescale of hours. In experiments, synaptic consolidation can be induced by repeating a stimulation protocol several times and the effectiveness of consolidation depends crucially on the repetition frequency of the stimulations. We address the question: is there an understandable reason why induction protocols with repetitions at some frequency work better than sustained protocols—even though the accumulated stimulation strength might be exactly the same in both cases? In real synapses, plasticity occurs on multiple time scales from seconds (induction), to several minutes (early phase of long-term potentiation) to hours and days (late phase of synaptic consolidation). We use a simplified mathematical model of just two times scales to elucidate the above question in a purified setting. Our mathematical results show that, even in such a simple model, the repetition frequency of stimulation plays an important role for the successful induction, and stabilization, of potentiation.
Collapse
Affiliation(s)
- Chiara Gastaldi
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Samuel Muscinelli
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
24
|
Elliott T. Dynamic Integrative Synaptic Plasticity Explains the Spacing Effect in the Transition from Short- to Long-Term Memory. Neural Comput 2019; 31:2212-2251. [PMID: 31525308 DOI: 10.1162/neco_a_01227] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Repeated stimuli that are spaced apart in time promote the transition from short- to long-term memory, while massing repetitions together does not. Previously, we showed that a model of integrative synaptic plasticity, in which plasticity induction signals are integrated by a low-pass filter before plasticity is expressed, gives rise to a natural timescale at which to repeat stimuli, hinting at a partial account of this spacing effect. The account was only partial because the important role of neuromodulation was not considered. We now show that by extending the model to allow dynamic integrative synaptic plasticity, the model permits synapses to robustly discriminate between spaced and massed repetition protocols, suppressing the response to massed stimuli while maintaining that to spaced stimuli. This is achieved by dynamically coupling the filter decay rate to neuromodulatory signaling in a very simple model of the signaling cascades downstream from cAMP production. In particular, the model's parameters may be interpreted as corresponding to the duration and amplitude of the waves of activity in the MAPK pathway. We identify choices of parameters and repetition times for stimuli in this model that optimize the ability of synapses to discriminate between spaced and massed repetition protocols. The model is very robust to reasonable changes around these optimal parameters and times, but for large changes in parameters, the model predicts that massed and spaced stimuli cannot be distinguished or that the responses to both patterns are suppressed. A model of dynamic integrative synaptic plasticity therefore explains the spacing effect under normal conditions and also predicts its breakdown under abnormal conditions.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K.
| |
Collapse
|
25
|
Roelfsema PR, Holtmaat A. Control of synaptic plasticity in deep cortical networks. Nat Rev Neurosci 2019; 19:166-180. [PMID: 29449713 DOI: 10.1038/nrn.2018.6] [Citation(s) in RCA: 105] [Impact Index Per Article: 21.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Humans and many other animals have an enormous capacity to learn about sensory stimuli and to master new skills. However, many of the mechanisms that enable us to learn remain to be understood. One of the greatest challenges of systems neuroscience is to explain how synaptic connections change to support maximally adaptive behaviour. Here, we provide an overview of factors that determine the change in the strength of synapses, with a focus on synaptic plasticity in sensory cortices. We review the influence of neuromodulators and feedback connections in synaptic plasticity and suggest a specific framework in which these factors can interact to improve the functioning of the entire network.
Collapse
Affiliation(s)
- Pieter R Roelfsema
- Department of Vision and Cognition, Netherlands Institute for Neuroscience, Royal Netherlands Academy of Arts and Sciences, Amsterdam, Netherlands.,Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, VU University, Amsterdam, Netherlands.,Psychiatry Department, Academic Medical Center, Amsterdam, Netherlands
| | - Anthony Holtmaat
- Department of Basic Neurosciences, Geneva Neuroscience Center, Faculty of Medicine, University of Geneva, Geneva, Switzerland
| |
Collapse
|
26
|
Deger M, Seeholzer A, Gerstner W. Multicontact Co-operativity in Spike-Timing-Dependent Structural Plasticity Stabilizes Networks. Cereb Cortex 2019; 28:1396-1415. [PMID: 29300903 PMCID: PMC6041941 DOI: 10.1093/cercor/bhx339] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2017] [Accepted: 11/30/2017] [Indexed: 12/12/2022] Open
Abstract
Excitatory synaptic connections in the adult neocortex consist of multiple synaptic contacts, almost exclusively formed on dendritic spines. Changes of spine volume, a correlate of synaptic strength, can be tracked in vivo for weeks. Here, we present a combined model of structural and spike-timing–dependent plasticity that explains the multicontact configuration of synapses in adult neocortical networks under steady-state and lesion-induced conditions. Our plasticity rule with Hebbian and anti-Hebbian terms stabilizes both the postsynaptic firing rate and correlations between the pre- and postsynaptic activity at an active synaptic contact. Contacts appear spontaneously at a low rate and disappear if their strength approaches zero. Many presynaptic neurons compete to make strong synaptic connections onto a postsynaptic neuron, whereas the synaptic contacts of a given presynaptic neuron co-operate via postsynaptic firing. We find that co-operation of multiple synaptic contacts is crucial for stable, long-term synaptic memories. In simulations of a simplified network model of barrel cortex, our plasticity rule reproduces whisker-trimming–induced rewiring of thalamocortical and recurrent synaptic connectivity on realistic time scales.
Collapse
Affiliation(s)
- Moritz Deger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.,Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, 50674 Cologne, Germany
| | - Alexander Seeholzer
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
27
|
Herpich J, Tetzlaff C. Principles underlying the input-dependent formation and organization of memories. Netw Neurosci 2019; 3:606-634. [PMID: 31157312 PMCID: PMC6542621 DOI: 10.1162/netn_a_00086] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2018] [Accepted: 03/21/2019] [Indexed: 11/29/2022] Open
Abstract
The neuronal system exhibits the remarkable ability to dynamically store and organize incoming information into a web of memory representations (items), which is essential for the generation of complex behaviors. Central to memory function is that such memory items must be (1) discriminated from each other, (2) associated to each other, or (3) brought into a sequential order. However, how these three basic mechanisms are robustly implemented in an input-dependent manner by the underlying complex neuronal and synaptic dynamics is still unknown. Here, we develop a mathematical framework, which provides a direct link between different synaptic mechanisms, determining the neuronal and synaptic dynamics of the network, to create a network that emulates the above mechanisms. Combining correlation-based synaptic plasticity and homeostatic synaptic scaling, we demonstrate that these mechanisms enable the reliable formation of sequences and associations between two memory items still missing the capability for discrimination. We show that this shortcoming can be removed by additionally considering inhibitory synaptic plasticity. Thus, the here-presented framework provides a new, functionally motivated link between different known synaptic mechanisms leading to the self-organization of fundamental memory mechanisms.
Collapse
Affiliation(s)
- Juliane Herpich
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| |
Collapse
|
28
|
Ocker GK, Doiron B. Training and Spontaneous Reinforcement of Neuronal Assemblies by Spike Timing Plasticity. Cereb Cortex 2019; 29:937-951. [PMID: 29415191 PMCID: PMC7963120 DOI: 10.1093/cercor/bhy001] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 01/01/2018] [Accepted: 01/05/2018] [Indexed: 12/15/2022] Open
Abstract
The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, PA, USA
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
29
|
Gerstner W, Lehmann M, Liakoni V, Corneil D, Brea J. Eligibility Traces and Plasticity on Behavioral Time Scales: Experimental Support of NeoHebbian Three-Factor Learning Rules. Front Neural Circuits 2018; 12:53. [PMID: 30108488 PMCID: PMC6079224 DOI: 10.3389/fncir.2018.00053] [Citation(s) in RCA: 112] [Impact Index Per Article: 18.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2018] [Accepted: 06/19/2018] [Indexed: 11/13/2022] Open
Abstract
Most elementary behaviors such as moving the arm to grasp an object or walking into the next room to explore a museum evolve on the time scale of seconds; in contrast, neuronal action potentials occur on the time scale of a few milliseconds. Learning rules of the brain must therefore bridge the gap between these two different time scales. Modern theories of synaptic plasticity have postulated that the co-activation of pre- and postsynaptic neurons sets a flag at the synapse, called an eligibility trace, that leads to a weight change only if an additional factor is present while the flag is set. This third factor, signaling reward, punishment, surprise, or novelty, could be implemented by the phasic activity of neuromodulators or specific neuronal inputs signaling special events. While the theoretical framework has been developed over the last decades, experimental evidence in support of eligibility traces on the time scale of seconds has been collected only during the last few years. Here we review, in the context of three-factor rules of synaptic plasticity, four key experiments that support the role of synaptic eligibility traces in combination with a third factor as a biological implementation of neoHebbian three-factor learning rules.
Collapse
Affiliation(s)
- Wulfram Gerstner
- School of Computer Science and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | | | | | | | | |
Collapse
|
30
|
Diederich N, Bartsch T, Kohlstedt H, Ziegler M. A memristive plasticity model of voltage-based STDP suitable for recurrent bidirectional neural networks in the hippocampus. Sci Rep 2018; 8:9367. [PMID: 29921840 PMCID: PMC6008480 DOI: 10.1038/s41598-018-27616-6] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2017] [Accepted: 06/04/2018] [Indexed: 01/02/2023] Open
Abstract
Memristive systems have gained considerable attention in the field of neuromorphic engineering, because they allow the emulation of synaptic functionality in solid state nano-physical systems. In this study, we show that memristive behavior provides a broad working framework for the phenomenological modelling of cellular synaptic mechanisms. In particular, we seek to understand how close a memristive system can account for the biological realism. The basic characteristics of memristive systems, i.e. voltage and memory behavior, are used to derive a voltage-based plasticity rule. We show that this model is suitable to account for a variety of electrophysiology plasticity data. Furthermore, we incorporate the plasticity model into an all-to-all connecting network scheme. Motivated by the auto-associative CA3 network of the hippocampus, we show that the implemented network allows the discrimination and processing of mnemonic pattern information, i.e. the formation of functional bidirectional connections resulting in the formation of local receptive fields. Since the presented plasticity model can be applied to real memristive devices as well, the presented theoretical framework can support both, the design of appropriate memristive devices for neuromorphic computing and the development of complex neuromorphic networks, which account for the specific advantage of memristive devices.
Collapse
Affiliation(s)
- Nick Diederich
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany
- Department of Neurology, Memory Disorders and Plasticity Group, University Hospital Schleswig-Holstein, Kiel, Germany
| | - Thorsten Bartsch
- Department of Neurology, Memory Disorders and Plasticity Group, University Hospital Schleswig-Holstein, Kiel, Germany
| | - Hermann Kohlstedt
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany
| | - Martin Ziegler
- Nanoelektronik, Technische Fakultät, Christian-Albrechts-Universität zu Kiel, D-24143, Kiel, Germany.
| |
Collapse
|
31
|
Helfer P, Shultz TR. Coupled feedback loops maintain synaptic long-term potentiation: A computational model of PKMzeta synthesis and AMPA receptor trafficking. PLoS Comput Biol 2018; 14:e1006147. [PMID: 29813048 PMCID: PMC5993340 DOI: 10.1371/journal.pcbi.1006147] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2017] [Revised: 06/08/2018] [Accepted: 04/16/2018] [Indexed: 01/13/2023] Open
Abstract
In long-term potentiation (LTP), one of the most studied types of neural plasticity, synaptic strength is persistently increased in response to stimulation. Although a number of different proteins have been implicated in the sub-cellular molecular processes underlying induction and maintenance of LTP, the precise mechanisms remain unknown. A particular challenge is to demonstrate that a proposed molecular mechanism can provide the level of stability needed to maintain memories for months or longer, in spite of the fact that many of the participating molecules have much shorter life spans. Here we present a computational model that combines simulations of several biochemical reactions that have been suggested in the LTP literature and show that the resulting system does exhibit the required stability. At the core of the model are two interlinked feedback loops of molecular reactions, one involving the atypical protein kinase PKMζ and its messenger RNA, the other involving PKMζ and GluA2-containing AMPA receptors. We demonstrate that robust bistability-stable equilibria both in the synapse's potentiated and unpotentiated states-can arise from a set of simple molecular reactions. The model is able to account for a wide range of empirical results, including induction and maintenance of late-phase LTP, cellular memory reconsolidation and the effects of different pharmaceutical interventions.
Collapse
Affiliation(s)
- Peter Helfer
- Department of Psychology and Integrated Program in Neuroscience, McGill University, Montreal, Quebec, Canada
| | - Thomas R. Shultz
- Department of Psychology and School of Computer Science, McGill University, Montreal, Quebec, Canada
| |
Collapse
|
32
|
Sharma M, Razali NB, Sajikumar S. Inhibition of G9a/GLP Complex Promotes Long-Term Potentiation and Synaptic Tagging/Capture in Hippocampal CA1 Pyramidal Neurons. Cereb Cortex 2018; 27:3161-3171. [PMID: 27252354 DOI: 10.1093/cercor/bhw170] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Abstract
Epigenetic regulations play an important role in regulating the learning and memory processes. G9a/G9a-like protein (GLP) lysine dimethyltransferase complex controls a prominent histone H3 lysine9 dimethylation (H3K9me2) that results in transcriptional silencing of the chromatin. Here, we report that the inhibition of G9a/GLP complex by either of the substrate competitive inhibitors UNC 0638 or BIX 01294 reinforces protein synthesis-independent long-term potentiation (early-LTP) to protein synthesis-dependent long-term potentiation (late-LTP). The reinforcement effect was observed if the inhibitors were present during the induction of early-LTP and in addition when G9a/GLP complex inhibition was carried out by priming of synapses within an interval of 30 min before or after the induction of early-LTP. Surprisingly, the reinforced LTP by G9a/GLP complex inhibition was able to associate with a weak plasticity event from nearby independent synaptic populations, resulting in synaptic tagging/capture (STC). We have identified brain-derived neurotrophic factor (BDNF) as a critical plasticity protein that maintains G9a/GLP complex inhibition-mediated LTP facilitation and its STC. Our study reveals an epigenetic mechanism for promoting plasticity and associativity by G9a/GLP complex inhibition, and it may engender a promising epigenetic target for enhancing memory in neural networks.
Collapse
Affiliation(s)
- Mahima Sharma
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore 117 597, Singapore.,Neurobiology/Aging Program, #04-44, 28 Medical Drive, Life Sciences Institute (LSI), National University of Singapore, Singapore 117 456, Singapore
| | - Nuralyah Bte Razali
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore 117 597, Singapore.,Neurobiology/Aging Program, #04-44, 28 Medical Drive, Life Sciences Institute (LSI), National University of Singapore, Singapore 117 456, Singapore
| | - Sreedharan Sajikumar
- Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore 117 597, Singapore.,Neurobiology/Aging Program, #04-44, 28 Medical Drive, Life Sciences Institute (LSI), National University of Singapore, Singapore 117 456, Singapore
| |
Collapse
|
33
|
Mehta A. Storing and retrieving long-term memories: cooperation and competition in synaptic dynamics. ADVANCES IN PHYSICS: X 2018. [DOI: 10.1080/23746149.2018.1480415] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022] Open
Affiliation(s)
- Anita Mehta
- Max Planck Institute for Mathematics in the Sciences, Leipzig, Germany
| |
Collapse
|
34
|
Kastellakis G, Silva AJ, Poirazi P. Linking Memories across Time via Neuronal and Dendritic Overlaps in Model Neurons with Active Dendrites. Cell Rep 2017; 17:1491-1504. [PMID: 27806290 PMCID: PMC5149530 DOI: 10.1016/j.celrep.2016.10.015] [Citation(s) in RCA: 56] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2015] [Revised: 08/17/2016] [Accepted: 10/04/2016] [Indexed: 11/29/2022] Open
Abstract
Memories are believed to be stored in distributed neuronal assemblies through activity-induced changes in synaptic and intrinsic properties. However, the specific mechanisms by which different memories become associated or linked remain a mystery. Here, we develop a simplified, biophysically inspired network model that incorporates multiple plasticity processes and explains linking of information at three different levels: (1) learning of a single associative memory, (2) rescuing of a weak memory when paired with a strong one, and (3) linking of multiple memories across time. By dissecting synaptic from intrinsic plasticity and neuron-wide from dendritically restricted protein capture, the model reveals a simple, unifying principle: linked memories share synaptic clusters within the dendrites of overlapping populations of neurons. The model generates numerous experimentally testable predictions regarding the cellular and sub-cellular properties of memory engrams as well as their spatiotemporal interactions. Network model with active dendrites and synaptic, somatic, homeostatic plasticity Linked memories are stored in overlapping populations of neurons Linked memories share synaptic clusters in common dendritic branches The locus of protein synthesis or capture shapes the structure of the memory trace
Collapse
Affiliation(s)
- George Kastellakis
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology, Hellas (FORTH), N. Plastira 100, P.O. Box 1385, Heraklion, Crete 70013, Greece; Department of Biology, University of Crete, P.O. Box 2208, Heraklion, Crete 70013, Greece
| | - Alcino J Silva
- Integrative Center for Learning and Memory, Departments of Neurobiology, Psychology, and Psychiatry, and Brain Research Institute, UCLA, 2554 Gonda Center, Los Angeles, CA 90095, USA
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology, Hellas (FORTH), N. Plastira 100, P.O. Box 1385, Heraklion, Crete 70013, Greece.
| |
Collapse
|
35
|
Costa RP, Padamsey Z, D'Amour JA, Emptage NJ, Froemke RC, Vogels TP. Synaptic Transmission Optimization Predicts Expression Loci of Long-Term Plasticity. Neuron 2017; 96:177-189.e7. [PMID: 28957667 PMCID: PMC5626823 DOI: 10.1016/j.neuron.2017.09.021] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2017] [Revised: 07/05/2017] [Accepted: 09/13/2017] [Indexed: 10/29/2022]
Abstract
Long-term modifications of neuronal connections are critical for reliable memory storage in the brain. However, their locus of expression-pre- or postsynaptic-is highly variable. Here we introduce a theoretical framework in which long-term plasticity performs an optimization of the postsynaptic response statistics toward a given mean with minimal variance. Consequently, the state of the synapse at the time of plasticity induction determines the ratio of pre- and postsynaptic modifications. Our theory explains the experimentally observed expression loci of the hippocampal and neocortical synaptic potentiation studies we examined. Moreover, the theory predicts presynaptic expression of long-term depression, consistent with experimental observations. At inhibitory synapses, the theory suggests a statistically efficient excitatory-inhibitory balance in which changes in inhibitory postsynaptic response statistics specifically target the mean excitation. Our results provide a unifying theory for understanding the expression mechanisms and functions of long-term synaptic transmission plasticity.
Collapse
Affiliation(s)
- Rui Ponte Costa
- Centre for Neural Circuits and Behaviour, Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, UK.
| | - Zahid Padamsey
- Department of Pharmacology, University of Oxford, Oxford, UK
| | - James A D'Amour
- Skirball Institute, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, USA
| | - Nigel J Emptage
- Department of Pharmacology, University of Oxford, Oxford, UK
| | - Robert C Froemke
- Skirball Institute, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, USA; Center for Neural Science, New York University, New York, NY, USA; Howard Hughes Medical Institute Faculty Scholar
| | - Tim P Vogels
- Centre for Neural Circuits and Behaviour, Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, UK
| |
Collapse
|
36
|
Jȩdrzejewska-Szmek J, Luczak V, Abel T, Blackwell KT. β-adrenergic signaling broadly contributes to LTP induction. PLoS Comput Biol 2017; 13:e1005657. [PMID: 28742159 PMCID: PMC5546712 DOI: 10.1371/journal.pcbi.1005657] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 08/07/2017] [Accepted: 06/27/2017] [Indexed: 12/18/2022] Open
Abstract
Long-lasting forms of long-term potentiation (LTP) represent one of the major cellular mechanisms underlying learning and memory. One of the fundamental questions in the field of LTP is why different molecules are critical for long-lasting forms of LTP induced by diverse experimental protocols. Further complexity stems from spatial aspects of signaling networks, such that some molecules function in the dendrite and some are critical in the spine. We investigated whether the diverse experimental evidence can be unified by creating a spatial, mechanistic model of multiple signaling pathways in hippocampal CA1 neurons. Our results show that the combination of activity of several key kinases can predict the occurrence of long-lasting forms of LTP for multiple experimental protocols. Specifically Ca2+/calmodulin activated kinase II, protein kinase A and exchange protein activated by cAMP (Epac) together predict the occurrence of LTP in response to strong stimulation (multiple trains of 100 Hz) or weak stimulation augmented by isoproterenol. Furthermore, our analysis suggests that activation of the β-adrenergic receptor either via canonical (Gs-coupled) or non-canonical (Gi-coupled) pathways underpins most forms of long-lasting LTP. Simulations make the experimentally testable prediction that a complete antagonist of the β-adrenergic receptor will likely block long-lasting LTP in response to strong stimulation. Collectively these results suggest that converging molecular mechanisms allow CA1 neurons to flexibly utilize signaling mechanisms best tuned to temporal pattern of synaptic input to achieve long-lasting LTP and memory storage. Long-term potentiation of the strength of synaptic connections is a mechanism of learning and memory storage. One of the most confusing aspects of hippocampal synaptic potentiation is that numerous experiments have revealed the requirement for a plethora of signaling molecules. Furthermore the degree to which molecules activated by the stress response modify hippocampal synaptic potentiation and memory is still unclear. We used a computational model to demonstrate that this molecular diversity can be explained by considering a combination of several key molecules. We also show that activation of β-adrenergic receptors by the stress response appears to be involved in most forms of synaptic potentiation, though in some cases unconventional mechanisms are utilized. This suggests that novel treatments for stress-related disorders may have more success if they target unconventional mechanisms activated by β-adrenergic receptors.
Collapse
Affiliation(s)
- Joanna Jȩdrzejewska-Szmek
- The Krasnow Institute for Advanced Studies, George Mason University, Fairfax, Virginia, United States of America
| | - Vincent Luczak
- Department of Biology, University of Pennsylvania, Philadelphia, Pennsylvania, United States of America
| | - Ted Abel
- Department of Biology, University of Pennsylvania, Philadelphia, Pennsylvania, United States of America
| | - Kim T Blackwell
- The Krasnow Institute for Advanced Studies, George Mason University, Fairfax, Virginia, United States of America
- * E-mail:
| |
Collapse
|
37
|
Grogan JP, Tsivos D, Smith L, Knight BE, Bogacz R, Whone A, Coulthard EJ. Effects of dopamine on reinforcement learning and consolidation in Parkinson's disease. eLife 2017; 6. [PMID: 28691905 PMCID: PMC5531832 DOI: 10.7554/elife.26801] [Citation(s) in RCA: 37] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2017] [Accepted: 07/07/2017] [Indexed: 01/24/2023] Open
Abstract
Emerging evidence suggests that dopamine may modulate learning and memory with important implications for understanding the neurobiology of memory and future therapeutic targeting. An influential hypothesis posits that dopamine biases reinforcement learning. More recent data also suggest an influence during both consolidation and retrieval. Eighteen Parkinson's disease patients learned through feedback ON or OFF medication, with memory tested 24 hr later ON or OFF medication (4 conditions, within-subjects design with matched healthy control group). Patients OFF medication during learning decreased in memory accuracy over the following 24 hr. In contrast to previous studies, however, dopaminergic medication during learning and testing did not affect expression of positive or negative reinforcement. Two further experiments were run without the 24 hr delay, but they too failed to reproduce effects of dopaminergic medication on reinforcement learning. While supportive of a dopaminergic role in consolidation, this study failed to replicate previous findings on reinforcement learning.
Collapse
Affiliation(s)
- John P Grogan
- Institute of Clinical Neurosciences, School of Clinical Sciences, University of Bristol, Bristol, United Kingdom
| | - Demitra Tsivos
- Clinical Neurosciences, North Bristol NHS Trust, Bristol, United Kingdom
| | - Laura Smith
- Institute of Clinical Neurosciences, School of Clinical Sciences, University of Bristol, Bristol, United Kingdom
| | - Brogan E Knight
- Clinical Neurosciences, North Bristol NHS Trust, Bristol, United Kingdom
| | - Rafal Bogacz
- MRC Brain Network Dynamics Unit, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
| | - Alan Whone
- Institute of Clinical Neurosciences, School of Clinical Sciences, University of Bristol, Bristol, United Kingdom
| | - Elizabeth J Coulthard
- Institute of Clinical Neurosciences, School of Clinical Sciences, University of Bristol, Bristol, United Kingdom.,Clinical Neurosciences, North Bristol NHS Trust, Bristol, United Kingdom
| |
Collapse
|
38
|
Abstract
Humans regularly perform new learning without losing memory for previous information, but neural network models suffer from the phenomenon of catastrophic forgetting in which new learning impairs prior function. A recent article presents an algorithm that spares learning at synapses important for previously learned function, reducing catastrophic forgetting.
Collapse
Affiliation(s)
- Michael E Hasselmo
- Center for Systems Neuroscience, Boston University, 2 Cummington Mall, Boston, MA 02215, USA.
| |
Collapse
|
39
|
Zenke F, Gerstner W, Ganguli S. The temporal paradox of Hebbian learning and homeostatic plasticity. Curr Opin Neurobiol 2017; 43:166-176. [DOI: 10.1016/j.conb.2017.03.015] [Citation(s) in RCA: 104] [Impact Index Per Article: 14.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 03/07/2017] [Accepted: 03/22/2017] [Indexed: 11/16/2022]
|
40
|
Gershman SJ, Monfils MH, Norman KA, Niv Y. The computational nature of memory modification. eLife 2017; 6. [PMID: 28294944 PMCID: PMC5391211 DOI: 10.7554/elife.23763] [Citation(s) in RCA: 61] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2016] [Accepted: 03/13/2017] [Indexed: 11/25/2022] Open
Abstract
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature. DOI:http://dx.doi.org/10.7554/eLife.23763.001 Our memories contain our expectations about the world that we can retrieve to make predictions about the future. For example, most people would expect a chocolate bar to taste good, because they have previously learned to associate chocolate with pleasure. When a surprising event occurs, such as tasting an unpalatable chocolate bar, the brain therefore faces a dilemma. Should it update the existing memory and overwrite the association between chocolate and pleasure? Or should it create an additional memory? In the latter case, the brain would form a new association between chocolate and displeasure that competes with, but does not overwrite, the original one between chocolate and pleasure. Previous studies have shown that surprising events tend to create new memories unless the existing memory is briefly reactivated before the surprising event occurs. In other words, retrieving old memories makes them more malleable. Gershman et al. have now developed a computational model for how the brain decides whether to update an old memory or create a new one. The idea at the heart of the model is that the brain will attempt to infer what caused the surprising event. The reason the chocolate bar tastes unpalatable, for example, might be because it was old and had spoiled. Every time the brain infers a new possible cause for a surprising event, it will create an additional memory to store this new set of expectations. In the future we will know that spoiled chocolate bars taste bad. However, if the brain cannot infer a new cause for the surprising event – because, for example, there appears to be nothing unusual about the unpalatable chocolate bar – it will instead opt to update the existing memory. The next time we buy a chocolate bar, we will have slightly lower expectations about how good it will taste. The dilemma of whether to update an existing memory or create a new one thus boils down to the question: is the surprising event the consequence of a new cause or an old one? This theory implies that retrieving a memory nudges the brain to infer that its associated cause is once again active and, since this is an old cause, it means that the memory will be eligible for updating. Many experiments have been performed on the topic of modifying memories, but this is the first computational model that offers a unifying explanation for the results. The next step is to work out how to apply the model, which is phrased in abstract terms, to networks of neurons that are more biologically realistic. DOI:http://dx.doi.org/10.7554/eLife.23763.002
Collapse
Affiliation(s)
- Samuel J Gershman
- Department of Psychology and Center for Brain Science, Harvard University, Cambridge, United States
| | - Marie-H Monfils
- Department of Psychology, University of Texas, Austin, United States
| | - Kenneth A Norman
- Princeton Neuroscience Institute and Department of Psychology, Princeton University, Princeton, United States
| | - Yael Niv
- Princeton Neuroscience Institute and Department of Psychology, Princeton University, Princeton, United States
| |
Collapse
|
41
|
Abstract
The ability to learn tasks in a sequential fashion is crucial to the development of artificial intelligence. Until now neural networks have not been capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models. We show that it is possible to overcome this limitation and train networks that can maintain expertise on tasks that they have not experienced for a long time. Our approach remembers old tasks by selectively slowing down learning on the weights important for those tasks. We demonstrate our approach is scalable and effective by solving a set of classification tasks based on a hand-written digit dataset and by learning several Atari 2600 games sequentially.
Collapse
|
42
|
Nguyen-Vu TB, Zhao GQ, Lahiri S, Kimpo RR, Lee H, Ganguli S, Shatz CJ, Raymond JL. A saturation hypothesis to explain both enhanced and impaired learning with enhanced plasticity. eLife 2017; 6. [PMID: 28234229 PMCID: PMC5386593 DOI: 10.7554/elife.20147] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2016] [Accepted: 02/02/2017] [Indexed: 11/19/2022] Open
Abstract
Across many studies, animals with enhanced synaptic plasticity exhibit either enhanced or impaired learning, raising a conceptual puzzle: how enhanced plasticity can yield opposite learning outcomes? Here, we show that the recent history of experience can determine whether mice with enhanced plasticity exhibit enhanced or impaired learning in response to the same training. Mice with enhanced cerebellar LTD, due to double knockout (DKO) of MHCI H2-Kb/H2-Db (KbDb−/−), exhibited oculomotor learning deficits. However, the same mice exhibited enhanced learning after appropriate pre-training. Theoretical analysis revealed that synapses with history-dependent learning rules could recapitulate the data, and suggested that saturation may be a key factor limiting the ability of enhanced plasticity to enhance learning. Optogenetic stimulation designed to saturate LTD produced the same impairment in WT as observed in DKO mice. Overall, our results suggest that the recent history of activity and the threshold for synaptic plasticity conspire to effect divergent learning outcomes. DOI:http://dx.doi.org/10.7554/eLife.20147.001 All animals can learn from their experiences. One of the main ideas for how learning occurs is that it involves changes in the strength of the connections between neurons, known as synapses. The ability of synapses to become stronger or weaker is referred to as synaptic plasticity. High levels of synaptic plasticity are generally thought to be good for learning, while low levels of synaptic plasticity make learning more difficult. Nevertheless, studies have also reported that high levels of synaptic plasticity can sometimes impair learning. To explain these mixed results, Nguyen-Vu, Zhao, Lahiri et al. studied mice that had been genetically modified to show greater synaptic plasticity than normal mice. The same individual mutant animals were sometimes less able to learn an eye-movement task than unmodified mice, and at other times better able to learn exactly the same task. The main factor that determined how well the mice could learn was what the mice had experienced shortly before they began the training. Nguyen-Vu et al. propose that some experiences change the strength of synapses so much that they temporarily prevent those synapses from undergoing any further changes. Animals with these “saturated” synapses will struggle to learn a new task, even if their brains are normally capable of high levels of synaptic plasticity. Notably, even normal activity appears to be able to put the synapses of the mutant mice into a saturated state, whereas this saturation would only occur in normal mice under a restricted set of circumstances. Consistent with this idea, Nguyen-Vu et al. showed that a specific type of pre-training that desaturates synapses improved the ability of the modified mice to learn the eye-movement task. Conversely, a different procedure that is known to saturate synapses impaired the learning ability of the unmodified mice. A future challenge is to test these predictions experimentally by measuring changes in synaptic plasticity directly, both in brain slices and in living animals. The results could ultimately help to develop treatments that improve the ability to learn and so could provide benefits to a wide range of individuals, including people who have suffered a brain injury or stroke. DOI:http://dx.doi.org/10.7554/eLife.20147.002
Collapse
Affiliation(s)
- Td Barbara Nguyen-Vu
- Department of Neurobiology, Stanford School of Medicine, Stanford, United States.,Department of Molecular and Cellular Physiology, Stanford School of Medicine, Stanford, United States
| | - Grace Q Zhao
- Department of Neurobiology, Stanford School of Medicine, Stanford, United States
| | - Subhaneil Lahiri
- Department of Applied Physics, Stanford University, Stanford, United States
| | - Rhea R Kimpo
- Department of Neurobiology, Stanford School of Medicine, Stanford, United States
| | - Hanmi Lee
- Department of Neurobiology, Stanford School of Medicine, Stanford, United States
| | - Surya Ganguli
- Department of Neurobiology, Stanford School of Medicine, Stanford, United States.,Department of Applied Physics, Stanford University, Stanford, United States
| | - Carla J Shatz
- Department of Neurobiology, Stanford School of Medicine, Stanford, United States.,Department of Biology, Stanford University, Stanford, United States
| | - Jennifer L Raymond
- Department of Neurobiology, Stanford School of Medicine, Stanford, United States
| |
Collapse
|
43
|
Computational principles of synaptic memory consolidation. Nat Neurosci 2016; 19:1697-1706. [PMID: 27694992 DOI: 10.1038/nn.4401] [Citation(s) in RCA: 89] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2016] [Accepted: 09/01/2016] [Indexed: 02/07/2023]
Abstract
Memories are stored and retained through complex, coupled processes operating on multiple timescales. To understand the computational principles behind these intricate networks of interactions, we construct a broad class of synaptic models that efficiently harness biological complexity to preserve numerous memories by protecting them against the adverse effects of overwriting. The memory capacity scales almost linearly with the number of synapses, which is a substantial improvement over the square root scaling of previous models. This was achieved by combining multiple dynamical processes that initially store memories in fast variables and then progressively transfer them to slower variables. Notably, the interactions between fast and slow variables are bidirectional. The proposed models are robust to parameter perturbations and can explain several properties of biological memory, including delayed expression of synaptic modifications, metaplasticity, and spacing effects.
Collapse
|
44
|
Elliott T. Variations on the Theme of Synaptic Filtering: A Comparison of Integrate-and-Express Models of Synaptic Plasticity for Memory Lifetimes. Neural Comput 2016; 28:2393-2460. [PMID: 27626970 DOI: 10.1162/neco_a_00889] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Integrate-and-express models of synaptic plasticity propose that synapses integrate plasticity induction signals before expressing synaptic plasticity. By discerning trends in their induction signals, synapses can control destabilizing fluctuations in synaptic strength. In a feedforward perceptron framework with binary-strength synapses for associative memory storage, we have previously shown that such a filter-based model outperforms other, nonintegrative, "cascade"-type models of memory storage in most regions of biologically relevant parameter space. Here, we consider some natural extensions of our earlier filter model, including one specifically tailored to binary-strength synapses and one that demands a fixed, consecutive number of same-type induction signals rather than merely an excess before expressing synaptic plasticity. With these extensions, we show that filter-based models outperform nonintegrative models in all regions of biologically relevant parameter space except for a small sliver in which all models encode memories only weakly. In this sliver, which model is superior depends on the metric used to gauge memory lifetimes (whether a signal-to-noise ratio or a mean first passage time). After comparing and contrasting these various filter models, we discuss the multiple mechanisms and timescales that underlie both synaptic plasticity and memory phenomena and suggest that multiple, different filtering mechanisms may operate at single synapses.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K.
| |
Collapse
|
45
|
Li Y, Kulvicius T, Tetzlaff C. Induction and Consolidation of Calcium-Based Homo- and Heterosynaptic Potentiation and Depression. PLoS One 2016; 11:e0161679. [PMID: 27560350 PMCID: PMC4999190 DOI: 10.1371/journal.pone.0161679] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2015] [Accepted: 08/10/2016] [Indexed: 11/19/2022] Open
Abstract
The adaptive mechanisms of homo- and heterosynaptic plasticity play an important role in learning and memory. In order to maintain plasticity-induced changes for longer time scales (up to several days), they have to be consolidated by transferring them from a short-lasting early-phase to a long-lasting late-phase state. The underlying processes of this synaptic consolidation are already well-known for homosynaptic plasticity, however, it is not clear whether the same processes also enable the induction and consolidation of heterosynaptic plasticity. In this study, by extending a generic calcium-based plasticity model with the processes of synaptic consolidation, we show in simulations that indeed heterosynaptic plasticity can be induced and, furthermore, consolidated by the same underlying processes as for homosynaptic plasticity. Furthermore, we show that by local diffusion processes the heterosynaptic effect can be restricted to a few synapses neighboring the homosynaptically changed ones. Taken together, this generic model reproduces many experimental results of synaptic tagging and consolidation, provides several predictions for heterosynaptic induction and consolidation, and yields insights into the complex interactions between homo- and heterosynaptic plasticity over a broad variety of time (minutes to days) and spatial scales (several micrometers).
Collapse
Affiliation(s)
- Yinyun Li
- III. Institute of Physics – Biophysics, Georg-August-University, 37077 Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, 37077 Göttingen, Germany
- School of System Science, Beijing Normal University, 100875 Beijing, China
- * E-mail:
| | - Tomas Kulvicius
- III. Institute of Physics – Biophysics, Georg-August-University, 37077 Göttingen, Germany
- Maersk Mc-Kinney Moller Institute, University of Southern Denmark, 5230 Odense, Denmark
| | - Christian Tetzlaff
- Bernstein Center for Computational Neuroscience, Georg-August-University, 37077 Göttingen, Germany
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
- Department of Neurobiology, Weizmann Institute of Science, 76100 Rehovot, Israel
| |
Collapse
|
46
|
Elliott T. The Enhanced Rise and Delayed Fall of Memory in a Model of Synaptic Integration: Extension to Discrete State Synapses. Neural Comput 2016; 28:1927-84. [PMID: 27391686 DOI: 10.1162/neco_a_00867] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Integrate-and-express models of synaptic plasticity propose that synapses may act as low-pass filters, integrating synaptic plasticity induction signals in order to discern trends before expressing synaptic plasticity. We have previously shown that synaptic filtering strongly controls destabilizing fluctuations in developmental models. When applied to palimpsest memory systems that learn new memories by forgetting old ones, we have also shown that with binary-strength synapses, integrative synapses lead to an initial memory signal rise before its fall back to equilibrium. Such an initial rise is in dramatic contrast to nonintegrative synapses, in which the memory signal falls monotonically. We now extend our earlier analysis of palimpsest memories with synaptic filters to consider the more general case of discrete state, multilevel synapses. We derive exact results for the memory signal dynamics and then consider various simplifying approximations. We show that multilevel synapses enhance the initial rise in the memory signal and then delay its subsequent fall by inducing a plateau-like region in the memory signal. Such dynamics significantly increase memory lifetimes, defined by a signal-to-noise ratio (SNR). We derive expressions for optimal choices of synaptic parameters (filter size, number of strength states, number of synapses) that maximize SNR memory lifetimes. However, we find that with memory lifetimes defined via mean-first-passage times, such optimality conditions do not exist, suggesting that optimality may be an artifact of SNRs.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K.
| |
Collapse
|
47
|
Brea J, Gaál AT, Urbanczik R, Senn W. Prospective Coding by Spiking Neurons. PLoS Comput Biol 2016; 12:e1005003. [PMID: 27341100 PMCID: PMC4920376 DOI: 10.1371/journal.pcbi.1005003] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2015] [Accepted: 06/01/2016] [Indexed: 11/18/2022] Open
Abstract
Animals learn to make predictions, such as associating the sound of a bell with upcoming feeding or predicting a movement that a motor command is eliciting. How predictions are realized on the neuronal level and what plasticity rule underlies their learning is not well understood. Here we propose a biologically plausible synaptic plasticity rule to learn predictions on a single neuron level on a timescale of seconds. The learning rule allows a spiking two-compartment neuron to match its current firing rate to its own expected future discounted firing rate. For instance, if an originally neutral event is repeatedly followed by an event that elevates the firing rate of a neuron, the originally neutral event will eventually also elevate the neuron's firing rate. The plasticity rule is a form of spike timing dependent plasticity in which a presynaptic spike followed by a postsynaptic spike leads to potentiation. Even if the plasticity window has a width of 20 milliseconds, associations on the time scale of seconds can be learned. We illustrate prospective coding with three examples: learning to predict a time varying input, learning to predict the next stimulus in a delayed paired-associate task and learning with a recurrent network to reproduce a temporally compressed version of a sequence. We discuss the potential role of the learning mechanism in classical trace conditioning. In the special case that the signal to be predicted encodes reward, the neuron learns to predict the discounted future reward and learning is closely related to the temporal difference learning algorithm TD(λ).
Collapse
Affiliation(s)
- Johanni Brea
- Department of Physiology, University of Bern, Bern, Switzerland
- School of Computer and Communication Sciences and School of Life Sciences, Brain-Mind Institute, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Alexisz Tamás Gaál
- Department of Physiology, University of Bern, Bern, Switzerland
- Courant Institute of Mathematical Sciences, New York University, New York, New York, United States of America
| | | | - Walter Senn
- Department of Physiology, University of Bern, Bern, Switzerland
- Center for Cognition, Learning and Memory, University of Bern, Bern, Switzerland
| |
Collapse
|
48
|
Spike-Based Bayesian-Hebbian Learning of Temporal Sequences. PLoS Comput Biol 2016; 12:e1004954. [PMID: 27213810 PMCID: PMC4877102 DOI: 10.1371/journal.pcbi.1004954] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2015] [Accepted: 04/28/2016] [Indexed: 11/25/2022] Open
Abstract
Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model’s feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx). We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison. From one moment to the next, in an ever-changing world, and awash in a deluge of sensory data, the brain fluidly guides our actions throughout an astonishing variety of tasks. Processing this ongoing bombardment of information is a fundamental problem faced by its underlying neural circuits. Given that the structure of our actions along with the organization of the environment in which they are performed can be intuitively decomposed into sequences of simpler patterns, an encoding strategy reflecting the temporal nature of these patterns should offer an efficient approach for assembling more complex memories and behaviors. We present a model that demonstrates how activity could propagate through recurrent cortical microcircuits as a result of a learning rule based on neurobiologically plausible time courses and dynamics. The model predicts that the interaction between several learning and dynamical processes constitute a compound mnemonic engram that can flexibly generate sequential step-wise increases of activity within neural populations.
Collapse
|
49
|
Kastner DB, Schwalger T, Ziegler L, Gerstner W. A Model of Synaptic Reconsolidation. Front Neurosci 2016; 10:206. [PMID: 27242410 PMCID: PMC4870270 DOI: 10.3389/fnins.2016.00206] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2015] [Accepted: 04/25/2016] [Indexed: 11/18/2022] Open
Abstract
Reconsolidation of memories has mostly been studied at the behavioral and molecular level. Here, we put forward a simple extension of existing computational models of synaptic consolidation to capture hippocampal slice experiments that have been interpreted as reconsolidation at the synaptic level. The model implements reconsolidation through stabilization of consolidated synapses by stabilizing entities combined with an activity-dependent reservoir of stabilizing entities that are immune to protein synthesis inhibition (PSI). We derive a reduced version of our model to explore the conditions under which synaptic reconsolidation does or does not occur, often referred to as the boundary conditions of reconsolidation. We find that our computational model of synaptic reconsolidation displays complex boundary conditions. Our results suggest that a limited resource of hypothetical stabilizing molecules or complexes, which may be implemented by protein phosphorylation or different receptor subtypes, can underlie the phenomenon of synaptic reconsolidation.
Collapse
Affiliation(s)
- David B Kastner
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | - Tilo Schwalger
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | - Lorric Ziegler
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| |
Collapse
|
50
|
Frémaux N, Gerstner W. Neuromodulated Spike-Timing-Dependent Plasticity, and Theory of Three-Factor Learning Rules. Front Neural Circuits 2016; 9:85. [PMID: 26834568 PMCID: PMC4717313 DOI: 10.3389/fncir.2015.00085] [Citation(s) in RCA: 138] [Impact Index Per Article: 17.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2015] [Accepted: 12/14/2015] [Indexed: 11/13/2022] Open
Abstract
Classical Hebbian learning puts the emphasis on joint pre- and postsynaptic activity, but neglects the potential role of neuromodulators. Since neuromodulators convey information about novelty or reward, the influence of neuromodulators on synaptic plasticity is useful not just for action learning in classical conditioning, but also to decide "when" to create new memories in response to a flow of sensory stimuli. In this review, we focus on timing requirements for pre- and postsynaptic activity in conjunction with one or several phasic neuromodulatory signals. While the emphasis of the text is on conceptual models and mathematical theories, we also discuss some experimental evidence for neuromodulation of Spike-Timing-Dependent Plasticity. We highlight the importance of synaptic mechanisms in bridging the temporal gap between sensory stimulation and neuromodulatory signals, and develop a framework for a class of neo-Hebbian three-factor learning rules that depend on presynaptic activity, postsynaptic variables as well as the influence of neuromodulators.
Collapse
Affiliation(s)
- Nicolas Frémaux
- School of Computer Science and Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | - Wulfram Gerstner
- School of Computer Science and Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| |
Collapse
|