1
|
Boyer M, Baudin P, Stengel C, Valero-Cabré A, Lohof AM, Charpier S, Sherrard RM, Mahon S. In vivo low-intensity magnetic pulses durably alter neocortical neuron excitability and spontaneous activity. J Physiol 2022; 600:4019-4037. [PMID: 35899578 DOI: 10.1113/jp283244] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2022] [Accepted: 07/21/2022] [Indexed: 11/08/2022] Open
Abstract
KEY POINTS Repetitive transcranial magnetic stimulation (rTMS) is a promising technique to alleviate neurological and psychiatric disorders caused by alterations in cortical activity. Our knowledge of the cellular mechanisms underlying rTMS-based therapies remains limited. We combined in vivo focal application of low-intensity rTMS (LI-rTMS) to the rat somatosensory cortex with intracellular recordings of subjacent pyramidal neurons to characterize the effects of weak magnetic fields at single cell level. Ten minutes of LI-rTMS delivered at 10 Hz reliably evoked action potentials in cortical neurons during the stimulation period, and induced durable attenuation of their intrinsic excitability, synaptic activity, and spontaneous firing. These results help us better understand the mechanisms of weak magnetic stimulation and should allow optimizing the effectiveness of stimulation protocols for clinical use. ABSTRACT Magnetic brain stimulation is a promising treatment for neurological and psychiatric disorders. However, a better understanding of its effects at the individual neuron level is essential to improve its clinical application. We combined focal low-intensity repetitive transcranial magnetic stimulation (LI-rTMS) to the rat somatosensory cortex with intracellular recordings of subjacent pyramidal neurons in vivo. Continuous 10 Hz LI-rTMS reliably evoked firing at ∼4-5 Hz during the stimulation period and induced durable attenuation of synaptic activity and spontaneous firing in cortical neurons, through membrane hyperpolarization and a reduced intrinsic excitability. However, inducing firing in individual neurons by repeated intracellular current injection did not reproduce LI-rTMS effects on neuronal properties. These data provide novel understanding of mechanisms underlying magnetic brain stimulation showing that, in addition to inducing biochemical plasticity, even weak magnetic fields can activate neurons and enduringly modulate their excitability. Abstract figure legend We examined by means of in vivo intracellular recordings in the rodent the effects of low-intensity (10 mT) repetitive transcranial magnetic stimulation (LI-rTMS) on the functional properties of primary somatosensory cortex pyramidal neurons. After a baseline period, during which cortical spontaneous activity and excitability were measured (Pre), LI-rTMS was applied at 10 Hz for 10 minutes. Despite their low intensity, magnetic pulses reliably evoked action potentials in cortical neurons. Ten minutes of LI-rTMS induced a progressive and long-lasting hyperpolarization of the neuronal membrane and a marked decrease in cell firing rate (Post). This was associated with an altered intrinsic neuronal excitability, characterized by reduced membrane input resistance and increased minimal current required to induce neuronal firing. A portion of this figure was created with biorender.com. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Manon Boyer
- IBPS-B2A, UMR 8256 Biological Adaptation and Ageing, Sorbonne Université & CNRS, Paris, 75005, France.,Paris Brain Institute-ICM, INSERM, CNRS, APHP, Pitié-Salpêtrière Hospital, team 'Network Dynamics and cellular excitability', Sorbonne Université, Paris, France, 75013
| | - Paul Baudin
- Paris Brain Institute-ICM, INSERM, CNRS, APHP, Pitié-Salpêtrière Hospital, team 'Network Dynamics and cellular excitability', Sorbonne Université, Paris, France, 75013
| | - Chloé Stengel
- Paris Brain Institute-ICM, INSERM, CNRS, Pitié-Salpêtrière Hospital, team Cerebral Dynamics, Plasticity and Rehabilitation Group, FRONTLAB team, Sorbonne Université, Paris, 75013, France
| | - Antoni Valero-Cabré
- Paris Brain Institute-ICM, INSERM, CNRS, Pitié-Salpêtrière Hospital, team Cerebral Dynamics, Plasticity and Rehabilitation Group, FRONTLAB team, Sorbonne Université, Paris, 75013, France
| | - Ann M Lohof
- IBPS-B2A, UMR 8256 Biological Adaptation and Ageing, Sorbonne Université & CNRS, Paris, 75005, France
| | - Stéphane Charpier
- Paris Brain Institute-ICM, INSERM, CNRS, APHP, Pitié-Salpêtrière Hospital, team 'Network Dynamics and cellular excitability', Sorbonne Université, Paris, France, 75013
| | - Rachel M Sherrard
- IBPS-B2A, UMR 8256 Biological Adaptation and Ageing, Sorbonne Université & CNRS, Paris, 75005, France
| | - Séverine Mahon
- Paris Brain Institute-ICM, INSERM, CNRS, APHP, Pitié-Salpêtrière Hospital, team 'Network Dynamics and cellular excitability', Sorbonne Université, Paris, France, 75013
| |
Collapse
|
2
|
Sarazin MXB, Victor J, Medernach D, Naudé J, Delord B. Online Learning and Memory of Neural Trajectory Replays for Prefrontal Persistent and Dynamic Representations in the Irregular Asynchronous State. Front Neural Circuits 2021; 15:648538. [PMID: 34305535 PMCID: PMC8298038 DOI: 10.3389/fncir.2021.648538] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2020] [Accepted: 05/31/2021] [Indexed: 11/13/2022] Open
Abstract
In the prefrontal cortex (PFC), higher-order cognitive functions and adaptive flexible behaviors rely on continuous dynamical sequences of spiking activity that constitute neural trajectories in the state space of activity. Neural trajectories subserve diverse representations, from explicit mappings in physical spaces to generalized mappings in the task space, and up to complex abstract transformations such as working memory, decision-making and behavioral planning. Computational models have separately assessed learning and replay of neural trajectories, often using unrealistic learning rules or decoupling simulations for learning from replay. Hence, the question remains open of how neural trajectories are learned, memorized and replayed online, with permanently acting biological plasticity rules. The asynchronous irregular regime characterizing cortical dynamics in awake conditions exerts a major source of disorder that may jeopardize plasticity and replay of locally ordered activity. Here, we show that a recurrent model of local PFC circuitry endowed with realistic synaptic spike timing-dependent plasticity and scaling processes can learn, memorize and replay large-size neural trajectories online under asynchronous irregular dynamics, at regular or fast (sped-up) timescale. Presented trajectories are quickly learned (within seconds) as synaptic engrams in the network, and the model is able to chunk overlapping trajectories presented separately. These trajectory engrams last long-term (dozen hours) and trajectory replays can be triggered over an hour. In turn, we show the conditions under which trajectory engrams and replays preserve asynchronous irregular dynamics in the network. Functionally, spiking activity during trajectory replays at regular timescale accounts for both dynamical coding with temporal tuning in individual neurons, persistent activity at the population level, and large levels of variability consistent with observed cognitive-related PFC dynamics. Together, these results offer a consistent theoretical framework accounting for how neural trajectories can be learned, memorized and replayed in PFC networks circuits to subserve flexible dynamic representations and adaptive behaviors.
Collapse
Affiliation(s)
- Matthieu X B Sarazin
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Julie Victor
- CEA Paris-Saclay, CNRS, NeuroSpin, Saclay, France
| | - David Medernach
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Jérémie Naudé
- Neuroscience Paris Seine - Institut de biologie Paris Seine, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Bruno Delord
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| |
Collapse
|
3
|
Manninen T, Aćimović J, Havela R, Teppola H, Linne ML. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures. Front Neuroinform 2018; 12:20. [PMID: 29765315 PMCID: PMC5938413 DOI: 10.3389/fninf.2018.00020] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2018] [Accepted: 04/06/2018] [Indexed: 01/26/2023] Open
Abstract
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Collapse
Affiliation(s)
- Tiina Manninen
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Jugoslava Aćimović
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Riikka Havela
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Heidi Teppola
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Marja-Leena Linne
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| |
Collapse
|
4
|
Higgins D, Graupner M, Brunel N. Memory maintenance in synapses with calcium-based plasticity in the presence of background activity. PLoS Comput Biol 2014; 10:e1003834. [PMID: 25275319 PMCID: PMC4183374 DOI: 10.1371/journal.pcbi.1003834] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2014] [Accepted: 07/28/2014] [Indexed: 11/19/2022] Open
Abstract
Most models of learning and memory assume that memories are maintained in neuronal circuits by persistent synaptic modifications induced by specific patterns of pre- and postsynaptic activity. For this scenario to be viable, synaptic modifications must survive the ubiquitous ongoing activity present in neural circuits in vivo. In this paper, we investigate the time scales of memory maintenance in a calcium-based synaptic plasticity model that has been shown recently to be able to fit different experimental data-sets from hippocampal and neocortical preparations. We find that in the presence of background activity on the order of 1 Hz parameters that fit pyramidal layer 5 neocortical data lead to a very fast decay of synaptic efficacy, with time scales of minutes. We then identify two ways in which this memory time scale can be extended: (i) the extracellular calcium concentration in the experiments used to fit the model are larger than estimated concentrations in vivo. Lowering extracellular calcium concentration to in vivo levels leads to an increase in memory time scales of several orders of magnitude; (ii) adding a bistability mechanism so that each synapse has two stable states at sufficiently low background activity leads to a further boost in memory time scale, since memory decay is no longer described by an exponential decay from an initial state, but by an escape from a potential well. We argue that both features are expected to be present in synapses in vivo. These results are obtained first in a single synapse connecting two independent Poisson neurons, and then in simulations of a large network of excitatory and inhibitory integrate-and-fire neurons. Our results emphasise the need for studying plasticity at physiological extracellular calcium concentration, and highlight the role of synaptic bi- or multistability in the stability of learned synaptic structures. Synaptic plasticity is widely believed to be the main mechanism underlying learning and memory. In recent years, several mathematical plasticity rules have been shown to fit satisfactorily a wide range of experimental data in hippocampal and neocortical in vitro preparations. In particular, a model in which plasticity is driven by the postsynaptic calcium concentration was shown to reproduce successfully how synaptic changes depend on spike timing, specific spike patterns, and firing rate. The advantage of calcium-based rules is the possibility of predicting how changes in extracellular concentrations will affect plasticity. This is particularly significant in the view that in vitro studies are typically done at higher concentrations than the ones measured in vivo. Using such a rule, with parameters fitting in vitro data, we explore how long the memory of a particular synaptic change can be maintained in the presence of background neuronal activity, ubiquitously observed in cortex. We find that the memory time scales increase by several orders of magnitude when calcium concentrations are lowered from typical in vitro experiments to in vivo. Furthermore, we find that synaptic bistability further extends the memory time scale, and estimate that synaptic changes in vivo could be stable on the scale of weeks to months.
Collapse
Affiliation(s)
- David Higgins
- IBENS, École Normale Supérieure, Paris, France
- Departments of Statistics and Neurobiology, University of Chicago, Chicago, Illinois, United States of America
| | - Michael Graupner
- Center for Neural Science, New York University, New York, New York, United States of America
| | - Nicolas Brunel
- Departments of Statistics and Neurobiology, University of Chicago, Chicago, Illinois, United States of America
- * E-mail:
| |
Collapse
|
5
|
Effects of cellular homeostatic intrinsic plasticity on dynamical and computational properties of biological recurrent neural networks. J Neurosci 2013; 33:15032-43. [PMID: 24048833 DOI: 10.1523/jneurosci.0870-13.2013] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Homeostatic intrinsic plasticity (HIP) is a ubiquitous cellular mechanism regulating neuronal activity, cardinal for the proper functioning of nervous systems. In invertebrates, HIP is critical for orchestrating stereotyped activity patterns. The functional impact of HIP remains more obscure in vertebrate networks, where higher order cognitive processes rely on complex neural dynamics. The hypothesis has emerged that HIP might control the complexity of activity dynamics in recurrent networks, with important computational consequences. However, conflicting results about the causal relationships between cellular HIP, network dynamics, and computational performance have arisen from machine-learning studies. Here, we assess how cellular HIP effects translate into collective dynamics and computational properties in biological recurrent networks. We develop a realistic multiscale model including a generic HIP rule regulating the neuronal threshold with actual molecular signaling pathways kinetics, Dale's principle, sparse connectivity, synaptic balance, and Hebbian synaptic plasticity (SP). Dynamic mean-field analysis and simulations unravel that HIP sets a working point at which inputs are transduced by large derivative ranges of the transfer function. This cellular mechanism ensures increased network dynamics complexity, robust balance with SP at the edge of chaos, and improved input separability. Although critically dependent upon balanced excitatory and inhibitory drives, these effects display striking robustness to changes in network architecture, learning rates, and input features. Thus, the mechanism we unveil might represent a ubiquitous cellular basis for complex dynamics in neural networks. Understanding this robustness is an important challenge to unraveling principles underlying self-organization around criticality in biological recurrent neural networks.
Collapse
|
6
|
Mahon S, Charpier S. Bidirectional plasticity of intrinsic excitability controls sensory inputs efficiency in layer 5 barrel cortex neurons in vivo. J Neurosci 2012; 32:11377-89. [PMID: 22895720 PMCID: PMC6621180 DOI: 10.1523/jneurosci.0415-12.2012] [Citation(s) in RCA: 50] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2012] [Revised: 06/18/2012] [Accepted: 06/25/2012] [Indexed: 11/21/2022] Open
Abstract
Responsiveness of cortical neurons to sensory inputs can be altered by experience and learning. While synaptic plasticity is generally proposed as the underlying cellular mechanism, possible contributions of activity-dependent changes in intrinsic excitability remain poorly investigated. Here, we show that periods of rhythmic firing in rat barrel cortex layer 5 pyramidal neurons can trigger a long-lasting increase or decrease in their membrane excitability in vivo. Potentiation of cortical excitability consisted of an increased firing in response to intracellular stimulation and a reduction in threshold current for spike initiation. Conversely, depression of cortical excitability was evidenced by an augmented firing threshold leading to a reduced current-evoked spiking. The direction of plasticity depended on the baseline level of spontaneous firing rate and cell excitability. We also found that changes in intrinsic excitability were accompanied by corresponding modifications in the effectiveness of sensory inputs. Potentiation and depression of cortical neuron excitability resulted, respectively, in an increased or decreased firing probability on whisker-evoked synaptic responses, without modifications in the synaptic strength itself. These data suggest that bidirectional intrinsic plasticity could play an important role in experience-dependent refinement of sensory cortical networks.
Collapse
Affiliation(s)
- Séverine Mahon
- Centre de Recherche de l'Institut du Cerveau et de la Moelle épinière, Université Pierre et Marie Curie (UPMC), INSERM UMR-S 975, CNRS UMR 7225, Hôpital Pitié-Salpêtrière, F-75013, Paris, France.
| | | |
Collapse
|
7
|
Naudé J, Paz JT, Berry H, Delord B. A theory of rate coding control by intrinsic plasticity effects. PLoS Comput Biol 2012; 8:e1002349. [PMID: 22275858 PMCID: PMC3261921 DOI: 10.1371/journal.pcbi.1002349] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2011] [Accepted: 11/27/2011] [Indexed: 11/18/2022] Open
Abstract
Intrinsic plasticity (IP) is a ubiquitous activity-dependent process regulating neuronal excitability and a cellular correlate of behavioral learning and neuronal homeostasis. Because IP is induced rapidly and maintained long-term, it likely represents a major determinant of adaptive collective neuronal dynamics. However, assessing the exact impact of IP has remained elusive. Indeed, it is extremely difficult disentangling the complex non-linear interaction between IP effects, by which conductance changes alter neuronal activity, and IP rules, whereby activity modifies conductance via signaling pathways. Moreover, the two major IP effects on firing rate, threshold and gain modulation, remain unknown in their very mechanisms. Here, using extensive simulations and sensitivity analysis of Hodgkin-Huxley models, we show that threshold and gain modulation are accounted for by maximal conductance plasticity of conductance that situate in two separate domains of the parameter space corresponding to sub- and supra-threshold conductance (i.e. activating below or above the spike onset threshold potential). Analyzing equivalent integrate-and-fire models, we provide formal expressions of sensitivities relating to conductance parameters, unraveling unprecedented mechanisms governing IP effects. Our results generalize to the IP of other conductance parameters and allow strong inference for calcium-gated conductance, yielding a general picture that accounts for a large repertoire of experimental observations. The expressions we provide can be combined with IP rules in rate or spiking models, offering a general framework to systematically assess the computational consequences of IP of pharmacologically identified conductance with both fine grain description and mathematical tractability. We provide an example of such IP loop model addressing the important issue of the homeostatic regulation of spontaneous discharge. Because we do not formulate any assumptions on modification rules, the present theory is also relevant to other neural processes involving excitability changes, such as neuromodulation, development, aging and neural disorders. Over the past decades, experimental and theoretical studies of the cellular basis of learning and memory have mainly focused on synaptic plasticity, the experience-dependent modification of synapses. However, behavioral learning has also been correlated with experience-dependent changes of non-synaptic voltage-dependent ion channels. This intrinsic plasticity changes the neuron's propensity to fire action potentials in response to synaptic inputs. Thus a fundamental problem is to relate changes of the neuron input-output function with voltage-gated conductance modifications. Using a sensitivity analysis in biophysically realistic models, we depict a generic dichotomy between two classes of voltage-dependent ion channels. These two classes modify the threshold and the slope of the neuron input-output relation, allowing neurons to regulate the range of inputs they respond to and the gain of that response, respectively. We further provide analytical descriptions that enlighten the dynamical mechanisms underlying these effects and propose a concise and realistic framework for assessing the computational impact of intrinsic plasticity in neuron network models. Our results account for a large repertoire of empirical observations and may enlighten functional changes that characterize development, aging and several neural diseases, which also involve changes in voltage-dependent ion channels.
Collapse
Affiliation(s)
- J. Naudé
- Institut des Systèmes Intelligents et de Robotique, CNRS – UMR 7222, Université Pierre et Marie Curie (UPMC), Paris, France
| | - J. T. Paz
- Department of Neurology & Neurological Sciences, Stanford University Medical Center, Stanford, California, United States of America
| | - H. Berry
- Project-Team BEAGLE, INRIA Rhone-Alpes, LIRIS UMR5205, Université de Lyon, Lyon, France
| | - B. Delord
- Institut des Systèmes Intelligents et de Robotique, CNRS – UMR 7222, Université Pierre et Marie Curie (UPMC), Paris, France
- * E-mail:
| |
Collapse
|
8
|
Modeling signal transduction leading to synaptic plasticity: evaluation and comparison of five models. EURASIP JOURNAL ON BIOINFORMATICS & SYSTEMS BIOLOGY 2011; 2011:797250. [PMID: 21559300 DOI: 10.1155/2011/797250] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/01/2010] [Revised: 01/21/2011] [Accepted: 01/27/2011] [Indexed: 11/17/2022]
Abstract
An essential phenomenon of the functional brain is synaptic plasticity which is associated with changes in the strength of synapses between neurons. These changes are affected by both extracellular and intracellular mechanisms. For example, intracellular phosphorylation-dephosphorylation cycles have been shown to possess a special role in synaptic plasticity. We, here, provide the first computational comparison of models for synaptic plasticity by evaluating five models describing postsynaptic signal transduction networks. Our simulation results show that some of the models change their behavior completely due to varying total concentrations of protein kinase and phosphatase. Furthermore, the responses of the models vary when models are compared to each other. Based on our study, we conclude that there is a need for a general setup to objectively compare the models and an urgent demand for the minimum criteria that a computational model for synaptic plasticity needs to meet.
Collapse
|
9
|
Manninen T, Hituri K, Kotaleski JH, Blackwell KT, Linne ML. Postsynaptic signal transduction models for long-term potentiation and depression. Front Comput Neurosci 2010; 4:152. [PMID: 21188161 PMCID: PMC3006457 DOI: 10.3389/fncom.2010.00152] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2010] [Accepted: 11/22/2010] [Indexed: 01/01/2023] Open
Abstract
More than a hundred biochemical species, activated by neurotransmitters binding to transmembrane receptors, are important in long-term potentiation (LTP) and long-term depression (LTD). To investigate which species and interactions are critical for synaptic plasticity, many computational postsynaptic signal transduction models have been developed. The models range from simple models with a single reversible reaction to detailed models with several hundred kinetic reactions. In this study, more than a hundred models are reviewed, and their features are compared and contrasted so that similarities and differences are more readily apparent. The models are classified according to the type of synaptic plasticity that is modeled (LTP or LTD) and whether they include diffusion or electrophysiological phenomena. Other characteristics that discriminate the models include the phase of synaptic plasticity modeled (induction, expression, or maintenance) and the simulation method used (deterministic or stochastic). We find that models are becoming increasingly sophisticated, by including stochastic properties, integrating with electrophysiological properties of entire neurons, or incorporating diffusion of signaling molecules. Simpler models continue to be developed because they are computationally efficient and allow theoretical analysis. The more complex models permit investigation of mechanisms underlying specific properties and experimental verification of model predictions. Nonetheless, it is difficult to fully comprehend the evolution of these models because (1) several models are not described in detail in the publications, (2) only a few models are provided in existing model databases, and (3) comparison to previous models is lacking. We conclude that the value of these models for understanding molecular mechanisms of synaptic plasticity is increasing and will be enhanced further with more complete descriptions and sharing of the published models.
Collapse
Affiliation(s)
- Tiina Manninen
- Department of Signal Processing, Tampere University of Technology Tampere, Finland
| | | | | | | | | |
Collapse
|
10
|
Graupner M, Brunel N. Mechanisms of induction and maintenance of spike-timing dependent plasticity in biophysical synapse models. Front Comput Neurosci 2010; 4. [PMID: 20948584 PMCID: PMC2953414 DOI: 10.3389/fncom.2010.00136] [Citation(s) in RCA: 67] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2010] [Accepted: 08/25/2010] [Indexed: 01/02/2023] Open
Abstract
We review biophysical models of synaptic plasticity, with a focus on spike-timing dependent plasticity (STDP). The common property of the discussed models is that synaptic changes depend on the dynamics of the intracellular calcium concentration, which itself depends on pre- and postsynaptic activity. We start by discussing simple models in which plasticity changes are based directly on calcium amplitude and dynamics. We then consider models in which dynamic intracellular signaling cascades form the link between the calcium dynamics and the plasticity changes. Both mechanisms of induction of STDP (through the ability of pre/postsynaptic spikes to evoke changes in the state of the synapse) and of maintenance of the evoked changes (through bistability) are discussed.
Collapse
Affiliation(s)
- Michael Graupner
- Center for Neural Science, New York University New York City, NY, USA
| | | |
Collapse
|
11
|
Manninen T, Hituri K, Toivari E, Linne ML. Modeling signal transduction in synaptic plasticity: comparison of models and methods. BMC Neurosci 2010. [PMCID: PMC3090899 DOI: 10.1186/1471-2202-11-s1-p190] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
12
|
Dembrow NC, Pettit DL, Zakon HH. Calcium dynamics encode the magnitude of a graded memory underlying sensorimotor adaptation. J Neurophysiol 2010; 103:2372-81. [PMID: 20181728 DOI: 10.1152/jn.00109.2010] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The role of Ca(2+) in the induction of neural correlates of memory has frequently been described in binary terms despite the fact that many forms of memory are graded in their strength and/or persistence. We find that Ca(2+) dynamics encode the magnitude of sensorimotor adaptation of the electromotor output in a weakly electric fish. The neural correlate of this memory is a synaptically induced Ca(2+)-dependent enhancement of intrinsic excitability of neurons responsible for setting the electromotor output. Changes in Ca(2+) during induction accurately predict the magnitude of this graded memory over a wide range of stimuli. Thus despite operating over a range from seconds to tens of minutes, the encoding of graded memory can be mediated by a relatively simple cellular mechanism.
Collapse
Affiliation(s)
- Nikolai C Dembrow
- Center for Learning and Memory, University of Texas at Austin, Austin, TX 78712-0805, USA.
| | | | | |
Collapse
|
13
|
Urakubo H, Honda M, Tanaka K, Kuroda S. Experimental and computational aspects of signaling mechanisms of spike-timing-dependent plasticity. HFSP JOURNAL 2009; 3:240-54. [PMID: 20119481 DOI: 10.2976/1.3137602] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/05/2008] [Accepted: 04/27/2009] [Indexed: 11/19/2022]
Abstract
STDP (spike-timing-dependent synaptic plasticity) is thought to be a synaptic learning rule that embeds spike-timing information into a specific pattern of synaptic strengths in neuronal circuits, resulting in a memory. STDP consists of bidirectional long-term changes in synaptic strengths. This process includes long-term potentiation and long-term depression, which are dependent on the timing of presynaptic and postsynaptic spikings. In this review, we focus on computational aspects of signaling mechanisms that induce and maintain STDP as a key step toward the definition of a general synaptic learning rule. In addition, we discuss the temporal and spatial aspects of STDP, and the requirement of a homeostatic mechanism of STDP in vivo.
Collapse
|
14
|
Paz JT, Mahon S, Tiret P, Genet S, Delord B, Charpier S. Multiple forms of activity-dependent intrinsic plasticity in layer V cortical neurones in vivo. J Physiol 2009; 587:3189-205. [PMID: 19433575 DOI: 10.1113/jphysiol.2009.169334] [Citation(s) in RCA: 46] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022] Open
Abstract
Synaptic plasticity is classically considered as the neuronal substrate for learning and memory. However, activity-dependent changes in neuronal intrinsic excitability have been reported in several learning-related brain regions, suggesting that intrinsic plasticity could also participate to information storage. Compared to synaptic plasticity, there has been little exploration of the properties of induction and expression of intrinsic plasticity in an intact brain. Here, by the means of in vivo intracellular recordings in the rat we have examined how the intrinsic excitability of layer V motor cortex pyramidal neurones is altered following brief periods of repeated firing. Changes in membrane excitability were assessed by modifications in the discharge frequency versus injected current (F-I) curves. Most (approximately 64%) conditioned neurones exhibited a long-lasting intrinsic plasticity, which was expressed either by selective changes in the current threshold or in the slope of the F-I curve, or by concomitant changes in both parameters. These modifications in the neuronal input-output relationship led to a global increase or decrease in intrinsic excitability. Passive electrical membrane properties were unaffected by the intracellular conditioning, indicating that intrinsic plasticity resulted from modifications of voltage-gated ion channels. These results demonstrate that neocortical pyramidal neurones can express in vivo a bidirectional use-dependent intrinsic plasticity, modifying their sensitivity to weak inputs and/or the gain of their input-output function. These multiple forms of experience-dependent intrinsic changes, which expand the computational abilities of individual neurones, could shape new network dynamics and thus might participate in the formation of mnemonic motor engrams.
Collapse
|
15
|
Siri B, Berry H, Cessac B, Delord B, Quoy M. A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks. Neural Comput 2009; 20:2937-66. [PMID: 18624656 DOI: 10.1162/neco.2008.05-07-530] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.
Collapse
Affiliation(s)
- Benoît Siri
- Team Alchemy, INRIA, Parc Club Orsay Université, Orsay Cedex, France.
| | | | | | | | | |
Collapse
|
16
|
Siri B, Quoy M, Delord B, Cessac B, Berry H. Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons. ACTA ACUST UNITED AC 2007; 101:136-48. [PMID: 18042357 DOI: 10.1016/j.jphysparis.2007.10.003] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
The aim of the present paper is to study the effects of Hebbian learning in random recurrent neural networks with biological connectivity, i.e. sparse connections and separate populations of excitatory and inhibitory neurons. We furthermore consider that the neuron dynamics may occur at a (shorter) time scale than synaptic plasticity and consider the possibility of learning rules with passive forgetting. We show that the application of such Hebbian learning leads to drastic changes in the network dynamics and structure. In particular, the learning rule contracts the norm of the weight matrix and yields a rapid decay of the dynamics complexity and entropy. In other words, the network is rewired by Hebbian learning into a new synaptic structure that emerges with learning on the basis of the correlations that progressively build up between neurons. We also observe that, within this emerging structure, the strongest synapses organize as a small-world network. The second effect of the decay of the weight matrix spectral radius consists in a rapid contraction of the spectral radius of the Jacobian matrix. This drives the system through the "edge of chaos" where sensitivity to the input pattern is maximal. Taken together, this scenario is remarkably predicted by theoretical arguments derived from dynamical systems and graph theory.
Collapse
Affiliation(s)
- Benoît Siri
- INRIA, Futurs Research Centre, Project-Team Alchemy, 4 rue J Monod, 91893, Orsay Cedex, France
| | | | | | | | | |
Collapse
|