51
|
Men K, Chen X, Yang B, Zhu J, Yi J, Wang S, Li Y, Dai J. Automatic segmentation of three clinical target volumes in radiotherapy using lifelong learning. Radiother Oncol 2021; 157:1-7. [PMID: 33418008 DOI: 10.1016/j.radonc.2020.12.034] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2020] [Revised: 12/21/2020] [Accepted: 12/22/2020] [Indexed: 10/22/2022]
Abstract
BACKGROUND AND PURPOSE Convolutional neural networks (CNNs) have comparable human level performance in automatic segmentation. An important challenge that CNNs face in segmentation is catastrophic forgetting. They lose performance on tasks that were previously learned when trained on task. In this study, we propose a lifelong learning method to learn multiple segmentation tasks continuously without forgetting previous tasks. MATERIALS AND METHODS The cohort included three tumors, 800 patients of which had nasopharyngeal cancer (NPC), 800 patients had breast cancer, and 800 patients had rectal cancer. The tasks included segmentation of the clinical target volume (CTV) of these three cancers. The proposed lifelong learning network adopted dilation adapter to learn three segmentation tasks one by one. Only the newly added dilation adapter (seven layers) was fine tuning for incoming new task, whereas all the other learned layers were frozen. RESULTS Compared with single-task, multi-task or transfer learning, the proposed lifelong learning can achieve better or comparable segmentation accuracy with a DSC of 0.86 for NPC, 0.89 for breast cancer, and 0.87 for rectal cancer. Lifelong learning can avoid forgetting in sequential learning and yield good performance with less training data. Furthermore, it is more efficient than single-task or transfer learning, which reduced the number of parameters, size of model, and training time by ~58.8%, ~55.6%, and ~25.0%, respectively. CONCLUSION The proposed method preserved the knowledge of previous tasks while learning a new one using a dilation adapter. It could yield comparable performance with much less training data, model parameters, and training time.
Collapse
Affiliation(s)
- Kuo Men
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Xinyuan Chen
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Bining Yang
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Ji Zhu
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Junlin Yi
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Shulian Wang
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Yexiong Li
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Jianrong Dai
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China.
| |
Collapse
|
52
|
Liang H. Evaluation of fitness state of sports training based on self-organizing neural network. Neural Comput Appl 2021. [DOI: 10.1007/s00521-020-05551-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
53
|
Huang Y, Liu J, Harkin J, McDaid L, Luo Y. An memristor-based synapse implementation using BCM learning rule. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.10.106] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
|
54
|
Gabrieli D, Schumm SN, Vigilante NF, Meaney DF. NMDA Receptor Alterations After Mild Traumatic Brain Injury Induce Deficits in Memory Acquisition and Recall. Neural Comput 2020; 33:67-95. [PMID: 33253030 DOI: 10.1162/neco_a_01343] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Mild traumatic brain injury (mTBI) presents a significant health concern with potential persisting deficits that can last decades. Although a growing body of literature improves our understanding of the brain network response and corresponding underlying cellular alterations after injury, the effects of cellular disruptions on local circuitry after mTBI are poorly understood. Our group recently reported how mTBI in neuronal networks affects the functional wiring of neural circuits and how neuronal inactivation influences the synchrony of coupled microcircuits. Here, we utilized a computational neural network model to investigate the circuit-level effects of N-methyl D-aspartate receptor dysfunction. The initial increase in activity in injured neurons spreads to downstream neurons, but this increase was partially reduced by restructuring the network with spike-timing-dependent plasticity. As a model of network-based learning, we also investigated how injury alters pattern acquisition, recall, and maintenance of a conditioned response to stimulus. Although pattern acquisition and maintenance were impaired in injured networks, the greatest deficits arose in recall of previously trained patterns. These results demonstrate how one specific mechanism of cellular-level damage in mTBI affects the overall function of a neural network and point to the importance of reversing cellular-level changes to recover important properties of learning and memory in a microcircuit.
Collapse
Affiliation(s)
- David Gabrieli
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA 19104, U.S.A.
| | - Samantha N Schumm
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA 19104, U.S.A.
| | - Nicholas F Vigilante
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA 19104, U.S.A.
| | - David F Meaney
- Department of Bioengineering, School of Engineering and Applied Sciences, and Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104, U.S.A.
| |
Collapse
|
55
|
Muret D, Makin TR. The homeostatic homunculus: rethinking deprivation-triggered reorganisation. Curr Opin Neurobiol 2020; 67:115-122. [PMID: 33248404 DOI: 10.1016/j.conb.2020.08.008] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 08/15/2020] [Accepted: 08/16/2020] [Indexed: 12/16/2022]
Abstract
While amputation was considered a prominent model for cortical reorganisation, recent evidence highlights persistent representation of the missing hand. We offer a new perspective on the literature of amputation-triggered sensorimotor plasticity, by emphasising the need for homeostasis and emerging evidence of latent activity distributed across the homunculus. We argue that deprivation uncovers pre-existing latent activity, which can manifest as remapping, but that since this activity was already there, remapping could in some instances correspond to functional stability of the system rather than reorganisation. Adaptive behaviour and Hebbian-like plasticity may also play crucial roles in maintaining the functional organisation of the homunculus when deprivation occurs in adulthood or in early development. Collectively, we suggest that the brain's need for stability may underlie several key phenotypes for brain remapping, previously interpreted as consequential to reorganisation. Nevertheless, reorganisation may still be possible, especially when cortical changes contribute to the stability of the system.
Collapse
Affiliation(s)
- Dollyane Muret
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom
| | - Tamar R Makin
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom.
| |
Collapse
|
56
|
Chen H, Xie L, Wang Y, Zhang H. Memory retention in pyramidal neurons: a unified model of energy-based homo and heterosynaptic plasticity with homeostasis. Cogn Neurodyn 2020; 15:675-692. [PMID: 34367368 DOI: 10.1007/s11571-020-09652-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2019] [Revised: 10/27/2020] [Accepted: 11/09/2020] [Indexed: 01/07/2023] Open
Abstract
The brain can learn new tasks without forgetting old ones. This memory retention is closely associated with the long-term stability of synaptic strength. To understand the capacity of pyramidal neurons to preserve memory under different tasks, we established a plasticity model based on the postsynaptic membrane energy state, in which the change in synaptic strength depends on the difference between the energy state after stimulation and the resting energy state. If the post-stimulation energy state is higher than the resting energy state, then synaptic depression occurs. On the contrary, the synapse is strengthened. Our model unifies homo- and heterosynaptic plasticity and can reproduce synaptic plasticity observed in multiple experiments, such as spike-timing-dependent plasticity, and cooperative plasticity with few and common parameters. Based on the proposed plasticity model, we conducted a simulation study on how the activation patterns of dendritic branches by different tasks affect the synaptic connection strength of pyramidal neurons. We further investigate the formation mechanism by which different tasks activate different dendritic branches. Simulation results show that compare to the classic plasticity model, the plasticity model we proposed can achieve a better spatial separation of different branches activated by different tasks in pyramidal neurons, which deepens our insight into the memory retention mechanism of brains.
Collapse
Affiliation(s)
- Huanwen Chen
- The School of Automation, Central South University, Changsha, 410083 Hunan China
| | - Lijuan Xie
- The Institute of Physiology and Psychology, Changsha University of Science and Technology, Changsha, 410076 Hunan China
| | - Yijun Wang
- The School of Automation, Central South University, Changsha, 410083 Hunan China
| | - Hang Zhang
- The School of Automation, Central South University, Changsha, 410083 Hunan China
| |
Collapse
|
57
|
Andrushko JW, Gould LA, Renshaw DW, Ekstrand C, Hortobágyi T, Borowsky R, Farthing JP. High Force Unimanual Handgrip Contractions Increase Ipsilateral Sensorimotor Activation and Functional Connectivity. Neuroscience 2020; 452:111-125. [PMID: 33197497 DOI: 10.1016/j.neuroscience.2020.10.031] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2020] [Revised: 09/23/2020] [Accepted: 10/26/2020] [Indexed: 01/10/2023]
Abstract
Imaging and brain stimulation studies seem to correct the classical understanding of how brain networks, rather than contralateral focal areas, control the generation of unimanual voluntary force. However, the scaling and hemispheric-specificity of network activation remain less understood. Using fMRI, we examined the effects of parametrically increasing right-handgrip force on activation and functional connectivity among the sensorimotor network bilaterally with 25%, 50%, and 75% maximal voluntary contractions (MVC). High force (75% MVC) unimanual handgrip contractions resulted in greater ipsilateral motor activation and functional connectivity with the contralateral hemisphere compared to a low force 25% MVC condition. The ipsilateral motor cortex activation and network strength correlated with relative handgrip force (% MVC). Increases in unimanual handgrip force resulted in greater ipsilateral sensorimotor activation and greater functional connectivity between hemispheres within the sensorimotor network.
Collapse
Affiliation(s)
- Justin W Andrushko
- College of Kinesiology, University of Saskatchewan, Saskatchewan, Canada
| | - Layla A Gould
- College of Medicine, Division of Neurosurgery, University of Saskatchewan, Saskatchewan, Canada
| | - Doug W Renshaw
- College of Kinesiology, University of Saskatchewan, Saskatchewan, Canada
| | - Chelsea Ekstrand
- The Brain and Mind Institute, Western University, London, Ontario, Canada
| | - Tibor Hortobágyi
- Center for Human Movement Sciences, University Medical Center Groningen, University of Groningen, Groningen, the Netherlands
| | - Ron Borowsky
- College of Medicine, Division of Neurosurgery, University of Saskatchewan, Saskatchewan, Canada; College of Arts and Science, Department of Psychology, Saskatchewan, Canada
| | | |
Collapse
|
58
|
Sadeh S, Clopath C. Inhibitory stabilization and cortical computation. Nat Rev Neurosci 2020; 22:21-37. [PMID: 33177630 DOI: 10.1038/s41583-020-00390-z] [Citation(s) in RCA: 54] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/22/2020] [Indexed: 12/22/2022]
Abstract
Neuronal networks with strong recurrent connectivity provide the brain with a powerful means to perform complex computational tasks. However, high-gain excitatory networks are susceptible to instability, which can lead to runaway activity, as manifested in pathological regimes such as epilepsy. Inhibitory stabilization offers a dynamic, fast and flexible compensatory mechanism to balance otherwise unstable networks, thus enabling the brain to operate in its most efficient regimes. Here we review recent experimental evidence for the presence of such inhibition-stabilized dynamics in the brain and discuss their consequences for cortical computation. We show how the study of inhibition-stabilized networks in the brain has been facilitated by recent advances in the technological toolbox and perturbative techniques, as well as a concomitant development of biologically realistic computational models. By outlining future avenues, we suggest that inhibitory stabilization can offer an exemplary case of how experimental neuroscience can progress in tandem with technology and theory to advance our understanding of the brain.
Collapse
Affiliation(s)
- Sadra Sadeh
- Bioengineering Department, Imperial College London, London, UK
| | - Claudia Clopath
- Bioengineering Department, Imperial College London, London, UK.
| |
Collapse
|
59
|
Luo Y, Yin L, Bai W, Mao K. An Appraisal of Incremental Learning Methods. ENTROPY (BASEL, SWITZERLAND) 2020; 22:E1190. [PMID: 33286958 PMCID: PMC7712976 DOI: 10.3390/e22111190] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Revised: 10/16/2020] [Accepted: 10/19/2020] [Indexed: 11/24/2022]
Abstract
As a special case of machine learning, incremental learning can acquire useful knowledge from incoming data continuously while it does not need to access the original data. It is expected to have the ability of memorization and it is regarded as one of the ultimate goals of artificial intelligence technology. However, incremental learning remains a long term challenge. Modern deep neural network models achieve outstanding performance on stationary data distributions with batch training. This restriction leads to catastrophic forgetting for incremental learning scenarios since the distribution of incoming data is unknown and has a highly different probability from the old data. Therefore, a model must be both plastic to acquire new knowledge and stable to consolidate existing knowledge. This review aims to draw a systematic review of the state of the art of incremental learning methods. Published reports are selected from Web of Science, IEEEXplore, and DBLP databases up to May 2020. Each paper is reviewed according to the types: architectural strategy, regularization strategy and rehearsal and pseudo-rehearsal strategy. We compare and discuss different methods. Moreover, the development trend and research focus are given. It is concluded that incremental learning is still a hot research area and will be for a long period. More attention should be paid to the exploration of both biological systems and computational models.
Collapse
Affiliation(s)
| | | | | | - Keming Mao
- College of Software, Northeastern University, Shenyang 110004, China; (Y.L.); (L.Y.); (W.B.)
| |
Collapse
|
60
|
Temporal learning of bottom-up connections via spatially nonspecific top-down inputs. Neurocomputing 2020. [DOI: 10.1016/j.neucom.2020.06.030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
61
|
Drix D, Hafner VV, Schmuker M. Sparse coding with a somato-dendritic rule. Neural Netw 2020; 131:37-49. [PMID: 32750603 DOI: 10.1016/j.neunet.2020.06.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2020] [Revised: 04/30/2020] [Accepted: 06/04/2020] [Indexed: 10/24/2022]
Abstract
Cortical neurons are silent most of the time: sparse activity enables low-energy computation in the brain, and promises to do the same in neuromorphic hardware. Beyond power efficiency, sparse codes have favourable properties for associative learning, as they can store more information than local codes but are easier to read out than dense codes. Auto-encoders with a sparse constraint can learn sparse codes, and so can single-layer networks that combine recurrent inhibition with unsupervised Hebbian learning. But the latter usually require fast homeostatic plasticity, which could lead to catastrophic forgetting in embodied agents that learn continuously. Here we set out to explore whether plasticity at recurrent inhibitory synapses could take up that role instead, regulating both the population sparseness and the firing rates of individual neurons. We put the idea to the test in a network that employs compartmentalised inputs to solve the task: rate-based dendritic compartments integrate the feedforward input, while spiking integrate-and-fire somas compete through recurrent inhibition. A somato-dendritic learning rule allows somatic inhibition to modulate nonlinear Hebbian learning in the dendrites. Trained on MNIST digits and natural images, the network discovers independent components that form a sparse encoding of the input and support linear decoding. These findings confirm that intrinsic homeostatic plasticity is not strictly required for regulating sparseness: inhibitory synaptic plasticity can have the same effect. Our work illustrates the usefulness of compartmentalised inputs, and makes the case for moving beyond point neuron models in artificial spiking neural networks.
Collapse
Affiliation(s)
- Damien Drix
- Biocomputation group, Department of Computer Science, University of Hertfordshire, Hatfield, United Kingdom; Adaptive Systems laboratory, Institut für Informatik, Humboldt-Universität zu Berlin, Berlin, Germany; Bernstein Center for Computational Neuroscience, Berlin, Germany.
| | - Verena V Hafner
- Adaptive Systems laboratory, Institut für Informatik, Humboldt-Universität zu Berlin, Berlin, Germany; Bernstein Center for Computational Neuroscience, Berlin, Germany
| | - Michael Schmuker
- Biocomputation group, Department of Computer Science, University of Hertfordshire, Hatfield, United Kingdom; Bernstein Center for Computational Neuroscience, Berlin, Germany
| |
Collapse
|
62
|
Kozachkov L, Lundqvist M, Slotine JJ, Miller EK. Achieving stable dynamics in neural circuits. PLoS Comput Biol 2020; 16:e1007659. [PMID: 32764745 PMCID: PMC7446801 DOI: 10.1371/journal.pcbi.1007659] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2020] [Revised: 08/19/2020] [Accepted: 06/27/2020] [Indexed: 01/01/2023] Open
Abstract
The brain consists of many interconnected networks with time-varying, partially autonomous activity. There are multiple sources of noise and variation yet activity has to eventually converge to a stable, reproducible state (or sequence of states) for its computations to make sense. We approached this problem from a control-theory perspective by applying contraction analysis to recurrent neural networks. This allowed us to find mechanisms for achieving stability in multiple connected networks with biologically realistic dynamics, including synaptic plasticity and time-varying inputs. These mechanisms included inhibitory Hebbian plasticity, excitatory anti-Hebbian plasticity, synaptic sparsity and excitatory-inhibitory balance. Our findings shed light on how stable computations might be achieved despite biological complexity. Crucially, our analysis is not limited to analyzing the stability of fixed geometric objects in state space (e.g points, lines, planes), but rather the stability of state trajectories which may be complex and time-varying.
Collapse
Affiliation(s)
- Leo Kozachkov
- The Picower Institute for Learning & Memory, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
- Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
- Nonlinear Systems Laboratory, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
| | - Mikael Lundqvist
- The Picower Institute for Learning & Memory, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
- Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | - Jean-Jacques Slotine
- The Picower Institute for Learning & Memory, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
- Nonlinear Systems Laboratory, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
| | - Earl K. Miller
- The Picower Institute for Learning & Memory, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
- Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, United States of America
| |
Collapse
|
63
|
Bianchi S, Muñoz-Martin I, Ielmini D. Bio-Inspired Techniques in a Fully Digital Approach for Lifelong Learning. Front Neurosci 2020; 14:379. [PMID: 32425749 PMCID: PMC7203347 DOI: 10.3389/fnins.2020.00379] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2020] [Accepted: 03/27/2020] [Indexed: 11/13/2022] Open
Abstract
Lifelong learning has deeply underpinned the resilience of biological organisms respect to a constantly changing environment. This flexibility has allowed the evolution of parallel-distributed systems able to merge past information with new stimulus for accurate and efficient brain-computation. Nowadays, there is a strong attempt to reproduce such intelligent systems in standard artificial neural networks (ANNs). However, despite some great results in specific tasks, ANNs still appear too rigid and static in real life respect to the biological systems. Thus, it is necessary to define a new neural paradigm capable of merging the lifelong resilience of biological organisms with the great accuracy of ANNs. Here, we present a digital implementation of a novel mixed supervised-unsupervised neural network capable of performing lifelong learning. The network uses a set of convolutional filters to extract features from the input images of the MNIST and the Fashion-MNIST training datasets. This information defines an original combination of responses of both trained classes and non-trained classes by transfer learning. The responses are then used in the subsequent unsupervised learning based on spike-timing dependent plasticity (STDP). This procedure allows the clustering of non-trained information thanks to bio-inspired algorithms such as neuronal redundancy and spike-frequency adaptation. We demonstrate the implementation of the neural network in a fully digital environment, such as the Xilinx Zynq-7000 System on Chip (SoC). We illustrate a user-friendly interface to test the network by choosing the number and the type of the non-trained classes, or drawing a custom pattern on a tablet. Finally, we propose a comparison of this work with networks based on memristive synaptic devices capable of continual learning, highlighting the main differences and capabilities respect to a fully digital approach.
Collapse
Affiliation(s)
| | | | - Daniele Ielmini
- Dipartimento di Elettronica, Informazione e Bioingegneria (DEIB), Politecnico di Milano, Milan, Italy
| |
Collapse
|
64
|
Divergent Synaptic Scaling of Miniature EPSCs following Activity Blockade in Dissociated Neuronal Cultures. J Neurosci 2020; 40:4090-4102. [PMID: 32312887 DOI: 10.1523/jneurosci.1393-19.2020] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2019] [Revised: 03/28/2020] [Accepted: 03/31/2020] [Indexed: 11/21/2022] Open
Abstract
Neurons can respond to decreased network activity with a homeostatic increase in the amplitudes of miniature EPSCs (mEPSCs). The prevailing view is that mEPSC amplitudes are uniformly multiplied by a single factor, termed "synaptic scaling." Deviations from purely multiplicative scaling have been attributed to biological differences, or to a distortion imposed by a detection threshold limit. Here, we demonstrate in neurons dissociated from cortices of male and female mice that the shift in mEPSC amplitudes observed in the experimental data cannot be reproduced by simulation of uniform multiplicative scaling, with or without the distortion caused by applying a detection threshold. Furthermore, we demonstrate explicitly that the scaling factor is not uniform but is close to 1 for small mEPSCs, and increases with increasing mEPSC amplitude across a substantial portion of the data. This pattern was also observed for previously published data from dissociated mouse hippocampal neurons and dissociated rat cortical neurons. The finding of "divergent scaling" shifts the current view of homeostatic plasticity as a process that alters all synapses on a neuron equally to one that must accommodate the differential effect observed for small versus large mEPSCs. Divergent scaling still accomplishes the essential homeostatic task of modifying synaptic strengths in the opposite direction of the activity change, but the consequences are greatest for those synapses which individually are more likely to bring a neuron to threshold.SIGNIFICANCE STATEMENT In homeostatic plasticity, the responses to chronic increases or decreases in network activity act in the opposite direction to restore normal activity levels. Homeostatic plasticity is likely to play a role in diseases associated with long-term changes in brain function, such as epilepsy and neuropsychiatric illnesses. One homeostatic response is the increase in synaptic strength following a chronic block of activity. Research is focused on finding a globally expressed signaling pathway, because it has been proposed that the plasticity is uniformly expressed across all synapses. Here, we show that the plasticity is not uniform. Our work suggests that homeostatic signaling molecules are likely to be differentially expressed across synapses.
Collapse
|
65
|
Field RE, D'amour JA, Tremblay R, Miehl C, Rudy B, Gjorgjieva J, Froemke RC. Heterosynaptic Plasticity Determines the Set Point for Cortical Excitatory-Inhibitory Balance. Neuron 2020; 106:842-854.e4. [PMID: 32213321 DOI: 10.1016/j.neuron.2020.03.002] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2018] [Revised: 12/27/2019] [Accepted: 03/03/2020] [Indexed: 01/24/2023]
Abstract
Excitation in neural circuits must be carefully controlled by inhibition to regulate information processing and network excitability. During development, cortical inhibitory and excitatory inputs are initially mismatched but become co-tuned or balanced with experience. However, little is known about how excitatory-inhibitory balance is defined at most synapses or about the mechanisms for establishing or maintaining this balance at specific set points. Here we show how coordinated long-term plasticity calibrates populations of excitatory-inhibitory inputs onto mouse auditory cortical pyramidal neurons. Pairing pre- and postsynaptic activity induced plasticity at paired inputs and different forms of heterosynaptic plasticity at the strongest unpaired synapses, which required minutes of activity and dendritic Ca2+ signaling to be computed. Theoretical analyses demonstrated how the relative rate of heterosynaptic plasticity could normalize and stabilize synaptic strengths to achieve any possible excitatory-inhibitory correlation. Thus, excitatory-inhibitory balance is dynamic and cell specific, determined by distinct plasticity rules across multiple excitatory and inhibitory synapses.
Collapse
Affiliation(s)
- Rachel E Field
- Skirball Institute for Biomolecular Medicine, New York University School of Medicine, New York, NY 10016, USA; Neuroscience Institute, New York University School of Medicine, New York, NY 10016, USA; Department of Otolaryngology, New York University School of Medicine, New York, NY 10016, USA; Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY 10016, USA
| | - James A D'amour
- Skirball Institute for Biomolecular Medicine, New York University School of Medicine, New York, NY 10016, USA; Neuroscience Institute, New York University School of Medicine, New York, NY 10016, USA; Department of Otolaryngology, New York University School of Medicine, New York, NY 10016, USA; Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY 10016, USA
| | - Robin Tremblay
- Neuroscience Institute, New York University School of Medicine, New York, NY 10016, USA; Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY 10016, USA; Department of Anesthesiology, New York University School of Medicine, New York, NY 10016, USA
| | - Christoph Miehl
- Max Planck Institute for Brain Research, 60438 Frankfurt, Germany; School of Life Sciences, Technical University of Munich, 85354 Freising, Germany
| | - Bernardo Rudy
- Neuroscience Institute, New York University School of Medicine, New York, NY 10016, USA; Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY 10016, USA; Department of Anesthesiology, New York University School of Medicine, New York, NY 10016, USA
| | - Julijana Gjorgjieva
- Max Planck Institute for Brain Research, 60438 Frankfurt, Germany; School of Life Sciences, Technical University of Munich, 85354 Freising, Germany
| | - Robert C Froemke
- Skirball Institute for Biomolecular Medicine, New York University School of Medicine, New York, NY 10016, USA; Neuroscience Institute, New York University School of Medicine, New York, NY 10016, USA; Department of Otolaryngology, New York University School of Medicine, New York, NY 10016, USA; Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY 10016, USA; Center for Neural Science, New York University, New York, NY 10003, USA.
| |
Collapse
|
66
|
Activity Dependent and Independent Determinants of Synaptic Size Diversity. J Neurosci 2020; 40:2828-2848. [PMID: 32127494 DOI: 10.1523/jneurosci.2181-19.2020] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2019] [Revised: 02/04/2020] [Accepted: 02/13/2020] [Indexed: 11/21/2022] Open
Abstract
The extraordinary diversity of excitatory synapse sizes is commonly attributed to activity-dependent processes that drive synaptic growth and diminution. Recent studies also point to activity-independent size fluctuations, possibly driven by innate synaptic molecule dynamics, as important generators of size diversity. To examine the contributions of activity-dependent and independent processes to excitatory synapse size diversity, we studied glutamatergic synapse size dynamics and diversification in cultured rat cortical neurons (both sexes), silenced from plating. We found that in networks with no history of activity whatsoever, synaptic size diversity was no less extensive than that observed in spontaneously active networks. Synapses in silenced networks were larger, size distributions were broader, yet these were rightward-skewed and similar in shape when scaled by mean synaptic size. Silencing reduced the magnitude of size fluctuations and weakened constraints on size distributions, yet these were sufficient to explain synaptic size diversity in silenced networks. Model-based exploration followed by experimental testing indicated that silencing-associated changes in innate molecular dynamics and fluctuation characteristics might negatively impact synaptic persistence, resulting in reduced synaptic numbers. This, in turn, would increase synaptic molecule availability, promote synaptic enlargement, and ultimately alter fluctuation characteristics. These findings suggest that activity-independent size fluctuations are sufficient to fully diversify glutamatergic synaptic sizes, with activity-dependent processes primarily setting the scale rather than the shape of size distributions. Moreover, they point to reciprocal relationships between synaptic size fluctuations, size distributions, and synaptic numbers mediated by the innate dynamics of synaptic molecules as they move in, out, and between synapses.SIGNIFICANCE STATEMENT Sizes of glutamatergic synapses vary tremendously, even when formed on the same neuron. This diversity is commonly thought to reflect the outcome of activity-dependent forms of synaptic plasticity, yet activity-independent processes might also play some part. Here we show that in neurons with no history of activity whatsoever, synaptic sizes are no less diverse. We show that this diversity is the product of activity-independent size fluctuations, which are sufficient to generate a full repertoire of synaptic sizes at correct proportions. By combining modeling and experimentation we expose reciprocal relationships between size fluctuations, synaptic sizes and synaptic counts, and show how these phenomena might be connected through the dynamics of synaptic molecules as they move in, out, and between synapses.
Collapse
|
67
|
Todo M. Towards the interpretation of complex visual hallucinations in terms of self-reorganization of neural networks. Neurosci Res 2020; 156:147-158. [PMID: 32112785 DOI: 10.1016/j.neures.2020.02.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2019] [Revised: 10/25/2019] [Accepted: 12/28/2019] [Indexed: 10/24/2022]
Abstract
Patients suffering from dementia with Lewy body (DLB) often see complex visual hallucinations (CVH). Despite many pathological, clinical, and neuroimaging studies, the mechanism of CVH remains unknown. One possible scenario is that top-down information is being used to compensate for the lack of bottom-up information. To investigate this possibility and understand the underlying mathematical structure of the CVH mechanism, we propose a simple computational model of synaptic plasticity with particular focus on the effect of selective damage to the bottom-up network on self-reorganization. We show neurons that undergo a change in activity from a bottom-up to a top-down network framework during the reorganization process, which can be understood in terms of state transitions. Assuming that the pre-reorganization representation of this neuron remains after reorganization, it is possible to interpret neural response induced by top-down information as the sensation of bottom-up information. This situation might correspond to a hallucinatory situation in DLB patients. Our results agree with existing experimental evidence and provide new insights into data that have hitherto not been experimentally validated on patients with DLB.
Collapse
Affiliation(s)
- Masato Todo
- Department of Mathematics, School of Science, Hokkaido University, Sapporo, Hokkaido, Japan.
| |
Collapse
|
68
|
Pereira U, Brunel N. Unsupervised Learning of Persistent and Sequential Activity. Front Comput Neurosci 2020; 13:97. [PMID: 32009924 PMCID: PMC6978734 DOI: 10.3389/fncom.2019.00097] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2018] [Accepted: 12/23/2019] [Indexed: 11/25/2022] Open
Abstract
Two strikingly distinct types of activity have been observed in various brain structures during delay periods of delayed response tasks: Persistent activity (PA), in which a sub-population of neurons maintains an elevated firing rate throughout an entire delay period; and Sequential activity (SA), in which sub-populations of neurons are activated sequentially in time. It has been hypothesized that both types of dynamics can be “learned” by the relevant networks from the statistics of their inputs, thanks to mechanisms of synaptic plasticity. However, the necessary conditions for a synaptic plasticity rule and input statistics to learn these two types of dynamics in a stable fashion are still unclear. In particular, it is unclear whether a single learning rule is able to learn both types of activity patterns, depending on the statistics of the inputs driving the network. Here, we first characterize the complete bifurcation diagram of a firing rate model of multiple excitatory populations with an inhibitory mechanism, as a function of the parameters characterizing its connectivity. We then investigate how an unsupervised temporally asymmetric Hebbian plasticity rule shapes the dynamics of the network. Consistent with previous studies, we find that for stable learning of PA and SA, an additional stabilization mechanism is necessary. We show that a generalized version of the standard multiplicative homeostatic plasticity (Renart et al., 2003; Toyoizumi et al., 2014) stabilizes learning by effectively masking excitatory connections during stimulation and unmasking those connections during retrieval. Using the bifurcation diagram derived for fixed connectivity, we study analytically the temporal evolution and the steady state of the learned recurrent architecture as a function of parameters characterizing the external inputs. Slow changing stimuli lead to PA, while fast changing stimuli lead to SA. Our network model shows how a network with plastic synapses can stably and flexibly learn PA and SA in an unsupervised manner.
Collapse
Affiliation(s)
- Ulises Pereira
- Department of Statistics, The University of Chicago, Chicago, IL, United States
| | - Nicolas Brunel
- Department of Statistics, The University of Chicago, Chicago, IL, United States.,Department of Neurobiology, The University of Chicago, Chicago, IL, United States.,Department of Neurobiology, Duke University, Durham, NC, United States.,Department of Physics, Duke University, Durham, NC, United States
| |
Collapse
|
69
|
Rapid and sustained homeostatic control of presynaptic exocytosis at a central synapse. Proc Natl Acad Sci U S A 2019; 116:23783-23789. [PMID: 31685637 PMCID: PMC6876255 DOI: 10.1073/pnas.1909675116] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022] Open
Abstract
Homeostatic mechanisms stabilize neural activity, and there are genetic links between homeostatic plasticity and neural disease. While homeostatic plasticity in the central nervous system (CNS) operates on relatively slow time scales of hours to days, activity-dependent forms of synaptic plasticity alter neural activity on much faster time scales. It is unclear if homeostatic plasticity stabilizes CNS synapses on rapid time scales. Here, we uncovered that cerebellar synapses stabilize transmission within minutes upon activity perturbation. This is achieved through homeostatic control of presynaptic exocytosis. We show that synergistic modulation of distinct presynaptic mechanisms not only maintains synaptic efficacy on rapid, but also on prolonged time scales. Homeostatic control of presynaptic exocytosis may be a general mechanism for stabilizing CNS function. Animal behavior is remarkably robust despite constant changes in neural activity. Homeostatic plasticity stabilizes central nervous system (CNS) function on time scales of hours to days. If and how CNS function is stabilized on more rapid time scales remains unknown. Here, we discovered that mossy fiber synapses in the mouse cerebellum homeostatically control synaptic efficacy within minutes after pharmacological glutamate receptor impairment. This rapid form of homeostatic plasticity is expressed presynaptically. We show that modulations of readily releasable vesicle pool size and release probability normalize synaptic strength in a hierarchical fashion upon acute pharmacological and prolonged genetic receptor perturbation. Presynaptic membrane capacitance measurements directly demonstrate regulation of vesicle pool size upon receptor impairment. Moreover, presynaptic voltage-clamp analysis revealed increased Ca2+-current density under specific experimental conditions. Thus, homeostatic modulation of presynaptic exocytosis through specific mechanisms stabilizes synaptic transmission in a CNS circuit on time scales ranging from minutes to months. Rapid presynaptic homeostatic plasticity may ensure stable neural circuit function in light of rapid activity-dependent plasticity.
Collapse
|
70
|
Neuroscience out of control: control-theoretic perspectives on neural circuit dynamics. Curr Opin Neurobiol 2019; 58:122-129. [DOI: 10.1016/j.conb.2019.09.001] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2019] [Revised: 07/16/2019] [Accepted: 09/03/2019] [Indexed: 12/19/2022]
|
71
|
Rathour RK, Narayanan R. Degeneracy in hippocampal physiology and plasticity. Hippocampus 2019; 29:980-1022. [PMID: 31301166 PMCID: PMC6771840 DOI: 10.1002/hipo.23139] [Citation(s) in RCA: 39] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2018] [Revised: 05/27/2019] [Accepted: 06/25/2019] [Indexed: 12/17/2022]
Abstract
Degeneracy, defined as the ability of structurally disparate elements to perform analogous function, has largely been assessed from the perspective of maintaining robustness of physiology or plasticity. How does the framework of degeneracy assimilate into an encoding system where the ability to change is an essential ingredient for storing new incoming information? Could degeneracy maintain the balance between the apparently contradictory goals of the need to change for encoding and the need to resist change towards maintaining homeostasis? In this review, we explore these fundamental questions with the mammalian hippocampus as an example encoding system. We systematically catalog lines of evidence, spanning multiple scales of analysis that point to the expression of degeneracy in hippocampal physiology and plasticity. We assess the potential of degeneracy as a framework to achieve the conjoint goals of encoding and homeostasis without cross-interferences. We postulate that biological complexity, involving interactions among the numerous parameters spanning different scales of analysis, could establish disparate routes towards accomplishing these conjoint goals. These disparate routes then provide several degrees of freedom to the encoding-homeostasis system in accomplishing its tasks in an input- and state-dependent manner. Finally, the expression of degeneracy spanning multiple scales offers an ideal reconciliation to several outstanding controversies, through the recognition that the seemingly contradictory disparate observations are merely alternate routes that the system might recruit towards accomplishment of its goals.
Collapse
Affiliation(s)
- Rahul K. Rathour
- Cellular Neurophysiology LaboratoryMolecular Biophysics Unit, Indian Institute of ScienceBangaloreIndia
| | - Rishikesh Narayanan
- Cellular Neurophysiology LaboratoryMolecular Biophysics Unit, Indian Institute of ScienceBangaloreIndia
| |
Collapse
|
72
|
Abstract
The structure of neuronal circuits that subserve cognitive functions in the brain is shaped and refined throughout development and into adulthood. Evidence from human and animal studies suggests that the cellular and synaptic substrates of these circuits are atypical in neuropsychiatric disorders, indicating that altered structural plasticity may be an important part of the disease biology. Advances in genetics have redefined our understanding of neuropsychiatric disorders and have revealed a spectrum of risk factors that impact pathways known to influence structural plasticity. In this Review, we discuss the importance of recent genetic findings on the different mechanisms of structural plasticity and propose that these converge on shared pathways that can be targeted with novel therapeutics.
Collapse
|
73
|
Hamel R, Côté K, Matte A, Lepage JF, Bernier PM. Rewards interact with repetition-dependent learning to enhance long-term retention of motor memories. Ann N Y Acad Sci 2019; 1452:34-51. [PMID: 31294872 DOI: 10.1111/nyas.14171] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2019] [Revised: 04/26/2019] [Accepted: 05/29/2019] [Indexed: 11/28/2022]
Abstract
The combination of behavioral experiences that enhance long-term retention remains largely unknown. Informed by neurophysiological lines of work, this study tested the hypothesis that performance-contingent monetary rewards potentiate repetition-dependent forms of learning, as induced by extensive practice at asymptote, to enhance long-term retention of motor memories. To this end, six groups of 14 participants (n = 84) acquired novel motor behaviors by adapting to a gradual visuomotor rotation while these factors were manipulated. Retention was assessed 24 h later. While all groups similarly acquired the novel motor behaviors, results from the retention session revealed an interaction indicating that rewards enhanced long-term retention, but only when practice was extended to asymptote. Specifically, the interaction indicated that this effect selectively occurred when rewards were intermittently available (i.e., 50%), but not when they were absent (i.e., 0%) or continuously available (i.e., 100%) during acquisition. This suggests that the influence of rewards on extensive practice and long-term retention is nonlinear, as continuous rewards did not further enhance retention as compared with intermittent rewards. One possibility is that rewards' intermittent availability allows to maintain their subjective value during acquisition, which may be key to potentiate long-term retention.
Collapse
Affiliation(s)
- Raphaël Hamel
- Département de Pédiatrie, Faculté de Médecine et des Sciences de la Santé, Université de Sherbrooke, Sherbrooke, Québec, Canada.,Département de Kinanthropologie, Faculté des Sciences de l'Activité Physique, Université de Sherbrooke, Sherbrooke, Québec, Canada
| | - Kathleen Côté
- Département de Pédiatrie, Faculté de Médecine et des Sciences de la Santé, Université de Sherbrooke, Sherbrooke, Québec, Canada
| | - Alexia Matte
- Département de Pédiatrie, Faculté de Médecine et des Sciences de la Santé, Université de Sherbrooke, Sherbrooke, Québec, Canada
| | - Jean-François Lepage
- Département de Pédiatrie, Faculté de Médecine et des Sciences de la Santé, Université de Sherbrooke, Sherbrooke, Québec, Canada
| | - Pierre-Michel Bernier
- Département de Kinanthropologie, Faculté des Sciences de l'Activité Physique, Université de Sherbrooke, Sherbrooke, Québec, Canada
| |
Collapse
|
74
|
Sandvig A, Sandvig I. Connectomics of Morphogenetically Engineered Neurons as a Predictor of Functional Integration in the Ischemic Brain. Front Neurol 2019; 10:630. [PMID: 31249553 PMCID: PMC6582372 DOI: 10.3389/fneur.2019.00630] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2019] [Accepted: 05/28/2019] [Indexed: 11/13/2022] Open
Abstract
Recent advances in cell reprogramming technologies enable the in vitro generation of theoretically unlimited numbers of cells, including cells of neural lineage and specific neuronal subtypes from human, including patient-specific, somatic cells. Similarly, as demonstrated in recent animal studies, by applying morphogenetic neuroengineering principles in situ, it is possible to reprogram resident brain cells to the desired phenotype. These developments open new exciting possibilities for cell replacement therapy in stroke, albeit not without caveats. Main challenges include the successful integration of engineered cells in the ischemic brain to promote functional restoration as well as the fact that the underlying mechanisms of action are not fully understood. In this review, we aim to provide new insights to the above in the context of connectomics of morphogenetically engineered neural networks. Specifically, we discuss the relevance of combining advanced interdisciplinary approaches to: validate the functionality of engineered neurons by studying their self-organizing behavior into neural networks as well as responses to stroke-related pathology in vitro; derive structural and functional connectomes from these networks in healthy and perturbed conditions; and identify and extract key elements regulating neural network dynamics, which might predict the behavior of grafted engineered neurons post-transplantation in the stroke-injured brain.
Collapse
Affiliation(s)
- Axel Sandvig
- Department of Neuromedicine and Movement Science, Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology, Trondheim, Norway.,Department of Neurology, St. Olav's Hospital, Trondheim University Hospital, Trondheim, Norway.,Department of Pharmacology and Clinical Neurosciences, Division of Neuro, Head, and Neck, Umeå University Hospital, Umeå, Sweden
| | - Ioanna Sandvig
- Department of Neuromedicine and Movement Science, Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology, Trondheim, Norway
| |
Collapse
|
75
|
Deger M, Seeholzer A, Gerstner W. Multicontact Co-operativity in Spike-Timing-Dependent Structural Plasticity Stabilizes Networks. Cereb Cortex 2019; 28:1396-1415. [PMID: 29300903 PMCID: PMC6041941 DOI: 10.1093/cercor/bhx339] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2017] [Accepted: 11/30/2017] [Indexed: 12/12/2022] Open
Abstract
Excitatory synaptic connections in the adult neocortex consist of multiple synaptic contacts, almost exclusively formed on dendritic spines. Changes of spine volume, a correlate of synaptic strength, can be tracked in vivo for weeks. Here, we present a combined model of structural and spike-timing–dependent plasticity that explains the multicontact configuration of synapses in adult neocortical networks under steady-state and lesion-induced conditions. Our plasticity rule with Hebbian and anti-Hebbian terms stabilizes both the postsynaptic firing rate and correlations between the pre- and postsynaptic activity at an active synaptic contact. Contacts appear spontaneously at a low rate and disappear if their strength approaches zero. Many presynaptic neurons compete to make strong synaptic connections onto a postsynaptic neuron, whereas the synaptic contacts of a given presynaptic neuron co-operate via postsynaptic firing. We find that co-operation of multiple synaptic contacts is crucial for stable, long-term synaptic memories. In simulations of a simplified network model of barrel cortex, our plasticity rule reproduces whisker-trimming–induced rewiring of thalamocortical and recurrent synaptic connectivity on realistic time scales.
Collapse
Affiliation(s)
- Moritz Deger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.,Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, 50674 Cologne, Germany
| | - Alexander Seeholzer
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland
| |
Collapse
|
76
|
Letellier M, Levet F, Thoumine O, Goda Y. Differential role of pre- and postsynaptic neurons in the activity-dependent control of synaptic strengths across dendrites. PLoS Biol 2019; 17:e2006223. [PMID: 31166943 PMCID: PMC6576792 DOI: 10.1371/journal.pbio.2006223] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2018] [Revised: 06/17/2019] [Accepted: 05/17/2019] [Indexed: 01/07/2023] Open
Abstract
Neurons receive a large number of active synaptic inputs from their many presynaptic partners across their dendritic tree. However, little is known about how the strengths of individual synapses are controlled in balance with other synapses to effectively encode information while maintaining network homeostasis. This is in part due to the difficulty in assessing the activity of individual synapses with identified afferent and efferent connections for a synapse population in the brain. Here, to gain insights into the basic cellular rules that drive the activity-dependent spatial distribution of pre- and postsynaptic strengths across incoming axons and dendrites, we combine patch-clamp recordings with live-cell imaging of hippocampal pyramidal neurons in dissociated cultures and organotypic slices. Under basal conditions, both pre- and postsynaptic strengths cluster on single dendritic branches according to the identity of the presynaptic neurons, thus highlighting the ability of single dendritic branches to exhibit input specificity. Stimulating a single presynaptic neuron induces input-specific and dendritic branchwise spatial clustering of presynaptic strengths, which accompanies a widespread multiplicative scaling of postsynaptic strengths in dissociated cultures and heterosynaptic plasticity at distant synapses in organotypic slices. Our study provides evidence for a potential homeostatic mechanism by which the rapid changes in global or distant postsynaptic strengths compensate for input-specific presynaptic plasticity.
Collapse
Affiliation(s)
- Mathieu Letellier
- RIKEN Brain Science Institute, Wako, Saitama, Japan
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
- * E-mail: (ML); (YG)
| | - Florian Levet
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
- Bordeaux Imaging Center, University of Bordeaux, Bordeaux, France
- Bordeaux Imaging Center, CNRS UMS 3420, Bordeaux, France
- Bordeaux Imaging Center, INSERM US04, Bordeaux, France
| | - Olivier Thoumine
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
| | - Yukiko Goda
- RIKEN Center for Brain Science, Wako, Saitama, Japan
- * E-mail: (ML); (YG)
| |
Collapse
|
77
|
Herpich J, Tetzlaff C. Principles underlying the input-dependent formation and organization of memories. Netw Neurosci 2019; 3:606-634. [PMID: 31157312 PMCID: PMC6542621 DOI: 10.1162/netn_a_00086] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2018] [Accepted: 03/21/2019] [Indexed: 11/29/2022] Open
Abstract
The neuronal system exhibits the remarkable ability to dynamically store and organize incoming information into a web of memory representations (items), which is essential for the generation of complex behaviors. Central to memory function is that such memory items must be (1) discriminated from each other, (2) associated to each other, or (3) brought into a sequential order. However, how these three basic mechanisms are robustly implemented in an input-dependent manner by the underlying complex neuronal and synaptic dynamics is still unknown. Here, we develop a mathematical framework, which provides a direct link between different synaptic mechanisms, determining the neuronal and synaptic dynamics of the network, to create a network that emulates the above mechanisms. Combining correlation-based synaptic plasticity and homeostatic synaptic scaling, we demonstrate that these mechanisms enable the reliable formation of sequences and associations between two memory items still missing the capability for discrimination. We show that this shortcoming can be removed by additionally considering inhibitory synaptic plasticity. Thus, the here-presented framework provides a new, functionally motivated link between different known synaptic mechanisms leading to the self-organization of fundamental memory mechanisms.
Collapse
Affiliation(s)
- Juliane Herpich
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics - Biophysics, Georg-August-University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| |
Collapse
|
78
|
Deep learning in bioinformatics: Introduction, application, and perspective in the big data era. Methods 2019; 166:4-21. [PMID: 31022451 DOI: 10.1016/j.ymeth.2019.04.008] [Citation(s) in RCA: 132] [Impact Index Per Article: 26.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2018] [Revised: 03/23/2019] [Accepted: 04/15/2019] [Indexed: 12/13/2022] Open
Abstract
Deep learning, which is especially formidable in handling big data, has achieved great success in various fields, including bioinformatics. With the advances of the big data era in biology, it is foreseeable that deep learning will become increasingly important in the field and will be incorporated in vast majorities of analysis pipelines. In this review, we provide both the exoteric introduction of deep learning, and concrete examples and implementations of its representative applications in bioinformatics. We start from the recent achievements of deep learning in the bioinformatics field, pointing out the problems which are suitable to use deep learning. After that, we introduce deep learning in an easy-to-understand fashion, from shallow neural networks to legendary convolutional neural networks, legendary recurrent neural networks, graph neural networks, generative adversarial networks, variational autoencoder, and the most recent state-of-the-art architectures. After that, we provide eight examples, covering five bioinformatics research directions and all the four kinds of data type, with the implementation written in Tensorflow and Keras. Finally, we discuss the common issues, such as overfitting and interpretability, that users will encounter when adopting deep learning methods and provide corresponding suggestions. The implementations are freely available at https://github.com/lykaust15/Deep_learning_examples.
Collapse
|
79
|
Gray JM, Spiegel I. Cell-type-specific programs for activity-regulated gene expression. Curr Opin Neurobiol 2018; 56:33-39. [PMID: 30529822 DOI: 10.1016/j.conb.2018.11.001] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2018] [Revised: 11/04/2018] [Accepted: 11/05/2018] [Indexed: 12/20/2022]
Abstract
Experience leaves a lasting mark on neural circuit function in part through activity-regulated gene (ARG) expression. New genome wide approaches have revealed that ARG programs are highly cell-type-specific, raising the possibility that they mediate different forms of experience-dependent plasticity in different cell types. The cell-type specificity of these gene programs is achieved by a combination of cell-intrinsic mechanisms that determine the transcriptional response of each neuronal subtype to a given stimulus and by cell-extrinsic mechanisms that influence the nature of the stimulus a cell receives. A better understanding of these mechanisms could usher in an era of molecular systems neuroscience in which genetic perturbations of cell-type-specific plasticities are assessed using electrophysiology and in vivo imaging to reveal the neural basis of adaptive behaviors.
Collapse
Affiliation(s)
- Jesse M Gray
- Department of Genetics, Harvard Medical School, Boston, United States.
| | - Ivo Spiegel
- Department of Neurobiology, Weizmann Institute of Science, 76100 Rehovot, Israel.
| |
Collapse
|
80
|
Modulator-Gated, SUMOylation-Mediated, Activity-Dependent Regulation of Ionic Current Densities Contributes to Short-Term Activity Homeostasis. J Neurosci 2018; 39:596-611. [PMID: 30504282 DOI: 10.1523/jneurosci.1379-18.2018] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2018] [Revised: 10/23/2018] [Accepted: 11/03/2018] [Indexed: 02/07/2023] Open
Abstract
Neurons operate within defined activity limits, and feedback control mechanisms dynamically tune ionic currents to maintain this optimal range. This study describes a novel, rapid feedback mechanism that uses SUMOylation to continuously adjust ionic current densities according to changes in activity. Small ubiquitin-like modifier (SUMO) is a peptide that can be post-translationally conjugated to ion channels to influence their surface expression and biophysical properties. Neuronal activity can regulate the extent of protein SUMOylation. This study on the single, unambiguously identifiable lateral pyloric neuron (LP), a component of the pyloric network in the stomatogastric nervous system of male and female spiny lobsters (Panulirus interruptus), focused on dynamic SUMOylation in the context of activity homeostasis. There were four major findings: First, neuronal activity adjusted the balance between SUMO conjugation and deconjugation to continuously and bidirectionally fine-tune the densities of two opposing conductances: the hyperpolarization activated current (Ih) and the transient potassium current (IA). Second, tonic 5 nm dopamine (DA) gated activity-dependent SUMOylation to permit and prevent activity-dependent regulation of Ih and IA, respectively. Third, DA-gated, activity-dependent SUMOylation contributed to a feedback mechanism that restored the timing and duration of LP activity during prolonged modulation by 5 μm DA, which initially altered these and other activity features. Fourth, DA modulatory and metamoduatory (gating) effects were tailored to simultaneously alter and stabilize neuronal output. Our findings suggest that modulatory tone may select a subset of rapid activity-dependent mechanisms from a larger menu to achieve homeostasis under varying conditions.SIGNIFICANCE STATEMENT Post-translational SUMOylation of ion channel subunits controls their interactions. When subunit SUMOylation is dysregulated, conductance densities mediated by the channels are distorted, leading to nervous system disorders, such as seizures and chronic pain. Regulation of ion channel SUMOylation is poorly understood. This study demonstrated that neuronal activity can regulate SUMOylation to reconfigure ionic current densities over minutes, and this regulation was gated by tonic nanomolar dopamine. Dynamic SUMOylation was necessary to maintain specific aspects of neuronal output while the neuron was being modulated by high (5 μm) concentrations of dopamine, suggesting that the gating function may ensure neuronal homeostasis during extrinsic modulation of a circuit.
Collapse
|
81
|
Parisi GI, Tani J, Weber C, Wermter S. Lifelong Learning of Spatiotemporal Representations With Dual-Memory Recurrent Self-Organization. Front Neurorobot 2018; 12:78. [PMID: 30546302 PMCID: PMC6279894 DOI: 10.3389/fnbot.2018.00078] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2018] [Accepted: 11/06/2018] [Indexed: 11/28/2022] Open
Abstract
Artificial autonomous agents and robots interacting in complex environments are required to continually acquire and fine-tune knowledge over sustained periods of time. The ability to learn from continuous streams of information is referred to as lifelong learning and represents a long-standing challenge for neural network models due to catastrophic forgetting in which novel sensory experience interferes with existing representations and leads to abrupt decreases in the performance on previously acquired knowledge. Computational models of lifelong learning typically alleviate catastrophic forgetting in experimental scenarios with given datasets of static images and limited complexity, thereby differing significantly from the conditions artificial agents are exposed to. In more natural settings, sequential information may become progressively available over time and access to previous experience may be restricted. Therefore, specialized neural network mechanisms are required that adapt to novel sequential experience while preventing disruptive interference with existing representations. In this paper, we propose a dual-memory self-organizing architecture for lifelong learning scenarios. The architecture comprises two growing recurrent networks with the complementary tasks of learning object instances (episodic memory) and categories (semantic memory). Both growing networks can expand in response to novel sensory experience: the episodic memory learns fine-grained spatiotemporal representations of object instances in an unsupervised fashion while the semantic memory uses task-relevant signals to regulate structural plasticity levels and develop more compact representations from episodic experience. For the consolidation of knowledge in the absence of external sensory input, the episodic memory periodically replays trajectories of neural reactivations. We evaluate the proposed model on the CORe50 benchmark dataset for continuous object recognition, showing that we significantly outperform current methods of lifelong learning in three different incremental learning scenarios.
Collapse
Affiliation(s)
- German I. Parisi
- Knowledge Technology, Department of Informatics, Universität Hamburg, Hamburg, Germany
| | - Jun Tani
- Cognitive Neurorobotics Research Unit, Okinawa Institute of Science and Technology, Okinawa, Japan
| | - Cornelius Weber
- Knowledge Technology, Department of Informatics, Universität Hamburg, Hamburg, Germany
| | - Stefan Wermter
- Knowledge Technology, Department of Informatics, Universität Hamburg, Hamburg, Germany
| |
Collapse
|
82
|
Henderson JA, Gong P. Functional mechanisms underlie the emergence of a diverse range of plasticity phenomena. PLoS Comput Biol 2018; 14:e1006590. [PMID: 30419014 PMCID: PMC6258383 DOI: 10.1371/journal.pcbi.1006590] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2018] [Revised: 11/26/2018] [Accepted: 10/23/2018] [Indexed: 11/18/2022] Open
Abstract
Diverse plasticity mechanisms are orchestrated to shape the spatiotemporal dynamics underlying brain functions. However, why these plasticity rules emerge and how their dynamics interact with neural activity to give rise to complex neural circuit dynamics remains largely unknown. Here we show that both Hebbian and homeostatic plasticity rules emerge from a functional perspective of neuronal dynamics whereby each neuron learns to encode its own activity in the population activity, so that the activity of the presynaptic neuron can be decoded from the activity of its postsynaptic neurons. We explain how a range of experimentally observed plasticity phenomena with widely separated time scales emerge from learning this encoding function, including STDP and its frequency dependence, and metaplasticity. We show that when implemented in neural circuits, these plasticity rules naturally give rise to essential neural response properties, including variable neural dynamics with balanced excitation and inhibition, and approximately log-normal distributions of synaptic strengths, while simultaneously encoding a complex real-world visual stimulus. These findings establish a novel function-based account of diverse plasticity mechanisms, providing a unifying framework relating plasticity, dynamics and neural computation. Many experiments have documented a variety of ways in which the connectivity strengths between neurons change in response to the activity of neurons. These changes are an important part of learning. However, it is not understood how such a diverse range of observations can be understood as consequences of an underlying algorithm used by brains for learning. In order to understand such a learning algorithm it is also necessary to understand the neural computation that is being learned, that is, how the functions of the brain are encoded in the activity of its neurons and its connectivity. In this work we propose a simple way in which information can be encoded and decoded in a network of neurons for operating on real-world stimuli, and how this can be learned using two fundamental plasticity rules that change the strength of connections between neurons in response to neural activity. Surprisingly, many experimental observations result as consequences of this approach, indicating that studying the learning of function provides a novel framework for unifying plasticity, dynamics, and neural computation.
Collapse
Affiliation(s)
- James A. Henderson
- School of Physics, The University of Sydney, Sydney, NSW, Australia
- ARC Centre of Excellence for Integrative Brain Function, The University of Sydney, Sydney, NSW, Australia
- * E-mail: (JAH); (PG)
| | - Pulin Gong
- School of Physics, The University of Sydney, Sydney, NSW, Australia
- ARC Centre of Excellence for Integrative Brain Function, The University of Sydney, Sydney, NSW, Australia
- * E-mail: (JAH); (PG)
| |
Collapse
|
83
|
Goodhill GJ. Theoretical Models of Neural Development. iScience 2018; 8:183-199. [PMID: 30321813 PMCID: PMC6197653 DOI: 10.1016/j.isci.2018.09.017] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2018] [Revised: 08/06/2018] [Accepted: 09/19/2018] [Indexed: 12/22/2022] Open
Abstract
Constructing a functioning nervous system requires the precise orchestration of a vast array of mechanical, molecular, and neural-activity-dependent cues. Theoretical models can play a vital role in helping to frame quantitative issues, reveal mathematical commonalities between apparently diverse systems, identify what is and what is not possible in principle, and test the abilities of specific mechanisms to explain the data. This review focuses on the progress that has been made over the last decade in our theoretical understanding of neural development.
Collapse
Affiliation(s)
- Geoffrey J Goodhill
- Queensland Brain Institute and School of Mathematics and Physics, The University of Queensland, St Lucia, QLD 4072, Australia.
| |
Collapse
|
84
|
Removal of area CA3 from hippocampal slices induces postsynaptic plasticity at Schaffer collateral synapses that normalizes CA1 pyramidal cell discharge. Neurosci Lett 2018; 678:55-61. [PMID: 29738844 DOI: 10.1016/j.neulet.2018.05.011] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2018] [Revised: 04/24/2018] [Accepted: 05/04/2018] [Indexed: 11/23/2022]
Abstract
Neural networks that undergo acute insults display remarkable reorganization. This injury related plasticity is thought to permit recovery of function in the face of damage that cannot be reversed. Previously, an increase in the transmission strength at Schaffer collateral to CA1 pyramidal cell synapses was observed after long-term activity reduction in organotypic hippocampal slices. Here we report that, following acute preparation of adult rat hippocampal slices and surgical removal of area CA3, input to area CA1 was reduced and Schaffer collateral synapses underwent functional strengthening. This increase in synaptic strength was limited to Schaffer collateral inputs (no alteration to temporoammonic synapses) and acted to normalize postsynaptic discharge, supporting a homeostatic or compensatory response. Short-term plasticity was not altered, but an increase in immunohistochemical labeling of GluA1 subunits was observed in the stratum radiatum (but not stratum moleculare), suggesting increased numbers of α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid receptors and a postsynaptic locus of expression. Combined, these data support the idea that, in response to the reduction in presynaptic activity caused by removal of area CA3, Schaffer collateral synapses undergo a relatively rapid increase in functional efficacy likely supported by insertion of more AMPARs, which maintains postsynaptic excitability in CA1 pyramidal neurons. This novel fast compensatory plasticity exhibits properties that would allow it to maintain optimal network activity levels in the hippocampus, a brain structure lauded for its ongoing experience-dependent malleability.
Collapse
|
85
|
Bogodvid TK, Andrianov VV, Deryabina IB, Muranova LN, Silantyeva DI, Vinarskaya A, Balaban PM, Gainutdinov KL. Responses of Withdrawal Interneurons to Serotonin Applications in Naïve and Learned Snails Are Different. Front Cell Neurosci 2017; 11:403. [PMID: 29311833 PMCID: PMC5735116 DOI: 10.3389/fncel.2017.00403] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2017] [Accepted: 12/04/2017] [Indexed: 02/04/2023] Open
Abstract
Long-term changes in membrane potential after associative training were described previously in identified premotor interneurons for withdrawal of the terrestrial snail Helix. Serotonin was shown to be a major transmitter involved in triggering the long-term changes in mollusks. In the present study we compared the changes in electrophysiological characteristics of identifiable premotor interneurons for withdrawal in response to bath applications of serotonin (5-HT) or serotonin precursor 5-hydroxytryptophan (5-HTP) in preparations from naïve, neurotoxin-injected or associatively trained snails. It was found that 5-HT or 5-HTP applications caused a significant decrease of membrane potential in premotor interneurons of naïve snails, associatively trained snails and snails with impaired serotonergic system by injection of a selective neurotoxin 5,7-dihydroxytryptamine (5,7-DHT) 1 week before the experiments. Applications of 5-HT or 5-HTP did not cause significant changes in the action potential (AP) threshold potential of these neurons in naïve snails. Conversely, applications of 5-HT or 5-HTP to the premotor interneurons of previously trained or 5,7-DHT-injected snails caused a significant increase in the firing threshold potential in spite of a depolarizing shift of the resting membrane potential. Results demonstrate that responsiveness of premotor interneurons to extracellularly applied 5-HT or 5-HTP changes for days after the associative training or serotonin depletion. Similarity of the effects in trained and 5,7-DHT-injected animals may be due to massive release of serotonin elicited by 5,7-DHT injection. Our results suggest that serotonin release due to aversive conditionining or elicited by the neurotoxin administration triggers similar changes in resting membrane potential and AP threshold in response to bath applications of 5-HT or its precursor 5-HTP.
Collapse
Affiliation(s)
- Tatiana K. Bogodvid
- Laboratory of Neuroreabilitation of Motor Disorders, Institute of Fundamental Medicine and Biology, Kazan Federal University, Kazan, Russia
- Department of Biomedical Sciences, Volga Region State Academy of Physical Culture, Sport and Tourism, Kazan, Russia
| | - Vyatcheslav V. Andrianov
- Laboratory of Neuroreabilitation of Motor Disorders, Institute of Fundamental Medicine and Biology, Kazan Federal University, Kazan, Russia
| | - Irina B. Deryabina
- Laboratory of Neuroreabilitation of Motor Disorders, Institute of Fundamental Medicine and Biology, Kazan Federal University, Kazan, Russia
| | - Lyudmila N. Muranova
- Laboratory of Neuroreabilitation of Motor Disorders, Institute of Fundamental Medicine and Biology, Kazan Federal University, Kazan, Russia
| | - Dinara I. Silantyeva
- Laboratory of Neuroreabilitation of Motor Disorders, Institute of Fundamental Medicine and Biology, Kazan Federal University, Kazan, Russia
| | - Aliya Vinarskaya
- Laboratory of Cellular Neurobiology of Learning, Institute of Higher Nervous Activity and Neurophysiology, Russian Academy of Sciences, Moscow, Russia
| | - Pavel M. Balaban
- Laboratory of Cellular Neurobiology of Learning, Institute of Higher Nervous Activity and Neurophysiology, Russian Academy of Sciences, Moscow, Russia
| | - Khalil L. Gainutdinov
- Laboratory of Neuroreabilitation of Motor Disorders, Institute of Fundamental Medicine and Biology, Kazan Federal University, Kazan, Russia
| |
Collapse
|
86
|
Parisi GI, Tani J, Weber C, Wermter S. Lifelong learning of human actions with deep neural network self-organization. Neural Netw 2017; 96:137-149. [DOI: 10.1016/j.neunet.2017.09.001] [Citation(s) in RCA: 41] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2017] [Revised: 08/23/2017] [Accepted: 09/01/2017] [Indexed: 10/18/2022]
|
87
|
Qiao N, Bartolozzi C, Indiveri G. An Ultralow Leakage Synaptic Scaling Homeostatic Plasticity Circuit With Configurable Time Scales up to 100 ks. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2017; 11:1271-1277. [PMID: 29293423 DOI: 10.1109/tbcas.2017.2754383] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Homeostatic plasticity is a stabilizing mechanism commonly observed in real neural systems that allows neurons to maintain their activity around a functional operating point. This phenomenon can be used in neuromorphic systems to compensate for slowly changing conditions or chronic shifts in the system configuration. However, to avoid interference with other adaptation or learning processes active in the neuromorphic system, it is important that the homeostatic plasticity mechanism operates on time scales that are much longer than conventional synaptic plasticity ones. In this paper we present an ultralow leakage circuit, integrated into an automatic gain control scheme, that can implement the synaptic scaling homeostatic process over extremely long time scales. Synaptic scaling consists in globally scaling the synaptic weights of all synapses impinging onto a neuron maintaining their relative differences, to preserve the effects of learning. The scheme we propose controls the global gain of analog log-domain synapse circuits to keep the neuron's average firing rate constant around a set operating point, over extremely long time scales. To validate the proposed scheme, we implemented the ultralow leakage synaptic scaling homeostatic plasticity circuit in a standard 0.18 m complementary metal-oxide-semiconductor process, and integrated it in an array of dynamic synapses connected to an adaptive integrate and fire neuron. The circuit occupies a silicon area of 84 m 22 m and consumes approximately 10.8 nW with a 1.8 V supply voltage. We present experimental results from the homeostatic circuit and demonstrate how it can be configured to exhibit time scales of up to 100 ks, thanks to a controllable leakage current that can be scaled down to 0.45 aA (2.8 electrons per second).
Collapse
|
88
|
Plasticity of intrinsic excitability during LTD is mediated by bidirectional changes in h-channel activity. Sci Rep 2017; 7:14418. [PMID: 29089586 PMCID: PMC5663755 DOI: 10.1038/s41598-017-14874-z] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2017] [Accepted: 10/18/2017] [Indexed: 11/12/2022] Open
Abstract
The polarity of excitability changes associated with induction of Long-Term synaptic Depression (LTD) in CA1 pyramidal neurons is a contentious issue. Postsynaptic neuronal excitability after LTD induction is found to be reduced in certain cases (i.e. synergistic changes) but enhanced in others (i.e. compensatory or homeostatic). We examined here whether these divergent findings could result from the activation of two separate mechanisms converging onto a single learning rule linking synergistic and homeostatic plasticity. We show that the magnitude of LTD induced with low frequency stimulation (LFS) of the Schaffer collaterals determines the polarity of intrinsic changes in CA1 pyramidal neurons. Apparent input resistance (Rin) is reduced following induction of moderate LTD (<20–30%). In contrast, Rin is increased after induction of large LTD (>40%) induced by repetitive episodes of LFS. The up-regulation of Ih observed after moderate LTD results from the activation of NMDA receptors whereas the down-regulation of Ih is due to activation of mGluR1 receptors. These changes in Rin were associated with changes in intrinsic excitability. In conclusion, our study indicates that changes in excitability after LTD induction follow a learning rule describing a continuum linking synergistic and compensatory changes in excitability.
Collapse
|
89
|
Sprekeler H. Functional consequences of inhibitory plasticity: homeostasis, the excitation-inhibition balance and beyond. Curr Opin Neurobiol 2017; 43:198-203. [PMID: 28500933 DOI: 10.1016/j.conb.2017.03.014] [Citation(s) in RCA: 45] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2016] [Revised: 03/12/2017] [Accepted: 03/22/2017] [Indexed: 11/18/2022]
Abstract
Computational neuroscience has a long-standing tradition of investigating the consequences of excitatory synaptic plasticity. In contrast, the functions of inhibitory plasticity are still largely nebulous, particularly given the bewildering diversity of interneurons in the brain. Here, we review recent computational advances that provide first suggestions for the functional roles of inhibitory plasticity, such as a maintenance of the excitation-inhibition balance, a stabilization of recurrent network dynamics and a decorrelation of sensory responses. The field is still in its infancy, but given the existing body of theory for excitatory plasticity, it is likely to mature quickly and deliver important insights into the self-organization of inhibitory circuits in the brain.
Collapse
Affiliation(s)
- Henning Sprekeler
- Department for Electrical Engineering and Computer Science, Berlin Institute of Technology, and Bernstein Center for Computational Neuroscience, Marchstr. 23, 10587 Berlin, Germany.
| |
Collapse
|