1
|
Rößler N, Smilovic D, Vuksic M, Jedlicka P, Deller T. Maintenance of Lognormal-Like Skewed Dendritic Spine Size Distributions in Dentate Granule Cells of TNF, TNF-R1, TNF-R2, and TNF-R1/2-Deficient Mice. J Comp Neurol 2024; 532:e25645. [PMID: 38943486 DOI: 10.1002/cne.25645] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2023] [Revised: 03/25/2024] [Accepted: 05/30/2024] [Indexed: 07/01/2024]
Abstract
Dendritic spines are sites of synaptic plasticity and their head size correlates with the strength of the corresponding synapse. We recently showed that the distribution of spine head sizes follows a lognormal-like distribution even after blockage of activity or plasticity induction. As the cytokine tumor necrosis factor (TNF) influences synaptic transmission and constitutive TNF and receptor (TNF-R)-deficiencies cause changes in spine head size distributions, we tested whether these genetic alterations disrupt the lognormality of spine head sizes. Furthermore, we distinguished between spines containing the actin-modulating protein synaptopodin (SP-positive), which is present in large, strong and stable spines and those lacking it (SP-negative). Our analysis revealed that neither TNF-deficiency nor the absence of TNF-R1, TNF-R2 or TNF-R 1 and 2 (TNF-R1/R2) degrades the general lognormal-like, skewed distribution of spine head sizes (all spines, SP-positive spines, SP-negative spines). However, TNF, TNF-R1 and TNF-R2-deficiency affected the width of the lognormal distribution, and TNF-R1/2-deficiency shifted the distribution to the left. Our findings demonstrate the robustness of the lognormal-like, skewed distribution, which is maintained even in the face of genetic manipulations that alter the distribution of spine head sizes. Our observations are in line with homeostatic adaptation mechanisms of neurons regulating the distribution of spines and their head sizes.
Collapse
MESH Headings
- Animals
- Dendritic Spines/metabolism
- Mice
- Receptors, Tumor Necrosis Factor, Type I/deficiency
- Receptors, Tumor Necrosis Factor, Type I/metabolism
- Receptors, Tumor Necrosis Factor, Type I/genetics
- Mice, Knockout
- Dentate Gyrus/metabolism
- Dentate Gyrus/cytology
- Tumor Necrosis Factor-alpha/metabolism
- Mice, Inbred C57BL
- Receptors, Tumor Necrosis Factor, Type II/deficiency
- Receptors, Tumor Necrosis Factor, Type II/metabolism
- Receptors, Tumor Necrosis Factor, Type II/genetics
- Neurons/metabolism
- Male
- Microfilament Proteins/metabolism
- Microfilament Proteins/genetics
- Microfilament Proteins/deficiency
Collapse
Affiliation(s)
- Nina Rößler
- Institute of Clinical Neuroanatomy, Dr. Senckenberg Anatomy, Neuroscience Center, Goethe University Frankfurt, Frankfurt, Germany
- ICAR3R - Interdisciplinary Centre for 3Rs in Animal Research, Computer-Based Modelling, Faculty of Medicine, Justus-Liebig-University, Giessen, Germany
| | - Dinko Smilovic
- Institute of Clinical Neuroanatomy, Dr. Senckenberg Anatomy, Neuroscience Center, Goethe University Frankfurt, Frankfurt, Germany
- Croatian Institute for Brain Research, School of Medicine, University of Zagreb, Zagreb, Croatia
| | - Mario Vuksic
- Institute of Clinical Neuroanatomy, Dr. Senckenberg Anatomy, Neuroscience Center, Goethe University Frankfurt, Frankfurt, Germany
- Croatian Institute for Brain Research, School of Medicine, University of Zagreb, Zagreb, Croatia
| | - Peter Jedlicka
- Institute of Clinical Neuroanatomy, Dr. Senckenberg Anatomy, Neuroscience Center, Goethe University Frankfurt, Frankfurt, Germany
- ICAR3R - Interdisciplinary Centre for 3Rs in Animal Research, Computer-Based Modelling, Faculty of Medicine, Justus-Liebig-University, Giessen, Germany
| | - Thomas Deller
- Institute of Clinical Neuroanatomy, Dr. Senckenberg Anatomy, Neuroscience Center, Goethe University Frankfurt, Frankfurt, Germany
| |
Collapse
|
2
|
Chini M, Hnida M, Kostka JK, Chen YN, Hanganu-Opatz IL. Preconfigured architecture of the developing mouse brain. Cell Rep 2024; 43:114267. [PMID: 38795344 DOI: 10.1016/j.celrep.2024.114267] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Revised: 03/13/2024] [Accepted: 05/08/2024] [Indexed: 05/27/2024] Open
Abstract
In the adult brain, structural and functional parameters, such as synaptic sizes and neuronal firing rates, follow right-skewed and heavy-tailed distributions. While this organization is thought to have significant implications, its development is still largely unknown. Here, we address this knowledge gap by investigating a large-scale dataset recorded from the prefrontal cortex and the olfactory bulb of mice aged 4-60 postnatal days. We show that firing rates and spike train interactions have a largely stable distribution shape throughout the first 60 postnatal days and that the prefrontal cortex displays a functional small-world architecture. Moreover, early brain activity exhibits an oligarchical organization, where high-firing neurons have hub-like properties. In a neural network model, we show that analogously right-skewed and heavy-tailed synaptic parameters are instrumental to consistently recapitulate the experimental data. Thus, functional and structural parameters in the developing brain are already extremely distributed, suggesting that this organization is preconfigured and not experience dependent.
Collapse
Affiliation(s)
- Mattia Chini
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany.
| | - Marilena Hnida
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Johanna K Kostka
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Yu-Nan Chen
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Ileana L Hanganu-Opatz
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| |
Collapse
|
3
|
Albesa-González A, Clopath C. Learning with filopodia and spines: Complementary strong and weak competition lead to specialized, graded, and protected receptive fields. PLoS Comput Biol 2024; 20:e1012110. [PMID: 38743789 PMCID: PMC11125506 DOI: 10.1371/journal.pcbi.1012110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 05/24/2024] [Accepted: 04/25/2024] [Indexed: 05/16/2024] Open
Abstract
Filopodia are thin synaptic protrusions that have been long known to play an important role in early development. Recently, they have been found to be more abundant in the adult cortex than previously thought, and more plastic than spines (button-shaped mature synapses). Inspired by these findings, we introduce a new model of synaptic plasticity that jointly describes learning of filopodia and spines. The model assumes that filopodia exhibit strongly competitive learning dynamics -similarly to additive spike-timing-dependent plasticity (STDP). At the same time it proposes that, if filopodia undergo sufficient potentiation, they consolidate into spines. Spines follow weakly competitive learning, classically associated with multiplicative, soft-bounded models of STDP. This makes spines more stable and sensitive to the fine structure of input correlations. We show that our learning rule has a selectivity comparable to additive STDP and captures input correlations as well as multiplicative models of STDP. We also show how it can protect previously formed memories and perform synaptic consolidation. Overall, our results can be seen as a phenomenological description of how filopodia and spines could cooperate to overcome the individual difficulties faced by strong and weak competition mechanisms.
Collapse
Affiliation(s)
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, United Kingdom
| |
Collapse
|
4
|
Zhou H, Bi GQ, Liu G. Intracellular magnesium optimizes transmission efficiency and plasticity of hippocampal synapses by reconfiguring their connectivity. Nat Commun 2024; 15:3406. [PMID: 38649706 PMCID: PMC11035601 DOI: 10.1038/s41467-024-47571-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Accepted: 04/02/2024] [Indexed: 04/25/2024] Open
Abstract
Synapses at dendritic branches exhibit specific properties for information processing. However, how the synapses are orchestrated to dynamically modify their properties, thus optimizing information processing, remains elusive. Here, we observed at hippocampal dendritic branches diverse configurations of synaptic connectivity, two extremes of which are characterized by low transmission efficiency, high plasticity and coding capacity, or inversely. The former favors information encoding, pertinent to learning, while the latter prefers information storage, relevant to memory. Presynaptic intracellular Mg2+ crucially mediates the dynamic transition continuously between the two extreme configurations. Consequently, varying intracellular Mg2+ levels endow individual branches with diverse synaptic computations, thus modulating their ability to process information. Notably, elevating brain Mg2+ levels in aging animals restores synaptic configuration resembling that of young animals, coincident with improved learning and memory. These findings establish intracellular Mg2+ as a crucial factor reconfiguring synaptic connectivity at dendrites, thus optimizing their branch-specific properties in information processing.
Collapse
Affiliation(s)
- Hang Zhou
- Faculty of Life and Health Sciences, Shenzhen University of Advanced Technology, Shenzhen, 518107, China.
- Interdisciplinary Center for Brain Information, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China.
| | - Guo-Qiang Bi
- Faculty of Life and Health Sciences, Shenzhen University of Advanced Technology, Shenzhen, 518107, China
- Interdisciplinary Center for Brain Information, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- Shenzhen-Hong Kong Institute of Brain Science, Shenzhen, 518055, China
- Hefei National Laboratory for Physical Sciences at the Microscale, and School of Life Sciences, University of Science and Technology of China, Hefei, 230031, China
| | - Guosong Liu
- School of Medicine, Tsinghua University, Beijing, 100084, China.
- NeuroCentria Inc., Walnut Creek, CA, 94596, USA.
| |
Collapse
|
5
|
Li C, Qiu J, Huang H. Meta predictive learning model of languages in neural circuits. Phys Rev E 2024; 109:044309. [PMID: 38755909 DOI: 10.1103/physreve.109.044309] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2023] [Accepted: 03/18/2024] [Indexed: 05/18/2024]
Abstract
Large language models based on self-attention mechanisms have achieved astonishing performances, not only in natural language itself, but also in a variety of tasks of different nature. However, regarding processing language, our human brain may not operate using the same principle. Then, a debate is established on the connection between brain computation and artificial self-supervision adopted in large language models. One of most influential hypotheses in brain computation is the predictive coding framework, which proposes to minimize the prediction error by local learning. However, the role of predictive coding and the associated credit assignment in language processing remains unknown. Here, we propose a mean-field learning model within the predictive coding framework, assuming that the synaptic weight of each connection follows a spike and slab distribution, and only the distribution, rather than specific weights, is trained. This meta predictive learning is successfully validated on classifying handwritten digits where pixels are input to the network in sequence, and moreover, on the toy and real language corpus. Our model reveals that most of the connections become deterministic after learning, while the output connections have a higher level of variability. The performance of the resulting network ensemble changes continuously with data load, further improving with more training data, in analogy with the emergent behavior of large language models. Therefore, our model provides a starting point to investigate the connection among brain computation, next-token prediction, and general intelligence.
Collapse
Affiliation(s)
- Chan Li
- PMI Laboratory, School of Physics, Sun Yat-sen University, Guangzhou 510275, People's Republic of China
- Department of Physics, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093, USA
| | - Junbin Qiu
- PMI Laboratory, School of Physics, Sun Yat-sen University, Guangzhou 510275, People's Republic of China
| | - Haiping Huang
- PMI Laboratory, School of Physics, Sun Yat-sen University, Guangzhou 510275, People's Republic of China
- Guangdong Provincial Key Laboratory of Magnetoelectric Physics and Devices, Sun Yat-sen University, Guangzhou 510275, People's Republic of China
| |
Collapse
|
6
|
Shao Y, Ostojic S. Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks. PLoS Comput Biol 2023; 19:e1010855. [PMID: 36689488 PMCID: PMC9894562 DOI: 10.1371/journal.pcbi.1010855] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2022] [Revised: 02/02/2023] [Accepted: 01/06/2023] [Indexed: 01/24/2023] Open
Abstract
How the connectivity of cortical networks determines the neural dynamics and the resulting computations is one of the key questions in neuroscience. Previous works have pursued two complementary approaches to quantify the structure in connectivity. One approach starts from the perspective of biological experiments where only the local statistics of connectivity motifs between small groups of neurons are accessible. Another approach is based instead on the perspective of artificial neural networks where the global connectivity matrix is known, and in particular its low-rank structure can be used to determine the resulting low-dimensional dynamics. A direct relationship between these two approaches is however currently missing. Specifically, it remains to be clarified how local connectivity statistics and the global low-rank connectivity structure are inter-related and shape the low-dimensional activity. To bridge this gap, here we develop a method for mapping local connectivity statistics onto an approximate global low-rank structure. Our method rests on approximating the global connectivity matrix using dominant eigenvectors, which we compute using perturbation theory for random matrices. We demonstrate that multi-population networks defined from local connectivity statistics for which the central limit theorem holds can be approximated by low-rank connectivity with Gaussian-mixture statistics. We specifically apply this method to excitatory-inhibitory networks with reciprocal motifs, and show that it yields reliable predictions for both the low-dimensional dynamics, and statistics of population activity. Importantly, it analytically accounts for the activity heterogeneity of individual neurons in specific realizations of local connectivity. Altogether, our approach allows us to disentangle the effects of mean connectivity and reciprocal motifs on the global recurrent feedback, and provides an intuitive picture of how local connectivity shapes global network dynamics.
Collapse
Affiliation(s)
- Yuxiu Shao
- Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Superieure—PSL Research University, Paris, France
- * E-mail: (YS); (SO)
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Superieure—PSL Research University, Paris, France
- * E-mail: (YS); (SO)
| |
Collapse
|
7
|
Fosque LJ, Alipour A, Zare M, Williams-García RV, Beggs JM, Ortiz G. Quasicriticality explains variability of human neural dynamics across life span. Front Comput Neurosci 2022; 16:1037550. [PMID: 36532868 PMCID: PMC9747757 DOI: 10.3389/fncom.2022.1037550] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2022] [Accepted: 10/27/2022] [Indexed: 08/26/2023] Open
Abstract
Aging impacts the brain's structural and functional organization and over time leads to various disorders, such as Alzheimer's disease and cognitive impairment. The process also impacts sensory function, bringing about a general slowing in various perceptual and cognitive functions. Here, we analyze the Cambridge Centre for Ageing and Neuroscience (Cam-CAN) resting-state magnetoencephalography (MEG) dataset-the largest aging cohort available-in light of the quasicriticality framework, a novel organizing principle for brain functionality which relates information processing and scaling properties of brain activity to brain connectivity and stimulus. Examination of the data using this framework reveals interesting correlations with age and gender of test subjects. Using simulated data as verification, our results suggest a link between changes to brain connectivity due to aging and increased dynamical fluctuations of neuronal firing rates. Our findings suggest a platform to develop biomarkers of neurological health.
Collapse
Affiliation(s)
- Leandro J. Fosque
- Department of Physics, Indiana University, Bloomington, IN, United States
| | - Abolfazl Alipour
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, United States
| | | | | | - John M. Beggs
- Department of Physics, Indiana University, Bloomington, IN, United States
| | - Gerardo Ortiz
- Department of Physics, Indiana University, Bloomington, IN, United States
| |
Collapse
|
8
|
A sequential two-step priming scheme reproduces diversity in synaptic strength and short-term plasticity. Proc Natl Acad Sci U S A 2022; 119:e2207987119. [PMID: 35969787 PMCID: PMC9407230 DOI: 10.1073/pnas.2207987119] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/05/2022] Open
Abstract
Central nervous system synapses are diverse in strength and plasticity. Short-term plasticity has traditionally been evaluated with models postulating a single pool of functionally homogeneous fusion-competent synaptic vesicles. Many observations are not easily explainable by such simple models. We established and experimentally validated a scheme of synaptic vesicle priming consisting of two sequential and reversible steps of release–machinery assembly. This sequential two-step priming scheme faithfully reproduced plasticity at a glutamatergic model synapse. The proposed priming and fusion scheme was consistent with the measured mean responses and with the experimentally observed heterogeneity between synapses. Vesicle fusion probability was found to be relatively uniform among synapses, while the priming equilibrium at rest of mature versus immature vesicle priming states differed greatly. Glutamatergic synapses display variable strength and diverse short-term plasticity (STP), even for a given type of connection. Using nonnegative tensor factorization and conventional state modeling, we demonstrate that a kinetic scheme consisting of two sequential and reversible steps of release–machinery assembly and a final step of synaptic vesicle (SV) fusion reproduces STP and its diversity among synapses. Analyzing transmission at the calyx of Held synapses reveals that differences in synaptic strength and STP are not primarily caused by variable fusion probability (pfusion) but are determined by the fraction of docked synaptic vesicles equipped with a mature release machinery. Our simulations show that traditional quantal analysis methods do not necessarily report pfusion of SVs with a mature release machinery but reflect both pfusion and the distribution between mature and immature priming states at rest. Thus, the approach holds promise for a better mechanistic dissection of the roles of presynaptic proteins in the sequence of SV docking, two-step priming, and fusion. It suggests a mechanism for activity-induced redistribution of synaptic efficacy.
Collapse
|
9
|
Herbert E, Ostojic S. The impact of sparsity in low-rank recurrent neural networks. PLoS Comput Biol 2022; 18:e1010426. [PMID: 35944030 PMCID: PMC9390915 DOI: 10.1371/journal.pcbi.1010426] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2022] [Revised: 08/19/2022] [Accepted: 07/22/2022] [Indexed: 11/18/2022] Open
Abstract
Neural population dynamics are often highly coordinated, allowing task-related computations to be understood as neural trajectories through low-dimensional subspaces. How the network connectivity and input structure give rise to such activity can be investigated with the aid of low-rank recurrent neural networks, a recently-developed class of computational models which offer a rich theoretical framework linking the underlying connectivity structure to emergent low-dimensional dynamics. This framework has so far relied on the assumption of all-to-all connectivity, yet cortical networks are known to be highly sparse. Here we investigate the dynamics of low-rank recurrent networks in which the connections are randomly sparsified, which makes the network connectivity formally full-rank. We first analyse the impact of sparsity on the eigenvalue spectrum of low-rank connectivity matrices, and use this to examine the implications for the dynamics. We find that in the presence of sparsity, the eigenspectra in the complex plane consist of a continuous bulk and isolated outliers, a form analogous to the eigenspectra of connectivity matrices composed of a low-rank and a full-rank random component. This analogy allows us to characterise distinct dynamical regimes of the sparsified low-rank network as a function of key network parameters. Altogether, we find that the low-dimensional dynamics induced by low-rank connectivity structure are preserved even at high levels of sparsity, and can therefore support rich and robust computations even in networks sparsified to a biologically-realistic extent. In large networks of neurons, the activity displayed by the population depends on the strength of the connections between each neuron. In cortical regions engaged in cognitive tasks, this population activity is often seen to be highly coordinated and low-dimensional. A recent line of theoretical work explores how such coordinated activity can arise in a network of neurons in which the matrix defining the connections is constrained to be mathematically low-rank. Until now, this connectivity structure has only been explored in fully-connected networks, in which every neuron is connected to every other. However, in the brain, network connections are often highly sparse, in the sense that most neurons do not share direct connections. Here, we test the robustness of the theoretical framework of low-rank networks to the reality of sparsity present in biological networks. By mathematically analysing the impact of removing connections, we find that the low-dimensional dynamics previously found in dense low-rank networks can in fact persist even at very high levels of sparsity. This has promising implications for the proposal that complex cortical computations which appear to rely on low-dimensional dynamics may be underpinned by a network which has a fundamentally low-rank structure, albeit with only a small fraction of possible connections present.
Collapse
Affiliation(s)
- Elizabeth Herbert
- Laboratoire de Neurosciences Cognitives et Computationnelles, Département d’Études Cognitives, INSERM U960, École Normale Supérieure - PSL University, Paris, France
- * E-mail: (EH); (SO)
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationnelles, Département d’Études Cognitives, INSERM U960, École Normale Supérieure - PSL University, Paris, France
- * E-mail: (EH); (SO)
| |
Collapse
|
10
|
Randomly fluctuating neural connections may implement a consolidation mechanism that explains classic memory laws. Sci Rep 2022; 12:13423. [PMID: 35927567 PMCID: PMC9352731 DOI: 10.1038/s41598-022-17639-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2019] [Accepted: 07/28/2022] [Indexed: 11/09/2022] Open
Abstract
How can we reconcile the massive fluctuations in neural connections with a stable long-term memory? Two-photon microscopy studies have revealed that large portions of neural connections (spines, synapses) are unexpectedly active, changing unpredictably over time. This appears to invalidate the main assumption underlying the majority of memory models in cognitive neuroscience, which rely on stable connections that retain information over time. Here, we show that such random fluctuations may in fact implement a type of memory consolidation mechanism with a stable very long-term memory that offers novel explanations for several classic memory 'laws', namely Jost's Law (1897: superiority of spaced learning) and Ribot's Law (1881: loss of recent memories in retrograde amnesia), for which a common neural basis has been postulated but not established, as well as other general 'laws' of learning and forgetting. We show how these phenomena emerge naturally from massively fluctuating neural connections.
Collapse
|
11
|
Organization and Priming of Long-term Memory Representations with Two-phase Plasticity. Cognit Comput 2022. [DOI: 10.1007/s12559-022-10021-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Abstract
Background / Introduction
In recurrent neural networks in the brain, memories are represented by so-called Hebbian cell assemblies. Such assemblies are groups of neurons with particularly strong synaptic connections formed by synaptic plasticity and consolidated by synaptic tagging and capture (STC). To link these synaptic mechanisms to long-term memory on the level of cognition and behavior, their functional implications on the level of neural networks have to be understood.
Methods
We employ a biologically detailed recurrent network of spiking neurons featuring synaptic plasticity and STC to model the learning and consolidation of long-term memory representations. Using this, we investigate the effects of different organizational paradigms, and of priming stimulation, on the functionality of multiple memory representations. We quantify these effects by the spontaneous activation of memory representations driven by background noise.
Results
We find that the learning order of the memory representations significantly biases the likelihood of activation towards more recently learned representations, and that hub-like overlap structure counters this effect. We identify long-term depression as the mechanism underlying these findings. Finally, we demonstrate that STC has functional consequences for the interaction of long-term memory representations: 1. intermediate consolidation in between learning the individual representations strongly alters the previously described effects, and 2. STC enables the priming of a long-term memory representation on a timescale of minutes to hours.
Conclusion
Our findings show how synaptic and neuronal mechanisms can provide an explanatory basis for known cognitive effects.
Collapse
|
12
|
Diversified physiological sensory input connectivity questions the existence of distinct classes of spinal interneurons. iScience 2022; 25:104083. [PMID: 35372805 PMCID: PMC8971951 DOI: 10.1016/j.isci.2022.104083] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 02/14/2022] [Accepted: 03/14/2022] [Indexed: 12/21/2022] Open
Abstract
The spinal cord is engaged in all forms of motor performance but its functions are far from understood. Because network connectivity defines function, we explored the connectivity of muscular, tendon, and tactile sensory inputs among a wide population of spinal interneurons in the lower cervical segments. Using low noise intracellular whole cell recordings in the decerebrated, non-anesthetized cat in vivo, we could define mono-, di-, and trisynaptic inputs as well as the weights of each input. Whereas each neuron had a highly specific input, and each indirect input could moreover be explained by inputs in other recorded neurons, we unexpectedly also found the input connectivity of the spinal interneuron population to form a continuum. Our data hence contrasts with the currently widespread notion of distinct classes of interneurons. We argue that this suggested diversified physiological connectivity, which likely requires a major component of circuitry learning, implies a more flexible functionality.
Collapse
|
13
|
He X, Ewing AG. Concentration of stimulant regulates initial exocytotic molecular plasticity at single cells. Chem Sci 2022; 13:1815-1822. [PMID: 35282618 PMCID: PMC8826951 DOI: 10.1039/d1sc05278k] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2021] [Accepted: 01/20/2022] [Indexed: 11/21/2022] Open
Abstract
Activity-induced synaptic plasticity has been intensively studied, but is not yet well understood. We examined the temporal and concentration effects of exocytotic molecular plasticity during and immediately after chemical stimulation (30 s K+ stimulation) via single cell amperometry. Here the first and the second 15 s event periods from individual event traces were compared. Remarkably, we found that the amount of catecholamine release and release dynamics depend on the stimulant concentration. No changes were observed at 10 mM K+ stimulation, but changes observed at 30 and 50 mM (i.e., potentiation, increased number of molecules) were opposite to those at 100 mM (i.e., depression, decreased number of events), revealing changes in exocytotic plasticity based on the concentration of the stimulant solution. These results show that molecular changes initiating exocytotic plasticity can be regulated by the concentration strength of the stimulant solution. These different effects on early plasticity offer a possible link between stimulation intensity and synaptic (or adrenal) plasticity. Amperometric measurement of exocytosis (SCA) and vesicle content (IVIEC) over 15 s intervals reveals plasticity (none, potentiation, or depression), that is regulated by the concentration of stimulant solution (e.g., 30 s 10, 30, 50, and 100 mM K+).![]()
Collapse
Affiliation(s)
- Xiulan He
- Department of Chemistry and Molecular Biology, University of Gothenburg 412 96 Gothenburg Sweden
| | - Andrew G Ewing
- Department of Chemistry and Molecular Biology, University of Gothenburg 412 96 Gothenburg Sweden
| |
Collapse
|
14
|
Ho S, Lajaunie R, Lerat M, Le M, Crépel V, Loulier K, Livet J, Kessler JP, Marcaggi P. A stable proportion of Purkinje cell inputs from parallel fibers are silent during cerebellar maturation. Proc Natl Acad Sci U S A 2021; 118:e2024890118. [PMID: 34740966 PMCID: PMC8609448 DOI: 10.1073/pnas.2024890118] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/23/2021] [Indexed: 11/18/2022] Open
Abstract
Cerebellar Purkinje neurons integrate information transmitted at excitatory synapses formed by granule cells. Although these synapses are considered essential sites for learning, most of them appear not to transmit any detectable electrical information and have been defined as silent. It has been proposed that silent synapses are required to maximize information storage capacity and ensure its reliability, and hence to optimize cerebellar operation. Such optimization is expected to occur once the cerebellar circuitry is in place, during its maturation and the natural and steady improvement of animal agility. We therefore investigated whether the proportion of silent synapses varies over this period, from the third to the sixth postnatal week in mice. Selective expression of a calcium indicator in granule cells enabled quantitative mapping of presynaptic activity, while postsynaptic responses were recorded by patch clamp in acute slices. Through this approach and the assessment of two anatomical features (the distance that separates adjacent planar Purkinje dendritic trees and the synapse density), we determined the average excitatory postsynaptic potential per synapse. Its value was four to eight times smaller than responses from paired recorded detectable connections, consistent with over 70% of synapses being silent. These figures remained remarkably stable across maturation stages. According to the proposed role for silent synapses, our results suggest that information storage capacity and reliability are optimized early during cerebellar maturation. Alternatively, silent synapses may have roles other than adjusting the information storage capacity and reliability.
Collapse
Affiliation(s)
- Shu Ho
- Aix-Marseille Université, INSERM, INMED, Marseille 13009, France
| | - Rebecca Lajaunie
- Department of Neuroscience, Physiology and Pharmacology, University College London, London WC1E 6BT, United Kingdom
| | - Marion Lerat
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris F-75012, France
| | - Mickaël Le
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris F-75012, France
| | - Valérie Crépel
- Aix-Marseille Université, INSERM, INMED, Marseille 13009, France
| | - Karine Loulier
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris F-75012, France
| | - Jean Livet
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris F-75012, France
| | - Jean-Pierre Kessler
- Aix-Marseille Université, CNRS, Institut de Biologie du Développement de Marseille, UMR 7288, Marseille 13288, France
| | - Païkan Marcaggi
- Aix-Marseille Université, INSERM, INMED, Marseille 13009, France;
- Department of Neuroscience, Physiology and Pharmacology, University College London, London WC1E 6BT, United Kingdom
- Unité de Neurobiologie des Canaux Ioniques et de la Synapse, UMR 1072, INSERM, Aix-Marseille Université, Marseille 13015, France
| |
Collapse
|
15
|
Chipman PH, Fung CCA, Pazo Fernandez A, Sawant A, Tedoldi A, Kawai A, Ghimire Gautam S, Kurosawa M, Abe M, Sakimura K, Fukai T, Goda Y. Astrocyte GluN2C NMDA receptors control basal synaptic strengths of hippocampal CA1 pyramidal neurons in the stratum radiatum. eLife 2021; 10:70818. [PMID: 34693906 PMCID: PMC8594917 DOI: 10.7554/elife.70818] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 10/22/2021] [Indexed: 12/12/2022] Open
Abstract
Experience-dependent plasticity is a key feature of brain synapses for which neuronal N-Methyl-D-Aspartate receptors (NMDARs) play a major role, from developmental circuit refinement to learning and memory. Astrocytes also express NMDARs, although their exact function has remained controversial. Here, we identify in mouse hippocampus, a circuit function for GluN2C NMDAR, a subtype highly expressed in astrocytes, in layer-specific tuning of synaptic strengths in CA1 pyramidal neurons. Interfering with astrocyte NMDAR or GluN2C NMDAR activity reduces the range of presynaptic strength distribution specifically in the stratum radiatum inputs without an appreciable change in the mean presynaptic strength. Mathematical modeling shows that narrowing of the width of presynaptic release probability distribution compromises the expression of long-term synaptic plasticity. Our findings suggest a novel feedback signaling system that uses astrocyte GluN2C NMDARs to adjust basal synaptic weight distribution of Schaffer collateral inputs, which in turn impacts computations performed by the CA1 pyramidal neuron.
Collapse
Affiliation(s)
| | - Chi Chung Alan Fung
- Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology Graduate University, Onna-son, Japan
| | | | | | - Angelo Tedoldi
- RIKEN Center for Brain Science, Wako-shi, Saitama, Japan
| | - Atsushi Kawai
- RIKEN Center for Brain Science, Wako-shi, Saitama, Japan
| | | | | | - Manabu Abe
- Department of Animal Model Development, Brain Research Institute, Niigata University, Niigata, Japan
| | - Kenji Sakimura
- Department of Animal Model Development, Brain Research Institute, Niigata University, Niigata, Japan
| | - Tomoki Fukai
- Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology Graduate University, Onna-son, Japan
| | - Yukiko Goda
- RIKEN Center for Brain Science, Wako-shi, Saitama, Japan
| |
Collapse
|
16
|
Computational roles of intrinsic synaptic dynamics. Curr Opin Neurobiol 2021; 70:34-42. [PMID: 34303124 DOI: 10.1016/j.conb.2021.06.002] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2021] [Revised: 05/14/2021] [Accepted: 06/15/2021] [Indexed: 12/26/2022]
Abstract
Conventional theories assume that long-term information storage in the brain is implemented by modifying synaptic efficacy. Recent experimental findings challenge this view by demonstrating that dendritic spine sizes, or their corresponding synaptic weights, are highly volatile even in the absence of neural activity. Here, we review previous computational works on the roles of these intrinsic synaptic dynamics. We first present the possibility for neuronal networks to sustain stable performance in their presence, and we then hypothesize that intrinsic dynamics could be more than mere noise to withstand, but they may improve information processing in the brain.
Collapse
|
17
|
Moments of characteristic polynomials in certain random neural networks. Stat Probab Lett 2021. [DOI: 10.1016/j.spl.2021.109044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
18
|
Sachdeva PS, Livezey JA, Dougherty ME, Gu BM, Berke JD, Bouchard KE. Improved inference in coupling, encoding, and decoding models and its consequence for neuroscientific interpretation. J Neurosci Methods 2021; 358:109195. [PMID: 33905791 DOI: 10.1016/j.jneumeth.2021.109195] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Revised: 04/08/2021] [Accepted: 04/10/2021] [Indexed: 10/21/2022]
Abstract
BACKGROUND A central goal of systems neuroscience is to understand the relationships amongst constituent units in neural populations, and their modulation by external factors, using high-dimensional and stochastic neural recordings. Parametric statistical models (e.g., coupling, encoding, and decoding models), play an instrumental role in accomplishing this goal. However, extracting conclusions from a parametric model requires that it is fit using an inference algorithm capable of selecting the correct parameters and properly estimating their values. Traditional approaches to parameter inference have been shown to suffer from failures in both selection and estimation. The recent development of algorithms that ameliorate these deficiencies raises the question of whether past work relying on such inference procedures have produced inaccurate systems neuroscience models, thereby impairing their interpretation. NEW METHOD We used algorithms based on Union of Intersections, a statistical inference framework based on stability principles, capable of improved selection and estimation. COMPARISON We fit functional coupling, encoding, and decoding models across a battery of neural datasets using both UoI and baseline inference procedures (e.g., ℓ1-penalized GLMs), and compared the structure of their fitted parameters. RESULTS Across recording modality, brain region, and task, we found that UoI inferred models with increased sparsity, improved stability, and qualitatively different parameter distributions, while maintaining predictive performance. We obtained highly sparse functional coupling networks with substantially different community structure, more parsimonious encoding models, and decoding models that relied on fewer single-units. CONCLUSIONS Together, these results demonstrate that improved parameter inference, achieved via UoI, reshapes interpretation in diverse neuroscience contexts.
Collapse
Affiliation(s)
- Pratik S Sachdeva
- Redwood Center for Theoretical Neuroscience, University of California, Berkeley, 94720, CA, USA; Department of Physics, University of California, Berkeley, 94720, CA, USA; Biological Systems and Engineering Division, Lawrence Berkeley National Laboratory, Berkeley, 94720, CA, USA
| | - Jesse A Livezey
- Redwood Center for Theoretical Neuroscience, University of California, Berkeley, 94720, CA, USA; Biological Systems and Engineering Division, Lawrence Berkeley National Laboratory, Berkeley, 94720, CA, USA
| | - Maximilian E Dougherty
- Biological Systems and Engineering Division, Lawrence Berkeley National Laboratory, Berkeley, 94720, CA, USA
| | - Bon-Mi Gu
- Department of Neurology, University of California, San Francisco, San Francisco, 94143, CA, USA
| | - Joshua D Berke
- Department of Neurology, University of California, San Francisco, San Francisco, 94143, CA, USA; Department of Psychiatry; Neuroscience Graduate Program; Kavli Institute for Fundamental Neuroscience; Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, 94143, CA, USA
| | - Kristofer E Bouchard
- Redwood Center for Theoretical Neuroscience, University of California, Berkeley, 94720, CA, USA; Biological Systems and Engineering Division, Lawrence Berkeley National Laboratory, Berkeley, 94720, CA, USA; Computational Resources Division, Lawrence Berkeley National Laboratory, Berkeley, 94720, CA, USA; Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, 94720, CA, USA
| |
Collapse
|
19
|
Becker MFP, Tetzlaff C. The biophysical basis underlying the maintenance of early phase long-term potentiation. PLoS Comput Biol 2021; 17:e1008813. [PMID: 33750943 PMCID: PMC8016278 DOI: 10.1371/journal.pcbi.1008813] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Revised: 04/01/2021] [Accepted: 02/17/2021] [Indexed: 11/18/2022] Open
Abstract
The maintenance of synaptic changes resulting from long-term potentiation (LTP) is essential for brain function such as memory and learning. Different LTP phases have been associated with diverse molecular processes and pathways, and the molecular underpinnings of LTP on the short, as well as long time scales, are well established. However, the principles on the intermediate time scale of 1-6 hours that mediate the early phase of LTP (E-LTP) remain elusive. We hypothesize that the interplay between specific features of postsynaptic receptor trafficking is responsible for sustaining synaptic changes during this LTP phase. We test this hypothesis by formalizing a biophysical model that integrates several experimentally-motivated mechanisms. The model captures a wide range of experimental findings and predicts that synaptic changes are preserved for hours when the receptor dynamics are shaped by the interplay of structural changes of the spine in conjunction with increased trafficking from recycling endosomes and the cooperative binding of receptors. Furthermore, our model provides several predictions to verify our findings experimentally. The cognitive ability of learning is associated with plasticity-induced changes in synaptic transmission efficacy mediated by AMPA receptors. Synaptic changes depend on a multitude of molecular and physiological mechanisms, building complex interaction networks. By formalizing and employing a biophysical model of AMPAR trafficking, we unravel and evaluate the interplay between key mechanisms such as receptor binding, exocytosis, morphological changes, and cooperative receptor binding. Our findings indicate that cooperative receptor binding in conjunction with morphological changes of the spine and increased trafficking from recycling endosomes leads to the maintenance of synaptic changes on behaviorally relevant time spans.
Collapse
Affiliation(s)
- Moritz F. P. Becker
- III. Institute of Physics – Biophysics, Georg-August University, Göttingen, Germany
- * E-mail: (MB); (CT)
| | - Christian Tetzlaff
- III. Institute of Physics – Biophysics, Georg-August University, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
- * E-mail: (MB); (CT)
| |
Collapse
|
20
|
Dabaghian Y. From Topological Analyses to Functional Modeling: The Case of Hippocampus. Front Comput Neurosci 2021; 14:593166. [PMID: 33505262 PMCID: PMC7829363 DOI: 10.3389/fncom.2020.593166] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2020] [Accepted: 12/02/2020] [Indexed: 11/13/2022] Open
Abstract
Topological data analyses are widely used for describing and conceptualizing large volumes of neurobiological data, e.g., for quantifying spiking outputs of large neuronal ensembles and thus understanding the functions of the corresponding networks. Below we discuss an approach in which convergent topological analyses produce insights into how information may be processed in mammalian hippocampus—a brain part that plays a key role in learning and memory. The resulting functional model provides a unifying framework for integrating spiking data at different timescales and following the course of spatial learning at different levels of spatiotemporal granularity. This approach allows accounting for contributions from various physiological phenomena into spatial cognition—the neuronal spiking statistics, the effects of spiking synchronization by different brain waves, the roles played by synaptic efficacies and so forth. In particular, it is possible to demonstrate that networks with plastic and transient synaptic architectures can encode stable cognitive maps, revealing the characteristic timescales of memory processing.
Collapse
Affiliation(s)
- Yuri Dabaghian
- Department of Neurology, The University of Texas McGovern Medical School, Houston, TX, United States
| |
Collapse
|
21
|
Chu D, Le Nguyen H. Constraints on Hebbian and STDP learned weights of a spiking neuron. Neural Netw 2021; 135:192-200. [PMID: 33401225 DOI: 10.1016/j.neunet.2020.12.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2020] [Revised: 12/09/2020] [Accepted: 12/10/2020] [Indexed: 10/22/2022]
Abstract
We analyse mathematically the constraints on weights resulting from Hebbian and STDP learning rules applied to a spiking neuron with weight normalisation. In the case of pure Hebbian learning, we find that the normalised weights equal the promotion probabilities of weights up to correction terms that depend on the learning rate and are usually small. A similar relation can be derived for STDP algorithms, where the normalised weight values reflect a difference between the promotion and demotion probabilities of the weight. These relations are practically useful in that they allow checking for convergence of Hebbian and STDP algorithms. Another application is novelty detection. We demonstrate this using the MNIST dataset.
Collapse
Affiliation(s)
- Dominique Chu
- CEMS, School of Computing, University of Kent, CT2 7NF, Canterbury, UK.
| | - Huy Le Nguyen
- CEMS, School of Computing, University of Kent, CT2 7NF, Canterbury, UK
| |
Collapse
|
22
|
Heterosynaptic cross-talk of pre- and postsynaptic strengths along segments of dendrites. Cell Rep 2021; 34:108693. [PMID: 33503435 DOI: 10.1016/j.celrep.2021.108693] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Revised: 08/13/2020] [Accepted: 01/05/2021] [Indexed: 11/20/2022] Open
Abstract
Dendrites are crucial for integrating incoming synaptic information. Individual dendritic branches are thought to constitute a signal processing unit, yet how neighboring synapses shape the boundaries of functional dendritic units is not well understood. Here, we address the cellular basis underlying the organization of the strengths of neighboring Schaffer collateral-CA1 synapses by optical quantal analysis and spine size measurements. Inducing potentiation at clusters of spines produces NMDA-receptor-dependent heterosynaptic plasticity. The direction of postsynaptic strength change shows distance dependency to the stimulated synapses where proximal synapses predominantly depress, whereas distal synapses potentiate; potentiation and depression are regulated by CaMKII and calcineurin, respectively. In contrast, heterosynaptic presynaptic plasticity is confined to weakening of presynaptic strength of nearby synapses, which requires CaMKII and the retrograde messenger nitric oxide. Our findings highlight the parallel engagement of multiple signaling pathways, each with characteristic spatial dynamics in shaping the local pattern of synaptic strengths.
Collapse
|
23
|
Ingrosso A. Optimal learning with excitatory and inhibitory synapses. PLoS Comput Biol 2020; 16:e1008536. [PMID: 33370266 PMCID: PMC7793294 DOI: 10.1371/journal.pcbi.1008536] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2020] [Revised: 01/08/2021] [Accepted: 11/13/2020] [Indexed: 11/22/2022] Open
Abstract
Characterizing the relation between weight structure and input/output statistics is fundamental for understanding the computational capabilities of neural circuits. In this work, I study the problem of storing associations between analog signals in the presence of correlations, using methods from statistical mechanics. I characterize the typical learning performance in terms of the power spectrum of random input and output processes. I show that optimal synaptic weight configurations reach a capacity of 0.5 for any fraction of excitatory to inhibitory weights and have a peculiar synaptic distribution with a finite fraction of silent synapses. I further provide a link between typical learning performance and principal components analysis in single cases. These results may shed light on the synaptic profile of brain circuits, such as cerebellar structures, that are thought to engage in processing time-dependent signals and performing on-line prediction.
Collapse
Affiliation(s)
- Alessandro Ingrosso
- Zuckerman Mind, Brain, Behavior Institute, Columbia University, New York, New York, United States of America
| |
Collapse
|
24
|
Scale free topology as an effective feedback system. PLoS Comput Biol 2020; 16:e1007825. [PMID: 32392249 PMCID: PMC7241857 DOI: 10.1371/journal.pcbi.1007825] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2019] [Revised: 05/21/2020] [Accepted: 03/26/2020] [Indexed: 12/13/2022] Open
Abstract
Biological networks are often heterogeneous in their connectivity pattern, with degree distributions featuring a heavy tail of highly connected hubs. The implications of this heterogeneity on dynamical properties are a topic of much interest. Here we show that interpreting topology as a feedback circuit can provide novel insights on dynamics. Based on the observation that in finite networks a small number of hubs have a disproportionate effect on the entire system, we construct an approximation by lumping these nodes into a single effective hub, which acts as a feedback loop with the rest of the nodes. We use this approximation to study dynamics of networks with scale-free degree distributions, focusing on their probability of convergence to fixed points. We find that the approximation preserves convergence statistics over a wide range of settings. Our mapping provides a parametrization of scale free topology which is predictive at the ensemble level and also retains properties of individual realizations. Specifically, outgoing hubs have an organizing role that can drive the network to convergence, in analogy to suppression of chaos by an external drive. In contrast, incoming hubs have no such property, resulting in a marked difference between the behavior of networks with outgoing vs. incoming scale free degree distribution. Combining feedback analysis with mean field theory predicts a transition between convergent and divergent dynamics which is corroborated by numerical simulations. Furthermore, they highlight the effect of a handful of outlying hubs, rather than of the connectivity distribution law as a whole, on network dynamics. Nature abounds with complex networks of interacting elements—from the proteins in our cells, through neural networks in our brains, to species interacting in ecosystems. In all of these fields, the relation between network structure and dynamics is an important research question. A recurring feature of natural networks is their heterogeneous structure: individual elements exhibit a huge diversity of connectivity patterns, which complicates the understanding of network dynamics. To address this problem, we devised a simplified approximation for complex structured networks which captures their dynamical properties. Separating out the largest “hubs”—a small number of nodes with disproportionately high connectivity—we represent them by a single node linked to the rest of the network. This enables us to borrow concepts from control theory, where a system’s output is linked back to itself forming a feedback loop. In this analogy, hubs in heterogeneous networks implement a feedback circuit with the rest of the network. The analogy reveals how these hubs can coordinate the network and drive it more easily towards stable states. Our approach enables analyzing dynamical properties of heterogeneous networks, which is difficult to achieve with existing techniques. It is potentially applicable to many fields where heterogeneous networks are important.
Collapse
|
25
|
Ray S, Aldworth ZN, Stopfer MA. Feedback inhibition and its control in an insect olfactory circuit. eLife 2020; 9:53281. [PMID: 32163034 PMCID: PMC7145415 DOI: 10.7554/elife.53281] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2019] [Accepted: 03/09/2020] [Indexed: 01/20/2023] Open
Abstract
Inhibitory neurons play critical roles in regulating and shaping olfactory responses in vertebrates and invertebrates. In insects, these roles are performed by relatively few neurons, which can be interrogated efficiently, revealing fundamental principles of olfactory coding. Here, with electrophysiological recordings from the locust and a large-scale biophysical model, we analyzed the properties and functions of GGN, a unique giant GABAergic neuron that plays a central role in structuring olfactory codes in the locust mushroom body. Our simulations suggest that depolarizing GGN at its input branch can globally inhibit KCs several hundred microns away. Our in vivorecordings show that GGN responds to odors with complex temporal patterns of depolarization and hyperpolarization that can vary with odors and across animals, leading our model to predict the existence of a yet-undiscovered olfactory pathway. Our analysis reveals basic new features of GGN and the olfactory network surrounding it.
Collapse
Affiliation(s)
- Subhasis Ray
- Section on Sensory Coding and Neural Ensembles, NICHD, NIH, Bethesda, United States
| | - Zane N Aldworth
- Section on Sensory Coding and Neural Ensembles, NICHD, NIH, Bethesda, United States
| | - Mark A Stopfer
- Section on Sensory Coding and Neural Ensembles, NICHD, NIH, Bethesda, United States
| |
Collapse
|
26
|
Activity Dependent and Independent Determinants of Synaptic Size Diversity. J Neurosci 2020; 40:2828-2848. [PMID: 32127494 DOI: 10.1523/jneurosci.2181-19.2020] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2019] [Revised: 02/04/2020] [Accepted: 02/13/2020] [Indexed: 11/21/2022] Open
Abstract
The extraordinary diversity of excitatory synapse sizes is commonly attributed to activity-dependent processes that drive synaptic growth and diminution. Recent studies also point to activity-independent size fluctuations, possibly driven by innate synaptic molecule dynamics, as important generators of size diversity. To examine the contributions of activity-dependent and independent processes to excitatory synapse size diversity, we studied glutamatergic synapse size dynamics and diversification in cultured rat cortical neurons (both sexes), silenced from plating. We found that in networks with no history of activity whatsoever, synaptic size diversity was no less extensive than that observed in spontaneously active networks. Synapses in silenced networks were larger, size distributions were broader, yet these were rightward-skewed and similar in shape when scaled by mean synaptic size. Silencing reduced the magnitude of size fluctuations and weakened constraints on size distributions, yet these were sufficient to explain synaptic size diversity in silenced networks. Model-based exploration followed by experimental testing indicated that silencing-associated changes in innate molecular dynamics and fluctuation characteristics might negatively impact synaptic persistence, resulting in reduced synaptic numbers. This, in turn, would increase synaptic molecule availability, promote synaptic enlargement, and ultimately alter fluctuation characteristics. These findings suggest that activity-independent size fluctuations are sufficient to fully diversify glutamatergic synaptic sizes, with activity-dependent processes primarily setting the scale rather than the shape of size distributions. Moreover, they point to reciprocal relationships between synaptic size fluctuations, size distributions, and synaptic numbers mediated by the innate dynamics of synaptic molecules as they move in, out, and between synapses.SIGNIFICANCE STATEMENT Sizes of glutamatergic synapses vary tremendously, even when formed on the same neuron. This diversity is commonly thought to reflect the outcome of activity-dependent forms of synaptic plasticity, yet activity-independent processes might also play some part. Here we show that in neurons with no history of activity whatsoever, synaptic sizes are no less diverse. We show that this diversity is the product of activity-independent size fluctuations, which are sufficient to generate a full repertoire of synaptic sizes at correct proportions. By combining modeling and experimentation we expose reciprocal relationships between size fluctuations, synaptic sizes and synaptic counts, and show how these phenomena might be connected through the dynamics of synaptic molecules as they move in, out, and between synapses.
Collapse
|
27
|
Qi G, Yang D, Ding C, Feldmeyer D. Unveiling the Synaptic Function and Structure Using Paired Recordings From Synaptically Coupled Neurons. Front Synaptic Neurosci 2020; 12:5. [PMID: 32116641 PMCID: PMC7026682 DOI: 10.3389/fnsyn.2020.00005] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2019] [Accepted: 01/22/2020] [Indexed: 11/24/2022] Open
Abstract
Synaptic transmission between neurons is the basic mechanism for information processing in cortical microcircuits. To date, paired recording from synaptically coupled neurons is the most widely used method which allows a detailed functional characterization of unitary synaptic transmission at the cellular and synaptic level in combination with a structural characterization of both pre- and postsynaptic neurons at the light and electron microscopic level. In this review, we will summarize the many applications of paired recordings to investigate synaptic function and structure. Paired recordings have been used to study the detailed electrophysiological and anatomical properties of synaptically coupled cell pairs within a synaptic microcircuit; this is critical in order to understand the connectivity rules and dynamic properties of synaptic transmission. Paired recordings can also be adopted for quantal analysis of an identified synaptic connection and to study the regulation of synaptic transmission by neuromodulators such as acetylcholine, the monoamines, neuropeptides, and adenosine etc. Taken together, paired recordings from synaptically coupled neurons will remain a very useful approach for a detailed characterization of synaptic transmission not only in the rodent brain but also that of other species including humans.
Collapse
Affiliation(s)
- Guanxiao Qi
- Institute of Neuroscience and Medicine, INM-10, Jülich Research Centre, Jülich, Germany
| | - Danqing Yang
- Institute of Neuroscience and Medicine, INM-10, Jülich Research Centre, Jülich, Germany
| | - Chao Ding
- Institute of Neuroscience and Medicine, INM-10, Jülich Research Centre, Jülich, Germany
| | - Dirk Feldmeyer
- Institute of Neuroscience and Medicine, INM-10, Jülich Research Centre, Jülich, Germany.,Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University Hospital, Aachen, Germany.,Jülich-Aachen Research Alliance, Translational Brain Medicine (JARA Brain), Aachen, Germany
| |
Collapse
|
28
|
Sammons RP, Clopath C, Barnes SJ. Size-Dependent Axonal Bouton Dynamics following Visual Deprivation In Vivo. Cell Rep 2019; 22:576-584. [PMID: 29346758 PMCID: PMC5792425 DOI: 10.1016/j.celrep.2017.12.065] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2017] [Revised: 12/04/2017] [Accepted: 12/20/2017] [Indexed: 11/26/2022] Open
Abstract
Persistent synapses are thought to underpin the storage of sensory experience, yet little is known about their structural plasticity in vivo. We investigated how persistent presynaptic structures respond to the loss of primary sensory input. Using in vivo two-photon (2P) imaging, we measured fluctuations in the size of excitatory axonal boutons in L2/3 of adult mouse visual cortex after monocular enucleation. The average size of boutons did not change after deprivation, but the range of bouton sizes was reduced. Large boutons decreased, and small boutons increased. Reduced bouton variance was accompanied by a reduced range of correlated calcium-mediated neural activity in L2/3 of awake animals. Network simulations predicted that size-dependent plasticity may promote conditions of greater bidirectional plasticity. These predictions were supported by electrophysiological measures of short- and long-term plasticity. We propose size-dependent dynamics facilitate cortical reorganization by maximizing the potential for bidirectional plasticity. The range of persistent axonal bouton sizes is reduced following visual deprivation Bouton sizes move toward the mean in a size-dependent manner Bouton plasticity is accompanied by a reduced range of correlated network activity Deprived cortex exhibits greater bidirectional functional presynaptic plasticity
Collapse
Affiliation(s)
- Rosanna P Sammons
- Department of Neuroscience, Physiology and Pharmacology, University College London, 21 University St., London WC1E 6DE, UK
| | - Claudia Clopath
- Department of Biomedical Engineering, Imperial College London, South Kensington Campus, London SW7 2AZ, UK
| | - Samuel J Barnes
- Division of Brain Sciences, Department of Medicine, Imperial College London, Hammersmith Hospital Campus, Du Cane Road, London W12 0NN, UK.
| |
Collapse
|
29
|
Letellier M, Levet F, Thoumine O, Goda Y. Differential role of pre- and postsynaptic neurons in the activity-dependent control of synaptic strengths across dendrites. PLoS Biol 2019; 17:e2006223. [PMID: 31166943 PMCID: PMC6576792 DOI: 10.1371/journal.pbio.2006223] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2018] [Revised: 06/17/2019] [Accepted: 05/17/2019] [Indexed: 01/07/2023] Open
Abstract
Neurons receive a large number of active synaptic inputs from their many presynaptic partners across their dendritic tree. However, little is known about how the strengths of individual synapses are controlled in balance with other synapses to effectively encode information while maintaining network homeostasis. This is in part due to the difficulty in assessing the activity of individual synapses with identified afferent and efferent connections for a synapse population in the brain. Here, to gain insights into the basic cellular rules that drive the activity-dependent spatial distribution of pre- and postsynaptic strengths across incoming axons and dendrites, we combine patch-clamp recordings with live-cell imaging of hippocampal pyramidal neurons in dissociated cultures and organotypic slices. Under basal conditions, both pre- and postsynaptic strengths cluster on single dendritic branches according to the identity of the presynaptic neurons, thus highlighting the ability of single dendritic branches to exhibit input specificity. Stimulating a single presynaptic neuron induces input-specific and dendritic branchwise spatial clustering of presynaptic strengths, which accompanies a widespread multiplicative scaling of postsynaptic strengths in dissociated cultures and heterosynaptic plasticity at distant synapses in organotypic slices. Our study provides evidence for a potential homeostatic mechanism by which the rapid changes in global or distant postsynaptic strengths compensate for input-specific presynaptic plasticity.
Collapse
Affiliation(s)
- Mathieu Letellier
- RIKEN Brain Science Institute, Wako, Saitama, Japan
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
- * E-mail: (ML); (YG)
| | - Florian Levet
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
- Bordeaux Imaging Center, University of Bordeaux, Bordeaux, France
- Bordeaux Imaging Center, CNRS UMS 3420, Bordeaux, France
- Bordeaux Imaging Center, INSERM US04, Bordeaux, France
| | - Olivier Thoumine
- Interdisciplinary Institute for Neuroscience, University of Bordeaux, Bordeaux, France
- Interdisciplinary Institute for Neuroscience, Centre National de la Recherche Scientifique (CNRS) UMR 5297, Bordeaux, France
| | - Yukiko Goda
- RIKEN Center for Brain Science, Wako, Saitama, Japan
- * E-mail: (ML); (YG)
| |
Collapse
|
30
|
Zhou JF, Yuan WJ, Chen D, Wang BH, Zhou Z, Boccaletti S, Wang Z. Synaptic modifications driven by spike-timing-dependent plasticity in weakly coupled bursting neurons. Phys Rev E 2019; 99:032419. [PMID: 30999534 DOI: 10.1103/physreve.99.032419] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2017] [Indexed: 12/25/2022]
Abstract
In the course of development, sleep, or mental disorders, certain neurons in the brain display spontaneous spike-burst activity. The synaptic plasticity evoked by such activity is here studied in the presence of spike-timing-dependent plasticity (STDP). In two chemically coupled bursting model neurons, the spike-burst activity can translate the STDP related to pre- and postsynaptic spike activity into burst-timing-dependent plasticity (BTDP), based on the timing of bursts of pre- and postsynaptic neurons. The resulting BTDP exhibits exponential decays with the same time scales as those of STDP. In weakly coupled bursting neuron networks, the synaptic modification driven by the spike-burst activity obeys a power-law distribution. The model can also produce a power-law distribution of synaptic weights. Here, the considered bursting behavior is made of stereotypical groups of spikes, and bursting is evenly spaced by long intervals.
Collapse
Affiliation(s)
- Jian-Fang Zhou
- College of Physics and Electronic Information, Huaibei Normal University, Huaibei 235000, China
| | - Wu-Jie Yuan
- College of Physics and Electronic Information, Huaibei Normal University, Huaibei 235000, China
| | - Debao Chen
- College of Physics and Electronic Information, Huaibei Normal University, Huaibei 235000, China
| | - Bing-Hong Wang
- Department of Modern Physics, University of Science and Technology of China, Hefei 230026, China
| | - Zhao Zhou
- College of Physics and Electronic Information, Huaibei Normal University, Huaibei 235000, China
| | - Stefano Boccaletti
- CNR-Institute of Complex Systems, Via Madonna del Piano, 10, 50019 Sesto Fiorentino, Florence, Italy.,Unmanned Systems Research Institute, Northwestern Polytechnical University, Xi'an, 710072 Shanxi, China
| | - Zhen Wang
- Center for OPTical IMagery Analysis and Learning (OPTIMAL), Northwestern Polytechnical University, Xi'an, 710072 Shanxi, China
| |
Collapse
|
31
|
Dabaghian Y. Through synapses to spatial memory maps via a topological model. Sci Rep 2019; 9:572. [PMID: 30679520 PMCID: PMC6345962 DOI: 10.1038/s41598-018-36807-0] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2018] [Accepted: 11/22/2018] [Indexed: 12/16/2022] Open
Abstract
Various neurophysiological and cognitive functions are based on transferring information between spiking neurons via a complex system of synaptic connections. In particular, the capacity of presynaptic inputs to influence the postsynaptic outputs–the efficacy of the synapses–plays a principal role in all aspects of hippocampal neurophysiology. However, a direct link between the information processed at the level of individual synapses and the animal’s ability to form memories at the organismal level has not yet been fully understood. Here, we investigate the effect of synaptic transmission probabilities on the ability of the hippocampal place cell ensembles to produce a cognitive map of the environment. Using methods from algebraic topology, we find that weakening synaptic connections increase spatial learning times, produce topological defects in the large-scale representation of the ambient space and restrict the range of parameters for which place cell ensembles are capable of producing a map with correct topological structure. On the other hand, the results indicate a possibility of compensatory phenomena, namely that spatial learning deficiencies may be mitigated through enhancement of neuronal activity.
Collapse
Affiliation(s)
- Yuri Dabaghian
- Department of Neurology, The University of Texas McGovern Medical School, 6431 Fannin St, Houston, TX, 77030, USA.
| |
Collapse
|
32
|
Yousefzadeh A, Stromatias E, Soto M, Serrano-Gotarredona T, Linares-Barranco B. On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights. Front Neurosci 2018; 12:665. [PMID: 30374283 PMCID: PMC6196279 DOI: 10.3389/fnins.2018.00665] [Citation(s) in RCA: 34] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2017] [Accepted: 09/04/2018] [Indexed: 11/21/2022] Open
Abstract
In computational neuroscience, synaptic plasticity learning rules are typically studied using the full 64-bit floating point precision computers provide. However, for dedicated hardware implementations, the precision used not only penalizes directly the required memory resources, but also the computing, communication, and energy resources. When it comes to hardware engineering, a key question is always to find the minimum number of necessary bits to keep the neurocomputational system working satisfactorily. Here we present some techniques and results obtained when limiting synaptic weights to 1-bit precision, applied to a Spike-Timing-Dependent-Plasticity (STDP) learning rule in Spiking Neural Networks (SNN). We first illustrate the 1-bit synapses STDP operation by replicating a classical biological experiment on visual orientation tuning, using a simple four neuron setup. After this, we apply 1-bit STDP learning to the hidden feature extraction layer of a 2-layer system, where for the second (and output) layer we use already reported SNN classifiers. The systems are tested on two spiking datasets: a Dynamic Vision Sensor (DVS) recorded poker card symbols dataset and a Poisson-distributed spike representation MNIST dataset version. Tests are performed using the in-house MegaSim event-driven behavioral simulator and by implementing the systems on FPGA (Field Programmable Gate Array) hardware.
Collapse
Affiliation(s)
- Amirreza Yousefzadeh
- Instituto de Microelectrónica de Sevilla (IMSE-CNM), CSIC and Universidad de Sevilla, Sevilla, Spain
| | - Evangelos Stromatias
- Instituto de Microelectrónica de Sevilla (IMSE-CNM), CSIC and Universidad de Sevilla, Sevilla, Spain
| | - Miguel Soto
- Instituto de Microelectrónica de Sevilla (IMSE-CNM), CSIC and Universidad de Sevilla, Sevilla, Spain
| | | | - Bernabé Linares-Barranco
- Instituto de Microelectrónica de Sevilla (IMSE-CNM), CSIC and Universidad de Sevilla, Sevilla, Spain
| |
Collapse
|
33
|
Abstract
Neurons integrate information from many neighbors when they process information. Inputs to a given neuron are thus indistinguishable from one another. Under the assumption that neurons maximize their information storage, indistinguishability is shown to place a strong constraint on the distribution of strengths between neurons. The distribution of individual synapse strengths is found to follow a modified Boltzmann distribution with strength proportional to [Formula: see text]. The model is shown to be consistent with experimental data from Caenorhabditis elegans connectivity and in vivo synaptic strength measurements. The [Formula: see text] dependence helps account for the observation of many zero or weak connections between neurons or sparsity of the neural network.
Collapse
Affiliation(s)
- Joseph Snider
- Institute for Neural Computation, University of California, San Diego, La Jolla, CA 90039, U.S.A.
| |
Collapse
|
34
|
Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex. J Neurosci 2017; 37:11021-11036. [PMID: 28986463 DOI: 10.1523/jneurosci.1222-17.2017] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2017] [Revised: 09/22/2017] [Accepted: 09/27/2017] [Indexed: 12/18/2022] Open
Abstract
Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear "mixed" selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training.SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli-and in particular, to combinations of stimuli ("mixed selectivity")-is a topic of interest. Even though models with random feedforward connectivity are capable of creating computationally relevant mixed selectivity, such a model does not match the levels of mixed selectivity seen in the data analyzed in this study. Adding simple Hebbian learning to the model increases mixed selectivity to the correct level and makes the model match the data on several other relevant measures. This study thus offers predictions on how mixed selectivity and other properties evolve with training.
Collapse
|
35
|
Striatopallidal Neuron NMDA Receptors Control Synaptic Connectivity, Locomotor, and Goal-Directed Behaviors. J Neurosci 2017; 36:4976-92. [PMID: 27147651 DOI: 10.1523/jneurosci.2717-15.2016] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2015] [Accepted: 03/07/2016] [Indexed: 01/08/2023] Open
Abstract
UNLABELLED The basal ganglia (BG) control action selection, motor programs, habits, and goal-directed learning. The striatum, the principal input structure of BG, is predominantly composed of medium-sized spiny neurons (MSNs). Arising from these spatially intermixed MSNs, two inhibitory outputs form two main efferent pathways, the direct and indirect pathways. Striatonigral MSNs give rise to the activating, direct pathway MSNs and striatopallidal MSNs to the inhibitory, indirect pathway (iMSNs). BG output nuclei integrate information from both pathways to fine-tune motor procedures and to acquire complex habits and skills. Therefore, balanced activity between both pathways is crucial for harmonious functions of the BG. Despite the increase in knowledge concerning the role of glutamate NMDA receptors (NMDA-Rs) in the striatum, understanding of the specific functions of NMDA-R iMSNs is still lacking. For this purpose, we generated a conditional knock-out mouse to address the functions of the NMDA-R in the indirect pathway. At the cellular level, deletion of GluN1 in iMSNs leads to a reduction in the number and strength of the excitatory corticostriatopallidal synapses. The subsequent scaling down in input integration leads to dysfunctional changes in BG output, which is seen as reduced habituation, delay in goal-directed learning, lack of associative behavior, and impairment in action selection or skill learning. The NMDA-R deletion in iMSNs causes a decrease in the synaptic strength of striatopallidal neurons, which in turn might lead to a imbalanced integration between direct and indirect MSN pathways, making mice less sensitive to environmental change. Therefore, their ability to learn and adapt to the environment-based experience was significantly affected. SIGNIFICANCE STATEMENT The striatum controls habits, locomotion, and goal-directed behaviors by coordinated activation of two antagonistic pathways. Insofar as NMDA receptors (NMDA-Rs) play a key role in synaptic plasticity essential for sustaining these behaviors, we generated a mouse model lacking NMDA-Rs specifically in striatopallidal neurons. To our knowledge, this is the first time that a specific deletion of inhibitory, indirect pathway medium-sized spiny neuron (iMSN) NMDA-Rs has been used to address the role of these receptors in the inhibitory pathway. Importantly, we found that this specific deletion led to a significant reduction in the number and strength of the cortico-iMSN synapses, which resulted in the significant impairments of behaviors orchestrated by the basal ganglia. Our findings indicate that the NMDA-Rs of the indirect pathway are essential for habituation, action selection, and goal-directed learning.
Collapse
|
36
|
Abstract
In this paper, we present data for the lognormal distributions of spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas examined. The difference between strongly recurrent and feed-forward connectivity (cortex vs. striatum and cerebellum), neurotransmitter (GABA (striatum) or glutamate (cortex)) or the level of activation (low in cortex, high in Purkinje cells and midbrain nuclei) turns out to be irrelevant for this feature. Logarithmic scale distribution of weights and gains appears to be a general, functional property in all cases analyzed. We then created a generic neural model to investigate adaptive learning rules that create and maintain lognormal distributions. We conclusively demonstrate that not only weights, but also intrinsic gains, need to have strong Hebbian learning in order to produce and maintain the experimentally attested distributions. This provides a solution to the long-standing question about the type of plasticity exhibited by intrinsic excitability.
Collapse
Affiliation(s)
- Gabriele Scheler
- Carl Correns Foundation for Mathematical Biology, Mountain View, CA, 94040, USA
| |
Collapse
|
37
|
Abstract
In this paper, we document lognormal distributions for spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas. The difference between strongly recurrent and feed-forward connectivity (cortex vs. striatum and cerebellum), neurotransmitter (GABA (striatum) or glutamate (cortex)) or the level of activation (low in cortex, high in Purkinje cells and midbrain nuclei) turns out to be irrelevant for this feature. Logarithmic scale distribution of weights and gains appears as a functional property that is present everywhere. Secondly, we created a generic neural model to show that Hebbian learning will create and maintain lognormal distributions. We could prove with the model that not only weights, but also intrinsic gains, need to have strong Hebbian learning in order to produce and maintain the experimentally attested distributions. This settles a long-standing question about the type of plasticity exhibited by intrinsic excitability.
Collapse
Affiliation(s)
- Gabriele Scheler
- Carl Correns Foundation for Mathematical Biology, Mountain View, CA, 94040, USA
| |
Collapse
|
38
|
Park Y, Choi W, Paik SB. Symmetry of learning rate in synaptic plasticity modulates formation of flexible and stable memories. Sci Rep 2017; 7:5671. [PMID: 28720795 PMCID: PMC5516032 DOI: 10.1038/s41598-017-05929-2] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2016] [Accepted: 06/06/2017] [Indexed: 01/06/2023] Open
Abstract
Spike-timing-dependent plasticity (STDP) is considered critical to learning and memory functions in the human brain. Across various types of synapse, STDP is observed as different profiles of Hebbian and anti-Hebbian learning rules. However, the specific roles of diverse STDP profiles in memory formation still remain elusive. Here, we show that the symmetry of the learning rate profile in STDP is crucial to determining the character of stored memory. Using computer simulations, we found that an asymmetric learning rate generates flexible memory that is volatile and easily overwritten by newly appended information. Moreover, a symmetric learning rate generates stable memory that can coexist with newly appended information. In addition, by combining these two conditions, we could realize a hybrid memory type that operates in a way intermediate between stable and flexible memory. Our results demonstrate that various attributes of memory functions may originate from differences in the synaptic stability.
Collapse
Affiliation(s)
- Youngjin Park
- Department of Bio and Brain Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 34141, Republic of Korea
| | - Woochul Choi
- Department of Bio and Brain Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 34141, Republic of Korea.,Program of Brain and Cognitive Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 34141, Republic of Korea
| | - Se-Bum Paik
- Department of Bio and Brain Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 34141, Republic of Korea. .,Program of Brain and Cognitive Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 34141, Republic of Korea.
| |
Collapse
|
39
|
Behavioral and Single-Neuron Sensitivity to Millisecond Variations in Temporally Patterned Communication Signals. J Neurosci 2017; 36:8985-9000. [PMID: 27559179 DOI: 10.1523/jneurosci.0648-16.2016] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2016] [Accepted: 07/05/2016] [Indexed: 01/09/2023] Open
Abstract
UNLABELLED In many sensory pathways, central neurons serve as temporal filters for timing patterns in communication signals. However, how a population of neurons with diverse temporal filtering properties codes for natural variation in communication signals is unknown. Here we addressed this question in the weakly electric fish Brienomyrus brachyistius, which varies the time intervals between successive electric organ discharges to communicate. These fish produce an individually stereotyped signal called a scallop, which consists of a distinctive temporal pattern of ∼8-12 electric pulses. We manipulated the temporal structure of natural scallops during behavioral playback and in vivo electrophysiology experiments to probe the temporal sensitivity of scallop encoding and recognition. We found that presenting time-reversed, randomized, or jittered scallops increased behavioral response thresholds, demonstrating that fish's electric signaling behavior was sensitive to the precise temporal structure of scallops. Next, using in vivo intracellular recordings and discriminant function analysis, we found that the responses of interval-selective midbrain neurons were also sensitive to the precise temporal structure of scallops. Subthreshold changes in membrane potential recorded from single neurons discriminated natural scallops from time-reversed, randomized, and jittered sequences. Pooling the responses of multiple neurons improved the discriminability of natural sequences from temporally manipulated sequences. Finally, we found that single-neuron responses were sensitive to interindividual variation in scallop sequences, raising the question of whether fish may analyze scallop structure to gain information about the sender. Collectively, these results demonstrate that a population of interval-selective neurons can encode behaviorally relevant temporal patterns with millisecond precision. SIGNIFICANCE STATEMENT The timing patterns of action potentials, or spikes, play important roles in representing information in the nervous system. However, how these temporal patterns are recognized by downstream neurons is not well understood. Here we use the electrosensory system of mormyrid weakly electric fish to investigate how a population of neurons with diverse temporal filtering properties encodes behaviorally relevant input timing patterns, and how this relates to behavioral sensitivity. We show that fish are behaviorally sensitive to millisecond variations in natural, temporally patterned communication signals, and that the responses of individual midbrain neurons are also sensitive to variation in these patterns. In fact, the output of single neurons contains enough information to discriminate stereotyped communication signals produced by different individuals.
Collapse
|
40
|
Puggioni P, Jelitai M, Duguid I, van Rossum MCW. Extraction of Synaptic Input Properties in Vivo. Neural Comput 2017; 29:1745-1768. [PMID: 28562220 DOI: 10.1162/neco_a_00975] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
Knowledge of synaptic input is crucial for understanding synaptic integration and ultimately neural function. However, in vivo, the rates at which synaptic inputs arrive are high, so that it is typically impossible to detect single events. We show here that it is nevertheless possible to extract the properties of the events and, in particular, to extract the event rate, the synaptic time constants, and the properties of the event size distribution from in vivo voltage-clamp recordings. Applied to cerebellar interneurons, our method reveals that the synaptic input rate increases from 600 Hz during rest to 1000 Hz during locomotion, while the amplitude and shape of the synaptic events are unaffected by this state change. This method thus complements existing methods to measure neural function in vivo.
Collapse
Affiliation(s)
- Paolo Puggioni
- Neuroinformatics Doctoral Training Centre and Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, U.K.
| | - Marta Jelitai
- Centre for Integrative Physiology, University of Edinburgh, Edinburgh EH8 9XD, U.K.
| | - Ian Duguid
- Centre for Integrative Physiology, University of Edinburgh, Edinburgh EH8 9XD, U.K.
| | - Mark C W van Rossum
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, U.K.
| |
Collapse
|
41
|
Einarsson H, Gauy MM, Lengler J, Steger A. A Model of Fast Hebbian Spike Latency Normalization. Front Comput Neurosci 2017; 11:33. [PMID: 28555102 PMCID: PMC5430963 DOI: 10.3389/fncom.2017.00033] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2016] [Accepted: 04/13/2017] [Indexed: 11/13/2022] Open
Abstract
Hebbian changes of excitatory synapses are driven by and enhance correlations between pre- and postsynaptic neuronal activations, forming a positive feedback loop that can lead to instability in simulated neural networks. Because Hebbian learning may occur on time scales of seconds to minutes, it is conjectured that some form of fast stabilization of neural firing is necessary to avoid runaway of excitation, but both the theoretical underpinning and the biological implementation for such homeostatic mechanism are to be fully investigated. Supported by analytical and computational arguments, we show that a Hebbian spike-timing-dependent metaplasticity rule, accounts for inherently-stable, quick tuning of the total input weight of a single neuron in the general scenario of asynchronous neural firing characterized by UP and DOWN states of activity.
Collapse
Affiliation(s)
- Hafsteinn Einarsson
- Department of Computer Science, Institute of Theoretical Computer Science, ETH ZurichZurich, Switzerland
| | - Marcelo M. Gauy
- Department of Computer Science, Institute of Theoretical Computer Science, ETH ZurichZurich, Switzerland
| | - Johannes Lengler
- Department of Computer Science, Institute of Theoretical Computer Science, ETH ZurichZurich, Switzerland
| | - Angelika Steger
- Department of Computer Science, Institute of Theoretical Computer Science, ETH ZurichZurich, Switzerland
- Collegium HelveticumZurich, Switzerland
| |
Collapse
|
42
|
Exact firing time statistics of neurons driven by discrete inhibitory noise. Sci Rep 2017; 7:1577. [PMID: 28484244 PMCID: PMC5431561 DOI: 10.1038/s41598-017-01658-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2017] [Accepted: 03/29/2017] [Indexed: 12/15/2022] Open
Abstract
Neurons in the intact brain receive a continuous and irregular synaptic bombardment from excitatory and inhibitory pre- synaptic neurons, which determines the firing activity of the stimulated neuron. In order to investigate the influence of inhibitory stimulation on the firing time statistics, we consider Leaky Integrate-and-Fire neurons subject to inhibitory instantaneous post- synaptic potentials. In particular, we report exact results for the firing rate, the coefficient of variation and the spike train spectrum for various synaptic weight distributions. Our results are not limited to stimulations of infinitesimal amplitude, but they apply as well to finite amplitude post-synaptic potentials, thus being able to capture the effect of rare and large spikes. The developed methods are able to reproduce also the average firing properties of heterogeneous neuronal populations.
Collapse
|
43
|
Lee WW, Kukreja SL, Thakor NV. CONE: Convex-Optimized-Synaptic Efficacies for Temporally Precise Spike Mapping. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2017; 28:849-861. [PMID: 27046881 DOI: 10.1109/tnnls.2015.2509479] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Spiking neural networks are well suited to perform time-dependent pattern recognition problems by encoding the temporal dimension in precise spike times. With an appropriate set of weights, a spiking neuron can emit precisely timed action potentials in response to spatiotemporal input spikes. However, deriving supervised learning rules for spike mapping is nontrivial due to the increased complexity. Existing methods rely on heuristic approaches that do not guarantee a convex objective function and, therefore, may not converge to a global minimum. In this paper, we present a novel technique to obtain the weights of spiking neurons by formulating the problem in a convex optimization framework, rendering it be compatible with the established methods. We introduce techniques to influence the weight distribution and membrane trajectory, and then study how these factors affect robustness in the presence of noise. In addition, we show how the existence of a solution can be determined and assess memory capacity limits of a neuron model using synthetic examples. The practical utility of our technique is further assessed by its application to gait-event detection using the experimental data.
Collapse
|
44
|
M De la Fuente I, Malaina I, Pérez-Samartín A, Boyano MD, Pérez-Yarza G, Bringas C, Villarroel Á, Fedetz M, Arellano R, Cortes JM, Martínez L. Dynamic properties of calcium-activated chloride currents in Xenopus laevis oocytes. Sci Rep 2017; 7:41791. [PMID: 28198817 PMCID: PMC5304176 DOI: 10.1038/srep41791] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2016] [Accepted: 12/30/2016] [Indexed: 11/18/2022] Open
Abstract
Chloride is the most abundant permeable anion in the cell, and numerous studies in the last two decades highlight the great importance and broad physiological role of chloride currents mediated anion transport. They participate in a multiplicity of key processes, as for instance, the regulation of electrical excitability, apoptosis, cell cycle, epithelial secretion and neuronal excitability. In addition, dysfunction of Cl− channels is involved in a variety of human diseases such as epilepsy, osteoporosis and different cancer types. Historically, chloride channels have been of less interest than the cation channels. In fact, there seems to be practically no quantitative studies of the dynamics of chloride currents. Here, for the first time, we have quantitatively studied experimental calcium-activated chloride fluxes belonging to Xenopus laevis oocytes, and the main results show that the experimental Cl− currents present an informational structure characterized by highly organized data sequences, long-term memory properties and inherent “crossover” dynamics in which persistent correlations arise at short time intervals, while anti-persistent behaviors become dominant in long time intervals. Our work sheds some light on the understanding of the informational properties of ion currents, a key element to elucidate the physiological functional coupling with the integrative dynamics of metabolic processes.
Collapse
Affiliation(s)
- Ildefonso M De la Fuente
- Department of Nutrition, CEBAS-CSIC Institute, Espinardo University Campus, Murcia, Spain.,Department of Mathematics, Faculty of Science and Technology, University of the Basque Country, UPV/EHU, Leioa, Spain
| | - Iker Malaina
- Department of Mathematics, Faculty of Science and Technology, University of the Basque Country, UPV/EHU, Leioa, Spain
| | - Alberto Pérez-Samartín
- Department of Neurosciences, Faculty of Medicine and Dentistry, University of the Basque Country, UPV/EHU, Leioa, Spain
| | - María Dolores Boyano
- Department of Cell Biology and Histology, Faculty of Medicine and Dentistry, University of the Basque Country, UPV/EHU, Leioa, Spain
| | - Gorka Pérez-Yarza
- Department of Cell Biology and Histology, Faculty of Medicine and Dentistry, University of the Basque Country, UPV/EHU, Leioa, Spain
| | - Carlos Bringas
- Department of Cell Biology and Histology, Faculty of Medicine and Dentistry, University of the Basque Country, UPV/EHU, Leioa, Spain
| | - Álvaro Villarroel
- Biophysics Unit, CSIC, University of the Basque Country, UPV/EHU, Leioa, Spain
| | - María Fedetz
- Department of Biochemistry and Pharmacology, Institute of Parasitology and Biomedicine "López-Neyra", CSIC, Granada, Spain
| | - Rogelio Arellano
- Laboratory of Cellular Neurophysiology, Neurobiology Institute, UNAM, Querétaro, México
| | - Jesus M Cortes
- Department of Cell Biology and Histology, Faculty of Medicine and Dentistry, University of the Basque Country, UPV/EHU, Leioa, Spain.,BioCruces Health Research Institute, Cruces University Hospital, Barakaldo, Spain.,IKERBASQUE: The Basque Foundation for Science, Bilbao, Spain
| | - Luis Martínez
- Department of Mathematics, Faculty of Science and Technology, University of the Basque Country, UPV/EHU, Leioa, Spain
| |
Collapse
|
45
|
Goldt S, Seifert U. Stochastic Thermodynamics of Learning. PHYSICAL REVIEW LETTERS 2017; 118:010601. [PMID: 28106416 DOI: 10.1103/physrevlett.118.010601] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/11/2016] [Indexed: 06/06/2023]
Abstract
Virtually every organism gathers information about its noisy environment and builds models from those data, mostly using neural networks. Here, we use stochastic thermodynamics to analyze the learning of a classification rule by a neural network. We show that the information acquired by the network is bounded by the thermodynamic cost of learning and introduce a learning efficiency η≤1. We discuss the conditions for optimal learning and analyze Hebbian learning in the thermodynamic limit.
Collapse
Affiliation(s)
- Sebastian Goldt
- II. Institut für Theoretische Physik, Universität Stuttgart, 70550 Stuttgart, Germany
| | - Udo Seifert
- II. Institut für Theoretische Physik, Universität Stuttgart, 70550 Stuttgart, Germany
| |
Collapse
|
46
|
Li G, Deng L, Wang D, Wang W, Zeng F, Zhang Z, Li H, Song S, Pei J, Shi L. Hierarchical Chunking of Sequential Memory on Neuromorphic Architecture with Reduced Synaptic Plasticity. Front Comput Neurosci 2016; 10:136. [PMID: 28066223 PMCID: PMC5168929 DOI: 10.3389/fncom.2016.00136] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2016] [Accepted: 12/01/2016] [Indexed: 11/30/2022] Open
Abstract
Chunking refers to a phenomenon whereby individuals group items together when performing a memory task to improve the performance of sequential memory. In this work, we build a bio-plausible hierarchical chunking of sequential memory (HCSM) model to explain why such improvement happens. We address this issue by linking hierarchical chunking with synaptic plasticity and neuromorphic engineering. We uncover that a chunking mechanism reduces the requirements of synaptic plasticity since it allows applying synapses with narrow dynamic range and low precision to perform a memory task. We validate a hardware version of the model through simulation, based on measured memristor behavior with narrow dynamic range in neuromorphic circuits, which reveals how chunking works and what role it plays in encoding sequential memory. Our work deepens the understanding of sequential memory and enables incorporating it for the investigation of the brain-inspired computing on neuromorphic architecture.
Collapse
Affiliation(s)
- Guoqi Li
- Department of Precision Instrument, Center for Brain Inspired Computing Research, Tsinghua University Beijing, China
| | - Lei Deng
- Department of Precision Instrument, Center for Brain Inspired Computing Research, Tsinghua University Beijing, China
| | - Dong Wang
- Department of Precision Instrument, Center for Brain Inspired Computing Research, Tsinghua University Beijing, China
| | - Wei Wang
- School of Automation Science and Electric Engineering, Beihang University Beijing, China
| | - Fei Zeng
- Department of Materials Science and Engineering, Tsinghua University Beijing, China
| | - Ziyang Zhang
- Department of Precision Instrument, Center for Brain Inspired Computing Research, Tsinghua University Beijing, China
| | - Huanglong Li
- Department of Precision Instrument, Center for Brain Inspired Computing Research, Tsinghua University Beijing, China
| | - Sen Song
- School of Medicine, Tsinghua University Beijing, China
| | - Jing Pei
- Department of Precision Instrument, Center for Brain Inspired Computing Research, Tsinghua University Beijing, China
| | - Luping Shi
- Department of Precision Instrument, Center for Brain Inspired Computing Research, Tsinghua University Beijing, China
| |
Collapse
|
47
|
|
48
|
Miconi T, McKinstry JL, Edelman GM. Spontaneous emergence of fast attractor dynamics in a model of developing primary visual cortex. Nat Commun 2016; 7:13208. [PMID: 27796298 PMCID: PMC5095518 DOI: 10.1038/ncomms13208] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2015] [Accepted: 09/12/2016] [Indexed: 11/17/2022] Open
Abstract
Recent evidence suggests that neurons in primary sensory cortex arrange into competitive groups, representing stimuli by their joint activity rather than as independent feature analysers. A possible explanation for these results is that sensory cortex implements attractor dynamics, although this proposal remains controversial. Here we report that fast attractor dynamics emerge naturally in a computational model of a patch of primary visual cortex endowed with realistic plasticity (at both feedforward and lateral synapses) and mutual inhibition. When exposed to natural images (but not random pixels), the model spontaneously arranges into competitive groups of reciprocally connected, similarly tuned neurons, while developing realistic, orientation-selective receptive fields. Importantly, the same groups are observed in both stimulus-evoked and spontaneous (stimulus-absent) activity. The resulting network is inhibition-stabilized and exhibits fast, non-persistent attractor dynamics. Our results suggest that realistic plasticity, mutual inhibition and natural stimuli are jointly necessary and sufficient to generate attractor dynamics in primary sensory cortex. Sensory cortices represent stimuli through joint activity of competing neuronal assemblies. Here the authors show that a model of visual cortex with plastic feedforward and recurrent synapses, exposed to natural images, spontaneously develops attractor dynamics between groups of similarly tuned neurons.
Collapse
Affiliation(s)
- Thomas Miconi
- The Neurosciences Institute, 800 Silverado Street, Suite 302, La Jolla, California 92037-4234, USA
| | - Jeffrey L McKinstry
- The Neurosciences Institute, 800 Silverado Street, Suite 302, La Jolla, California 92037-4234, USA
| | - Gerald M Edelman
- The Neurosciences Institute, 800 Silverado Street, Suite 302, La Jolla, California 92037-4234, USA
| |
Collapse
|
49
|
TARP γ-2 and γ-8 Differentially Control AMPAR Density Across Schaffer Collateral/Commissural Synapses in the Hippocampal CA1 Area. J Neurosci 2016; 36:4296-312. [PMID: 27076426 DOI: 10.1523/jneurosci.4178-15.2016] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2015] [Accepted: 02/19/2016] [Indexed: 11/21/2022] Open
Abstract
UNLABELLED The number of AMPA-type glutamate receptors (AMPARs) at synapses is the major determinant of synaptic strength and varies from synapse to synapse. To clarify the underlying molecular mechanisms, the density of AMPARs, PSD-95, and transmembrane AMPAR regulatory proteins (TARPs) were compared at Schaffer collateral/commissural (SCC) synapses in the adult mouse hippocampal CA1 by quantitative immunogold electron microscopy using serial sections. We examined four types of SCC synapses: perforated and nonperforated synapses on pyramidal cells and axodendritic synapses on parvalbumin-positive (PV synapse) and pravalbumin-negative interneurons (non-PV synapse). SCC synapses were categorized into those expressing high-density (perforated and PV synapses) or low-density (nonperforated and non-PV synapses) AMPARs. Although the density of PSD-95 labeling was fairly constant, the density and composition of TARP isoforms was highly variable depending on the synapse type. Of the three TARPs expressed in hippocampal neurons, the disparity in TARP γ-2 labeling was closely related to that of AMPAR labeling. Importantly, AMPAR density was significantly reduced at perforated and PV synapses in TARP γ-2-knock-out (KO) mice, resulting in a virtual loss of AMPAR disparity among SCC synapses. In comparison, TARP γ-8 was the only TARP expressed at nonperforated synapses, where AMPAR labeling further decreased to a background level in TARP γ-8-KO mice. These results show that synaptic inclusion of TARP γ-2 potently increases AMPAR expression and transforms low-density synapses into high-density ones, whereas TARP γ-8 is essential for low-density or basal expression of AMPARs at nonperforated synapses. Therefore, these TARPs are critically involved in AMPAR density control at SCC synapses. SIGNIFICANCE STATEMENT Although converging evidence implicates the importance of transmembrane AMPA-type glutamate receptor (AMPAR) regulatory proteins (TARPs) in AMPAR stabilization during basal transmission and synaptic plasticity, how they control large disparities in AMPAR numbers or densities across central synapses remains largely unknown. We compared the density of AMPARs with that of TARPs among four types of Schaffer collateral/commissural (SCC) hippocampal synapses in wild-type and TARP-knock-out mice. We show that the density of AMPARs correlates with that of TARP γ-2 across SCC synapses and its high expression is linked to high-density AMPAR expression at perforated type of pyramidal cell synapses and synapses on parvalbumin-positive interneurons. In comparison, TARP γ-8 is the only TARP expressed at nonperforated type of pyramidal cell synapses, playing an essential role in low-density or basal AMPAR expression.
Collapse
|
50
|
Wan CJ, Liu YH, Feng P, Wang W, Zhu LQ, Liu ZP, Shi Y, Wan Q. Flexible Metal Oxide/Graphene Oxide Hybrid Neuromorphic Transistors on Flexible Conducting Graphene Substrates. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2016; 28:5878-5885. [PMID: 27159546 DOI: 10.1002/adma.201600820] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/11/2016] [Revised: 04/11/2016] [Indexed: 06/05/2023]
Abstract
Flexible metal oxide/graphene oxide hybrid multi-gate neuromorphic transistors are fabricated on flexible conducting graphene substrates. Dendritic integrations in both spatial and temporal modes are emulated, and spatiotemporal correlated logics are obtained. A proof-of-principle visual system model for emulating Lobula Giant Motion Detector neuron is also investigated. The results are of great significance for flexible sensors and neuromorphic cognitive systems.
Collapse
Affiliation(s)
- Chang Jin Wan
- School of Electronic Science and Engineering and Collaborative Innovation Center of Advanced Microstructures, Nanjing University, Nanjing, 210093, China
- Ningbo Institute of Material Technology and Engineering, Chinese Academy of Sciences, Ningbo, 315201, China
| | - Yang Hui Liu
- Ningbo Institute of Material Technology and Engineering, Chinese Academy of Sciences, Ningbo, 315201, China
| | - Ping Feng
- School of Electronic Science and Engineering and Collaborative Innovation Center of Advanced Microstructures, Nanjing University, Nanjing, 210093, China
| | - Wei Wang
- School of Electronic Science and Engineering and Collaborative Innovation Center of Advanced Microstructures, Nanjing University, Nanjing, 210093, China
- Ningbo Institute of Material Technology and Engineering, Chinese Academy of Sciences, Ningbo, 315201, China
| | - Li Qiang Zhu
- Ningbo Institute of Material Technology and Engineering, Chinese Academy of Sciences, Ningbo, 315201, China
| | - Zhao Ping Liu
- Ningbo Institute of Material Technology and Engineering, Chinese Academy of Sciences, Ningbo, 315201, China
| | - Yi Shi
- School of Electronic Science and Engineering and Collaborative Innovation Center of Advanced Microstructures, Nanjing University, Nanjing, 210093, China
| | - Qing Wan
- School of Electronic Science and Engineering and Collaborative Innovation Center of Advanced Microstructures, Nanjing University, Nanjing, 210093, China
- Ningbo Institute of Material Technology and Engineering, Chinese Academy of Sciences, Ningbo, 315201, China
| |
Collapse
|