1
|
Devalle F, Roxin A. How plasticity shapes the formation of neuronal assemblies driven by oscillatory and stochastic inputs. J Comput Neurosci 2025; 53:9-23. [PMID: 39661297 DOI: 10.1007/s10827-024-00885-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2024] [Revised: 11/21/2024] [Accepted: 11/25/2024] [Indexed: 12/12/2024]
Abstract
Synaptic connections in neuronal circuits are modulated by pre- and post-synaptic spiking activity. Previous theoretical work has studied how such Hebbian plasticity rules shape network connectivity when firing rates are constant, or slowly varying in time. However, oscillations and fluctuations, which can arise through sensory inputs or intrinsic brain mechanisms, are ubiquitous in neuronal circuits. Here we study how oscillatory and fluctuating inputs shape recurrent network connectivity given a temporally asymmetric plasticity rule. We do this analytically using a separation of time scales approach for pairs of neurons, and then show that the analysis can be extended to understand the structure in large networks. In the case of oscillatory inputs, the resulting network structure is strongly affected by the phase relationship between drive to different neurons. In large networks, distributed phases tend to lead to hierarchical clustering. The analysis for stochastic inputs reveals a rich phase plane in which there is multistability between different possible connectivity motifs. Our results may be of relevance for understanding the effect of sensory-driven inputs, which are by nature time-varying, on synaptic plasticity, and hence on learning and memory.
Collapse
Affiliation(s)
- Federico Devalle
- Computational Neuroscience Group, Centre de Recerca Matemàtica, Campus de Bellaterra, Edifici C, 08193, Bellterra, Spain
| | - Alex Roxin
- Computational Neuroscience Group, Centre de Recerca Matemàtica, Campus de Bellaterra, Edifici C, 08193, Bellterra, Spain.
| |
Collapse
|
2
|
Ying X, Xie Q, Zhao Y, Shen J, Huang J, Feng Z, Chu L, Xu J, Jiang D, Wu P, Zuo Y, Li S, Jiang C, Li X, Wang Z. Exercise therapy facilitates neural remodeling and functional recovery post-spinal cord injury via PKA/CREB signaling pathway modulation in rats. BURNS & TRAUMA 2025; 13:tkae058. [PMID: 39845195 PMCID: PMC11751360 DOI: 10.1093/burnst/tkae058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/10/2023] [Revised: 03/14/2024] [Accepted: 08/29/2024] [Indexed: 01/24/2025]
Abstract
Background Neuronal structure is disrupted after spinal cord injury (SCI), causing functional impairment. The effectiveness of exercise therapy (ET) in clinical settings for nerve remodeling post-SCI and its underlying mechanisms remain unclear. This study aims to explore the effects and related mechanisms of ET on nerve remodeling in SCI rats. Methods We randomly assigned rats to various groups: sham-operated group, sham-operated + ET, SCI alone, SCI + H89, SCI + ET, and SCI + ET + H89. Techniques including motor-evoked potential (MEP), video capture and analysis, the Basso-Beattie-Bresnahan (BBB) scale, western blotting, transmission electron microscopy, hematoxylin and eosin staining, Nissl staining, glycine silver staining, immunofluorescence, and Golgi staining were utilized to assess signal conduction capabilities, neurological deficits, hindlimb performance, protein expression levels, neuron ultrastructure, and tissue morphology. H89-an inhibitor that targets the protein kinase A (PKA)/cAMP response element-binding (CREB) signaling pathway-was employed to investigate molecular mechanisms. Results This study found that ET can reduce neuronal damage in rats with SCI, protect residual tissue, promote the remodeling of motor neurons, neurofilaments, dendrites/axons, synapses, and myelin sheaths, reorganize neural circuits, and promote motor function recovery. In terms of mechanism, ET mainly works by mediating the PKA/CREB signaling pathway in neurons. Conclusions Our findings indicated that: (1) ET counteracted the H89-induced suppression of the PKA/CREB signaling pathway following SCI; (2) ET significantly alleviated neuronal injury and improved motor dysfunction; (3) ET facilitated neuronal regeneration by mediating the PKA/CREB signaling pathway; (4) ET enhanced synaptic and dendritic spine plasticity, as well as myelin sheath remodeling, post-SCI through the PKA/CREB signaling pathway.
Collapse
Affiliation(s)
- Xinwang Ying
- The Orthopaedic Center, The Affiliated Wenling Hospital of Wenzhou Medical University (The First People’s Hospital of Wenling), 333 Chuanan Road, Chengxi Street, Wenling City, Zhejiang Province 317500, China
- Department of Physical Medicine and Rehabilitation, The Second Affiliated Hospital and Yuying Children's Hospital of Wenzhou Medical University, 109 Xueyuan West Road, Lucheng District, Wenzhou City, Zhejiang Province 325000, China
- National Key Laboratory of Macromolecular Drug Development and Manufacturing, School of Pharmaceutical Science, Wenzhou Medical University, Zhongxin North Road, Chashan Higher Education Park, Ouhai District, Wenzhou City, Zhejiang Province 325035, China
| | - Qingfeng Xie
- Department of Physical Medicine and Rehabilitation, The Second Affiliated Hospital and Yuying Children's Hospital of Wenzhou Medical University, 109 Xueyuan West Road, Lucheng District, Wenzhou City, Zhejiang Province 325000, China
| | - Yanfang Zhao
- National Key Laboratory of Macromolecular Drug Development and Manufacturing, School of Pharmaceutical Science, Wenzhou Medical University, Zhongxin North Road, Chashan Higher Education Park, Ouhai District, Wenzhou City, Zhejiang Province 325035, China
| | - Jiamen Shen
- National Key Laboratory of Macromolecular Drug Development and Manufacturing, School of Pharmaceutical Science, Wenzhou Medical University, Zhongxin North Road, Chashan Higher Education Park, Ouhai District, Wenzhou City, Zhejiang Province 325035, China
| | - Junqing Huang
- Oujiang Laboratory (Zhejiang Lab for Regenerative Medicine, Vision and Brain Health), School of Pharmaceutical Science, Wenzhou Medical University, No. 999 Jinshi Road, Yongzhong Street, Longwan District, Wenzhou City, Zhejiang Province 325000, China
| | - Zhiyi Feng
- National Key Laboratory of Macromolecular Drug Development and Manufacturing, School of Pharmaceutical Science, Wenzhou Medical University, Zhongxin North Road, Chashan Higher Education Park, Ouhai District, Wenzhou City, Zhejiang Province 325035, China
| | - Liuxi Chu
- Oujiang Laboratory (Zhejiang Lab for Regenerative Medicine, Vision and Brain Health), School of Pharmaceutical Science, Wenzhou Medical University, No. 999 Jinshi Road, Yongzhong Street, Longwan District, Wenzhou City, Zhejiang Province 325000, China
| | - Junpeng Xu
- Wenzhou Medical University, Affiliated Cixi Hospital, No. 999, South Second Ring Road East, Hushan Street, Cixi City, Ningbo City, Zhejiang Province 315300, China
| | - Dawei Jiang
- Wenzhou Medical University, Affiliated Cixi Hospital, No. 999, South Second Ring Road East, Hushan Street, Cixi City, Ningbo City, Zhejiang Province 315300, China
| | - Ping Wu
- National Key Laboratory of Macromolecular Drug Development and Manufacturing, School of Pharmaceutical Science, Wenzhou Medical University, Zhongxin North Road, Chashan Higher Education Park, Ouhai District, Wenzhou City, Zhejiang Province 325035, China
| | - Yanming Zuo
- National Key Laboratory of Macromolecular Drug Development and Manufacturing, School of Pharmaceutical Science, Wenzhou Medical University, Zhongxin North Road, Chashan Higher Education Park, Ouhai District, Wenzhou City, Zhejiang Province 325035, China
| | - Shengcun Li
- Department of Physical Medicine and Rehabilitation, The Second Affiliated Hospital and Yuying Children's Hospital of Wenzhou Medical University, 109 Xueyuan West Road, Lucheng District, Wenzhou City, Zhejiang Province 325000, China
| | - Chang Jiang
- The Orthopaedic Center, The Affiliated Wenling Hospital of Wenzhou Medical University (The First People’s Hospital of Wenling), 333 Chuanan Road, Chengxi Street, Wenling City, Zhejiang Province 317500, China
| | - Xiaokun Li
- Oujiang Laboratory (Zhejiang Lab for Regenerative Medicine, Vision and Brain Health), School of Pharmaceutical Science, Wenzhou Medical University, No. 999 Jinshi Road, Yongzhong Street, Longwan District, Wenzhou City, Zhejiang Province 325000, China
| | - Zhouguang Wang
- The Orthopaedic Center, The Affiliated Wenling Hospital of Wenzhou Medical University (The First People’s Hospital of Wenling), 333 Chuanan Road, Chengxi Street, Wenling City, Zhejiang Province 317500, China
- Oujiang Laboratory (Zhejiang Lab for Regenerative Medicine, Vision and Brain Health), School of Pharmaceutical Science, Wenzhou Medical University, No. 999 Jinshi Road, Yongzhong Street, Longwan District, Wenzhou City, Zhejiang Province 325000, China
| |
Collapse
|
3
|
Fink AJP, Muscinelli SP, Wang S, Hogan MI, English DF, Axel R, Litwin-Kumar A, Schoonover CE. Experience-dependent reorganization of inhibitory neuron synaptic connectivity. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2025:2025.01.16.633450. [PMID: 39868262 PMCID: PMC11761011 DOI: 10.1101/2025.01.16.633450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 01/28/2025]
Abstract
Organisms continually tune their perceptual systems to the features they encounter in their environment1-3. We have studied how ongoing experience reorganizes the synaptic connectivity of neurons in the olfactory (piriform) cortex of the mouse. We developed an approach to measure synaptic connectivity in vivo, training a deep convolutional network to reliably identify monosynaptic connections from the spike-time cross-correlograms of 4.4 million single-unit pairs. This revealed that excitatory piriform neurons with similar odor tuning are more likely to be connected. We asked whether experience enhances this like-to-like connectivity but found that it was unaffected by odor exposure. Experience did, however, alter the logic of interneuron connectivity. Following repeated encounters with a set of odorants, inhibitory neurons that responded differentially to these stimuli exhibited a high degree of both incoming and outgoing synaptic connections within the cortical network. This reorganization depended only on the odor tuning of the inhibitory interneuron and not on the tuning of its pre- or postsynaptic partners. A computational model of this reorganized connectivity predicts that it increases the dimensionality of the entire network's responses to familiar stimuli, thereby enhancing their discriminability. We confirmed that this network-level property is present in physiological measurements, which showed increased dimensionality and separability of the evoked responses to familiar versus novel odorants. Thus, a simple, non-Hebbian reorganization of interneuron connectivity may selectively enhance an organism's discrimination of the features of its environment.
Collapse
Affiliation(s)
- Andrew J P Fink
- Department of Neurobiology, Northwestern University Evanston, IL
- Mortimer B. Zuckerman Mind Brain Behavior Institute Department of Neuroscience Columbia University New York, NY
| | - Samuel P Muscinelli
- Mortimer B. Zuckerman Mind Brain Behavior Institute Department of Neuroscience Columbia University New York, NY
| | - Shuqi Wang
- Mortimer B. Zuckerman Mind Brain Behavior Institute Department of Neuroscience Columbia University New York, NY
- École Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | - Marcus I Hogan
- Mortimer B. Zuckerman Mind Brain Behavior Institute Department of Neuroscience Columbia University New York, NY
- Neuroscience Graduate Program, University of California Berkeley Berkeley, CA
| | | | - Richard Axel
- Mortimer B. Zuckerman Mind Brain Behavior Institute Department of Neuroscience Columbia University New York, NY
- Howard Hughes Medical Institute
| | - Ashok Litwin-Kumar
- Mortimer B. Zuckerman Mind Brain Behavior Institute Department of Neuroscience Columbia University New York, NY
| | - Carl E Schoonover
- Mortimer B. Zuckerman Mind Brain Behavior Institute Department of Neuroscience Columbia University New York, NY
- Allen Institute for Neural Dynamics Seattle, WA
| |
Collapse
|
4
|
Mayzel J, Schneidman E. Homeostatic synaptic normalization optimizes learning in network models of neural population codes. eLife 2024; 13:RP96566. [PMID: 39680435 DOI: 10.7554/elife.96566] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2024] Open
Abstract
Studying and understanding the code of large neural populations hinge on accurate statistical models of population activity. A novel class of models, based on learning to weigh sparse nonlinear Random Projections (RP) of the population, has demonstrated high accuracy, efficiency, and scalability. Importantly, these RP models have a clear and biologically plausible implementation as shallow neural networks. We present a new class of RP models that are learned by optimizing the randomly selected sparse projections themselves. This 'reshaping' of projections is akin to changing synaptic connections in just one layer of the corresponding neural circuit model. We show that Reshaped RP models are more accurate and efficient than the standard RP models in recapitulating the code of tens of cortical neurons from behaving monkeys. Incorporating more biological features and utilizing synaptic normalization in the learning process, results in accurate models that are more efficient. Remarkably, these models exhibit homeostasis in firing rates and total synaptic weights of projection neurons. We further show that these sparse homeostatic reshaped RP models outperform fully connected neural network models. Thus, our new scalable, efficient, and highly accurate population code models are not only biologically plausible but are actually optimized due to their biological features. These findings suggest a dual functional role of synaptic normalization in neural circuits: maintaining spiking and synaptic homeostasis while concurrently optimizing network performance and efficiency in encoding information and learning.
Collapse
Affiliation(s)
- Jonathan Mayzel
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot, Israel
| | - Elad Schneidman
- Department of Brain Sciences, Weizmann Institute of Science, Rehovot, Israel
| |
Collapse
|
5
|
Bauer J, Lewin U, Herbert E, Gjorgjieva J, Schoonover CE, Fink AJP, Rose T, Bonhoeffer T, Hübener M. Sensory experience steers representational drift in mouse visual cortex. Nat Commun 2024; 15:9153. [PMID: 39443498 PMCID: PMC11499870 DOI: 10.1038/s41467-024-53326-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Accepted: 10/08/2024] [Indexed: 10/25/2024] Open
Abstract
Representational drift-the gradual continuous change of neuronal representations-has been observed across many brain areas. It is unclear whether drift is caused by synaptic plasticity elicited by sensory experience, or by the intrinsic volatility of synapses. Here, using chronic two-photon calcium imaging in primary visual cortex of female mice, we find that the preferred stimulus orientation of individual neurons slowly drifts over the course of weeks. By using cylinder lens goggles to limit visual experience to a narrow range of orientations, we show that the direction of drift, but not its magnitude, is biased by the statistics of visual input. A network model suggests that drift of preferred orientation largely results from synaptic volatility, which under normal visual conditions is counteracted by experience-driven Hebbian mechanisms, stabilizing preferred orientation. Under deprivation conditions these Hebbian mechanisms enable adaptation. Thus, Hebbian synaptic plasticity steers drift to match the statistics of the environment.
Collapse
Affiliation(s)
- Joel Bauer
- Max Planck Institute for Biological Intelligence, Martinsried, Germany.
- International Max Planck Research School for Molecular Life Sciences, Martinsried, Germany.
- Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London, London, UK.
| | - Uwe Lewin
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Planegg, Germany
| | - Elizabeth Herbert
- School of Life Sciences, Technical University of Munich, Freising, Germany
| | | | - Carl E Schoonover
- Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY, USA
- Allen Institute for Neural Dynamics, Seattle, WA, USA
| | - Andrew J P Fink
- Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY, USA
- Department of Neurobiology, Northwestern University, Evanston, IL, USA
| | - Tobias Rose
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
- Institute for Experimental Epileptology and Cognition Research, University of Bonn, Medical Center, Bonn, Germany
| | - Tobias Bonhoeffer
- Max Planck Institute for Biological Intelligence, Martinsried, Germany
| | - Mark Hübener
- Max Planck Institute for Biological Intelligence, Martinsried, Germany.
| |
Collapse
|
6
|
Monk T, Dennler N, Ralph N, Rastogi S, Afshar S, Urbizagastegui P, Jarvis R, van Schaik A, Adamatzky A. Electrical Signaling Beyond Neurons. Neural Comput 2024; 36:1939-2029. [PMID: 39141803 DOI: 10.1162/neco_a_01696] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2023] [Accepted: 05/21/2024] [Indexed: 08/16/2024]
Abstract
Neural action potentials (APs) are difficult to interpret as signal encoders and/or computational primitives. Their relationships with stimuli and behaviors are obscured by the staggering complexity of nervous systems themselves. We can reduce this complexity by observing that "simpler" neuron-less organisms also transduce stimuli into transient electrical pulses that affect their behaviors. Without a complicated nervous system, APs are often easier to understand as signal/response mechanisms. We review examples of nonneural stimulus transductions in domains of life largely neglected by theoretical neuroscience: bacteria, protozoans, plants, fungi, and neuron-less animals. We report properties of those electrical signals-for example, amplitudes, durations, ionic bases, refractory periods, and particularly their ecological purposes. We compare those properties with those of neurons to infer the tasks and selection pressures that neurons satisfy. Throughout the tree of life, nonneural stimulus transductions time behavioral responses to environmental changes. Nonneural organisms represent the presence or absence of a stimulus with the presence or absence of an electrical signal. Their transductions usually exhibit high sensitivity and specificity to a stimulus, but are often slow compared to neurons. Neurons appear to be sacrificing the specificity of their stimulus transductions for sensitivity and speed. We interpret cellular stimulus transductions as a cell's assertion that it detected something important at that moment in time. In particular, we consider neural APs as fast but noisy detection assertions. We infer that a principal goal of nervous systems is to detect extremely weak signals from noisy sensory spikes under enormous time pressure. We discuss neural computation proposals that address this goal by casting neurons as devices that implement online, analog, probabilistic computations with their membrane potentials. Those proposals imply a measurable relationship between afferent neural spiking statistics and efferent neural membrane electrophysiology.
Collapse
Affiliation(s)
- Travis Monk
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Sydney, NSW 2747, Australia
| | - Nik Dennler
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Sydney, NSW 2747, Australia
- Biocomputation Group, University of Hertfordshire, Hatfield, Hertfordshire AL10 9AB, U.K.
| | - Nicholas Ralph
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Sydney, NSW 2747, Australia
| | - Shavika Rastogi
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Sydney, NSW 2747, Australia
- Biocomputation Group, University of Hertfordshire, Hatfield, Hertfordshire AL10 9AB, U.K.
| | - Saeed Afshar
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Sydney, NSW 2747, Australia
| | - Pablo Urbizagastegui
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Sydney, NSW 2747, Australia
| | - Russell Jarvis
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Sydney, NSW 2747, Australia
| | - André van Schaik
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Sydney, NSW 2747, Australia
| | - Andrew Adamatzky
- Unconventional Computing Laboratory, University of the West of England, Bristol BS16 1QY, U.K.
| |
Collapse
|
7
|
Yang X, La Camera G. Co-existence of synaptic plasticity and metastable dynamics in a spiking model of cortical circuits. PLoS Comput Biol 2024; 20:e1012220. [PMID: 38950068 PMCID: PMC11244818 DOI: 10.1371/journal.pcbi.1012220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Revised: 07/12/2024] [Accepted: 06/01/2024] [Indexed: 07/03/2024] Open
Abstract
Evidence for metastable dynamics and its role in brain function is emerging at a fast pace and is changing our understanding of neural coding by putting an emphasis on hidden states of transient activity. Clustered networks of spiking neurons have enhanced synaptic connections among groups of neurons forming structures called cell assemblies; such networks are capable of producing metastable dynamics that is in agreement with many experimental results. However, it is unclear how a clustered network structure producing metastable dynamics may emerge from a fully local plasticity rule, i.e., a plasticity rule where each synapse has only access to the activity of the neurons it connects (as opposed to the activity of other neurons or other synapses). Here, we propose a local plasticity rule producing ongoing metastable dynamics in a deterministic, recurrent network of spiking neurons. The metastable dynamics co-exists with ongoing plasticity and is the consequence of a self-tuning mechanism that keeps the synaptic weights close to the instability line where memories are spontaneously reactivated. In turn, the synaptic structure is stable to ongoing dynamics and random perturbations, yet it remains sufficiently plastic to remap sensory representations to encode new sets of stimuli. Both the plasticity rule and the metastable dynamics scale well with network size, with synaptic stability increasing with the number of neurons. Overall, our results show that it is possible to generate metastable dynamics over meaningful hidden states using a simple but biologically plausible plasticity rule which co-exists with ongoing neural dynamics.
Collapse
Affiliation(s)
- Xiaoyu Yang
- Graduate Program in Physics and Astronomy, Stony Brook University, Stony Brook, New York, United States of America
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York, United States of America
- Center for Neural Circuit Dynamics, Stony Brook University, Stony Brook, New York, United States of America
| | - Giancarlo La Camera
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York, United States of America
- Center for Neural Circuit Dynamics, Stony Brook University, Stony Brook, New York, United States of America
| |
Collapse
|
8
|
Yang X, La Camera G. Co-existence of synaptic plasticity and metastable dynamics in a spiking model of cortical circuits. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2023.12.07.570692. [PMID: 38106233 PMCID: PMC10723399 DOI: 10.1101/2023.12.07.570692] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/19/2023]
Abstract
Evidence for metastable dynamics and its role in brain function is emerging at a fast pace and is changing our understanding of neural coding by putting an emphasis on hidden states of transient activity. Clustered networks of spiking neurons have enhanced synaptic connections among groups of neurons forming structures called cell assemblies; such networks are capable of producing metastable dynamics that is in agreement with many experimental results. However, it is unclear how a clustered network structure producing metastable dynamics may emerge from a fully local plasticity rule, i.e., a plasticity rule where each synapse has only access to the activity of the neurons it connects (as opposed to the activity of other neurons or other synapses). Here, we propose a local plasticity rule producing ongoing metastable dynamics in a deterministic, recurrent network of spiking neurons. The metastable dynamics co-exists with ongoing plasticity and is the consequence of a self-tuning mechanism that keeps the synaptic weights close to the instability line where memories are spontaneously reactivated. In turn, the synaptic structure is stable to ongoing dynamics and random perturbations, yet it remains sufficiently plastic to remap sensory representations to encode new sets of stimuli. Both the plasticity rule and the metastable dynamics scale well with network size, with synaptic stability increasing with the number of neurons. Overall, our results show that it is possible to generate metastable dynamics over meaningful hidden states using a simple but biologically plausible plasticity rule which co-exists with ongoing neural dynamics.
Collapse
Affiliation(s)
- Xiaoyu Yang
- Graduate Program in Physics and Astronomy, Stony Brook University
- Department of Neurobiology & Behavior, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| | - Giancarlo La Camera
- Department of Neurobiology & Behavior, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| |
Collapse
|
9
|
Sun SED, Levenstein D, Li B, Mandelberg N, Chenouard N, Suutari BS, Sanchez S, Tian G, Rinzel J, Buzsáki G, Tsien RW. Synaptic homeostasis transiently leverages Hebbian mechanisms for a multiphasic response to inactivity. Cell Rep 2024; 43:113839. [PMID: 38507409 DOI: 10.1016/j.celrep.2024.113839] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2022] [Revised: 12/19/2023] [Accepted: 02/05/2024] [Indexed: 03/22/2024] Open
Abstract
Homeostatic regulation of synapses is vital for nervous system function and key to understanding a range of neurological conditions. Synaptic homeostasis is proposed to operate over hours to counteract the destabilizing influence of long-term potentiation (LTP) and long-term depression (LTD). The prevailing view holds that synaptic scaling is a slow first-order process that regulates postsynaptic glutamate receptors and fundamentally differs from LTP or LTD. Surprisingly, we find that the dynamics of scaling induced by neuronal inactivity are not exponential or monotonic, and the mechanism requires calcineurin and CaMKII, molecules dominant in LTD and LTP. Our quantitative model of these enzymes reconstructs the unexpected dynamics of homeostatic scaling and reveals how synapses can efficiently safeguard future capacity for synaptic plasticity. This mechanism of synaptic adaptation supports a broader set of homeostatic changes, including action potential autoregulation, and invites further inquiry into how such a mechanism varies in health and disease.
Collapse
Affiliation(s)
- Simón E D Sun
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA; Cold Spring Harbor Laboratory, Cold Spring Harbor, NY 11724, USA
| | - Daniel Levenstein
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA; Montreal Neurological Institute, Department of Neurology and Neurosurgery, McGill University, 3810 University Street, Montreal, QC, Canada
| | - Boxing Li
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA; Neuroscience Program, Guangdong Provincial Key Laboratory of Brain Function and Disease, Zhongshan School of Medicine and the Fifth Affiliated Hospital, Sun Yat-sen University, Guangzhou 510810, China
| | - Nataniel Mandelberg
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - Nicolas Chenouard
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA; Sorbonne Université, INSERM U1127, UMR CNRS 7225, Institut du Cerveau (ICM), 47 bld de l'hôpital, 75013 Paris, France
| | - Benjamin S Suutari
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - Sandrine Sanchez
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - Guoling Tian
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - John Rinzel
- Center for Neural Science, New York University, New York, NY 10003, USA
| | - György Buzsáki
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - Richard W Tsien
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA.
| |
Collapse
|
10
|
Wang Y, Wang Y, Zhang X, Du J, Zhang T, Xu B. Brain topology improved spiking neural network for efficient reinforcement learning of continuous control. Front Neurosci 2024; 18:1325062. [PMID: 38694900 PMCID: PMC11062182 DOI: 10.3389/fnins.2024.1325062] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2023] [Accepted: 03/27/2024] [Indexed: 05/04/2024] Open
Abstract
The brain topology highly reflects the complex cognitive functions of the biological brain after million-years of evolution. Learning from these biological topologies is a smarter and easier way to achieve brain-like intelligence with features of efficiency, robustness, and flexibility. Here we proposed a brain topology-improved spiking neural network (BT-SNN) for efficient reinforcement learning. First, hundreds of biological topologies are generated and selected as subsets of the Allen mouse brain topology with the help of the Tanimoto hierarchical clustering algorithm, which has been widely used in analyzing key features of the brain connectome. Second, a few biological constraints are used to filter out three key topology candidates, including but not limited to the proportion of node functions (e.g., sensation, memory, and motor types) and network sparsity. Third, the network topology is integrated with the hybrid numerical solver-improved leaky-integrated and fire neurons. Fourth, the algorithm is then tuned with an evolutionary algorithm named adaptive random search instead of backpropagation to guide synaptic modifications without affecting raw key features of the topology. Fifth, under the test of four animal-survival-like RL tasks (i.e., dynamic controlling in Mujoco), the BT-SNN can achieve higher scores than not only counterpart SNN using random topology but also some classical ANNs (i.e., long-short-term memory and multi-layer perception). This result indicates that the research effort of incorporating biological topology and evolutionary learning rules has much in store for the future.
Collapse
Affiliation(s)
- Yongjian Wang
- Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
| | - Yansong Wang
- Institute of Neuroscience, State Key Laboratory of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
- School of Future Technology, University of Chinese Academy of Sciences, Beijing, China
| | - Xinhe Zhang
- Institute of Automation, Chinese Academy of Sciences, Beijing, China
| | - Jiulin Du
- Institute of Neuroscience, State Key Laboratory of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
- School of Life Science and Technology, ShanghaiTech University, Shanghai, China
| | - Tielin Zhang
- Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
- Institute of Neuroscience, State Key Laboratory of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
| | - Bo Xu
- Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
- Institute of Neuroscience, State Key Laboratory of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
- School of Future Technology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
11
|
González-Ramírez LR. A fractional-order Wilson-Cowan formulation of cortical disinhibition. J Comput Neurosci 2024; 52:109-123. [PMID: 37787876 DOI: 10.1007/s10827-023-00862-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2023] [Revised: 08/13/2023] [Accepted: 09/08/2023] [Indexed: 10/04/2023]
Abstract
This work presents a fractional-order Wilson-Cowan model derivation under Caputo's formalism, considering an order of 0 < α ≤ 1 . To that end, we propose memory-dependent response functions and average neuronal excitation functions that permit us to naturally arrive at a fractional-order model that incorporates past dynamics into the description of synaptically coupled neuronal populations' activity. We then shift our focus on a particular example, aiming to analyze the fractional-order dynamics of the disinhibited cortex. This system mimics cortical activity observed during neurological disorders such as epileptic seizures, where an imbalance between excitation and inhibition is present, which allows brain dynamics to transition to a hyperexcited activity state. In the context of the first-order mathematical model, we recover traditional results showing a transition from a low-level activity state to a potentially pathological high-level activity state as an external factor modifies cortical inhibition. On the other hand, under the fractional-order formulation, we establish novel results showing that the system resists such transition as the order is decreased, permitting the possibility of staying in the low-activity state even with increased disinhibition. Furthermore, considering the memory index interpretation of the fractional-order model motivation here developed, our results establish that by increasing the memory index, the system becomes more resistant to transitioning towards the high-level activity state. That is, one possible effect of the memory index is to stabilize neuronal activity. Noticeably, this neuronal stabilizing effect is similar to homeostatic plasticity mechanisms. To summarize our results, we present a two-parameter structural portrait describing the system's dynamics dependent on a proposed disinhibition parameter and the order. We also explore numerical model simulations to validate our results.
Collapse
Affiliation(s)
- L R González-Ramírez
- Escuela Superior de Física y Matemáticas, Instituto Politécnico Nacional, Unidad Profesional Adolfo López Mateos Edificio 9, 07738, Cd. de México, México.
| |
Collapse
|
12
|
de Brito CSN, Gerstner W. Learning what matters: Synaptic plasticity with invariance to second-order input correlations. PLoS Comput Biol 2024; 20:e1011844. [PMID: 38346073 PMCID: PMC10890752 DOI: 10.1371/journal.pcbi.1011844] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Revised: 02/23/2024] [Accepted: 01/18/2024] [Indexed: 02/25/2024] Open
Abstract
Cortical populations of neurons develop sparse representations adapted to the statistics of the environment. To learn efficient population codes, synaptic plasticity mechanisms must differentiate relevant latent features from spurious input correlations, which are omnipresent in cortical networks. Here, we develop a theory for sparse coding and synaptic plasticity that is invariant to second-order correlations in the input. Going beyond classical Hebbian learning, our learning objective explains the functional form of observed excitatory plasticity mechanisms, showing how Hebbian long-term depression (LTD) cancels the sensitivity to second-order correlations so that receptive fields become aligned with features hidden in higher-order statistics. Invariance to second-order correlations enhances the versatility of biologically realistic learning models, supporting optimal decoding from noisy inputs and sparse population coding from spatially correlated stimuli. In a spiking model with triplet spike-timing-dependent plasticity (STDP), we show that individual neurons can learn localized oriented receptive fields, circumventing the need for input preprocessing, such as whitening, or population-level lateral inhibition. The theory advances our understanding of local unsupervised learning in cortical circuits, offers new interpretations of the Bienenstock-Cooper-Munro and triplet STDP models, and assigns a specific functional role to synaptic LTD mechanisms in pyramidal neurons.
Collapse
Affiliation(s)
- Carlos Stein Naves de Brito
- École Polytechnique Fédérale de Lausanne, EPFL, Lusanne, Switzerland
- Champalimaud Research, Champalimaud Centre for the Unknown, Lisbon, Portugal
| | - Wulfram Gerstner
- École Polytechnique Fédérale de Lausanne, EPFL, Lusanne, Switzerland
| |
Collapse
|
13
|
Han K, Liu G, Liu N, Li J, Li J, Cui L, Cheng M, Long J, Liao X, Tang Z, Liu Y, Liu J, Chen J, Lu H, Zhang H. Effects of Mobile Intelligent Cognitive Training for Patients with Post-Stroke Cognitive Impairment: A 12-Week, Multicenter, Randomized Controlled Study. J Alzheimers Dis 2024; 100:999-1015. [PMID: 38968051 DOI: 10.3233/jad-240356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/07/2024]
Abstract
Background The current application effects of computerized cognitive intervention are inconsistent and limited to hospital rehabilitation settings. Objective To investigate the effect of mobile intelligent cognitive training (MICT) on patients with post-stroke cognitive impairment (PSCI). Methods This study was a multicenter, prospective, open-label, blinded endpoint, cluster-randomized controlled trial (RCT). 518 PSCI patients were stratified and assigned to four rehabilitation settings, and then patients were randomized into experimental and control groups in each rehabilitation setting through cluster randomization. All patients received comprehensive management for PSCI, while the experimental group additionally received MICT intervention. Treatment was 30 minutes daily, 5 days per week, for 12 weeks. Cognitive function, activities of daily living (ADL), and quality of life (QOL) were assessed before the treatment, at weeks 6 and 12 post-treatment, and a 16-week follow-up. Results Linear Mixed Effects Models showed patients with PSCI were better off than pre-treatment patients on each outcome measure (p < 0.05). Additionally, the improvement of these outcomes in the experimental group was significantly better than in the control group at week 6 post-treatment and 16-week follow-up (p < 0.05). The rehabilitation setting also affected the cognitive efficacy of MICT intervention in improving PSCI patients, and the degree of improvement in each outcome was found to be highest in hospital, followed by community, nursing home, and home settings. Conclusions Long-term MICT intervention can improve cognition, ADL, and QOL in patients with PSCI, with sustained effects for at least one month. Notably, different rehabilitation settings affect the cognitive intervention efficacy of MICT on PSCI patients. However, this still needs to be further determined in future studies.
Collapse
Affiliation(s)
- Kaiyue Han
- School of Rehabilitation, Capital Medical University, Beijing, China
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
| | | | - Nan Liu
- Beijing Puren Hospital, Beijing, China
| | - Jiangyi Li
- Beijing Dongcheng District Kangfu One Two Three Health Training Center, Beijing, China
| | - Jianfeng Li
- Beijing Yangfangdian Hospital, Beijing, China
| | - Lihua Cui
- Beijing Fengtai District Jiaxiang Nursing-Home for the Elderly, Beijing, China
- Beijing Fengtai You Anmen Hospital, Beijing, China
| | - Ming Cheng
- Beijing Haidian District Guolilai Elderly Care Center, Beijing, China
| | - Junzi Long
- School of Rehabilitation, Capital Medical University, Beijing, China
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
- Changping Laboratory, Beijing, China
| | - Xingxing Liao
- School of Rehabilitation, Capital Medical University, Beijing, China
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
- Changping Laboratory, Beijing, China
| | - Zhiqing Tang
- School of Rehabilitation, Capital Medical University, Beijing, China
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
| | - Ying Liu
- School of Rehabilitation, Capital Medical University, Beijing, China
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
| | - Jiajie Liu
- School of Rehabilitation, Capital Medical University, Beijing, China
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
| | - Jiarou Chen
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
- The Second School of Medicine, Wenzhou Medical University, Wenzhou, China
| | - Haitao Lu
- School of Rehabilitation, Capital Medical University, Beijing, China
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
| | - Hao Zhang
- School of Rehabilitation, Capital Medical University, Beijing, China
- Beijing Bo'ai Hospital, China Rehabilitation Research Center, Beijing, China
- University of Health and Rehabilitation Sciences, Qingdao, China
- Cheeloo College of Medicine, Shandong University, Jinan, China
| |
Collapse
|
14
|
Hutt A, Trotter D, Pariz A, Valiante TA, Lefebvre J. Diversity-induced trivialization and resilience of neural dynamics. CHAOS (WOODBURY, N.Y.) 2024; 34:013147. [PMID: 38285722 DOI: 10.1063/5.0165773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Accepted: 01/01/2024] [Indexed: 01/31/2024]
Abstract
Heterogeneity is omnipresent across all living systems. Diversity enriches the dynamical repertoire of these systems but remains challenging to reconcile with their manifest robustness and dynamical persistence over time, a fundamental feature called resilience. To better understand the mechanism underlying resilience in neural circuits, we considered a nonlinear network model, extracting the relationship between excitability heterogeneity and resilience. To measure resilience, we quantified the number of stationary states of this network, and how they are affected by various control parameters. We analyzed both analytically and numerically gradient and non-gradient systems modeled as non-linear sparse neural networks evolving over long time scales. Our analysis shows that neuronal heterogeneity quenches the number of stationary states while decreasing the susceptibility to bifurcations: a phenomenon known as trivialization. Heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in network size and connection probability by quenching the system's dynamic volatility.
Collapse
Affiliation(s)
- Axel Hutt
- MLMS, MIMESIS, Université de Strasbourg, CNRS, Inria, ICube, 67000 Strasbourg, France
| | - Daniel Trotter
- Department of Physics, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
| | - Aref Pariz
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
| | - Taufik A Valiante
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Electrical and Computer Engineering, Institute of Medical Science, Institute of Biomedical Engineering, Division of Neurosurgery, Department of Surgery, CRANIA (Center for Advancing Neurotechnological Innovation to Application), Max Planck-University of Toronto Center for Neural Science and Technology, University of Toronto, Toronto, Ontario M5S 3G8, Canada
| | - Jérémie Lefebvre
- Department of Physics, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Department of Mathematics, University of Toronto, Toronto, Ontario M5S 2E4, Canada
| |
Collapse
|
15
|
Yamada T, Watanabe T, Sasaki Y. Plasticity-stability dynamics during post-training processing of learning. Trends Cogn Sci 2024; 28:72-83. [PMID: 37858389 PMCID: PMC10842181 DOI: 10.1016/j.tics.2023.09.002] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2023] [Revised: 09/13/2023] [Accepted: 09/14/2023] [Indexed: 10/21/2023]
Abstract
Learning continues beyond the end of training. Post-training learning is supported by changes in plasticity and stability in the brain during both wakefulness and sleep. However, the lack of a unified measure for assessing plasticity and stability dynamics during training and post-training periods has limited our understanding of how these dynamics shape learning. Focusing primarily on procedural learning, we integrate work using behavioral paradigms and a recently developed measure, the excitatory-to-inhibitory (E/I) ratio, to explore the delicate balance between plasticity and stability and its relationship to post-training learning. This reveals plasticity-stability cycles during both wakefulness and sleep that enhance learning and protect it from new learning during post-training processing.
Collapse
Affiliation(s)
- Takashi Yamada
- Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA
| | - Takeo Watanabe
- Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA
| | - Yuka Sasaki
- Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA.
| |
Collapse
|
16
|
Boscaglia M, Gastaldi C, Gerstner W, Quian Quiroga R. A dynamic attractor network model of memory formation, reinforcement and forgetting. PLoS Comput Biol 2023; 19:e1011727. [PMID: 38117859 PMCID: PMC10766193 DOI: 10.1371/journal.pcbi.1011727] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Revised: 01/04/2024] [Accepted: 12/02/2023] [Indexed: 12/22/2023] Open
Abstract
Empirical evidence shows that memories that are frequently revisited are easy to recall, and that familiar items involve larger hippocampal representations than less familiar ones. In line with these observations, here we develop a modelling approach to provide a mechanistic understanding of how hippocampal neural assemblies evolve differently, depending on the frequency of presentation of the stimuli. For this, we added an online Hebbian learning rule, background firing activity, neural adaptation and heterosynaptic plasticity to a rate attractor network model, thus creating dynamic memory representations that can persist, increase or fade according to the frequency of presentation of the corresponding memory patterns. Specifically, we show that a dynamic interplay between Hebbian learning and background firing activity can explain the relationship between the memory assembly sizes and their frequency of stimulation. Frequently stimulated assemblies increase their size independently from each other (i.e. creating orthogonal representations that do not share neurons, thus avoiding interference). Importantly, connections between neurons of assemblies that are not further stimulated become labile so that these neurons can be recruited by other assemblies, providing a neuronal mechanism of forgetting.
Collapse
Affiliation(s)
- Marta Boscaglia
- Centre for Systems Neuroscience, University of Leicester, United Kingdom
- School of Psychology and Vision Sciences, University of Leicester, United Kingdom
| | - Chiara Gastaldi
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Switzerland
| | - Rodrigo Quian Quiroga
- Centre for Systems Neuroscience, University of Leicester, United Kingdom
- Hospital del Mar Medical Research Institute (IMIM), Barcelona, Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
- Ruijin hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, People’s Republic of China
| |
Collapse
|
17
|
Bertocchi I, Rocha-Almeida F, Romero-Barragán MT, Cambiaghi M, Carretero-Guillén A, Botta P, Dogbevia GK, Treviño M, Mele P, Oberto A, Larkum ME, Gruart A, Sprengel R, Delgado-García JM, Hasan MT. Pre- and postsynaptic N-methyl-D-aspartate receptors are required for sequential printing of fear memory engrams. iScience 2023; 26:108050. [PMID: 37876798 PMCID: PMC10590821 DOI: 10.1016/j.isci.2023.108050] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Revised: 07/24/2023] [Accepted: 09/22/2023] [Indexed: 10/26/2023] Open
Abstract
The organization of fear memory involves the participation of multiple brain regions. However, it is largely unknown how fear memory is formed, which circuit pathways are used for "printing" memory engrams across brain regions, and the role of identified brain circuits in memory retrieval. With advanced genetic methods, we combinatorially blocked presynaptic output and manipulated N-methyl-D-aspartate receptor (NMDAR) in the basolateral amygdala (BLA) and medial prefrontal cortex (mPFC) before and after cued fear conditioning. Further, we tagged fear-activated neurons during associative learning for optogenetic memory recall. We found that presynaptic mPFC and postsynaptic BLA NMDARs are required for fear memory formation, but not expression. Our results provide strong evidence that NMDAR-dependent synaptic plasticity drives multi-trace systems consolidation for the sequential printing of fear memory engrams from BLA to mPFC and, subsequently, to the other regions, for flexible memory retrieval.
Collapse
Affiliation(s)
- Ilaria Bertocchi
- Department of Molecular Neurobiology, Max Planck Institute for Medical Research, Jahnstrasse 29, 69120 Heidelberg, Germany
- Department of Neuroscience "Rita Levi Montalcini", Neuroscience Institute Cavalieri Ottolenghi (NICO), University of Turin, 10043 Turin, Italy
| | - Florbela Rocha-Almeida
- Division of Neurosciences, University Pablo de Olavide, Ctra. de Utrera, km. 1 41013 Seville, Spain
| | | | - Marco Cambiaghi
- Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Strada le Grazie 8, Verona, Italy
| | - Alejandro Carretero-Guillén
- Laboratory of Brain Circuits Therapeutics, Achucarro Basque Center for Neuroscience, Science Park of the UPV/EHU, Sede Building, Barrio Sarriena, s/n, 48940 Leioa, Spain
| | - Paolo Botta
- CNS drug development, Copenhagen, Capital Region, Denmark
| | - Godwin K. Dogbevia
- Department of Molecular Neurobiology, Max Planck Institute for Medical Research, Jahnstrasse 29, 69120 Heidelberg, Germany
- Health Canada, 70 Colombine Driveway, Ottawa, ON K1A0K9, Canada
| | - Mario Treviño
- Department of Molecular Neurobiology, Max Planck Institute for Medical Research, Jahnstrasse 29, 69120 Heidelberg, Germany
- Laboratorio de Plasticidad Cortical y Aprendizaje Perceptual, Instituto de Neurociencias, Universidad de Guadalajara, Guadalajara, Mexico
| | - Paolo Mele
- Department of Neuroscience "Rita Levi Montalcini", Neuroscience Institute Cavalieri Ottolenghi (NICO), University of Turin, 10043 Turin, Italy
| | - Alessandra Oberto
- Department of Neuroscience "Rita Levi Montalcini", Neuroscience Institute Cavalieri Ottolenghi (NICO), University of Turin, 10043 Turin, Italy
| | - Matthew E. Larkum
- NeuroCure, Charité-Universitatsmedizin, Virchowweg 6, 10117 Berlin, Germany
| | - Agnes Gruart
- Division of Neurosciences, University Pablo de Olavide, Ctra. de Utrera, km. 1 41013 Seville, Spain
| | - Rolf Sprengel
- Department of Molecular Neurobiology, Max Planck Institute for Medical Research, Jahnstrasse 29, 69120 Heidelberg, Germany
| | | | - Mazahir T. Hasan
- Department of Molecular Neurobiology, Max Planck Institute for Medical Research, Jahnstrasse 29, 69120 Heidelberg, Germany
- Laboratory of Brain Circuits Therapeutics, Achucarro Basque Center for Neuroscience, Science Park of the UPV/EHU, Sede Building, Barrio Sarriena, s/n, 48940 Leioa, Spain
- Ikerbasque – Basque Foundation for Science, Bilbao, Spain
| |
Collapse
|
18
|
Andrei AR, Akil AE, Kharas N, Rosenbaum R, Josić K, Dragoi V. Rapid compensatory plasticity revealed by dynamic correlated activity in monkeys in vivo. Nat Neurosci 2023; 26:1960-1969. [PMID: 37828225 DOI: 10.1038/s41593-023-01446-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2022] [Accepted: 09/01/2023] [Indexed: 10/14/2023]
Abstract
To produce adaptive behavior, neural networks must balance between plasticity and stability. Computational work has demonstrated that network stability requires plasticity mechanisms to be counterbalanced by rapid compensatory processes. However, such processes have yet to be experimentally observed. Here we demonstrate that repeated optogenetic activation of excitatory neurons in monkey visual cortex (area V1) induces a population-wide dynamic reduction in the strength of neuronal interactions over the timescale of minutes during the awake state, but not during rest. This new form of rapid plasticity was observed only in the correlation structure, with firing rates remaining stable across trials. A computational network model operating in the balanced regime confirmed experimental findings and revealed that inhibitory plasticity is responsible for the decrease in correlated activity in response to repeated light stimulation. These results provide the first experimental evidence for rapid homeostatic plasticity that primarily operates during wakefulness, which stabilizes neuronal interactions during strong network co-activation.
Collapse
Affiliation(s)
- Ariana R Andrei
- Department of Neurobiology and Anatomy, University of Texas, Houston, TX, USA.
| | - Alan E Akil
- Departments of Mathematics, Biology and Biochemistry, University of Houston, Houston, TX, USA
| | - Natasha Kharas
- Department of Neurobiology and Anatomy, University of Texas, Houston, TX, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
| | - Krešimir Josić
- Departments of Mathematics, Biology and Biochemistry, University of Houston, Houston, TX, USA
| | - Valentin Dragoi
- Department of Neurobiology and Anatomy, University of Texas, Houston, TX, USA.
- Department of Electrical and Computer Engineering, Rice University, Houston, TX, USA.
| |
Collapse
|
19
|
Eissa TL, Kilpatrick ZP. Learning efficient representations of environmental priors in working memory. PLoS Comput Biol 2023; 19:e1011622. [PMID: 37943956 PMCID: PMC10662764 DOI: 10.1371/journal.pcbi.1011622] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2022] [Revised: 11/21/2023] [Accepted: 10/20/2023] [Indexed: 11/12/2023] Open
Abstract
Experience shapes our expectations and helps us learn the structure of the environment. Inference models render such learning as a gradual refinement of the observer's estimate of the environmental prior. For instance, when retaining an estimate of an object's features in working memory, learned priors may bias the estimate in the direction of common feature values. Humans display such biases when retaining color estimates on short time intervals. We propose that these systematic biases emerge from modulation of synaptic connectivity in a neural circuit based on the experienced stimulus history, shaping the persistent and collective neural activity that encodes the stimulus estimate. Resulting neural activity attractors are aligned to common stimulus values. Using recently published human response data from a delayed-estimation task in which stimuli (colors) were drawn from a heterogeneous distribution that did not necessarily correspond with reported population biases, we confirm that most subjects' response distributions are better described by experience-dependent learning models than by models with fixed biases. This work suggests systematic limitations in working memory reflect efficient representations of inferred environmental structure, providing new insights into how humans integrate environmental knowledge into their cognitive strategies.
Collapse
Affiliation(s)
- Tahra L. Eissa
- Department of Applied Mathematics, University of Colorado Boulder, Boulder, Colorado, United States of America
| | - Zachary P. Kilpatrick
- Department of Applied Mathematics, University of Colorado Boulder, Boulder, Colorado, United States of America
- Institute of Cognitive Science, University of Colorado Boulder, Boulder, Colorado, United States of America
| |
Collapse
|
20
|
Halvagal MS, Zenke F. The combination of Hebbian and predictive plasticity learns invariant object representations in deep sensory networks. Nat Neurosci 2023; 26:1906-1915. [PMID: 37828226 PMCID: PMC10620089 DOI: 10.1038/s41593-023-01460-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Accepted: 09/08/2023] [Indexed: 10/14/2023]
Abstract
Recognition of objects from sensory stimuli is essential for survival. To that end, sensory networks in the brain must form object representations invariant to stimulus changes, such as size, orientation and context. Although Hebbian plasticity is known to shape sensory networks, it fails to create invariant object representations in computational models, raising the question of how the brain achieves such processing. In the present study, we show that combining Hebbian plasticity with a predictive form of plasticity leads to invariant representations in deep neural network models. We derive a local learning rule that generalizes to spiking neural networks and naturally accounts for several experimentally observed properties of synaptic plasticity, including metaplasticity and spike-timing-dependent plasticity. Finally, our model accurately captures neuronal selectivity changes observed in the primate inferotemporal cortex in response to altered visual experience. Thus, we provide a plausible normative theory emphasizing the importance of predictive plasticity mechanisms for successful representational learning.
Collapse
Affiliation(s)
- Manu Srinath Halvagal
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland
- Faculty of Science, University of Basel, Basel, Switzerland
| | - Friedemann Zenke
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland.
- Faculty of Science, University of Basel, Basel, Switzerland.
| |
Collapse
|
21
|
Madar A, Dong C, Sheffield M. BTSP, not STDP, Drives Shifts in Hippocampal Representations During Familiarization. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.10.17.562791. [PMID: 37904999 PMCID: PMC10614909 DOI: 10.1101/2023.10.17.562791] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/02/2023]
Abstract
Synaptic plasticity is widely thought to support memory storage in the brain, but the rules determining impactful synaptic changes in-vivo are not known. We considered the trial-by-trial shifting dynamics of hippocampal place fields (PFs) as an indicator of ongoing plasticity during memory formation. By implementing different plasticity rules in computational models of spiking place cells and comparing to experimentally measured PFs from mice navigating familiar and novel environments, we found that Behavioral-Timescale-Synaptic-Plasticity (BTSP), rather than Hebbian Spike-Timing-Dependent-Plasticity, is the principal mechanism governing PF shifting dynamics. BTSP-triggering events are rare, but more frequent during novel experiences. During exploration, their probability is dynamic: it decays after PF onset, but continually drives a population-level representational drift. Finally, our results show that BTSP occurs in CA3 but is less frequent and phenomenologically different than in CA1. Overall, our study provides a new framework to understand how synaptic plasticity shapes neuronal representations during learning.
Collapse
Affiliation(s)
- A.D. Madar
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| | - C. Dong
- Department of Neurobiology, Neuroscience Institute, University of Chicago
- current affiliation: Department of Neurobiology, Stanford University School of Medicine
| | - M.E.J. Sheffield
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| |
Collapse
|
22
|
Hutt A, Rich S, Valiante TA, Lefebvre J. Intrinsic neural diversity quenches the dynamic volatility of neural networks. Proc Natl Acad Sci U S A 2023; 120:e2218841120. [PMID: 37399421 PMCID: PMC10334753 DOI: 10.1073/pnas.2218841120] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Accepted: 05/19/2023] [Indexed: 07/05/2023] Open
Abstract
Heterogeneity is the norm in biology. The brain is no different: Neuronal cell types are myriad, reflected through their cellular morphology, type, excitability, connectivity motifs, and ion channel distributions. While this biophysical diversity enriches neural systems' dynamical repertoire, it remains challenging to reconcile with the robustness and persistence of brain function over time (resilience). To better understand the relationship between excitability heterogeneity (variability in excitability within a population of neurons) and resilience, we analyzed both analytically and numerically a nonlinear sparse neural network with balanced excitatory and inhibitory connections evolving over long time scales. Homogeneous networks demonstrated increases in excitability, and strong firing rate correlations-signs of instability-in response to a slowly varying modulatory fluctuation. Excitability heterogeneity tuned network stability in a context-dependent way by restraining responses to modulatory challenges and limiting firing rate correlations, while enriching dynamics during states of low modulatory drive. Excitability heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in population size, connection probability, strength and variability of synaptic weights, by quenching the volatility (i.e., its susceptibility to critical transitions) of its dynamics. Together, these results highlight the fundamental role played by cell-to-cell heterogeneity in the robustness of brain function in the face of change.
Collapse
Affiliation(s)
- Axel Hutt
- Université de Strasbourg, CNRS, Inria, ICube, MLMS, MIMESIS, StrasbourgF-67000, France
| | - Scott Rich
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
| | - Taufik A. Valiante
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
- Department of Electrical and Computer Engineering, University of Toronto, Toronto, ONM5S 3G8, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ONM5S 3G9, Canada
- Institute of Medical Sciences, University of Toronto, Toronto, ONM5S 1A8, Canada
- Division of Neurosurgery, Department of Surgery, University of Toronto, Toronto, ONM5G 2C4, Canada
- Center for Advancing Neurotechnological Innovation to Application, University of Toronto, Toronto, ONM5G 2A2, Canada
- Max Planck-University of Toronto Center for Neural Science and Technology, University of Toronto, Toronto, ONM5S 3G8, Canada
| | - Jérémie Lefebvre
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, ONK1N 6N5, Canada
- Department of Mathematics, University of Toronto, Toronto, ONM5S 2E4, Canada
| |
Collapse
|
23
|
Tikidji-Hamburyan RA, Govindaiah G, Guido W, Colonnese MT. Synaptic and circuit mechanisms prevent detrimentally precise correlation in the developing mammalian visual system. eLife 2023; 12:e84333. [PMID: 37211984 PMCID: PMC10202458 DOI: 10.7554/elife.84333] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Accepted: 04/25/2023] [Indexed: 05/23/2023] Open
Abstract
The developing visual thalamus and cortex extract positional information encoded in the correlated activity of retinal ganglion cells by synaptic plasticity, allowing for the refinement of connectivity. Here, we use a biophysical model of the visual thalamus during the initial visual circuit refinement period to explore the role of synaptic and circuit properties in the regulation of such neural correlations. We find that the NMDA receptor dominance, combined with weak recurrent excitation and inhibition characteristic of this age, prevents the emergence of spike-correlations between thalamocortical neurons on the millisecond timescale. Such precise correlations, which would emerge due to the broad, unrefined connections from the retina to the thalamus, reduce the spatial information contained by thalamic spikes, and therefore we term them 'parasitic' correlations. Our results suggest that developing synapses and circuits evolved mechanisms to compensate for such detrimental parasitic correlations arising from the unrefined and immature circuit.
Collapse
Affiliation(s)
| | - Gubbi Govindaiah
- Department of Anatomical Sciences and Neurobiology, University of LouisvilleLouisvilleUnited States
| | - William Guido
- Department of Anatomical Sciences and Neurobiology, University of LouisvilleLouisvilleUnited States
| | - Matthew T Colonnese
- Department of Pharmacology and Physiology, The George Washington UniversityWashingtonUnited States
| |
Collapse
|
24
|
Mishra R, Suri M. A survey and perspective on neuromorphic continual learning systems. Front Neurosci 2023; 17:1149410. [PMID: 37214407 PMCID: PMC10194827 DOI: 10.3389/fnins.2023.1149410] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2023] [Accepted: 04/03/2023] [Indexed: 05/24/2023] Open
Abstract
With the advent of low-power neuromorphic computing systems, new possibilities have emerged for deployment in various sectors, like healthcare and transport, that require intelligent autonomous applications. These applications require reliable low-power solutions for sequentially adapting to new relevant data without loss of learning. Neuromorphic systems are inherently inspired by biological neural networks that have the potential to offer an efficient solution toward the feat of continual learning. With increasing attention in this area, we present a first comprehensive review of state-of-the-art neuromorphic continual learning (NCL) paradigms. The significance of our study is multi-fold. We summarize the recent progress and propose a plausible roadmap for developing end-to-end NCL systems. We also attempt to identify the gap between research and the real-world deployment of NCL systems in multiple applications. We do so by assessing the recent contributions in neuromorphic continual learning at multiple levels-applications, algorithms, architectures, and hardware. We discuss the relevance of NCL systems and draw out application-specific requisites. We analyze the biological underpinnings that are used for acquiring high-level performance. At the hardware level, we assess the ability of the current neuromorphic platforms and emerging nano-device-based architectures to support these algorithms in the presence of several constraints. Further, we propose refinements to continual learning metrics for applying them to NCL systems. Finally, the review identifies gaps and possible solutions that are not yet focused upon for deploying application-specific NCL systems in real-life scenarios.
Collapse
|
25
|
Vegué M, Thibeault V, Desrosiers P, Allard A. Dimension reduction of dynamics on modular and heterogeneous directed networks. PNAS NEXUS 2023; 2:pgad150. [PMID: 37215634 PMCID: PMC10198746 DOI: 10.1093/pnasnexus/pgad150] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Revised: 02/17/2023] [Accepted: 04/12/2023] [Indexed: 05/24/2023]
Abstract
Dimension reduction is a common strategy to study nonlinear dynamical systems composed by a large number of variables. The goal is to find a smaller version of the system whose time evolution is easier to predict while preserving some of the key dynamical features of the original system. Finding such a reduced representation for complex systems is, however, a difficult task. We address this problem for dynamics on weighted directed networks, with special emphasis on modular and heterogeneous networks. We propose a two-step dimension-reduction method that takes into account the properties of the adjacency matrix. First, units are partitioned into groups of similar connectivity profiles. Each group is associated to an observable that is a weighted average of the nodes' activities within the group. Second, we derive a set of equations that must be fulfilled for these observables to properly represent the original system's behavior, together with a method for approximately solving them. The result is a reduced adjacency matrix and an approximate system of ODEs for the observables' evolution. We show that the reduced system can be used to predict some characteristic features of the complete dynamics for different types of connectivity structures, both synthetic and derived from real data, including neuronal, ecological, and social networks. Our formalism opens a way to a systematic comparison of the effect of various structural properties on the overall network dynamics. It can thus help to identify the main structural driving forces guiding the evolution of dynamical processes on networks.
Collapse
Affiliation(s)
- Marina Vegué
- Département de physique, de génie physique et d'optique, Université Laval, 2325 rue de l'Université, G1V 0A6 Québec, Canada
- Centre interdisciplinaire en modélisation mathématique, Université Laval, 2325 rue de l'Université, G1V 0A6 Québec, Canada
| | - Vincent Thibeault
- Département de physique, de génie physique et d'optique, Université Laval, 2325 rue de l'Université, G1V 0A6 Québec, Canada
- Centre interdisciplinaire en modélisation mathématique, Université Laval, 2325 rue de l'Université, G1V 0A6 Québec, Canada
| | - Patrick Desrosiers
- Département de physique, de génie physique et d'optique, Université Laval, 2325 rue de l'Université, G1V 0A6 Québec, Canada
- Centre interdisciplinaire en modélisation mathématique, Université Laval, 2325 rue de l'Université, G1V 0A6 Québec, Canada
- CERVO Brain Research Center, 2301 avenue d'Estimauville, G1E 1T2 Québec, Canada
| | - Antoine Allard
- Département de physique, de génie physique et d'optique, Université Laval, 2325 rue de l'Université, G1V 0A6 Québec, Canada
- Centre interdisciplinaire en modélisation mathématique, Université Laval, 2325 rue de l'Université, G1V 0A6 Québec, Canada
| |
Collapse
|
26
|
Lea-Carnall CA, Tanner LI, Montemurro MA. Noise-modulated multistable synapses in a Wilson-Cowan-based model of plasticity. Front Comput Neurosci 2023; 17:1017075. [PMID: 36817317 PMCID: PMC9931909 DOI: 10.3389/fncom.2023.1017075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2022] [Accepted: 01/10/2023] [Indexed: 02/05/2023] Open
Abstract
Frequency-dependent plasticity refers to changes in synaptic strength in response to different stimulation frequencies. Resonance is a factor known to be of importance in such frequency dependence, however, the role of neural noise in the process remains elusive. Considering the brain is an inherently noisy system, understanding its effects may prove beneficial in shaping therapeutic interventions based on non-invasive brain stimulation protocols. The Wilson-Cowan (WC) model is a well-established model to describe the average dynamics of neural populations and has been shown to exhibit bistability in the presence of noise. However, the important question of how the different stable regimes in the WC model can affect synaptic plasticity when cortical populations interact has not yet been addressed. Therefore, we investigated plasticity dynamics in a WC-based model of interacting neural populations coupled with activity-dependent synapses in which a periodic stimulation was applied in the presence of noise of controlled intensity. The results indicate that for a narrow range of the noise variance, synaptic strength can be optimized. In particular, there is a regime of noise intensity for which synaptic strength presents a triple-stable state. Regulating noise intensity affects the probability that the system chooses one of the stable states, thereby controlling plasticity. These results suggest that noise is a highly influential factor in determining the outcome of plasticity induced by stimulation.
Collapse
Affiliation(s)
- Caroline A Lea-Carnall
- School of Health Sciences, Manchester Academic Health Science Centre, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, United Kingdom
| | - Lisabel I Tanner
- School of Health Sciences, Manchester Academic Health Science Centre, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, United Kingdom
| | - Marcelo A Montemurro
- School of Mathematics and Statistics, Faculty of Science, Technology, Engineering and Mathematics, The Open University, Milton Keynes, United Kingdom
| |
Collapse
|
27
|
Miehl C, Gjorgjieva J. Stability and learning in excitatory synapses by nonlinear inhibitory plasticity. PLoS Comput Biol 2022; 18:e1010682. [PMID: 36459503 PMCID: PMC9718420 DOI: 10.1371/journal.pcbi.1010682] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2022] [Accepted: 10/25/2022] [Indexed: 12/03/2022] Open
Abstract
Synaptic changes are hypothesized to underlie learning and memory formation in the brain. But Hebbian synaptic plasticity of excitatory synapses on its own is unstable, leading to either unlimited growth of synaptic strengths or silencing of neuronal activity without additional homeostatic mechanisms. To control excitatory synaptic strengths, we propose a novel form of synaptic plasticity at inhibitory synapses. Using computational modeling, we suggest two key features of inhibitory plasticity, dominance of inhibition over excitation and a nonlinear dependence on the firing rate of postsynaptic excitatory neurons whereby inhibitory synaptic strengths change with the same sign (potentiate or depress) as excitatory synaptic strengths. We demonstrate that the stable synaptic strengths realized by this novel inhibitory plasticity model affects excitatory/inhibitory weight ratios in agreement with experimental results. Applying a disinhibitory signal can gate plasticity and lead to the generation of receptive fields and strong bidirectional connectivity in a recurrent network. Hence, a novel form of nonlinear inhibitory plasticity can simultaneously stabilize excitatory synaptic strengths and enable learning upon disinhibition.
Collapse
Affiliation(s)
- Christoph Miehl
- Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- School of Life Sciences, Technical University of Munich, Freising, Germany
- * E-mail: (CM); (JG)
| | - Julijana Gjorgjieva
- Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- School of Life Sciences, Technical University of Munich, Freising, Germany
- * E-mail: (CM); (JG)
| |
Collapse
|
28
|
Meng JH, Riecke H. Structural spine plasticity: Learning and forgetting of odor-specific subnetworks in the olfactory bulb. PLoS Comput Biol 2022; 18:e1010338. [PMID: 36279303 PMCID: PMC9632792 DOI: 10.1371/journal.pcbi.1010338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2022] [Revised: 11/03/2022] [Accepted: 09/28/2022] [Indexed: 11/05/2022] Open
Abstract
Learning to discriminate between different sensory stimuli is essential for survival. In rodents, the olfactory bulb, which contributes to odor discrimination via pattern separation, exhibits extensive structural synaptic plasticity involving the formation and removal of synaptic spines, even in adult animals. The network connectivity resulting from this plasticity is still poorly understood. To gain insight into this connectivity we present here a computational model for the structural plasticity of the reciprocal synapses between the dominant population of excitatory principal neurons and inhibitory interneurons. It incorporates the observed modulation of spine stability by odor exposure. The model captures the striking experimental observation that the exposure to odors does not always enhance their discriminability: while training with similar odors enhanced their discriminability, training with dissimilar odors actually reduced the discriminability of the training stimuli. Strikingly, this differential learning does not require the activity-dependence of the spine stability and occurs also in a model with purely random spine dynamics in which the spine density is changed homogeneously, e.g., due to a global signal. However, the experimentally observed odor-specific reduction in the response of principal cells as a result of extended odor exposure and the concurrent disinhibition of a subset of principal cells arise only in the activity-dependent model. Moreover, this model predicts the experimentally testable recovery of odor response through weak but not through strong odor re-exposure and the forgetting of odors via exposure to interfering odors. Combined with the experimental observations, the computational model provides strong support for the prediction that odor exposure leads to the formation of odor-specific subnetworks in the olfactory bulb. A key feature of the brain is its ability to learn through the plasticity of its network. The olfactory bulb in the olfactory system is a remarkable brain area whose anatomical structure evolves substantially still in adult animals by establishing new synaptic connections and removing existing ones. We present a computational model for this process and employ it to interpret recent experimental results. By comparing the results of our model with those of a random control model we identify various experimental observations that lend strong support to the notion that the network of the olfactory bulb comprises learned, odor-specific subnetworks. Moreover, our model explains the recent observation that the learning of odors does not always improve their discriminability and provides testable predictions for the recovery of odor response after repeated odor exposure and for when the learning of new odors interferes with retaining the memory of familiar odors.
Collapse
Affiliation(s)
- John Hongyu Meng
- Engineering Sciences and Applied Mathematics, Northwestern University, Evanston, Illinois, United States of America
| | - Hermann Riecke
- Engineering Sciences and Applied Mathematics, Northwestern University, Evanston, Illinois, United States of America
- * E-mail:
| |
Collapse
|
29
|
Bialas M, Mandziuk J. Spike-Timing-Dependent Plasticity With Activation-Dependent Scaling for Receptive Fields Development. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2022; 33:5215-5228. [PMID: 33844634 DOI: 10.1109/tnnls.2021.3069683] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Spike-timing-dependent plasticity (STDP) is one of the most popular and deeply biologically motivated forms of unsupervised Hebbian-type learning. In this article, we propose a variant of STDP extended by an additional activation-dependent scale factor. The consequent learning rule is an efficient algorithm, which is simple to implement and applicable to spiking neural networks (SNNs). It is demonstrated that the proposed plasticity mechanism combined with competitive learning can serve as an effective mechanism for the unsupervised development of receptive fields (RFs). Furthermore, the relationship between synaptic scaling and lateral inhibition is explored in the context of the successful development of RFs. Specifically, we demonstrate that maintaining a high level of synaptic scaling followed by its rapid increase is crucial for the development of neuronal mechanisms of selectivity. The strength of the proposed solution is assessed in classification tasks performed on the Modified National Institute of Standards and Technology (MNIST) data set with an accuracy level of 94.65% (a single network) and 95.17% (a network committee)-comparable to the state-of-the-art results of single-layer SNN architectures trained in an unsupervised manner. Furthermore, the training process leads to sparse data representation and the developed RFs have the potential to serve as local feature detectors in multilayered spiking networks. We also prove theoretically that when applied to linear Poisson neurons, our rule conserves total synaptic strength, guaranteeing the convergence of the learning process.
Collapse
|
30
|
Stock CH, Harvey SE, Ocko SA, Ganguli S. Synaptic balancing: A biologically plausible local learning rule that provably increases neural network noise robustness without sacrificing task performance. PLoS Comput Biol 2022; 18:e1010418. [PMID: 36121844 PMCID: PMC9522011 DOI: 10.1371/journal.pcbi.1010418] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2020] [Revised: 09/29/2022] [Accepted: 07/20/2022] [Indexed: 11/26/2022] Open
Abstract
We introduce a novel, biologically plausible local learning rule that provably increases the robustness of neural dynamics to noise in nonlinear recurrent neural networks with homogeneous nonlinearities. Our learning rule achieves higher noise robustness without sacrificing performance on the task and without requiring any knowledge of the particular task. The plasticity dynamics-an integrable dynamical system operating on the weights of the network-maintains a multiplicity of conserved quantities, most notably the network's entire temporal map of input to output trajectories. The outcome of our learning rule is a synaptic balancing between the incoming and outgoing synapses of every neuron. This synaptic balancing rule is consistent with many known aspects of experimentally observed heterosynaptic plasticity, and moreover makes new experimentally testable predictions relating plasticity at the incoming and outgoing synapses of individual neurons. Overall, this work provides a novel, practical local learning rule that exactly preserves overall network function and, in doing so, provides new conceptual bridges between the disparate worlds of the neurobiology of heterosynaptic plasticity, the engineering of regularized noise-robust networks, and the mathematics of integrable Lax dynamical systems.
Collapse
Affiliation(s)
- Christopher H. Stock
- Neuroscience Graduate Program, Stanford University School of Medicine, Stanford, California, United States of America
| | - Sarah E. Harvey
- Department of Applied Physics, Stanford University, Stanford, California, United States of America
| | - Samuel A. Ocko
- Department of Applied Physics, Stanford University, Stanford, California, United States of America
| | - Surya Ganguli
- Department of Applied Physics, Stanford University, Stanford, California, United States of America
- Stanford Institute for Human-Centered Artificial Intelligence, Stanford University, Stanford, California, United States of America
| |
Collapse
|
31
|
Hummos A, Wang BA, Drammis S, Halassa MM, Pleger B. Thalamic regulation of frontal interactions in human cognitive flexibility. PLoS Comput Biol 2022; 18:e1010500. [PMID: 36094955 PMCID: PMC9499289 DOI: 10.1371/journal.pcbi.1010500] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2022] [Revised: 09/22/2022] [Accepted: 08/19/2022] [Indexed: 11/19/2022] Open
Abstract
Interactions across frontal cortex are critical for cognition. Animal studies suggest a role for mediodorsal thalamus (MD) in these interactions, but the computations performed and direct relevance to human decision making are unclear. Here, inspired by animal work, we extended a neural model of an executive frontal-MD network and trained it on a human decision-making task for which neuroimaging data were collected. Using a biologically-plausible learning rule, we found that the model MD thalamus compressed its cortical inputs (dorsolateral prefrontal cortex, dlPFC) underlying stimulus-response representations. Through direct feedback to dlPFC, this thalamic operation efficiently partitioned cortical activity patterns and enhanced task switching across different contingencies. To account for interactions with other frontal regions, we expanded the model to compute higher-order strategy signals outside dlPFC, and found that the MD offered a more efficient route for such signals to switch dlPFC activity patterns. Human fMRI data provided evidence that the MD engaged in feedback to dlPFC, and had a role in routing orbitofrontal cortex inputs when subjects switched behavioral strategy. Collectively, our findings contribute to the emerging evidence for thalamic regulation of frontal interactions in the human brain.
Collapse
Affiliation(s)
- Ali Hummos
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
| | - Bin A. Wang
- Department of Neurology, BG University Hospital Bergmannsheil, Ruhr-University Bochum, Bochum, Germany
- Collaborative Research Centre 874 "Integration and Representation of Sensory Processes", Ruhr University Bochum, Bochum, Germany
| | - Sabrina Drammis
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
- Computer Science & Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
| | - Michael M. Halassa
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
| | - Burkhard Pleger
- Department of Neurology, BG University Hospital Bergmannsheil, Ruhr-University Bochum, Bochum, Germany
- Collaborative Research Centre 874 "Integration and Representation of Sensory Processes", Ruhr University Bochum, Bochum, Germany
| |
Collapse
|
32
|
Organization and Priming of Long-term Memory Representations with Two-phase Plasticity. Cognit Comput 2022. [DOI: 10.1007/s12559-022-10021-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Abstract
Background / Introduction
In recurrent neural networks in the brain, memories are represented by so-called Hebbian cell assemblies. Such assemblies are groups of neurons with particularly strong synaptic connections formed by synaptic plasticity and consolidated by synaptic tagging and capture (STC). To link these synaptic mechanisms to long-term memory on the level of cognition and behavior, their functional implications on the level of neural networks have to be understood.
Methods
We employ a biologically detailed recurrent network of spiking neurons featuring synaptic plasticity and STC to model the learning and consolidation of long-term memory representations. Using this, we investigate the effects of different organizational paradigms, and of priming stimulation, on the functionality of multiple memory representations. We quantify these effects by the spontaneous activation of memory representations driven by background noise.
Results
We find that the learning order of the memory representations significantly biases the likelihood of activation towards more recently learned representations, and that hub-like overlap structure counters this effect. We identify long-term depression as the mechanism underlying these findings. Finally, we demonstrate that STC has functional consequences for the interaction of long-term memory representations: 1. intermediate consolidation in between learning the individual representations strongly alters the previously described effects, and 2. STC enables the priming of a long-term memory representation on a timescale of minutes to hours.
Conclusion
Our findings show how synaptic and neuronal mechanisms can provide an explanatory basis for known cognitive effects.
Collapse
|
33
|
Kreutzer E, Senn W, Petrovici MA. Natural-gradient learning for spiking neurons. eLife 2022; 11:e66526. [PMID: 35467527 PMCID: PMC9038192 DOI: 10.7554/elife.66526] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2021] [Accepted: 02/21/2022] [Indexed: 11/16/2022] Open
Abstract
In many normative theories of synaptic plasticity, weight updates implicitly depend on the chosen parametrization of the weights. This problem relates, for example, to neuronal morphology: synapses which are functionally equivalent in terms of their impact on somatic firing can differ substantially in spine size due to their different positions along the dendritic tree. Classical theories based on Euclidean-gradient descent can easily lead to inconsistencies due to such parametrization dependence. The issues are solved in the framework of Riemannian geometry, in which we propose that plasticity instead follows natural-gradient descent. Under this hypothesis, we derive a synaptic learning rule for spiking neurons that couples functional efficiency with the explanation of several well-documented biological phenomena such as dendritic democracy, multiplicative scaling, and heterosynaptic plasticity. We therefore suggest that in its search for functional synaptic plasticity, evolution might have come up with its own version of natural-gradient descent.
Collapse
Affiliation(s)
- Elena Kreutzer
- Department of Physiology, University of BernBernSwitzerland
| | - Walter Senn
- Department of Physiology, University of BernBernSwitzerland
| | - Mihai A Petrovici
- Department of Physiology, University of BernBernSwitzerland
- Kirchhoff-Institute for Physics, Heidelberg UniversityHeidelbergGermany
| |
Collapse
|
34
|
Rule ME, O'Leary T. Self-healing codes: How stable neural populations can track continually reconfiguring neural representations. Proc Natl Acad Sci U S A 2022; 119:e2106692119. [PMID: 35145024 PMCID: PMC8851551 DOI: 10.1073/pnas.2106692119] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2021] [Accepted: 12/29/2021] [Indexed: 12/19/2022] Open
Abstract
As an adaptive system, the brain must retain a faithful representation of the world while continuously integrating new information. Recent experiments have measured population activity in cortical and hippocampal circuits over many days and found that patterns of neural activity associated with fixed behavioral variables and percepts change dramatically over time. Such "representational drift" raises the question of how malleable population codes can interact coherently with stable long-term representations that are found in other circuits and with relatively rigid topographic mappings of peripheral sensory and motor signals. We explore how known plasticity mechanisms can allow single neurons to reliably read out an evolving population code without external error feedback. We find that interactions between Hebbian learning and single-cell homeostasis can exploit redundancy in a distributed population code to compensate for gradual changes in tuning. Recurrent feedback of partially stabilized readouts could allow a pool of readout cells to further correct inconsistencies introduced by representational drift. This shows how relatively simple, known mechanisms can stabilize neural tuning in the short term and provides a plausible explanation for how plastic neural codes remain integrated with consolidated, long-term representations.
Collapse
Affiliation(s)
- Michael E Rule
- Engineering Department, University of Cambridge, Cambridge CB2 1PZ, United Kingdom
| | - Timothy O'Leary
- Engineering Department, University of Cambridge, Cambridge CB2 1PZ, United Kingdom
| |
Collapse
|
35
|
Aghili Yajadda MM, Robinson PA, Henderson JA. Generalized neural field theory of cortical plasticity illustrated by an application to the linear phase of ocular dominance column formation in primary visual cortex. BIOLOGICAL CYBERNETICS 2022; 116:33-52. [PMID: 34773503 DOI: 10.1007/s00422-021-00901-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Accepted: 09/30/2021] [Indexed: 06/13/2023]
Abstract
Physiologically based neural field theory (NFT) is extended to encompass cortical plasticity dynamics. An illustrative application is provided which treats the evolution of the connectivity of left- and right-eye visual stimuli to neuronal populations in the primary visual cortex (V1), and the initial, linear phase of formation of approximately one-dimensional (1D) ocular dominance columns (ODCs) that sets their transverse spatial scale. This links V1 activity, structure, and physiology within a single theory that already accounts for a range of other brain activity and connectivity phenomena, thereby enabling ODC formation and many other phenomena to be interrelated and cortical parameters to be constrained across multiple domains. The results accord with experimental ODC widths for realistic cortical parameters and are based directly on a unified description of the neuronal populations involved, their connection strengths, and the neuronal activity they support. Other key results include simple analytic approximations for ODC widths and the parameters of maximum growth rate, constraints on cortical excitatory and inhibitory gains, elucidation of the roles of specific poles of the V1 response function, and the fact that ODCs are not formed when input stimuli are fully correlated between eyes. This work provides a basis for further generalization of NFT to model other plasticity phenomena, thereby linking them to the range multiscale phenomena accounted for by NFT.
Collapse
Affiliation(s)
- M M Aghili Yajadda
- School of Physics, University of Sydney, Sydney, NSW, 2006, Australia
- Center for Integrative Brain Function, University of Sydney, Sydney, NSW, 2006, Australia
| | - P A Robinson
- School of Physics, University of Sydney, Sydney, NSW, 2006, Australia
- Center for Integrative Brain Function, University of Sydney, Sydney, NSW, 2006, Australia
| | - J A Henderson
- School of Physics, University of Sydney, Sydney, NSW, 2006, Australia.
- Center for Integrative Brain Function, University of Sydney, Sydney, NSW, 2006, Australia.
| |
Collapse
|
36
|
Gallinaro JV, Gašparović N, Rotter S. Homeostatic control of synaptic rewiring in recurrent networks induces the formation of stable memory engrams. PLoS Comput Biol 2022; 18:e1009836. [PMID: 35143489 PMCID: PMC8865699 DOI: 10.1371/journal.pcbi.1009836] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2021] [Revised: 02/23/2022] [Accepted: 01/14/2022] [Indexed: 12/04/2022] Open
Abstract
Brain networks store new memories using functional and structural synaptic plasticity. Memory formation is generally attributed to Hebbian plasticity, while homeostatic plasticity is thought to have an ancillary role in stabilizing network dynamics. Here we report that homeostatic plasticity alone can also lead to the formation of stable memories. We analyze this phenomenon using a new theory of network remodeling, combined with numerical simulations of recurrent spiking neural networks that exhibit structural plasticity based on firing rate homeostasis. These networks are able to store repeatedly presented patterns and recall them upon the presentation of incomplete cues. Storage is fast, governed by the homeostatic drift. In contrast, forgetting is slow, driven by a diffusion process. Joint stimulation of neurons induces the growth of associative connections between them, leading to the formation of memory engrams. These memories are stored in a distributed fashion throughout connectivity matrix, and individual synaptic connections have only a small influence. Although memory-specific connections are increased in number, the total number of inputs and outputs of neurons undergo only small changes during stimulation. We find that homeostatic structural plasticity induces a specific type of "silent memories", different from conventional attractor states.
Collapse
Affiliation(s)
- Júlia V. Gallinaro
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Freiburg im Breisgau, Germany
- Bioengineering Department, Imperial College London, London, United Kingdom
| | - Nebojša Gašparović
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Freiburg im Breisgau, Germany
| | - Stefan Rotter
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Freiburg im Breisgau, Germany
| |
Collapse
|
37
|
Strettoi E, Di Marco B, Orsini N, Napoli D. Retinal Plasticity. Int J Mol Sci 2022; 23:ijms23031138. [PMID: 35163059 PMCID: PMC8835074 DOI: 10.3390/ijms23031138] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Revised: 01/17/2022] [Accepted: 01/18/2022] [Indexed: 12/28/2022] Open
Abstract
Brain plasticity is a well-established concept designating the ability of central nervous system (CNS) neurons to rearrange as a result of learning, when adapting to changeable environmental conditions or else while reacting to injurious factors. As a part of the CNS, the retina has been repeatedly probed for its possible ability to respond plastically to a variably altered environment or to pathological insults. However, numerous studies support the conclusion that the retina, outside the developmental stage, is endowed with only limited plasticity, exhibiting, instead, a remarkable ability to maintain a stable architectural and functional organization. Reviewed here are representative examples of hippocampal and cortical paradigms of plasticity and of retinal structural rearrangements found in organization and circuitry following altered developmental conditions or occurrence of genetic diseases leading to neuronal degeneration. The variable rate of plastic changes found in mammalian retinal neurons in different circumstances is discussed, focusing on structural plasticity. The likely adaptive value of maintaining a low level of plasticity in an organ subserving a sensory modality that is dominant for the human species and that requires elevated fidelity is discussed.
Collapse
Affiliation(s)
- Enrica Strettoi
- CNR Neuroscience Institute, 56124 Pisa, Italy; (B.D.M.); (N.O.); (D.N.)
- Correspondence: ; Tel.: +39-0503153213
| | - Beatrice Di Marco
- CNR Neuroscience Institute, 56124 Pisa, Italy; (B.D.M.); (N.O.); (D.N.)
- Regional Doctorate School in Neuroscience, Universities of Florence, Pisa and Siena, 50134 Florence, Italy
| | - Noemi Orsini
- CNR Neuroscience Institute, 56124 Pisa, Italy; (B.D.M.); (N.O.); (D.N.)
- Regional Doctorate School in Neuroscience, Universities of Florence, Pisa and Siena, 50134 Florence, Italy
| | - Debora Napoli
- CNR Neuroscience Institute, 56124 Pisa, Italy; (B.D.M.); (N.O.); (D.N.)
- Regional Doctorate School in Neuroscience, Universities of Florence, Pisa and Siena, 50134 Florence, Italy
| |
Collapse
|
38
|
Jordan JT, Tong Y, Pytte CL. Transection of the ventral hippocampal commissure impairs spatial reference but not contextual or spatial working memory. Learn Mem 2022; 29:29-37. [PMID: 34911801 PMCID: PMC8686591 DOI: 10.1101/lm.053483.121] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2021] [Accepted: 11/09/2021] [Indexed: 01/03/2023]
Abstract
Plasticity is a neural phenomenon in which experience induces long-lasting changes to neuronal circuits and is at the center of most neurobiological theories of learning and memory. However, too much plasticity is maladaptive and must be balanced with substrate stability. Area CA3 of the hippocampus provides such a balance via hemispheric lateralization, with the left hemisphere dominant in providing plasticity and the right specialized for stability. Left and right CA3 project bilaterally to CA1; however, it is not known whether this downstream merging of lateralized plasticity and stability is functional. We hypothesized that interhemispheric convergence of input from these pathways is essential for integrating spatial memory stored in the left CA3 with navigational working memory facilitated by the right CA3. To test this, we severed interhemispheric connections between the left and right hippocampi in mice and assessed learning and memory. Despite damage to this major hippocampal fiber tract, hippocampus-dependent navigational working memory and short- and long-term memory were both spared. However, tasks that required the integration of information retrieved from memory with ongoing navigational working memory and navigation were impaired. We propose that one function of interhemispheric communication in the mouse hippocampus is to integrate lateralized processing of plastic and stable circuits to facilitate memory-guided spatial navigation.
Collapse
Affiliation(s)
- Jake T. Jordan
- Department of Biology, The Graduate Center, City University of New York (CUNY), New York, New York 11016, USA,CUNY Neuroscience Collaborative, The Graduate Center, City University of New York, New York, New York 11016, USA
| | - Yi Tong
- Department of Psychology, Queens College, City University of New York, Flushing, New York 11367, USA
| | - Carolyn L. Pytte
- Department of Biology, The Graduate Center, City University of New York (CUNY), New York, New York 11016, USA,CUNY Neuroscience Collaborative, The Graduate Center, City University of New York, New York, New York 11016, USA,Department of Psychology, Queens College, City University of New York, Flushing, New York 11367, USA,Department of Psychology, The Graduate Center, City University of New York, New York, New York 11016, USA
| |
Collapse
|
39
|
Larisch R, Gönner L, Teichmann M, Hamker FH. Sensory coding and contrast invariance emerge from the control of plastic inhibition over emergent selectivity. PLoS Comput Biol 2021; 17:e1009566. [PMID: 34843455 PMCID: PMC8629393 DOI: 10.1371/journal.pcbi.1009566] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2021] [Accepted: 10/15/2021] [Indexed: 11/18/2022] Open
Abstract
Visual stimuli are represented by a highly efficient code in the primary visual cortex, but the development of this code is still unclear. Two distinct factors control coding efficiency: Representational efficiency, which is determined by neuronal tuning diversity, and metabolic efficiency, which is influenced by neuronal gain. How these determinants of coding efficiency are shaped during development, supported by excitatory and inhibitory plasticity, is only partially understood. We investigate a fully plastic spiking network of the primary visual cortex, building on phenomenological plasticity rules. Our results suggest that inhibitory plasticity is key to the emergence of tuning diversity and accurate input encoding. We show that inhibitory feedback (random and specific) increases the metabolic efficiency by implementing a gain control mechanism. Interestingly, this led to the spontaneous emergence of contrast-invariant tuning curves. Our findings highlight that (1) interneuron plasticity is key to the development of tuning diversity and (2) that efficient sensory representations are an emergent property of the resulting network. Synaptic plasticity is crucial for the development of efficient input representation in the different sensory cortices, such as the primary visual cortex. Efficient visual representation is determined by two factors: representational efficiency, i.e. how many different input features can be represented, and metabolic efficiency, i.e. how many spikes are required to represent a specific feature. Previous research has pointed out the importance of plasticity at excitatory synapses to achieve high representational efficiency and feedback inhibition as a gain control mechanism for controlling metabolic efficiency. However, it is only partially understood how the influence of inhibitory plasticity on excitatory plasticity can lead to an efficient representation. Using a spiking neural network, we show that plasticity at feed-forward and feedback inhibitory synapses is necessary for the emergence of well-distributed neuronal selectivity to improve representational efficiency. Further, the emergent balance between excitatory and inhibitory currents improves the metabolic efficiency, and leads to contrast-invariant tuning as an inherent network property. Extending previous work, our simulation results highlight the importance of plasticity at inhibitory synapses.
Collapse
Affiliation(s)
- René Larisch
- Department of Computer Science, Artificial Intelligence, TU Chemnitz, Chemnitz, Germany
- * E-mail: (RL); (FHH)
| | - Lorenz Gönner
- Department of Computer Science, Artificial Intelligence, TU Chemnitz, Chemnitz, Germany
- Faculty of Psychology, Lifespan Developmental Neuroscience, TU Dresden, Dresden, Germany
| | - Michael Teichmann
- Department of Computer Science, Artificial Intelligence, TU Chemnitz, Chemnitz, Germany
| | - Fred H. Hamker
- Department of Computer Science, Artificial Intelligence, TU Chemnitz, Chemnitz, Germany
- Bernstein Center Computational Neuroscience, Berlin, Germany
- * E-mail: (RL); (FHH)
| |
Collapse
|
40
|
Shen Y, Wang J, Navlakha S. A Correspondence Between Normalization Strategies in Artificial and Biological Neural Networks. Neural Comput 2021; 33:3179-3203. [PMID: 34474484 PMCID: PMC8662716 DOI: 10.1162/neco_a_01439] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 06/14/2021] [Indexed: 12/24/2022]
Abstract
A fundamental challenge at the interface of machine learning and neuroscience is to uncover computational principles that are shared between artificial and biological neural networks. In deep learning, normalization methods such as batch normalization, weight normalization, and their many variants help to stabilize hidden unit activity and accelerate network training, and these methods have been called one of the most important recent innovations for optimizing deep networks. In the brain, homeostatic plasticity represents a set of mechanisms that also stabilize and normalize network activity to lie within certain ranges, and these mechanisms are critical for maintaining normal brain function. In this article, we discuss parallels between artificial and biological normalization methods at four spatial scales: normalization of a single neuron's activity, normalization of synaptic weights of a neuron, normalization of a layer of neurons, and normalization of a network of neurons. We argue that both types of methods are functionally equivalent-that is, both push activation patterns of hidden units toward a homeostatic state, where all neurons are equally used-and we argue that such representations can improve coding capacity, discrimination, and regularization. As a proof of concept, we develop an algorithm, inspired by a neural normalization technique called synaptic scaling, and show that this algorithm performs competitively against existing normalization methods on several data sets. Overall, we hope this bidirectional connection will inspire neuroscientists and machine learners in three ways: to uncover new normalization algorithms based on established neurobiological principles; to help quantify the trade-offs of different homeostatic plasticity mechanisms used in the brain; and to offer insights about how stability may not hinder, but may actually promote, plasticity.
Collapse
Affiliation(s)
- Yang Shen
- Cold Spring Harbor Laboratory, Simons Center for Quantitative Biology, Cold Spring Harbor, NY 11724, U.S.A.
| | - Julia Wang
- Cold Spring Harbor Laboratory, Simons Center for Quantitative Biology, Cold Spring Harbor, NY 11724, U.S.A.
| | - Saket Navlakha
- Cold Spring Harbor Laboratory, Simons Center for Quantitative Biology, Cold Spring Harbor, NY 11724, U.S.A.
| |
Collapse
|
41
|
Voitiuk K, Geng J, Keefe MG, Parks DF, Sanso SE, Hawthorne N, Freeman DB, Currie R, Mostajo-Radji MA, Pollen AA, Nowakowski TJ, Salama SR, Teodorescu M, Haussler D. Light-weight electrophysiology hardware and software platform for cloud-based neural recording experiments. J Neural Eng 2021; 18:10.1088/1741-2552/ac310a. [PMID: 34666315 PMCID: PMC8667733 DOI: 10.1088/1741-2552/ac310a] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2021] [Accepted: 10/19/2021] [Indexed: 11/12/2022]
Abstract
Objective.Neural activity represents a functional readout of neurons that is increasingly important to monitor in a wide range of experiments. Extracellular recordings have emerged as a powerful technique for measuring neural activity because these methods do not lead to the destruction or degradation of the cells being measured. Current approaches to electrophysiology have a low throughput of experiments due to manual supervision and expensive equipment. This bottleneck limits broader inferences that can be achieved with numerous long-term recorded samples.Approach.We developed Piphys, an inexpensive open source neurophysiological recording platform that consists of both hardware and software. It is easily accessed and controlled via a standard web interface through Internet of Things (IoT) protocols.Main results.We used a Raspberry Pi as the primary processing device along with an Intan bioamplifier. We designed a hardware expansion circuit board and software to enable voltage sampling and user interaction. This standalone system was validated with primary human neurons, showing reliability in collecting neural activity in near real-time.Significance.The hardware modules and cloud software allow for remote control of neural recording experiments as well as horizontal scalability, enabling long-term observations of development, organization, and neural activity at scale.
Collapse
Affiliation(s)
- Kateryna Voitiuk
- Department of Biomolecular Engineering, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| | - Jinghui Geng
- Department of Electrical and Computer Engineering, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| | - Matthew G Keefe
- Department of Anatomy, University of California San Francisco, San Francisco, CA 94143, United States of America
| | - David F Parks
- Department of Biomolecular Engineering, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| | - Sebastian E Sanso
- UC Santa Cruz Genomics Institute, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| | - Nico Hawthorne
- Department of Electrical and Computer Engineering, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| | - Daniel B Freeman
- Universal Audio Inc., Scotts Valley, CA 95066, United States of America
| | - Rob Currie
- UC Santa Cruz Genomics Institute, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| | - Mohammed A Mostajo-Radji
- The Eli and Edythe Broad Center of Regeneration Medicine and Stem Cell Research, University of California San Francisco, San Francisco, CA 94143, United States of America
- UC Santa Cruz Genomics Institute, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
- Department of Neurology, University of California San Francisco, San Francisco, CA 94143, United States of America
| | - Alex A Pollen
- The Eli and Edythe Broad Center of Regeneration Medicine and Stem Cell Research, University of California San Francisco, San Francisco, CA 94143, United States of America
- Department of Neurology, University of California San Francisco, San Francisco, CA 94143, United States of America
| | - Tomasz J Nowakowski
- Department of Anatomy, University of California San Francisco, San Francisco, CA 94143, United States of America
- The Eli and Edythe Broad Center of Regeneration Medicine and Stem Cell Research, University of California San Francisco, San Francisco, CA 94143, United States of America
| | - Sofie R Salama
- Department of Biomolecular Engineering, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
- Howard Hughes Medical Institute, University of California Santa Cruz, Santa Cruz, CA 95064, United States of America
- UC Santa Cruz Genomics Institute, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| | - Mircea Teodorescu
- Department of Electrical and Computer Engineering, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
- UC Santa Cruz Genomics Institute, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| | - David Haussler
- Department of Biomolecular Engineering, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
- Howard Hughes Medical Institute, University of California Santa Cruz, Santa Cruz, CA 95064, United States of America
- UC Santa Cruz Genomics Institute, University of California Santa Cruz, Santa Cruz, CA 95060, United States of America
| |
Collapse
|
42
|
Vercruysse F, Naud R, Sprekeler H. Self-organization of a doubly asynchronous irregular network state for spikes and bursts. PLoS Comput Biol 2021; 17:e1009478. [PMID: 34748532 PMCID: PMC8575278 DOI: 10.1371/journal.pcbi.1009478] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2021] [Accepted: 09/24/2021] [Indexed: 11/21/2022] Open
Abstract
Cortical pyramidal cells (PCs) have a specialized dendritic mechanism for the generation of bursts, suggesting that these events play a special role in cortical information processing. In vivo, bursts occur at a low, but consistent rate. Theory suggests that this network state increases the amount of information they convey. However, because burst activity relies on a threshold mechanism, it is rather sensitive to dendritic input levels. In spiking network models, network states in which bursts occur rarely are therefore typically not robust, but require fine-tuning. Here, we show that this issue can be solved by a homeostatic inhibitory plasticity rule in dendrite-targeting interneurons that is consistent with experimental data. The suggested learning rule can be combined with other forms of inhibitory plasticity to self-organize a network state in which both spikes and bursts occur asynchronously and irregularly at low rate. Finally, we show that this network state creates the network conditions for a recently suggested multiplexed code and thereby indeed increases the amount of information encoded in bursts.
Collapse
Affiliation(s)
- Filip Vercruysse
- Department for Electrical Engineering and Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
| | - Richard Naud
- Department of Physics, University of Ottawa, Ottawa, Canada
- uOttawa Brain Mind Institute, Center for Neural Dynamics, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Canada
| | - Henning Sprekeler
- Department for Electrical Engineering and Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
| |
Collapse
|
43
|
Chipman PH, Fung CCA, Pazo Fernandez A, Sawant A, Tedoldi A, Kawai A, Ghimire Gautam S, Kurosawa M, Abe M, Sakimura K, Fukai T, Goda Y. Astrocyte GluN2C NMDA receptors control basal synaptic strengths of hippocampal CA1 pyramidal neurons in the stratum radiatum. eLife 2021; 10:70818. [PMID: 34693906 PMCID: PMC8594917 DOI: 10.7554/elife.70818] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 10/22/2021] [Indexed: 12/12/2022] Open
Abstract
Experience-dependent plasticity is a key feature of brain synapses for which neuronal N-Methyl-D-Aspartate receptors (NMDARs) play a major role, from developmental circuit refinement to learning and memory. Astrocytes also express NMDARs, although their exact function has remained controversial. Here, we identify in mouse hippocampus, a circuit function for GluN2C NMDAR, a subtype highly expressed in astrocytes, in layer-specific tuning of synaptic strengths in CA1 pyramidal neurons. Interfering with astrocyte NMDAR or GluN2C NMDAR activity reduces the range of presynaptic strength distribution specifically in the stratum radiatum inputs without an appreciable change in the mean presynaptic strength. Mathematical modeling shows that narrowing of the width of presynaptic release probability distribution compromises the expression of long-term synaptic plasticity. Our findings suggest a novel feedback signaling system that uses astrocyte GluN2C NMDARs to adjust basal synaptic weight distribution of Schaffer collateral inputs, which in turn impacts computations performed by the CA1 pyramidal neuron.
Collapse
Affiliation(s)
| | - Chi Chung Alan Fung
- Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology Graduate University, Onna-son, Japan
| | | | | | - Angelo Tedoldi
- RIKEN Center for Brain Science, Wako-shi, Saitama, Japan
| | - Atsushi Kawai
- RIKEN Center for Brain Science, Wako-shi, Saitama, Japan
| | | | | | - Manabu Abe
- Department of Animal Model Development, Brain Research Institute, Niigata University, Niigata, Japan
| | - Kenji Sakimura
- Department of Animal Model Development, Brain Research Institute, Niigata University, Niigata, Japan
| | - Tomoki Fukai
- Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology Graduate University, Onna-son, Japan
| | - Yukiko Goda
- RIKEN Center for Brain Science, Wako-shi, Saitama, Japan
| |
Collapse
|
44
|
Lu H, Gallinaro JV, Normann C, Rotter S, Yalcin I. Time Course of Homeostatic Structural Plasticity in Response to Optogenetic Stimulation in Mouse Anterior Cingulate Cortex. Cereb Cortex 2021; 32:1574-1592. [PMID: 34607362 DOI: 10.1093/cercor/bhab281] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2021] [Revised: 07/09/2021] [Accepted: 07/12/2021] [Indexed: 11/13/2022] Open
Abstract
Plasticity is the mechanistic basis of development, aging, learning, and memory, both in healthy and pathological brains. Structural plasticity is rarely accounted for in computational network models due to a lack of insight into the underlying neuronal mechanisms and processes. Little is known about how the rewiring of networks is dynamically regulated. To inform such models, we characterized the time course of neural activity, the expression of synaptic proteins, and neural morphology employing an in vivo optogenetic mouse model. We stimulated pyramidal neurons in the anterior cingulate cortex of mice and harvested their brains at 1.5 h, 24 h, and $48\,\mathrm{h}$ after stimulation. Stimulus-induced cortical hyperactivity persisted up to 1.5 h and decayed to baseline after $24\,\mathrm{h}$ indicated by c-Fos expression. The synaptic proteins VGLUT1 and PSD-95, in contrast, were upregulated at $24\,\mathrm{h}$ and downregulated at $48\,\mathrm{h}$, respectively. Spine density and spine head volume were also increased at $24\,\mathrm{h}$ and decreased at $48\,\mathrm{h}$. This specific sequence of events reflects a continuous joint evolution of activity and connectivity that is characteristic of the model of homeostatic structural plasticity. Our computer simulations thus corroborate the observed empirical evidence from our animal experiments.
Collapse
Affiliation(s)
- Han Lu
- Bernstein Center Freiburg and Faculty of Biology, University of Freiburg, Freiburg 79104, Germany.,Centre National de la Recherche Scientifique, Université de Strasbourg, Institut des Neurosciences Cellulaires et Intégratives UPR3212, Strasbourg 67000, France.,Department of Neuroanatomy, Institute of Anatomy and Cell Biology, Faculty of Medicine, University of Freiburg, Freiburg 79104, Germany
| | - Júlia V Gallinaro
- Bernstein Center Freiburg and Faculty of Biology, University of Freiburg, Freiburg 79104, Germany.,Bioengineering Department, Imperial College London, London SW7 2AZ, United Kingdom
| | - Claus Normann
- Department of Psychiatry and Psychotherapy, Medical Center, Faculty of Medicine, University of Freiburg, Freiburg 79104, Germany.,Center for Basics in Neuromodulation, Faculty of Medicine, University of Freiburg, Freiburg 79104, Germany
| | - Stefan Rotter
- Bernstein Center Freiburg and Faculty of Biology, University of Freiburg, Freiburg 79104, Germany
| | - Ipek Yalcin
- Centre National de la Recherche Scientifique, Université de Strasbourg, Institut des Neurosciences Cellulaires et Intégratives UPR3212, Strasbourg 67000, France.,Department of Psychiatry and Neuroscience, Université Laval, Québec QC G1V 0A6, Canada
| |
Collapse
|
45
|
Walker JR, Detloff MR. Plasticity in Cervical Motor Circuits following Spinal Cord Injury and Rehabilitation. BIOLOGY 2021; 10:biology10100976. [PMID: 34681075 PMCID: PMC8533179 DOI: 10.3390/biology10100976] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/24/2021] [Revised: 09/17/2021] [Accepted: 09/22/2021] [Indexed: 11/16/2022]
Abstract
Simple Summary Spinal cord injury results in a decreased quality of life and impacts hundreds of thousands of people in the US alone. This review discusses the underlying cellular mechanisms of injury and the concurrent therapeutic hurdles that impede recovery. It then describes the phenomena of neural plasticity—the nervous system’s ability to change. The primary focus of the review is on the impact of cervical spinal cord injury on control of the upper limbs. The neural plasticity that occurs without intervention is discussed, which shows new connections growing around the injury site and the involvement of compensatory movements. Rehabilitation-driven neural plasticity is shown to have the ability to guide connections to create more normal functions. Various novel stimulation and recording technologies are outlined for their role in further improving rehabilitative outcomes and gains in independence. Finally, the importance of sensory input, an often-overlooked aspect of motor control, is shown in driving neural plasticity. Overall, this review seeks to delineate the historical and contemporary research into neural plasticity following injury and rehabilitation to guide future studies. Abstract Neuroplasticity is a robust mechanism by which the central nervous system attempts to adapt to a structural or chemical disruption of functional connections between neurons. Mechanical damage from spinal cord injury potentiates via neuroinflammation and can cause aberrant changes in neural circuitry known as maladaptive plasticity. Together, these alterations greatly diminish function and quality of life. This review discusses contemporary efforts to harness neuroplasticity through rehabilitation and neuromodulation to restore function with a focus on motor recovery following cervical spinal cord injury. Background information on the general mechanisms of plasticity and long-term potentiation of the nervous system, most well studied in the learning and memory fields, will be reviewed. Spontaneous plasticity of the nervous system, both maladaptive and during natural recovery following spinal cord injury is outlined to provide a baseline from which rehabilitation builds. Previous research has focused on the impact of descending motor commands in driving spinal plasticity. However, this review focuses on the influence of physical therapy and primary afferent input and interneuron modulation in driving plasticity within the spinal cord. Finally, future directions into previously untargeted primary afferent populations are presented.
Collapse
|
46
|
Amorim FE, Chapot RL, Moulin TC, Lee JLC, Amaral OB. Memory destabilization during reconsolidation: a consequence of homeostatic plasticity? ACTA ACUST UNITED AC 2021; 28:371-389. [PMID: 34526382 DOI: 10.1101/lm.053418.121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Accepted: 07/14/2021] [Indexed: 11/24/2022]
Abstract
Remembering is not a static process: When retrieved, a memory can be destabilized and become prone to modifications. This phenomenon has been demonstrated in a number of brain regions, but the neuronal mechanisms that rule memory destabilization and its boundary conditions remain elusive. Using two distinct computational models that combine Hebbian plasticity and synaptic downscaling, we show that homeostatic plasticity can function as a destabilization mechanism, accounting for behavioral results of protein synthesis inhibition upon reactivation with different re-exposure times. Furthermore, by performing systematic reviews, we identify a series of overlapping molecular mechanisms between memory destabilization and synaptic downscaling, although direct experimental links between both phenomena remain scarce. In light of these results, we propose a theoretical framework where memory destabilization can emerge as an epiphenomenon of homeostatic adaptations prompted by memory retrieval.
Collapse
Affiliation(s)
- Felippe E Amorim
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| | - Renata L Chapot
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| | - Thiago C Moulin
- Functional Pharmacology Unit, Department of Neuroscience, Uppsala University, Uppsala 751 24, Sweden
| | - Jonathan L C Lee
- University of Birmingham, School of Psychology, Edgbaston, Birmingham B15 2TT, United Kingdom
| | - Olavo B Amaral
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| |
Collapse
|
47
|
Gozel O, Gerstner W. A functional model of adult dentate gyrus neurogenesis. eLife 2021; 10:66463. [PMID: 34137370 PMCID: PMC8260225 DOI: 10.7554/elife.66463] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2021] [Accepted: 06/16/2021] [Indexed: 12/27/2022] Open
Abstract
In adult dentate gyrus neurogenesis, the link between maturation of newborn neurons and their function, such as behavioral pattern separation, has remained puzzling. By analyzing a theoretical model, we show that the switch from excitation to inhibition of the GABAergic input onto maturing newborn cells is crucial for their proper functional integration. When the GABAergic input is excitatory, cooperativity drives the growth of synapses such that newborn cells become sensitive to stimuli similar to those that activate mature cells. When GABAergic input switches to inhibitory, competition pushes the configuration of synapses onto newborn cells toward stimuli that are different from previously stored ones. This enables the maturing newborn cells to code for concepts that are novel, yet similar to familiar ones. Our theory of newborn cell maturation explains both how adult-born dentate granule cells integrate into the preexisting network and why they promote separation of similar but not distinct patterns.
Collapse
Affiliation(s)
- Olivia Gozel
- School of Life Sciences and School of Computer and Communication Sciences, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland.,Departments of Neurobiology and Statistics, University of Chicago, Chicago, United States.,Grossman Center for Quantitative Biology and Human Behavior, University of Chicago, Chicago, United States
| | - Wulfram Gerstner
- School of Life Sciences and School of Computer and Communication Sciences, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
48
|
Wu CH, Ramos R, Katz DB, Turrigiano GG. Homeostatic synaptic scaling establishes the specificity of an associative memory. Curr Biol 2021; 31:2274-2285.e5. [PMID: 33798429 PMCID: PMC8187282 DOI: 10.1016/j.cub.2021.03.024] [Citation(s) in RCA: 29] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2021] [Revised: 03/01/2021] [Accepted: 03/08/2021] [Indexed: 12/21/2022]
Abstract
Correlation-based (Hebbian) forms of synaptic plasticity are crucial for the initial encoding of associative memories but likely insufficient to enable the stable storage of multiple specific memories within neural circuits. Theoretical studies have suggested that homeostatic synaptic normalization rules provide an essential countervailing force that can stabilize and expand memory storage capacity. Although such homeostatic mechanisms have been identified and studied for decades, experimental evidence that they play an important role in associative memory is lacking. Here, we show that synaptic scaling, a widely studied form of homeostatic synaptic plasticity that globally renormalizes synaptic strengths, is dispensable for initial associative memory formation but crucial for the establishment of memory specificity. We used conditioned taste aversion (CTA) learning, a form of associative learning that relies on Hebbian mechanisms within gustatory cortex (GC), to show that animals conditioned to avoid saccharin initially generalized this aversion to other novel tastants. Specificity of the aversion to saccharin emerged slowly over a time course of many hours and was associated with synaptic scaling down of excitatory synapses onto conditioning-active neuronal ensembles within gustatory cortex. Blocking synaptic scaling down in the gustatory cortex enhanced the persistence of synaptic strength increases induced by conditioning and prolonged the duration of memory generalization. Taken together, these findings demonstrate that synaptic scaling is crucial for sculpting the specificity of an associative memory and suggest that the relative strengths of Hebbian and homeostatic plasticity can modulate the balance between stable memory formation and memory generalization.
Collapse
Affiliation(s)
- Chi-Hong Wu
- Department of Biology, Brandeis University, Waltham, MA 02453, USA
| | - Raul Ramos
- Department of Biology, Brandeis University, Waltham, MA 02453, USA
| | - Donald B Katz
- Department of Psychology, Brandeis University, Waltham, MA 02453, USA
| | | |
Collapse
|
49
|
Altered Heterosynaptic Plasticity Impairs Visual Discrimination Learning in Adenosine A1 Receptor Knock-Out Mice. J Neurosci 2021; 41:4631-4640. [PMID: 33849950 DOI: 10.1523/jneurosci.3073-20.2021] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2020] [Revised: 03/04/2021] [Accepted: 04/03/2021] [Indexed: 12/20/2022] Open
Abstract
Theoretical and modeling studies demonstrate that heterosynaptic plasticity-changes at synapses inactive during induction-facilitates fine-grained discriminative learning in Hebbian-type systems, and helps to achieve a robust ability for repetitive learning. A dearth of tools for selective manipulation has hindered experimental analysis of the proposed role of heterosynaptic plasticity in behavior. Here we circumvent this obstacle by testing specific predictions about the behavioral consequences of the impairment of heterosynaptic plasticity by experimental manipulations to adenosine A1 receptors (A1Rs). Our prior work demonstrated that the blockade of adenosine A1 receptors impairs heterosynaptic plasticity in brain slices and, when implemented in computer models, selectively impairs repetitive learning on sequential tasks. Based on this work, we predict that A1R knock-out (KO) mice will express (1) impairment of heterosynaptic plasticity and (2) behavioral deficits in learning on sequential tasks. Using electrophysiological experiments in slices and behavioral testing of animals of both sexes, we show that, compared with wild-type controls, A1R KO mice have impaired synaptic plasticity in visual cortex neurons, coupled with significant deficits in visual discrimination learning. Deficits in A1R knockouts were seen specifically during relearning, becoming progressively more apparent with learning on sequential visual discrimination tasks of increasing complexity. These behavioral results confirm our model predictions and provide the first experimental evidence for a proposed role of heterosynaptic plasticity in organism-level learning. Moreover, these results identify heterosynaptic plasticity as a new potential target for interventions that may help to enhance new learning on a background of existing memories.SIGNIFICANCE STATEMENT Understanding how interacting forms of synaptic plasticity mediate learning is fundamental for neuroscience. Theory and modeling revealed that, in addition to Hebbian-type associative plasticity, heterosynaptic changes at synapses that were not active during induction are necessary for stable system operation and fine-grained discrimination learning. However, lacking tools for selective manipulation prevented behavioral analysis of heterosynaptic plasticity. Here we circumvent this barrier: from our prior experimental and computational work we predict differential behavioral consequences of the impairment of Hebbian-type versus heterosynaptic plasticity. We show that, in adenosine A1 receptor knock-out mice, impaired synaptic plasticity in visual cortex neurons is coupled with specific deficits in learning sequential, increasingly complex visual discrimination tasks. This provides the first evidence linking heterosynaptic plasticity to organism-level learning.
Collapse
|
50
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|