1
|
Eckmann S, Young EJ, Gjorgjieva J. Synapse-type-specific competitive Hebbian learning forms functional recurrent networks. Proc Natl Acad Sci U S A 2024; 121:e2305326121. [PMID: 38870059 PMCID: PMC11194505 DOI: 10.1073/pnas.2305326121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2023] [Accepted: 04/25/2024] [Indexed: 06/15/2024] Open
Abstract
Cortical networks exhibit complex stimulus-response patterns that are based on specific recurrent interactions between neurons. For example, the balance between excitatory and inhibitory currents has been identified as a central component of cortical computations. However, it remains unclear how the required synaptic connectivity can emerge in developing circuits where synapses between excitatory and inhibitory neurons are simultaneously plastic. Using theory and modeling, we propose that a wide range of cortical response properties can arise from a single plasticity paradigm that acts simultaneously at all excitatory and inhibitory connections-Hebbian learning that is stabilized by the synapse-type-specific competition for a limited supply of synaptic resources. In plastic recurrent circuits, this competition enables the formation and decorrelation of inhibition-balanced receptive fields. Networks develop an assembly structure with stronger synaptic connections between similarly tuned excitatory and inhibitory neurons and exhibit response normalization and orientation-specific center-surround suppression, reflecting the stimulus statistics during training. These results demonstrate how neurons can self-organize into functional networks and suggest an essential role for synapse-type-specific competitive learning in the development of cortical circuits.
Collapse
Affiliation(s)
- Samuel Eckmann
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt am Main60438, Germany
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, CambridgeCB2 1PZ, United Kingdom
| | - Edward James Young
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, CambridgeCB2 1PZ, United Kingdom
| | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt am Main60438, Germany
- School of Life Sciences, Technical University Munich, Freising85354, Germany
| |
Collapse
|
2
|
Yang X, La Camera G. Co-existence of synaptic plasticity and metastable dynamics in a spiking model of cortical circuits. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2023.12.07.570692. [PMID: 38106233 PMCID: PMC10723399 DOI: 10.1101/2023.12.07.570692] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/19/2023]
Abstract
Evidence for metastable dynamics and its role in brain function is emerging at a fast pace and is changing our understanding of neural coding by putting an emphasis on hidden states of transient activity. Clustered networks of spiking neurons have enhanced synaptic connections among groups of neurons forming structures called cell assemblies; such networks are capable of producing metastable dynamics that is in agreement with many experimental results. However, it is unclear how a clustered network structure producing metastable dynamics may emerge from a fully local plasticity rule, i.e., a plasticity rule where each synapse has only access to the activity of the neurons it connects (as opposed to the activity of other neurons or other synapses). Here, we propose a local plasticity rule producing ongoing metastable dynamics in a deterministic, recurrent network of spiking neurons. The metastable dynamics co-exists with ongoing plasticity and is the consequence of a self-tuning mechanism that keeps the synaptic weights close to the instability line where memories are spontaneously reactivated. In turn, the synaptic structure is stable to ongoing dynamics and random perturbations, yet it remains sufficiently plastic to remap sensory representations to encode new sets of stimuli. Both the plasticity rule and the metastable dynamics scale well with network size, with synaptic stability increasing with the number of neurons. Overall, our results show that it is possible to generate metastable dynamics over meaningful hidden states using a simple but biologically plausible plasticity rule which co-exists with ongoing neural dynamics.
Collapse
Affiliation(s)
- Xiaoyu Yang
- Graduate Program in Physics and Astronomy, Stony Brook University
- Department of Neurobiology & Behavior, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| | - Giancarlo La Camera
- Department of Neurobiology & Behavior, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| |
Collapse
|
3
|
Sun SED, Levenstein D, Li B, Mandelberg N, Chenouard N, Suutari BS, Sanchez S, Tian G, Rinzel J, Buzsáki G, Tsien RW. Synaptic homeostasis transiently leverages Hebbian mechanisms for a multiphasic response to inactivity. Cell Rep 2024; 43:113839. [PMID: 38507409 DOI: 10.1016/j.celrep.2024.113839] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2022] [Revised: 12/19/2023] [Accepted: 02/05/2024] [Indexed: 03/22/2024] Open
Abstract
Homeostatic regulation of synapses is vital for nervous system function and key to understanding a range of neurological conditions. Synaptic homeostasis is proposed to operate over hours to counteract the destabilizing influence of long-term potentiation (LTP) and long-term depression (LTD). The prevailing view holds that synaptic scaling is a slow first-order process that regulates postsynaptic glutamate receptors and fundamentally differs from LTP or LTD. Surprisingly, we find that the dynamics of scaling induced by neuronal inactivity are not exponential or monotonic, and the mechanism requires calcineurin and CaMKII, molecules dominant in LTD and LTP. Our quantitative model of these enzymes reconstructs the unexpected dynamics of homeostatic scaling and reveals how synapses can efficiently safeguard future capacity for synaptic plasticity. This mechanism of synaptic adaptation supports a broader set of homeostatic changes, including action potential autoregulation, and invites further inquiry into how such a mechanism varies in health and disease.
Collapse
Affiliation(s)
- Simón E D Sun
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA; Cold Spring Harbor Laboratory, Cold Spring Harbor, NY 11724, USA
| | - Daniel Levenstein
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA; Montreal Neurological Institute, Department of Neurology and Neurosurgery, McGill University, 3810 University Street, Montreal, QC, Canada
| | - Boxing Li
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA; Neuroscience Program, Guangdong Provincial Key Laboratory of Brain Function and Disease, Zhongshan School of Medicine and the Fifth Affiliated Hospital, Sun Yat-sen University, Guangzhou 510810, China
| | - Nataniel Mandelberg
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - Nicolas Chenouard
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA; Sorbonne Université, INSERM U1127, UMR CNRS 7225, Institut du Cerveau (ICM), 47 bld de l'hôpital, 75013 Paris, France
| | - Benjamin S Suutari
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - Sandrine Sanchez
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - Guoling Tian
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - John Rinzel
- Center for Neural Science, New York University, New York, NY 10003, USA
| | - György Buzsáki
- Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA
| | - Richard W Tsien
- Center for Neural Science, New York University, New York, NY 10003, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Langone Health, New York, NY 10016, USA.
| |
Collapse
|
4
|
Pereira-Obilinovic U, Hou H, Svoboda K, Wang XJ. Brain mechanism of foraging: Reward-dependent synaptic plasticity versus neural integration of values. Proc Natl Acad Sci U S A 2024; 121:e2318521121. [PMID: 38551832 PMCID: PMC10998608 DOI: 10.1073/pnas.2318521121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2023] [Accepted: 01/16/2024] [Indexed: 04/02/2024] Open
Abstract
During foraging behavior, action values are persistently encoded in neural activity and updated depending on the history of choice outcomes. What is the neural mechanism for action value maintenance and updating? Here, we explore two contrasting network models: synaptic learning of action value versus neural integration. We show that both models can reproduce extant experimental data, but they yield distinct predictions about the underlying biological neural circuits. In particular, the neural integrator model but not the synaptic model requires that reward signals are mediated by neural pools selective for action alternatives and their projections are aligned with linear attractor axes in the valuation system. We demonstrate experimentally observable neural dynamical signatures and feasible perturbations to differentiate the two contrasting scenarios, suggesting that the synaptic model is a more robust candidate mechanism. Overall, this work provides a modeling framework to guide future experimental research on probabilistic foraging.
Collapse
Affiliation(s)
- Ulises Pereira-Obilinovic
- Center for Neural Science, New York University, New York, NY10003
- Allen Institute for Neural Dynamics, Seattle, WA98109
| | - Han Hou
- Allen Institute for Neural Dynamics, Seattle, WA98109
| | - Karel Svoboda
- Allen Institute for Neural Dynamics, Seattle, WA98109
| | - Xiao-Jing Wang
- Center for Neural Science, New York University, New York, NY10003
| |
Collapse
|
5
|
Bhasin BJ, Raymond JL, Goldman MS. Synaptic weight dynamics underlying systems consolidation of a memory. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.03.20.586036. [PMID: 38585936 PMCID: PMC10996481 DOI: 10.1101/2024.03.20.586036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/09/2024]
Abstract
Systems consolidation is a common feature of learning and memory systems, in which a long-term memory initially stored in one brain region becomes persistently stored in another region. We studied the dynamics of systems consolidation in simple circuit architectures modeling core features of many memory systems: an early- and late-learning brain region and two sites of plasticity. We show that the synaptic dynamics of the circuit during consolidation of an analog memory can be understood as a temporal integration process, by which transient changes in activity driven by plasticity in the early-learning area are accumulated into persistent synaptic changes at the late-learning site. This simple principle leads to two constraints on the circuit operation for consolidation to be implemented successfully. First, the plasticity rule at the late-learning site must stably support a continuum of possible outputs for a given input. We show that this is readily achieved by heterosynaptic but not standard Hebbian rules, that it naturally leads to a speed-accuracy tradeoff in systems consolidation, and that it provides a concrete circuit instantiation for how systems consolidation solves the stability-plasticity dilemma. Second, to turn off the consolidation process and prevent erroneous changes at the late-learning site, neural activity in the early-learning area must be reset to its baseline activity. We propose two biologically plausible implementations for this reset that suggest novel roles for core elements of the cerebellar circuit.
Collapse
Affiliation(s)
- Brandon J Bhasin
- Department of Bioengineering, Stanford University, Stanford, CA 94305
- Center for Neuroscience, University of California, Davis, CA 95616
| | - Jennifer L Raymond
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA 94305
| | - Mark S Goldman
- Center for Neuroscience, University of California, Davis, CA 95616
- Departments of Neurobiology, Physiology, and Behavior, and Ophthalmology and Vision Science, University of California, Davis, CA 95616
| |
Collapse
|
6
|
Chintaluri C, Vogels TP. Metabolically regulated spiking could serve neuronal energy homeostasis and protect from reactive oxygen species. Proc Natl Acad Sci U S A 2023; 120:e2306525120. [PMID: 37988463 PMCID: PMC10691349 DOI: 10.1073/pnas.2306525120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2023] [Accepted: 10/11/2023] [Indexed: 11/23/2023] Open
Abstract
So-called spontaneous activity is a central hallmark of most nervous systems. Such non-causal firing is contrary to the tenet of spikes as a means of communication, and its purpose remains unclear. We propose that self-initiated firing can serve as a release valve to protect neurons from the toxic conditions arising in mitochondria from lower-than-baseline energy consumption. To demonstrate the viability of our hypothesis, we built a set of models that incorporate recent experimental results indicating homeostatic control of metabolic products-Adenosine triphosphate (ATP), adenosine diphosphate (ADP), and reactive oxygen species (ROS)-by changes in firing. We explore the relationship of metabolic cost of spiking with its effect on the temporal patterning of spikes and reproduce experimentally observed changes in intrinsic firing in the fruitfly dorsal fan-shaped body neuron in a model with ROS-modulated potassium channels. We also show that metabolic spiking homeostasis can produce indefinitely sustained avalanche dynamics in cortical circuits. Our theory can account for key features of neuronal activity observed in many studies ranging from ion channel function all the way to resting state dynamics. We finish with a set of experimental predictions that would confirm an integrated, crucial role for metabolically regulated spiking and firmly link metabolic homeostasis and neuronal function.
Collapse
Affiliation(s)
- Chaitanya Chintaluri
- Institute of Science and Technology Austria, KlosterneuburgA-3400, Austria
- Centre for Neural Circuits and Behaviour, Department of Physiology, Anatomy and Genetics, University of Oxford, OxfordOX13SR, United Kingdom
| | - Tim P. Vogels
- Institute of Science and Technology Austria, KlosterneuburgA-3400, Austria
| |
Collapse
|
7
|
Andrei AR, Akil AE, Kharas N, Rosenbaum R, Josić K, Dragoi V. Rapid compensatory plasticity revealed by dynamic correlated activity in monkeys in vivo. Nat Neurosci 2023; 26:1960-1969. [PMID: 37828225 DOI: 10.1038/s41593-023-01446-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2022] [Accepted: 09/01/2023] [Indexed: 10/14/2023]
Abstract
To produce adaptive behavior, neural networks must balance between plasticity and stability. Computational work has demonstrated that network stability requires plasticity mechanisms to be counterbalanced by rapid compensatory processes. However, such processes have yet to be experimentally observed. Here we demonstrate that repeated optogenetic activation of excitatory neurons in monkey visual cortex (area V1) induces a population-wide dynamic reduction in the strength of neuronal interactions over the timescale of minutes during the awake state, but not during rest. This new form of rapid plasticity was observed only in the correlation structure, with firing rates remaining stable across trials. A computational network model operating in the balanced regime confirmed experimental findings and revealed that inhibitory plasticity is responsible for the decrease in correlated activity in response to repeated light stimulation. These results provide the first experimental evidence for rapid homeostatic plasticity that primarily operates during wakefulness, which stabilizes neuronal interactions during strong network co-activation.
Collapse
Affiliation(s)
- Ariana R Andrei
- Department of Neurobiology and Anatomy, University of Texas, Houston, TX, USA.
| | - Alan E Akil
- Departments of Mathematics, Biology and Biochemistry, University of Houston, Houston, TX, USA
| | - Natasha Kharas
- Department of Neurobiology and Anatomy, University of Texas, Houston, TX, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
| | - Krešimir Josić
- Departments of Mathematics, Biology and Biochemistry, University of Houston, Houston, TX, USA
| | - Valentin Dragoi
- Department of Neurobiology and Anatomy, University of Texas, Houston, TX, USA.
- Department of Electrical and Computer Engineering, Rice University, Houston, TX, USA.
| |
Collapse
|
8
|
Marom S, Marder E. A biophysical perspective on the resilience of neuronal excitability across timescales. Nat Rev Neurosci 2023; 24:640-652. [PMID: 37620600 DOI: 10.1038/s41583-023-00730-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/27/2023] [Indexed: 08/26/2023]
Abstract
Neuronal membrane excitability must be resilient to perturbations that can take place over timescales from milliseconds to months (or even years in long-lived animals). A great deal of attention has been paid to classes of homeostatic mechanisms that contribute to long-term maintenance of neuronal excitability through processes that alter a key structural parameter: the number of ion channel proteins present at the neuronal membrane. However, less attention has been paid to the self-regulating 'automatic' mechanisms that contribute to neuronal resilience by virtue of the kinetic properties of ion channels themselves. Here, we propose that these two sets of mechanisms are complementary instantiations of feedback control, together enabling resilience on a wide range of temporal scales. We further point to several methodological and conceptual challenges entailed in studying these processes - both of which involve enmeshed feedback control loops - and consider the consequences of these mechanisms of resilience.
Collapse
Affiliation(s)
- Shimon Marom
- Faculty of Medicine, Technion - Institute of Technology, Haifa, Israel.
| | - Eve Marder
- Biology Department, Brandeis University, Waltham, MA, USA.
- Volen National Center for Complex Systems, Brandeis University, Waltham, MA, USA.
| |
Collapse
|
9
|
O'Donnell C. Nonlinear slow-timescale mechanisms in synaptic plasticity. Curr Opin Neurobiol 2023; 82:102778. [PMID: 37657186 DOI: 10.1016/j.conb.2023.102778] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 08/07/2023] [Accepted: 08/09/2023] [Indexed: 09/03/2023]
Abstract
Learning and memory rely on synapses changing their strengths in response to neural activity. However, there is a substantial gap between the timescales of neural electrical dynamics (1-100 ms) and organism behaviour during learning (seconds-minutes). What mechanisms bridge this timescale gap? What are the implications for theories of brain learning? Here I first cover experimental evidence for slow-timescale factors in plasticity induction. Then I review possible underlying cellular and synaptic mechanisms, and insights from recent computational models that incorporate such slow-timescale variables. I conclude that future progress in understanding brain learning across timescales will require both experimental and computational modelling studies that map out the nonlinearities implemented by both fast and slow plasticity mechanisms at synapses, and crucially, their joint interactions.
Collapse
Affiliation(s)
- Cian O'Donnell
- School of Computing, Engineering, and Intelligent Systems, Magee Campus, Ulster University, Derry/Londonderry, UK; School of Computer Science, Electrical and Electronic Engineering, and Engineering Maths, University of Bristol, Bristol, UK.
| |
Collapse
|
10
|
Mitchell R, Shaverdian S, Dacke M, Webb B. A model of cue integration as vector summation in the insect brain. Proc Biol Sci 2023; 290:20230767. [PMID: 37357865 PMCID: PMC10291719 DOI: 10.1098/rspb.2023.0767] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Accepted: 05/30/2023] [Indexed: 06/27/2023] Open
Abstract
Ball-rolling dung beetles are known to integrate multiple cues in order to facilitate their straight-line orientation behaviour. Recent work has suggested that orientation cues are integrated according to a vector sum, that is, compass cues are represented by vectors and summed to give a combined orientation estimate. Further, cue weight (vector magnitude) appears to be set according to cue reliability. This is consistent with the popular Bayesian view of cue integration: cues are integrated to reduce or minimize an agent's uncertainty about the external world. Integration of orientation cues is believed to occur at the input to the insect central complex. Here, we demonstrate that a model of the head direction circuit of the central complex, including plasticity in input synapses, can act as a substrate for cue integration as vector summation. Further, we show that cue influence is not necessarily driven by cue reliability. Finally, we present a dung beetle behavioural experiment which, in combination with simulation, strongly suggests that these beetles do not weight cues according to reliability. We suggest an alternative strategy whereby cues are weighted according to relative contrast, which can also explain previous results.
Collapse
Affiliation(s)
- Robert Mitchell
- Institute for Perception, Action, and Behaviour, The University of Edinburgh School of Informatics, Edinburgh, Edinburgh EH8 9AB, UK
| | - Shahrzad Shaverdian
- Lund Vision Group, Department of Biology, Lund University, Lund SE-223 62, Sweden
| | - Marie Dacke
- Lund Vision Group, Department of Biology, Lund University, Lund SE-223 62, Sweden
| | - Barbara Webb
- Institute for Perception, Action, and Behaviour, The University of Edinburgh School of Informatics, Edinburgh, Edinburgh EH8 9AB, UK
| |
Collapse
|
11
|
Wilmes KA, Clopath C. Dendrites help mitigate the plasticity-stability dilemma. Sci Rep 2023; 13:6543. [PMID: 37085642 PMCID: PMC10121616 DOI: 10.1038/s41598-023-32410-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2022] [Accepted: 03/27/2023] [Indexed: 04/23/2023] Open
Abstract
With Hebbian learning 'who fires together wires together', well-known problems arise. Hebbian plasticity can cause unstable network dynamics and overwrite stored memories. Because the known homeostatic plasticity mechanisms tend to be too slow to combat unstable dynamics, it has been proposed that plasticity must be highly gated and synaptic strengths limited. While solving the issue of stability, gating and limiting plasticity does not solve the stability-plasticity dilemma. We propose that dendrites enable both stable network dynamics and considerable synaptic changes, as they allow the gating of plasticity in a compartment-specific manner. We investigate how gating plasticity influences network stability in plastic balanced spiking networks of neurons with dendrites. We compare how different ways to gate plasticity, namely via modulating excitability, learning rate, and inhibition increase stability. We investigate how dendritic versus perisomatic gating allows for different amounts of weight changes in stable networks. We suggest that the compartmentalisation of pyramidal cells enables dendritic synaptic changes while maintaining stability. We show that the coupling between dendrite and soma is critical for the plasticity-stability trade-off. Finally, we show that spatially restricted plasticity additionally improves stability.
Collapse
Affiliation(s)
- Katharina A Wilmes
- Imperial College London, London, United Kingdom.
- University of Bern, Bern, Switzerland.
| | | |
Collapse
|
12
|
Chen S, Yang Q, Lim S. Efficient inference of synaptic plasticity rule with Gaussian process regression. iScience 2023; 26:106182. [PMID: 36879810 PMCID: PMC9985048 DOI: 10.1016/j.isci.2023.106182] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2022] [Revised: 01/24/2023] [Accepted: 02/07/2023] [Indexed: 02/16/2023] Open
Abstract
Finding the form of synaptic plasticity is critical to understanding its functions underlying learning and memory. We investigated an efficient method to infer synaptic plasticity rules in various experimental settings. We considered biologically plausible models fitting a wide range of in-vitro studies and examined the recovery of their firing-rate dependence from sparse and noisy data. Among the methods assuming low-rankness or smoothness of plasticity rules, Gaussian process regression (GPR), a nonparametric Bayesian approach, performs the best. Under the conditions measuring changes in synaptic weights directly or measuring changes in neural activities as indirect observables of synaptic plasticity, which leads to different inference problems, GPR performs well. Also, GPR could simultaneously recover multiple plasticity rules and robustly perform under various plasticity rules and noise levels. Such flexibility and efficiency, particularly at the low sampling regime, make GPR suitable for recent experimental developments and inferring a broader class of plasticity models.
Collapse
Affiliation(s)
- Shirui Chen
- Department of Applied Mathematics, University of Washington, Lewis Hall 201, Box 353925, Seattle, WA 98195-3925, USA.,Neural Science, New York University Shanghai, 1555 Century Avenue, Shanghai, 200122, China
| | - Qixin Yang
- The Edmond and Lily Safra Center for Brain Sciences, The Hebrew University, The Suzanne and Charles Goodman Brain Sciences Building, Edmond J. Safra Campus, Jerusalem, 9190401, Israel.,Neural Science, New York University Shanghai, 1555 Century Avenue, Shanghai, 200122, China
| | - Sukbin Lim
- Neural Science, New York University Shanghai, 1555 Century Avenue, Shanghai, 200122, China.,NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, 3663 Zhongshan Road North, Shanghai, 200062, China
| |
Collapse
|
13
|
Damiani F, Cornuti S, Tognini P. The gut-brain connection: Exploring the influence of the gut microbiota on neuroplasticity and neurodevelopmental disorders. Neuropharmacology 2023; 231:109491. [PMID: 36924923 DOI: 10.1016/j.neuropharm.2023.109491] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2022] [Revised: 02/22/2023] [Accepted: 03/05/2023] [Indexed: 03/17/2023]
Abstract
Neuroplasticity refers to the ability of brain circuits to reorganize and change the properties of the network, resulting in alterations in brain function and behavior. It is traditionally believed that neuroplasticity is influenced by external stimuli, learning, and experience. Intriguingly, there is new evidence suggesting that endogenous signals from the body's periphery may play a role. The gut microbiota, a diverse community of microorganisms living in harmony with their host, may be able to influence plasticity through its modulation of the gut-brain axis. Interestingly, the maturation of the gut microbiota coincides with critical periods of neurodevelopment, during which neural circuits are highly plastic and potentially vulnerable. As such, dysbiosis (an imbalance in the gut microbiota composition) during early life may contribute to the disruption of normal developmental trajectories, leading to neurodevelopmental disorders. This review aims to examine the ways in which the gut microbiota can affect neuroplasticity. It will also discuss recent research linking gastrointestinal issues and bacterial dysbiosis to various neurodevelopmental disorders and their potential impact on neurological outcomes.
Collapse
Affiliation(s)
| | - Sara Cornuti
- Laboratory of Biology, Scuola Normale Superiore, Pisa, Italy
| | - Paola Tognini
- Laboratory of Biology, Scuola Normale Superiore, Pisa, Italy; Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy.
| |
Collapse
|
14
|
Issa NP, Nunn KC, Wu S, Haider HA, Tao JX. Putative roles for homeostatic plasticity in epileptogenesis. Epilepsia 2023; 64:539-552. [PMID: 36617338 PMCID: PMC10015501 DOI: 10.1111/epi.17500] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2022] [Revised: 01/04/2023] [Accepted: 01/05/2023] [Indexed: 01/09/2023]
Abstract
Homeostatic plasticity allows neural circuits to maintain an average activity level while preserving the ability to learn new associations and efficiently transmit information. This dynamic process usually protects the brain from excessive activity, like seizures. However, in certain contexts, homeostatic plasticity might produce seizures, either in response to an acute provocation or more chronically as a driver of epileptogenesis. Here, we review three seizure conditions in which homeostatic plasticity likely plays an important role: acute drug withdrawal seizures, posttraumatic or disconnection epilepsy, and cyclic seizures. Identifying the homeostatic mechanisms active at different stages of development and in different circuits could allow better targeting of therapies, including determining when neuromodulation might be most effective, proposing ways to prevent epileptogenesis, and determining how to disrupt the cycle of recurring seizure clusters.
Collapse
Affiliation(s)
- Naoum P. Issa
- Comprehensive Epilepsy Center, Department of Neurology, 5841 S. Maryland Ave., MC 2030, University of Chicago, Chicago, IL 60637
| | | | - Shasha Wu
- Comprehensive Epilepsy Center, Department of Neurology, 5841 S. Maryland Ave., MC 2030, University of Chicago, Chicago, IL 60637
| | - Hiba A. Haider
- Comprehensive Epilepsy Center, Department of Neurology, 5841 S. Maryland Ave., MC 2030, University of Chicago, Chicago, IL 60637
| | - James X. Tao
- Comprehensive Epilepsy Center, Department of Neurology, 5841 S. Maryland Ave., MC 2030, University of Chicago, Chicago, IL 60637
| |
Collapse
|
15
|
Synaptic plasticity in Schizophrenia pathophysiology. IBRO Neurosci Rep 2023. [DOI: 10.1016/j.ibneur.2023.01.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023] Open
|
16
|
Lea-Carnall CA, Tanner LI, Montemurro MA. Noise-modulated multistable synapses in a Wilson-Cowan-based model of plasticity. Front Comput Neurosci 2023; 17:1017075. [PMID: 36817317 PMCID: PMC9931909 DOI: 10.3389/fncom.2023.1017075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2022] [Accepted: 01/10/2023] [Indexed: 02/05/2023] Open
Abstract
Frequency-dependent plasticity refers to changes in synaptic strength in response to different stimulation frequencies. Resonance is a factor known to be of importance in such frequency dependence, however, the role of neural noise in the process remains elusive. Considering the brain is an inherently noisy system, understanding its effects may prove beneficial in shaping therapeutic interventions based on non-invasive brain stimulation protocols. The Wilson-Cowan (WC) model is a well-established model to describe the average dynamics of neural populations and has been shown to exhibit bistability in the presence of noise. However, the important question of how the different stable regimes in the WC model can affect synaptic plasticity when cortical populations interact has not yet been addressed. Therefore, we investigated plasticity dynamics in a WC-based model of interacting neural populations coupled with activity-dependent synapses in which a periodic stimulation was applied in the presence of noise of controlled intensity. The results indicate that for a narrow range of the noise variance, synaptic strength can be optimized. In particular, there is a regime of noise intensity for which synaptic strength presents a triple-stable state. Regulating noise intensity affects the probability that the system chooses one of the stable states, thereby controlling plasticity. These results suggest that noise is a highly influential factor in determining the outcome of plasticity induced by stimulation.
Collapse
Affiliation(s)
- Caroline A Lea-Carnall
- School of Health Sciences, Manchester Academic Health Science Centre, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, United Kingdom
| | - Lisabel I Tanner
- School of Health Sciences, Manchester Academic Health Science Centre, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, United Kingdom
| | - Marcelo A Montemurro
- School of Mathematics and Statistics, Faculty of Science, Technology, Engineering and Mathematics, The Open University, Milton Keynes, United Kingdom
| |
Collapse
|
17
|
Kim HH, Lee SH, Ho WK, Eom K. Dopamine Receptor Supports the Potentiation of Intrinsic Excitability and Synaptic LTD in Temporoammonic-CA1 Synapse. Exp Neurobiol 2022; 31:361-375. [PMID: 36631845 PMCID: PMC9841748 DOI: 10.5607/en22028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Revised: 11/15/2022] [Accepted: 12/28/2022] [Indexed: 01/13/2023] Open
Abstract
Dopaminergic projection to the hippocampus from the ventral tegmental area or locus ceruleus has been considered to play an essential role in the acquisition of novel information. Hence, the dopaminergic modulation of synaptic plasticity in the hippocampus has been widely studied. We examined how the D1 and D2 receptors influenced the mGluR5-mediated synaptic plasticity of the temporoammonic-CA1 synapses and showed that the dopaminergic modulation of the temporoammonic-CA1 synapses was expressed in various ways. Our findings suggest that the dopaminergic system in the hippocampal CA1 region regulates the long-term synaptic plasticity and processing of the novel information.
Collapse
Affiliation(s)
- Hye-Hyun Kim
- Department of Physiology, Seoul National University College of Medicine, Seoul 03080, Korea,Neuroscience Research Center, Seoul National University College of Medicine, Seoul 03080, Korea
| | - Suk-Ho Lee
- Department of Physiology, Seoul National University College of Medicine, Seoul 03080, Korea,Neuroscience Research Center, Seoul National University College of Medicine, Seoul 03080, Korea
| | - Won-Kyung Ho
- Department of Physiology, Seoul National University College of Medicine, Seoul 03080, Korea,Neuroscience Research Center, Seoul National University College of Medicine, Seoul 03080, Korea,Won-Kyung Ho, TEL: 82-2-740-8226, FAX: 82-2-763-9667, e-mail:
| | - Kisang Eom
- Department of Physiology, School of Medicine, Keimyung University, Daegu 42601, Korea,To whom correspondence should be addressed. Kisang Eom, TEL: 82-53-258-7416, FAX: 82-53-258-7412, e-mail:
| |
Collapse
|
18
|
Miehl C, Gjorgjieva J. Stability and learning in excitatory synapses by nonlinear inhibitory plasticity. PLoS Comput Biol 2022; 18:e1010682. [PMID: 36459503 PMCID: PMC9718420 DOI: 10.1371/journal.pcbi.1010682] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2022] [Accepted: 10/25/2022] [Indexed: 12/03/2022] Open
Abstract
Synaptic changes are hypothesized to underlie learning and memory formation in the brain. But Hebbian synaptic plasticity of excitatory synapses on its own is unstable, leading to either unlimited growth of synaptic strengths or silencing of neuronal activity without additional homeostatic mechanisms. To control excitatory synaptic strengths, we propose a novel form of synaptic plasticity at inhibitory synapses. Using computational modeling, we suggest two key features of inhibitory plasticity, dominance of inhibition over excitation and a nonlinear dependence on the firing rate of postsynaptic excitatory neurons whereby inhibitory synaptic strengths change with the same sign (potentiate or depress) as excitatory synaptic strengths. We demonstrate that the stable synaptic strengths realized by this novel inhibitory plasticity model affects excitatory/inhibitory weight ratios in agreement with experimental results. Applying a disinhibitory signal can gate plasticity and lead to the generation of receptive fields and strong bidirectional connectivity in a recurrent network. Hence, a novel form of nonlinear inhibitory plasticity can simultaneously stabilize excitatory synaptic strengths and enable learning upon disinhibition.
Collapse
Affiliation(s)
- Christoph Miehl
- Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- School of Life Sciences, Technical University of Munich, Freising, Germany
- * E-mail: (CM); (JG)
| | - Julijana Gjorgjieva
- Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- School of Life Sciences, Technical University of Munich, Freising, Germany
- * E-mail: (CM); (JG)
| |
Collapse
|
19
|
Schumm SN, Gabrieli D, Meaney DF. Plasticity impairment alters community structure but permits successful pattern separation in a hippocampal network model. Front Cell Neurosci 2022; 16:977769. [PMID: 36505514 PMCID: PMC9729278 DOI: 10.3389/fncel.2022.977769] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Accepted: 10/25/2022] [Indexed: 11/25/2022] Open
Abstract
Patients who suffer from traumatic brain injury (TBI) often complain of learning and memory problems. Their symptoms are principally mediated by the hippocampus and the ability to adapt to stimulus, also known as neural plasticity. Therefore, one plausible injury mechanism is plasticity impairment, which currently lacks comprehensive investigation across TBI research. For these studies, we used a computational network model of the hippocampus that includes the dentate gyrus, CA3, and CA1 with neuron-scale resolution. We simulated mild injury through weakened spike-timing-dependent plasticity (STDP), which modulates synaptic weights according to causal spike timing. In preliminary work, we found functional deficits consisting of decreased firing rate and broadband power in areas CA3 and CA1 after STDP impairment. To address structural changes with these studies, we applied modularity analysis to evaluate how STDP impairment modifies community structure in the hippocampal network. We also studied the emergent function of network-based learning and found that impaired networks could acquire conditioned responses after training, but the magnitude of the response was significantly lower. Furthermore, we examined pattern separation, a prerequisite of learning, by entraining two overlapping patterns. Contrary to our initial hypothesis, impaired networks did not exhibit deficits in pattern separation with either population- or rate-based coding. Collectively, these results demonstrate how a mechanism of injury that operates at the synapse regulates circuit function.
Collapse
Affiliation(s)
- Samantha N. Schumm
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA, United States
| | - David Gabrieli
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA, United States
| | - David F. Meaney
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, PA, United States
- Department of Neurosurgery, Penn Center for Brain Injury and Repair, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States
| |
Collapse
|
20
|
Leisman G. On the Application of Developmental Cognitive Neuroscience in Educational Environments. Brain Sci 2022; 12:1501. [PMID: 36358427 PMCID: PMC9688360 DOI: 10.3390/brainsci12111501] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 10/25/2022] [Accepted: 11/01/2022] [Indexed: 09/29/2023] Open
Abstract
The paper overviews components of neurologic processing efficiencies to develop innovative methodologies and thinking to school-based applications and changes in educational leadership based on sound findings in the cognitive neurosciences applied to schools and learners. Systems science can allow us to better manage classroom-based learning and instruction on the basis of relatively easily evaluated efficiencies or inefficiencies and optimization instead of simply examining achievement. "Medicalizing" the learning process with concepts such as "learning disability" or employing grading methods such as pass-fail does little to aid in understanding the processes that learners employ to acquire, integrate, remember, and apply information learned. The paper endeavors to overview and provided reference to tools that can be employed that allow a better focus on nervous system-based strategic approaches to classroom learning.
Collapse
Affiliation(s)
- Gerry Leisman
- Movement and Cognition Laboratory, Department of Physical Therapy, University of Haifa, Haifa 3498838, Israel; or
- Department of Neurology, Universidad de Ciencias Médicas de la Habana, Havana 11300, Cuba
| |
Collapse
|
21
|
Zhang K, Liao P, Wen J, Hu Z. Synaptic plasticity in schizophrenia pathophysiology. IBRO Neurosci Rep 2022; 13:478-487. [PMID: 36590092 PMCID: PMC9795311 DOI: 10.1016/j.ibneur.2022.10.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2022] [Accepted: 10/21/2022] [Indexed: 11/05/2022] Open
Abstract
Schizophrenia is a severe neuropsychiatric syndrome with psychotic behavioral abnormalities and marked cognitive deficits. It is widely accepted that genetic and environmental factors contribute to the onset of schizophrenia. However, the etiology and pathology of the disease remain largely unexplored. Recently, the synaptopathology and the dysregulated synaptic plasticity and function have emerging as intriguing and prominent biological mechanisms of schizophrenia pathogenesis. Synaptic plasticity is the ability of neurons to change the strength of their connections in response to internal or external stimuli, which is essential for brain development and function, learning and memory, and vast majority of behavior responses relevant to psychiatric diseases including schizophrenia. Here, we reviewed molecular and cellular mechanisms of the multiple forms synaptic plasticity, and the functional regulations of schizophrenia-risk factors including disease susceptible genes and environmental alterations on synaptic plasticity and animal behavior. Recent genome-wide association studies have provided fruitful findings of hundreds of risk gene variances associated with schizophrenia, thus further clarifying the role of these disease-risk genes in synaptic transmission and plasticity will be beneficial to advance our understanding of schizophrenia pathology, as well as the molecular mechanism of synaptic plasticity.
Collapse
Affiliation(s)
- Kexuan Zhang
- Hunan Key Laboratory of Molecular Precision Medicine, Department of Critical Care Medicine, Xiangya Hospital, Central South University, Changsha 410008, Hunan, PR China,Center for Medical Genetics, School of Life Sciences, Central South University, Changsha 410008, Hunan, PR China
| | - Panlin Liao
- Hunan Key Laboratory of Molecular Precision Medicine, Department of Critical Care Medicine, Xiangya Hospital, Central South University, Changsha 410008, Hunan, PR China,National Clinical Research Center for Geriatric Diseases, Xiangya Hospital, Central South University, Changsha 410008, Hunan, PR China
| | - Jin Wen
- Hunan Key Laboratory of Molecular Precision Medicine, Department of Critical Care Medicine, Xiangya Hospital, Central South University, Changsha 410008, Hunan, PR China,National Clinical Research Center for Geriatric Diseases, Xiangya Hospital, Central South University, Changsha 410008, Hunan, PR China
| | - Zhonghua Hu
- Hunan Key Laboratory of Molecular Precision Medicine, Department of Critical Care Medicine, Xiangya Hospital, Central South University, Changsha 410008, Hunan, PR China,Center for Medical Genetics, School of Life Sciences, Central South University, Changsha 410008, Hunan, PR China,National Clinical Research Center for Geriatric Diseases, Xiangya Hospital, Central South University, Changsha 410008, Hunan, PR China,Hunan Provincial Clinical Research Center for Critical Care Medicine, Xiangya Hospital, Central South University, Changsha 410008, Hunan, PR China,Hunan Key Laboratory of Animal Models for Human Diseases, School of Life Sciences, Central South University, Changsha 410008, Hunan, PR China,Key Laboratory of Hunan Province in Neurodegenerative Disorders, Central South University, Changsha 410008, Hunan, PR China,Correspondence to: Institute of Molecular Precision Medicine and Hunan Key Laboratory of Molecular Precision Medicine, Xiangya Hospital, Central South University, 87 Xiangya Rd, Changsha, Hunan, PR China.
| |
Collapse
|
22
|
Wang MB, Halassa MM. Thalamocortical contribution to flexible learning in neural systems. Netw Neurosci 2022; 6:980-997. [PMID: 36875011 PMCID: PMC9976647 DOI: 10.1162/netn_a_00235] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2021] [Accepted: 01/19/2022] [Indexed: 11/04/2022] Open
Abstract
Animal brains evolved to optimize behavior in dynamic environments, flexibly selecting actions that maximize future rewards in different contexts. A large body of experimental work indicates that such optimization changes the wiring of neural circuits, appropriately mapping environmental input onto behavioral outputs. A major unsolved scientific question is how optimal wiring adjustments, which must target the connections responsible for rewards, can be accomplished when the relation between sensory inputs, action taken, and environmental context with rewards is ambiguous. The credit assignment problem can be categorized into context-independent structural credit assignment and context-dependent continual learning. In this perspective, we survey prior approaches to these two problems and advance the notion that the brain's specialized neural architectures provide efficient solutions. Within this framework, the thalamus with its cortical and basal ganglia interactions serves as a systems-level solution to credit assignment. Specifically, we propose that thalamocortical interaction is the locus of meta-learning where the thalamus provides cortical control functions that parametrize the cortical activity association space. By selecting among these control functions, the basal ganglia hierarchically guide thalamocortical plasticity across two timescales to enable meta-learning. The faster timescale establishes contextual associations to enable behavioral flexibility, while the slower one enables generalization to new contexts.
Collapse
Affiliation(s)
- Mien Brabeeba Wang
- Department of Brain and Cognitive Science, Massachusetts Institute of Technology, Cambridge, MA, USA
- Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Michael M. Halassa
- Department of Brain and Cognitive Science, Massachusetts Institute of Technology, Cambridge, MA, USA
| |
Collapse
|
23
|
Multilevel development of cognitive abilities in an artificial neural network. Proc Natl Acad Sci U S A 2022; 119:e2201304119. [PMID: 36122214 PMCID: PMC9522351 DOI: 10.1073/pnas.2201304119] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Several neuronal mechanisms have been proposed to account for the formation of cognitive abilities through postnatal interactions with the physical and sociocultural environment. Here, we introduce a three-level computational model of information processing and acquisition of cognitive abilities. We propose minimal architectural requirements to build these levels, and how the parameters affect their performance and relationships. The first sensorimotor level handles local nonconscious processing, here during a visual classification task. The second level or cognitive level globally integrates the information from multiple local processors via long-ranged connections and synthesizes it in a global, but still nonconscious, manner. The third and cognitively highest level handles the information globally and consciously. It is based on the global neuronal workspace (GNW) theory and is referred to as the conscious level. We use the trace and delay conditioning tasks to, respectively, challenge the second and third levels. Results first highlight the necessity of epigenesis through the selection and stabilization of synapses at both local and global scales to allow the network to solve the first two tasks. At the global scale, dopamine appears necessary to properly provide credit assignment despite the temporal delay between perception and reward. At the third level, the presence of interneurons becomes necessary to maintain a self-sustained representation within the GNW in the absence of sensory input. Finally, while balanced spontaneous intrinsic activity facilitates epigenesis at both local and global scales, the balanced excitatory/inhibitory ratio increases performance. We discuss the plausibility of the model in both neurodevelopmental and artificial intelligence terms.
Collapse
|
24
|
Synaptic balancing: A biologically plausible local learning rule that provably increases neural network noise robustness without sacrificing task performance. PLoS Comput Biol 2022; 18:e1010418. [PMID: 36121844 PMCID: PMC9522011 DOI: 10.1371/journal.pcbi.1010418] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2020] [Revised: 09/29/2022] [Accepted: 07/20/2022] [Indexed: 11/26/2022] Open
Abstract
We introduce a novel, biologically plausible local learning rule that provably increases the robustness of neural dynamics to noise in nonlinear recurrent neural networks with homogeneous nonlinearities. Our learning rule achieves higher noise robustness without sacrificing performance on the task and without requiring any knowledge of the particular task. The plasticity dynamics—an integrable dynamical system operating on the weights of the network—maintains a multiplicity of conserved quantities, most notably the network’s entire temporal map of input to output trajectories. The outcome of our learning rule is a synaptic balancing between the incoming and outgoing synapses of every neuron. This synaptic balancing rule is consistent with many known aspects of experimentally observed heterosynaptic plasticity, and moreover makes new experimentally testable predictions relating plasticity at the incoming and outgoing synapses of individual neurons. Overall, this work provides a novel, practical local learning rule that exactly preserves overall network function and, in doing so, provides new conceptual bridges between the disparate worlds of the neurobiology of heterosynaptic plasticity, the engineering of regularized noise-robust networks, and the mathematics of integrable Lax dynamical systems. The precise pattern of synaptic connections in a neural network entirely determines the network’s function. Learning rules are programs for modifying synapses to allow the network to attain some functional goal. By carefully tweaking synaptic connectivity, neural networks are able learn new tasks, form new memories, and stabilize neural activity. Due to the complexity of network structure, there are many different synaptic weight patterns that can in principle perform the same computation. These different solutions, however, may be more or less desirable in other other ways, such as their vulnerability to unreliable components in the network. In this paper, we present a synaptic learning rule which locates robust solutions among many synaptic weight configurations that equivalently perform the same task. Our biologically plausible rule relies on information which is entirely locally available to each neuron and possesses a number of desirable mathematical properties. Our analysis reveals that networks are most robust when the aggregate incoming and outgoing synapses are balanced in every neuron. Intriguingly, our learning rule makes predictions which resemble widely experimentally observed patterns of compensatory heterosynaptic plasticity in cortical dendrites.
Collapse
|
25
|
Developmental depression-to-facilitation shift controls excitation-inhibition balance. Commun Biol 2022; 5:873. [PMID: 36008708 PMCID: PMC9411206 DOI: 10.1038/s42003-022-03801-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Accepted: 08/04/2022] [Indexed: 11/25/2022] Open
Abstract
Changes in the short-term dynamics of excitatory synapses over development have been observed throughout cortex, but their purpose and consequences remain unclear. Here, we propose that developmental changes in synaptic dynamics buffer the effect of slow inhibitory long-term plasticity, allowing for continuously stable neural activity. Using computational modeling we demonstrate that early in development excitatory short-term depression quickly stabilises neural activity, even in the face of strong, unbalanced excitation. We introduce a model of the commonly observed developmental shift from depression to facilitation and show that neural activity remains stable throughout development, while inhibitory synaptic plasticity slowly balances excitation, consistent with experimental observations. Our model predicts changes in the input responses from phasic to phasic-and-tonic and more precise spike timings. We also observe a gradual emergence of short-lasting memory traces governed by short-term plasticity development. We conclude that the developmental depression-to-facilitation shift may control excitation-inhibition balance throughout development with important functional consequences. Using computational modelling this study proposes that the commonly observed depression-to-facilitation shift across development controls excitation-inhibition balance in the brain.
Collapse
|
26
|
Wang B, Aljadeff J. Multiplicative Shot-Noise: A New Route to Stability of Plastic Networks. PHYSICAL REVIEW LETTERS 2022; 129:068101. [PMID: 36018633 DOI: 10.1103/physrevlett.129.068101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 06/30/2022] [Indexed: 06/15/2023]
Abstract
Fluctuations of synaptic weights, among many other physical, biological, and ecological quantities, are driven by coincident events of two "parent" processes. We propose a multiplicative shot-noise model that can capture the behaviors of a broad range of such natural phenomena, and analytically derive an approximation that accurately predicts its statistics. We apply our results to study the effects of a multiplicative synaptic plasticity rule that was recently extracted from measurements in physiological conditions. Using mean-field theory analysis and network simulations, we investigate how this rule shapes the connectivity and dynamics of recurrent spiking neural networks. The multiplicative plasticity rule is shown to support efficient learning of input stimuli, and it gives a stable, unimodal synaptic-weight distribution with a large fraction of strong synapses. The strong synapses remain stable over long times but do not "run away." Our results suggest that the multiplicative shot-noise offers a new route to understand the tradeoff between flexibility and stability in neural circuits and other dynamic networks.
Collapse
Affiliation(s)
- Bin Wang
- Department of Physics, University of California San Diego, La Jolla, California 92093, USA
| | - Johnatan Aljadeff
- Department of Neurobiology, University of California San Diego, La Jolla, California 92093, USA
| |
Collapse
|
27
|
Gu J, Lim S. Unsupervised learning for robust working memory. PLoS Comput Biol 2022; 18:e1009083. [PMID: 35500033 PMCID: PMC9098088 DOI: 10.1371/journal.pcbi.1009083] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Revised: 05/12/2022] [Accepted: 03/16/2022] [Indexed: 11/18/2022] Open
Abstract
Working memory is a core component of critical cognitive functions such as planning and decision-making. Persistent activity that lasts long after the stimulus offset has been considered a neural substrate for working memory. Attractor dynamics based on network interactions can successfully reproduce such persistent activity. However, it requires a fine-tuning of network connectivity, in particular, to form continuous attractors which were suggested for encoding continuous signals in working memory. Here, we investigate whether a specific form of synaptic plasticity rules can mitigate such tuning problems in two representative working memory models, namely, rate-coded and location-coded persistent activity. We consider two prominent types of plasticity rules, differential plasticity correcting the rapid activity changes and homeostatic plasticity regularizing the long-term average of activity, both of which have been proposed to fine-tune the weights in an unsupervised manner. Consistent with the findings of previous works, differential plasticity alone was enough to recover a graded-level persistent activity after perturbations in the connectivity. For the location-coded memory, differential plasticity could also recover persistent activity. However, its pattern can be irregular for different stimulus locations under slow learning speed or large perturbation in the connectivity. On the other hand, homeostatic plasticity shows a robust recovery of smooth spatial patterns under particular types of synaptic perturbations, such as perturbations in incoming synapses onto the entire or local populations. However, homeostatic plasticity was not effective against perturbations in outgoing synapses from local populations. Instead, combining it with differential plasticity recovers location-coded persistent activity for a broader range of perturbations, suggesting compensation between two plasticity rules.
Collapse
Affiliation(s)
- Jintao Gu
- Neural Science, New York University Shanghai, Shanghai, China
| | - Sukbin Lim
- Neural Science, New York University Shanghai, Shanghai, China
- NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai, Shanghai, China
- * E-mail:
| |
Collapse
|
28
|
Gonzalez KC, Losonczy A, Negrean A. Dendritic Excitability and Synaptic Plasticity In Vitro and In Vivo. Neuroscience 2022; 489:165-175. [PMID: 34998890 PMCID: PMC9392867 DOI: 10.1016/j.neuroscience.2021.12.039] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2021] [Revised: 12/29/2021] [Accepted: 12/30/2021] [Indexed: 02/06/2023]
Abstract
Much of our understanding of dendritic and synaptic physiology comes from in vitro experimentation, where the afforded mechanical stability and convenience of applying drugs allowed patch-clamping based recording techniques to investigate ion channel distributions, their gating kinetics, and to uncover dendritic integrative and synaptic plasticity rules. However, with current efforts to study these questions in vivo, there is a great need to translate existing knowledge between in vitro and in vivo experimental conditions. In this review, we identify discrepancies between in vitro and in vivo ionic composition of extracellular media and discuss how changes in ionic composition alter dendritic excitability and plasticity induction. Here, we argue that under physiological in vivo ionic conditions, dendrites are expected to be more excitable and the threshold for synaptic plasticity induction to be lowered. Consequently, the plasticity rules described in vitro vary significantly from those implemented in vivo.
Collapse
Affiliation(s)
- Kevin C Gonzalez
- Department of Neuroscience, Columbia University, New York, NY, USA; Mortimer B. Zuckerman Mind Brain Behavior Institute, New York, NY, USA.
| | - Attila Losonczy
- Department of Neuroscience, Columbia University, New York, NY, USA; Mortimer B. Zuckerman Mind Brain Behavior Institute, New York, NY, USA; Kavli Institute for Brain Science, New York, NY, USA.
| | - Adrian Negrean
- Department of Neuroscience, Columbia University, New York, NY, USA; Mortimer B. Zuckerman Mind Brain Behavior Institute, New York, NY, USA.
| |
Collapse
|
29
|
Wesselink DB, Sanders ZB, Edmondson LR, Dempsey-Jones H, Kieliba P, Kikkert S, Themistocleous AC, Emir U, Diedrichsen J, Saal HP, Makin TR. Malleability of the cortical hand map following a finger nerve block. SCIENCE ADVANCES 2022; 8:eabk2393. [PMID: 35452294 PMCID: PMC9032959 DOI: 10.1126/sciadv.abk2393] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Electrophysiological studies in monkeys show that finger amputation triggers local remapping within the deprived primary somatosensory cortex (S1). Human neuroimaging research, however, shows persistent S1 representation of the missing hand's fingers, even decades after amputation. Here, we explore whether this apparent contradiction stems from underestimating the distributed peripheral and central representation of fingers in the hand map. Using pharmacological single-finger nerve block and 7-tesla neuroimaging, we first replicated previous accounts (electrophysiological and other) of local S1 remapping. Local blocking also triggered activity changes to nonblocked fingers across the entire hand area. Using methods exploiting interfinger representational overlap, however, we also show that the blocked finger representation remained persistent despite input loss. Computational modeling suggests that both local stability and global reorganization are driven by distributed processing underlying the topographic map, combined with homeostatic mechanisms. Our findings reveal complex interfinger representational features that play a key role in brain (re)organization, beyond (re)mapping.
Collapse
Affiliation(s)
- Daan B. Wesselink
- Institute of Cognitive Neuroscience, University College London, London, UK
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
- Department of Neurobiology, Harvard Medical School, Boston, MA, USA
- Corresponding author.
| | - Zeena-Britt Sanders
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
| | - Laura R. Edmondson
- Active Touch Laboratory, Department of Psychology, The University of Sheffield, Sheffield, UK
| | - Harriet Dempsey-Jones
- Institute of Cognitive Neuroscience, University College London, London, UK
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
- School of Psychology, University of Queensland, Brisbane, Australia
| | - Paulina Kieliba
- Institute of Cognitive Neuroscience, University College London, London, UK
| | - Sanne Kikkert
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
| | - Andreas C. Themistocleous
- Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
- Brain Function Research Group, University of the Witwatersrand, Johannesburg, South Africa
| | - Uzay Emir
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
| | - Jörn Diedrichsen
- Brain and Mind Institute, University of Western Ontario, London, Canada
| | - Hannes P. Saal
- Active Touch Laboratory, Department of Psychology, The University of Sheffield, Sheffield, UK
| | - Tamar R. Makin
- Institute of Cognitive Neuroscience, University College London, London, UK
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, UK
- Wellcome Centre for Human Neuroimaging, University College London, London, UK
| |
Collapse
|
30
|
d'Aquin S, Szonyi A, Mahn M, Krabbe S, Gründemann J, Lüthi A. Compartmentalized dendritic plasticity during associative learning. Science 2022; 376:eabf7052. [PMID: 35420958 DOI: 10.1126/science.abf7052] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
Experience-dependent changes in behavior are mediated by long-term functional modifications in brain circuits. Activity-dependent plasticity of synaptic input is a major underlying cellular process. Although we have a detailed understanding of synaptic and dendritic plasticity in vitro, little is known about the functional and plastic properties of active dendrites in behaving animals. Using deep brain two-photon Ca2+ imaging, we investigated how sensory responses in amygdala principal neurons develop upon classical fear conditioning, a form of associative learning. Fear conditioning induced differential plasticity in dendrites and somas regulated by compartment-specific inhibition. Our results indicate that learning-induced plasticity can be uncoupled between soma and dendrites, reflecting distinct synaptic and microcircuit-level mechanisms that increase the computational capacity of amygdala circuits.
Collapse
Affiliation(s)
- Simon d'Aquin
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland.,University of Basel, Basel, Switzerland
| | - Andras Szonyi
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland.,Laboratory of Cellular Neurophysiology, Institute of Experimental Medicine, Budapest, Hungary
| | - Mathias Mahn
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland
| | - Sabine Krabbe
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland
| | - Jan Gründemann
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland.,Department of Biomedicine, University of Basel, Basel, Switzerland
| | - Andreas Lüthi
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland.,University of Basel, Basel, Switzerland
| |
Collapse
|
31
|
Chen H, Xie L, Wang Y, Zhang H. Postsynaptic Potential Energy as Determinant of Synaptic Plasticity. Front Comput Neurosci 2022; 16:804604. [PMID: 35250524 PMCID: PMC8891168 DOI: 10.3389/fncom.2022.804604] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 01/13/2022] [Indexed: 02/06/2023] Open
Abstract
Metabolic energy can be used as a unifying principle to control neuronal activity. However, whether and how metabolic energy alone can determine the outcome of synaptic plasticity remains unclear. This study proposes a computational model of synaptic plasticity that is completely determined by energy. A simple quantitative relationship between synaptic plasticity and postsynaptic potential energy is established. Synaptic weight is directly proportional to the difference between the baseline potential energy and the suprathreshold potential energy and is constrained by the maximum energy supply. Results show that the energy constraint improves the performance of synaptic plasticity and avoids setting the hard boundary of synaptic weights. With the same set of model parameters, our model can reproduce several classical experiments in homo- and heterosynaptic plasticity. The proposed model can explain the interaction mechanism of Hebbian and homeostatic plasticity at the cellular level. Homeostatic synaptic plasticity at different time scales coexists. Homeostatic plasticity operating on a long time scale is caused by heterosynaptic plasticity and, on the same time scale as Hebbian synaptic plasticity, is caused by the constraint of energy supply.
Collapse
Affiliation(s)
- Huanwen Chen
- School of Automation, Central South University, Changsha, China
- *Correspondence: Huanwen Chen
| | - Lijuan Xie
- Institute of Physiology and Psychology, School of Marxism, Changsha University of Science and Technology, Changsha, China
| | - Yijun Wang
- School of Automation, Central South University, Changsha, China
| | - Hang Zhang
- School of Automation, Central South University, Changsha, China
| |
Collapse
|
32
|
Schumm SN, Gabrieli D, Meaney DF. Plasticity impairment exposes CA3 vulnerability in a hippocampal network model of mild traumatic brain injury. Hippocampus 2022; 32:231-250. [PMID: 34978378 DOI: 10.1002/hipo.23402] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2020] [Revised: 11/08/2021] [Accepted: 11/18/2021] [Indexed: 11/10/2022]
Abstract
Proper function of the hippocampus is critical for executing cognitive tasks such as learning and memory. Traumatic brain injury (TBI) and other neurological disorders are commonly associated with cognitive deficits and hippocampal dysfunction. Although there are many existing models of individual subregions of the hippocampus, few models attempt to integrate the primary areas into one system. In this work, we developed a computational model of the hippocampus, including the dentate gyrus, CA3, and CA1. The subregions are represented as an interconnected neuronal network, incorporating well-characterized ex vivo slice electrophysiology into the functional neuron models and well-documented anatomical connections into the network structure. In addition, since plasticity is foundational to the role of the hippocampus in learning and memory as well as necessary for studying adaptation to injury, we implemented spike-timing-dependent plasticity among the synaptic connections. Our model mimics key features of hippocampal activity, including signal frequencies in the theta and gamma bands and phase-amplitude coupling in area CA1. We also studied the effects of spike-timing-dependent plasticity impairment, a potential consequence of TBI, in our model and found that impairment decreases broadband power in CA3 and CA1 and reduces phase coherence between these two subregions, yet phase-amplitude coupling in CA1 remains intact. Altogether, our work demonstrates characteristic hippocampal activity with a scaled network model of spiking neurons and reveals the sensitive balance of plasticity mechanisms in the circuit through one manifestation of mild traumatic injury.
Collapse
Affiliation(s)
- Samantha N Schumm
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - David Gabrieli
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - David F Meaney
- Department of Bioengineering, School of Engineering and Applied Sciences, University of Pennsylvania, Philadelphia, Pennsylvania, USA.,Department of Neurosurgery, Penn Center for Brain Injury and Repair, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| |
Collapse
|
33
|
Madadi Asl M, Vahabie AH, Valizadeh A, Tass PA. Spike-Timing-Dependent Plasticity Mediated by Dopamine and its Role in Parkinson's Disease Pathophysiology. FRONTIERS IN NETWORK PHYSIOLOGY 2022; 2:817524. [PMID: 36926058 PMCID: PMC10013044 DOI: 10.3389/fnetp.2022.817524] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/18/2021] [Accepted: 02/08/2022] [Indexed: 01/05/2023]
Abstract
Parkinson's disease (PD) is a multi-systemic neurodegenerative brain disorder. Motor symptoms of PD are linked to the significant dopamine (DA) loss in substantia nigra pars compacta (SNc) followed by basal ganglia (BG) circuit dysfunction. Increasing experimental and computational evidence indicates that (synaptic) plasticity plays a key role in the emergence of PD-related pathological changes following DA loss. Spike-timing-dependent plasticity (STDP) mediated by DA provides a mechanistic model for synaptic plasticity to modify synaptic connections within the BG according to the neuronal activity. To shed light on how DA-mediated STDP can shape neuronal activity and synaptic connectivity in the PD condition, we reviewed experimental and computational findings addressing the modulatory effect of DA on STDP as well as other plasticity mechanisms and discussed their potential role in PD pathophysiology and related network dynamics and connectivity. In particular, reshaping of STDP profiles together with other plasticity-mediated processes following DA loss may abnormally modify synaptic connections in competing pathways of the BG. The cascade of plasticity-induced maladaptive or compensatory changes can impair the excitation-inhibition balance towards the BG output nuclei, leading to the emergence of pathological activity-connectivity patterns in PD. Pre-clinical, clinical as well as computational studies reviewed here provide an understanding of the impact of synaptic plasticity and other plasticity mechanisms on PD pathophysiology, especially PD-related network activity and connectivity, after DA loss. This review may provide further insights into the abnormal structure-function relationship within the BG contributing to the emergence of pathological states in PD. Specifically, this review is intended to provide detailed information for the development of computational network models for PD, serving as testbeds for the development and optimization of invasive and non-invasive brain stimulation techniques. Computationally derived hypotheses may accelerate the development of therapeutic stimulation techniques and potentially reduce the number of related animal experiments.
Collapse
Affiliation(s)
- Mojtaba Madadi Asl
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan, Iran
| | - Abdol-Hossein Vahabie
- School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, Iran.,Department of Psychology, Faculty of Psychology and Education, University of Tehran, Tehran, Iran
| | - Alireza Valizadeh
- Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan, Iran
| | - Peter A Tass
- Department of Neurosurgery, Stanford University School of Medicine, Stanford, CA, United States
| |
Collapse
|
34
|
Virtuoso A, Colangelo AM, Maggio N, Fennig U, Weinberg N, Papa M, De Luca C. The Spatiotemporal Coupling: Regional Energy Failure and Aberrant Proteins in Neurodegenerative Diseases. Int J Mol Sci 2021; 22:11304. [PMID: 34768733 PMCID: PMC8583302 DOI: 10.3390/ijms222111304] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Revised: 10/15/2021] [Accepted: 10/17/2021] [Indexed: 12/14/2022] Open
Abstract
The spatial and temporal coordination of each element is a pivotal characteristic of systems, and the central nervous system (CNS) is not an exception. Glial elements and the vascular interface have been considered more recently, together with the extracellular matrix and the immune system. However, the knowledge of the single-element configuration is not sufficient to predict physiological or pathological long-lasting changes. Ionic currents, complex molecular cascades, genomic rearrangement, and the regional energy demand can be different even in neighboring cells of the same phenotype, and their differential expression could explain the region-specific progression of the most studied neurodegenerative diseases. We here reviewed the main nodes and edges of the system, which could be studied to develop a comprehensive knowledge of CNS plasticity from the neurovascular unit to the synaptic cleft. The future goal is to redefine the modeling of synaptic plasticity and achieve a better understanding of neurological diseases, pointing out cellular, subcellular, and molecular components that couple in specific neuroanatomical and functional regions.
Collapse
Affiliation(s)
- Assunta Virtuoso
- Laboratory of Neuronal Networks, Department of Mental and Physical Health and Preventive Medicine, University of Campania ‘‘Luigi Vanvitelli”, 80138 Naples, Italy; (A.V.); (C.D.L.)
| | - Anna Maria Colangelo
- SYSBIO Centre of Systems Biology ISBE-IT, University of Milano-Bicocca, 20126 Milan, Italy;
- Laboratory of Neuroscience “R. Levi-Montalcini”, Department of Biotechnology and Biosciences, University of Milano-Bicocca, 20126 Milan, Italy
| | - Nicola Maggio
- Department of Neurology and Neurosurgery, Sackler Faculty of Medicine, Sagol School of Neuroscience, Tel Aviv University, Tel Aviv 6997801, Israel; (N.M.); (U.F.); (N.W.)
- Department of Neurology, The Chaim Sheba Medical Center at Tel HaShomer, Ramat Gan 52662, Israel
| | - Uri Fennig
- Department of Neurology and Neurosurgery, Sackler Faculty of Medicine, Sagol School of Neuroscience, Tel Aviv University, Tel Aviv 6997801, Israel; (N.M.); (U.F.); (N.W.)
- Department of Neurology, The Chaim Sheba Medical Center at Tel HaShomer, Ramat Gan 52662, Israel
| | - Nitai Weinberg
- Department of Neurology and Neurosurgery, Sackler Faculty of Medicine, Sagol School of Neuroscience, Tel Aviv University, Tel Aviv 6997801, Israel; (N.M.); (U.F.); (N.W.)
- Department of Neurology, The Chaim Sheba Medical Center at Tel HaShomer, Ramat Gan 52662, Israel
| | - Michele Papa
- Laboratory of Neuronal Networks, Department of Mental and Physical Health and Preventive Medicine, University of Campania ‘‘Luigi Vanvitelli”, 80138 Naples, Italy; (A.V.); (C.D.L.)
- SYSBIO Centre of Systems Biology ISBE-IT, University of Milano-Bicocca, 20126 Milan, Italy;
| | - Ciro De Luca
- Laboratory of Neuronal Networks, Department of Mental and Physical Health and Preventive Medicine, University of Campania ‘‘Luigi Vanvitelli”, 80138 Naples, Italy; (A.V.); (C.D.L.)
| |
Collapse
|
35
|
Schulz A, Miehl C, Berry MJ, Gjorgjieva J. The generation of cortical novelty responses through inhibitory plasticity. eLife 2021; 10:e65309. [PMID: 34647889 PMCID: PMC8516419 DOI: 10.7554/elife.65309] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2020] [Accepted: 09/22/2021] [Indexed: 12/17/2022] Open
Abstract
Animals depend on fast and reliable detection of novel stimuli in their environment. Neurons in multiple sensory areas respond more strongly to novel in comparison to familiar stimuli. Yet, it remains unclear which circuit, cellular, and synaptic mechanisms underlie those responses. Here, we show that spike-timing-dependent plasticity of inhibitory-to-excitatory synapses generates novelty responses in a recurrent spiking network model. Inhibitory plasticity increases the inhibition onto excitatory neurons tuned to familiar stimuli, while inhibition for novel stimuli remains low, leading to a network novelty response. The generation of novelty responses does not depend on the periodicity but rather on the distribution of presented stimuli. By including tuning of inhibitory neurons, the network further captures stimulus-specific adaptation. Finally, we suggest that disinhibition can control the amplification of novelty responses. Therefore, inhibitory plasticity provides a flexible, biologically plausible mechanism to detect the novelty of bottom-up stimuli, enabling us to make experimentally testable predictions.
Collapse
Affiliation(s)
- Auguste Schulz
- Max Planck Institute for Brain ResearchFrankfurtGermany
- Technical University of Munich, Department of Electrical and Computer EngineeringMunichGermany
| | - Christoph Miehl
- Max Planck Institute for Brain ResearchFrankfurtGermany
- Technical University of Munich, School of Life SciencesFreisingGermany
| | - Michael J Berry
- Princeton University, Princeton Neuroscience InstitutePrincetonUnited States
| | - Julijana Gjorgjieva
- Max Planck Institute for Brain ResearchFrankfurtGermany
- Technical University of Munich, School of Life SciencesFreisingGermany
| |
Collapse
|
36
|
Edwards G, Berestova A, Battelli L. Behavioral gain following isolation of attention. Sci Rep 2021; 11:19329. [PMID: 34588526 PMCID: PMC8481494 DOI: 10.1038/s41598-021-98670-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2021] [Accepted: 09/08/2021] [Indexed: 11/10/2022] Open
Abstract
Stable sensory perception is achieved through balanced excitatory-inhibitory interactions of lateralized sensory processing. In real world experience, sensory processing is rarely equal across lateralized processing regions, resulting in continuous rebalancing. Using lateralized attention as a case study, we predicted rebalancing lateralized processing following prolonged spatial attention imbalance could cause a gain in attention in the opposite direction. In neurotypical human adults, we isolated covert attention to one visual field with a 30-min attention-demanding task and found an increase in attention in the opposite visual field after manipulation. We suggest a gain in lateralized attention in the previously unattended visual field is due to an overshoot through attention rebalancing. The offline post-manipulation effect is suggestive of long-term potentiation affecting behavior. Our finding of visual field specific attention increase could be critical for the development of clinical rehabilitation for patients with a unilateral lesion and lateralized attention deficits. This proof-of-concept study initiates the examination of overshoot following the release of imbalance in other lateralized control and sensory domains, important in our basic understanding of lateralized processing.
Collapse
Affiliation(s)
- Grace Edwards
- Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy. .,Department of Psychology, Harvard University, Cambridge, MA, 02138, USA.
| | - Anna Berestova
- Lesley University, 29 Everett St, Cambridge, MA, 02138, USA
| | - Lorella Battelli
- Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy.,Department of Psychology, Harvard University, Cambridge, MA, 02138, USA.,Berenson-Allen Center for Noninvasive Brain Stimulation and Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, 02215, USA
| |
Collapse
|
37
|
Teichmann M, Larisch R, Hamker FH. Performance of biologically grounded models of the early visual system on standard object recognition tasks. Neural Netw 2021; 144:210-228. [PMID: 34507042 DOI: 10.1016/j.neunet.2021.08.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2021] [Revised: 07/05/2021] [Accepted: 08/04/2021] [Indexed: 11/29/2022]
Abstract
Computational neuroscience models of vision and neural network models for object recognition are often framed by different research agendas. Computational neuroscience mainly aims at replicating experimental data, while (artificial) neural networks target high performance on classification tasks. However, we propose that models of vision should be validated on object recognition tasks. At some point, mechanisms of realistic neuro-computational models of the visual cortex have to convince in object recognition as well. In order to foster this idea, we report the recognition accuracy for two different neuro-computational models of the visual cortex on several object recognition datasets. The models were trained using unsupervised Hebbian learning rules on natural scene inputs for the emergence of receptive fields comparable to their biological counterpart. We assume that the emerged receptive fields result in a general codebook of features, which should be applicable to a variety of visual scenes. We report the performances on datasets with different levels of difficulty, ranging from the simple MNIST to the more complex CIFAR-10 or ETH-80. We found that both networks show good results on simple digit recognition, comparable with previously published biologically plausible models. We also observed that our deeper layer neurons provide for naturalistic datasets a better recognition codebook. As for most datasets, recognition results of biologically grounded models are not available yet, our results provide a broad basis of performance values to compare methodologically similar models.
Collapse
Affiliation(s)
- Michael Teichmann
- Chemnitz University of Technology, Str. der Nationen, 62, 09111, Chemnitz, Germany.
| | - René Larisch
- Chemnitz University of Technology, Str. der Nationen, 62, 09111, Chemnitz, Germany.
| | - Fred H Hamker
- Chemnitz University of Technology, Str. der Nationen, 62, 09111, Chemnitz, Germany.
| |
Collapse
|
38
|
Sinha M, Narayanan R. Active Dendrites and Local Field Potentials: Biophysical Mechanisms and Computational Explorations. Neuroscience 2021; 489:111-142. [PMID: 34506834 PMCID: PMC7612676 DOI: 10.1016/j.neuroscience.2021.08.035] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2021] [Revised: 08/30/2021] [Accepted: 08/31/2021] [Indexed: 10/27/2022]
Abstract
Neurons and glial cells are endowed with membranes that express a rich repertoire of ion channels, transporters, and receptors. The constant flux of ions across the neuronal and glial membranes results in voltage fluctuations that can be recorded from the extracellular matrix. The high frequency components of this voltage signal contain information about the spiking activity, reflecting the output from the neurons surrounding the recording location. The low frequency components of the signal, referred to as the local field potential (LFP), have been traditionally thought to provide information about the synaptic inputs that impinge on the large dendritic trees of various neurons. In this review, we discuss recent computational and experimental studies pointing to a critical role of several active dendritic mechanisms that can influence the genesis and the location-dependent spectro-temporal dynamics of LFPs, spanning different brain regions. We strongly emphasize the need to account for the several fast and slow dendritic events and associated active mechanisms - including gradients in their expression profiles, inter- and intra-cellular spatio-temporal interactions spanning neurons and glia, heterogeneities and degeneracy across scales, neuromodulatory influences, and activitydependent plasticity - towards gaining important insights about the origins of LFP under different behavioral states in health and disease. We provide simple but essential guidelines on how to model LFPs taking into account these dendritic mechanisms, with detailed methodology on how to account for various heterogeneities and electrophysiological properties of neurons and synapses while studying LFPs.
Collapse
Affiliation(s)
- Manisha Sinha
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, Karnataka 560012, India
| | - Rishikesh Narayanan
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, Karnataka 560012, India.
| |
Collapse
|
39
|
Niemeyer N, Schleimer JH, Schreiber S. Biophysical models of intrinsic homeostasis: Firing rates and beyond. Curr Opin Neurobiol 2021; 70:81-88. [PMID: 34454303 DOI: 10.1016/j.conb.2021.07.011] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2021] [Revised: 06/14/2021] [Accepted: 07/14/2021] [Indexed: 12/01/2022]
Abstract
In view of ever-changing conditions both in the external world and in intrinsic brain states, maintaining the robustness of computations poses a challenge, adequate solutions to which we are only beginning to understand. At the level of cell-intrinsic properties, biophysical models of neurons permit one to identify relevant physiological substrates that can serve as regulators of neuronal excitability and to test how feedback loops can stabilize crucial variables such as long-term calcium levels and firing rates. Mathematical theory has also revealed a rich set of complementary computational properties arising from distinct cellular dynamics and even shaping processing at the network level. Here, we provide an overview over recently explored homeostatic mechanisms derived from biophysical models and hypothesize how multiple dynamical characteristics of cells, including their intrinsic neuronal excitability classes, can be stably controlled.
Collapse
Affiliation(s)
- Nelson Niemeyer
- Institute for Theoretical Biology, Humboldt-Universität zu Berlin, 10115, Berlin, Germany; Einstein Center for Neurosciences Berlin, Charitéplatz 1, 10117, Berlin, Germany; Bernstein Center for Computational Neuroscience, 10115, Berlin, Germany
| | - Jan-Hendrik Schleimer
- Institute for Theoretical Biology, Humboldt-Universität zu Berlin, 10115, Berlin, Germany; Bernstein Center for Computational Neuroscience, 10115, Berlin, Germany
| | - Susanne Schreiber
- Institute for Theoretical Biology, Humboldt-Universität zu Berlin, 10115, Berlin, Germany; Einstein Center for Neurosciences Berlin, Charitéplatz 1, 10117, Berlin, Germany; Bernstein Center for Computational Neuroscience, 10115, Berlin, Germany.
| |
Collapse
|
40
|
|
41
|
Neuromodulated Dopamine Plastic Networks for Heterogeneous Transfer Learning with Hebbian Principle. Symmetry (Basel) 2021. [DOI: 10.3390/sym13081344] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023] Open
Abstract
The plastic modifications in synaptic connectivity is primarily from changes triggered by neuromodulated dopamine signals. These activities are controlled by neuromodulation, which is itself under the control of the brain. The subjective brain’s self-modifying abilities play an essential role in learning and adaptation. The artificial neural networks with neuromodulated plasticity are used to implement transfer learning in the image classification domain. In particular, this has application in image detection, image segmentation, and transfer of learning parameters with significant results. This paper proposes a novel approach to enhance transfer learning accuracy in a heterogeneous source and target, using the neuromodulation of the Hebbian learning principle, called NDHTL (Neuromodulated Dopamine Hebbian Transfer Learning). Neuromodulation of plasticity offers a powerful new technique with applications in training neural networks implementing asymmetric backpropagation using Hebbian principles in transfer learning motivated CNNs (Convolutional neural networks). Biologically motivated concomitant learning, where connected brain cells activate positively, enhances the synaptic connection strength between the network neurons. Using the NDHTL algorithm, the percentage of change of the plasticity between the neurons of the CNN layer is directly managed by the dopamine signal’s value. The discriminative nature of transfer learning fits well with the technique. The learned model’s connection weights must adapt to unseen target datasets with the least cost and effort in transfer learning. Using distinctive learning principles such as dopamine Hebbian learning in transfer learning for asymmetric gradient weights update is a novel approach. The paper emphasizes the NDHTL algorithmic technique as synaptic plasticity controlled by dopamine signals in transfer learning to classify images using source-target datasets. The standard transfer learning using gradient backpropagation is a symmetric framework. Experimental results using CIFAR-10 and CIFAR-100 datasets show that the proposed NDHTL algorithm can enhance transfer learning efficiency compared to existing methods.
Collapse
|
42
|
Sarazin MXB, Victor J, Medernach D, Naudé J, Delord B. Online Learning and Memory of Neural Trajectory Replays for Prefrontal Persistent and Dynamic Representations in the Irregular Asynchronous State. Front Neural Circuits 2021; 15:648538. [PMID: 34305535 PMCID: PMC8298038 DOI: 10.3389/fncir.2021.648538] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2020] [Accepted: 05/31/2021] [Indexed: 11/13/2022] Open
Abstract
In the prefrontal cortex (PFC), higher-order cognitive functions and adaptive flexible behaviors rely on continuous dynamical sequences of spiking activity that constitute neural trajectories in the state space of activity. Neural trajectories subserve diverse representations, from explicit mappings in physical spaces to generalized mappings in the task space, and up to complex abstract transformations such as working memory, decision-making and behavioral planning. Computational models have separately assessed learning and replay of neural trajectories, often using unrealistic learning rules or decoupling simulations for learning from replay. Hence, the question remains open of how neural trajectories are learned, memorized and replayed online, with permanently acting biological plasticity rules. The asynchronous irregular regime characterizing cortical dynamics in awake conditions exerts a major source of disorder that may jeopardize plasticity and replay of locally ordered activity. Here, we show that a recurrent model of local PFC circuitry endowed with realistic synaptic spike timing-dependent plasticity and scaling processes can learn, memorize and replay large-size neural trajectories online under asynchronous irregular dynamics, at regular or fast (sped-up) timescale. Presented trajectories are quickly learned (within seconds) as synaptic engrams in the network, and the model is able to chunk overlapping trajectories presented separately. These trajectory engrams last long-term (dozen hours) and trajectory replays can be triggered over an hour. In turn, we show the conditions under which trajectory engrams and replays preserve asynchronous irregular dynamics in the network. Functionally, spiking activity during trajectory replays at regular timescale accounts for both dynamical coding with temporal tuning in individual neurons, persistent activity at the population level, and large levels of variability consistent with observed cognitive-related PFC dynamics. Together, these results offer a consistent theoretical framework accounting for how neural trajectories can be learned, memorized and replayed in PFC networks circuits to subserve flexible dynamic representations and adaptive behaviors.
Collapse
Affiliation(s)
- Matthieu X B Sarazin
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Julie Victor
- CEA Paris-Saclay, CNRS, NeuroSpin, Saclay, France
| | - David Medernach
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Jérémie Naudé
- Neuroscience Paris Seine - Institut de biologie Paris Seine, CNRS, Inserm, Sorbonne Université, Paris, France
| | - Bruno Delord
- Institut des Systèmes Intelligents et de Robotique, CNRS, Inserm, Sorbonne Université, Paris, France
| |
Collapse
|
43
|
Fernandez-Leon JA, Acosta G. A heuristic perspective on non-variational free energy modulation at the sleep-like edge. Biosystems 2021; 208:104466. [PMID: 34246689 DOI: 10.1016/j.biosystems.2021.104466] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Revised: 06/03/2021] [Accepted: 06/21/2021] [Indexed: 11/16/2022]
Abstract
BACKGROUND The variational Free Energy Principle (FEP) establishes that a neural system minimizes a free energy function of their internal state through environmental sensing entailing beliefs about hidden states in their environment. PROBLEM Because sensations are drastically reduced during sleep, it is still unclear how a self-organizing neural network can modulate free energy during sleep transitions. GOAL To address this issue, we study how network's state-dependent changes in energy, entropy and free energy connect with changes at the synaptic level in the absence of sensing during a sleep-like transition. APPROACH We use simulations of a physically plausible, environmentally isolated neuronal network that self-organize after inducing a thalamic input to show that the reduction of non-variational free energy depends sensitively upon thalamic input at a slow, rhythmic Poisson (delta) frequency due to spike timing dependent plasticity. METHODS We define a non-variational free energy in terms of the relative difference between the energy and entropy of the network from the initial distribution (prior to activity dependent plasticity) to the nonequilibrium steady-state distribution (after plasticity). We repeated the analysis under different levels of thalamic drive - as defined by the number of cortical neurons in receipt of thalamic input. RESULTS Entraining slow activity with thalamic input induces a transition from a gamma (awake-like state) to a delta (sleep-like state) mode of activity, which can be characterized through a modulation of network's energy and entropy (non-variational free energy) of the ensuing dynamics. The self-organizing response to low and high thalamic drive also showed characteristic differences in the spectrum of frequency content due to spike timing dependent plasticity. CONCLUSIONS The modulation of this non-variational free energy in a network that self-organizes, seems to be an organizational network principle. This could open a window to new empirically testable hypotheses about state changes in a neural network.
Collapse
Affiliation(s)
- Jose A Fernandez-Leon
- Neurology, Harvard Medical School, Brigham and Women's Hospital, Boston, MA, 02115, USA; Neuroscience, Baylor College of Medicine, Houston, TX, 77030, USA.
| | - Gerardo Acosta
- INTELYMEC-CIFICEN (UNCPBA-CICPBA-CONICET), Olavarría, B7400JWI, Argentina
| |
Collapse
|
44
|
Ness N, Schultz SR. A computational grid-to-place-cell transformation model indicates a synaptic driver of place cell impairment in early-stage Alzheimer's Disease. PLoS Comput Biol 2021; 17:e1009115. [PMID: 34133417 PMCID: PMC8238223 DOI: 10.1371/journal.pcbi.1009115] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2020] [Revised: 06/28/2021] [Accepted: 05/26/2021] [Indexed: 12/23/2022] Open
Abstract
Alzheimer's Disease (AD) is characterized by progressive neurodegeneration and cognitive impairment. Synaptic dysfunction is an established early symptom, which correlates strongly with cognitive decline, and is hypothesised to mediate the diverse neuronal network abnormalities observed in AD. However, how synaptic dysfunction contributes to network pathology and cognitive impairment in AD remains elusive. Here, we present a grid-cell-to-place-cell transformation model of long-term CA1 place cell dynamics to interrogate the effect of synaptic loss on network function and environmental representation. Synapse loss modelled after experimental observations in the APP/PS1 mouse model was found to induce firing rate alterations and place cell abnormalities that have previously been observed in AD mouse models, including enlarged place fields and lower across-session stability of place fields. Our results support the hypothesis that synaptic dysfunction underlies cognitive deficits, and demonstrate how impaired environmental representation may arise in the early stages of AD. We further propose that dysfunction of excitatory and inhibitory inputs to CA1 pyramidal cells may cause distinct impairments in place cell function, namely reduced stability and place map resolution.
Collapse
Affiliation(s)
- Natalie Ness
- Centre for Neurotechnology and Department of Bioengineering, Imperial College London, London, United Kingdom
| | - Simon R. Schultz
- Centre for Neurotechnology and Department of Bioengineering, Imperial College London, London, United Kingdom
| |
Collapse
|
45
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
46
|
Alzubaidi L, Zhang J, Humaidi AJ, Al-Dujaili A, Duan Y, Al-Shamma O, Santamaría J, Fadhel MA, Al-Amidie M, Farhan L. Review of deep learning: concepts, CNN architectures, challenges, applications, future directions. JOURNAL OF BIG DATA 2021; 8:53. [PMID: 33816053 PMCID: PMC8010506 DOI: 10.1186/s40537-021-00444-8] [Citation(s) in RCA: 663] [Impact Index Per Article: 221.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2021] [Accepted: 03/22/2021] [Indexed: 05/04/2023]
Abstract
In the last few years, the deep learning (DL) computing paradigm has been deemed the Gold Standard in the machine learning (ML) community. Moreover, it has gradually become the most widely used computational approach in the field of ML, thus achieving outstanding results on several complex cognitive tasks, matching or even beating those provided by human performance. One of the benefits of DL is the ability to learn massive amounts of data. The DL field has grown fast in the last few years and it has been extensively used to successfully address a wide range of traditional applications. More importantly, DL has outperformed well-known ML techniques in many domains, e.g., cybersecurity, natural language processing, bioinformatics, robotics and control, and medical information processing, among many others. Despite it has been contributed several works reviewing the State-of-the-Art on DL, all of them only tackled one aspect of the DL, which leads to an overall lack of knowledge about it. Therefore, in this contribution, we propose using a more holistic approach in order to provide a more suitable starting point from which to develop a full understanding of DL. Specifically, this review attempts to provide a more comprehensive survey of the most important aspects of DL and including those enhancements recently added to the field. In particular, this paper outlines the importance of DL, presents the types of DL techniques and networks. It then presents convolutional neural networks (CNNs) which the most utilized DL network type and describes the development of CNNs architectures together with their main features, e.g., starting with the AlexNet network and closing with the High-Resolution network (HR.Net). Finally, we further present the challenges and suggested solutions to help researchers understand the existing research gaps. It is followed by a list of the major DL applications. Computational tools including FPGA, GPU, and CPU are summarized along with a description of their influence on DL. The paper ends with the evolution matrix, benchmark datasets, and summary and conclusion.
Collapse
Affiliation(s)
- Laith Alzubaidi
- School of Computer Science, Queensland University of Technology, Brisbane, QLD 4000 Australia
- AlNidhal Campus, University of Information Technology & Communications, Baghdad, 10001 Iraq
| | - Jinglan Zhang
- School of Computer Science, Queensland University of Technology, Brisbane, QLD 4000 Australia
| | - Amjad J. Humaidi
- Control and Systems Engineering Department, University of Technology, Baghdad, 10001 Iraq
| | - Ayad Al-Dujaili
- Electrical Engineering Technical College, Middle Technical University, Baghdad, 10001 Iraq
| | - Ye Duan
- Faculty of Electrical Engineering & Computer Science, University of Missouri, Columbia, MO 65211 USA
| | - Omran Al-Shamma
- AlNidhal Campus, University of Information Technology & Communications, Baghdad, 10001 Iraq
| | - J. Santamaría
- Department of Computer Science, University of Jaén, 23071 Jaén, Spain
| | - Mohammed A. Fadhel
- College of Computer Science and Information Technology, University of Sumer, Thi Qar, 64005 Iraq
| | - Muthana Al-Amidie
- Faculty of Electrical Engineering & Computer Science, University of Missouri, Columbia, MO 65211 USA
| | - Laith Farhan
- School of Engineering, Manchester Metropolitan University, Manchester, M1 5GD UK
| |
Collapse
|
47
|
Wosniack ME, Kirchner JH, Chao LY, Zabouri N, Lohmann C, Gjorgjieva J. Adaptation of spontaneous activity in the developing visual cortex. eLife 2021; 10:61619. [PMID: 33722342 PMCID: PMC7963484 DOI: 10.7554/elife.61619] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Accepted: 02/03/2021] [Indexed: 12/11/2022] Open
Abstract
Spontaneous activity drives the establishment of appropriate connectivity in different circuits during brain development. In the mouse primary visual cortex, two distinct patterns of spontaneous activity occur before vision onset: local low-synchronicity events originating in the retina and global high-synchronicity events originating in the cortex. We sought to determine the contribution of these activity patterns to jointly organize network connectivity through different activity-dependent plasticity rules. We postulated that local events shape cortical input selectivity and topography, while global events homeostatically regulate connection strength. However, to generate robust selectivity, we found that global events should adapt their amplitude to the history of preceding cortical activation. We confirmed this prediction by analyzing in vivo spontaneous cortical activity. The predicted adaptation leads to the sparsification of spontaneous activity on a slower timescale during development, demonstrating the remarkable capacity of the developing sensory cortex to acquire sensitivity to visual inputs after eye-opening.
Collapse
Affiliation(s)
- Marina E Wosniack
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany.,School of Life Sciences Weihenstephan, Technical University of Munich, Freising, Germany
| | - Jan H Kirchner
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany.,School of Life Sciences Weihenstephan, Technical University of Munich, Freising, Germany
| | - Ling-Ya Chao
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
| | - Nawal Zabouri
- Netherlands Institute for Neuroscience, Amsterdam, Netherlands
| | - Christian Lohmann
- Netherlands Institute for Neuroscience, Amsterdam, Netherlands.,Center for Neurogenomics and Cognitive Research, Vrije Universiteit, Amsterdam, Netherlands
| | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany.,School of Life Sciences Weihenstephan, Technical University of Munich, Freising, Germany
| |
Collapse
|
48
|
Sohn H, Meirhaeghe N, Rajalingham R, Jazayeri M. A Network Perspective on Sensorimotor Learning. Trends Neurosci 2021; 44:170-181. [PMID: 33349476 PMCID: PMC9744184 DOI: 10.1016/j.tins.2020.11.007] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2020] [Revised: 09/11/2020] [Accepted: 11/20/2020] [Indexed: 12/15/2022]
Abstract
What happens in the brain when we learn? Ever since the foundational work of Cajal, the field has made numerous discoveries as to how experience could change the structure and function of individual synapses. However, more recent advances have highlighted the need for understanding learning in terms of complex interactions between populations of neurons and synapses. How should one think about learning at such a macroscopic level? Here, we develop a conceptual framework to bridge the gap between the different scales at which learning operates, from synapses to neurons to behavior. Using this framework, we explore the principles that guide sensorimotor learning across these scales, and set the stage for future experimental and theoretical work in the field.
Collapse
Affiliation(s)
| | - Nicolas Meirhaeghe
- Harvard-MIT Division of Health Sciences & Technology, Massachusetts Institute of Technology,Corresponding authors: Nicolas Meirhaeghe, , Mehrdad Jazayeri, Ph.D.,
| | | | - Mehrdad Jazayeri
- McGovern Institute for Brain Research,,Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology,Corresponding authors: Nicolas Meirhaeghe, , Mehrdad Jazayeri, Ph.D.,
| |
Collapse
|
49
|
Men K, Chen X, Yang B, Zhu J, Yi J, Wang S, Li Y, Dai J. Automatic segmentation of three clinical target volumes in radiotherapy using lifelong learning. Radiother Oncol 2021; 157:1-7. [PMID: 33418008 DOI: 10.1016/j.radonc.2020.12.034] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2020] [Revised: 12/21/2020] [Accepted: 12/22/2020] [Indexed: 10/22/2022]
Abstract
BACKGROUND AND PURPOSE Convolutional neural networks (CNNs) have comparable human level performance in automatic segmentation. An important challenge that CNNs face in segmentation is catastrophic forgetting. They lose performance on tasks that were previously learned when trained on task. In this study, we propose a lifelong learning method to learn multiple segmentation tasks continuously without forgetting previous tasks. MATERIALS AND METHODS The cohort included three tumors, 800 patients of which had nasopharyngeal cancer (NPC), 800 patients had breast cancer, and 800 patients had rectal cancer. The tasks included segmentation of the clinical target volume (CTV) of these three cancers. The proposed lifelong learning network adopted dilation adapter to learn three segmentation tasks one by one. Only the newly added dilation adapter (seven layers) was fine tuning for incoming new task, whereas all the other learned layers were frozen. RESULTS Compared with single-task, multi-task or transfer learning, the proposed lifelong learning can achieve better or comparable segmentation accuracy with a DSC of 0.86 for NPC, 0.89 for breast cancer, and 0.87 for rectal cancer. Lifelong learning can avoid forgetting in sequential learning and yield good performance with less training data. Furthermore, it is more efficient than single-task or transfer learning, which reduced the number of parameters, size of model, and training time by ~58.8%, ~55.6%, and ~25.0%, respectively. CONCLUSION The proposed method preserved the knowledge of previous tasks while learning a new one using a dilation adapter. It could yield comparable performance with much less training data, model parameters, and training time.
Collapse
Affiliation(s)
- Kuo Men
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Xinyuan Chen
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Bining Yang
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Ji Zhu
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Junlin Yi
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Shulian Wang
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Yexiong Li
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Jianrong Dai
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China.
| |
Collapse
|
50
|
Liang H. Evaluation of fitness state of sports training based on self-organizing neural network. Neural Comput Appl 2021. [DOI: 10.1007/s00521-020-05551-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|