1
|
Bhasin BJ, Raymond JL, Goldman MS. Synaptic weight dynamics underlying memory consolidation: Implications for learning rules, circuit organization, and circuit function. Proc Natl Acad Sci U S A 2024; 121:e2406010121. [PMID: 39365821 PMCID: PMC11474072 DOI: 10.1073/pnas.2406010121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2024] [Accepted: 08/12/2024] [Indexed: 10/06/2024] Open
Abstract
Systems consolidation is a common feature of learning and memory systems, in which a long-term memory initially stored in one brain region becomes persistently stored in another region. We studied the dynamics of systems consolidation in simple circuit architectures with two sites of plasticity, one in an early-learning and one in a late-learning brain area. We show that the synaptic dynamics of the circuit during consolidation of an analog memory can be understood as a temporal integration process, by which transient changes in activity driven by plasticity in the early-learning area are accumulated into persistent synaptic changes at the late-learning site. This simple principle naturally leads to a speed-accuracy tradeoff in systems consolidation and provides insight into how the circuit mitigates the stability-plasticity dilemma of storing new memories while preserving core features of older ones. Furthermore, it imposes two constraints on the circuit. First, the plasticity rule at the late-learning site must stably support a continuum of possible outputs for a given input. We show that this is readily achieved by heterosynaptic but not standard Hebbian rules. Second, to turn off the consolidation process and prevent erroneous changes at the late-learning site, neural activity in the early-learning area must be reset to its baseline activity. We provide two biologically plausible implementations for this reset that propose functional roles in stabilizing consolidation for core elements of the cerebellar circuit.
Collapse
Affiliation(s)
- Brandon J. Bhasin
- Department of Bioengineering, Stanford University, Stanford, CA94305
- Center for Neuroscience, University of California, Davis, CA95616
| | - Jennifer L. Raymond
- Department of Neurobiology, Stanford University School of Medicine, Stanford, CA94305
| | - Mark S. Goldman
- Center for Neuroscience, University of California, Davis, CA95616
- Department of Neurobiology, Physiology, and Behavior, University of California, Davis, CA95616
- Department of Ophthalmology and Vision Science, University of California, Davis, CA95616
| |
Collapse
|
2
|
Jiang J, Foyard E, van Rossum MCW. Reinforcement learning when your life depends on it: A neuro-economic theory of learning. PLoS Comput Biol 2024; 20:e1012554. [PMID: 39466882 PMCID: PMC11542834 DOI: 10.1371/journal.pcbi.1012554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2024] [Revised: 11/07/2024] [Accepted: 10/14/2024] [Indexed: 10/30/2024] Open
Abstract
Synaptic plasticity enables animals to adapt to their environment, but memory formation can require a substantial amount of metabolic energy, potentially impairing survival. Hence, a neuro-economic dilemma arises whether learning is a profitable investment or not, and the brain must therefore judiciously regulate learning. Indeed, in experiments it was observed that during starvation, Drosophila suppress formation of energy-intensive aversive memories. Here we include energy considerations in a reinforcement learning framework. Simulated flies learned to avoid noxious stimuli through synaptic plasticity in either the energy expensive long-term memory (LTM) pathway, or the decaying anesthesia-resistant memory (ARM) pathway. The objective of the flies is to maximize their lifespan, which is calculated with a hazard function. We find that strategies that switch between the LTM and ARM pathways, based on energy reserve and reward prediction error, prolong lifespan. Our study highlights the significance of energy-regulation of memory pathways and dopaminergic control for adaptive learning and survival. It might also benefit engineering applications of reinforcement learning under resources constraints.
Collapse
Affiliation(s)
- Jiamu Jiang
- School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Emilie Foyard
- School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Mark C. W. van Rossum
- School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
- School of Psychology, University of Nottingham, Nottingham, United Kingdom
| |
Collapse
|
3
|
van Rossum MCW, Pache A. Competitive plasticity to reduce the energetic costs of learning. PLoS Comput Biol 2024; 20:e1012553. [PMID: 39466853 PMCID: PMC11542811 DOI: 10.1371/journal.pcbi.1012553] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2024] [Revised: 11/07/2024] [Accepted: 10/11/2024] [Indexed: 10/30/2024] Open
Abstract
The brain is not only constrained by energy needed to fuel computation, but it is also constrained by energy needed to form memories. Experiments have shown that learning simple conditioning tasks which might require only a few synaptic updates, already carries a significant metabolic cost. Yet, learning a task like MNIST to 95% accuracy appears to require at least 108 synaptic updates. Therefore the brain has likely evolved to be able to learn using as little energy as possible. We explored the energy required for learning in feedforward neural networks. Based on a parsimonious energy model, we propose two plasticity restricting algorithms that save energy: 1) only modify synapses with large updates, and 2) restrict plasticity to subsets of synapses that form a path through the network. In biology networks are often much larger than the task requires, yet vanilla backprop prescribes to update all synapses. In particular in this case, large savings can be achieved while only incurring a slightly worse learning time. Thus competitively restricting plasticity helps to save metabolic energy associated to synaptic plasticity. The results might lead to a better understanding of biological plasticity and a better match between artificial and biological learning. Moreover, the algorithms might benefit hardware because also electronic memory storage is energetically costly.
Collapse
Affiliation(s)
- Mark C. W. van Rossum
- School of Psychology, University of Nottingham, Nottingham, United Kingdom
- School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| | - Aaron Pache
- School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| |
Collapse
|
4
|
Bhasin BJ, Raymond JL, Goldman MS. Synaptic weight dynamics underlying memory consolidation: implications for learning rules, circuit organization, and circuit function. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.03.20.586036. [PMID: 38585936 PMCID: PMC10996481 DOI: 10.1101/2024.03.20.586036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/09/2024]
Abstract
Systems consolidation is a common feature of learning and memory systems, in which a long-term memory initially stored in one brain region becomes persistently stored in another region. We studied the dynamics of systems consolidation in simple circuit architectures with two sites of plasticity, one in an early-learning and one in a late-learning brain area. We show that the synaptic dynamics of the circuit during consolidation of an analog memory can be understood as a temporal integration process, by which transient changes in activity driven by plasticity in the early-learning area are accumulated into persistent synaptic changes at the late-learning site. This simple principle naturally leads to a speed-accuracy tradeoff in systems consolidation and provides insight into how the circuit mitigates the stability-plasticity dilemma of storing new memories while preserving core features of older ones. Furthermore, it imposes two constraints on the circuit. First, the plasticity rule at the late-learning site must stably support a continuum of possible outputs for a given input. We show that this is readily achieved by heterosynaptic but not standard Hebbian rules. Second, to turn off the consolidation process and prevent erroneous changes at the late-learning site, neural activity in the early-learning area must be reset to its baseline activity. We propose two biologically plausible implementations for this reset that suggest novel roles for core elements of the cerebellar circuit. Significance Statement How are memories transformed over time? We propose a simple organizing principle for how long term memories are moved from an initial to a final site of storage. We show that successful transfer occurs when the late site of memory storage is endowed with synaptic plasticity rules that stably accumulate changes in activity occurring at the early site of memory storage. We instantiate this principle in a simple computational model that is representative of brain circuits underlying a variety of behaviors. The model suggests how a neural circuit can store new memories while preserving core features of older ones, and suggests novel roles for core elements of the cerebellar circuit.
Collapse
|
5
|
Rae CD, Baur JA, Borges K, Dienel G, Díaz-García CM, Douglass SR, Drew K, Duarte JMN, Duran J, Kann O, Kristian T, Lee-Liu D, Lindquist BE, McNay EC, Robinson MB, Rothman DL, Rowlands BD, Ryan TA, Scafidi J, Scafidi S, Shuttleworth CW, Swanson RA, Uruk G, Vardjan N, Zorec R, McKenna MC. Brain energy metabolism: A roadmap for future research. J Neurochem 2024; 168:910-954. [PMID: 38183680 PMCID: PMC11102343 DOI: 10.1111/jnc.16032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2023] [Revised: 11/29/2023] [Accepted: 12/05/2023] [Indexed: 01/08/2024]
Abstract
Although we have learned much about how the brain fuels its functions over the last decades, there remains much still to discover in an organ that is so complex. This article lays out major gaps in our knowledge of interrelationships between brain metabolism and brain function, including biochemical, cellular, and subcellular aspects of functional metabolism and its imaging in adult brain, as well as during development, aging, and disease. The focus is on unknowns in metabolism of major brain substrates and associated transporters, the roles of insulin and of lipid droplets, the emerging role of metabolism in microglia, mysteries about the major brain cofactor and signaling molecule NAD+, as well as unsolved problems underlying brain metabolism in pathologies such as traumatic brain injury, epilepsy, and metabolic downregulation during hibernation. It describes our current level of understanding of these facets of brain energy metabolism as well as a roadmap for future research.
Collapse
Affiliation(s)
- Caroline D. Rae
- School of Psychology, The University of New South Wales, NSW 2052 & Neuroscience Research Australia, Randwick, New South Wales, Australia
| | - Joseph A. Baur
- Department of Physiology and Institute for Diabetes, Obesity and Metabolism, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Karin Borges
- School of Biomedical Sciences, Faculty of Medicine, The University of Queensland, St Lucia, QLD, Australia
| | - Gerald Dienel
- Department of Neurology, University of Arkansas for Medical Sciences, Little Rock, Arkansas, USA
- Department of Cell Biology and Physiology, University of New Mexico School of Medicine, Albuquerque, New Mexico, USA
| | - Carlos Manlio Díaz-García
- Department of Biochemistry and Molecular Biology, Center for Geroscience and Healthy Brain Aging, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA
| | | | - Kelly Drew
- Center for Transformative Research in Metabolism, Institute of Arctic Biology, University of Alaska Fairbanks, Fairbanks, Alaska, USA
| | - João M. N. Duarte
- Department of Experimental Medical Science, Faculty of Medicine, Lund University, Lund, & Wallenberg Centre for Molecular Medicine, Lund University, Lund, Sweden
| | - Jordi Duran
- Institut Químic de Sarrià (IQS), Universitat Ramon Llull (URL), Barcelona, Spain
- Institute for Bioengineering of Catalonia (IBEC), The Barcelona Institute of Science and Technology, Barcelona, Spain
| | - Oliver Kann
- Institute of Physiology and Pathophysiology, University of Heidelberg, D-69120; Interdisciplinary Center for Neurosciences (IZN), University of Heidelberg, Heidelberg, Germany
| | - Tibor Kristian
- Veterans Affairs Maryland Health Center System, Baltimore, Maryland, USA
- Department of Anesthesiology and the Center for Shock, Trauma, and Anesthesiology Research (S.T.A.R.), University of Maryland School of Medicine, Baltimore, Maryland, USA
| | - Dasfne Lee-Liu
- Facultad de Medicina y Ciencia, Universidad San Sebastián, Santiago, Región Metropolitana, Chile
| | - Britta E. Lindquist
- Department of Neurology, Division of Neurocritical Care, Gladstone Institute of Neurological Disease, University of California at San Francisco, San Francisco, California, USA
| | - Ewan C. McNay
- Behavioral Neuroscience, University at Albany, Albany, New York, USA
| | - Michael B. Robinson
- Departments of Pediatrics and System Pharmacology & Translational Therapeutics, Children’s Hospital of Philadelphia, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Douglas L. Rothman
- Magnetic Resonance Research Center and Departments of Radiology and Biomedical Engineering, Yale University, New Haven, Connecticut, USA
| | - Benjamin D. Rowlands
- School of Chemistry, Faculty of Science, The University of Sydney, Sydney, New South Wales, Australia
| | - Timothy A. Ryan
- Department of Biochemistry, Weill Cornell Medicine, New York, New York, USA
| | - Joseph Scafidi
- Department of Neurology, Kennedy Krieger Institute, Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | - Susanna Scafidi
- Anesthesiology & Critical Care Medicine, Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | - C. William Shuttleworth
- Department of Neurosciences, University of New Mexico School of Medicine Albuquerque, Albuquerque, New Mexico, USA
| | - Raymond A. Swanson
- Department of Neurology, University of California, San Francisco, and San Francisco Veterans Affairs Medical Center, San Francisco, California, USA
| | - Gökhan Uruk
- Department of Neurology, University of California, San Francisco, and San Francisco Veterans Affairs Medical Center, San Francisco, California, USA
| | - Nina Vardjan
- Laboratory of Cell Engineering, Celica Biomedical, Ljubljana, Slovenia
- Laboratory of Neuroendocrinology—Molecular Cell Physiology, Institute of Pathophysiology, Faculty of Medicine, University of Ljubljana, Ljubljana, Slovenia
| | - Robert Zorec
- Laboratory of Cell Engineering, Celica Biomedical, Ljubljana, Slovenia
- Laboratory of Neuroendocrinology—Molecular Cell Physiology, Institute of Pathophysiology, Faculty of Medicine, University of Ljubljana, Ljubljana, Slovenia
| | - Mary C. McKenna
- Department of Pediatrics and Program in Neuroscience, University of Maryland School of Medicine, Baltimore, Maryland, USA
| |
Collapse
|
6
|
Karbowski J, Urban P. Cooperativity, Information Gain, and Energy Cost During Early LTP in Dendritic Spines. Neural Comput 2024; 36:271-311. [PMID: 38101326 DOI: 10.1162/neco_a_01632] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2023] [Accepted: 10/04/2023] [Indexed: 12/17/2023]
Abstract
We investigate a mutual relationship between information and energy during the early phase of LTP induction and maintenance in a large-scale system of mutually coupled dendritic spines, with discrete internal states and probabilistic dynamics, within the framework of nonequilibrium stochastic thermodynamics. In order to analyze this computationally intractable stochastic multidimensional system, we introduce a pair approximation, which allows us to reduce the spine dynamics into a lower-dimensional manageable system of closed equations. We found that the rates of information gain and energy attain their maximal values during an initial period of LTP (i.e., during stimulation), and after that, they recover to their baseline low values, as opposed to a memory trace that lasts much longer. This suggests that the learning phase is much more energy demanding than the memory phase. We show that positive correlations between neighboring spines increase both a duration of memory trace and energy cost during LTP, but the memory time per invested energy increases dramatically for very strong, positive synaptic cooperativity, suggesting a beneficial role of synaptic clustering on memory duration. In contrast, information gain after LTP is the largest for negative correlations, and energy efficiency of that information generally declines with increasing synaptic cooperativity. We also find that dendritic spines can use sparse representations for encoding long-term information, as both energetic and structural efficiencies of retained information and its lifetime exhibit maxima for low fractions of stimulated synapses during LTP. Moreover, we find that such efficiencies drop significantly with increasing the number of spines. In general, our stochastic thermodynamics approach provides a unifying framework for studying, from first principles, information encoding, and its energy cost during learning and memory in stochastic systems of interacting synapses.
Collapse
Affiliation(s)
- Jan Karbowski
- Institute of Applied Mathematics and Mechanics, University of Warsaw, Warsaw 02-097, Poland
| | - Paulina Urban
- College of Inter-Faculty Individual Studies in Mathematics and Natural Sciences and Laboratory of Functional and Structural Genomics, Centre of New Technologies, University of Warsaw, Warsaw 02-097, Poland
- Laboratory of Databases and Business Analytics, National Information Processing Institute, National Research Institute, Warsaw 00-608, Poland
| |
Collapse
|
7
|
Astrocyte strategies in the energy-efficient brain. Essays Biochem 2023; 67:3-16. [PMID: 36350053 DOI: 10.1042/ebc20220077] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2022] [Revised: 10/11/2022] [Accepted: 10/13/2022] [Indexed: 11/10/2022]
Abstract
Astrocytes generate ATP through glycolysis and mitochondrion respiration, using glucose, lactate, fatty acids, amino acids, and ketone bodies as metabolic fuels. Astrocytic mitochondria also participate in neuronal redox homeostasis and neurotransmitter recycling. In this essay, we aim to integrate the multifaceted evidence about astrocyte bioenergetics at the cellular and systems levels, with a focus on mitochondrial oxidation. At the cellular level, the use of fatty acid β-oxidation and the existence of molecular switches for the selection of metabolic mode and fuels are examined. At the systems level, we discuss energy audits of astrocytes and how astrocytic Ca2+ signaling might contribute to the higher performance and lower energy consumption of the brain as compared to engineered circuits. We finish by examining the neural-circuit dysregulation and behavior impairment associated with alterations of astrocytic mitochondria. We conclude that astrocytes may contribute to brain energy efficiency by coupling energy, redox, and computational homeostasis in neural circuits.
Collapse
|
8
|
Ernst E. The AI trilemma: Saving the planet without ruining our jobs. Front Artif Intell 2022; 5:886561. [PMID: 36337142 PMCID: PMC9626962 DOI: 10.3389/frai.2022.886561] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 08/12/2022] [Indexed: 01/24/2023] Open
Abstract
Digitalization and artificial intelligence increasingly affect the world of work. Rising risk of massive job losses have sparked technological fears. Limited income and productivity gains concentrated among a few tech companies are fueling inequalities. In addition, the increasing ecological footprint of digital technologies has become the focus of much discussion. This creates a trilemma of rising inequality, low productivity growth and high ecological costs brought by technological progress. How can this trilemma be resolved? Which digital applications should be promoted specifically? And what should policymakers do to address this trilemma? This contribution shows that policymakers should create suitable conditions to fully exploit the potential in the area of network applications (transport, information exchange, supply, provisioning) in order to reap maximum societal benefits that can be widely shared. This requires shifting incentives away from current uses toward those that can, at least partially, address the trilemma. The contribution analyses the scope and limits of current policy instruments in this regard and discusses alternative approaches that are more aligned with the properties of the emerging technological paradigm underlying the digital economy. In particular, it discusses the possibility of institutional innovations required to address the socio-economic challenges resulting from the technological innovations brought about by artificial intelligence.
Collapse
Affiliation(s)
- Ekkehard Ernst
- International Labour Organization, Department of Research, Geneva, Switzerland
| |
Collapse
|
9
|
How can caching explain automaticity? Psychon Bull Rev 2022; 30:407-420. [PMID: 36224462 DOI: 10.3758/s13423-022-02191-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/15/2022] [Indexed: 11/08/2022]
Abstract
Automaticity is still ill-understood, and its relation to habit formation and skill acquisition is highly debated. Recently, the principle of caching has been advanced as a potentially promising avenue for studying automaticity. It is roughly understood as a means of storing direct input-output associations in a manner that supports instant lookup. We raise various concerns that should be addressed before the theoretical progress afforded by this principle can be evaluated. Is caching merely a metaphor for computer caching or is it a computational model that can be used to derive testable predictions? How do the short-term and long-term effects of automaticity relate to the distinction between working memory and long-term memory? Does caching apply to stimulus-response associations - as already suggested by Logan's instance theory - or to algorithms, too? How much practice is required for caching and how does caching depend on the task's type? What is the relation between control processes and caching as these pertain to the possible suppression of automatic processes? Dealing with these questions will arguably also advance our understanding of automaticity.
Collapse
|
10
|
Small, correlated changes in synaptic connectivity may facilitate rapid motor learning. Nat Commun 2022; 13:5163. [PMID: 36056006 PMCID: PMC9440011 DOI: 10.1038/s41467-022-32646-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2021] [Accepted: 08/08/2022] [Indexed: 11/08/2022] Open
Abstract
Animals rapidly adapt their movements to external perturbations, a process paralleled by changes in neural activity in the motor cortex. Experimental studies suggest that these changes originate from altered inputs (Hinput) rather than from changes in local connectivity (Hlocal), as neural covariance is largely preserved during adaptation. Since measuring synaptic changes in vivo remains very challenging, we used a modular recurrent neural network to qualitatively test this interpretation. As expected, Hinput resulted in small activity changes and largely preserved covariance. Surprisingly given the presumed dependence of stable covariance on preserved circuit connectivity, Hlocal led to only slightly larger changes in activity and covariance, still within the range of experimental recordings. This similarity is due to Hlocal only requiring small, correlated connectivity changes for successful adaptation. Simulations of tasks that impose increasingly larger behavioural changes revealed a growing difference between Hinput and Hlocal, which could be exploited when designing future experiments.
Collapse
|
11
|
Albesa-González A, Froc M, Williamson O, Rossum MCWV. Weight dependence in BCM leads to adjustable synaptic competition. J Comput Neurosci 2022; 50:431-444. [PMID: 35764852 PMCID: PMC9666303 DOI: 10.1007/s10827-022-00824-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Revised: 05/15/2022] [Accepted: 06/08/2022] [Indexed: 11/28/2022]
Abstract
Models of synaptic plasticity have been used to better understand neural development as well as learning and memory. One prominent classic model is the Bienenstock-Cooper-Munro (BCM) model that has been particularly successful in explaining plasticity of the visual cortex. Here, in an effort to include more biophysical detail in the BCM model, we incorporate 1) feedforward inhibition, and 2) the experimental observation that large synapses are relatively harder to potentiate than weak ones, while synaptic depression is proportional to the synaptic strength. These modifications change the outcome of unsupervised plasticity under the BCM model. The amount of feed-forward inhibition adds a parameter to BCM that turns out to determine the strength of competition. In the limit of strong inhibition the learning outcome is identical to standard BCM and the neuron becomes selective to one stimulus only (winner-take-all). For smaller values of inhibition, competition is weaker and the receptive fields are less selective. However, both BCM variants can yield realistic receptive fields.
Collapse
Affiliation(s)
- Albert Albesa-González
- School of Psychology and School of Mathematical Sciences, University of Nottingham, Nottingham, NH7 2RD, UK
| | - Maxime Froc
- School of Psychology and School of Mathematical Sciences, University of Nottingham, Nottingham, NH7 2RD, UK
| | - Oliver Williamson
- School of Psychology and School of Mathematical Sciences, University of Nottingham, Nottingham, NH7 2RD, UK
| | - Mark C W van Rossum
- School of Psychology and School of Mathematical Sciences, University of Nottingham, Nottingham, NH7 2RD, UK.
| |
Collapse
|
12
|
Hong SZ, Mesik L, Grossman CD, Cohen JY, Lee B, Severin D, Lee HK, Hell JW, Kirkwood A. Norepinephrine potentiates and serotonin depresses visual cortical responses by transforming eligibility traces. Nat Commun 2022; 13:3202. [PMID: 35680879 PMCID: PMC9184610 DOI: 10.1038/s41467-022-30827-1] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2021] [Accepted: 05/19/2022] [Indexed: 11/18/2022] Open
Abstract
Reinforcement allows organisms to learn which stimuli predict subsequent biological relevance. Hebbian mechanisms of synaptic plasticity are insufficient to account for reinforced learning because neuromodulators signaling biological relevance are delayed with respect to the neural activity associated with the stimulus. A theoretical solution is the concept of eligibility traces (eTraces), silent synaptic processes elicited by activity which upon arrival of a neuromodulator are converted into a lasting change in synaptic strength. Previously we demonstrated in visual cortical slices the Hebbian induction of eTraces and their conversion into LTP and LTD by the retroactive action of norepinephrine and serotonin Here we show in vivo in mouse V1 that the induction of eTraces and their conversion to LTP/D by norepinephrine and serotonin respectively potentiates and depresses visual responses. We also show that the integrity of this process is crucial for ocular dominance plasticity, a canonical model of experience-dependent plasticity.
Collapse
Affiliation(s)
- Su Z Hong
- Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Lukas Mesik
- Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Cooper D Grossman
- Department of Neuroscience, Johns Hopkins University, Baltimore, MD, 21205, USA
| | - Jeremiah Y Cohen
- Department of Neuroscience, Johns Hopkins University, Baltimore, MD, 21205, USA
| | - Boram Lee
- Department of Pharmacology, University of California at Davis, Davis, CA, 95616, USA
| | - Daniel Severin
- Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Hey-Kyoung Lee
- Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, 21218, USA
- Department of Neuroscience, Johns Hopkins University, Baltimore, MD, 21205, USA
| | - Johannes W Hell
- Department of Pharmacology, University of California at Davis, Davis, CA, 95616, USA
| | - Alfredo Kirkwood
- Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, 21218, USA.
- Department of Neuroscience, Johns Hopkins University, Baltimore, MD, 21205, USA.
| |
Collapse
|
13
|
Learning induces coordinated neuronal plasticity of metabolic demands and functional brain networks. Commun Biol 2022; 5:428. [PMID: 35534605 PMCID: PMC9085889 DOI: 10.1038/s42003-022-03362-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2021] [Accepted: 04/12/2022] [Indexed: 12/21/2022] Open
Abstract
The neurobiological basis of learning is reflected in adaptations of brain structure, network organization and energy metabolism. However, it is still unknown how different neuroplastic mechanisms act together and if cognitive advancements relate to general or task-specific changes. Therefore, we tested how hierarchical network interactions contribute to improvements in the performance of a visuo-spatial processing task by employing simultaneous PET/MR neuroimaging before and after a 4-week learning period. We combined functional PET and metabolic connectivity mapping (MCM) to infer directional interactions across brain regions. Learning altered the top-down regulation of the salience network onto the occipital cortex, with increases in MCM at resting-state and decreases during task execution. Accordingly, a higher divergence between resting-state and task-specific effects was associated with better cognitive performance, indicating that these adaptations are complementary and both required for successful visuo-spatial skill learning. Simulations further showed that changes at resting-state were dependent on glucose metabolism, whereas those during task performance were driven by functional connectivity between salience and visual networks. Referring to previous work, we suggest that learning establishes a metabolically expensive skill engram at rest, whose retrieval serves for efficient task execution by minimizing prediction errors between neuronal representations of brain regions on different hierarchical levels. Brain network analyses reveal coupled changes between functional connectivity and metabolic demands that relate to cognitive performance improvements induced by learning a challenging visuo-spatial task for four weeks.
Collapse
|
14
|
Chen H, Xie L, Wang Y, Zhang H. Postsynaptic Potential Energy as Determinant of Synaptic Plasticity. Front Comput Neurosci 2022; 16:804604. [PMID: 35250524 PMCID: PMC8891168 DOI: 10.3389/fncom.2022.804604] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 01/13/2022] [Indexed: 02/06/2023] Open
Abstract
Metabolic energy can be used as a unifying principle to control neuronal activity. However, whether and how metabolic energy alone can determine the outcome of synaptic plasticity remains unclear. This study proposes a computational model of synaptic plasticity that is completely determined by energy. A simple quantitative relationship between synaptic plasticity and postsynaptic potential energy is established. Synaptic weight is directly proportional to the difference between the baseline potential energy and the suprathreshold potential energy and is constrained by the maximum energy supply. Results show that the energy constraint improves the performance of synaptic plasticity and avoids setting the hard boundary of synaptic weights. With the same set of model parameters, our model can reproduce several classical experiments in homo- and heterosynaptic plasticity. The proposed model can explain the interaction mechanism of Hebbian and homeostatic plasticity at the cellular level. Homeostatic synaptic plasticity at different time scales coexists. Homeostatic plasticity operating on a long time scale is caused by heterosynaptic plasticity and, on the same time scale as Hebbian synaptic plasticity, is caused by the constraint of energy supply.
Collapse
Affiliation(s)
- Huanwen Chen
- School of Automation, Central South University, Changsha, China
- *Correspondence: Huanwen Chen
| | - Lijuan Xie
- Institute of Physiology and Psychology, School of Marxism, Changsha University of Science and Technology, Changsha, China
| | - Yijun Wang
- School of Automation, Central South University, Changsha, China
| | - Hang Zhang
- School of Automation, Central South University, Changsha, China
| |
Collapse
|
15
|
Hernandez-Diaz S, Ghimire S, Sanchez-Mirasierra I, Montecinos-Oliva C, Swerts J, Kuenen S, Verstreken P, Soukup SF. Endophilin-B regulates autophagy during synapse development and neurodegeneration. Neurobiol Dis 2021; 163:105595. [PMID: 34933093 DOI: 10.1016/j.nbd.2021.105595] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2021] [Revised: 12/13/2021] [Accepted: 12/17/2021] [Indexed: 01/18/2023] Open
Abstract
Synapses are critical for neuronal communication and brain function. To maintain neuronal homeostasis, synapses rely on autophagy. Autophagic alterations cause neurodegeneration and synaptic dysfunction is a feature in neurodegenerative diseases. In Parkinson's disease (PD), where the loss of synapses precedes dopaminergic neuron loss, various PD-causative proteins are involved in the regulation of autophagy. So far only a few factors regulating autophagy at the synapse have been identified and the molecular mechanisms underlying autophagy at the synapse is only partially understood. Here, we describe Endophilin-B (EndoB) as a novel player in the regulation of synaptic autophagy in health and disease. We demonstrate that EndoB is required for autophagosome biogenesis at the synapse, whereas the loss of EndoB blocks the autophagy induction promoted by the PD mutation LRRK2G2019S. We show that EndoB is required to prevent neuronal loss. Moreover, loss of EndoB in the Drosophila visual system leads to an increase in synaptic contacts between photoreceptor terminals and their post-synaptic synapses. These data confirm the role of autophagy in synaptic contact formation and neuronal survival.
Collapse
Affiliation(s)
| | - Saurav Ghimire
- Univ. Bordeaux, CNRS, IMN, UMR 5293, F-33000 Bordeaux, France
| | | | | | - Jef Swerts
- VIB Center for the Biology of Disease, Belgium; KU Leuven, Department for Human Genetics, Leuven Institute for Neurodegenerative Disease (LIND), 3000 Leuven, Belgium
| | - Sabine Kuenen
- VIB Center for the Biology of Disease, Belgium; KU Leuven, Department for Human Genetics, Leuven Institute for Neurodegenerative Disease (LIND), 3000 Leuven, Belgium
| | - Patrik Verstreken
- VIB Center for the Biology of Disease, Belgium; KU Leuven, Department for Human Genetics, Leuven Institute for Neurodegenerative Disease (LIND), 3000 Leuven, Belgium
| | | |
Collapse
|
16
|
Abstract
Energy constraints are a fundamental limitation of the brain, which is physically embedded in a restricted space. The collective dynamics of neurons through connections enable the brain to achieve rich functionality, but building connections and maintaining activity come at a high cost. The effects of reducing these costs can be found in the characteristic structures of the brain network. Nevertheless, the mechanism by which energy constraints affect the organization and formation of the neuronal network in the brain is unclear. Here, it is shown that a simple model based on cost minimization can reproduce structures characteristic of the brain network. With reference to the behavior of neurons in real brains, the cost function was introduced in an activity-dependent form correlating the activity cost and the wiring cost as a simple ratio. Cost reduction of this ratio resulted in strengthening connections, especially at highly activated nodes, and induced the formation of large clusters. Regarding these network features, statistical similarity was confirmed by comparison to connectome datasets from various real brains. The findings indicate that these networks share an efficient structure maintained with low costs, both for activity and for wiring. These results imply the crucial role of energy constraints in regulating the network activity and structure of the brain.
Collapse
|
17
|
Bianco R, Harrison PMC, Hu M, Bolger C, Picken S, Pearce MT, Chait M. Long-term implicit memory for sequential auditory patterns in humans. eLife 2020; 9:e56073. [PMID: 32420868 PMCID: PMC7338054 DOI: 10.7554/elife.56073] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2020] [Accepted: 05/18/2020] [Indexed: 11/17/2022] Open
Abstract
Memory, on multiple timescales, is critical to our ability to discover the structure of our surroundings, and efficiently interact with the environment. We combined behavioural manipulation and modelling to investigate the dynamics of memory formation for rarely reoccurring acoustic patterns. In a series of experiments, participants detected the emergence of regularly repeating patterns within rapid tone-pip sequences. Unbeknownst to them, a few patterns reoccurred every ~3 min. All sequences consisted of the same 20 frequencies and were distinguishable only by the order of tone-pips. Despite this, reoccurring patterns were associated with a rapidly growing detection-time advantage over novel patterns. This effect was implicit, robust to interference, and persisted for 7 weeks. The results implicate an interplay between short (a few seconds) and long-term (over many minutes) integration in memory formation and demonstrate the remarkable sensitivity of the human auditory system to sporadically reoccurring structure within the acoustic environment.
Collapse
Affiliation(s)
- Roberta Bianco
- UCL Ear Institute, University College LondonLondonUnited Kingdom
| | - Peter MC Harrison
- School of Electronic Engineering and Computer Science, Queen Mary University of LondonLondonUnited Kingdom
| | - Mingyue Hu
- UCL Ear Institute, University College LondonLondonUnited Kingdom
| | - Cora Bolger
- UCL Ear Institute, University College LondonLondonUnited Kingdom
| | - Samantha Picken
- UCL Ear Institute, University College LondonLondonUnited Kingdom
| | - Marcus T Pearce
- School of Electronic Engineering and Computer Science, Queen Mary University of LondonLondonUnited Kingdom
- Department of Clinical Medicine, Aarhus UniversityAarhusDenmark
| | - Maria Chait
- UCL Ear Institute, University College LondonLondonUnited Kingdom
| |
Collapse
|