1
|
Gerevich Z, Kovács R, Liotta A, Hasam-Henderson LA, Weh L, Wallach I, Berndt N. Metabolic implications of axonal demyelination and its consequences for synchronized network activity: An in silico and in vitro study. J Cereb Blood Flow Metab 2023; 43:1571-1587. [PMID: 37125487 PMCID: PMC10414014 DOI: 10.1177/0271678x231170746] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 02/13/2023] [Accepted: 03/22/2023] [Indexed: 05/02/2023]
Abstract
Myelination enhances the conduction velocity of action potentials (AP) and increases energy efficiency. Thick myelin sheaths are typically found on large-distance axonal connections or in fast-spiking interneurons, which are critical for synchronizing neuronal networks during gamma-band oscillations. Loss of myelin sheath is associated with multiple alterations in axonal architecture leading to impaired AP propagation. While numerous studies are devoted to the effects of demyelination on conduction velocity, the metabolic effects and the consequences for network synchronization have not been investigated. Here we present a unifying computational model for electrophysiology and metabolism of the myelinated axon. The computational model suggested that demyelination not only decreases the AP speed but AP propagation in demyelinated axons requires compensatory processes like mitochondrial mass increase and a switch from saltatory to continuous propagation to rescue axon functionality at the cost of reduced AP propagation speed and increased energy expenditure. Indeed, these predictions were proven to be true in a culture model of demyelination where the pharmacologically-induced loss of myelin was associated with increased oxygen consumption rates, and a significant broadening of bandwidth as well as a decrease in the power of gamma oscillations.
Collapse
Affiliation(s)
- Zoltan Gerevich
- Institute of Neurophysiology, Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Richard Kovács
- Institute of Neurophysiology, Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Agustin Liotta
- Institute of Neurophysiology, Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
- Department of Anesthesiology and Intensive Care, Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
- Institute of Computer-assisted Cardiovascular Medicine, Deutsches Herzzentrum der Charité (DHZC), Berlin, Germany
| | - Luisa A Hasam-Henderson
- Institute of Neurophysiology, Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Ludwig Weh
- Institute of Biochemistry, Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Iwona Wallach
- Institute of Computer-assisted Cardiovascular Medicine, Deutsches Herzzentrum der Charité (DHZC), Berlin, Germany
- Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Nikolaus Berndt
- Institute of Computer-assisted Cardiovascular Medicine, Deutsches Herzzentrum der Charité (DHZC), Berlin, Germany
- Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
2
|
Sawicki J, Berner R, Loos SAM, Anvari M, Bader R, Barfuss W, Botta N, Brede N, Franović I, Gauthier DJ, Goldt S, Hajizadeh A, Hövel P, Karin O, Lorenz-Spreen P, Miehl C, Mölter J, Olmi S, Schöll E, Seif A, Tass PA, Volpe G, Yanchuk S, Kurths J. Perspectives on adaptive dynamical systems. CHAOS (WOODBURY, N.Y.) 2023; 33:071501. [PMID: 37486668 DOI: 10.1063/5.0147231] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/20/2023] [Accepted: 05/24/2023] [Indexed: 07/25/2023]
Abstract
Adaptivity is a dynamical feature that is omnipresent in nature, socio-economics, and technology. For example, adaptive couplings appear in various real-world systems, such as the power grid, social, and neural networks, and they form the backbone of closed-loop control strategies and machine learning algorithms. In this article, we provide an interdisciplinary perspective on adaptive systems. We reflect on the notion and terminology of adaptivity in different disciplines and discuss which role adaptivity plays for various fields. We highlight common open challenges and give perspectives on future research directions, looking to inspire interdisciplinary approaches.
Collapse
Affiliation(s)
- Jakub Sawicki
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Rico Berner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Sarah A M Loos
- DAMTP, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA, United Kingdom
| | - Mehrnaz Anvari
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Fraunhofer Institute for Algorithms and Scientific Computing, Schloss Birlinghoven, 53757 Sankt-Augustin, Germany
| | - Rolf Bader
- Institute of Systematic Musicology, University of Hamburg, Hamburg, Germany
| | - Wolfram Barfuss
- Transdisciplinary Research Area: Sustainable Futures, University of Bonn, 53113 Bonn, Germany
- Center for Development Research (ZEF), University of Bonn, 53113 Bonn, Germany
| | - Nicola Botta
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Computer Science and Engineering, Chalmers University of Technology, 412 96 Göteborg, Sweden
| | - Nuria Brede
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Computer Science, University of Potsdam, An der Bahn 2, 14476 Potsdam, Germany
| | - Igor Franović
- Scientific Computing Laboratory, Center for the Study of Complex Systems, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia
| | - Daniel J Gauthier
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
| | - Sebastian Goldt
- Department of Physics, International School of Advanced Studies (SISSA), Trieste, Italy
| | - Aida Hajizadeh
- Research Group Comparative Neuroscience, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Philipp Hövel
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
| | - Omer Karin
- Department of Mathematics, Imperial College London, London SW7 2AZ, United Kingdom
| | - Philipp Lorenz-Spreen
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany
| | - Christoph Miehl
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Jan Mölter
- Department of Mathematics, School of Computation, Information and Technology, Technical University of Munich, Boltzmannstraße 3, 85748 Garching bei München, Germany
| | - Simona Olmi
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Eckehard Schöll
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Alireza Seif
- Pritzker School of Molecular Engineering, The University of Chicago, Chicago, Illinois 60637, USA
| | - Peter A Tass
- Department of Neurosurgery, Stanford University School of Medicine, Stanford, California 94304, USA
| | - Giovanni Volpe
- Department of Physics, University of Gothenburg, Gothenburg, Sweden
| | - Serhiy Yanchuk
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Jürgen Kurths
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
3
|
Jacob M, Ford J, Deacon T. Cognition is entangled with metabolism: relevance for resting-state EEG-fMRI. Front Hum Neurosci 2023; 17:976036. [PMID: 37113322 PMCID: PMC10126302 DOI: 10.3389/fnhum.2023.976036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2022] [Accepted: 03/02/2023] [Indexed: 04/29/2023] Open
Abstract
The brain is a living organ with distinct metabolic constraints. However, these constraints are typically considered as secondary or supportive of information processing which is primarily performed by neurons. The default operational definition of neural information processing is that (1) it is ultimately encoded as a change in individual neuronal firing rate as this correlates with the presentation of a peripheral stimulus, motor action or cognitive task. Two additional assumptions are associated with this default interpretation: (2) that the incessant background firing activity against which changes in activity are measured plays no role in assigning significance to the extrinsically evoked change in neural firing, and (3) that the metabolic energy that sustains this background activity and which correlates with differences in neuronal firing rate is merely a response to an evoked change in neuronal activity. These assumptions underlie the design, implementation, and interpretation of neuroimaging studies, particularly fMRI, which relies on changes in blood oxygen as an indirect measure of neural activity. In this article we reconsider all three of these assumptions in light of recent evidence. We suggest that by combining EEG with fMRI, new experimental work can reconcile emerging controversies in neurovascular coupling and the significance of ongoing, background activity during resting-state paradigms. A new conceptual framework for neuroimaging paradigms is developed to investigate how ongoing neural activity is "entangled" with metabolism. That is, in addition to being recruited to support locally evoked neuronal activity (the traditional hemodynamic response), changes in metabolic support may be independently "invoked" by non-local brain regions, yielding flexible neurovascular coupling dynamics that inform the cognitive context. This framework demonstrates how multimodal neuroimaging is necessary to probe the neurometabolic foundations of cognition, with implications for the study of neuropsychiatric disorders.
Collapse
Affiliation(s)
- Michael Jacob
- Mental Health Service, San Francisco VA Healthcare System, San Francisco, CA, United States
- Department of Psychiatry and Behavioral Sciences, Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, United States
| | - Judith Ford
- Mental Health Service, San Francisco VA Healthcare System, San Francisco, CA, United States
- Department of Psychiatry and Behavioral Sciences, Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, United States
| | - Terrence Deacon
- Department of Anthropology, University of California, Berkeley, Berkeley, CA, United States
| |
Collapse
|
4
|
Abstract
Bursting is one of the fundamental rhythms that excitable cells can generate either in response to incoming stimuli or intrinsically. It has been a topic of intense research in computational biology for several decades. The classification of bursting oscillations in excitable systems has been the subject of active research since the early 1980s and is still ongoing. As a by-product, it establishes analytical and numerical foundations for studying complex temporal behaviors in multiple timescale models of cellular activity. In this review, we first present the seminal works of Rinzel and Izhikevich in classifying bursting patterns of excitable systems. We recall a complementary mathematical classification approach by Bertram and colleagues, and then by Golubitsky and colleagues, which, together with the Rinzel-Izhikevich proposals, provide the state-of-the-art foundations to these classifications. Beyond classical approaches, we review a recent bursting example that falls outside the previous classification systems. Generalizing this example leads us to propose an extended classification, which requires the analysis of both fast and slow subsystems of an underlying slow-fast model and allows the dissection of a larger class of bursters. Namely, we provide a general framework for bursting systems with both subthreshold and superthreshold oscillations. A new class of bursters with at least 2 slow variables is then added, which we denote folded-node bursters, to convey the idea that the bursts are initiated or annihilated via a folded-node singularity. Key to this mechanism are so-called canard or duck orbits, organizing the underpinning excitability structure. We describe the 2 main families of folded-node bursters, depending upon the phase (active/spiking or silent/nonspiking) of the bursting cycle during which folded-node dynamics occurs. We classify both families and give examples of minimal systems displaying these novel bursting patterns. Finally, we provide a biophysical example by reinterpreting a generic conductance-based episodic burster as a folded-node burster, showing that the associated framework can explain its subthreshold oscillations over a larger parameter region than the fast subsystem approach. Bursting is ubiquitous in cellular excitable rhythms and comes in a plethora of patterns, both experimentally recorded and reproduced through models. As these different patterns may reflect different coding or information properties, it is therefore crucial to develop modeling frameworks that can both capture them and understand their characteristics. In this review, we propose a comprehensive account of the main bursting classification systems that have been developed over the past 40 years, together with recent developments allowing us to extend these classifications. Based upon bifurcation theory and heavily reliant on timescale separation, these schemes take full advantage of the fast subsystem analysis, obtained when slow variables are frozen and considered as bifurcation parameters. We complement this classical view by showing that nontrivial slow subsystem may also encode key informations important to classify bursting rhythms, due to the presence of so-called folded-node singularities. We provide minimal idealized models as well as one generic conductance-based example displaying bursting oscillations that require our extended classification in order to be fully characterized. We also highlight examples of biological data that could be suitably revisited with the lenses of this extended classifications and could lead to new models of complex cellular activity.
Collapse
Affiliation(s)
- Mathieu Desroches
- MathNeuro Team, Inria Sophia Antipolis Méditerranée Research Centre, Sophia Antipolis, France
- MCEN Team, Basque Centre for Applied Mathematics (BCAM), Bilbao, Bizkaia, Spain
- * E-mail: (MD); (SR)
| | - John Rinzel
- Center for Neural Science, New York University, New York, New York, United States of America
- Courant Institute for Mathematical Sciences, New York University, New York, New York, United States of America
| | - Serafim Rodrigues
- MCEN Team, Basque Centre for Applied Mathematics (BCAM), Bilbao, Bizkaia, Spain
- Ikerbasque, The Basque Science Foundation, Bilbao, Bizkaia, Spain
- * E-mail: (MD); (SR)
| |
Collapse
|
5
|
Artificial neurovascular network (ANVN) to study the accuracy vs. efficiency trade-off in an energy dependent neural network. Sci Rep 2021; 11:13808. [PMID: 34226588 PMCID: PMC8257640 DOI: 10.1038/s41598-021-92661-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2021] [Accepted: 06/03/2021] [Indexed: 01/03/2023] Open
Abstract
Artificial feedforward neural networks perform a wide variety of classification and function approximation tasks with high accuracy. Unlike their artificial counterparts, biological neural networks require a supply of adequate energy delivered to single neurons by a network of cerebral microvessels. Since energy is a limited resource, a natural question is whether the cerebrovascular network is capable of ensuring maximum performance of the neural network while consuming minimum energy? Should the cerebrovascular network also be trained, along with the neural network, to achieve such an optimum? In order to answer the above questions in a simplified modeling setting, we constructed an Artificial Neurovascular Network (ANVN) comprising a multilayered perceptron (MLP) connected to a vascular tree structure. The root node of the vascular tree structure is connected to an energy source, and the terminal nodes of the vascular tree supply energy to the hidden neurons of the MLP. The energy delivered by the terminal vascular nodes to the hidden neurons determines the biases of the hidden neurons. The "weights" on the branches of the vascular tree depict the energy distribution from the parent node to the child nodes. The vascular weights are updated by a kind of "backpropagation" of the energy demand error generated by the hidden neurons. We observed that higher performance was achieved at lower energy levels when the vascular network was also trained along with the neural network. This indicates that the vascular network needs to be trained to ensure efficient neural performance. We observed that below a certain network size, the energetic dynamics of the network in the per capita energy consumption vs. classification accuracy space approaches a fixed-point attractor for various initial conditions. Once the number of hidden neurons increases beyond a threshold, the fixed point appears to vanish, giving place to a line of attractors. The model also showed that when there is a limited resource, the energy consumption of neurons is strongly correlated to their individual contribution to the network's performance.
Collapse
|