1
|
Francés R, Rabah Y, Preat T, Plaçais PY. Diverting glial glycolytic flux towards neurons is a memory-relevant role of Drosophila CRH-like signalling. Nat Commun 2024; 15:10467. [PMID: 39622834 PMCID: PMC11612226 DOI: 10.1038/s41467-024-54778-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2024] [Accepted: 11/21/2024] [Indexed: 12/06/2024] Open
Abstract
An essential role of glial cells is to comply with the large and fluctuating energy needs of neurons. Metabolic adaptation is integral to the acute stress response, suggesting that glial cells could be major, yet overlooked, targets of stress hormones. Here we show that Dh44 neuropeptide, Drosophila homologue of mammalian corticotropin-releasing hormone (CRH), acts as an experience-dependent metabolic switch for glycolytic output in glia. Dh44 released by dopamine neurons limits glial fatty acid synthesis and build-up of lipid stores. Although basally active, this hormonal axis is acutely stimulated following learning of a danger-predictive cue. This results in transient suppression of glial anabolic use of pyruvate, sparing it for memory-relevant energy supply to neurons. Diverting pyruvate destination may dampen the need to upregulate glial glycolysis in response to increased neuronal demand. Although beneficial for the energy efficiency of memory formation, this mechanism reveals an ongoing competition between neuronal fuelling and glial anabolism.
Collapse
Affiliation(s)
- Raquel Francés
- Energy & Memory, Brain Plasticity (UMR 8249), CNRS, ESPCI Paris, PSL Research University, Paris, France
| | - Yasmine Rabah
- Energy & Memory, Brain Plasticity (UMR 8249), CNRS, ESPCI Paris, PSL Research University, Paris, France
| | - Thomas Preat
- Energy & Memory, Brain Plasticity (UMR 8249), CNRS, ESPCI Paris, PSL Research University, Paris, France.
| | - Pierre-Yves Plaçais
- Energy & Memory, Brain Plasticity (UMR 8249), CNRS, ESPCI Paris, PSL Research University, Paris, France.
| |
Collapse
|
2
|
Shichkova P, Coggan JS, Markram H, Keller D. Brain Metabolism in Health and Neurodegeneration: The Interplay Among Neurons and Astrocytes. Cells 2024; 13:1714. [PMID: 39451233 PMCID: PMC11506225 DOI: 10.3390/cells13201714] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2024] [Revised: 08/31/2024] [Accepted: 10/14/2024] [Indexed: 10/26/2024] Open
Abstract
The regulation of energy in the brain has garnered substantial attention in recent years due to its significant implications in various disorders and aging. The brain's energy metabolism is a dynamic and tightly regulated network that balances energy demand and supply by engaging complementary molecular pathways. The crosstalk among these pathways enables the system to switch its preferred fuel source based on substrate availability, activity levels, and cell state-related factors such as redox balance. Brain energy production relies on multi-cellular cooperation and is continuously supplied by fuel from the blood due to limited internal energy stores. Astrocytes, which interface with neurons and blood vessels, play a crucial role in coordinating the brain's metabolic activity, and their dysfunction can have detrimental effects on brain health. This review characterizes the major energy substrates (glucose, lactate, glycogen, ketones and lipids) in astrocyte metabolism and their role in brain health, focusing on recent developments in the field.
Collapse
Affiliation(s)
- Polina Shichkova
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202 Geneva, Switzerland
| | - Jay S. Coggan
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202 Geneva, Switzerland
| | - Henry Markram
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202 Geneva, Switzerland
- Laboratory of Neural Microcircuitry, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland
| | - Daniel Keller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202 Geneva, Switzerland
| |
Collapse
|
3
|
Malkin J, O'Donnell C, Houghton CJ, Aitchison L. Signatures of Bayesian inference emerge from energy-efficient synapses. eLife 2024; 12:RP92595. [PMID: 39106188 PMCID: PMC11302983 DOI: 10.7554/elife.92595] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/09/2024] Open
Abstract
Biological synaptic transmission is unreliable, and this unreliability likely degrades neural circuit performance. While there are biophysical mechanisms that can increase reliability, for instance by increasing vesicle release probability, these mechanisms cost energy. We examined four such mechanisms along with the associated scaling of the energetic costs. We then embedded these energetic costs for reliability in artificial neural networks (ANNs) with trainable stochastic synapses, and trained these networks on standard image classification tasks. The resulting networks revealed a tradeoff between circuit performance and the energetic cost of synaptic reliability. Additionally, the optimised networks exhibited two testable predictions consistent with pre-existing experimental data. Specifically, synapses with lower variability tended to have (1) higher input firing rates and (2) lower learning rates. Surprisingly, these predictions also arise when synapse statistics are inferred through Bayesian inference. Indeed, we were able to find a formal, theoretical link between the performance-reliability cost tradeoff and Bayesian inference. This connection suggests two incompatible possibilities: evolution may have chanced upon a scheme for implementing Bayesian inference by optimising energy efficiency, or alternatively, energy-efficient synapses may display signatures of Bayesian inference without actually using Bayes to reason about uncertainty.
Collapse
Affiliation(s)
- James Malkin
- Faculty of Engineering, University of BristolBristolUnited Kingdom
| | - Cian O'Donnell
- Faculty of Engineering, University of BristolBristolUnited Kingdom
- Intelligent Systems Research Centre, School of Computing, Engineering, and Intelligent Systems, Ulster UniversityDerry/LondonderryUnited Kingdom
| | - Conor J Houghton
- Faculty of Engineering, University of BristolBristolUnited Kingdom
| | | |
Collapse
|
4
|
Karbowski J, Urban P. Information encoded in volumes and areas of dendritic spines is nearly maximal across mammalian brains. Sci Rep 2023; 13:22207. [PMID: 38097675 PMCID: PMC10721930 DOI: 10.1038/s41598-023-49321-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 12/06/2023] [Indexed: 12/17/2023] Open
Abstract
Many experiments suggest that long-term information associated with neuronal memory resides collectively in dendritic spines. However, spines can have a limited size due to metabolic and neuroanatomical constraints, which should effectively limit the amount of encoded information in excitatory synapses. This study investigates how much information can be stored in the population of sizes of dendritic spines, and whether it is optimal in any sense. It is shown here, using empirical data for several mammalian brains across different regions and physiological conditions, that dendritic spines nearly maximize entropy contained in their volumes and surface areas for a given mean size in cortical and hippocampal regions. Although both short- and heavy-tailed fitting distributions approach [Formula: see text] of maximal entropy in the majority of cases, the best maximization is obtained primarily for short-tailed gamma distribution. We find that most empirical ratios of standard deviation to mean for spine volumes and areas are in the range [Formula: see text], which is close to the theoretical optimal ratios coming from entropy maximization for gamma and lognormal distributions. On average, the highest entropy is contained in spine length ([Formula: see text] bits per spine), and the lowest in spine volume and area ([Formula: see text] bits), although the latter two are closer to optimality. In contrast, we find that entropy density (entropy per spine size) is always suboptimal. Our results suggest that spine sizes are almost as random as possible given the constraint on their size, and moreover the general principle of entropy maximization is applicable and potentially useful to information and memory storing in the population of cortical and hippocampal excitatory synapses, and to predicting their morphological properties.
Collapse
Affiliation(s)
- Jan Karbowski
- Institute of Applied Mathematics and Mechanics, University of Warsaw, Warsaw, Poland.
| | - Paulina Urban
- Laboratory of Functional and Structural Genomics, Centre of New Technologies, University of Warsaw, Warsaw, Poland
- College of Inter-Faculty Individual Studies in Mathematics and Natural Sciences, University of Warsaw, Warsaw, Poland
- Laboratory of Databases and Business Analytics, National Information Processing Institute, National Research Institute, Warsaw, Poland
| |
Collapse
|
5
|
Padamsey Z, Rochefort NL. Paying the brain's energy bill. Curr Opin Neurobiol 2023; 78:102668. [PMID: 36571958 DOI: 10.1016/j.conb.2022.102668] [Citation(s) in RCA: 22] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Revised: 11/18/2022] [Accepted: 11/23/2022] [Indexed: 12/25/2022]
Abstract
How have animals managed to maintain metabolically expensive brains given the volatile and fleeting availability of calories in the natural world? Here we review studies in support of three strategies that involve: 1) a reallocation of energy from peripheral tissues and functions to cover the costs of the brain, 2) an implementation of energy-efficient neural coding, enabling the brain to operate at reduced energy costs, and 3) efficient use of costly neural resources during food scarcity. Collectively, these studies reveal a heterogeneous set of energy-saving mechanisms that make energy-costly brains fit for survival.
Collapse
Affiliation(s)
- Zahid Padamsey
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, EH8 9XD, Edinburgh, United Kingdom.
| | - Nathalie L Rochefort
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, EH8 9XD, Edinburgh, United Kingdom; Simons Initiative for the Developing Brain, University of Edinburgh, EH8 9XD, Edinburgh, United Kingdom.
| |
Collapse
|
6
|
Frankle L. Entropy, Amnesia, and Abnormal Déjà Experiences. Front Psychol 2022; 13:794683. [PMID: 35967717 PMCID: PMC9364811 DOI: 10.3389/fpsyg.2022.794683] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2021] [Accepted: 05/30/2022] [Indexed: 11/13/2022] Open
Abstract
Previous research has contrasted fleeting erroneous experiences of familiarity with equally convincing, and often more stubborn erroneous experiences of remembering. While a subset of the former category may present as nonpathological “déjà vu,” the latter, termed “déjà vécu” can categorize a delusion-like confabulatory phenomenon first described in elderly dementia patients. Leading explanations for this experience include the dual process view, in which erroneous familiarity and erroneous recollection are elicited by inappropriate activation of the parahippocampal cortex and the hippocampus, respectively, and the more popular encoding-as-retrieval explanation in which normal memory encoding processes are falsely flagged and interpreted as memory retrieval. This paper presents a novel understanding of this recollective confabulation that builds on the encoding-as-retrieval hypothesis but more adequately accounts for the co-occurrence of persistent déjà vécu with both perceptual novelty and memory impairment, the latter of which occurs not only in progressive dementia but also in transient epileptic amnesia (TEA) and psychosis. It makes use of the growing interdisciplinary understanding of the fluidity of time and posits that the functioning of memory and the perception of novelty, long known to influence the subjective experience of time, may have a more fundamental effect on the flow of time.
Collapse
|
7
|
Zhang Y, Lai S, Wu W, Wang Y, Zhao H, He J, Zhu Y, Chen G, Qi Z, Chen P, Lv S, Song Z, Hu Y, Miao H, Yan S, Luo Y, Ran H, Huang X, Lu X, Zhong S, Jia Y. Associations between executive function impairment and biochemical abnormalities in depressed adolescents with non-suicidal self-injury. J Affect Disord 2022; 298:492-499. [PMID: 34737017 DOI: 10.1016/j.jad.2021.10.132] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/19/2021] [Revised: 10/26/2021] [Accepted: 10/29/2021] [Indexed: 12/09/2022]
Abstract
BACKGROUND H protons magnetic resonance spectroscopy (1H-MRS) has been used to detect the biochemical metabolism changes and the mechanism of executive dysfunction in major depressive disorder (MDD). While, finding information associated with non-suicidal self-injury (NSSI) among adolescents with MDD is challenging. The present study aimed to examine the executive function and biochemical metabolism alterations, as well as to elucidate their associations in depressed adolescents with NSSI. METHODS A total of 86 adolescents with MDD (40 with NSSI, and 46 without NSSI) and 28 healthy controls were recruited in the current study. The executive function was assessed by Digital symbol test (DST), Wisconsin Card Sorting Test (WCST), Trail Making Test, part B (TMT-B), and Verbal fluency (VF). Bilateral metabolite levels of the prefrontal cortex (PFC), anterior cingulated cortex (ACC), lenticular nucleus (LN) of basal ganglia and thalamus were obtained by 1H-MRS at 3.0 T, and then the ratios of N-acetyl aspartate (NAA) and choline-containing compounds (Cho) to creatine (Cr) were determined, respectively. Finally, association analysis was conducted to investigate their relationships. RESULTS The depressed adolescents with NSSI showed significantly lower VF scores than those without NSSI and healthy controls. We also found significantly higher NAA/Cr ratios in the right thalamus, while significantly lower Cho/Cr ratios in the right thalamus of NSSI group than the MDD without NSSI group and healthy controls. And NSSI group also showed lower NAA/Cr ratio in the right LN than the MDD without NSSI group. For MDD with NSSI, the NAA/Cr ratios of the left thalamus were positively correlated with the time of TMTB and the Cho/Cr ratios of the left ACC were positively correlated with the VF scores. CONCLUSIONS Depressed adolescents with NSSI may have executive dysfunction and NAA and Cho metabolism abnormalities in the thalamus. And the NAA/Cr ratios of the right LN could distinguish NSSI from depressed adolescents. Further, the executive dysfunction may be associated with the abnormal NAA metabolism in the left thalamus and ACC.
Collapse
Affiliation(s)
- Yiliang Zhang
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Shunkai Lai
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Weige Wu
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China; The Department of Child and Adolescent Psychology Xiamen Xianyue hospital, Fujian 361012, China
| | - Ying Wang
- Medical Imaging Center, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Hui Zhao
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Jiali He
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Yunxia Zhu
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Guangmao Chen
- Medical Imaging Center, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Zhangzhang Qi
- Medical Imaging Center, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Pan Chen
- Medical Imaging Center, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Sihui Lv
- School of Management, Jinan University, Guangzhou 510316, China
| | - Zijin Song
- School of Management, Jinan University, Guangzhou 510316, China
| | - Yilei Hu
- School of Management, Jinan University, Guangzhou 510316, China
| | - Haofei Miao
- School of Management, Jinan University, Guangzhou 510316, China
| | - Shuya Yan
- School of Management, Jinan University, Guangzhou 510316, China
| | - Yange Luo
- School of Management, Jinan University, Guangzhou 510316, China
| | - Hanglin Ran
- School of Management, Jinan University, Guangzhou 510316, China
| | - Xiaosi Huang
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Xiaodan Lu
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China
| | - Shuming Zhong
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China.
| | - Yanbin Jia
- Department of Psychiatry, First Affiliated Hospital of Jinan University, Guangzhou 510630, China.
| |
Collapse
|
8
|
Emergence and fragmentation of the alpha-band driven by neuronal network dynamics. PLoS Comput Biol 2021; 17:e1009639. [PMID: 34871305 PMCID: PMC8675921 DOI: 10.1371/journal.pcbi.1009639] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2021] [Revised: 12/16/2021] [Accepted: 11/14/2021] [Indexed: 11/22/2022] Open
Abstract
Rhythmic neuronal network activity underlies brain oscillations. To investigate how connected neuronal networks contribute to the emergence of the α-band and to the regulation of Up and Down states, we study a model based on synaptic short-term depression-facilitation with afterhyperpolarization (AHP). We found that the α-band is generated by the network behavior near the attractor of the Up-state. Coupling inhibitory and excitatory networks by reciprocal connections leads to the emergence of a stable α-band during the Up states, as reflected in the spectrogram. To better characterize the emergence and stability of thalamocortical oscillations containing α and δ rhythms during anesthesia, we model the interaction of two excitatory networks with one inhibitory network, showing that this minimal topology underlies the generation of a persistent α-band in the neuronal voltage characterized by dominant Up over Down states. Finally, we show that the emergence of the α-band appears when external inputs are suppressed, while fragmentation occurs at small synaptic noise or with increasing inhibitory inputs. To conclude, α-oscillations could result from the synaptic dynamics of interacting excitatory neuronal networks with and without AHP, a principle that could apply to other rhythms. Brain oscillations, recorded from electroencephalograms characterize behaviors such as sleep, wakefulness, brain evoked responses, coma or anesthesia. The underlying rhythms for these oscillations are associated at a neuronal population level to fluctuations of the membrane potential between Up (depolarized) and Down (hyperpolarized) states. During anesthesia with propofol, a dominant α-band (8–12Hz) can emerge or disappear, but the underlying mechanism remains unclear. Using modeling, we report that the α-band appears during Up states in neuronal populations driven by short-term synaptic plasticity and synaptic noise. Moreover, we show that three connected neuronal networks representing the thalamocortical loop reproduce the dynamics of the α-band, which emerges following the arrest of excitatory stimulations, but that can disappear by increasing inhibitory inputs. To conclude, short-term plasticity in well connected neuronal networks can explain the emergence and fragmentation of the α-band.
Collapse
|
9
|
Schug S, Benzing F, Steger A. Presynaptic stochasticity improves energy efficiency and helps alleviate the stability-plasticity dilemma. eLife 2021; 10:e69884. [PMID: 34661525 PMCID: PMC8716105 DOI: 10.7554/elife.69884] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2021] [Accepted: 10/18/2021] [Indexed: 12/30/2022] Open
Abstract
When an action potential arrives at a synapse there is a large probability that no neurotransmitter is released. Surprisingly, simple computational models suggest that these synaptic failures enable information processing at lower metabolic costs. However, these models only consider information transmission at single synapses ignoring the remainder of the neural network as well as its overall computational goal. Here, we investigate how synaptic failures affect the energy efficiency of models of entire neural networks that solve a goal-driven task. We find that presynaptic stochasticity and plasticity improve energy efficiency and show that the network allocates most energy to a sparse subset of important synapses. We demonstrate that stabilising these synapses helps to alleviate the stability-plasticity dilemma, thus connecting a presynaptic notion of importance to a computational role in lifelong learning. Overall, our findings present a set of hypotheses for how presynaptic plasticity and stochasticity contribute to sparsity, energy efficiency and improved trade-offs in the stability-plasticity dilemma.
Collapse
Affiliation(s)
- Simon Schug
- Institute of Neuroinformatics, University of Zurich & ETH ZurichZurichSwitzerland
| | | | | |
Collapse
|
10
|
Optimising the energetic cost of the glutamatergic synapse. Neuropharmacology 2021; 197:108727. [PMID: 34314736 DOI: 10.1016/j.neuropharm.2021.108727] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2021] [Revised: 07/14/2021] [Accepted: 07/19/2021] [Indexed: 11/24/2022]
Abstract
As for electronic computation, neural information processing is energetically expensive. This is because information is coded in the brain as membrane voltage changes, which are generated largely by passive ion movements down electrochemical gradients, and these ion movements later need to be reversed by active ATP-dependent ion pumping. This article will review how much of the energetic cost of the brain reflects the activity of glutamatergic synapses, consider the relative amount of energy used pre- and postsynaptically, outline how evolution has energetically optimised synapse function by adjusting the presynaptic release probability and the postsynaptic number of glutamate receptors, and speculate on how energy use by synapses may be sensed and adjusted.
Collapse
|
11
|
Artificial neurovascular network (ANVN) to study the accuracy vs. efficiency trade-off in an energy dependent neural network. Sci Rep 2021; 11:13808. [PMID: 34226588 PMCID: PMC8257640 DOI: 10.1038/s41598-021-92661-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2021] [Accepted: 06/03/2021] [Indexed: 01/03/2023] Open
Abstract
Artificial feedforward neural networks perform a wide variety of classification and function approximation tasks with high accuracy. Unlike their artificial counterparts, biological neural networks require a supply of adequate energy delivered to single neurons by a network of cerebral microvessels. Since energy is a limited resource, a natural question is whether the cerebrovascular network is capable of ensuring maximum performance of the neural network while consuming minimum energy? Should the cerebrovascular network also be trained, along with the neural network, to achieve such an optimum? In order to answer the above questions in a simplified modeling setting, we constructed an Artificial Neurovascular Network (ANVN) comprising a multilayered perceptron (MLP) connected to a vascular tree structure. The root node of the vascular tree structure is connected to an energy source, and the terminal nodes of the vascular tree supply energy to the hidden neurons of the MLP. The energy delivered by the terminal vascular nodes to the hidden neurons determines the biases of the hidden neurons. The "weights" on the branches of the vascular tree depict the energy distribution from the parent node to the child nodes. The vascular weights are updated by a kind of "backpropagation" of the energy demand error generated by the hidden neurons. We observed that higher performance was achieved at lower energy levels when the vascular network was also trained along with the neural network. This indicates that the vascular network needs to be trained to ensure efficient neural performance. We observed that below a certain network size, the energetic dynamics of the network in the per capita energy consumption vs. classification accuracy space approaches a fixed-point attractor for various initial conditions. Once the number of hidden neurons increases beyond a threshold, the fixed point appears to vanish, giving place to a line of attractors. The model also showed that when there is a limited resource, the energy consumption of neurons is strongly correlated to their individual contribution to the network's performance.
Collapse
|
12
|
Mijatovic G, Antonacci Y, Loncar-Turukalo T, Minati L, Faes L. An Information-Theoretic Framework to Measure the Dynamic Interaction Between Neural Spike Trains. IEEE Trans Biomed Eng 2021; 68:3471-3481. [PMID: 33872139 DOI: 10.1109/tbme.2021.3073833] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
OBJECTIVE While understanding the interaction patterns among simultaneous recordings of spike trains from multiple neuronal units is a key topic in neuroscience, existing methods either do not consider the inherent point-process nature of spike trains or are based on parametric assumptions. This work presents an information-theoretic framework for the model-free, continuous-time estimation of both undirected (symmetric) and directed (Granger-causal) interactions between spike trains. METHODS The framework computes the mutual information rate (MIR) and the transfer entropy rate (TER) for two point processes X and Y, showing that the MIR between X and Y can be decomposed as the sum of the TER along the directions X → Y and Y → X. We present theoretical expressions and introduce strategies to estimate efficiently the two measures through nearest neighbor statistics. RESULTS Using simulations of independent and coupled point processes, we show the accuracy of MIR and TER to assess interactions even for weakly coupled and short realizations, and demonstrate the superiority of continuous-time estimation over the standard discrete-time approach. We also apply the MIR and TER to real-world data, specifically, recordings from in-vitro preparations of spontaneously-growing cultures of cortical neurons. Using this dataset, we demonstrate the ability of MIR and TER to describe how the functional networks between recording units emerge over the course of the maturation of the neuronal cultures. CONCLUSION AND SIGNIFICANCE the proposed framework provides principled measures to assess undirected and directed spike train interactions with more efficiency and flexibility than previous discrete-time or parametric approaches, opening new perspectives for the analysis of point-process data in neuroscience and many other fields.
Collapse
|
13
|
Shorten DP, Spinney RE, Lizier JT. Estimating Transfer Entropy in Continuous Time Between Neural Spike Trains or Other Event-Based Data. PLoS Comput Biol 2021; 17:e1008054. [PMID: 33872296 PMCID: PMC8084348 DOI: 10.1371/journal.pcbi.1008054] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Revised: 04/29/2021] [Accepted: 02/19/2021] [Indexed: 11/24/2022] Open
Abstract
Transfer entropy (TE) is a widely used measure of directed information flows in a number of domains including neuroscience. Many real-world time series for which we are interested in information flows come in the form of (near) instantaneous events occurring over time. Examples include the spiking of biological neurons, trades on stock markets and posts to social media, amongst myriad other systems involving events in continuous time throughout the natural and social sciences. However, there exist severe limitations to the current approach to TE estimation on such event-based data via discretising the time series into time bins: it is not consistent, has high bias, converges slowly and cannot simultaneously capture relationships that occur with very fine time precision as well as those that occur over long time intervals. Building on recent work which derived a theoretical framework for TE in continuous time, we present an estimation framework for TE on event-based data and develop a k-nearest-neighbours estimator within this framework. This estimator is provably consistent, has favourable bias properties and converges orders of magnitude more quickly than the current state-of-the-art in discrete-time estimation on synthetic examples. We demonstrate failures of the traditionally-used source-time-shift method for null surrogate generation. In order to overcome these failures, we develop a local permutation scheme for generating surrogate time series conforming to the appropriate null hypothesis in order to test for the statistical significance of the TE and, as such, test for the conditional independence between the history of one point process and the updates of another. Our approach is shown to be capable of correctly rejecting or accepting the null hypothesis of conditional independence even in the presence of strong pairwise time-directed correlations. This capacity to accurately test for conditional independence is further demonstrated on models of a spiking neural circuit inspired by the pyloric circuit of the crustacean stomatogastric ganglion, succeeding where previous related estimators have failed.
Collapse
Affiliation(s)
- David P. Shorten
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering, The University of Sydney, Sydney, Australia
| | - Richard E. Spinney
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering, The University of Sydney, Sydney, Australia
- School of Physics and EMBL Australia Node Single Molecule Science, School of Medical Sciences, The University of New South Wales, Sydney, Australia
| | - Joseph T. Lizier
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering, The University of Sydney, Sydney, Australia
| |
Collapse
|
14
|
Information Processing in the Brain as Optimal Entropy Transport: A Theoretical Approach. ENTROPY 2020; 22:e22111231. [PMID: 33287001 PMCID: PMC7712441 DOI: 10.3390/e22111231] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/09/2020] [Revised: 10/14/2020] [Accepted: 10/15/2020] [Indexed: 01/20/2023]
Abstract
We consider brain activity from an information theoretic perspective. We analyze the information processing in the brain, considering the optimality of Shannon entropy transport using the Monge–Kantorovich framework. It is proposed that some of these processes satisfy an optimal transport of informational entropy condition. This optimality condition allows us to derive an equation of the Monge–Ampère type for the information flow that accounts for the branching structure of neurons via the linearization of this equation. Based on this fact, we discuss a version of Murray’s law in this context.
Collapse
|
15
|
Local Design Principles at Hippocampal Synapses Revealed by an Energy-Information Trade-Off. eNeuro 2020; 7:ENEURO.0521-19.2020. [PMID: 32847867 PMCID: PMC7540928 DOI: 10.1523/eneuro.0521-19.2020] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2019] [Revised: 03/16/2020] [Accepted: 03/17/2020] [Indexed: 12/01/2022] Open
Abstract
Synapses across different brain regions display distinct structure-function relationships. We investigated the interplay of fundamental design constraints that shape the transmission properties of the excitatory CA3-CA1 pyramidal cell connection, a prototypic synapse for studying the mechanisms of learning in the mammalian hippocampus. This small synapse is characterized by probabilistic release of transmitter, which is markedly facilitated in response to naturally occurring trains of action potentials. Based on a physiologically motivated computational model of the rat CA3 presynaptic terminal, we show how unreliability and short-term dynamics of vesicular release work together to regulate the trade-off of information transfer versus energy use. We propose that individual CA3-CA1 synapses are designed to operate near the maximum possible capacity of information transmission in an efficient manner. Experimental measurements reveal a wide range of vesicular release probabilities at hippocampal synapses, which may be a necessary consequence of long-term plasticity and homeostatic mechanisms that manifest as presynaptic modifications of the release probability. We show that the timescales and magnitude of short-term plasticity (STP) render synaptic information transfer nearly independent of differences in release probability. Thus, individual synapses transmit optimally while maintaining a heterogeneous distribution of presynaptic strengths indicative of synaptically-encoded memory representations. Our results support the view that organizing principles that are evident on higher scales of neural organization percolate down to the design of an individual synapse.
Collapse
|
16
|
Barta T, Kostal L. The effect of inhibition on rate code efficiency indicators. PLoS Comput Biol 2019; 15:e1007545. [PMID: 31790384 PMCID: PMC6907877 DOI: 10.1371/journal.pcbi.1007545] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2019] [Revised: 12/12/2019] [Accepted: 11/12/2019] [Indexed: 11/30/2022] Open
Abstract
In this paper we investigate the rate coding capabilities of neurons whose input signal are alterations of the base state of balanced inhibitory and excitatory synaptic currents. We consider different regimes of excitation-inhibition relationship and an established conductance-based leaky integrator model with adaptive threshold and parameter sets recreating biologically relevant spiking regimes. We find that given mean post-synaptic firing rate, counter-intuitively, increased ratio of inhibition to excitation generally leads to higher signal to noise ratio (SNR). On the other hand, the inhibitory input significantly reduces the dynamic coding range of the neuron. We quantify the joint effect of SNR and dynamic coding range by computing the metabolic efficiency-the maximal amount of information per one ATP molecule expended (in bits/ATP). Moreover, by calculating the metabolic efficiency we are able to predict the shapes of the post-synaptic firing rate histograms that may be tested on experimental data. Likewise, optimal stimulus input distributions are predicted, however, we show that the optimum can essentially be reached with a broad range of input distributions. Finally, we examine which parameters of the used neuronal model are the most important for the metabolically efficient information transfer.
Collapse
Affiliation(s)
- Tomas Barta
- Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
- Charles University, First Medical Faculty, Prague, Czech Republic
- Institute of Ecology and Environmental Sciences, INRA, Versailles, France
| | - Lubomir Kostal
- Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
| |
Collapse
|