1
|
Sihn D, Kwon OS, Kim SP. Robust and efficient representations of dynamic stimuli in hierarchical neural networks via temporal smoothing. Front Comput Neurosci 2023; 17:1164595. [PMID: 37398935 PMCID: PMC10307978 DOI: 10.3389/fncom.2023.1164595] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2023] [Accepted: 05/24/2023] [Indexed: 07/04/2023] Open
Abstract
Introduction Efficient coding that minimizes informational redundancy of neural representations is a widely accepted neural coding principle. Despite the benefit, maximizing efficiency in neural coding can make neural representation vulnerable to random noise. One way to achieve robustness against random noise is smoothening neural responses. However, it is not clear whether the smoothness of neural responses can hold robust neural representations when dynamic stimuli are processed through a hierarchical brain structure, in which not only random noise but also systematic error due to temporal lag can be induced. Methods In the present study, we showed that smoothness via spatio-temporally efficient coding can achieve both efficiency and robustness by effectively dealing with noise and neural delay in the visual hierarchy when processing dynamic visual stimuli. Results The simulation results demonstrated that a hierarchical neural network whose bidirectional synaptic connections were learned through spatio-temporally efficient coding with natural scenes could elicit neural responses to visual moving bars similar to those to static bars with the identical position and orientation, indicating robust neural responses against erroneous neural information. It implies that spatio-temporally efficient coding preserves the structure of visual environments locally in the neural responses of hierarchical structures. Discussion The present results suggest the importance of a balance between efficiency and robustness in neural coding for visual processing of dynamic stimuli across hierarchical brain structures.
Collapse
|
2
|
Hiratani N, Latham PE. Developmental and evolutionary constraints on olfactory circuit selection. Proc Natl Acad Sci U S A 2022; 119:e2100600119. [PMID: 35263217 PMCID: PMC8931209 DOI: 10.1073/pnas.2100600119] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2021] [Accepted: 01/14/2022] [Indexed: 11/18/2022] Open
Abstract
SignificanceIn this work, we explore the hypothesis that biological neural networks optimize their architecture, through evolution, for learning. We study early olfactory circuits of mammals and insects, which have relatively similar structure but a huge diversity in size. We approximate these circuits as three-layer networks and estimate, analytically, the scaling of the optimal hidden-layer size with input-layer size. We find that both longevity and information in the genome constrain the hidden-layer size, so a range of allometric scalings is possible. However, the experimentally observed allometric scalings in mammals and insects are consistent with biologically plausible values. This analysis should pave the way for a deeper understanding of both biological and artificial networks.
Collapse
Affiliation(s)
- Naoki Hiratani
- Gatsby Computational Neuroscience Unit, University College London, London W1T 4JG, United Kingdom
| | - Peter E. Latham
- Gatsby Computational Neuroscience Unit, University College London, London W1T 4JG, United Kingdom
| |
Collapse
|
3
|
Lee DG, Daunizeau J. Trading mental effort for confidence in the metacognitive control of value-based decision-making. eLife 2021; 10:e63282. [PMID: 33900198 PMCID: PMC8128438 DOI: 10.7554/elife.63282] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2020] [Accepted: 04/23/2021] [Indexed: 01/08/2023] Open
Abstract
Why do we sometimes opt for actions or items that we do not value the most? Under current neurocomputational theories, such preference reversals are typically interpreted in terms of errors that arise from the unreliable signaling of value to brain decision systems. But, an alternative explanation is that people may change their mind because they are reassessing the value of alternative options while pondering the decision. So, why do we carefully ponder some decisions, but not others? In this work, we derive a computational model of the metacognitive control of decisions or MCD. In brief, we assume that fast and automatic processes first provide initial (and largely uncertain) representations of options' values, yielding prior estimates of decision difficulty. These uncertain value representations are then refined by deploying cognitive (e.g., attentional, mnesic) resources, the allocation of which is controlled by an effort-confidence tradeoff. Importantly, the anticipated benefit of allocating resources varies in a decision-by-decision manner according to the prior estimate of decision difficulty. The ensuing MCD model predicts response time, subjective feeling of effort, choice confidence, changes of mind, as well as choice-induced preference change and certainty gain. We test these predictions in a systematic manner, using a dedicated behavioral paradigm. Our results provide a quantitative link between mental effort, choice confidence, and preference reversals, which could inform interpretations of related neuroimaging findings.
Collapse
Affiliation(s)
- Douglas G Lee
- Sorbonne UniversityParisFrance
- Paris Brain Institute (ICM)ParisFrance
- Institute of Cognitive Sciences and Technologies, National Research Council of ItalyRomeItaly
| | - Jean Daunizeau
- Paris Brain Institute (ICM)ParisFrance
- Translational Neuromodeling Unit (TNU), ETHZurichSwitzerland
| |
Collapse
|
4
|
Metabolic tuning of inhibition regulates hippocampal neurogenesis in the adult brain. Proc Natl Acad Sci U S A 2020; 117:25818-25829. [PMID: 32973092 DOI: 10.1073/pnas.2006138117] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022] Open
Abstract
Hippocampus-engaged behaviors stimulate neurogenesis in the adult dentate gyrus by largely unknown means. To explore the underlying mechanisms, we used tetrode recording to analyze neuronal activity in the dentate gyrus of freely moving adult mice during hippocampus-engaged contextual exploration. We found that exploration induced an overall sustained increase in inhibitory neuron activity that was concomitant with decreased excitatory neuron activity. A mathematical model based on energy homeostasis in the dentate gyrus showed that enhanced inhibition and decreased excitation resulted in a similar increase in neurogenesis to that observed experimentally. To mechanistically investigate this sustained inhibitory regulation, we performed metabolomic and lipidomic profiling of the hippocampus during exploration. We found sustainably increased signaling of sphingosine-1-phosphate, a bioactive metabolite, during exploration. Furthermore, we found that sphingosine-1-phosphate signaling through its receptor 2 increased interneuron activity and thus mediated exploration-induced neurogenesis. Taken together, our findings point to a behavior-metabolism circuit pathway through which experience regulates adult hippocampal neurogenesis.
Collapse
|
5
|
Yi G, Fan Y, Wang J. Metabolic Cost of Dendritic Ca 2+ Action Potentials in Layer 5 Pyramidal Neurons. Front Neurosci 2019; 13:1221. [PMID: 31780891 PMCID: PMC6861219 DOI: 10.3389/fnins.2019.01221] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2019] [Accepted: 10/29/2019] [Indexed: 01/15/2023] Open
Abstract
Pyramidal neurons consume most signaling-related energy to generate action potentials (APs) and perform synaptic integration. Dendritic Ca2+ spike is an important integration mechanism for coupling inputs from different cortical layers. Our objective was to quantify the metabolic energy associated with the generation of Ca2+ APs in the dendrites. We used morphology-based computational models to simulate the dendritic Ca2+ spikes in layer 5 pyramidal neurons. We calculated the energy cost by converting Ca2+ influx into the number of ATP required to restore and maintain the homeostasis of intracellular Ca2+ concentrations. We quantified the effects of synaptic inputs, dendritic voltage, back-propagating Na+ spikes, and Ca2+ inactivation on Ca2+ spike cost. We showed that much more ATP molecules were required for reversing Ca2+ influx in the dendrites than for Na+ ion pumping in the soma during a Ca2+ AP. Increasing synaptic input increased the rate of dendritic depolarization and underlying Ca2+ influx, resulting in higher ATP consumption. Depolarizing dendritic voltage resulted in the inactivation of Ca2+ channels and reduced the ATP cost, while dendritic hyperpolarization increased the spike cost by de-inactivating Ca2+ channels. A back-propagating Na+ AP initiated in the soma increased Ca2+ spike cost in the apical dendrite when it coincided with a synaptic input within a time window of several milliseconds. Increasing Ca2+ inactivation rate reduced Ca2+ spike cost, while slowing Ca2+ inactivation increased the spike cost. The results revealed that the energy demand of a Ca2+ AP was dynamically dependent on the state of dendritic activity. These findings were important for predicting the energy budget for signaling in pyramidal cells, interpreting functional imaging data, and designing energy-efficient neuromorphic devices.
Collapse
Affiliation(s)
- Guosheng Yi
- School of Electrical and Information Engineering, Tianjin University, Tianjin, China
| | - Yaqin Fan
- School of Electrical and Information Engineering, Tianjin University, Tianjin, China
| | - Jiang Wang
- School of Electrical and Information Engineering, Tianjin University, Tianjin, China
| |
Collapse
|
6
|
White O, Babič J, Trenado C, Johannsen L, Goswami N. The Promise of Stochastic Resonance in Falls Prevention. Front Physiol 2019; 9:1865. [PMID: 30745883 PMCID: PMC6360177 DOI: 10.3389/fphys.2018.01865] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2018] [Accepted: 12/11/2018] [Indexed: 12/13/2022] Open
Abstract
Multisensory integration is essential for maintenance of motor and cognitive abilities, thereby ensuring normal function and personal autonomy. Balance control is challenged during senescence or in motor disorders, leading to potential falls. Increased uncertainty in sensory signals is caused by a number of factors including noise, defined as a random and persistent disturbance that reduces the clarity of information. Counter-intuitively, noise can be beneficial in some conditions. Stochastic resonance is a mechanism whereby a particular level of noise actually enhances the response of non-linear systems to weak sensory signals. Here we review the effects of stochastic resonance on sensory modalities and systems directly involved in balance control. We highlight its potential for improving sensorimotor performance as well as cognitive and autonomic functions. These promising results demonstrate that stochastic resonance represents a flexible and non-invasive technique that can be applied to different modalities simultaneously. Finally we point out its benefits for a variety of scenarios including in ambulant elderly, skilled movements, sports and to patients with sensorimotor or autonomic dysfunctions.
Collapse
Affiliation(s)
- Olivier White
- INSERM UMR1093-CAPS, Université Bourgogne Franche-Comté, UFR des Sciences du Sport, Dijon, France.,Acquired Brain Injury Rehabilitation, Faculty of Medicine and Health Sciences, School of Health Sciences, University of East Anglia, Norwich Research Park, Norwich, United Kingdom
| | - Jan Babič
- Laboratory for Neuromechanics and Biorobotics, Jožef Stefan Institute, Ljubljana, Slovenia
| | - Carlos Trenado
- Leibniz Research Centre for Working Environment and Human Factors TU Dortmund (ifADO), Institute of Clinical Neuroscience and Medical Psychology, University Hospital Düsseldorf, Düsseldorf, Germany
| | - Leif Johannsen
- Acquired Brain Injury Rehabilitation, Faculty of Medicine and Health Sciences, School of Health Sciences, University of East Anglia, Norwich Research Park, Norwich, United Kingdom
| | - Nandu Goswami
- Otto Loewi Research Center for Vascular Biology, Immunology and Inflammation, Medical University of Graz, Graz, Austria
| |
Collapse
|
7
|
Yuan Y, Huo H, Zhao P, Liu J, Liu J, Xing F, Fang T. Constraints of Metabolic Energy on the Number of Synaptic Connections of Neurons and the Density of Neuronal Networks. Front Comput Neurosci 2018; 12:91. [PMID: 30524259 PMCID: PMC6256250 DOI: 10.3389/fncom.2018.00091] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2018] [Accepted: 10/31/2018] [Indexed: 11/13/2022] Open
Abstract
Neuronal networks in the brain are the structural basis of human cognitive function, and the plasticity of neuronal networks is thought to be the principal neural mechanism underlying learning and memory. Dominated by the Hebbian theory, researchers have devoted extensive effort to studying the changes in synaptic connections between neurons. However, understanding the network topology of all synaptic connections has been neglected over the past decades. Furthermore, increasing studies indicate that synaptic activities are tightly coupled with metabolic energy, and metabolic energy is a unifying principle governing neuronal activities. Therefore, the network topology of all synaptic connections may also be governed by metabolic energy. Here, by implementing a computational model, we investigate the general synaptic organization rules for neurons and neuronal networks from the perspective of energy metabolism. We find that to maintain the energy balance of individual neurons in the proposed model, the number of synaptic connections is inversely proportional to the average of the synaptic weights. This strategy may be adopted by neurons to ensure that the ability of neurons to transmit signals matches their own energy metabolism. In addition, we find that the density of neuronal networks is also an important factor in the energy balance of neuronal networks. An abnormal increase or decrease in the network density could lead to failure of energy metabolism in the neuronal network. These rules may change our view of neuronal networks in the brain and have guiding significance for the design of neuronal network models.
Collapse
Affiliation(s)
- Ye Yuan
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Hong Huo
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Peng Zhao
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Jian Liu
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Jiaxing Liu
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Fu Xing
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Tao Fang
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| |
Collapse
|
8
|
Yu L, Shen Z, Wang C, Yu Y. Efficient Coding and Energy Efficiency Are Promoted by Balanced Excitatory and Inhibitory Synaptic Currents in Neuronal Network. Front Cell Neurosci 2018; 12:123. [PMID: 29773979 PMCID: PMC5943499 DOI: 10.3389/fncel.2018.00123] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2017] [Accepted: 04/16/2018] [Indexed: 11/13/2022] Open
Abstract
Selective pressure may drive neural systems to process as much information as possible with the lowest energy cost. Recent experiment evidence revealed that the ratio between synaptic excitation and inhibition (E/I) in local cortex is generally maintained at a certain value which may influence the efficiency of energy consumption and information transmission of neural networks. To understand this issue deeply, we constructed a typical recurrent Hodgkin-Huxley network model and studied the general principles that governs the relationship among the E/I synaptic current ratio, the energy cost and total amount of information transmission. We observed in such a network that there exists an optimal E/I synaptic current ratio in the network by which the information transmission achieves the maximum with relatively low energy cost. The coding energy efficiency which is defined as the mutual information divided by the energy cost, achieved the maximum with the balanced synaptic current. Although background noise degrades information transmission and imposes an additional energy cost, we find an optimal noise intensity that yields the largest information transmission and energy efficiency at this optimal E/I synaptic transmission ratio. The maximization of energy efficiency also requires a certain part of energy cost associated with spontaneous spiking and synaptic activities. We further proved this finding with analytical solution based on the response function of bistable neurons, and demonstrated that optimal net synaptic currents are capable of maximizing both the mutual information and energy efficiency. These results revealed that the development of E/I synaptic current balance could lead a cortical network to operate at a highly efficient information transmission rate at a relatively low energy cost. The generality of neuronal models and the recurrent network configuration used here suggest that the existence of an optimal E/I cell ratio for highly efficient energy costs and information maximization is a potential principle for cortical circuit networks.
Collapse
Affiliation(s)
- Lianchun Yu
- Institute of Theoretical Physics, Lanzhou University, Lanzhou, China.,The School of Nationalities' Educators, Qinghai Normal University, Xining, China
| | - Zhou Shen
- Cuiying Honors College, Lanzhou University, Lanzhou, China
| | - Chen Wang
- Department of Physical Science and Technology, Lanzhou University, Lanzhou, China
| | - Yuguo Yu
- State Key Laboratory of Medical Neurobiology, School of Life Science and Human Phenome Institute, Institutes of Brain Science, Center for Computational Systems Biology, Fudan University, Shanghai, China
| |
Collapse
|
9
|
Manos T, Zeitler M, Tass PA. Short-Term Dosage Regimen for Stimulation-Induced Long-Lasting Desynchronization. Front Physiol 2018; 9:376. [PMID: 29706900 PMCID: PMC5906576 DOI: 10.3389/fphys.2018.00376] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2017] [Accepted: 03/27/2018] [Indexed: 11/23/2022] Open
Abstract
In this paper, we computationally generate hypotheses for dose-finding studies in the context of desynchronizing neuromodulation techniques. Abnormally strong neuronal synchronization is a hallmark of several brain disorders. Coordinated Reset (CR) stimulation is a spatio-temporally patterned stimulation technique that specifically aims at disrupting abnormal neuronal synchrony. In networks with spike-timing-dependent plasticity CR stimulation may ultimately cause an anti-kindling, i.e., an unlearning of abnormal synaptic connectivity and neuronal synchrony. This long-lasting desynchronization was theoretically predicted and verified in several pre-clinical and clinical studies. We have shown that CR stimulation with rapidly varying sequences (RVS) robustly induces an anti-kindling at low intensities e.g., if the CR stimulation frequency (i.e., stimulus pattern repetition rate) is in the range of the frequency of the neuronal oscillation. In contrast, CR stimulation with slowly varying sequences (SVS) turned out to induce an anti-kindling more strongly, but less robustly with respect to variations of the CR stimulation frequency. Motivated by clinical constraints and inspired by the spacing principle of learning theory, in this computational study we propose a short-term dosage regimen that enables a robust anti-kindling effect of both RVS and SVS CR stimulation, also for those parameter values where RVS and SVS CR stimulation previously turned out to be ineffective. Intriguingly, for the vast majority of parameter values tested, spaced multishot CR stimulation with demand-controlled variation of stimulation frequency and intensity caused a robust and pronounced anti-kindling. In contrast, spaced CR stimulation with fixed stimulation parameters as well as singleshot CR stimulation of equal integral duration failed to improve the stimulation outcome. In the model network under consideration, our short-term dosage regimen enables to robustly induce long-term desynchronization at comparably short stimulation duration and low integral stimulation duration. Currently, clinical proof of concept is available for deep brain CR stimulation for Parkinson's therapy and acoustic CR stimulation for tinnitus therapy. Promising first in human data is available for vibrotactile CR stimulation for Parkinson's treatment. For the clinical development of these treatments it is mandatory to perform dose-finding studies to reveal optimal stimulation parameters and dosage regimens. Our findings can straightforwardly be tested in human dose-finding studies.
Collapse
Affiliation(s)
- Thanos Manos
- Institute of Neuroscience and Medicine (INM-7), Research Centre Jülich, Jülich, Germany
- Institute of Systems Neuroscience, Medical Faculty, Heinrich Heine University Düsseldorf, Düsseldorf, Germany
| | - Magteld Zeitler
- Institute of Neuroscience and Medicine (INM-7), Research Centre Jülich, Jülich, Germany
| | - Peter A. Tass
- Department of Neurosurgery, Stanford University, Stanford, CA, United States
| |
Collapse
|
10
|
Zhou S, Yu Y. Synaptic E-I Balance Underlies Efficient Neural Coding. Front Neurosci 2018; 12:46. [PMID: 29456491 PMCID: PMC5801300 DOI: 10.3389/fnins.2018.00046] [Citation(s) in RCA: 88] [Impact Index Per Article: 14.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2017] [Accepted: 01/19/2018] [Indexed: 12/19/2022] Open
Abstract
Both theoretical and experimental evidence indicate that synaptic excitation and inhibition in the cerebral cortex are well-balanced during the resting state and sensory processing. Here, we briefly summarize the evidence for how neural circuits are adjusted to achieve this balance. Then, we discuss how such excitatory and inhibitory balance shapes stimulus representation and information propagation, two basic functions of neural coding. We also point out the benefit of adopting such a balance during neural coding. We conclude that excitatory and inhibitory balance may be a fundamental mechanism underlying efficient coding.
Collapse
Affiliation(s)
- Shanglin Zhou
- State Key Laboratory of Medical Neurobiology, School of Life Science and the Collaborative Innovation Center for Brain Science, Institutes of Brain Science, Center for Computational Systems Biology, Fudan University, Shanghai, China
| | - Yuguo Yu
- State Key Laboratory of Medical Neurobiology, School of Life Science and the Collaborative Innovation Center for Brain Science, Institutes of Brain Science, Center for Computational Systems Biology, Fudan University, Shanghai, China
| |
Collapse
|
11
|
Liu Y, Yue Y, Yu Y, Liu L, Yu L. Effects of channel blocking on information transmission and energy efficiency in squid giant axons. J Comput Neurosci 2018; 44:219-231. [PMID: 29327161 DOI: 10.1007/s10827-017-0676-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2016] [Revised: 11/18/2017] [Accepted: 12/11/2017] [Indexed: 11/25/2022]
Abstract
Action potentials are the information carriers of neural systems. The generation of action potentials involves the cooperative opening and closing of sodium and potassium channels. This process is metabolically expensive because the ions flowing through open channels need to be restored to maintain concentration gradients of these ions. Toxins like tetraethylammonium can block working ion channels, thus affecting the function and energy cost of neurons. In this paper, by computer simulation of the Hodgkin-Huxley neuron model, we studied the effects of channel blocking with toxins on the information transmission and energy efficiency in squid giant axons. We found that gradually blocking sodium channels will sequentially maximize the information transmission and energy efficiency of the axons, whereas moderate blocking of potassium channels will have little impact on the information transmission and will decrease the energy efficiency. Heavy blocking of potassium channels will cause self-sustained oscillation of membrane potentials. Simultaneously blocking sodium and potassium channels with the same ratio increases both information transmission and energy efficiency. Our results are in line with previous studies suggesting that information processing capacity and energy efficiency can be maximized by regulating the number of active ion channels, and this indicates a viable avenue for future experimentation.
Collapse
Affiliation(s)
- Yujiang Liu
- Institute of Theoretical Physics, Lanzhou University, Lanzhou, 730000, China
| | - Yuan Yue
- Institute of Theoretical Physics, Lanzhou University, Lanzhou, 730000, China
- College of Electrical Engineering, Northwest University for Nationalities, Lanzhou, 730070, China
| | - Yuguo Yu
- School of Life Science and the Collaborative Innovation Center for Brain Science, Center for Computational Systems Biology, Fudan University, Shanghai Shi, 200433, China
| | - Liwei Liu
- College of Electrical Engineering, Northwest University for Nationalities, Lanzhou, 730070, China
| | - Lianchun Yu
- Institute of Theoretical Physics, Lanzhou University, Lanzhou, 730000, China.
| |
Collapse
|
12
|
Yi G, Wang J, Wei X, Deng B. Dendritic Properties Control Energy Efficiency of Action Potentials in Cortical Pyramidal Cells. Front Cell Neurosci 2017; 11:265. [PMID: 28919852 PMCID: PMC5585200 DOI: 10.3389/fncel.2017.00265] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2017] [Accepted: 08/18/2017] [Indexed: 12/31/2022] Open
Abstract
Neural computation is performed by transforming input signals into sequences of action potentials (APs), which is metabolically expensive and limited by the energy available to the brain. The metabolic efficiency of single AP has important consequences for the computational power of the cell, which is determined by its biophysical properties and morphologies. Here we adopt biophysically-based two-compartment models to investigate how dendrites affect energy efficiency of APs in cortical pyramidal neurons. We measure the Na+ entry during the spike and examine how it is efficiently used for generating AP depolarization. We show that increasing the proportion of dendritic area or coupling conductance between two chambers decreases Na+ entry efficiency of somatic AP. Activating inward Ca2+ current in dendrites results in dendritic spike, which increases AP efficiency. Activating Ca2+-activated outward K+ current in dendrites, however, decreases Na+ entry efficiency. We demonstrate that the active and passive dendrites take effects by altering the overlap between Na+ influx and internal current flowing from soma to dendrite. We explain a fundamental link between dendritic properties and AP efficiency, which is essential to interpret how neural computation consumes metabolic energy and how biophysics and morphologies contribute to such consumption.
Collapse
Affiliation(s)
- Guosheng Yi
- School of Electrical and Information Engineering, Tianjin UniversityTianjin, China
| | - Jiang Wang
- School of Electrical and Information Engineering, Tianjin UniversityTianjin, China
| | - Xile Wei
- School of Electrical and Information Engineering, Tianjin UniversityTianjin, China
| | - Bin Deng
- School of Electrical and Information Engineering, Tianjin UniversityTianjin, China
| |
Collapse
|
13
|
Yu L, Yu Y. Energy-efficient neural information processing in individual neurons and neuronal networks. J Neurosci Res 2017; 95:2253-2266. [PMID: 28833444 DOI: 10.1002/jnr.24131] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2016] [Revised: 07/07/2017] [Accepted: 07/10/2017] [Indexed: 12/22/2022]
Abstract
Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. © 2017 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Lianchun Yu
- Institute of Theoretical Physics, Key Laboratory for Magnetism and Magnetic Materials of the Ministry of Education, Lanzhou University, Lanzhou, China
| | - Yuguo Yu
- School of Life Science and the State Key Laboratory of Medical Neurobiology, Institutes of Brain Science and the Collaborative Innovation Center for Brain Science, Fudan University, Shanghai, China
| |
Collapse
|
14
|
Guo D, Perc M, Zhang Y, Xu P, Yao D. Frequency-difference-dependent stochastic resonance in neural systems. Phys Rev E 2017; 96:022415. [PMID: 28950589 DOI: 10.1103/physreve.96.022415] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2017] [Indexed: 06/07/2023]
Abstract
Biological neurons receive multiple noisy oscillatory signals, and their dynamical response to the superposition of these signals is of fundamental importance for information processing in the brain. Here we study the response of neural systems to the weak envelope modulation signal, which is superimposed by two periodic signals with different frequencies. We show that stochastic resonance occurs at the beat frequency in neural systems at the single-neuron as well as the population level. The performance of this frequency-difference-dependent stochastic resonance is influenced by both the beat frequency and the two forcing frequencies. Compared to a single neuron, a population of neurons is more efficient in detecting the information carried by the weak envelope modulation signal at the beat frequency. Furthermore, an appropriate fine-tuning of the excitation-inhibition balance can further optimize the response of a neural ensemble to the superimposed signal. Our results thus introduce and provide insights into the generation and modulation mechanism of the frequency-difference-dependent stochastic resonance in neural systems.
Collapse
Affiliation(s)
- Daqing Guo
- The Clinical Hospital of Chengdu Brian Science Institute, MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, Chengdu 610054, People's Republic of China
| | - Matjaž Perc
- Faculty of Natural Sciences and Mathematics, University of Maribor, Koroška cesta 160, SI-2000 Maribor, Slovenia
- CAMTP-Center for Applied Mathematics and Theoretical Physics, University of Maribor, Mladinska 3, SI-2000 Maribor, Slovenia
| | - Yangsong Zhang
- The Clinical Hospital of Chengdu Brian Science Institute, MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, Chengdu 610054, People's Republic of China
| | - Peng Xu
- The Clinical Hospital of Chengdu Brian Science Institute, MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, Chengdu 610054, People's Republic of China
| | - Dezhong Yao
- The Clinical Hospital of Chengdu Brian Science Institute, MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, Chengdu 610054, People's Republic of China
| |
Collapse
|
15
|
Abstract
For many organisms, the number of sensory neurons is largely determined during development, before strong environmental cues are present. This is despite the fact that environments can fluctuate drastically both from generation to generation and within an organism's lifetime. How can organisms get by by hard coding the number of sensory neurons? We approach this question using rate-distortion theory. A combination of simulation and theory suggests that when environments are large, the rate-distortion function-a proxy for material costs, timing delays, and energy requirements-depends only on coarse-grained environmental statistics that are expected to change on evolutionary, rather than ontogenetic, time scales.
Collapse
Affiliation(s)
- Sarah Marzen
- Department of Physics, Redwood Center for Theoretical Neuroscience, University of California at Berkeley, Berkeley, California 94720, USA.,Physics of Living Systems, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA
| | - Simon DeDeo
- Center for Complex Networks and Systems Research, Department of Informatics, Indiana University, 919 East 10th Street, Bloomington, Indiana 47408, USA.,Department of Social and Decision Sciences, 5000 Forbes Avenue, BP 208, Pittsburgh, Pennsylvania 15213, USA.,Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, New Mexico 87501, USA
| |
Collapse
|