1
|
Koren V, Malerba SB, Schwalger T, Panzeri S. Structure, dynamics, coding and optimal biophysical parameters of efficient excitatory-inhibitory spiking networks. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.04.24.590955. [PMID: 38712237 PMCID: PMC11071478 DOI: 10.1101/2024.04.24.590955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2024]
Abstract
The principle of efficient coding posits that sensory cortical networks are designed to encode maximal sensory information with minimal metabolic cost. Despite the major influence of efficient coding in neuro-science, it has remained unclear whether fundamental empirical properties of neural network activity can be explained solely based on this normative principle. Here, we rigorously derive the structural, coding, biophysical and dynamical properties of excitatory-inhibitory recurrent networks of spiking neurons that emerge directly from imposing that the network minimizes an instantaneous loss function and a time-averaged performance measure enacting efficient coding. The optimal network has biologically-plausible biophysical features, including realistic integrate-and-fire spiking dynamics, spike-triggered adaptation, and a non-stimulus-specific excitatory external input regulating metabolic cost. The efficient network has excitatory-inhibitory recurrent connectivity between neurons with similar stimulus tuning implementing feature-specific competition, similar to that recently found in visual cortex. Networks with unstructured connectivity cannot reach comparable levels of coding efficiency. The optimal biophysical parameters include 4 to 1 ratio of excitatory vs inhibitory neurons and 3 to 1 ratio of mean inhibitory-to-inhibitory vs. excitatory-to-inhibitory connectivity that closely match those of cortical sensory networks. The efficient network has biologically-plausible spiking dynamics, with a tight instantaneous E-I balance that makes them capable to achieve efficient coding of external stimuli varying over multiple time scales. Together, these results explain how efficient coding may be implemented in cortical networks and suggests that key properties of biological neural networks may be accounted for by efficient coding.
Collapse
|
2
|
Puttkammer F, Lindner B. Fluctuation-response relations for integrate-and-fire models with an absolute refractory period. BIOLOGICAL CYBERNETICS 2024; 118:7-19. [PMID: 38261004 PMCID: PMC11068698 DOI: 10.1007/s00422-023-00982-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 12/11/2023] [Indexed: 01/24/2024]
Abstract
We study the problem of relating the spontaneous fluctuations of a stochastic integrate-and-fire (IF) model to the response of the instantaneous firing rate to time-dependent stimulation if the IF model is endowed with a non-vanishing refractory period and a finite (stereotypical) spike shape. This seemingly harmless addition to the model is shown to complicate the analysis put forward by Lindner Phys. Rev. Lett. (2022), i.e., the incorporation of the reset into the model equation, the Rice-like averaging of the stochastic differential equation, and the application of the Furutsu-Novikov theorem. We derive a still exact (although more complicated) fluctuation-response relation (FRR) for an IF model with refractory state and a white Gaussian background noise. We also briefly discuss an approximation for the case of a colored Gaussian noise and conclude with a summary and outlook on open problems.
Collapse
Affiliation(s)
- Friedrich Puttkammer
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| |
Collapse
|
3
|
Masoli S, Sanchez-Ponce D, Vrieler N, Abu-Haya K, Lerner V, Shahar T, Nedelescu H, Rizza MF, Benavides-Piccione R, DeFelipe J, Yarom Y, Munoz A, D'Angelo E. Human Purkinje cells outperform mouse Purkinje cells in dendritic complexity and computational capacity. Commun Biol 2024; 7:5. [PMID: 38168772 PMCID: PMC10761885 DOI: 10.1038/s42003-023-05689-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2023] [Accepted: 12/08/2023] [Indexed: 01/05/2024] Open
Abstract
Purkinje cells in the cerebellum are among the largest neurons in the brain and have been extensively investigated in rodents. However, their morphological and physiological properties remain poorly understood in humans. In this study, we utilized high-resolution morphological reconstructions and unique electrophysiological recordings of human Purkinje cells ex vivo to generate computational models and estimate computational capacity. An inter-species comparison showed that human Purkinje cell had similar fractal structures but were larger than those of mouse Purkinje cells. Consequently, given a similar spine density (2/μm), human Purkinje cell hosted approximately 7.5 times more dendritic spines than those of mice. Moreover, human Purkinje cells had a higher dendritic complexity than mouse Purkinje cells and usually emitted 2-3 main dendritic trunks instead of one. Intrinsic electro-responsiveness was similar between the two species, but model simulations revealed that the dendrites could process ~6.5 times (n = 51 vs. n = 8) more input patterns in human Purkinje cells than in mouse Purkinje cells. Thus, while human Purkinje cells maintained spike discharge properties similar to those of rodents during evolution, they developed more complex dendrites, enhancing computational capacity.
Collapse
Affiliation(s)
- Stefano Masoli
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Diana Sanchez-Ponce
- Centro de Tecnología Biomédica (CTB), Universidad Politécnica de Madrid, Madrid, Spain
| | - Nora Vrieler
- Feinberg school of Medicine, Northwestern University, Chicago, IL, USA
- Department of Neurobiology and ELSC, Edmond J. Safra Campus, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Karin Abu-Haya
- Department of Neurobiology and ELSC, Edmond J. Safra Campus, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Vitaly Lerner
- Department of Neurobiology and ELSC, Edmond J. Safra Campus, The Hebrew University of Jerusalem, Jerusalem, Israel
- Brain and Cognitive Sciences and Center of Visual Science, University of Rochester, Rochester, NY, USA
| | - Tal Shahar
- Department of Neurosurgery, Shaare Zedek Medical Center, Jerusalem, Israel
| | | | | | - Ruth Benavides-Piccione
- Centro de Tecnología Biomédica (CTB), Universidad Politécnica de Madrid, Madrid, Spain
- Instituto Cajal (CSIC), Madrid, Spain
| | - Javier DeFelipe
- Centro de Tecnología Biomédica (CTB), Universidad Politécnica de Madrid, Madrid, Spain
- Instituto Cajal (CSIC), Madrid, Spain
| | - Yosef Yarom
- Department of Neurobiology and ELSC, Edmond J. Safra Campus, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Alberto Munoz
- Centro de Tecnología Biomédica (CTB), Universidad Politécnica de Madrid, Madrid, Spain
- Departamento de Biología Celular, Universidad Complutense de Madrid, Madrid, Spain
| | - Egidio D'Angelo
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy.
- Digital Neuroscience Center, IRCCS Mondino Foundation, Pavia, Italy.
| |
Collapse
|
4
|
Marasco A, Spera E, De Falco V, Iuorio A, Lupascu CA, Solinas S, Migliore M. An Adaptive Generalized Leaky Integrate-and-Fire Model for Hippocampal CA1 Pyramidal Neurons and Interneurons. Bull Math Biol 2023; 85:109. [PMID: 37792146 PMCID: PMC10550887 DOI: 10.1007/s11538-023-01206-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2023] [Accepted: 08/24/2023] [Indexed: 10/05/2023]
Abstract
Full-scale morphologically and biophysically realistic model networks, aiming at modeling multiple brain areas, provide an invaluable tool to make significant scientific advances from in-silico experiments on cognitive functions to digital twin implementations. Due to the current technical limitations of supercomputer systems in terms of computational power and memory requirements, these networks must be implemented using (at least) simplified neurons. A class of models which achieve a reasonable compromise between accuracy and computational efficiency is given by generalized leaky integrate-and fire models complemented by suitable initial and update conditions. However, we found that these models cannot reproduce the complex and highly variable firing dynamics exhibited by neurons in several brain regions, such as the hippocampus. In this work, we propose an adaptive generalized leaky integrate-and-fire model for hippocampal CA1 neurons and interneurons, in which the nonlinear nature of the firing dynamics is successfully reproduced by linear ordinary differential equations equipped with nonlinear and more realistic initial and update conditions after each spike event, which strictly depends on the external stimulation current. A mathematical analysis of the equilibria stability as well as the monotonicity properties of the analytical solution for the membrane potential allowed (i) to determine general constraints on model parameters, reducing the computational cost of an optimization procedure based on spike times in response to a set of constant currents injections; (ii) to identify additional constraints to quantitatively reproduce and predict the experimental traces from 85 neurons and interneurons in response to any stimulation protocol using constant and piecewise constant current injections. Finally, this approach allows to easily implement a procedure to create infinite copies of neurons with mathematically controlled firing properties, statistically indistinguishable from experiments, to better reproduce the full range and variability of the firing scenarios observed in a real network.
Collapse
Affiliation(s)
- Addolorata Marasco
- Department of Mathematics and Applications, University of Naples Federico II, Via Cintia ed. 5A, 80126 Naples, Italy
- Institute of Biophysics, National Research Council, Via Ugo La Malfa 153, 90146 Palermo, Italy
| | - Emiliano Spera
- Institute of Biophysics, National Research Council, Via Ugo La Malfa 153, 90146 Palermo, Italy
| | - Vittorio De Falco
- Scuola Superiore Meridionale, Largo San Marcellino 10, 80138 Naples, Napoli Italy
- Istituto Nazionale di Fisica Nucleare di Napoli, Via Cintia ed. 6, 80126 Naples, Napoli Italy
| | - Annalisa Iuorio
- Faculty of Mathematics, University of Vienna, Oskar-Morgenstern-Platz 1, 1090 Vienna, Austria
- Department of Engineering, Parthenope University of Naples, Centro Direzionale - Isola C4, 80143 Naples, Italy
| | - Carmen Alina Lupascu
- Institute of Biophysics, National Research Council, Via Ugo La Malfa 153, 90146 Palermo, Italy
| | - Sergio Solinas
- Department of Biomedical Science, University of Sassari, Viale San Pietro 23, 07100 Sassari, Italy
| | - Michele Migliore
- Institute of Biophysics, National Research Council, Via Ugo La Malfa 153, 90146 Palermo, Italy
| |
Collapse
|
5
|
Ramlow L, Falcke M, Lindner B. An integrate-and-fire approach to Ca 2+ signaling. Part I: Renewal model. Biophys J 2023; 122:713-736. [PMID: 36635961 PMCID: PMC9989887 DOI: 10.1016/j.bpj.2023.01.007] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 12/13/2022] [Accepted: 01/06/2023] [Indexed: 01/13/2023] Open
Abstract
In computational neuroscience integrate-and-fire models capture the spike generation by a subthreshold dynamics supplemented by a simple fire-and-reset rule; they allow for a numerically efficient and analytically tractable description of stochastic single cell as well as network dynamics. Stochastic spiking is also a prominent feature of Ca2+ signaling which suggests to adopt the integrate-and-fire approach for this fundamental biophysical process. The model introduced here consists of two components describing 1) activity of clusters of inositol-trisphosphate receptor channels and 2) dynamics of the global Ca2+ concentrations in the cytosol. The cluster dynamics is given in terms of a cyclic Markov chain, capturing the puff, i.e., the punctuated release of Ca2+ from intracellular stores. The cytosolic Ca2+ concentration is described by an integrate-and-fire dynamics driven by the puff current. For the cyclic Markov chain we derive expressions for the statistics of the interpuff interval, the single-puff strength and the puff current assuming constant cytosolic Ca2+. The latter condition is often well approximated because cytosolic Ca2+ varies much slower than the cluster activity does. Furthermore, because the detailed two-component model is numerically expensive to simulate and difficult to treat analytically, we develop an analytical framework to approximate the driving puff current of the stochastic cytosolic Ca2+ dynamics by a temporally uncorrelated Gaussian noise. This approximation reduces our two-component system to an integrate-and-fire model with a nonlinear drift function and a multiplicative Gaussian white noise, a model that is known to generate a renewal spike train, i.e., a point process with statistically independent interspike intervals. The model allows for fast numerical simulations, permits to derive analytical expressions for the rate of Ca2+ spiking and the coefficient of variation of the interspike interval, and to approximate the interspike interval density and the spike train power spectrum. Comparison of these statistics to experimental data is discussed.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Physics Department of Humboldt University Berlin, Berlin, Germany.
| | - Martin Falcke
- Physics Department of Humboldt University Berlin, Berlin, Germany; Max Delbrück Center for Molecular Medicine, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Physics Department of Humboldt University Berlin, Berlin, Germany
| |
Collapse
|
6
|
D'Angelo E, Jirsa V. The quest for multiscale brain modeling. Trends Neurosci 2022; 45:777-790. [PMID: 35906100 DOI: 10.1016/j.tins.2022.06.007] [Citation(s) in RCA: 33] [Impact Index Per Article: 16.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 05/20/2022] [Accepted: 06/21/2022] [Indexed: 01/07/2023]
Abstract
Addressing the multiscale organization of the brain, which is fundamental to the dynamic repertoire of the organ, remains challenging. In principle, it should be possible to model neurons and synapses in detail and then connect them into large neuronal assemblies to explain the relationship between microscopic phenomena, large-scale brain functions, and behavior. It is more difficult to infer neuronal functions from ensemble measurements such as those currently obtained with brain activity recordings. In this article we consider theories and strategies for combining bottom-up models, generated from principles of neuronal biophysics, with top-down models based on ensemble representations of network activity and on functional principles. These integrative approaches are hoped to provide effective multiscale simulations in virtual brains and neurorobots, and pave the way to future applications in medicine and information technologies.
Collapse
Affiliation(s)
- Egidio D'Angelo
- Department of Brain and Behavioral Sciences, University of Pavia, and Brain Connectivity Center, Istituto di Ricovero e Cura a Carattere Scientifico (IRCCS) Mondino Foundation, Pavia, Italy.
| | - Viktor Jirsa
- Institut National de la Santé et de la Recherche Médicale (INSERM) Unité 1106, Centre National de la Recherche Scientifique (CNRS), and University of Aix-Marseille, Marseille, France
| |
Collapse
|
7
|
van Albada SJ, Morales-Gregorio A, Dickscheid T, Goulas A, Bakker R, Bludau S, Palm G, Hilgetag CC, Diesmann M. Bringing Anatomical Information into Neuronal Network Models. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:201-234. [DOI: 10.1007/978-3-030-89439-9_9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
8
|
Conventional measures of intrinsic excitability are poor estimators of neuronal activity under realistic synaptic inputs. PLoS Comput Biol 2021; 17:e1009378. [PMID: 34529674 PMCID: PMC8478185 DOI: 10.1371/journal.pcbi.1009378] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2021] [Revised: 09/28/2021] [Accepted: 08/24/2021] [Indexed: 11/19/2022] Open
Abstract
Activity-dependent regulation of intrinsic excitability has been shown to greatly contribute to the overall plasticity of neuronal circuits. Such neuroadaptations are commonly investigated in patch clamp experiments using current step stimulation and the resulting input-output functions are analyzed to quantify alterations in intrinsic excitability. However, it is rarely addressed, how such changes translate to the function of neurons when they operate under natural synaptic inputs. Still, it is reasonable to expect that a strong correlation and near proportional relationship exist between static firing responses and those evoked by synaptic drive. We challenge this view by performing a high-yield electrophysiological analysis of cultured mouse hippocampal neurons using both standard protocols and simulated synaptic inputs via dynamic clamp. We find that under these conditions the neurons exhibit vastly different firing responses with surprisingly weak correlation between static and dynamic firing intensities. These contrasting responses are regulated by two intrinsic K-currents mediated by Kv1 and Kir channels, respectively. Pharmacological manipulation of the K-currents produces differential regulation of the firing output of neurons. Static firing responses are greatly increased in stuttering type neurons under blocking their Kv1 channels, while the synaptic responses of the same neurons are less affected. Pharmacological blocking of Kir-channels in delayed firing type neurons, on the other hand, exhibit the opposite effects. Our subsequent computational model simulations confirm the findings in the electrophysiological experiments and also show that adaptive changes in the kinetic properties of such currents can even produce paradoxical regulation of the firing output.
Collapse
|
9
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel’s time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability. The elementary processing units in the central nervous system are neurons that transmit information by short electrical pulses, so called action potentials or spikes. The generation of the action potential is a random process that can be shaped by correlated fluctuations (colored noise) and by adaptation. A consequence of these two ubiquitous features is that the successive time intervals between spikes, the interspike intervals, are not independent but correlated. As these correlations can significantly improve information transmission and weak-signal detection, it is an important task to develop analytical approaches to these statistics for well-established computational models. Here we present a theory of interval correlations for a widely used class of integrate-and-fire models endowed with an adaptation mechanism and subject to correlated fluctuations. We demonstrate which patterns of interval correlations can be expected from the interplay of colored noise, adaptation and intrinsic nonlinear dynamics.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
- * E-mail:
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
10
|
Beniaguev D, Segev I, London M. Single cortical neurons as deep artificial neural networks. Neuron 2021; 109:2727-2739.e3. [PMID: 34380016 DOI: 10.1016/j.neuron.2021.07.002] [Citation(s) in RCA: 44] [Impact Index Per Article: 14.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Revised: 03/04/2021] [Accepted: 06/30/2021] [Indexed: 11/17/2022]
Abstract
Utilizing recent advances in machine learning, we introduce a systematic approach to characterize neurons' input/output (I/O) mapping complexity. Deep neural networks (DNNs) were trained to faithfully replicate the I/O function of various biophysical models of cortical neurons at millisecond (spiking) resolution. A temporally convolutional DNN with five to eight layers was required to capture the I/O mapping of a realistic model of a layer 5 cortical pyramidal cell (L5PC). This DNN generalized well when presented with inputs widely outside the training distribution. When NMDA receptors were removed, a much simpler network (fully connected neural network with one hidden layer) was sufficient to fit the model. Analysis of the DNNs' weight matrices revealed that synaptic integration in dendritic branches could be conceptualized as pattern matching from a set of spatiotemporal templates. This study provides a unified characterization of the computational complexity of single neurons and suggests that cortical networks therefore have a unique architecture, potentially supporting their computational power.
Collapse
Affiliation(s)
- David Beniaguev
- Edmond and Lily Safra Center for Brain Sciences (ELSC), The Hebrew University of Jerusalem, Jerusalem 91904, Israel.
| | - Idan Segev
- Edmond and Lily Safra Center for Brain Sciences (ELSC), The Hebrew University of Jerusalem, Jerusalem 91904, Israel; Department of Neurobiology, The Hebrew University of Jerusalem, Jerusalem 91904, Israel
| | - Michael London
- Edmond and Lily Safra Center for Brain Sciences (ELSC), The Hebrew University of Jerusalem, Jerusalem 91904, Israel; Department of Neurobiology, The Hebrew University of Jerusalem, Jerusalem 91904, Israel
| |
Collapse
|
11
|
Jaras I, Harada T, Orchard ME, Maldonado PE, Vergara RC. Extending the integrate-and-fire model to account for metabolic dependencies. Eur J Neurosci 2021; 54:5249-5260. [PMID: 34109698 DOI: 10.1111/ejn.15326] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2020] [Revised: 05/06/2021] [Accepted: 05/23/2021] [Indexed: 12/24/2022]
Abstract
It is widely accepted that the brain, like any other physical system, is subjected to physical constraints that restrict its operation. The brain's metabolic demands are particularly critical for proper neuronal function, but the impact of these constraints continues to remain poorly understood. Detailed single-neuron models are recently integrating metabolic constraints, but these models' computational resources make it challenging to explore the dynamics of extended neural networks, which are governed by such constraints. Thus, there is a need for a simplified neuron model that incorporates metabolic activity and allows us to explore the dynamics of neural networks. This work introduces an energy-dependent leaky integrate-and-fire (EDLIF) neuronal model extension to account for the effects of metabolic constraints on the single-neuron behavior. This simple, energy-dependent model could describe the relationship between the average firing rate and the Adenosine triphosphate (ATP) cost as well as replicate a neuron's behavior under a clinical setting such as amyotrophic lateral sclerosis (ALS). Additionally, EDLIF model showed better performance in predicting real spike trains - in the sense of spike coincidence measure - than the classical leaky integrate-and-fire (LIF) model. The simplicity of the energy-dependent model presented here makes it computationally efficient and, thus, suitable for studying the dynamics of large neural networks.
Collapse
Affiliation(s)
- Ismael Jaras
- Department of Electrical Engineering, Faculty of Mathematical and Physical Sciences, University of Chile, Santiago, Chile.,Neurosystems Laboratory, Biomedical Neuroscience Institute, Faculty of Medicine, University of Chile, Santiago, Chile
| | - Taiki Harada
- Tokyo Medical and Dental University, Tokyo, Japan
| | - Marcos E Orchard
- Department of Electrical Engineering, Faculty of Mathematical and Physical Sciences, University of Chile, Santiago, Chile
| | - Pedro E Maldonado
- Neurosystems Laboratory, Biomedical Neuroscience Institute, Faculty of Medicine, University of Chile, Santiago, Chile
| | - Rodrigo C Vergara
- Kinesiology Department, Facultad de Artes y Educación Física, Universidad Metropolitana de Ciencias de la Educación, Santiago, Chile
| |
Collapse
|
12
|
Wybo WA, Jordan J, Ellenberger B, Marti Mengual U, Nevian T, Senn W. Data-driven reduction of dendritic morphologies with preserved dendro-somatic responses. eLife 2021; 10:60936. [PMID: 33494860 PMCID: PMC7837682 DOI: 10.7554/elife.60936] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2020] [Accepted: 01/04/2021] [Indexed: 11/13/2022] Open
Abstract
Dendrites shape information flow in neurons. Yet, there is little consensus on the level of spatial complexity at which they operate. Through carefully chosen parameter fits, solvable in the least-squares sense, we obtain accurate reduced compartmental models at any level of complexity. We show that (back-propagating) action potentials, Ca2+ spikes, and N-methyl-D-aspartate spikes can all be reproduced with few compartments. We also investigate whether afferent spatial connectivity motifs admit simplification by ablating targeted branches and grouping affected synapses onto the next proximal dendrite. We find that voltage in the remaining branches is reproduced if temporal conductance fluctuations stay below a limit that depends on the average difference in input resistance between the ablated branches and the next proximal dendrite. Furthermore, our methodology fits reduced models directly from experimental data, without requiring morphological reconstructions. We provide software that automatizes the simplification, eliminating a common hurdle toward including dendritic computations in network models.
Collapse
Affiliation(s)
- Willem Am Wybo
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Jakob Jordan
- Department of Physiology, University of Bern, Bern, Switzerland
| | | | | | - Thomas Nevian
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Walter Senn
- Department of Physiology, University of Bern, Bern, Switzerland
| |
Collapse
|
13
|
Baker C, Zhu V, Rosenbaum R. Nonlinear stimulus representations in neural circuits with approximate excitatory-inhibitory balance. PLoS Comput Biol 2020; 16:e1008192. [PMID: 32946433 PMCID: PMC7526938 DOI: 10.1371/journal.pcbi.1008192] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2020] [Revised: 09/30/2020] [Accepted: 07/24/2020] [Indexed: 12/02/2022] Open
Abstract
Balanced excitation and inhibition is widely observed in cortex. How does this balance shape neural computations and stimulus representations? This question is often studied using computational models of neuronal networks in a dynamically balanced state. But balanced network models predict a linear relationship between stimuli and population responses. So how do cortical circuits implement nonlinear representations and computations? We show that every balanced network architecture admits stimuli that break the balanced state and these breaks in balance push the network into a "semi-balanced state" characterized by excess inhibition to some neurons, but an absence of excess excitation. The semi-balanced state produces nonlinear stimulus representations and nonlinear computations, is unavoidable in networks driven by multiple stimuli, is consistent with cortical recordings, and has a direct mathematical relationship to artificial neural networks.
Collapse
Affiliation(s)
- Cody Baker
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
| | - Vicky Zhu
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, IN, USA
| |
Collapse
|
14
|
Skaar JEW, Stasik AJ, Hagen E, Ness TV, Einevoll GT. Estimation of neural network model parameters from local field potentials (LFPs). PLoS Comput Biol 2020; 16:e1007725. [PMID: 32155141 PMCID: PMC7083334 DOI: 10.1371/journal.pcbi.1007725] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2019] [Revised: 03/20/2020] [Accepted: 02/12/2020] [Indexed: 11/20/2022] Open
Abstract
Most modeling in systems neuroscience has been descriptive where neural representations such as ‘receptive fields’, have been found by statistically correlating neural activity to sensory input. In the traditional physics approach to modelling, hypotheses are represented by mechanistic models based on the underlying building blocks of the system, and candidate models are validated by comparing with experiments. Until now validation of mechanistic cortical network models has been based on comparison with neuronal spikes, found from the high-frequency part of extracellular electrical potentials. In this computational study we investigated to what extent the low-frequency part of the signal, the local field potential (LFP), can be used to validate and infer properties of mechanistic cortical network models. In particular, we asked the question whether the LFP can be used to accurately estimate synaptic connection weights in the underlying network. We considered the thoroughly analysed Brunel network comprising an excitatory and an inhibitory population of recurrently connected integrate-and-fire (LIF) neurons. This model exhibits a high diversity of spiking network dynamics depending on the values of only three network parameters. The LFP generated by the network was computed using a hybrid scheme where spikes computed from the point-neuron network were replayed on biophysically detailed multicompartmental neurons. We assessed how accurately the three model parameters could be estimated from power spectra of stationary ‘background’ LFP signals by application of convolutional neural nets (CNNs). All network parameters could be very accurately estimated, suggesting that LFPs indeed can be used for network model validation. Most of what we have learned about brain networks in vivo has come from the measurement of spikes (action potentials) recorded by extracellular electrodes. The low-frequency part of these signals, the local field potential (LFP), contains unique information about how dendrites in neuronal populations integrate synaptic inputs, but has so far played a lesser role. To investigate whether the LFP can be used to validate network models, we computed LFP signals for a recurrent network model (the Brunel network) for which the ground-truth parameters are known. By application of convolutional neural nets (CNNs) we found that the synaptic weights indeed could be accurately estimated from ‘background’ LFP signals, suggesting a future key role for LFP in development of network models.
Collapse
Affiliation(s)
- Jan-Eirik W. Skaar
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | | | - Espen Hagen
- Department of Physics, University of Oslo, Oslo, Norway
| | - Torbjørn V. Ness
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Gaute T. Einevoll
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- Department of Physics, University of Oslo, Oslo, Norway
- * E-mail:
| |
Collapse
|
15
|
Abu-Hassan K, Taylor JD, Morris PG, Donati E, Bortolotto ZA, Indiveri G, Paton JFR, Nogaret A. Optimal solid state neurons. Nat Commun 2019; 10:5309. [PMID: 31796727 PMCID: PMC6890780 DOI: 10.1038/s41467-019-13177-3] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2019] [Accepted: 10/14/2019] [Indexed: 11/09/2022] Open
Abstract
Bioelectronic medicine is driving the need for neuromorphic microcircuits that integrate raw nervous stimuli and respond identically to biological neurons. However, designing such circuits remains a challenge. Here we estimate the parameters of highly nonlinear conductance models and derive the ab initio equations of intracellular currents and membrane voltages embodied in analog solid-state electronics. By configuring individual ion channels of solid-state neurons with parameters estimated from large-scale assimilation of electrophysiological recordings, we successfully transfer the complete dynamics of hippocampal and respiratory neurons in silico. The solid-state neurons are found to respond nearly identically to biological neurons under stimulation by a wide range of current injection protocols. The optimization of nonlinear models demonstrates a powerful method for programming analog electronic circuits. This approach offers a route for repairing diseased biocircuits and emulating their function with biomedical implants that can adapt to biofeedback.
Collapse
Affiliation(s)
- Kamal Abu-Hassan
- Department of Physics, University of Bath, Claverton Down, Bath, BA2 7AY, UK
| | - Joseph D Taylor
- Department of Physics, University of Bath, Claverton Down, Bath, BA2 7AY, UK
| | - Paul G Morris
- Department of Physics, University of Bath, Claverton Down, Bath, BA2 7AY, UK.,School of Physiology, Pharmacology and Neuroscience, University of Bristol, Bristol, BS8 1TD, UK
| | - Elisa Donati
- Institute of Neuroinformatics, University of Zürich and ETH Zürich, Winterthurerstrasse 190, 8057, Zürich, Switzerland
| | - Zuner A Bortolotto
- School of Physiology, Pharmacology and Neuroscience, University of Bristol, Bristol, BS8 1TD, UK
| | - Giacomo Indiveri
- Institute of Neuroinformatics, University of Zürich and ETH Zürich, Winterthurerstrasse 190, 8057, Zürich, Switzerland
| | - Julian F R Paton
- School of Physiology, Pharmacology and Neuroscience, University of Bristol, Bristol, BS8 1TD, UK.,Department of Physiology, Faculty of Medical and Health Sciences, University of Auckland, Grafton, Auckland, New Zealand
| | - Alain Nogaret
- Department of Physics, University of Bath, Claverton Down, Bath, BA2 7AY, UK.
| |
Collapse
|
16
|
Barta T, Kostal L. The effect of inhibition on rate code efficiency indicators. PLoS Comput Biol 2019; 15:e1007545. [PMID: 31790384 PMCID: PMC6907877 DOI: 10.1371/journal.pcbi.1007545] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2019] [Revised: 12/12/2019] [Accepted: 11/12/2019] [Indexed: 11/30/2022] Open
Abstract
In this paper we investigate the rate coding capabilities of neurons whose input signal are alterations of the base state of balanced inhibitory and excitatory synaptic currents. We consider different regimes of excitation-inhibition relationship and an established conductance-based leaky integrator model with adaptive threshold and parameter sets recreating biologically relevant spiking regimes. We find that given mean post-synaptic firing rate, counter-intuitively, increased ratio of inhibition to excitation generally leads to higher signal to noise ratio (SNR). On the other hand, the inhibitory input significantly reduces the dynamic coding range of the neuron. We quantify the joint effect of SNR and dynamic coding range by computing the metabolic efficiency-the maximal amount of information per one ATP molecule expended (in bits/ATP). Moreover, by calculating the metabolic efficiency we are able to predict the shapes of the post-synaptic firing rate histograms that may be tested on experimental data. Likewise, optimal stimulus input distributions are predicted, however, we show that the optimum can essentially be reached with a broad range of input distributions. Finally, we examine which parameters of the used neuronal model are the most important for the metabolically efficient information transfer.
Collapse
Affiliation(s)
- Tomas Barta
- Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
- Charles University, First Medical Faculty, Prague, Czech Republic
- Institute of Ecology and Environmental Sciences, INRA, Versailles, France
| | - Lubomir Kostal
- Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
| |
Collapse
|
17
|
Inferring and validating mechanistic models of neural microcircuits based on spike-train data. Nat Commun 2019; 10:4933. [PMID: 31666513 PMCID: PMC6821748 DOI: 10.1038/s41467-019-12572-0] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2018] [Accepted: 09/18/2019] [Indexed: 01/11/2023] Open
Abstract
The interpretation of neuronal spike train recordings often relies on abstract statistical models that allow for principled parameter estimation and model selection but provide only limited insights into underlying microcircuits. In contrast, mechanistic models are useful to interpret microcircuit dynamics, but are rarely quantitatively matched to experimental data due to methodological challenges. Here we present analytical methods to efficiently fit spiking circuit models to single-trial spike trains. Using derived likelihood functions, we statistically infer the mean and variance of hidden inputs, neuronal adaptation properties and connectivity for coupled integrate-and-fire neurons. Comprehensive evaluations on synthetic data, validations using ground truth in-vitro and in-vivo recordings, and comparisons with existing techniques demonstrate that parameter estimation is very accurate and efficient, even for highly subsampled networks. Our methods bridge statistical, data-driven and theoretical, model-based neurosciences at the level of spiking circuits, for the purpose of a quantitative, mechanistic interpretation of recorded neuronal population activity. It is difficult to fit mechanistic, biophysically constrained circuit models to spike train data from in vivo extracellular recordings. Here the authors present analytical methods that enable efficient parameter estimation for integrate-and-fire circuit models and inference of the underlying connectivity structure in subsampled networks.
Collapse
|
18
|
He Z, Li C, Chen L, Cao Z. Dynamic behaviors of the FitzHugh-Nagumo neuron model with state-dependent impulsive effects. Neural Netw 2019; 121:497-511. [PMID: 31655446 DOI: 10.1016/j.neunet.2019.09.031] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2019] [Revised: 09/14/2019] [Accepted: 09/22/2019] [Indexed: 11/19/2022]
Abstract
In present work, in order to reproduce spiking and bursting behavior of real neurons, a new hybrid biological neuron model is established and analyzed by combining the FitzHugh-Nagumo (FHN) neuron model, the threshold for spike initiation and the state-dependent impulsive effects (impulse resetting process). Firstly, we construct Poincaré mappings under different conditions by means of geometric analysis, and then obtain some sufficient criteria for the existence and stability of order-1 or order-2 periodic solution to the impulsive neuron model by finding the fixed point of Poincaré mapping and some geometric analysis techniques. Numerical simulations are given to illustrate and verify our theoretical results. The bifurcation diagrams are presented to describe the phenomena of period-doubling route to chaos, which implies that the dynamic behavior of the neuron model become more complex due to impulsive effects. Furthermore, the correctness and effectiveness of the proposed FitzHugh-Nagumo neuron model with state-dependent impulsive effects are verified by circuit simulation. Finally, the conclusions of this paper are analyzed and summarized, and the effects of random factors on the electrophysiological activities of neuron are discussed by numerical simulation.
Collapse
Affiliation(s)
- Zhilong He
- Chongqing Key Laboratory of Nonlinear Circuits and Intelligent Information Processing, College of Electronic and Information Engineering, Southwest University, Chongqing 400715, PR China; School of Finance, Xinjiang University of Finance and Economics, Urumqi, 830012, PR China.
| | - Chuandong Li
- Chongqing Key Laboratory of Nonlinear Circuits and Intelligent Information Processing, College of Electronic and Information Engineering, Southwest University, Chongqing 400715, PR China.
| | - Ling Chen
- Chongqing Key Laboratory of Nonlinear Circuits and Intelligent Information Processing, College of Electronic and Information Engineering, Southwest University, Chongqing 400715, PR China.
| | - Zhengran Cao
- Chongqing Key Laboratory of Nonlinear Circuits and Intelligent Information Processing, College of Electronic and Information Engineering, Southwest University, Chongqing 400715, PR China.
| |
Collapse
|
19
|
Valadez-Godínez S, Sossa H, Santiago-Montero R. On the accuracy and computational cost of spiking neuron implementation. Neural Netw 2019; 122:196-217. [PMID: 31689679 DOI: 10.1016/j.neunet.2019.09.026] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2019] [Revised: 09/12/2019] [Accepted: 09/17/2019] [Indexed: 10/25/2022]
Abstract
Since more than a decade ago, three statements about spiking neuron (SN) implementations have been widely accepted: 1) Hodgkin and Huxley (HH) model is computationally prohibitive, 2) Izhikevich (IZH) artificial neuron is as efficient as Leaky Integrate-and-Fire (LIF) model, and 3) IZH model is more efficient than HH model (Izhikevich, 2004). As suggested by Hodgkin and Huxley (1952), their model operates in two modes: by using the α's and β's rate functions directly (HH model) and by storing them into tables (HHT model) for computational cost reduction. Recently, it has been stated that: 1) HHT model (HH using tables) is not prohibitive, 2) IZH model is not efficient, and 3) both HHT and IZH models are comparable in computational cost (Skocik & Long, 2014). That controversy shows that there is no consensus concerning SN simulation capacities. Hence, in this work, we introduce a refined approach, based on the multiobjective optimization theory, describing the SN simulation capacities and ultimately choosing optimal simulation parameters. We have used normalized metrics to define the capacity levels of accuracy, computational cost, and efficiency. Normalized metrics allowed comparisons between SNs at the same level or scale. We conducted tests for balanced, lower, and upper boundary conditions under a regular spiking mode with constant and random current stimuli. We found optimal simulation parameters leading to a balance between computational cost and accuracy. Importantly, and, in general, we found that 1) HH model (without using tables) is the most accurate, computationally inexpensive, and efficient, 2) IZH model is the most expensive and inefficient, 3) both LIF and HHT models are the most inaccurate, 4) HHT model is more expensive and inaccurate than HH model due to α's and β's table discretization, and 5) HHT model is not comparable in computational cost to IZH model. These results refute the theory formulated over a decade ago (Izhikevich, 2004) and go more in-depth in the statements formulated by Skocik and Long (2014). Our statements imply that the number of dimensions or FLOPS in the SNs are theoretical but not practical indicators of the true computational cost. The metric we propose for the computational cost is more precise than FLOPS and was found to be invariant to computer architecture. Moreover, we found that the firing frequency used in previous works is a necessary but an insufficient metric to evaluate the simulation accuracy. We also show that our results are consistent with the theory of numerical methods and the theory of SN discontinuity. Discontinuous SNs, such LIF and IZH models, introduce a considerable error every time a spike is generated. In addition, compared to the constant input current, the random input current increases the computational cost and inaccuracy. Besides, we found that the search for optimal simulation parameters is problem-specific. That is important because most of the previous works have intended to find a general and unique optimal simulation. Here, we show that this solution could not exist because it is a multiobjective optimization problem that depends on several factors. This work sets up a renewed thesis concerning the SN simulation that is useful to several related research areas, including the emergent Deep Spiking Neural Networks.
Collapse
Affiliation(s)
- Sergio Valadez-Godínez
- Laboratorio de Robótica y Mecatrónica, Centro de Investigación en Computación, Instituto Politécnico Nacional, Av. Juan de Dios Bátiz, S/N, Col. Nva. Industrial Vallejo, Ciudad de México, México, 07738, Mexico; División de Ingeniería Informática, Instituto Tecnológico Superior de Purísima del Rincón, Gto., México, 36413, Mexico; División de Ingenierías de Educación Superior, Universidad Virtual del Estado de Guanajuato, Gto., México, 36400, Mexico.
| | - Humberto Sossa
- Laboratorio de Robótica y Mecatrónica, Centro de Investigación en Computación, Instituto Politécnico Nacional, Av. Juan de Dios Bátiz, S/N, Col. Nva. Industrial Vallejo, Ciudad de México, México, 07738, Mexico; Tecnológico de Monterrey, Campus Guadalajara, Av. Gral. Ramón Corona 2514, Zapopan, Jal., México, 45138, Mexico.
| | - Raúl Santiago-Montero
- División de Estudios de Posgrado e Investigación, Instituto Tecnológico de León, Av. Tecnológico S/N, León, Gto., México, 37290, Mexico.
| |
Collapse
|
20
|
Mäki-Marttunen T, Devor A, Phillips WA, Dale AM, Andreassen OA, Einevoll GT. Computational Modeling of Genetic Contributions to Excitability and Neural Coding in Layer V Pyramidal Cells: Applications to Schizophrenia Pathology. Front Comput Neurosci 2019; 13:66. [PMID: 31616272 PMCID: PMC6775251 DOI: 10.3389/fncom.2019.00066] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2019] [Accepted: 09/09/2019] [Indexed: 11/13/2022] Open
Abstract
Pyramidal cells in layer V of the neocortex are one of the most widely studied neuron types in the mammalian brain. Due to their role as integrators of feedforward and cortical feedback inputs, they are well-positioned to contribute to the symptoms and pathology in mental disorders-such as schizophrenia-that are characterized by a mismatch between the internal perception and external inputs. In this modeling study, we analyze the input/output properties of layer V pyramidal cells and their sensitivity to modeled genetic variants in schizophrenia-associated genes. We show that the excitability of layer V pyramidal cells and the way they integrate inputs in space and time are altered by many types of variants in ion-channel and Ca2+ transporter-encoding genes that have been identified as risk genes by recent genome-wide association studies. We also show that the variability in the output patterns of spiking and Ca2+ transients in layer V pyramidal cells is altered by these model variants. Importantly, we show that many of the predicted effects are robust to noise and qualitatively similar across different computational models of layer V pyramidal cells. Our modeling framework reveals several aspects of single-neuron excitability that can be linked to known schizophrenia-related phenotypes and existing hypotheses on disease mechanisms. In particular, our models predict that single-cell steady-state firing rate is positively correlated with the coding capacity of the neuron and negatively correlated with the amplitude of a prepulse-mediated adaptation and sensitivity to coincidence of stimuli in the apical dendrite and the perisomatic region of a layer V pyramidal cell. These results help to uncover the voltage-gated ion-channel and Ca2+ transporter-associated genetic underpinnings of schizophrenia phenotypes and biomarkers.
Collapse
Affiliation(s)
| | - Anna Devor
- Department of Neurosciences, University of California San Diego, La Jolla, CA, United States.,Department of Radiology, University of California San Diego, La Jolla, CA, United States.,Martinos Center for Biomedical Imaging, Harvard Medical School, Massachusetts General Hospital, Charlestown, MA, United States
| | - William A Phillips
- Psychology, Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
| | - Anders M Dale
- Department of Neurosciences, University of California San Diego, La Jolla, CA, United States.,Department of Radiology, University of California San Diego, La Jolla, CA, United States
| | - Ole A Andreassen
- NORMENT, Division of Mental Health and Addiction, Oslo University Hospital and Institute of Clinical Medicine, University of Oslo, Oslo, Norway
| | - Gaute T Einevoll
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway.,Department of Physics, University of Oslo, Oslo, Norway
| |
Collapse
|
21
|
Alternative classifications of neurons based on physiological properties and synaptic responses, a computational study. Sci Rep 2019; 9:13096. [PMID: 31511545 PMCID: PMC6739481 DOI: 10.1038/s41598-019-49197-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2019] [Accepted: 08/16/2019] [Indexed: 01/25/2023] Open
Abstract
One of the central goals of today's neuroscience is to achieve the conceivably most accurate classification of neuron types in the mammalian brain. As part of this research effort, electrophysiologists commonly utilize current clamp techniques to gain a detailed characterization of the neurons' physiological properties. While this approach has been useful, it is not well understood whether neurons that share physiological properties of a particular phenotype would also operate consistently under the action of natural synaptic inputs. We approached this problem by simulating a biophysically diverse population of model neurons based on 3 generic phenotypes. We exposed the model neurons to two types of stimulation to investigate their voltage responses under conventional current step protocols and under simulated synaptic bombardment. We extracted standard physiological parameters from the voltage responses elicited by current step stimulation and spike arrival times descriptive of the model's firing behavior under synaptic inputs. The biophysical phenotypes could be reliably identified using classification based on the 'static' physiological properties, but not the interspike interval-based parameters. However, the model neurons associated with the biophysically different phenotypes retained cell type specific features in the fine structure of their spike responses that allowed their accurate classification.
Collapse
|
22
|
Abstract
Qualitative psychological principles are commonly utilized to influence the choices that people make. Can this goal be achieved more efficiently by using quantitative models of choice? Here, we launch an academic competition to compare the effectiveness of these two approaches.
Collapse
|
23
|
Einevoll GT, Destexhe A, Diesmann M, Grün S, Jirsa V, de Kamps M, Migliore M, Ness TV, Plesser HE, Schürmann F. The Scientific Case for Brain Simulations. Neuron 2019; 102:735-744. [DOI: 10.1016/j.neuron.2019.03.027] [Citation(s) in RCA: 50] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2018] [Revised: 02/06/2019] [Accepted: 03/18/2019] [Indexed: 01/30/2023]
|
24
|
Ocker GK, Doiron B. Training and Spontaneous Reinforcement of Neuronal Assemblies by Spike Timing Plasticity. Cereb Cortex 2019; 29:937-951. [PMID: 29415191 PMCID: PMC7963120 DOI: 10.1093/cercor/bhy001] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 01/01/2018] [Accepted: 01/05/2018] [Indexed: 12/15/2022] Open
Abstract
The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, PA, USA
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
25
|
Interneuronal gamma oscillations in hippocampus via adaptive exponential integrate-and-fire neurons. Neurocomputing 2019. [DOI: 10.1016/j.neucom.2018.11.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
26
|
An Efficient Population Density Method for Modeling Neural Networks with Synaptic Dynamics Manifesting Finite Relaxation Time and Short-Term Plasticity. eNeuro 2019; 5:eN-MNT-0002-18. [PMID: 30662939 PMCID: PMC6336402 DOI: 10.1523/eneuro.0002-18.2018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2018] [Revised: 10/24/2018] [Accepted: 11/21/2018] [Indexed: 12/05/2022] Open
Abstract
When incorporating more realistic synaptic dynamics, the computational efficiency of population density methods (PDMs) declines sharply due to the increase in the dimension of master equations. To avoid such a decline, we develop an efficient PDM, termed colored-synapse PDM (csPDM), in which the dimension of the master equations does not depend on the number of synapse-associated state variables in the underlying network model. Our goal is to allow the PDM to incorporate realistic synaptic dynamics that possesses not only finite relaxation time but also short-term plasticity (STP). The model equations of csPDM are derived based on the diffusion approximation on synaptic dynamics and probability density function methods for Langevin equations with colored noise. Numerical examples, given by simulations of the population dynamics of uncoupled exponential integrate-and-fire (EIF) neurons, show good agreement between the results of csPDM and Monte Carlo simulations (MCSs). Compared to the original full-dimensional PDM (fdPDM), the csPDM reveals more excellent computational efficiency because of the lower dimension of the master equations. In addition, it permits network dynamics to possess the short-term plastic characteristics inherited from plastic synapses. The novel csPDM has potential applicability to any spiking neuron models because of no assumptions on neuronal dynamics, and, more importantly, this is the first report of PDM to successfully encompass short-term facilitation/depression properties.
Collapse
|
27
|
Clarke SE. Analog Signaling With the "Digital" Molecular Switch CaMKII. Front Comput Neurosci 2018; 12:92. [PMID: 30524260 PMCID: PMC6262075 DOI: 10.3389/fncom.2018.00092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2018] [Accepted: 10/31/2018] [Indexed: 11/13/2022] Open
Abstract
Molecular switches, such as the protein kinase CaMKII, play a fundamental role in cell signaling by decoding inputs into either high or low states of activity; because the high activation state can be turned on and persist after the input ceases, these switches have earned a reputation as "digital." Although this on/off, binary perspective has been valuable for understanding long timescale synaptic plasticity, accumulating experimental evidence suggests that the CaMKII switch can also control plasticity on short timescales. To investigate this idea further, a non-autonomous, nonlinear ordinary differential equation, representative of a general bistable molecular switch, is analyzed. The results suggest that switch activity in regions surrounding either the high- or low-stable states of activation could act as a reliable analog signal, whose short timescale fluctuations relative to equilibrium track instantaneous input frequency. The model makes intriguing predictions and is validated against previous work demonstrating its suitability as a minimal representation of switch dynamics; in combination with existing experimental evidence, the theory suggests a multiplexed encoding of instantaneous frequency information over short timescales, with integration of total activity over longer timescales.
Collapse
Affiliation(s)
- Stephen E Clarke
- Department of Bioengineering, Department of Neurosurgery, Stanford University, Stanford, CA, United States
| |
Collapse
|
28
|
Senk J, Carde C, Hagen E, Kuhlen TW, Diesmann M, Weyers B. VIOLA-A Multi-Purpose and Web-Based Visualization Tool for Neuronal-Network Simulation Output. Front Neuroinform 2018; 12:75. [PMID: 30467469 PMCID: PMC6236002 DOI: 10.3389/fninf.2018.00075] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2018] [Accepted: 10/10/2018] [Indexed: 11/13/2022] Open
Abstract
Neuronal network models and corresponding computer simulations are invaluable tools to aid the interpretation of the relationship between neuron properties, connectivity, and measured activity in cortical tissue. Spatiotemporal patterns of activity propagating across the cortical surface as observed experimentally can for example be described by neuronal network models with layered geometry and distance-dependent connectivity. In order to cover the surface area captured by today's experimental techniques and to achieve sufficient self-consistency, such models contain millions of nerve cells. The interpretation of the resulting stream of multi-modal and multi-dimensional simulation data calls for integrating interactive visualization steps into existing simulation-analysis workflows. Here, we present a set of interactive visualization concepts called views for the visual analysis of activity data in topological network models, and a corresponding reference implementation VIOLA (VIsualization Of Layer Activity). The software is a lightweight, open-source, web-based, and platform-independent application combining and adapting modern interactive visualization paradigms, such as coordinated multiple views, for massively parallel neurophysiological data. For a use-case demonstration we consider spiking activity data of a two-population, layered point-neuron network model incorporating distance-dependent connectivity subject to a spatially confined excitation originating from an external population. With the multiple coordinated views, an explorative and qualitative assessment of the spatiotemporal features of neuronal activity can be performed upfront of a detailed quantitative data analysis of specific aspects of the data. Interactive multi-view analysis therefore assists existing data analysis workflows. Furthermore, ongoing efforts including the European Human Brain Project aim at providing online user portals for integrated model development, simulation, analysis, and provenance tracking, wherein interactive visual analysis tools are one component. Browser-compatible, web-technology based solutions are therefore required. Within this scope, with VIOLA we provide a first prototype.
Collapse
Affiliation(s)
- Johanna Senk
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Corto Carde
- Visual Computing Institute, RWTH Aachen University, Aachen, Germany
- JARA - High-Performance Computing, Aachen, Germany
- IMT Atlantique Bretagne-Pays de la Loire, Brest, France
| | - Espen Hagen
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, University of Oslo, Oslo, Norway
| | - Torsten W. Kuhlen
- Visual Computing Institute, RWTH Aachen University, Aachen, Germany
- JARA - High-Performance Computing, Aachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Benjamin Weyers
- Visual Computing Institute, RWTH Aachen University, Aachen, Germany
- JARA - High-Performance Computing, Aachen, Germany
| |
Collapse
|
29
|
Heiberg T, Kriener B, Tetzlaff T, Einevoll GT, Plesser HE. Firing-rate models for neurons with a broad repertoire of spiking behaviors. J Comput Neurosci 2018; 45:103-132. [PMID: 30146661 PMCID: PMC6208914 DOI: 10.1007/s10827-018-0693-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2018] [Revised: 08/01/2018] [Accepted: 08/02/2018] [Indexed: 11/29/2022]
Abstract
Capturing the response behavior of spiking neuron models with rate-based models facilitates the investigation of neuronal networks using powerful methods for rate-based network dynamics. To this end, we investigate the responses of two widely used neuron model types, the Izhikevich and augmented multi-adapative threshold (AMAT) models, to a range of spiking inputs ranging from step responses to natural spike data. We find (i) that linear-nonlinear firing rate models fitted to test data can be used to describe the firing-rate responses of AMAT and Izhikevich spiking neuron models in many cases; (ii) that firing-rate responses are generally too complex to be captured by first-order low-pass filters but require bandpass filters instead; (iii) that linear-nonlinear models capture the response of AMAT models better than of Izhikevich models; (iv) that the wide range of response types evoked by current-injection experiments collapses to few response types when neurons are driven by stationary or sinusoidally modulated Poisson input; and (v) that AMAT and Izhikevich models show different responses to spike input despite identical responses to current injections. Together, these findings suggest that rate-based models of network dynamics may capture a wider range of neuronal response properties by incorporating second-order bandpass filters fitted to responses of spiking model neurons. These models may contribute to bringing rate-based network modeling closer to the reality of biological neuronal networks.
Collapse
Affiliation(s)
- Thomas Heiberg
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Birgit Kriener
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway.,Institute of Basic Medical Sciences, University of Oslo, Oslo, Norway
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, Jülich, Germany.,Institute for Advanced Simulation (IAS-6), Jülich Research Centre, Jülich, Germany.,JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Gaute T Einevoll
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway.,Department of Physics, University of Oslo, Oslo, Norway
| | - Hans E Plesser
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway. .,Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, Jülich, Germany.
| |
Collapse
|
30
|
Imbalanced amplification: A mechanism of amplification and suppression from local imbalance of excitation and inhibition in cortical circuits. PLoS Comput Biol 2018; 14:e1006048. [PMID: 29543827 PMCID: PMC5871018 DOI: 10.1371/journal.pcbi.1006048] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2017] [Revised: 03/27/2018] [Accepted: 02/22/2018] [Indexed: 01/02/2023] Open
Abstract
Understanding the relationship between external stimuli and the spiking activity of cortical populations is a central problem in neuroscience. Dense recurrent connectivity in local cortical circuits can lead to counterintuitive response properties, raising the question of whether there are simple arithmetical rules for relating circuits’ connectivity structure to their response properties. One such arithmetic is provided by the mean field theory of balanced networks, which is derived in a limit where excitatory and inhibitory synaptic currents precisely balance on average. However, balanced network theory is not applicable to some biologically relevant connectivity structures. We show that cortical circuits with such structure are susceptible to an amplification mechanism arising when excitatory-inhibitory balance is broken at the level of local subpopulations, but maintained at a global level. This amplification, which can be quantified by a linear correction to the classical mean field theory of balanced networks, explains several response properties observed in cortical recordings and provides fundamental insights into the relationship between connectivity structure and neural responses in cortical circuits. Understanding how the brain represents and processes stimuli requires a quantitative understanding of how signals propagate through networks of neurons. Developing such an understanding is made difficult by the dense interconnectivity of neurons, especially in the cerebral cortex. One approach to quantifying neural processing in the cortex is derived from observations that excitatory (positive) and inhibitory (negative) interactions between neurons tend to balance each other in many brain areas. This balance is achieved under a class of computational models called “balanced networks.” However, previous approaches to the mathematical analysis of balanced network models is not possible under some biologically relevant connectivity structures. We show that, under these structures, balance between excitation and inhibition is necessarily broken and the resulting imbalance causes some stimulus features to be amplified. This “imbalanced amplification” of stimuli can explain several observations from recordings in mouse somatosensory and visual cortical circuits and provides fundamental insights into the relationship between connectivity structure and neural responses in cortical circuits.
Collapse
|
31
|
Hagen E, Dahmen D, Stavrinou ML, Lindén H, Tetzlaff T, van Albada SJ, Grün S, Diesmann M, Einevoll GT. Hybrid Scheme for Modeling Local Field Potentials from Point-Neuron Networks. Cereb Cortex 2016; 26:4461-4496. [PMID: 27797828 PMCID: PMC6193674 DOI: 10.1093/cercor/bhw237] [Citation(s) in RCA: 67] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2016] [Revised: 05/31/2016] [Accepted: 07/12/2016] [Indexed: 12/21/2022] Open
Abstract
With rapidly advancing multi-electrode recording technology, the local field potential (LFP) has again become a popular measure of neuronal activity in both research and clinical applications. Proper understanding of the LFP requires detailed mathematical modeling incorporating the anatomical and electrophysiological features of neurons near the recording electrode, as well as synaptic inputs from the entire network. Here we propose a hybrid modeling scheme combining efficient point-neuron network models with biophysical principles underlying LFP generation by real neurons. The LFP predictions rely on populations of network-equivalent multicompartment neuron models with layer-specific synaptic connectivity, can be used with an arbitrary number of point-neuron network populations, and allows for a full separation of simulated network dynamics and LFPs. We apply the scheme to a full-scale cortical network model for a ∼1 mm2 patch of primary visual cortex, predict laminar LFPs for different network states, assess the relative LFP contribution from different laminar populations, and investigate effects of input correlations and neuron density on the LFP. The generic nature of the hybrid scheme and its public implementation in hybridLFPy form the basis for LFP predictions from other and larger point-neuron network models, as well as extensions of the current application with additional biological detail.
Collapse
Affiliation(s)
- Espen Hagen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, 52425 Jülich, Germany.,Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences, 1430 Ås, Norway
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, 52425 Jülich, Germany
| | - Maria L Stavrinou
- Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences, 1430 Ås, Norway.,Department of Psychology, University of Oslo, 0373 Oslo, Norway
| | - Henrik Lindén
- Department of Neuroscience and Pharmacology, University of Copenhagen, 2200 Copenhagen, Denmark.,Department of Computational Biology, School of Computer Science and Communication, Royal Institute of Technology, 100 44 Stockholm, Sweden
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, 52425 Jülich, Germany
| | - Sacha J van Albada
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, 52425 Jülich, Germany
| | - Sonja Grün
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, 52425 Jülich, Germany.,Theoretical Systems Neurobiology, RWTH Aachen University, 52056 Aachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, 52425 Jülich, Germany.,Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, 52074 Aachen, Germany.,Department of Physics, Faculty 1, RWTH Aachen University, 52062 Aachen, Germany
| | - Gaute T Einevoll
- Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences, 1430 Ås, Norway.,Department of Physics, University of Oslo, 0316 Oslo, Norway
| |
Collapse
|
32
|
Nogaret A, Meliza CD, Margoliash D, Abarbanel HDI. Automatic Construction of Predictive Neuron Models through Large Scale Assimilation of Electrophysiological Data. Sci Rep 2016; 6:32749. [PMID: 27605157 PMCID: PMC5015021 DOI: 10.1038/srep32749] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2016] [Accepted: 08/08/2016] [Indexed: 01/09/2023] Open
Abstract
We report on the construction of neuron models by assimilating electrophysiological data with large-scale constrained nonlinear optimization. The method implements interior point line parameter search to determine parameters from the responses to intracellular current injections of zebra finch HVC neurons. We incorporated these parameters into a nine ionic channel conductance model to obtain completed models which we then use to predict the state of the neuron under arbitrary current stimulation. Each model was validated by successfully predicting the dynamics of the membrane potential induced by 20–50 different current protocols. The dispersion of parameters extracted from different assimilation windows was studied. Differences in constraints from current protocols, stochastic variability in neuron output, and noise behave as a residual temperature which broadens the global minimum of the objective function to an ellipsoid domain whose principal axes follow an exponentially decaying distribution. The maximum likelihood expectation of extracted parameters was found to provide an excellent approximation of the global minimum and yields highly consistent kinetics for both neurons studied. Large scale assimilation absorbs the intrinsic variability of electrophysiological data over wide assimilation windows. It builds models in an automatic manner treating all data as equal quantities and requiring minimal additional insight.
Collapse
Affiliation(s)
- Alain Nogaret
- Department of Physics, University of Bath, Bath BA2 7AY, UK
| | - C Daniel Meliza
- Department of Psychology, University of Virginia, Charlottesville, VA 22904, USA
| | - Daniel Margoliash
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, IL 60637, USA
| | - Henry D I Abarbanel
- Department of Physics, University of California San Diego, La Jolla, CA 92093, USA.,Scripps Institution for Oceanography, Marine Physical Laboratory, La Jolla, CA 92093, USA
| |
Collapse
|
33
|
Herrmann B, Henry MJ, Johnsrude IS, Obleser J. Altered temporal dynamics of neural adaptation in the aging human auditory cortex. Neurobiol Aging 2016; 45:10-22. [DOI: 10.1016/j.neurobiolaging.2016.05.006] [Citation(s) in RCA: 42] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2015] [Revised: 04/11/2016] [Accepted: 05/07/2016] [Indexed: 12/19/2022]
|
34
|
Van Geit W, Gevaert M, Chindemi G, Rössert C, Courcol JD, Muller EB, Schürmann F, Segev I, Markram H. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience. Front Neuroinform 2016; 10:17. [PMID: 27375471 PMCID: PMC4896051 DOI: 10.3389/fninf.2016.00017] [Citation(s) in RCA: 82] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2016] [Accepted: 05/06/2016] [Indexed: 11/13/2022] Open
Abstract
At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases.
Collapse
Affiliation(s)
- Werner Van Geit
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Michael Gevaert
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Giuseppe Chindemi
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Christian Rössert
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Jean-Denis Courcol
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Eilif B Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Felix Schürmann
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Idan Segev
- Department of Neurobiology, Alexander Silberman Institute of Life Sciences, The Hebrew University of JerusalemJerusalem, Israel; The Edmond and Lily Safra Centre for Brain Sciences, The Hebrew University of JerusalemJerusalem, Israel
| | - Henry Markram
- Blue Brain Project, École Polytechnique Fédérale de LausanneGeneva, Switzerland; Laboratory of Neural Microcircuitry, Brain Mind Institute, École Polytechnique Fédérale de LausanneLausanne, Switzerland
| |
Collapse
|
35
|
Rosenbaum R. A Diffusion Approximation and Numerical Methods for Adaptive Neuron Models with Stochastic Inputs. Front Comput Neurosci 2016; 10:39. [PMID: 27148036 PMCID: PMC4840919 DOI: 10.3389/fncom.2016.00039] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2015] [Accepted: 04/04/2016] [Indexed: 11/16/2022] Open
Abstract
Characterizing the spiking statistics of neurons receiving noisy synaptic input is a central problem in computational neuroscience. Monte Carlo approaches to this problem are computationally expensive and often fail to provide mechanistic insight. Thus, the field has seen the development of mathematical and numerical approaches, often relying on a Fokker-Planck formalism. These approaches force a compromise between biological realism, accuracy and computational efficiency. In this article we develop an extension of existing diffusion approximations to more accurately approximate the response of neurons with adaptation currents and noisy synaptic currents. The implementation refines existing numerical schemes for solving the associated Fokker-Planck equations to improve computationally efficiency and accuracy. Computer code implementing the developed algorithms is made available to the public.
Collapse
Affiliation(s)
- Robert Rosenbaum
- Applied and Computational Mathematics and Statistics, University of Notre Dame Notre Dame, IN, USA
| |
Collapse
|
36
|
Mensi S, Hagens O, Gerstner W, Pozzorini C. Enhanced Sensitivity to Rapid Input Fluctuations by Nonlinear Threshold Dynamics in Neocortical Pyramidal Neurons. PLoS Comput Biol 2016; 12:e1004761. [PMID: 26907675 PMCID: PMC4764342 DOI: 10.1371/journal.pcbi.1004761] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2015] [Accepted: 01/19/2016] [Indexed: 11/25/2022] Open
Abstract
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter—describing somatic integration—and the spike-history filter—accounting for spike-frequency adaptation—dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations. Over the last decades, a variety of simplified spiking models have been shown to achieve a surprisingly high performance in predicting the neuronal responses to in vitro somatic current injections. Because of the complex adaptive behavior featured by cortical neurons, this success is however restricted to limited stimulus ranges: model parameters optimized for a specific input regime are often inappropriate to describe the response to input currents with different statistical properties. In the present study, a new spiking neuron model is introduced that captures single-neuron computation over a wide range of input statistics and explains different aspects of the neuronal dynamics within a single framework. Our results indicate that complex forms of single neuron adaptation are mediated by the nonlinear dynamics of the firing threshold and that the input-output transformation performed by cortical pyramidal neurons can be intuitively understood in terms of an enhanced Generalized Linear Model in which both the input filter and the spike-history filter adapt to the input statistics.
Collapse
Affiliation(s)
- Skander Mensi
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Olivier Hagens
- Laboratory of Neural Microcircuitry (LNMC), Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Wulfram Gerstner
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Christian Pozzorini
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| |
Collapse
|
37
|
Ocker GK, Litwin-Kumar A, Doiron B. Self-Organization of Microcircuits in Networks of Spiking Neurons with Plastic Synapses. PLoS Comput Biol 2015; 11:e1004458. [PMID: 26291697 PMCID: PMC4546203 DOI: 10.1371/journal.pcbi.1004458] [Citation(s) in RCA: 54] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2014] [Accepted: 07/19/2015] [Indexed: 11/18/2022] Open
Abstract
The synaptic connectivity of cortical networks features an overrepresentation of certain wiring motifs compared to simple random-network models. This structure is shaped, in part, by synaptic plasticity that promotes or suppresses connections between neurons depending on their joint spiking activity. Frequently, theoretical studies focus on how feedforward inputs drive plasticity to create this network structure. We study the complementary scenario of self-organized structure in a recurrent network, with spike timing-dependent plasticity driven by spontaneous dynamics. We develop a self-consistent theory for the evolution of network structure by combining fast spiking covariance with a slow evolution of synaptic weights. Through a finite-size expansion of network dynamics we obtain a low-dimensional set of nonlinear differential equations for the evolution of two-synapse connectivity motifs. With this theory in hand, we explore how the form of the plasticity rule drives the evolution of microcircuits in cortical networks. When potentiation and depression are in approximate balance, synaptic dynamics depend on weighted divergent, convergent, and chain motifs. For additive, Hebbian STDP these motif interactions create instabilities in synaptic dynamics that either promote or suppress the initial network structure. Our work provides a consistent theoretical framework for studying how spiking activity in recurrent networks interacts with synaptic plasticity to determine network structure.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Melon University, Pittsburgh, Pennsylvania, United States of America
| | - Ashok Litwin-Kumar
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Melon University, Pittsburgh, Pennsylvania, United States of America
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for Theoretical Neuroscience, Columbia University, New York, New York, United States of America
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Melon University, Pittsburgh, Pennsylvania, United States of America
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| |
Collapse
|
38
|
Pozzorini C, Mensi S, Hagens O, Naud R, Koch C, Gerstner W. Automated High-Throughput Characterization of Single Neurons by Means of Simplified Spiking Models. PLoS Comput Biol 2015; 11:e1004275. [PMID: 26083597 PMCID: PMC4470831 DOI: 10.1371/journal.pcbi.1004275] [Citation(s) in RCA: 58] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2014] [Accepted: 04/08/2015] [Indexed: 11/18/2022] Open
Abstract
Single-neuron models are useful not only for studying the emergent properties of neural circuits in large-scale simulations, but also for extracting and summarizing in a principled way the information contained in electrophysiological recordings. Here we demonstrate that, using a convex optimization procedure we previously introduced, a Generalized Integrate-and-Fire model can be accurately fitted with a limited amount of data. The model is capable of predicting both the spiking activity and the subthreshold dynamics of different cell types, and can be used for online characterization of neuronal properties. A protocol is proposed that, combined with emergent technologies for automatic patch-clamp recordings, permits automated, in vitro high-throughput characterization of single neurons.
Collapse
Affiliation(s)
- Christian Pozzorini
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| | - Skander Mensi
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Olivier Hagens
- Laboratory of Neural Microcircuitry (LNMC), Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Richard Naud
- Department of Physics, University of Ottawa, Ottawa, Canada
| | - Christof Koch
- Allen Institute for Brain Science, Seattle, Washington, USA
| | - Wulfram Gerstner
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
39
|
Lynch EP, Houghton CJ. Parameter estimation of neuron models using in-vitro and in-vivo electrophysiological data. Front Neuroinform 2015; 9:10. [PMID: 25941485 PMCID: PMC4403314 DOI: 10.3389/fninf.2015.00010] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2014] [Accepted: 03/27/2015] [Indexed: 11/30/2022] Open
Abstract
Spiking neuron models can accurately predict the response of neurons to somatically injected currents if the model parameters are carefully tuned. Predicting the response of in-vivo neurons responding to natural stimuli presents a far more challenging modeling problem. In this study, an algorithm is presented for parameter estimation of spiking neuron models. The algorithm is a hybrid evolutionary algorithm which uses a spike train metric as a fitness function. We apply this to parameter discovery in modeling two experimental data sets with spiking neurons; in-vitro current injection responses from a regular spiking pyramidal neuron are modeled using spiking neurons and in-vivo extracellular auditory data is modeled using a two stage model consisting of a stimulus filter and spiking neuron model.
Collapse
Affiliation(s)
- Eoin P Lynch
- School of Mathematics, Trinity College Dublin Dublin, Ireland ; Department of Computer Science, University of Bristol Bristol, UK
| | - Conor J Houghton
- Department of Computer Science, University of Bristol Bristol, UK
| |
Collapse
|
40
|
Nogaret A, O'Callaghan EL, Lataro RM, Salgado HC, Meliza CD, Duncan E, Abarbanel HDI, Paton JFR. Silicon central pattern generators for cardiac diseases. J Physiol 2015; 593:763-74. [PMID: 25433077 DOI: 10.1113/jphysiol.2014.282723] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2014] [Accepted: 11/16/2014] [Indexed: 11/08/2022] Open
Abstract
Cardiac rhythm management devices provide therapies for both arrhythmias and resynchronisation but not heart failure, which affects millions of patients worldwide. This paper reviews recent advances in biophysics and mathematical engineering that provide a novel technological platform for addressing heart disease and enabling beat-to-beat adaptation of cardiac pacing in response to physiological feedback. The technology consists of silicon hardware central pattern generators (hCPGs) that may be trained to emulate accurately the dynamical response of biological central pattern generators (bCPGs). We discuss the limitations of present CPGs and appraise the advantages of analog over digital circuits for application in bioelectronic medicine. To test the system, we have focused on the cardio-respiratory oscillators in the medulla oblongata that modulate heart rate in phase with respiration to induce respiratory sinus arrhythmia (RSA). We describe here a novel, scalable hCPG comprising physiologically realistic (Hodgkin-Huxley type) neurones and synapses. Our hCPG comprises two neurones that antagonise each other to provide rhythmic motor drive to the vagus nerve to slow the heart. We show how recent advances in modelling allow the motor output to adapt to physiological feedback such as respiration. In rats, we report on the restoration of RSA using an hCPG that receives diaphragmatic electromyography input and use it to stimulate the vagus nerve at specific time points of the respiratory cycle to slow the heart rate. We have validated the adaptation of stimulation to alterations in respiratory rate. We demonstrate that the hCPG is tuneable in terms of the depth and timing of the RSA relative to respiratory phase. These pioneering studies will now permit an analysis of the physiological role of RSA as well as its any potential therapeutic use in cardiac disease.
Collapse
Affiliation(s)
- Alain Nogaret
- Department of Physics, University of Bath, Bath, BA2 7AY, UK
| | | | | | | | | | | | | | | |
Collapse
|
41
|
Naud R, Bathellier B, Gerstner W. Spike-timing prediction in cortical neurons with active dendrites. Front Comput Neurosci 2014; 8:90. [PMID: 25165443 PMCID: PMC4131408 DOI: 10.3389/fncom.2014.00090] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2014] [Accepted: 07/20/2014] [Indexed: 11/13/2022] Open
Abstract
A complete single-neuron model must correctly reproduce the firing of spikes and bursts. We present a study of a simplified model of deep pyramidal cells of the cortex with active dendrites. We hypothesized that we can model the soma and its apical dendrite with only two compartments, without significant loss in the accuracy of spike-timing predictions. The model is based on experimentally measurable impulse-response functions, which transfer the effect of current injected in one compartment to current reaching the other. Each compartment was modeled with a pair of non-linear differential equations and a small number of parameters that approximate the Hodgkin-and-Huxley equations. The predictive power of this model was tested on electrophysiological experiments where noisy current was injected in both the soma and the apical dendrite simultaneously. We conclude that a simple two-compartment model can predict spike times of pyramidal cells stimulated in the soma and dendrites simultaneously. Our results support that regenerating activity in the apical dendritic is required to properly account for the dynamics of layer 5 pyramidal cells under in-vivo-like conditions.
Collapse
Affiliation(s)
- Richard Naud
- Department of Physics, University of Ottawa Ottawa, ON, Canada
| | - Brice Bathellier
- Cortical Dynamics and Multisensory Processing Team, Unit of Neuroscience Information and Complexity, CNRS UPR-3239 Gif-sur-Yvette, France
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Ecole Polytechnique Federale de Lausanne Lausanne, Switzerland
| |
Collapse
|
42
|
Meliza CD, Kostuk M, Huang H, Nogaret A, Margoliash D, Abarbanel HDI. Estimating parameters and predicting membrane voltages with conductance-based neuron models. BIOLOGICAL CYBERNETICS 2014; 108:495-516. [PMID: 24962080 DOI: 10.1007/s00422-014-0615-5] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/19/2013] [Accepted: 06/09/2014] [Indexed: 05/07/2023]
Abstract
Recent results demonstrate techniques for fully quantitative, statistical inference of the dynamics of individual neurons under the Hodgkin-Huxley framework of voltage-gated conductances. Using a variational approximation, this approach has been successfully applied to simulated data from model neurons. Here, we use this method to analyze a population of real neurons recorded in a slice preparation of the zebra finch forebrain nucleus HVC. Our results demonstrate that using only 1,500 ms of voltage recorded while injecting a complex current waveform, we can estimate the values of 12 state variables and 72 parameters in a dynamical model, such that the model accurately predicts the responses of the neuron to novel injected currents. A less complex model produced consistently worse predictions, indicating that the additional currents contribute significantly to the dynamics of these neurons. Preliminary results indicate some differences in the channel complement of the models for different classes of HVC neurons, which accords with expectations from the biology. Whereas the model for each cell is incomplete (representing only the somatic compartment, and likely to be missing classes of channels that the real neurons possess), our approach opens the possibility to investigate in modeling the plausibility of additional classes of channels the cell might possess, thus improving the models over time. These results provide an important foundational basis for building biologically realistic network models, such as the one in HVC that contributes to the process of song production and developmental vocal learning in songbirds.
Collapse
Affiliation(s)
- C Daniel Meliza
- Department of Organismal Biology and Anatomy, University of Chicago, 1027 E 57th Street, Chicago, IL, 60637, USA,
| | | | | | | | | | | |
Collapse
|
43
|
Fontaine B, Peña JL, Brette R. Spike-threshold adaptation predicted by membrane potential dynamics in vivo. PLoS Comput Biol 2014; 10:e1003560. [PMID: 24722397 PMCID: PMC3983065 DOI: 10.1371/journal.pcbi.1003560] [Citation(s) in RCA: 50] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2013] [Accepted: 02/21/2014] [Indexed: 11/18/2022] Open
Abstract
Neurons encode information in sequences of spikes, which are triggered when their membrane potential crosses a threshold. In vivo, the spiking threshold displays large variability suggesting that threshold dynamics have a profound influence on how the combined input of a neuron is encoded in the spiking. Threshold variability could be explained by adaptation to the membrane potential. However, it could also be the case that most threshold variability reflects noise and processes other than threshold adaptation. Here, we investigated threshold variation in auditory neurons responses recorded in vivo in barn owls. We found that spike threshold is quantitatively predicted by a model in which the threshold adapts, tracking the membrane potential at a short timescale. As a result, in these neurons, slow voltage fluctuations do not contribute to spiking because they are filtered by threshold adaptation. More importantly, these neurons can only respond to input spikes arriving together on a millisecond timescale. These results demonstrate that fast adaptation to the membrane potential captures spike threshold variability in vivo. Neurons spike when their membrane potential exceeds a threshold value, but this value has been shown to be variable in the same neuron recorded in vivo. This variability could reflect noise, or deterministic processes that make the threshold vary with the membrane potential. The second alternative would have important functional consequences. Here, we show that threshold variability is a genuine feature of neurons, which reflects adaptation to the membrane potential at a short timescale, with little contribution from noise. This demonstrates that a deterministic model can predict spikes based only on the membrane potential.
Collapse
Affiliation(s)
- Bertrand Fontaine
- Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicine, Bronx, New York, United States of America
| | - José Luis Peña
- Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicine, Bronx, New York, United States of America
| | - Romain Brette
- Laboratoire Psychologie de la Perception, CNRS and Université Paris Descartes, Paris, France
- Département d'Etudes Cognitives, Ecole Normale Supérieure, Paris, France
- Sorbonne Universités, UPMC Univ. Paris 06, UMR_S 968, Institut de la Vision, Paris, France
- INSERM, U968, Paris, France
- CNRS, UMR_7210, Paris, France
- * E-mail:
| |
Collapse
|
44
|
Ladenbauer J, Augustin M, Obermayer K. How adaptation currents change threshold, gain, and variability of neuronal spiking. J Neurophysiol 2013; 111:939-53. [PMID: 24174646 DOI: 10.1152/jn.00586.2013] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Many types of neurons exhibit spike rate adaptation, mediated by intrinsic slow K(+) currents, which effectively inhibit neuronal responses. How these adaptation currents change the relationship between in vivo like fluctuating synaptic input, spike rate output, and the spike train statistics, however, is not well understood. In this computational study we show that an adaptation current that primarily depends on the subthreshold membrane voltage changes the neuronal input-output relationship (I-O curve) subtractively, thereby increasing the response threshold, and decreases its slope (response gain) for low spike rates. A spike-dependent adaptation current alters the I-O curve divisively, thus reducing the response gain. Both types of an adaptation current naturally increase the mean interspike interval (ISI), but they can affect ISI variability in opposite ways. A subthreshold current always causes an increase of variability while a spike-triggered current decreases high variability caused by fluctuation-dominated inputs and increases low variability when the average input is large. The effects on I-O curves match those caused by synaptic inhibition in networks with asynchronous irregular activity, for which we find subtractive and divisive changes caused by external and recurrent inhibition, respectively. Synaptic inhibition, however, always increases the ISI variability. We analytically derive expressions for the I-O curve and ISI variability, which demonstrate the robustness of our results. Furthermore, we show how the biophysical parameters of slow K(+) conductances contribute to the two different types of an adaptation current and find that Ca(2+)-activated K(+) currents are effectively captured by a simple spike-dependent description, while muscarine-sensitive or Na(+)-activated K(+) currents show a dominant subthreshold component.
Collapse
Affiliation(s)
- Josef Ladenbauer
- Neural Information Processing Group, Technische Universität Berlin, Berlin, Germany; and
| | | | | |
Collapse
|
45
|
Fontaine B, Benichoux V, Joris PX, Brette R. Predicting spike timing in highly synchronous auditory neurons at different sound levels. J Neurophysiol 2013; 110:1672-88. [PMID: 23864375 PMCID: PMC4042421 DOI: 10.1152/jn.00051.2013] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2013] [Accepted: 07/15/2013] [Indexed: 11/22/2022] Open
Abstract
A challenge for sensory systems is to encode natural signals that vary in amplitude by orders of magnitude. The spike trains of neurons in the auditory system must represent the fine temporal structure of sounds despite a tremendous variation in sound level in natural environments. It has been shown in vitro that the transformation from dynamic signals into precise spike trains can be accurately captured by simple integrate-and-fire models. In this work, we show that the in vivo responses of cochlear nucleus bushy cells to sounds across a wide range of levels can be precisely predicted by deterministic integrate-and-fire models with adaptive spike threshold. Our model can predict both the spike timings and the firing rate in response to novel sounds, across a large input level range. A noisy version of the model accounts for the statistical structure of spike trains, including the reliability and temporal precision of responses. Spike threshold adaptation was critical to ensure that predictions remain accurate at different levels. These results confirm that simple integrate-and-fire models provide an accurate phenomenological account of spike train statistics and emphasize the functional relevance of spike threshold adaptation.
Collapse
Affiliation(s)
- Bertrand Fontaine
- Laboratoire Psychologie de la Perception, CNRS, Université Paris Descartes, Paris, France
| | | | | | | |
Collapse
|
46
|
Ladenbauer J, Lehnert J, Rankoohi H, Dahms T, Schöll E, Obermayer K. Adaptation controls synchrony and cluster states of coupled threshold-model neurons. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2013; 88:042713. [PMID: 24229219 DOI: 10.1103/physreve.88.042713] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/16/2013] [Revised: 08/20/2013] [Indexed: 06/02/2023]
Abstract
We analyze zero-lag and cluster synchrony of delay-coupled nonsmooth dynamical systems by extending the master stability approach, and apply this to networks of adaptive threshold-model neurons. For a homogeneous population of excitatory and inhibitory neurons we find (i) that subthreshold adaptation stabilizes or destabilizes synchrony depending on whether the recurrent synaptic excitatory or inhibitory couplings dominate, and (ii) that synchrony is always unstable for networks with balanced recurrent synaptic inputs. If couplings are not too strong, synchronization properties are similar for very different coupling topologies, i.e., random connections or spatial networks with localized connectivity. We generalize our approach for two subpopulations of neurons with nonidentical local dynamics, including bursting, for which activity-based adaptation controls the stability of cluster states, independent of a specific coupling topology.
Collapse
Affiliation(s)
- Josef Ladenbauer
- Institut für Softwaretechnik und Theoretische Informatik, Technische Universität Berlin, Marchstraße 23, 10587 Berlin, Germany and Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, 10115 Berlin, Germany
| | | | | | | | | | | |
Collapse
|
47
|
Espinosa-Ramos JI, Vazquez RA, Cruz-Cortes N. Designing spiking neural models of neurophysiological recordings using gene expression programming. BMC Neurosci 2013. [PMCID: PMC3704823 DOI: 10.1186/1471-2202-14-s1-p74] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
48
|
Temporal whitening by power-law adaptation in neocortical neurons. Nat Neurosci 2013; 16:942-8. [PMID: 23749146 DOI: 10.1038/nn.3431] [Citation(s) in RCA: 115] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2012] [Accepted: 05/08/2013] [Indexed: 11/08/2022]
Abstract
Spike-frequency adaptation (SFA) is widespread in the CNS, but its function remains unclear. In neocortical pyramidal neurons, adaptation manifests itself by an increase in the firing threshold and by adaptation currents triggered after each spike. Combining electrophysiological recordings in mice with modeling, we found that these adaptation processes lasted for more than 20 s and decayed over multiple timescales according to a power law. The power-law decay associated with adaptation mirrored and canceled the temporal correlations of input current received in vivo at the somata of layer 2/3 somatosensory pyramidal neurons. These findings suggest that, in the cortex, SFA causes temporal decorrelation of output spikes (temporal whitening), an energy-efficient coding procedure that, at high signal-to-noise ratio, improves the information transfer.
Collapse
|
49
|
Abstract
Cell-to-cell variability in molecular, genetic, and physiological features is increasingly recognized as a critical feature of complex biological systems, including the brain. Although such variability has potential advantages in robustness and reliability, how and why biological circuits assemble heterogeneous cells into functional groups is poorly understood. Here, we develop analytic approaches toward answering how neuron-level variation in intrinsic biophysical properties of olfactory bulb mitral cells influences population coding of fluctuating stimuli. We capture the intrinsic diversity of recorded populations of neurons through a statistical approach based on generalized linear models. These models are flexible enough to predict the diverse responses of individual neurons yet provide a common reference frame for comparing one neuron to the next. We then use Bayesian stimulus decoding to ask how effectively different populations of mitral cells, varying in their diversity, encode a common stimulus. We show that a key advantage provided by physiological levels of intrinsic diversity is more efficient and more robust encoding of stimuli by the population as a whole. However, we find that the populations that best encode stimulus features are not simply the most heterogeneous, but those that balance diversity with the benefits of neural similarity.
Collapse
|
50
|
Augustin M, Ladenbauer J, Obermayer K. How adaptation shapes spike rate oscillations in recurrent neuronal networks. Front Comput Neurosci 2013; 7:9. [PMID: 23450654 PMCID: PMC3583173 DOI: 10.3389/fncom.2013.00009] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2012] [Accepted: 02/08/2013] [Indexed: 12/31/2022] Open
Abstract
Neural mass signals from in-vivo recordings often show oscillations with frequencies ranging from <1 to 100 Hz. Fast rhythmic activity in the beta and gamma range can be generated by network-based mechanisms such as recurrent synaptic excitation-inhibition loops. Slower oscillations might instead depend on neuronal adaptation currents whose timescales range from tens of milliseconds to seconds. Here we investigate how the dynamics of such adaptation currents contribute to spike rate oscillations and resonance properties in recurrent networks of excitatory and inhibitory neurons. Based on a network of sparsely coupled spiking model neurons with two types of adaptation current and conductance-based synapses with heterogeneous strengths and delays we use a mean-field approach to analyze oscillatory network activity. For constant external input, we find that spike-triggered adaptation currents provide a mechanism to generate slow oscillations over a wide range of adaptation timescales as long as recurrent synaptic excitation is sufficiently strong. Faster rhythms occur when recurrent inhibition is slower than excitation and oscillation frequency increases with the strength of inhibition. Adaptation facilitates such network-based oscillations for fast synaptic inhibition and leads to decreased frequencies. For oscillatory external input, adaptation currents amplify a narrow band of frequencies and cause phase advances for low frequencies in addition to phase delays at higher frequencies. Our results therefore identify the different key roles of neuronal adaptation dynamics for rhythmogenesis and selective signal propagation in recurrent networks.
Collapse
Affiliation(s)
- Moritz Augustin
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin Berlin, Germany ; Bernstein Center for Computational Neuroscience Berlin Berlin, Germany
| | | | | |
Collapse
|