1
|
Puttkammer F, Lindner B. Fluctuation-response relations for integrate-and-fire models with an absolute refractory period. BIOLOGICAL CYBERNETICS 2024; 118:7-19. [PMID: 38261004 PMCID: PMC11068698 DOI: 10.1007/s00422-023-00982-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 12/11/2023] [Indexed: 01/24/2024]
Abstract
We study the problem of relating the spontaneous fluctuations of a stochastic integrate-and-fire (IF) model to the response of the instantaneous firing rate to time-dependent stimulation if the IF model is endowed with a non-vanishing refractory period and a finite (stereotypical) spike shape. This seemingly harmless addition to the model is shown to complicate the analysis put forward by Lindner Phys. Rev. Lett. (2022), i.e., the incorporation of the reset into the model equation, the Rice-like averaging of the stochastic differential equation, and the application of the Furutsu-Novikov theorem. We derive a still exact (although more complicated) fluctuation-response relation (FRR) for an IF model with refractory state and a white Gaussian background noise. We also briefly discuss an approximation for the case of a colored Gaussian noise and conclude with a summary and outlook on open problems.
Collapse
Affiliation(s)
- Friedrich Puttkammer
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| |
Collapse
|
2
|
Ranft J, Lindner B. Theory of the asynchronous state of structured rotator networks and its application to recurrent networks of excitatory and inhibitory units. Phys Rev E 2023; 107:044306. [PMID: 37198857 DOI: 10.1103/physreve.107.044306] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2022] [Accepted: 03/28/2023] [Indexed: 05/19/2023]
Abstract
Recurrently coupled oscillators that are sufficiently heterogeneous and/or randomly coupled can show an asynchronous activity in which there are no significant correlations among the units of the network. The asynchronous state can nevertheless exhibit a rich temporal correlation statistics that is generally difficult to capture theoretically. For randomly coupled rotator networks, it is possible to derive differential equations that determine the autocorrelation functions of the network noise and of the single elements in the network. So far, the theory has been restricted to statistically homogeneous networks, making it difficult to apply this framework to real-world networks, which are structured with respect to the properties of the single units and their connectivity. A particularly striking case are neural networks for which one has to distinguish between excitatory and inhibitory neurons, which drive their target neurons towards or away from the firing threshold. To take into account network structures like that, here we extend the theory for rotator networks to the case of multiple populations. Specifically, we derive a system of differential equations that govern the self-consistent autocorrelation functions of the network fluctuations in the respective populations. We then apply this general theory to the special but important case of recurrent networks of excitatory and inhibitory units in the balanced case and compare our theory to numerical simulations. We inspect the effect of the network structure on the noise statistics by comparing our results to the case of an equivalent homogeneous network devoid of internal structure. Our results show that structured connectivity and heterogeneity of the oscillator type can both enhance or reduce the overall strength of the generated network noise and shape its temporal correlations.
Collapse
Affiliation(s)
- Jonas Ranft
- Institut de Biologie de l'ENS, Ecole Normale Supérieure, CNRS, Inserm, Université PSL, 46 rue d'Ulm, 75005 Paris, France
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
3
|
Wu T, Cai Y, Zhang R, Wang Z, Tao L, Xiao ZC. Multi-band oscillations emerge from a simple spiking network. CHAOS (WOODBURY, N.Y.) 2023; 33:043121. [PMID: 37097932 DOI: 10.1063/5.0106884] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2022] [Accepted: 03/14/2023] [Indexed: 06/19/2023]
Abstract
In the brain, coherent neuronal activities often appear simultaneously in multiple frequency bands, e.g., as combinations of alpha (8-12 Hz), beta (12.5-30 Hz), and gamma (30-120 Hz) oscillations, among others. These rhythms are believed to underlie information processing and cognitive functions and have been subjected to intense experimental and theoretical scrutiny. Computational modeling has provided a framework for the emergence of network-level oscillatory behavior from the interaction of spiking neurons. However, due to the strong nonlinear interactions between highly recurrent spiking populations, the interplay between cortical rhythms in multiple frequency bands has rarely been theoretically investigated. Many studies invoke multiple physiological timescales (e.g., various ion channels or multiple types of inhibitory neurons) or oscillatory inputs to produce rhythms in multi-bands. Here, we demonstrate the emergence of multi-band oscillations in a simple network consisting of one excitatory and one inhibitory neuronal population driven by constant input. First, we construct a data-driven, Poincaré section theory for robust numerical observations of single-frequency oscillations bifurcating into multiple bands. Then, we develop model reductions of the stochastic, nonlinear, high-dimensional neuronal network to capture the appearance of multi-band dynamics and the underlying bifurcations theoretically. Furthermore, when viewed within the reduced state space, our analysis reveals conserved geometrical features of the bifurcations on low-dimensional dynamical manifolds. These results suggest a simple geometric mechanism behind the emergence of multi-band oscillations without appealing to oscillatory inputs or multiple synaptic or neuronal timescales. Thus, our work points to unexplored regimes of stochastic competition between excitation and inhibition behind the generation of dynamic, patterned neuronal activities.
Collapse
Affiliation(s)
- Tianyi Wu
- School of Mathematical Sciences, Peking University, Beijing 100871, China
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing 100871, China
- Courant Institute of Mathematical Sciences, New York University, New York, New York 10003, USA
| | - Yuhang Cai
- Department of Mathematics, University of California, Berkeley, Berkeley, California 94720, USA
| | - Ruilin Zhang
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing 100871, China
- Yuanpei College, Peking University, Beijing 100871, China
| | - Zhongyi Wang
- School of Mathematical Sciences, Peking University, Beijing 100871, China
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing 100871, China
| | - Louis Tao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing 100871, China
- Center for Quantitative Biology, Peking University, Beijing 100871, China
| | - Zhuo-Cheng Xiao
- Courant Institute of Mathematical Sciences, New York University, New York, New York 10003, USA
| |
Collapse
|
4
|
Ranft J, Lindner B. A self-consistent analytical theory for rotator networks under stochastic forcing: Effects of intrinsic noise and common input. CHAOS (WOODBURY, N.Y.) 2022; 32:063131. [PMID: 35778158 DOI: 10.1063/5.0096000] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Accepted: 05/23/2022] [Indexed: 06/15/2023]
Abstract
Despite the incredible complexity of our brains' neural networks, theoretical descriptions of neural dynamics have led to profound insights into possible network states and dynamics. It remains challenging to develop theories that apply to spiking networks and thus allow one to characterize the dynamic properties of biologically more realistic networks. Here, we build on recent work by van Meegen and Lindner who have shown that "rotator networks," while considerably simpler than real spiking networks and, therefore, more amenable to mathematical analysis, still allow one to capture dynamical properties of networks of spiking neurons. This framework can be easily extended to the case where individual units receive uncorrelated stochastic input, which can be interpreted as intrinsic noise. However, the assumptions of the theory do not apply anymore when the input received by the single rotators is strongly correlated among units. As we show, in this case, the network fluctuations become significantly non-Gaussian, which calls for reworking of the theory. Using a cumulant expansion, we develop a self-consistent analytical theory that accounts for the observed non-Gaussian statistics. Our theory provides a starting point for further studies of more general network setups and information transmission properties of these networks.
Collapse
Affiliation(s)
- Jonas Ranft
- Institut de Biologie de l'ENS, Ecole Normale Supérieure, CNRS, Inserm, Université PSL, 46 rue d'Ulm, 75005 Paris, France
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Department of Physics, Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
5
|
Optimizing the Neural Structure and Hyperparameters of Liquid State Machines Based on Evolutionary Membrane Algorithm. MATHEMATICS 2022. [DOI: 10.3390/math10111844] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/10/2022]
Abstract
As one of the important artificial intelligence fields, brain-like computing attempts to give machines a higher intelligence level by studying and simulating the cognitive principles of the human brain. A spiking neural network (SNN) is one of the research directions of brain-like computing, characterized by better biogenesis and stronger computing power than the traditional neural network. A liquid state machine (LSM) is a neural computing model with a recurrent network structure based on SNN. In this paper, a learning algorithm based on an evolutionary membrane algorithm is proposed to optimize the neural structure and hyperparameters of an LSM. First, the object of the proposed algorithm is designed according to the neural structure and hyperparameters of the LSM. Second, the reaction rules of the proposed algorithm are employed to discover the best neural structure and hyperparameters of the LSM. Third, the membrane structure is that the skin membrane contains several elementary membranes to speed up the search of the proposed algorithm. In the simulation experiment, effectiveness verification is carried out on the MNIST and KTH datasets. In terms of the MNIST datasets, the best test results of the proposed algorithm with 500, 1000 and 2000 spiking neurons are 86.8%, 90.6% and 90.8%, respectively. The best test results of the proposed algorithm on KTH with 500, 1000 and 2000 spiking neurons are 82.9%, 85.3% and 86.3%, respectively. The simulation results show that the proposed algorithm has a more competitive advantage than other experimental algorithms.
Collapse
|
6
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
7
|
Kullmann R, Knoll G, Bernardi D, Lindner B. Critical current for giant Fano factor in neural models with bistable firing dynamics and implications for signal transmission. Phys Rev E 2022; 105:014416. [PMID: 35193262 DOI: 10.1103/physreve.105.014416] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2020] [Accepted: 01/05/2022] [Indexed: 06/14/2023]
Abstract
Bistability in the firing rate is a prominent feature in different types of neurons as well as in neural networks. We show that for a constant input below a critical value, such bistability can lead to a giant spike-count diffusion. We study the transmission of a periodic signal and demonstrate that close to the critical bias current, the signal-to-noise ratio suffers a sharp increase, an effect that can be traced back to the giant diffusion and large Fano factor.
Collapse
Affiliation(s)
- Richard Kullmann
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Davide Bernardi
- Center for Translational Neurophysiology of Speech and Communication, Fondazione Istituto Italiano di Tecnologia, via Fossato di Mortara 19, 44121 Ferrara, Italy
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|
8
|
The Mean Field Approach for Populations of Spiking Neurons. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:125-157. [DOI: 10.1007/978-3-030-89439-9_6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
AbstractMean field theory is a device to analyze the collective behavior of a dynamical system comprising many interacting particles. The theory allows to reduce the behavior of the system to the properties of a handful of parameters. In neural circuits, these parameters are typically the firing rates of distinct, homogeneous subgroups of neurons. Knowledge of the firing rates under conditions of interest can reveal essential information on both the dynamics of neural circuits and the way they can subserve brain function. The goal of this chapter is to provide an elementary introduction to the mean field approach for populations of spiking neurons. We introduce the general idea in networks of binary neurons, starting from the most basic results and then generalizing to more relevant situations. This allows to derive the mean field equations in a simplified setting. We then derive the mean field equations for populations of integrate-and-fire neurons. An effort is made to derive the main equations of the theory using only elementary methods from calculus and probability theory. The chapter ends with a discussion of the assumptions of the theory and some of the consequences of violating those assumptions. This discussion includes an introduction to balanced and metastable networks and a brief catalogue of successful applications of the mean field approach to the study of neural circuits.
Collapse
|
9
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel’s time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability. The elementary processing units in the central nervous system are neurons that transmit information by short electrical pulses, so called action potentials or spikes. The generation of the action potential is a random process that can be shaped by correlated fluctuations (colored noise) and by adaptation. A consequence of these two ubiquitous features is that the successive time intervals between spikes, the interspike intervals, are not independent but correlated. As these correlations can significantly improve information transmission and weak-signal detection, it is an important task to develop analytical approaches to these statistics for well-established computational models. Here we present a theory of interval correlations for a widely used class of integrate-and-fire models endowed with an adaptation mechanism and subject to correlated fluctuations. We demonstrate which patterns of interval correlations can be expected from the interplay of colored noise, adaptation and intrinsic nonlinear dynamics.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
- * E-mail:
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
10
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
11
|
Knoll G, Lindner B. Recurrence-mediated suprathreshold stochastic resonance. J Comput Neurosci 2021; 49:407-418. [PMID: 34003421 PMCID: PMC8556192 DOI: 10.1007/s10827-021-00788-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Revised: 04/21/2021] [Accepted: 04/26/2021] [Indexed: 11/29/2022]
Abstract
It has previously been shown that the encoding of time-dependent signals by feedforward networks (FFNs) of processing units exhibits suprathreshold stochastic resonance (SSR), which is an optimal signal transmission for a finite level of independent, individual stochasticity in the single units. In this study, a recurrent spiking network is simulated to demonstrate that SSR can be also caused by network noise in place of intrinsic noise. The level of autonomously generated fluctuations in the network can be controlled by the strength of synapses, and hence the coding fraction (our measure of information transmission) exhibits a maximum as a function of the synaptic coupling strength. The presence of a coding peak at an optimal coupling strength is robust over a wide range of individual, network, and signal parameters, although the optimal strength and peak magnitude depend on the parameter being varied. We also perform control experiments with an FFN illustrating that the optimized coding fraction is due to the change in noise level and not from other effects entailed when changing the coupling strength. These results also indicate that the non-white (temporally correlated) network noise in general provides an extra boost to encoding performance compared to the FFN driven by intrinsic white noise fluctuations.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany. .,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany.,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
12
|
An Integrate-and-Fire Spiking Neural Network Model Simulating Artificially Induced Cortical Plasticity. eNeuro 2021; 8:ENEURO.0333-20.2021. [PMID: 33632810 PMCID: PMC7986529 DOI: 10.1523/eneuro.0333-20.2021] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2020] [Revised: 02/10/2021] [Accepted: 02/16/2021] [Indexed: 11/21/2022] Open
Abstract
We describe an integrate-and-fire (IF) spiking neural network that incorporates spike-timing-dependent plasticity (STDP) and simulates the experimental outcomes of four different conditioning protocols that produce cortical plasticity. The original conditioning experiments were performed in freely moving non-human primates (NHPs) with an autonomous head-fixed bidirectional brain-computer interface (BCI). Three protocols involved closed-loop stimulation triggered from (1) spike activity of single cortical neurons, (2) electromyographic (EMG) activity from forearm muscles, and (3) cycles of spontaneous cortical beta activity. A fourth protocol involved open-loop delivery of pairs of stimuli at neighboring cortical sites. The IF network that replicates the experimental results consists of 360 units with simulated membrane potentials produced by synaptic inputs and triggering a spike when reaching threshold. The 240 cortical units produce either excitatory or inhibitory postsynaptic potentials (PSPs) in their target units. In addition to the experimentally observed conditioning effects, the model also allows computation of underlying network behavior not originally documented. Furthermore, the model makes predictions about outcomes from protocols not yet investigated, including spike-triggered inhibition, γ-triggered stimulation and disynaptic conditioning. The success of the simulations suggests that a simple voltage-based IF model incorporating STDP can capture the essential mechanisms mediating targeted plasticity with closed-loop stimulation.
Collapse
|
13
|
Bernardi D, Doron G, Brecht M, Lindner B. A network model of the barrel cortex combined with a differentiator detector reproduces features of the behavioral response to single-neuron stimulation. PLoS Comput Biol 2021; 17:e1007831. [PMID: 33556070 PMCID: PMC7895413 DOI: 10.1371/journal.pcbi.1007831] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Revised: 02/19/2021] [Accepted: 01/17/2021] [Indexed: 11/23/2022] Open
Abstract
The stimulation of a single neuron in the rat somatosensory cortex can elicit a behavioral response. The probability of a behavioral response does not depend appreciably on the duration or intensity of a constant stimulation, whereas the response probability increases significantly upon injection of an irregular current. Biological mechanisms that can potentially suppress a constant input signal are present in the dynamics of both neurons and synapses and seem ideal candidates to explain these experimental findings. Here, we study a large network of integrate-and-fire neurons with several salient features of neuronal populations in the rat barrel cortex. The model includes cellular spike-frequency adaptation, experimentally constrained numbers and types of chemical synapses endowed with short-term plasticity, and gap junctions. Numerical simulations of this model indicate that cellular and synaptic adaptation mechanisms alone may not suffice to account for the experimental results if the local network activity is read out by an integrator. However, a circuit that approximates a differentiator can detect the single-cell stimulation with a reliability that barely depends on the length or intensity of the stimulus, but that increases when an irregular signal is used. This finding is in accordance with the experimental results obtained for the stimulation of a regularly-spiking excitatory cell. It is widely assumed that only a large group of neurons can encode a stimulus or control behavior. This tenet of neuroscience has been challenged by experiments in which stimulating a single cortical neuron has had a measurable effect on an animal’s behavior. Recently, theoretical studies have explored how a single-neuron stimulation could be detected in a large recurrent network. However, these studies missed essential biological mechanisms of cortical networks and are unable to explain more recent experiments in the barrel cortex. Here, to describe the stimulated brain area, we propose and study a network model endowed with many important biological features of the barrel cortex. Importantly, we also investigate different readout mechanisms, i.e. ways in which the stimulation effects can propagate to other brain areas. We show that a readout network which tracks rapid variations in the local network activity is in agreement with the experiments. Our model demonstrates a possible mechanism for how the stimulation of a single neuron translates into a signal at the population level, which is taken as a proxy of the animal’s response. Our results illustrate the power of spiking neural networks to properly describe the effects of a single neuron’s activity.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
- Center for Translational Neurophysiology of Speech and Communication, Fondazione Istituto Italiano di Tecnologia, Ferrara, Italy
- * E-mail:
| | - Guy Doron
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Michael Brecht
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
14
|
Cofré R, Maldonado C, Cessac B. Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics. ENTROPY (BASEL, SWITZERLAND) 2020; 22:E1330. [PMID: 33266513 PMCID: PMC7712217 DOI: 10.3390/e22111330] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/09/2020] [Revised: 11/13/2020] [Accepted: 11/15/2020] [Indexed: 12/04/2022]
Abstract
The Thermodynamic Formalism provides a rigorous mathematical framework for studying quantitative and qualitative aspects of dynamical systems. At its core, there is a variational principle that corresponds, in its simplest form, to the Maximum Entropy principle. It is used as a statistical inference procedure to represent, by specific probability measures (Gibbs measures), the collective behaviour of complex systems. This framework has found applications in different domains of science. In particular, it has been fruitful and influential in neurosciences. In this article, we review how the Thermodynamic Formalism can be exploited in the field of theoretical neuroscience, as a conceptual and operational tool, in order to link the dynamics of interacting neurons and the statistics of action potentials from either experimental data or mathematical models. We comment on perspectives and open problems in theoretical neuroscience that could be addressed within this formalism.
Collapse
Affiliation(s)
- Rodrigo Cofré
- CIMFAV-Ingemat, Facultad de Ingeniería, Universidad de Valparaíso, Valparaíso 2340000, Chile
| | - Cesar Maldonado
- IPICYT/División de Matemáticas Aplicadas, San Luis Potosí 78216, Mexico;
| | - Bruno Cessac
- Inria Biovision team and Neuromod Institute, Université Côte d’Azur, 06901 CEDEX Inria, France;
| |
Collapse
|
15
|
Bachmann C, Tetzlaff T, Duarte R, Morrison A. Firing rate homeostasis counteracts changes in stability of recurrent neural networks caused by synapse loss in Alzheimer's disease. PLoS Comput Biol 2020; 16:e1007790. [PMID: 32841234 PMCID: PMC7505475 DOI: 10.1371/journal.pcbi.1007790] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2019] [Revised: 09/21/2020] [Accepted: 03/17/2020] [Indexed: 11/19/2022] Open
Abstract
The impairment of cognitive function in Alzheimer's disease is clearly correlated to synapse loss. However, the mechanisms underlying this correlation are only poorly understood. Here, we investigate how the loss of excitatory synapses in sparsely connected random networks of spiking excitatory and inhibitory neurons alters their dynamical characteristics. Beyond the effects on the activity statistics, we find that the loss of excitatory synapses on excitatory neurons reduces the network's sensitivity to small perturbations. This decrease in sensitivity can be considered as an indication of a reduction of computational capacity. A full recovery of the network's dynamical characteristics and sensitivity can be achieved by firing rate homeostasis, here implemented by an up-scaling of the remaining excitatory-excitatory synapses. Mean-field analysis reveals that the stability of the linearised network dynamics is, in good approximation, uniquely determined by the firing rate, and thereby explains why firing rate homeostasis preserves not only the firing rate but also the network's sensitivity to small perturbations.
Collapse
Affiliation(s)
- Claudia Bachmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Renato Duarte
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Abigail Morrison
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr-University Bochum, Bochum, Germany
| |
Collapse
|
16
|
Bernardi D, Lindner B. Receiver operating characteristic curves for a simple stochastic process that carries a static signal. Phys Rev E 2020; 101:062132. [PMID: 32688497 DOI: 10.1103/physreve.101.062132] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2020] [Accepted: 05/14/2020] [Indexed: 11/07/2022]
Abstract
The detection of a weak signal in the presence of noise is an important problem that is often studied in terms of the receiver operating characteristic (ROC) curve, in which the probability of correct detection is plotted against the probability for a false positive. This kind of analysis is typically applied to the situation in which signal and noise are stochastic variables; the detection problem emerges, however, also often in a context in which both signal and noise are stochastic processes and the (correct or false) detection is said to take place when the process crosses a threshold in a given time window. Here we consider the problem for a combination of a static signal which has to be detected against a dynamic noise process, the well-known Ornstein-Uhlenbeck process. We give exact (but difficult to evaluate) quadrature expressions for the detection rates for false positives and correct detections, investigate systematically a simple sampling approximation suggested earlier, compare to an approximation by Stratonovich for the limit of high threshold, and briefly explore the case of multiplicative signal; all theoretical results are compared to extensive numerical simulations of the corresponding Langevin equation. Our results demonstrate that the sampling approximation provides a reasonable description of the ROC curve for this system, and it clarifies limit cases for the ROC curve.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, 12489 Berlin, Germany
| |
Collapse
|
17
|
Ponzi A, Barton SJ, Bunner KD, Rangel-Barajas C, Zhang ES, Miller BR, Rebec GV, Kozloski J. Striatal network modeling in Huntington's Disease. PLoS Comput Biol 2020; 16:e1007648. [PMID: 32302302 PMCID: PMC7197869 DOI: 10.1371/journal.pcbi.1007648] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2019] [Revised: 05/04/2020] [Accepted: 01/09/2020] [Indexed: 12/26/2022] Open
Abstract
Medium spiny neurons (MSNs) comprise over 90% of cells in the striatum. In vivo MSNs display coherent burst firing cell assembly activity patterns, even though isolated MSNs do not burst fire intrinsically. This activity is important for the learning and execution of action sequences and is characteristically dysregulated in Huntington's Disease (HD). However, how dysregulation is caused by the various neural pathologies affecting MSNs in HD is unknown. Previous modeling work using simple cell models has shown that cell assembly activity patterns can emerge as a result of MSN inhibitory network interactions. Here, by directly estimating MSN network model parameters from single unit spiking data, we show that a network composed of much more physiologically detailed MSNs provides an excellent quantitative fit to wild type (WT) mouse spiking data, but only when network parameters are appropriate for the striatum. We find the WT MSN network is situated in a regime close to a transition from stable to strongly fluctuating network dynamics. This regime facilitates the generation of low-dimensional slowly varying coherent activity patterns and confers high sensitivity to variations in cortical driving. By re-estimating the model on HD spiking data we discover network parameter modifications are consistent across three very different types of HD mutant mouse models (YAC128, Q175, R6/2). In striking agreement with the known pathophysiology we find feedforward excitatory drive is reduced in HD compared to WT mice, while recurrent inhibition also shows phenotype dependency. We show that these modifications shift the HD MSN network to a sub-optimal regime where higher dimensional incoherent rapidly fluctuating activity predominates. Our results provide insight into a diverse range of experimental findings in HD, including cognitive and motor symptoms, and may suggest new avenues for treatment.
Collapse
Affiliation(s)
- Adam Ponzi
- IBM Research, Computational Biology Center, Thomas J. Watson Research Laboratories, Yorktown Heights, New York, United States of America
- * E-mail:
| | - Scott J. Barton
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Kendra D. Bunner
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Claudia Rangel-Barajas
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Emily S. Zhang
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Benjamin R. Miller
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - George V. Rebec
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - James Kozloski
- IBM Research, Computational Biology Center, Thomas J. Watson Research Laboratories, Yorktown Heights, New York, United States of America
| |
Collapse
|
18
|
Optimal Interplay between Synaptic Strengths and Network Structure Enhances Activity Fluctuations and Information Propagation in Hierarchical Modular Networks. Brain Sci 2020; 10:brainsci10040228. [PMID: 32290351 PMCID: PMC7226268 DOI: 10.3390/brainsci10040228] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2020] [Revised: 04/03/2020] [Accepted: 04/04/2020] [Indexed: 01/21/2023] Open
Abstract
In network models of spiking neurons, the joint impact of network structure and synaptic parameters on activity propagation is still an open problem. Here, we use an information-theoretical approach to investigate activity propagation in spiking networks with a hierarchical modular topology. We observe that optimized pairwise information propagation emerges due to the increase of either (i) the global synaptic strength parameter or (ii) the number of modules in the network, while the network size remains constant. At the population level, information propagation of activity among adjacent modules is enhanced as the number of modules increases until a maximum value is reached and then decreases, showing that there is an optimal interplay between synaptic strength and modularity for population information flow. This is in contrast to information propagation evaluated among pairs of neurons, which attains maximum value at the maximum values of these two parameter ranges. By examining the network behavior under the increase of synaptic strength and the number of modules, we find that these increases are associated with two different effects: (i) the increase of autocorrelations among individual neurons and (ii) the increase of cross-correlations among pairs of neurons. The second effect is associated with better information propagation in the network. Our results suggest roles that link topological features and synaptic strength levels to the transmission of information in cortical networks.
Collapse
|
19
|
Nobukawa S, Nishimura H, Yamanishi T. Temporal-specific complexity of spiking patterns in spontaneous activity induced by a dual complex network structure. Sci Rep 2019; 9:12749. [PMID: 31484990 PMCID: PMC6726653 DOI: 10.1038/s41598-019-49286-8] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2018] [Accepted: 08/22/2019] [Indexed: 11/08/2022] Open
Abstract
Temporal fluctuation of neural activity in the brain has an important function in optimal information processing. Spontaneous activity is a source of such fluctuation. The distribution of excitatory postsynaptic potentials (EPSPs) between cortical pyramidal neurons can follow a log-normal distribution. Recent studies have shown that networks connected by weak synapses exhibit characteristics of a random network, whereas networks connected by strong synapses have small-world characteristics of small path lengths and large cluster coefficients. To investigate the relationship between temporal complexity spontaneous activity and structural network duality in synaptic connections, we executed a simulation study using the leaky integrate-and-fire spiking neural network with log-normal synaptic weight distribution for the EPSPs and duality of synaptic connectivity, depending on synaptic weight. We conducted multiscale entropy analysis of the temporal spiking activity. Our simulation demonstrated that, when strong synaptic connections approach a small-world network, specific spiking patterns arise during irregular spatio-temporal spiking activity, and the complexity at the large temporal scale (i.e., slow frequency) is enhanced. Moreover, we confirmed through a surrogate data analysis that slow temporal dynamics reflect a deterministic process in the spiking neural networks. This modelling approach may improve the understanding of the spatio-temporal complex neural activity in the brain.
Collapse
Affiliation(s)
- Sou Nobukawa
- Department of Computer Science, Chiba Institute of Technology, 2-17-1 Tsudanuma, Narashino, Chiba, 275-0016, Japan.
| | - Haruhiko Nishimura
- Graduate School of Applied Informatics, University of Hyogo, 7-1-28 Chuo-ku, Kobe, Hyogo, 650-8588, Japan
| | - Teruya Yamanishi
- AI & IoT Center, Department of Management and Information Sciences, Fukui University of Technology, 3-6-1 Gakuen, Fukui, 910-8505, Japan
| |
Collapse
|
20
|
La Camera G, Fontanini A, Mazzucato L. Cortical computations via metastable activity. Curr Opin Neurobiol 2019; 58:37-45. [PMID: 31326722 DOI: 10.1016/j.conb.2019.06.007] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2018] [Accepted: 06/22/2019] [Indexed: 12/27/2022]
Abstract
Metastable brain dynamics are characterized by abrupt, jump-like modulations so that the neural activity in single trials appears to unfold as a sequence of discrete, quasi-stationary 'states'. Evidence that cortical neural activity unfolds as a sequence of metastable states is accumulating at fast pace. Metastable activity occurs both in response to an external stimulus and during ongoing, self-generated activity. These spontaneous metastable states are increasingly found to subserve internal representations that are not locked to external triggers, including states of deliberations, attention and expectation. Moreover, decoding stimuli or decisions via metastable states can be carried out trial-by-trial. Focusing on metastability will allow us to shift our perspective on neural coding from traditional concepts based on trial-averaging to models based on dynamic ensemble representations. Recent theoretical work has started to characterize the mechanistic origin and potential roles of metastable representations. In this article we review recent findings on metastable activity, how it may arise in biologically realistic models, and its potential role for representing internal states as well as relevant task variables.
Collapse
Affiliation(s)
- Giancarlo La Camera
- Department of Neurobiology and Behavior, State University of New York at Stony Brook, Stony Brook, NY 11794, United States; Graduate Program in Neuroscience, State University of New York at Stony Brook, Stony Brook, NY 11794, United States.
| | - Alfredo Fontanini
- Department of Neurobiology and Behavior, State University of New York at Stony Brook, Stony Brook, NY 11794, United States; Graduate Program in Neuroscience, State University of New York at Stony Brook, Stony Brook, NY 11794, United States
| | - Luca Mazzucato
- Departments of Biology and Mathematics and Institute of Neuroscience, University of Oregon, Eugene, OR 97403, United States
| |
Collapse
|
21
|
Muscinelli SP, Gerstner W, Schwalger T. How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput Biol 2019; 15:e1007122. [PMID: 31181063 PMCID: PMC6586367 DOI: 10.1371/journal.pcbi.1007122] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Revised: 06/20/2019] [Accepted: 05/22/2019] [Indexed: 02/07/2023] Open
Abstract
While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of "resonant chaos", characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks.
Collapse
Affiliation(s)
- Samuel P. Muscinelli
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
- Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany
| |
Collapse
|
22
|
Kähne M, Rüdiger S, Kihara AH, Lindner B. Gap junctions set the speed and nucleation rate of stage I retinal waves. PLoS Comput Biol 2019; 15:e1006355. [PMID: 31034472 PMCID: PMC6508742 DOI: 10.1371/journal.pcbi.1006355] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2018] [Revised: 05/09/2019] [Accepted: 11/27/2018] [Indexed: 11/18/2022] Open
Abstract
Spontaneous waves in the developing retina are essential in the formation of the retinotopic mapping in the visual system. From experiments in rabbits, it is known that the earliest type of retinal waves (stage I) is nucleated spontaneously, propagates at a speed of 451±91 μm/sec and relies on gap junction coupling between ganglion cells. Because gap junctions (electrical synapses) have short integration times, it has been argued that they cannot set the low speed of stage I retinal waves. Here, we present a theoretical study of a two-dimensional neural network of the ganglion cell layer with gap junction coupling and intrinsic noise. We demonstrate that this model can explain observed nucleation rates as well as the comparatively slow propagation speed of the waves. From the interaction between two coupled neurons, we estimate the wave speed in the model network. Furthermore, using simulations of small networks of neurons (N≤260), we estimate the nucleation rate in the form of an Arrhenius escape rate. These results allow for informed simulations of a realistically sized network, yielding values of the gap junction coupling and the intrinsic noise level that are in a physiologically plausible range.
Collapse
Affiliation(s)
- Malte Kähne
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
- * E-mail:
| | - Sten Rüdiger
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
| | | | - Benjamin Lindner
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
23
|
Braun W, Longtin A. Interspike interval correlations in networks of inhibitory integrate-and-fire neurons. Phys Rev E 2019; 99:032402. [PMID: 30999498 DOI: 10.1103/physreve.99.032402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Indexed: 11/07/2022]
Abstract
We study temporal correlations of interspike intervals, quantified by the network-averaged serial correlation coefficient (SCC), in networks of both current- and conductance-based purely inhibitory integrate-and-fire neurons. Numerical simulations reveal transitions to negative SCCs at intermediate values of bias current drive and network size. As bias drive and network size are increased past these values, the SCC returns to zero. The SCC is maximally negative at an intermediate value of the network oscillation strength. The dependence of the SCC on two canonical schemes for synaptic connectivity is studied, and it is shown that the results occur robustly in both schemes. For conductance-based synapses, the SCC becomes negative at the onset of both a fast and slow coherent network oscillation. We then show by means of offline simulations using prerecorded network activity that a neuron's SCC is highly sensitive to its number of presynaptic inputs. Finally, we devise a noise-reduced diffusion approximation for current-based networks that accounts for the observed temporal correlation transitions.
Collapse
Affiliation(s)
- Wilhelm Braun
- Neural Network Dynamics and Computation, Institut für Genetik, Universität Bonn, Kirschallee 1, 53115 Bonn, Germany.,Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| |
Collapse
|
24
|
Bernardi D, Lindner B. Detecting single-cell stimulation in a large network of integrate-and-fire neurons. Phys Rev E 2019; 99:032304. [PMID: 30999410 DOI: 10.1103/physreve.99.032304] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2018] [Indexed: 06/09/2023]
Abstract
Several experiments have shown that the stimulation of a single neuron in the cortex can influence the local network activity and even the behavior of an animal. From the theoretical point of view, it is not clear how stimulating a single cell in a cortical network can evoke a statistically significant change in the activity of a large population. Our previous study considered a random network of integrate-and-fire neurons and proposed a way of detecting the stimulation of a single neuron in the activity of a local network: a threshold detector biased toward a specific subset of neurons. Here, we revisit this model and extend it by introducing a second network acting as a readout. In the simplest scenario, the readout consists of a collection of integrate-and-fire neurons with no recurrent connections. In this case, the ability to detect the stimulus does not improve. However, a readout network with both feed-forward and local recurrent inhibition permits detection with a very small bias, if compared to the readout scheme introduced previously. The crucial role of inhibition is to reduce global input cross correlations, the main factor limiting detectability. Finally, we show that this result is robust if recurrent excitatory connections are included or if a different kind of readout bias (in the synaptic amplitudes instead of connection probability) is used.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
25
|
Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLoS Comput Biol 2019; 15:e1006893. [PMID: 30897092 PMCID: PMC6445477 DOI: 10.1371/journal.pcbi.1006893] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Revised: 04/02/2019] [Accepted: 02/19/2019] [Indexed: 11/19/2022] Open
Abstract
Neural activity in awake behaving animals exhibits a vast range of timescales that can be several fold larger than the membrane time constant of individual neurons. Two types of mechanisms have been proposed to explain this conundrum. One possibility is that large timescales are generated by a network mechanism based on positive feedback, but this hypothesis requires fine-tuning of the strength or structure of the synaptic connections. A second possibility is that large timescales in the neural dynamics are inherited from large timescales of underlying biophysical processes, two prominent candidates being intrinsic adaptive ionic currents and synaptic transmission. How the timescales of adaptation or synaptic transmission influence the timescale of the network dynamics has however not been fully explored. To address this question, here we analyze large networks of randomly connected excitatory and inhibitory units with additional degrees of freedom that correspond to adaptation or synaptic filtering. We determine the fixed points of the systems, their stability to perturbations and the corresponding dynamical timescales. Furthermore, we apply dynamical mean field theory to study the temporal statistics of the activity in the fluctuating regime, and examine how the adaptation and synaptic timescales transfer from individual units to the whole population. Our overarching finding is that synaptic filtering and adaptation in single neurons have very different effects at the network level. Unexpectedly, the macroscopic network dynamics do not inherit the large timescale present in adaptive currents. In contrast, the timescales of network activity increase proportionally to the time constant of the synaptic filter. Altogether, our study demonstrates that the timescales of different biophysical processes have different effects on the network level, so that the slow processes within individual neurons do not necessarily induce slow activity in large recurrent neural networks.
Collapse
Affiliation(s)
- Manuel Beiran
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
26
|
van Meegen A, Lindner B. Self-Consistent Correlations of Randomly Coupled Rotators in the Asynchronous State. PHYSICAL REVIEW LETTERS 2018; 121:258302. [PMID: 30608814 DOI: 10.1103/physrevlett.121.258302] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/14/2017] [Revised: 10/09/2018] [Indexed: 06/09/2023]
Abstract
We study a network of unidirectionally coupled rotators with independent identically distributed (i.i.d.) frequencies and i.i.d. coupling coefficients. Similar to biological networks, this system can attain an asynchronous state with pronounced temporal autocorrelations of the rotators. We derive differential equations for the self-consistent autocorrelation function that can be solved analytically in limit cases. For more involved scenarios, its numerical solution is confirmed by simulations of networks with Gaussian or sparsely distributed coupling coefficients. The theory is finally generalized for pulse-coupled units and tested on a standard model of computational neuroscience, a recurrent network of sparsely coupled exponential integrate-and-fire neurons.
Collapse
Affiliation(s)
- Alexander van Meegen
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
27
|
Amorim MR, Pena RFO, Souza GMPR, Bonagamba LGH, Roque AC, Machado BH. Firing properties of ventral medullary respiratory neurons in sino-aortic denervated rats. Exp Physiol 2018; 104:39-49. [PMID: 30427561 DOI: 10.1113/ep087150] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2018] [Accepted: 11/12/2018] [Indexed: 11/08/2022]
Abstract
NEW FINDINGS What is the central question of this study? After sino-aortic denervation (SAD), rats present normal levels of mean arterial pressure (MAP), high MAP variability and changes in breathing. However, mechanisms involved in SAD-induced respiratory changes and their impact on the modulation of sympathetic activity remain unclear. Herein, we characterized the firing frequency of medullary respiratory neurons after SAD. What is the main finding and its importance? Sino-aortic denervation-induced prolonged inspiration was associated with a reduced interburst frequency of pre-inspiratory/inspiratory neurons and an increased long-term variability of late inspiratory neurons, but no changes were observed in the ramp-inspiratory and post-inspiratory neurons. This imbalance in the respiratory network might contribute to the modulation of sympathetic activity after SAD. ABSTRACT In previous studies, we documented that after sino-aortic denervation (SAD) in rats there are significant changes in the breathing pattern, but no significant changes in sympathetic activity and mean arterial pressure compared with sham-operated rats. However, the neural mechanisms involved in the respiratory changes after SAD and the extent to which they might contribute to the observed normal sympathetic activity and mean arterial pressure remain unclear. Here, we hypothesized that after SAD, rats present with changes in the firing frequency of the ventral medullary inspiratory and post-inspiratory neurons. To test this hypothesis, male Wistar rats underwent SAD or sham surgery and 3 days later were surgically prepared for an in situ experiment. The duration of inspiration significantly increased in SAD rats. During inspiration, the total firing frequency of ramp-inspiratory, pre-inspiratory/inspiratory and late-inspiratory neurons was not different between groups. During post-inspiration, the total firing frequency of post-inspiratory neurons was also not different between groups. Furthermore, the data demonstrate a reduced interburst frequency of pre-inspiratory/inspiratory neurons and an increased long-term variability of late-inspiratory neurons in SAD compared with sham-operated rats. These findings indicate that the SAD-induced prolongation of inspiration was not accompanied by alterations in the total firing frequency of the ventral medullary respiratory neurons, but it was associated with changes in the long-term variability of late-inspiratory neurons. We suggest that the timing imbalance in the respiratory network in SAD rats might contribute to the modulation of presympathetic neurons after removal of baroreceptor afferents.
Collapse
Affiliation(s)
- Mateus R Amorim
- Department of Physiology, School of Medicine of Ribeirão Preto, University of São Paulo, 14049-900, Ribeirão Preto, SP, Brazil
| | - Rodrigo F O Pena
- Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, 14040-900, Ribeirão Preto, SP, Brazil
| | - George M P R Souza
- Department of Physiology, School of Medicine of Ribeirão Preto, University of São Paulo, 14049-900, Ribeirão Preto, SP, Brazil
| | - Leni G H Bonagamba
- Department of Physiology, School of Medicine of Ribeirão Preto, University of São Paulo, 14049-900, Ribeirão Preto, SP, Brazil
| | - Antônio C Roque
- Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, 14040-900, Ribeirão Preto, SP, Brazil
| | - Benedito H Machado
- Department of Physiology, School of Medicine of Ribeirão Preto, University of São Paulo, 14049-900, Ribeirão Preto, SP, Brazil
| |
Collapse
|
28
|
Ullner E, Politi A, Torcini A. Ubiquity of collective irregular dynamics in balanced networks of spiking neurons. CHAOS (WOODBURY, N.Y.) 2018; 28:081106. [PMID: 30180628 DOI: 10.1063/1.5049902] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2018] [Accepted: 08/09/2018] [Indexed: 06/08/2023]
Abstract
We revisit the dynamics of a prototypical model of balanced activity in networks of spiking neurons. A detailed investigation of the thermodynamic limit for fixed density of connections (massive coupling) shows that, when inhibition prevails, the asymptotic regime is not asynchronous but rather characterized by a self-sustained irregular, macroscopic (collective) dynamics. So long as the connectivity is massive, this regime is found in many different setups: leaky as well as quadratic integrate-and-fire neurons; large and small coupling strength; and weak and strong external currents.
Collapse
Affiliation(s)
- Ekkehard Ullner
- Institute for Complex Systems and Mathematical Biology and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Antonio Politi
- Institute for Complex Systems and Mathematical Biology and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Alessandro Torcini
- Max Planck Institut für Physik komplexer Systeme, Nöthnitzer Str. 38, 01187 Dresden, Germany
| |
Collapse
|
29
|
Pena RFO, Vellmer S, Bernardi D, Roque AC, Lindner B. Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks. Front Comput Neurosci 2018; 12:9. [PMID: 29551968 PMCID: PMC5840464 DOI: 10.3389/fncom.2018.00009] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2017] [Accepted: 02/07/2018] [Indexed: 11/13/2022] Open
Abstract
Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as indicated by comparison with simulation results of large recurrent networks. Our method can help to elucidate how network heterogeneity shapes the asynchronous state in recurrent neural networks.
Collapse
Affiliation(s)
- Rodrigo F O Pena
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Sebastian Vellmer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Davide Bernardi
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Antonio C Roque
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| |
Collapse
|
30
|
Beiran M, Kruscha A, Benda J, Lindner B. Coding of time-dependent stimuli in homogeneous and heterogeneous neural populations. J Comput Neurosci 2017; 44:189-202. [PMID: 29222729 DOI: 10.1007/s10827-017-0674-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Revised: 11/08/2017] [Accepted: 11/12/2017] [Indexed: 11/29/2022]
Abstract
We compare the information transmission of a time-dependent signal by two types of uncoupled neuron populations that differ in their sources of variability: i) a homogeneous population whose units receive independent noise and ii) a deterministic heterogeneous population, where each unit exhibits a different baseline firing rate ('disorder'). Our criterion for making both sources of variability quantitatively comparable is that the interspike-interval distributions are identical for both systems. Numerical simulations using leaky integrate-and-fire neurons unveil that a non-zero amount of both noise or disorder maximizes the encoding efficiency of the homogeneous and heterogeneous system, respectively, as a particular case of suprathreshold stochastic resonance. Our findings thus illustrate that heterogeneity can render similarly profitable effects for neuronal populations as dynamic noise. The optimal noise/disorder depends on the system size and the properties of the stimulus such as its intensity or cutoff frequency. We find that weak stimuli are better encoded by a noiseless heterogeneous population, whereas for strong stimuli a homogeneous population outperforms an equivalent heterogeneous system up to a moderate noise level. Furthermore, we derive analytical expressions of the coherence function for the cases of very strong noise and of vanishing intrinsic noise or heterogeneity, which predict the existence of an optimal noise intensity. Our results show that, depending on the type of signal, noise as well as heterogeneity can enhance the encoding performance of neuronal populations.
Collapse
Affiliation(s)
- Manuel Beiran
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany. .,Group for Neural Theory, Laboratoire de Neurosciences Cognitives, Département Études Cognitives, École Normale Supérieure, INSERM, PSL Research University, Paris, France.
| | - Alexandra Kruscha
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Jan Benda
- Institute for Neurobiology, Eberhard Karls Universität, Tübingen, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
31
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
32
|
Bernardi D, Lindner B. Optimal Detection of a Localized Perturbation in Random Networks of Integrate-and-Fire Neurons. PHYSICAL REVIEW LETTERS 2017; 118:268301. [PMID: 28707933 DOI: 10.1103/physrevlett.118.268301] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/19/2016] [Indexed: 06/07/2023]
Abstract
Experimental and theoretical studies suggest that cortical networks are chaotic and coding relies on averages over large populations. However, there is evidence that rats can respond to the short stimulation of a single cortical cell, a theoretically unexplained fact. We study effects of single-cell stimulation on a large recurrent network of integrate-and-fire neurons and propose a simple way to detect the perturbation. Detection rates obtained from simulations and analytical estimates are similar to experimental response rates if the readout is slightly biased towards specific neurons. Near-optimal detection is attained for a broad range of intermediate values of the mean coupling between neurons.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
33
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 68] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
34
|
Mastrogiuseppe F, Ostojic S. Intrinsically-generated fluctuating activity in excitatory-inhibitory networks. PLoS Comput Biol 2017; 13:e1005498. [PMID: 28437436 PMCID: PMC5421821 DOI: 10.1371/journal.pcbi.1005498] [Citation(s) in RCA: 43] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Revised: 05/08/2017] [Accepted: 04/04/2017] [Indexed: 12/05/2022] Open
Abstract
Recurrent networks of non-linear units display a variety of dynamical regimes depending on the structure of their synaptic connectivity. A particularly remarkable phenomenon is the appearance of strongly fluctuating, chaotic activity in networks of deterministic, but randomly connected rate units. How this type of intrinsically generated fluctuations appears in more realistic networks of spiking neurons has been a long standing question. To ease the comparison between rate and spiking networks, recent works investigated the dynamical regimes of randomly-connected rate networks with segregated excitatory and inhibitory populations, and firing rates constrained to be positive. These works derived general dynamical mean field (DMF) equations describing the fluctuating dynamics, but solved these equations only in the case of purely inhibitory networks. Using a simplified excitatory-inhibitory architecture in which DMF equations are more easily tractable, here we show that the presence of excitation qualitatively modifies the fluctuating activity compared to purely inhibitory networks. In presence of excitation, intrinsically generated fluctuations induce a strong increase in mean firing rates, a phenomenon that is much weaker in purely inhibitory networks. Excitation moreover induces two different fluctuating regimes: for moderate overall coupling, recurrent inhibition is sufficient to stabilize fluctuations; for strong coupling, firing rates are stabilized solely by the upper bound imposed on activity, even if inhibition is stronger than excitation. These results extend to more general network architectures, and to rate networks receiving noisy inputs mimicking spiking activity. Finally, we show that signatures of the second dynamical regime appear in networks of integrate-and-fire neurons.
Collapse
Affiliation(s)
- Francesca Mastrogiuseppe
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
- Laboratoire de Physique Statistique, CNRS UMR 8550, École Normale Supérieure - PSL Research University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
| |
Collapse
|
35
|
Engelken R, Farkhooi F, Hansel D, van Vreeswijk C, Wolf F. A reanalysis of "Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons". F1000Res 2016; 5:2043. [PMID: 27746905 PMCID: PMC5040152 DOI: 10.12688/f1000research.9144.1] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 08/08/2016] [Indexed: 11/20/2022] Open
Abstract
Neuronal activity in the central nervous system varies strongly in time and across neuronal populations. It is a longstanding proposal that such fluctuations generically arise from chaotic network dynamics. Various theoretical studies predict that the rich dynamics of rate models operating in the chaotic regime can subserve circuit computation and learning. Neurons in the brain, however, communicate via spikes and it is a theoretical challenge to obtain similar rate fluctuations in networks of spiking neuron models. A recent study investigated spiking balanced networks of leaky integrate and fire (LIF) neurons and compared their dynamics to a matched rate network with identical topology, where single unit input-output functions were chosen from isolated LIF neurons receiving Gaussian white noise input. A mathematical analogy between the chaotic instability in networks of rate units and the spiking network dynamics was proposed. Here we revisit the behavior of the spiking LIF networks and these matched rate networks. We find expected hallmarks of a chaotic instability in the rate network: For supercritical coupling strength near the transition point, the autocorrelation time diverges. For subcritical coupling strengths, we observe critical slowing down in response to small external perturbations. In the spiking network, we found in contrast that the timescale of the autocorrelations is insensitive to the coupling strength and that rate deviations resulting from small input perturbations rapidly decay. The decay speed even accelerates for increasing coupling strength. In conclusion, our reanalysis demonstrates fundamental differences between the behavior of pulse-coupled spiking LIF networks and rate networks with matched topology and input-output function. In particular there is no indication of a corresponding chaotic instability in the spiking network.
Collapse
Affiliation(s)
- Rainer Engelken
- Max Planck Institute for Dynamics and Self-Organization (MPI-DS), Bernstein Center for Computational Neuroscience Göttingen, Faculty of Physics,, University of Göttingen, Göttingen, Germany.,Collaborative Research Center 889, University of Göttingen, Göttingen, 37099, Germany
| | - Farzad Farkhooi
- Institut für Mathematik, Technische Universität Berlin and Bernstein Center for Computational Neuroscience, Berlin, Germany
| | - David Hansel
- Cerebral Dynamics, Learning and Memory Research Group, Center for Neurophysics, Physiology and Pathology, CNRS UMR8119, Université Paris Descartes, Paris, France
| | - Carl van Vreeswijk
- Cerebral Dynamics, Learning and Memory Research Group, Center for Neurophysics, Physiology and Pathology, CNRS UMR8119, Université Paris Descartes, Paris, France
| | - Fred Wolf
- Max Planck Institute for Dynamics and Self-Organization (MPI-DS), Bernstein Center for Computational Neuroscience Göttingen, Faculty of Physics,, University of Göttingen, Göttingen, Germany.,Collaborative Research Center 889, University of Göttingen, Göttingen, 37099, Germany
| |
Collapse
|