1
|
Clark DG, Abbott LF, Litwin-Kumar A. Dimension of Activity in Random Neural Networks. PHYSICAL REVIEW LETTERS 2023; 131:118401. [PMID: 37774280 DOI: 10.1103/physrevlett.131.118401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/09/2022] [Revised: 05/25/2023] [Accepted: 08/08/2023] [Indexed: 10/01/2023]
Abstract
Neural networks are high-dimensional nonlinear dynamical systems that process information through the coordinated activity of many connected units. Understanding how biological and machine-learning networks function and learn requires knowledge of the structure of this coordinated activity, information contained, for example, in cross covariances between units. Self-consistent dynamical mean field theory (DMFT) has elucidated several features of random neural networks-in particular, that they can generate chaotic activity-however, a calculation of cross covariances using this approach has not been provided. Here, we calculate cross covariances self-consistently via a two-site cavity DMFT. We use this theory to probe spatiotemporal features of activity coordination in a classic random-network model with independent and identically distributed (i.i.d.) couplings, showing an extensive but fractionally low effective dimension of activity and a long population-level timescale. Our formulas apply to a wide range of single-unit dynamics and generalize to non-i.i.d. couplings. As an example of the latter, we analyze the case of partially symmetric couplings.
Collapse
Affiliation(s)
- David G Clark
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| | - L F Abbott
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| | - Ashok Litwin-Kumar
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| |
Collapse
|
2
|
Zeraati R, Shi YL, Steinmetz NA, Gieselmann MA, Thiele A, Moore T, Levina A, Engel TA. Intrinsic timescales in the visual cortex change with selective attention and reflect spatial connectivity. Nat Commun 2023; 14:1858. [PMID: 37012299 PMCID: PMC10070246 DOI: 10.1038/s41467-023-37613-7] [Citation(s) in RCA: 11] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2021] [Accepted: 03/24/2023] [Indexed: 04/05/2023] Open
Abstract
Intrinsic timescales characterize dynamics of endogenous fluctuations in neural activity. Variation of intrinsic timescales across the neocortex reflects functional specialization of cortical areas, but less is known about how intrinsic timescales change during cognitive tasks. We measured intrinsic timescales of local spiking activity within columns of area V4 in male monkeys performing spatial attention tasks. The ongoing spiking activity unfolded across at least two distinct timescales, fast and slow. The slow timescale increased when monkeys attended to the receptive fields location and correlated with reaction times. By evaluating predictions of several network models, we found that spatiotemporal correlations in V4 activity were best explained by the model in which multiple timescales arise from recurrent interactions shaped by spatially arranged connectivity, and attentional modulation of timescales results from an increase in the efficacy of recurrent interactions. Our results suggest that multiple timescales may arise from the spatial connectivity in the visual cortex and flexibly change with the cognitive state due to dynamic effective interactions between neurons.
Collapse
Affiliation(s)
- Roxana Zeraati
- International Max Planck Research School for the Mechanisms of Mental Function and Dysfunction, University of Tübingen, Tübingen, Germany
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Yan-Liang Shi
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | | | - Marc A Gieselmann
- Biosciences Institute, Newcastle University, Newcastle upon Tyne, UK
| | - Alexander Thiele
- Biosciences Institute, Newcastle University, Newcastle upon Tyne, UK
| | - Tirin Moore
- Department of Neurobiology and Howard Hughes Medical Institute, Stanford University, Stanford, CA, USA
| | - Anna Levina
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany.
- Department of Computer Science, University of Tübingen, Tübingen, Germany.
- Bernstein Center for Computational Neuroscience Tübingen, Tübingen, Germany.
| | - Tatiana A Engel
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA.
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
| |
Collapse
|
3
|
Winston CN, Mastrovito D, Shea-Brown E, Mihalas S. Heterogeneity in Neuronal Dynamics Is Learned by Gradient Descent for Temporal Processing Tasks. Neural Comput 2023; 35:555-592. [PMID: 36827598 PMCID: PMC10044000 DOI: 10.1162/neco_a_01571] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Accepted: 11/02/2022] [Indexed: 02/26/2023]
Abstract
Individual neurons in the brain have complex intrinsic dynamics that are highly diverse. We hypothesize that the complex dynamics produced by networks of complex and heterogeneous neurons may contribute to the brain's ability to process and respond to temporally complex data. To study the role of complex and heterogeneous neuronal dynamics in network computation, we develop a rate-based neuronal model, the generalized-leaky-integrate-and-fire-rate (GLIFR) model, which is a rate equivalent of the generalized-leaky-integrate-and-fire model. The GLIFR model has multiple dynamical mechanisms, which add to the complexity of its activity while maintaining differentiability. We focus on the role of after-spike currents, currents induced or modulated by neuronal spikes, in producing rich temporal dynamics. We use machine learning techniques to learn both synaptic weights and parameters underlying intrinsic dynamics to solve temporal tasks. The GLIFR model allows the use of standard gradient descent techniques rather than surrogate gradient descent, which has been used in spiking neural networks. After establishing the ability to optimize parameters using gradient descent in single neurons, we ask how networks of GLIFR neurons learn and perform on temporally challenging tasks, such as sequential MNIST. We find that these networks learn diverse parameters, which gives rise to diversity in neuronal dynamics, as demonstrated by clustering of neuronal parameters. GLIFR networks have mixed performance when compared to vanilla recurrent neural networks, with higher performance in pixel-by-pixel MNIST but lower in line-by-line MNIST. However, they appear to be more robust to random silencing. We find that the ability to learn heterogeneity and the presence of after-spike currents contribute to these gains in performance. Our work demonstrates both the computational robustness of neuronal complexity and diversity in networks and a feasible method of training such models using exact gradients.
Collapse
Affiliation(s)
- Chloe N Winston
- Departments of Neuroscience and Computer Science, University of Washington, Seattle, WA 98195, U.S.A
- University of Washington Computational Neuroscience Center, Seattle, WA 98195, U.S.A.
| | - Dana Mastrovito
- Allen Institute for Brain Science, Seattle, WA 98109, U.S.A.
| | - Eric Shea-Brown
- University of Washington Computational Neuroscience Center, Seattle, WA 98195, U.S.A
- Allen Institute for Brain Science, Seattle, WA 98109, U.S.A
- Department of Applied Mathematics, University of Washington, Seattle, WA 98195, U.S.A.
| | - Stefan Mihalas
- University of Washington Computational Neuroscience Center, Seattle, WA 98195, U.S.A
- Allen Institute for Brain Science, Seattle, WA 98109, U.S.A
- Department of Applied Mathematics, University of Washington, Seattle, WA 98195, U.S.A.
| |
Collapse
|
4
|
Lundstrom BN, Richner TJ. Neural adaptation and fractional dynamics as a window to underlying neural excitability. PLoS Comput Biol 2023; 19:e1010527. [PMID: 36809353 PMCID: PMC9983885 DOI: 10.1371/journal.pcbi.1010527] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2022] [Revised: 03/03/2023] [Accepted: 01/29/2023] [Indexed: 02/23/2023] Open
Abstract
The relationship between macroscale electrophysiological recordings and the dynamics of underlying neural activity remains unclear. We have previously shown that low frequency EEG activity (<1 Hz) is decreased at the seizure onset zone (SOZ), while higher frequency activity (1-50 Hz) is increased. These changes result in power spectral densities (PSDs) with flattened slopes near the SOZ, which are assumed to be areas of increased excitability. We wanted to understand possible mechanisms underlying PSD changes in brain regions of increased excitability. We hypothesized that these observations are consistent with changes in adaptation within the neural circuit. We developed a theoretical framework and tested the effect of adaptation mechanisms, such as spike frequency adaptation and synaptic depression, on excitability and PSDs using filter-based neural mass models and conductance-based models. We compared the contribution of single timescale adaptation and multiple timescale adaptation. We found that adaptation with multiple timescales alters the PSDs. Multiple timescales of adaptation can approximate fractional dynamics, a form of calculus related to power laws, history dependence, and non-integer order derivatives. Coupled with input changes, these dynamics changed circuit responses in unexpected ways. Increased input without synaptic depression increases broadband power. However, increased input with synaptic depression may decrease power. The effects of adaptation were most pronounced for low frequency activity (< 1Hz). Increased input combined with a loss of adaptation yielded reduced low frequency activity and increased higher frequency activity, consistent with clinical EEG observations from SOZs. Spike frequency adaptation and synaptic depression, two forms of multiple timescale adaptation, affect low frequency EEG and the slope of PSDs. These neural mechanisms may underlie changes in EEG activity near the SOZ and relate to neural hyperexcitability. Neural adaptation may be evident in macroscale electrophysiological recordings and provide a window to understanding neural circuit excitability.
Collapse
Affiliation(s)
- Brian Nils Lundstrom
- Neurology Department, Mayo Clinic, Rochester, Minnesota, United States of America
- * E-mail:
| | - Thomas J. Richner
- Neurology Department, Mayo Clinic, Rochester, Minnesota, United States of America
| |
Collapse
|
5
|
Shao Y, Ostojic S. Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks. PLoS Comput Biol 2023; 19:e1010855. [PMID: 36689488 PMCID: PMC9894562 DOI: 10.1371/journal.pcbi.1010855] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2022] [Revised: 02/02/2023] [Accepted: 01/06/2023] [Indexed: 01/24/2023] Open
Abstract
How the connectivity of cortical networks determines the neural dynamics and the resulting computations is one of the key questions in neuroscience. Previous works have pursued two complementary approaches to quantify the structure in connectivity. One approach starts from the perspective of biological experiments where only the local statistics of connectivity motifs between small groups of neurons are accessible. Another approach is based instead on the perspective of artificial neural networks where the global connectivity matrix is known, and in particular its low-rank structure can be used to determine the resulting low-dimensional dynamics. A direct relationship between these two approaches is however currently missing. Specifically, it remains to be clarified how local connectivity statistics and the global low-rank connectivity structure are inter-related and shape the low-dimensional activity. To bridge this gap, here we develop a method for mapping local connectivity statistics onto an approximate global low-rank structure. Our method rests on approximating the global connectivity matrix using dominant eigenvectors, which we compute using perturbation theory for random matrices. We demonstrate that multi-population networks defined from local connectivity statistics for which the central limit theorem holds can be approximated by low-rank connectivity with Gaussian-mixture statistics. We specifically apply this method to excitatory-inhibitory networks with reciprocal motifs, and show that it yields reliable predictions for both the low-dimensional dynamics, and statistics of population activity. Importantly, it analytically accounts for the activity heterogeneity of individual neurons in specific realizations of local connectivity. Altogether, our approach allows us to disentangle the effects of mean connectivity and reciprocal motifs on the global recurrent feedback, and provides an intuitive picture of how local connectivity shapes global network dynamics.
Collapse
Affiliation(s)
- Yuxiu Shao
- Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Superieure—PSL Research University, Paris, France
- * E-mail: (YS); (SO)
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Superieure—PSL Research University, Paris, France
- * E-mail: (YS); (SO)
| |
Collapse
|
6
|
Movement is governed by rotational neural dynamics in spinal motor networks. Nature 2022; 610:526-531. [PMID: 36224394 DOI: 10.1038/s41586-022-05293-w] [Citation(s) in RCA: 28] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2021] [Accepted: 08/30/2022] [Indexed: 11/08/2022]
Abstract
Although the generation of movements is a fundamental function of the nervous system, the underlying neural principles remain unclear. As flexor and extensor muscle activities alternate during rhythmic movements such as walking, it is often assumed that the responsible neural circuitry is similarly exhibiting alternating activity1. Here we present ensemble recordings of neurons in the lumbar spinal cord that indicate that, rather than alternating, the population is performing a low-dimensional 'rotation' in neural space, in which the neural activity is cycling through all phases continuously during the rhythmic behaviour. The radius of rotation correlates with the intended muscle force, and a perturbation of the low-dimensional trajectory can modify the motor behaviour. As existing models of spinal motor control do not offer an adequate explanation of rotation1,2, we propose a theory of neural generation of movements from which this and other unresolved issues, such as speed regulation, force control and multifunctionalism, are readily explained.
Collapse
|
7
|
Wolff A, Berberian N, Golesorkhi M, Gomez-Pilar J, Zilio F, Northoff G. Intrinsic neural timescales: temporal integration and segregation. Trends Cogn Sci 2022; 26:159-173. [PMID: 34991988 DOI: 10.1016/j.tics.2021.11.007] [Citation(s) in RCA: 72] [Impact Index Per Article: 36.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2021] [Revised: 11/19/2021] [Accepted: 11/23/2021] [Indexed: 12/11/2022]
Abstract
We are continuously bombarded by external inputs of various timescales from the environment. How does the brain process this multitude of timescales? Recent resting state studies show a hierarchy of intrinsic neural timescales (INT) with a shorter duration in unimodal regions (e.g., visual cortex and auditory cortex) and a longer duration in transmodal regions (e.g., default mode network). This unimodal-transmodal hierarchy is present across acquisition modalities [electroencephalogram (EEG)/magnetoencephalogram (MEG) and fMRI] and can be found in different species and during a variety of different task states. Together, this suggests that the hierarchy of INT is central to the temporal integration (combining successive stimuli) and segregation (separating successive stimuli) of external inputs from the environment, leading to temporal segmentation and prediction in perception and cognition.
Collapse
Affiliation(s)
- Annemarie Wolff
- Mind, Brain Imaging, and Neuroethics Research Unit, Institute of Mental Health Research, The Royal Ottawa Mental Health Centre and University of Ottawa, Ottawa, Canada
| | - Nareg Berberian
- Mind, Brain Imaging, and Neuroethics Research Unit, Institute of Mental Health Research, The Royal Ottawa Mental Health Centre and University of Ottawa, Ottawa, Canada
| | - Mehrshad Golesorkhi
- Mind, Brain Imaging, and Neuroethics Research Unit, Institute of Mental Health Research, The Royal Ottawa Mental Health Centre and University of Ottawa, Ottawa, Canada
| | - Javier Gomez-Pilar
- Biomedical Engineering Group, University of Valladolid, Paseo de Belén, 15, 47011 Valladolid, Spain; Centro de Investigación Biomédica en Red en Bioingeniería, Biomateriales y Nanomedicia, (CIBER-BBN), Madrid, Spain
| | - Federico Zilio
- Department of Philosophy, Sociology, Education, and Applied Psychology, University of Padova, Padua, Italy
| | - Georg Northoff
- Mind, Brain Imaging, and Neuroethics Research Unit, Institute of Mental Health Research, The Royal Ottawa Mental Health Centre and University of Ottawa, Ottawa, Canada; Centre for Cognition and Brain Disorders, Hangzhou Normal University, Hangzhou, China; Mental Health Centre, Zhejiang University School of Medicine, Hangzhou, Zhejiang, China.
| |
Collapse
|
8
|
Krishnamurthy K, Can T, Schwab DJ. Theory of Gating in Recurrent Neural Networks. PHYSICAL REVIEW. X 2022; 12:011011. [PMID: 36545030 PMCID: PMC9762509 DOI: 10.1103/physrevx.12.011011] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience. Prior theoretical work has focused on RNNs with additive interactions. However gating i.e., multiplicative interactions are ubiquitous in real neurons and also the central feature of the best-performing RNNs in ML. Here, we show that gating offers flexible control of two salient features of the collective dynamics: (i) timescales and (ii) dimensionality. The gate controlling timescales leads to a novel marginally stable state, where the network functions as a flexible integrator. Unlike previous approaches, gating permits this important function without parameter fine-tuning or special symmetries. Gates also provide a flexible, context-dependent mechanism to reset the memory trace, thus complementing the memory function. The gate modulating the dimensionality can induce a novel, discontinuous chaotic transition, where inputs push a stable system to strong chaotic activity, in contrast to the typically stabilizing effect of inputs. At this transition, unlike additive RNNs, the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity). The rich dynamics are summarized in phase diagrams, thus providing a map for principled parameter initialization choices to ML practitioners.
Collapse
Affiliation(s)
- Kamesh Krishnamurthy
- Joseph Henry Laboratories of Physics and PNI, Princeton University, Princeton, New Jersey 08544, USA
| | - Tankut Can
- Institute for Advanced Study, Princeton, New Jersey 08540, USA
| | - David J. Schwab
- Initiative for Theoretical Sciences, Graduate Center, CUNY, New York, New York 10016, USA
| |
Collapse
|
9
|
Golesorkhi M, Gomez-Pilar J, Zilio F, Berberian N, Wolff A, Yagoub MCE, Northoff G. The brain and its time: intrinsic neural timescales are key for input processing. Commun Biol 2021; 4:970. [PMID: 34400800 PMCID: PMC8368044 DOI: 10.1038/s42003-021-02483-6] [Citation(s) in RCA: 58] [Impact Index Per Article: 19.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Accepted: 07/19/2021] [Indexed: 02/07/2023] Open
Abstract
We process and integrate multiple timescales into one meaningful whole. Recent evidence suggests that the brain displays a complex multiscale temporal organization. Different regions exhibit different timescales as described by the concept of intrinsic neural timescales (INT); however, their function and neural mechanisms remains unclear. We review recent literature on INT and propose that they are key for input processing. Specifically, they are shared across different species, i.e., input sharing. This suggests a role of INT in encoding inputs through matching the inputs' stochastics with the ongoing temporal statistics of the brain's neural activity, i.e., input encoding. Following simulation and empirical data, we point out input integration versus segregation and input sampling as key temporal mechanisms of input processing. This deeply grounds the brain within its environmental and evolutionary context. It carries major implications in understanding mental features and psychiatric disorders, as well as going beyond the brain in integrating timescales into artificial intelligence.
Collapse
Affiliation(s)
- Mehrshad Golesorkhi
- grid.28046.380000 0001 2182 2255School of Electrical Engineering and Computer Science, University of Ottawa, Ottawa, Canada ,grid.28046.380000 0001 2182 2255Mind, Brain Imaging and Neuroethics Research Unit, Institute of Mental Health, Royal Ottawa Mental Health Centre and University of Ottawa, Ottawa, Canada
| | - Javier Gomez-Pilar
- grid.5239.d0000 0001 2286 5329Biomedical Engineering Group, University of Valladolid, Valladolid, Spain ,grid.413448.e0000 0000 9314 1427Centro de Investigación Biomédica en Red en Bioingeniería, Biomateriales y Nanomedicina, (CIBER-BBN), Madrid, Spain
| | - Federico Zilio
- grid.5608.b0000 0004 1757 3470Department of Philosophy, Sociology, Education and Applied Psychology, University of Padova, Padua, Italy
| | - Nareg Berberian
- grid.28046.380000 0001 2182 2255Mind, Brain Imaging and Neuroethics Research Unit, Institute of Mental Health, Royal Ottawa Mental Health Centre and University of Ottawa, Ottawa, Canada
| | - Annemarie Wolff
- grid.28046.380000 0001 2182 2255Mind, Brain Imaging and Neuroethics Research Unit, Institute of Mental Health, Royal Ottawa Mental Health Centre and University of Ottawa, Ottawa, Canada
| | - Mustapha C. E. Yagoub
- grid.28046.380000 0001 2182 2255School of Electrical Engineering and Computer Science, University of Ottawa, Ottawa, Canada
| | - Georg Northoff
- grid.28046.380000 0001 2182 2255Mind, Brain Imaging and Neuroethics Research Unit, Institute of Mental Health, Royal Ottawa Mental Health Centre and University of Ottawa, Ottawa, Canada ,grid.410595.c0000 0001 2230 9154Centre for Cognition and Brain Disorders, Hangzhou Normal University, Hangzhou, China ,grid.13402.340000 0004 1759 700XMental Health Centre, Zhejiang University School of Medicine, Hangzhou, Zhejiang China
| |
Collapse
|
10
|
Bondanelli G, Deneux T, Bathellier B, Ostojic S. Network dynamics underlying OFF responses in the auditory cortex. eLife 2021; 10:e53151. [PMID: 33759763 PMCID: PMC8057817 DOI: 10.7554/elife.53151] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Accepted: 03/19/2021] [Indexed: 11/13/2022] Open
Abstract
Across sensory systems, complex spatio-temporal patterns of neural activity arise following the onset (ON) and offset (OFF) of stimuli. While ON responses have been widely studied, the mechanisms generating OFF responses in cortical areas have so far not been fully elucidated. We examine here the hypothesis that OFF responses are single-cell signatures of recurrent interactions at the network level. To test this hypothesis, we performed population analyses of two-photon calcium recordings in the auditory cortex of awake mice listening to auditory stimuli, and compared them to linear single-cell and network models. While the single-cell model explained some prominent features of the data, it could not capture the structure across stimuli and trials. In contrast, the network model accounted for the low-dimensional organization of population responses and their global structure across stimuli, where distinct stimuli activated mostly orthogonal dimensions in the neural state-space.
Collapse
Affiliation(s)
- Giulio Bondanelli
- Laboratoire de Neurosciences Cognitives et Computationelles, Département d’études cognitives, ENS, PSL University, INSERMParisFrance
- Neural Computation Laboratory, Center for Human Technologies, Istituto Italiano di Tecnologia (IIT)GenoaItaly
| | - Thomas Deneux
- Départment de Neurosciences Intégratives et Computationelles (ICN), Institut des Neurosciences Paris-Saclay (NeuroPSI), UMR 9197 CNRS, Université Paris SudGif-sur-YvetteFrance
| | - Brice Bathellier
- Départment de Neurosciences Intégratives et Computationelles (ICN), Institut des Neurosciences Paris-Saclay (NeuroPSI), UMR 9197 CNRS, Université Paris SudGif-sur-YvetteFrance
- Institut Pasteur, INSERM, Institut de l’AuditionParisFrance
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationelles, Département d’études cognitives, ENS, PSL University, INSERMParisFrance
| |
Collapse
|
11
|
Biophysically grounded mean-field models of neural populations under electrical stimulation. PLoS Comput Biol 2020; 16:e1007822. [PMID: 32324734 PMCID: PMC7200022 DOI: 10.1371/journal.pcbi.1007822] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2019] [Revised: 05/05/2020] [Accepted: 03/24/2020] [Indexed: 11/19/2022] Open
Abstract
Electrical stimulation of neural systems is a key tool for understanding neural dynamics and ultimately for developing clinical treatments. Many applications of electrical stimulation affect large populations of neurons. However, computational models of large networks of spiking neurons are inherently hard to simulate and analyze. We evaluate a reduced mean-field model of excitatory and inhibitory adaptive exponential integrate-and-fire (AdEx) neurons which can be used to efficiently study the effects of electrical stimulation on large neural populations. The rich dynamical properties of this basic cortical model are described in detail and validated using large network simulations. Bifurcation diagrams reflecting the network's state reveal asynchronous up- and down-states, bistable regimes, and oscillatory regions corresponding to fast excitation-inhibition and slow excitation-adaptation feedback loops. The biophysical parameters of the AdEx neuron can be coupled to an electric field with realistic field strengths which then can be propagated up to the population description. We show how on the edge of bifurcation, direct electrical inputs cause network state transitions, such as turning on and off oscillations of the population rate. Oscillatory input can frequency-entrain and phase-lock endogenous oscillations. Relatively weak electric field strengths on the order of 1 V/m are able to produce these effects, indicating that field effects are strongly amplified in the network. The effects of time-varying external stimulation are well-predicted by the mean-field model, further underpinning the utility of low-dimensional neural mass models.
Collapse
|
12
|
Muscinelli SP, Gerstner W, Schwalger T. How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput Biol 2019; 15:e1007122. [PMID: 31181063 PMCID: PMC6586367 DOI: 10.1371/journal.pcbi.1007122] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Revised: 06/20/2019] [Accepted: 05/22/2019] [Indexed: 02/07/2023] Open
Abstract
While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of "resonant chaos", characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks.
Collapse
Affiliation(s)
- Samuel P. Muscinelli
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
- Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany
| |
Collapse
|