1
|
Song D, Chung DW, Ermentrout GB. Mean-field analysis of synaptic alterations underlying deficient cortical gamma oscillations in schizophrenia. RESEARCH SQUARE 2024:rs.3.rs-3938805. [PMID: 38410475 PMCID: PMC10896366 DOI: 10.21203/rs.3.rs-3938805/v1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/28/2024]
Abstract
Deficient gamma oscillations in the prefrontal cortex (PFC) of individuals with schizophrenia (SZ) are proposed to arise from alterations in the excitatory drive to fast-spiking interneurons ( E → I ) and in the inhibitory drive from these interneurons to excitatory neurons ( I → E ) . Consistent with this idea, prior postmortem studies showed lower levels of molecular and structural markers for the strength of E → I and I → E synapses and also greater variability in E → I synaptic strength in PFC of SZ. Moreover, simulating these alterations in a network of quadratic integrate-and-fire (QIF) neurons revealed a synergistic effect of their interactions on reducing gamma power. In this study, we aimed to investigate the dynamical nature of this synergistic interaction at macroscopic level by deriving a mean-field description of the QIF model network that consists of all-to-all connected excitatory neurons and fast-spiking interneurons. Through a series of numerical simulations and bifurcation analyses, findings from our mean-field model showed that the macroscopic dynamics of gamma oscillations are synergistically disrupted by the interactions among lower strength of E → I and I → E synapses and greater variability in E → I synaptic strength. Furthermore, the two-dimensional bifurcation analyses showed that this synergistic interaction is primarily driven by the shift in Hopf bifurcation due to lower E → I synaptic strength. Together, these simulations predict the nature of dynamical mechanisms by which multiple synaptic alterations interact to robustly reduce PFC gamma power in SZ, and highlight the utility of mean-field model to study macroscopic neural dynamics and their alterations in the illness.
Collapse
Affiliation(s)
- Deying Song
- Joint Program in Neural Computation and Machine Learning, Neuroscience Institute, and Machine Learning Department, Carnegie Mellon University, Pittsburgh, PA, USA, 15213
- Center for the Neural Basis of Cognition, University of Pittsburgh, Pittsburgh, PA, USA, 15213
| | - Daniel W. Chung
- Translational Neuroscience Program, Department of Psychiatry, University of Pittsburgh, Pittsburgh, PA, USA, 15213
| | - G. Bard Ermentrout
- Center for the Neural Basis of Cognition, University of Pittsburgh, Pittsburgh, PA, USA, 15213
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA, 15213
| |
Collapse
|
2
|
Duchet B, Bick C, Byrne Á. Mean-Field Approximations With Adaptive Coupling for Networks With Spike-Timing-Dependent Plasticity. Neural Comput 2023; 35:1481-1528. [PMID: 37437202 PMCID: PMC10422128 DOI: 10.1162/neco_a_01601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2022] [Accepted: 04/26/2023] [Indexed: 07/14/2023]
Abstract
Understanding the effect of spike-timing-dependent plasticity (STDP) is key to elucidating how neural networks change over long timescales and to design interventions aimed at modulating such networks in neurological disorders. However, progress is restricted by the significant computational cost associated with simulating neural network models with STDP and by the lack of low-dimensional description that could provide analytical insights. Phase-difference-dependent plasticity (PDDP) rules approximate STDP in phase oscillator networks, which prescribe synaptic changes based on phase differences of neuron pairs rather than differences in spike timing. Here we construct mean-field approximations for phase oscillator networks with STDP to describe part of the phase space for this very high-dimensional system. We first show that single-harmonic PDDP rules can approximate a simple form of symmetric STDP, while multiharmonic rules are required to accurately approximate causal STDP. We then derive exact expressions for the evolution of the average PDDP coupling weight in terms of network synchrony. For adaptive networks of Kuramoto oscillators that form clusters, we formulate a family of low-dimensional descriptions based on the mean-field dynamics of each cluster and average coupling weights between and within clusters. Finally, we show that such a two-cluster mean-field model can be fitted to synthetic data to provide a low-dimensional approximation of a full adaptive network with symmetric STDP. Our framework represents a step toward a low-dimensional description of adaptive networks with STDP, and could for example inform the development of new therapies aimed at maximizing the long-lasting effects of brain stimulation.
Collapse
Affiliation(s)
- Benoit Duchet
- Nuffield Department of Clinical Neuroscience, University of Oxford, Oxford X3 9DU, U.K
- MRC Brain Network Dynamics Unit, University of Oxford, Oxford X1 3TH, U.K.
| | - Christian Bick
- Department of Mathematics, Vrije Universiteit Amsterdam, Amsterdam 1081 HV, the Netherlands
- Amsterdam Neuroscience-Systems and Network Neuroscience, Amsterdam 1081 HV, the Netherlands
- Mathematical Institute, University of Oxford, Oxford X2 6GG, U.K.
| | - Áine Byrne
- School of Mathematics and Statistics, University College Dublin, Dublin D04 V1W8, Ireland
| |
Collapse
|
3
|
Sawicki J, Berner R, Loos SAM, Anvari M, Bader R, Barfuss W, Botta N, Brede N, Franović I, Gauthier DJ, Goldt S, Hajizadeh A, Hövel P, Karin O, Lorenz-Spreen P, Miehl C, Mölter J, Olmi S, Schöll E, Seif A, Tass PA, Volpe G, Yanchuk S, Kurths J. Perspectives on adaptive dynamical systems. CHAOS (WOODBURY, N.Y.) 2023; 33:071501. [PMID: 37486668 DOI: 10.1063/5.0147231] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/20/2023] [Accepted: 05/24/2023] [Indexed: 07/25/2023]
Abstract
Adaptivity is a dynamical feature that is omnipresent in nature, socio-economics, and technology. For example, adaptive couplings appear in various real-world systems, such as the power grid, social, and neural networks, and they form the backbone of closed-loop control strategies and machine learning algorithms. In this article, we provide an interdisciplinary perspective on adaptive systems. We reflect on the notion and terminology of adaptivity in different disciplines and discuss which role adaptivity plays for various fields. We highlight common open challenges and give perspectives on future research directions, looking to inspire interdisciplinary approaches.
Collapse
Affiliation(s)
- Jakub Sawicki
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Rico Berner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Sarah A M Loos
- DAMTP, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA, United Kingdom
| | - Mehrnaz Anvari
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Fraunhofer Institute for Algorithms and Scientific Computing, Schloss Birlinghoven, 53757 Sankt-Augustin, Germany
| | - Rolf Bader
- Institute of Systematic Musicology, University of Hamburg, Hamburg, Germany
| | - Wolfram Barfuss
- Transdisciplinary Research Area: Sustainable Futures, University of Bonn, 53113 Bonn, Germany
- Center for Development Research (ZEF), University of Bonn, 53113 Bonn, Germany
| | - Nicola Botta
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Computer Science and Engineering, Chalmers University of Technology, 412 96 Göteborg, Sweden
| | - Nuria Brede
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Computer Science, University of Potsdam, An der Bahn 2, 14476 Potsdam, Germany
| | - Igor Franović
- Scientific Computing Laboratory, Center for the Study of Complex Systems, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia
| | - Daniel J Gauthier
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
| | - Sebastian Goldt
- Department of Physics, International School of Advanced Studies (SISSA), Trieste, Italy
| | - Aida Hajizadeh
- Research Group Comparative Neuroscience, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Philipp Hövel
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
| | - Omer Karin
- Department of Mathematics, Imperial College London, London SW7 2AZ, United Kingdom
| | - Philipp Lorenz-Spreen
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany
| | - Christoph Miehl
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Jan Mölter
- Department of Mathematics, School of Computation, Information and Technology, Technical University of Munich, Boltzmannstraße 3, 85748 Garching bei München, Germany
| | - Simona Olmi
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Eckehard Schöll
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Akademie Basel, Fachhochschule Nordwestschweiz FHNW, Leonhardsstrasse 6, 4009 Basel, Switzerland
| | - Alireza Seif
- Pritzker School of Molecular Engineering, The University of Chicago, Chicago, Illinois 60637, USA
| | - Peter A Tass
- Department of Neurosurgery, Stanford University School of Medicine, Stanford, California 94304, USA
| | - Giovanni Volpe
- Department of Physics, University of Gothenburg, Gothenburg, Sweden
| | - Serhiy Yanchuk
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Jürgen Kurths
- Potsdam Institute for Climate Impact Research, Telegrafenberg, 14473 Potsdam, Germany
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
4
|
Ferrara A, Angulo-Garcia D, Torcini A, Olmi S. Population spiking and bursting in next-generation neural masses with spike-frequency adaptation. Phys Rev E 2023; 107:024311. [PMID: 36932567 DOI: 10.1103/physreve.107.024311] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Accepted: 02/03/2023] [Indexed: 06/18/2023]
Abstract
Spike-frequency adaptation (SFA) is a fundamental neuronal mechanism taking into account the fatigue due to spike emissions and the consequent reduction of the firing activity. We have studied the effect of this adaptation mechanism on the macroscopic dynamics of excitatory and inhibitory networks of quadratic integrate-and-fire (QIF) neurons coupled via exponentially decaying post-synaptic potentials. In particular, we have studied the population activities by employing an exact mean-field reduction, which gives rise to next-generation neural mass models. This low-dimensional reduction allows for the derivation of bifurcation diagrams and the identification of the possible macroscopic regimes emerging both in a single and in two identically coupled neural masses. In single populations SFA favors the emergence of population bursts in excitatory networks, while it hinders tonic population spiking for inhibitory ones. The symmetric coupling of two neural masses, in absence of adaptation, leads to the emergence of macroscopic solutions with broken symmetry, namely, chimera-like solutions in the inhibitory case and antiphase population spikes in the excitatory one. The addition of SFA leads to new collective dynamical regimes exhibiting cross-frequency coupling (CFC) among the fast synaptic timescale and the slow adaptation one, ranging from antiphase slow-fast nested oscillations to symmetric and asymmetric bursting phenomena. The analysis of these CFC rhythms in the θ-γ range has revealed that a reduction of SFA leads to an increase of the θ frequency joined to a decrease of the γ one. This is analogous to what has been reported experimentally for the hippocampus and the olfactory cortex of rodents under cholinergic modulation, which is known to reduce SFA.
Collapse
Affiliation(s)
- Alberto Ferrara
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, 75012 Paris, France
| | - David Angulo-Garcia
- Departamento de Matemáticas y Estadística, Universidad Nacional de Colombia (UNAL), Cra 27 No. 64-60, 170003, Manizales, Colombia
| | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
- CNR, Consiglio Nazionale delle Ricerche, Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
- INFN, Sezione di Firenze, via Sansone 1, 50019 Sesto Fiorentino, Italy
| | - Simona Olmi
- CNR, Consiglio Nazionale delle Ricerche, Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
- INFN, Sezione di Firenze, via Sansone 1, 50019 Sesto Fiorentino, Italy
| |
Collapse
|
5
|
Gast R, Solla SA, Kennedy A. Macroscopic dynamics of neural networks with heterogeneous spiking thresholds. Phys Rev E 2023; 107:024306. [PMID: 36932598 DOI: 10.1103/physreve.107.024306] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Accepted: 02/02/2023] [Indexed: 06/18/2023]
Abstract
Mean-field theory links the physiological properties of individual neurons to the emergent dynamics of neural population activity. These models provide an essential tool for studying brain function at different scales; however, for their application to neural populations on large scale, they need to account for differences between distinct neuron types. The Izhikevich single neuron model can account for a broad range of different neuron types and spiking patterns, thus rendering it an optimal candidate for a mean-field theoretic treatment of brain dynamics in heterogeneous networks. Here we derive the mean-field equations for networks of all-to-all coupled Izhikevich neurons with heterogeneous spiking thresholds. Using methods from bifurcation theory, we examine the conditions under which the mean-field theory accurately predicts the dynamics of the Izhikevich neuron network. To this end, we focus on three important features of the Izhikevich model that are subject here to simplifying assumptions: (i) spike-frequency adaptation, (ii) the spike reset conditions, and (iii) the distribution of single-cell spike thresholds across neurons. Our results indicate that, while the mean-field model is not an exact model of the Izhikevich network dynamics, it faithfully captures its different dynamic regimes and phase transitions. We thus present a mean-field model that can represent different neuron types and spiking dynamics. The model comprises biophysical state variables and parameters, incorporates realistic spike resetting conditions, and accounts for heterogeneity in neural spiking thresholds. These features allow for a broad applicability of the model as well as for a direct comparison to experimental data.
Collapse
Affiliation(s)
- Richard Gast
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, Illinois 60611, USA
| | - Sara A Solla
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, Illinois 60611, USA
| | - Ann Kennedy
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, Illinois 60611, USA
| |
Collapse
|
6
|
Pietras B, Schmutz V, Schwalger T. Mesoscopic description of hippocampal replay and metastability in spiking neural networks with short-term plasticity. PLoS Comput Biol 2022; 18:e1010809. [PMID: 36548392 PMCID: PMC9822116 DOI: 10.1371/journal.pcbi.1010809] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 01/06/2023] [Accepted: 12/11/2022] [Indexed: 12/24/2022] Open
Abstract
Bottom-up models of functionally relevant patterns of neural activity provide an explicit link between neuronal dynamics and computation. A prime example of functional activity patterns are propagating bursts of place-cell activities called hippocampal replay, which is critical for memory consolidation. The sudden and repeated occurrences of these burst states during ongoing neural activity suggest metastable neural circuit dynamics. As metastability has been attributed to noise and/or slow fatigue mechanisms, we propose a concise mesoscopic model which accounts for both. Crucially, our model is bottom-up: it is analytically derived from the dynamics of finite-size networks of Linear-Nonlinear Poisson neurons with short-term synaptic depression. As such, noise is explicitly linked to stochastic spiking and network size, and fatigue is explicitly linked to synaptic dynamics. To derive the mesoscopic model, we first consider a homogeneous spiking neural network and follow the temporal coarse-graining approach of Gillespie to obtain a "chemical Langevin equation", which can be naturally interpreted as a stochastic neural mass model. The Langevin equation is computationally inexpensive to simulate and enables a thorough study of metastable dynamics in classical setups (population spikes and Up-Down-states dynamics) by means of phase-plane analysis. An extension of the Langevin equation for small network sizes is also presented. The stochastic neural mass model constitutes the basic component of our mesoscopic model for replay. We show that the mesoscopic model faithfully captures the statistical structure of individual replayed trajectories in microscopic simulations and in previously reported experimental data. Moreover, compared to the deterministic Romani-Tsodyks model of place-cell dynamics, it exhibits a higher level of variability regarding order, direction and timing of replayed trajectories, which seems biologically more plausible and could be functionally desirable. This variability is the product of a new dynamical regime where metastability emerges from a complex interplay between finite-size fluctuations and local fatigue.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
| | - Valentin Schmutz
- Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Tilo Schwalger
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- * E-mail:
| |
Collapse
|
7
|
Vanag VK. Plasticity in networks of active chemical cells with pulse coupling. CHAOS (WOODBURY, N.Y.) 2022; 32:123108. [PMID: 36587337 DOI: 10.1063/5.0110190] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2022] [Accepted: 11/09/2022] [Indexed: 06/17/2023]
Abstract
A method for controlling the coupling strength is proposed for pulsed coupled active chemical micro-cells. The method is consistent with Hebb's rules. The effect of various system parameters on this "spike-timing-dependent plasticity" is studied. In addition to networks of two and three coupled active cells, the effect of this "plasticity" on the dynamic modes of a network of four pulse-coupled chemical micro-cells unidirectionally coupled in a circle is studied. It is shown that the proposed adjustment of the coupling strengths leads to spontaneous switching between network eigenmodes.
Collapse
Affiliation(s)
- Vladimir K Vanag
- Centre for Nonlinear Chemistry, Immanuel Kant Baltic Federal University, 14 A. Nevskogo St., Kaliningrad 236041, Russia
| |
Collapse
|
8
|
Tiddia G, Golosio B, Fanti V, Paolucci PS. Simulations of working memory spiking networks driven by short-term plasticity. Front Integr Neurosci 2022; 16:972055. [PMID: 36262372 PMCID: PMC9574057 DOI: 10.3389/fnint.2022.972055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2022] [Accepted: 08/29/2022] [Indexed: 11/15/2022] Open
Abstract
Working Memory (WM) is a cognitive mechanism that enables temporary holding and manipulation of information in the human brain. This mechanism is mainly characterized by a neuronal activity during which neuron populations are able to maintain an enhanced spiking activity after being triggered by a short external cue. In this study, we implement, using the NEST simulator, a spiking neural network model in which the WM activity is sustained by a mechanism of short-term synaptic facilitation related to presynaptic calcium kinetics. The model, which is characterized by leaky integrate-and-fire neurons with exponential postsynaptic currents, is able to autonomously show an activity regime in which the memory information can be stored in a synaptic form as a result of synaptic facilitation, with spiking activity functional to facilitation maintenance. The network is able to simultaneously keep multiple memories by showing an alternated synchronous activity which preserves the synaptic facilitation within the neuron populations holding memory information. The results shown in this study confirm that a WM mechanism can be sustained by synaptic facilitation.
Collapse
Affiliation(s)
- Gianmarco Tiddia
- Department of Physics, University of Cagliari, Monserrato, Italy
- Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Cagliari, Monserrato, Italy
| | - Bruno Golosio
- Department of Physics, University of Cagliari, Monserrato, Italy
- Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Cagliari, Monserrato, Italy
- *Correspondence: Bruno Golosio
| | - Viviana Fanti
- Department of Physics, University of Cagliari, Monserrato, Italy
- Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Cagliari, Monserrato, Italy
| | | |
Collapse
|
9
|
Exact mean-field models for spiking neural networks with adaptation. J Comput Neurosci 2022; 50:445-469. [PMID: 35834100 DOI: 10.1007/s10827-022-00825-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 06/15/2022] [Indexed: 10/17/2022]
Abstract
Networks of spiking neurons with adaption have been shown to be able to reproduce a wide range of neural activities, including the emergent population bursting and spike synchrony that underpin brain disorders and normal function. Exact mean-field models derived from spiking neural networks are extremely valuable, as such models can be used to determine how individual neurons and the network they reside within interact to produce macroscopic network behaviours. In the paper, we derive and analyze a set of exact mean-field equations for the neural network with spike frequency adaptation. Specifically, our model is a network of Izhikevich neurons, where each neuron is modeled by a two dimensional system consisting of a quadratic integrate and fire equation plus an equation which implements spike frequency adaptation. Previous work deriving a mean-field model for this type of network, relied on the assumption of sufficiently slow dynamics of the adaptation variable. However, this approximation did not succeed in establishing an exact correspondence between the macroscopic description and the realistic neural network, especially when the adaptation time constant was not large. The challenge lies in how to achieve a closed set of mean-field equations with the inclusion of the mean-field dynamics of the adaptation variable. We address this problem by using a Lorentzian ansatz combined with the moment closure approach to arrive at a mean-field system in the thermodynamic limit. The resulting macroscopic description is capable of qualitatively and quantitatively describing the collective dynamics of the neural network, including transition between states where the individual neurons exhibit asynchronous tonic firing and synchronous bursting. We extend the approach to a network of two populations of neurons and discuss the accuracy and efficacy of our mean-field approximations by examining all assumptions that are imposed during the derivation. Numerical bifurcation analysis of our mean-field models reveals bifurcations not previously observed in the models, including a novel mechanism for emergence of bursting in the network. We anticipate our results will provide a tractable and reliable tool to investigate the underlying mechanism of brain function and dysfunction from the perspective of computational neuroscience.
Collapse
|