1
|
Spaeth A, Haussler D, Teodorescu M. Model-agnostic neural mean field with a data-driven transfer function. NEUROMORPHIC COMPUTING AND ENGINEERING 2024; 4:034013. [PMID: 39310743 PMCID: PMC11413991 DOI: 10.1088/2634-4386/ad787f] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/04/2024] [Revised: 09/02/2024] [Accepted: 09/09/2024] [Indexed: 09/25/2024]
Abstract
As one of the most complex systems known to science, modeling brain behavior and function is both fascinating and extremely difficult. Empirical data is increasingly available from ex vivo human brain organoids and surgical samples, as well as in vivo animal models, so the problem of modeling the behavior of large-scale neuronal systems is more relevant than ever. The statistical physics concept of a mean-field model offers a tractable way to bridge the gap between single-neuron and population-level descriptions of neuronal activity, by modeling the behavior of a single representative neuron and extending this to the population. However, existing neural mean-field methods typically either take the limit of small interaction sizes, or are applicable only to the specific neuron models for which they were derived. This paper derives a mean-field model by fitting a transfer function called Refractory SoftPlus, which is simple yet applicable to a broad variety of neuron types. The transfer function is fitted numerically to simulated spike time data, and is entirely agnostic to the underlying neuronal dynamics. The resulting mean-field model predicts the response of a network of randomly connected neurons to a time-varying external stimulus with a high degree of accuracy. Furthermore, it enables an accurate approximate bifurcation analysis as a function of the level of recurrent input. This model does not assume large presynaptic rates or small postsynaptic potential size, allowing mean-field models to be developed even for populations with large interaction terms.
Collapse
Affiliation(s)
- Alex Spaeth
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| | - David Haussler
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| | - Mircea Teodorescu
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States of America
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States of America
| |
Collapse
|
2
|
Nandi MK, Valla M, di Volo M. Bursting gamma oscillations in neural mass models. Front Comput Neurosci 2024; 18:1422159. [PMID: 39281982 PMCID: PMC11392745 DOI: 10.3389/fncom.2024.1422159] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2024] [Accepted: 08/08/2024] [Indexed: 09/18/2024] Open
Abstract
Gamma oscillations (30-120 Hz) in the brain are not periodic cycles, but they typically appear in short-time windows, often called oscillatory bursts. While the origin of this bursting phenomenon is still unclear, some recent studies hypothesize its origin in the external or endogenous noise of neural networks. We demonstrate that an exact neural mass model of excitatory and inhibitory quadratic-integrate and fire-spiking neurons theoretically predicts the emergence of a different regime of intrinsic bursting gamma (IBG) oscillations without any noise source, a phenomenon due to collective chaos. This regime is indeed observed in the direct simulation of spiking neurons, characterized by highly irregular spiking activity. IBG oscillations are distinguished by higher phase-amplitude coupling to slower theta oscillations concerning noise-induced bursting oscillations, thus indicating an increased capacity for information transfer between brain regions. We demonstrate that this phenomenon is present in both globally coupled and sparse networks of spiking neurons. These results propose a new mechanism for gamma oscillatory activity, suggesting deterministic collective chaos as a good candidate for the origin of gamma bursts.
Collapse
Affiliation(s)
- Manoj Kumar Nandi
- Université Claude Bernard Lyon 1, Lyon, Rhône-Alpes, France
- INSERM U1208 Institut Cellule Souche et Cerveau, Bron, France
| | - Michele Valla
- Université Claude Bernard Lyon 1, Lyon, Rhône-Alpes, France
- INSERM U1208 Institut Cellule Souche et Cerveau, Bron, France
| | - Matteo di Volo
- Université Claude Bernard Lyon 1, Lyon, Rhône-Alpes, France
- INSERM U1208 Institut Cellule Souche et Cerveau, Bron, France
| |
Collapse
|
3
|
Tesler F, Lorenzi RM, Ponzi A, Casellato C, Palesi F, Gandolfi D, Gandini Wheeler Kingshott CAM, Mapelli J, D'Angelo E, Migliore M, Destexhe A. Multiscale modeling of neuronal dynamics in hippocampus CA1. Front Comput Neurosci 2024; 18:1432593. [PMID: 39165754 PMCID: PMC11333306 DOI: 10.3389/fncom.2024.1432593] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2024] [Accepted: 07/17/2024] [Indexed: 08/22/2024] Open
Abstract
The development of biologically realistic models of brain microcircuits and regions constitutes currently a very relevant topic in computational neuroscience. One of the main challenges of such models is the passage between different scales, going from the microscale (cellular) to the meso (microcircuit) and macroscale (region or whole-brain level), while keeping at the same time a constraint on the demand of computational resources. In this paper we introduce a multiscale modeling framework for the hippocampal CA1, a region of the brain that plays a key role in functions such as learning, memory consolidation and navigation. Our modeling framework goes from the single cell level to the macroscale and makes use of a novel mean-field model of CA1, introduced in this paper, to bridge the gap between the micro and macro scales. We test and validate the model by analyzing the response of the system to the main brain rhythms observed in the hippocampus and comparing our results with the ones of the corresponding spiking network model of CA1. Then, we analyze the implementation of synaptic plasticity within our framework, a key aspect to study the role of hippocampus in learning and memory consolidation, and we demonstrate the capability of our framework to incorporate the variations at synaptic level. Finally, we present an example of the implementation of our model to study a stimulus propagation at the macro-scale level, and we show that the results of our framework can capture the dynamics obtained in the corresponding spiking network model of the whole CA1 area.
Collapse
Affiliation(s)
- Federico Tesler
- CNRS, Paris-Saclay Institute of Neuroscience (NeuroPSI), Paris-Saclay University, Gif-sur-Yvette, France
| | | | - Adam Ponzi
- Institute of Biophysics, National Research Council, Palermo, Italy
| | - Claudia Casellato
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
- Digital Neuroscience Centre, IRCCS Mondino Foundation, Pavia, Italy
| | - Fulvia Palesi
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
| | - Daniela Gandolfi
- Department of Engineering “Enzo Ferrari”, University of Modena and Reggio Emilia, Modena, Italy
| | - Claudia A. M. Gandini Wheeler Kingshott
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
- Digital Neuroscience Centre, IRCCS Mondino Foundation, Pavia, Italy
- NMR Research Unit, Queen Square MS Centre, Department of Neuroinflammation, UCL Queen Square Institute of Neurology, Faculty of Brain Sciences, University College London, London, United Kingdom
| | - Jonathan Mapelli
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, Modena, Italy
| | - Egidio D'Angelo
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
| | - Michele Migliore
- Institute of Biophysics, National Research Council, Palermo, Italy
| | - Alain Destexhe
- CNRS, Paris-Saclay Institute of Neuroscience (NeuroPSI), Paris-Saclay University, Gif-sur-Yvette, France
| |
Collapse
|
4
|
Yang L, Wang Z, Wang G, Liang L, Liu M, Wang J. Brain-inspired modular echo state network for EEG-based emotion recognition. Front Neurosci 2024; 18:1305284. [PMID: 38495107 PMCID: PMC10940514 DOI: 10.3389/fnins.2024.1305284] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2023] [Accepted: 01/10/2024] [Indexed: 03/19/2024] Open
Abstract
Previous studies have successfully applied a lightweight recurrent neural network (RNN) called Echo State Network (ESN) for EEG-based emotion recognition. These studies use intrinsic plasticity (IP) and synaptic plasticity (SP) to tune the hidden reservoir layer of ESN, yet they require extra training procedures and are often computationally complex. Recent neuroscientific research reveals that the brain is modular, consisting of internally dense and externally sparse subnetworks. Furthermore, it has been proved that this modular topology facilitates information processing efficiency in both biological and artificial neural networks (ANNs). Motivated by these findings, we propose Modular Echo State Network (M-ESN), where the hidden layer of ESN is directly initialized to a more efficient modular structure. In this paper, we first describe our novel implementation method, which enables us to find the optimal module numbers, local and global connectivity. Then, the M-ESN is benchmarked on the DEAP dataset. Lastly, we explain why network modularity improves model performance. We demonstrate that modular organization leads to a more diverse distribution of node degrees, which increases network heterogeneity and subsequently improves classification accuracy. On the emotion arousal, valence, and stress/calm classification tasks, our M-ESN outperforms regular ESN by 5.44, 5.90, and 5.42%, respectively, while this difference when comparing with adaptation rules tuned ESNs are 0.77, 5.49, and 0.95%. Notably, our results are obtained using M-ESN with a much smaller reservoir size and simpler training process.
Collapse
Affiliation(s)
- Liuyi Yang
- College of Big Data and Internet, Shenzhen Technology University, Shenzhen, China
| | - Zhaoze Wang
- School of Engineering and Applied Science, University of Pennsylvania, Pennsylvania, PA, United States
| | - Guoyu Wang
- Department of Auromation, Tiangong University, Tianjin, China
| | - Lixin Liang
- College of Big Data and Internet, Shenzhen Technology University, Shenzhen, China
| | - Meng Liu
- College of Big Data and Internet, Shenzhen Technology University, Shenzhen, China
| | - Junsong Wang
- College of Big Data and Internet, Shenzhen Technology University, Shenzhen, China
| |
Collapse
|
5
|
Spaeth A, Haussler D, Teodorescu M. Model-Agnostic Neural Mean Field With The Refractory SoftPlus Transfer Function. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.02.05.579047. [PMID: 38370695 PMCID: PMC10871173 DOI: 10.1101/2024.02.05.579047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/20/2024]
Abstract
Due to the complexity of neuronal networks and the nonlinear dynamics of individual neurons, it is challenging to develop a systems-level model which is accurate enough to be useful yet tractable enough to apply. Mean-field models which extrapolate from single-neuron descriptions to large-scale models can be derived from the neuron's transfer function, which gives its firing rate as a function of its synaptic input. However, analytically derived transfer functions are applicable only to the neurons and noise models from which they were originally derived. In recent work, approximate transfer functions have been empirically derived by fitting a sigmoidal curve, which imposes a maximum firing rate and applies only in the diffusion limit, restricting applications. In this paper, we propose an approximate transfer function called Refractory SoftPlus, which is simple yet applicable to a broad variety of neuron types. Refractory SoftPlus activation functions allow the derivation of simple empirically approximated mean-field models using simulation results, which enables prediction of the response of a network of randomly connected neurons to a time-varying external stimulus with a high degree of accuracy. These models also support an accurate approximate bifurcation analysis as a function of the level of recurrent input. Finally, the model works without assuming large presynaptic rates or small postsynaptic potential size, allowing mean-field models to be developed even for populations with large interaction terms.
Collapse
Affiliation(s)
- Alex Spaeth
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
| | - David Haussler
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
| | - Mircea Teodorescu
- Electrical and Computer Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
- Genomics Institute, University of California, Santa Cruz, Santa Cruz, CA, United States
- Biomolecular Engineering Department, University of California, Santa Cruz, Santa Cruz, CA, United States
| |
Collapse
|
6
|
Gast R, Solla SA, Kennedy A. Neural heterogeneity controls computations in spiking neural networks. Proc Natl Acad Sci U S A 2024; 121:e2311885121. [PMID: 38198531 PMCID: PMC10801870 DOI: 10.1073/pnas.2311885121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2023] [Accepted: 11/27/2023] [Indexed: 01/12/2024] Open
Abstract
The brain is composed of complex networks of interacting neurons that express considerable heterogeneity in their physiology and spiking characteristics. How does this neural heterogeneity influence macroscopic neural dynamics, and how might it contribute to neural computation? In this work, we use a mean-field model to investigate computation in heterogeneous neural networks, by studying how the heterogeneity of cell spiking thresholds affects three key computational functions of a neural population: the gating, encoding, and decoding of neural signals. Our results suggest that heterogeneity serves different computational functions in different cell types. In inhibitory interneurons, varying the degree of spike threshold heterogeneity allows them to gate the propagation of neural signals in a reciprocally coupled excitatory population. Whereas homogeneous interneurons impose synchronized dynamics that narrow the dynamic repertoire of the excitatory neurons, heterogeneous interneurons act as an inhibitory offset while preserving excitatory neuron function. Spike threshold heterogeneity also controls the entrainment properties of neural networks to periodic input, thus affecting the temporal gating of synaptic inputs. Among excitatory neurons, heterogeneity increases the dimensionality of neural dynamics, improving the network's capacity to perform decoding tasks. Conversely, homogeneous networks suffer in their capacity for function generation, but excel at encoding signals via multistable dynamic regimes. Drawing from these findings, we propose intra-cell-type heterogeneity as a mechanism for sculpting the computational properties of local circuits of excitatory and inhibitory spiking neurons, permitting the same canonical microcircuit to be tuned for diverse computational tasks.
Collapse
Affiliation(s)
- Richard Gast
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, IL60611
- Aligning Science Across Parkinson’s Collaborative Research Network, Chevy Chase, MD20815
| | - Sara A. Solla
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, IL60611
| | - Ann Kennedy
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, IL60611
- Aligning Science Across Parkinson’s Collaborative Research Network, Chevy Chase, MD20815
| |
Collapse
|
7
|
Hutt A, Trotter D, Pariz A, Valiante TA, Lefebvre J. Diversity-induced trivialization and resilience of neural dynamics. CHAOS (WOODBURY, N.Y.) 2024; 34:013147. [PMID: 38285722 DOI: 10.1063/5.0165773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Accepted: 01/01/2024] [Indexed: 01/31/2024]
Abstract
Heterogeneity is omnipresent across all living systems. Diversity enriches the dynamical repertoire of these systems but remains challenging to reconcile with their manifest robustness and dynamical persistence over time, a fundamental feature called resilience. To better understand the mechanism underlying resilience in neural circuits, we considered a nonlinear network model, extracting the relationship between excitability heterogeneity and resilience. To measure resilience, we quantified the number of stationary states of this network, and how they are affected by various control parameters. We analyzed both analytically and numerically gradient and non-gradient systems modeled as non-linear sparse neural networks evolving over long time scales. Our analysis shows that neuronal heterogeneity quenches the number of stationary states while decreasing the susceptibility to bifurcations: a phenomenon known as trivialization. Heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in network size and connection probability by quenching the system's dynamic volatility.
Collapse
Affiliation(s)
- Axel Hutt
- MLMS, MIMESIS, Université de Strasbourg, CNRS, Inria, ICube, 67000 Strasbourg, France
| | - Daniel Trotter
- Department of Physics, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
| | - Aref Pariz
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
| | - Taufik A Valiante
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Electrical and Computer Engineering, Institute of Medical Science, Institute of Biomedical Engineering, Division of Neurosurgery, Department of Surgery, CRANIA (Center for Advancing Neurotechnological Innovation to Application), Max Planck-University of Toronto Center for Neural Science and Technology, University of Toronto, Toronto, Ontario M5S 3G8, Canada
| | - Jérémie Lefebvre
- Department of Physics, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Krembil Brain Institute, University Health Network, Toronto, Ontario M5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, Ontario K1N 6N5, Canada
- Department of Mathematics, University of Toronto, Toronto, Ontario M5S 2E4, Canada
| |
Collapse
|
8
|
Hutt A, Rich S, Valiante TA, Lefebvre J. Intrinsic neural diversity quenches the dynamic volatility of neural networks. Proc Natl Acad Sci U S A 2023; 120:e2218841120. [PMID: 37399421 PMCID: PMC10334753 DOI: 10.1073/pnas.2218841120] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Accepted: 05/19/2023] [Indexed: 07/05/2023] Open
Abstract
Heterogeneity is the norm in biology. The brain is no different: Neuronal cell types are myriad, reflected through their cellular morphology, type, excitability, connectivity motifs, and ion channel distributions. While this biophysical diversity enriches neural systems' dynamical repertoire, it remains challenging to reconcile with the robustness and persistence of brain function over time (resilience). To better understand the relationship between excitability heterogeneity (variability in excitability within a population of neurons) and resilience, we analyzed both analytically and numerically a nonlinear sparse neural network with balanced excitatory and inhibitory connections evolving over long time scales. Homogeneous networks demonstrated increases in excitability, and strong firing rate correlations-signs of instability-in response to a slowly varying modulatory fluctuation. Excitability heterogeneity tuned network stability in a context-dependent way by restraining responses to modulatory challenges and limiting firing rate correlations, while enriching dynamics during states of low modulatory drive. Excitability heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in population size, connection probability, strength and variability of synaptic weights, by quenching the volatility (i.e., its susceptibility to critical transitions) of its dynamics. Together, these results highlight the fundamental role played by cell-to-cell heterogeneity in the robustness of brain function in the face of change.
Collapse
Affiliation(s)
- Axel Hutt
- Université de Strasbourg, CNRS, Inria, ICube, MLMS, MIMESIS, StrasbourgF-67000, France
| | - Scott Rich
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
| | - Taufik A. Valiante
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
- Department of Electrical and Computer Engineering, University of Toronto, Toronto, ONM5S 3G8, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ONM5S 3G9, Canada
- Institute of Medical Sciences, University of Toronto, Toronto, ONM5S 1A8, Canada
- Division of Neurosurgery, Department of Surgery, University of Toronto, Toronto, ONM5G 2C4, Canada
- Center for Advancing Neurotechnological Innovation to Application, University of Toronto, Toronto, ONM5G 2A2, Canada
- Max Planck-University of Toronto Center for Neural Science and Technology, University of Toronto, Toronto, ONM5S 3G8, Canada
| | - Jérémie Lefebvre
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, ONK1N 6N5, Canada
- Department of Mathematics, University of Toronto, Toronto, ONM5S 2E4, Canada
| |
Collapse
|
9
|
Destexhe A. Noise Enhancement of Neural Information Processing. ENTROPY (BASEL, SWITZERLAND) 2022; 24:1837. [PMID: 36554242 PMCID: PMC9778153 DOI: 10.3390/e24121837] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 12/09/2022] [Accepted: 12/13/2022] [Indexed: 06/17/2023]
Abstract
Cortical neurons in vivo function in highly fluctuating and seemingly noisy conditions, and the understanding of how information is processed in such complex states is still incomplete. In this perspective article, we first overview that an intense "synaptic noise" was measured first in single neurons, and computational models were built based on such measurements. Recent progress in recording techniques has enabled the measurement of highly complex activity in large numbers of neurons in animals and human subjects, and models were also built to account for these complex dynamics. Here, we attempt to link these two cellular and population aspects, where the complexity of network dynamics in awake cortex seems to link to the synaptic noise seen in single cells. We show that noise in single cells, in networks, or structural noise, all participate to enhance responsiveness and boost the propagation of information. We propose that such noisy states are fundamental to providing favorable conditions for information processing at large-scale levels in the brain, and may be involved in sensory perception.
Collapse
Affiliation(s)
- Alain Destexhe
- CNRS, Paris-Saclay Institute of Neuroscience (NeuroPSI), Paris-Saclay University, 91400 Saclay, France
| |
Collapse
|
10
|
Exact mean-field models for spiking neural networks with adaptation. J Comput Neurosci 2022; 50:445-469. [PMID: 35834100 DOI: 10.1007/s10827-022-00825-9] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 06/15/2022] [Indexed: 10/17/2022]
Abstract
Networks of spiking neurons with adaption have been shown to be able to reproduce a wide range of neural activities, including the emergent population bursting and spike synchrony that underpin brain disorders and normal function. Exact mean-field models derived from spiking neural networks are extremely valuable, as such models can be used to determine how individual neurons and the network they reside within interact to produce macroscopic network behaviours. In the paper, we derive and analyze a set of exact mean-field equations for the neural network with spike frequency adaptation. Specifically, our model is a network of Izhikevich neurons, where each neuron is modeled by a two dimensional system consisting of a quadratic integrate and fire equation plus an equation which implements spike frequency adaptation. Previous work deriving a mean-field model for this type of network, relied on the assumption of sufficiently slow dynamics of the adaptation variable. However, this approximation did not succeed in establishing an exact correspondence between the macroscopic description and the realistic neural network, especially when the adaptation time constant was not large. The challenge lies in how to achieve a closed set of mean-field equations with the inclusion of the mean-field dynamics of the adaptation variable. We address this problem by using a Lorentzian ansatz combined with the moment closure approach to arrive at a mean-field system in the thermodynamic limit. The resulting macroscopic description is capable of qualitatively and quantitatively describing the collective dynamics of the neural network, including transition between states where the individual neurons exhibit asynchronous tonic firing and synchronous bursting. We extend the approach to a network of two populations of neurons and discuss the accuracy and efficacy of our mean-field approximations by examining all assumptions that are imposed during the derivation. Numerical bifurcation analysis of our mean-field models reveals bifurcations not previously observed in the models, including a novel mechanism for emergence of bursting in the network. We anticipate our results will provide a tractable and reliable tool to investigate the underlying mechanism of brain function and dysfunction from the perspective of computational neuroscience.
Collapse
|
11
|
Ristič D, Gosak M. Interlayer Connectivity Affects the Coherence Resonance and Population Activity Patterns in Two-Layered Networks of Excitatory and Inhibitory Neurons. Front Comput Neurosci 2022; 16:885720. [PMID: 35521427 PMCID: PMC9062746 DOI: 10.3389/fncom.2022.885720] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 03/24/2022] [Indexed: 11/13/2022] Open
Abstract
The firing patterns of neuronal populations often exhibit emergent collective oscillations, which can display substantial regularity even though the dynamics of individual elements is very stochastic. One of the many phenomena that is often studied in this context is coherence resonance, where additional noise leads to improved regularity of spiking activity in neurons. In this work, we investigate how the coherence resonance phenomenon manifests itself in populations of excitatory and inhibitory neurons. In our simulations, we use the coupled FitzHugh-Nagumo oscillators in the excitable regime and in the presence of neuronal noise. Formally, our model is based on the concept of a two-layered network, where one layer contains inhibitory neurons, the other excitatory neurons, and the interlayer connections represent heterotypic interactions. The neuronal activity is simulated in realistic coupling schemes in which neurons within each layer are connected with undirected connections, whereas neurons of different types are connected with directed interlayer connections. In this setting, we investigate how different neurophysiological determinants affect the coherence resonance. Specifically, we focus on the proportion of inhibitory neurons, the proportion of excitatory interlayer axons, and the architecture of interlayer connections between inhibitory and excitatory neurons. Our results reveal that the regularity of simulated neural activity can be increased by a stronger damping of the excitatory layer. This can be accomplished with a higher proportion of inhibitory neurons, a higher fraction of inhibitory interlayer axons, a stronger coupling between inhibitory axons, or by a heterogeneous configuration of interlayer connections. Our approach of modeling multilayered neuronal networks in combination with stochastic dynamics offers a novel perspective on how the neural architecture can affect neural information processing and provide possible applications in designing networks of artificial neural circuits to optimize their function via noise-induced phenomena.
Collapse
Affiliation(s)
- David Ristič
- Faculty of Natural Sciences and Mathematics, University of Maribor, Maribor, Slovenia
| | - Marko Gosak
- Faculty of Natural Sciences and Mathematics, University of Maribor, Maribor, Slovenia
- Faculty of Medicine, University of Maribor, Maribor, Slovenia
| |
Collapse
|