1
|
Osborne H, Deutz L, de Kamps M. Multidimensional Dynamical Systems with Noise. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:159-178. [DOI: 10.1007/978-3-030-89439-9_7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
2
|
Osborne H, Lai YM, Lepperød ME, Sichau D, Deutz L, de Kamps M. MIIND : A Model-Agnostic Simulator of Neural Populations. Front Neuroinform 2021; 15:614881. [PMID: 34295233 PMCID: PMC8291130 DOI: 10.3389/fninf.2021.614881] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Accepted: 05/24/2021] [Indexed: 11/13/2022] Open
Abstract
MIIND is a software platform for easily and efficiently simulating the behaviour of interacting populations of point neurons governed by any 1D or 2D dynamical system. The simulator is entirely agnostic to the underlying neuron model of each population and provides an intuitive method for controlling the amount of noise which can significantly affect the overall behaviour. A network of populations can be set up quickly and easily using MIIND's XML-style simulation file format describing simulation parameters such as how populations interact, transmission delays, post-synaptic potentials, and what output to record. During simulation, a visual display of each population's state is provided for immediate feedback of the behaviour and population activity can be output to a file or passed to a Python script for further processing. The Python support also means that MIIND can be integrated into other software such as The Virtual Brain. MIIND's population density technique is a geometric and visual method for describing the activity of each neuron population which encourages a deep consideration of the dynamics of the neuron model and provides insight into how the behaviour of each population is affected by the behaviour of its neighbours in the network. For 1D neuron models, MIIND performs far better than direct simulation solutions for large populations. For 2D models, performance comparison is more nuanced but the population density approach still confers certain advantages over direct simulation. MIIND can be used to build neural systems that bridge the scales between an individual neuron model and a population network. This allows researchers to maintain a plausible path back from mesoscopic to microscopic scales while minimising the complexity of managing large numbers of interconnected neurons. In this paper, we introduce the MIIND system, its usage, and provide implementation details where appropriate.
Collapse
Affiliation(s)
- Hugh Osborne
- Institute for Artificial Intelligence and Biological Computation, School of Computing, University of Leeds, Leeds, United Kingdom
| | - Yi Ming Lai
- School of Medicine, University of Nottingham, Nottingham, United Kingdom
| | | | - David Sichau
- Department of Computer Science, Eidgenössische Technische Hochschule Zurich, Zurich, Switzerland
| | - Lukas Deutz
- Institute for Artificial Intelligence and Biological Computation, School of Computing, University of Leeds, Leeds, United Kingdom
| | - Marc de Kamps
- School of Computing and Leeds Institute for Data Analytics, University of Leeds, Leeds, United Kingdom
| |
Collapse
|
3
|
Shao Y, Zhang J, Tao L. Dimensional reduction of emergent spatiotemporal cortical dynamics via a maximum entropy moment closure. PLoS Comput Biol 2020; 16:e1007265. [PMID: 32516336 PMCID: PMC7304648 DOI: 10.1371/journal.pcbi.1007265] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Revised: 06/19/2020] [Accepted: 04/29/2020] [Indexed: 11/22/2022] Open
Abstract
Modern electrophysiological recordings and optical imaging techniques have revealed a diverse spectrum of spatiotemporal neural activities underlying fundamental cognitive processing. Oscillations, traveling waves and other complex population dynamical patterns are often concomitant with sensory processing, information transfer, decision making and memory consolidation. While neural population models such as neural mass, population density and kinetic theoretical models have been used to capture a wide range of the experimentally observed dynamics, a full account of how the multi-scale dynamics emerges from the detailed biophysical properties of individual neurons and the network architecture remains elusive. Here we apply a recently developed coarse-graining framework for reduced-dimensional descriptions of neuronal networks to model visual cortical dynamics. We show that, without introducing any new parameters, how a sequence of models culminating in an augmented system of spatially-coupled ODEs can effectively model a wide range of the observed cortical dynamics, ranging from visual stimulus orientation dynamics to traveling waves induced by visual illusory stimuli. In addition to an efficient simulation method, this framework also offers an analytic approach to studying large-scale network dynamics. As such, the dimensional reduction naturally leads to mesoscopic variables that capture the interplay between neuronal population stochasticity and network architecture that we believe to underlie many emergent cortical phenomena.
Collapse
Affiliation(s)
- Yuxiu Shao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing, China
| | - Jiwei Zhang
- School of Mathematics and Statistics, and Hubei Key Laboratory of Computational Science, Wuhan University, China
| | - Louis Tao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing, China
- Center for Quantitative Biology, Peking University, Beijing, China
| |
Collapse
|
4
|
de Kamps M, Lepperød M, Lai YM. Computational geometry for modeling neural populations: From visualization to simulation. PLoS Comput Biol 2019; 15:e1006729. [PMID: 30830903 PMCID: PMC6417745 DOI: 10.1371/journal.pcbi.1006729] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2018] [Revised: 03/14/2019] [Accepted: 11/26/2018] [Indexed: 11/18/2022] Open
Abstract
The importance of a mesoscopic description level of the brain has now been well established. Rate based models are widely used, but have limitations. Recently, several extremely efficient population-level methods have been proposed that go beyond the characterization of a population in terms of a single variable. Here, we present a method for simulating neural populations based on two dimensional (2D) point spiking neuron models that defines the state of the population in terms of a density function over the neural state space. Our method differs in that we do not make the diffusion approximation, nor do we reduce the state space to a single dimension (1D). We do not hard code the neural model, but read in a grid describing its state space in the relevant simulation region. Novel models can be studied without even recompiling the code. The method is highly modular: variations of the deterministic neural dynamics and the stochastic process can be investigated independently. Currently, there is a trend to reduce complex high dimensional neuron models to 2D ones as they offer a rich dynamical repertoire that is not available in 1D, such as limit cycles. We will demonstrate that our method is ideally suited to investigate noise in such systems, replicating results obtained in the diffusion limit and generalizing them to a regime of large jumps. The joint probability density function is much more informative than 1D marginals, and we will argue that the study of 2D systems subject to noise is important complementary to 1D systems.
Collapse
Affiliation(s)
- Marc de Kamps
- Institute for Artificial and Biological Intelligence, University of Leeds, Leeds, West Yorkshire, United Kingdom
| | - Mikkel Lepperød
- Institute of Basic Medical Sciences, and Center for Integrative Neuroplasticity, University of Oslo, Oslo, Norway
| | - Yi Ming Lai
- Institute for Artificial and Biological Intelligence, University of Leeds, Leeds, West Yorkshire, United Kingdom.,Currently at the School of Mathematical Sciences, University of Nottingham, Nottingham, United Kingdom
| |
Collapse
|
5
|
Inferring cortical function in the mouse visual system through large-scale systems neuroscience. Proc Natl Acad Sci U S A 2017; 113:7337-44. [PMID: 27382147 DOI: 10.1073/pnas.1512901113] [Citation(s) in RCA: 44] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.
Collapse
|
6
|
Lai YM, de Kamps M. Population density equations for stochastic processes with memory kernels. Phys Rev E 2017; 95:062125. [PMID: 28709222 DOI: 10.1103/physreve.95.062125] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2016] [Indexed: 06/07/2023]
Abstract
We present a method for solving population density equations (PDEs)--a mean-field technique describing homogeneous populations of uncoupled neurons-where the populations can be subject to non-Markov noise for arbitrary distributions of jump sizes. The method combines recent developments in two different disciplines that traditionally have had limited interaction: computational neuroscience and the theory of random networks. The method uses a geometric binning scheme, based on the method of characteristics, to capture the deterministic neurodynamics of the population, separating the deterministic and stochastic process cleanly. We can independently vary the choice of the deterministic model and the model for the stochastic process, leading to a highly modular numerical solution strategy. We demonstrate this by replacing the master equation implicit in many formulations of the PDE formalism by a generalization called the generalized Montroll-Weiss equation-a recent result from random network theory-describing a random walker subject to transitions realized by a non-Markovian process. We demonstrate the method for leaky- and quadratic-integrate and fire neurons subject to spike trains with Poisson and gamma-distributed interspike intervals. We are able to model jump responses for both models accurately to both excitatory and inhibitory input under the assumption that all inputs are generated by one renewal process.
Collapse
Affiliation(s)
- Yi Ming Lai
- Institute for Artificial and Biological Computation, School of Computing, University of Leeds, LS2 9JT Leeds, United Kingdom
| | - Marc de Kamps
- Institute for Artificial and Biological Computation, School of Computing, University of Leeds, LS2 9JT Leeds, United Kingdom
| |
Collapse
|
7
|
Cain N, Iyer R, Koch C, Mihalas S. The Computational Properties of a Simplified Cortical Column Model. PLoS Comput Biol 2016; 12:e1005045. [PMID: 27617444 PMCID: PMC5019422 DOI: 10.1371/journal.pcbi.1005045] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2015] [Accepted: 07/01/2016] [Indexed: 01/09/2023] Open
Abstract
The mammalian neocortex has a repetitious, laminar structure and performs functions integral to higher cognitive processes, including sensory perception, memory, and coordinated motor output. What computations does this circuitry subserve that link these unique structural elements to their function? Potjans and Diesmann (2014) parameterized a four-layer, two cell type (i.e. excitatory and inhibitory) model of a cortical column with homogeneous populations and cell type dependent connection probabilities. We implement a version of their model using a displacement integro-partial differential equation (DiPDE) population density model. This approach, exact in the limit of large homogeneous populations, provides a fast numerical method to solve equations describing the full probability density distribution of neuronal membrane potentials. It lends itself to quickly analyzing the mean response properties of population-scale firing rate dynamics. We use this strategy to examine the input-output relationship of the Potjans and Diesmann cortical column model to understand its computational properties. When inputs are constrained to jointly and equally target excitatory and inhibitory neurons, we find a large linear regime where the effect of a multi-layer input signal can be reduced to a linear combination of component signals. One of these, a simple subtractive operation, can act as an error signal passed between hierarchical processing stages. What computations do existing biophysically-plausible models of cortex perform on their inputs, and how do these computations relate to theories of cortical processing? We begin with a computational model of cortical tissue and seek to understand its input/output transformations. Our approach limits confirmation bias, and differs from a more constructionist approach of starting with a computational theory and then creating a model that can implement its necessary features. We here choose a population-level modeling technique that does not sacrifice accuracy, as it well-approximates the mean firing-rate of a population of leaky integrate-and-fire neurons. We extend this approach to simulate recurrently coupled neural populations, and characterize the computational properties of the Potjans and Diesmann cortical column model. We find that this model is capable of computing linear operations and naturally generates a subtraction operation implicated in theories of predictive coding. Although our quantitative findings are restricted to this particular model, we demonstrate that these conclusions are not highly sensitive to the model parameterization.
Collapse
Affiliation(s)
- Nicholas Cain
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Ramakrishnan Iyer
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Christof Koch
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Stefan Mihalas
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| |
Collapse
|
8
|
|
9
|
de Kamps M, Sichau D. pMIIND-an MPI-based population density simulation framework. BMC Neurosci 2013. [PMCID: PMC3704884 DOI: 10.1186/1471-2202-14-s1-p362] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
|
10
|
Dumont G, Henry J. Synchronization of an Excitatory Integrate-and-Fire Neural Network. Bull Math Biol 2013; 75:629-48. [DOI: 10.1007/s11538-013-9823-8] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2012] [Accepted: 01/28/2013] [Indexed: 11/28/2022]
|
11
|
Population density models of integrate-and-fire neurons with jumps: well-posedness. J Math Biol 2012; 67:453-81. [DOI: 10.1007/s00285-012-0554-5] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2011] [Revised: 05/11/2012] [Indexed: 10/28/2022]
|
12
|
|
13
|
Farkhooi F, Muller E, Nawrot MP. Adaptation reduces variability of the neuronal population code. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2011; 83:050905. [PMID: 21728481 DOI: 10.1103/physreve.83.050905] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/16/2010] [Revised: 03/22/2011] [Indexed: 05/31/2023]
Abstract
Sequences of events in noise-driven excitable systems with slow variables often show serial correlations among their intervals of events. Here, we employ a master equation for generalized non-renewal processes to calculate the interval and count statistics of superimposed processes governed by a slow adaptation variable. For an ensemble of neurons with spike-frequency adaptation, this results in the regularization of the population activity and an enhanced postsynaptic signal decoding. We confirm our theoretical results in a population of cortical neurons recorded in vivo.
Collapse
Affiliation(s)
- Farzad Farkhooi
- Neuroinformatics and Theoretical Neuroscience, Freie Universität Berlin and BCCN-Berlin, Berlin, Germany.
| | | | | |
Collapse
|
14
|
Helias M, Deger M, Rotter S, Diesmann M. Finite post synaptic potentials cause a fast neuronal response. Front Neurosci 2011; 5:19. [PMID: 21427776 PMCID: PMC3047297 DOI: 10.3389/fnins.2011.00019] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2010] [Accepted: 02/07/2011] [Indexed: 01/23/2023] Open
Abstract
A generic property of the communication between neurons is the exchange of pulses at discrete time points, the action potentials. However, the prevalent theory of spiking neuronal networks of integrate-and-fire model neurons relies on two assumptions: the superposition of many afferent synaptic impulses is approximated by Gaussian white noise, equivalent to a vanishing magnitude of the synaptic impulses, and the transfer of time varying signals by neurons is assessable by linearization. Going beyond both approximations, we find that in the presence of synaptic impulses the response to transient inputs differs qualitatively from previous predictions. It is instantaneous rather than exhibiting low-pass characteristics, depends non-linearly on the amplitude of the impulse, is asymmetric for excitation and inhibition and is promoted by a characteristic level of synaptic background noise. These findings resolve contradictions between the earlier theory and experimental observations. Here we review the recent theoretical progress that enabled these insights. We explain why the membrane potential near threshold is sensitive to properties of the afferent noise and show how this shapes the neural response. A further extension of the theory to time evolution in discrete steps quantifies simulation artifacts and yields improved methods to cross check results.
Collapse
Affiliation(s)
| | - Moritz Deger
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
| | - Stefan Rotter
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
- Computational Neuroscience, Faculty of Biology, Albert-Ludwig UniversityFreiburg, Germany
| | - Markus Diesmann
- RIKEN Brain Science InstituteWako City, Japan
- Bernstein Center Freiburg, Albert-Ludwig UniversityFreiburg, Germany
- Institute for Neuroscience and Medicine (INM-6), Computational and Systems Neuroscience, Research Center JülichGermany
- Brain and Neural Systems Team, Computational Science Research Program, RIKENWako City, Japan
| |
Collapse
|
15
|
Helias M, Deger M, Rotter S, Diesmann M. Instantaneous non-linear processing by pulse-coupled threshold units. PLoS Comput Biol 2010; 6. [PMID: 20856583 PMCID: PMC2936519 DOI: 10.1371/journal.pcbi.1000929] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2010] [Accepted: 08/10/2010] [Indexed: 11/18/2022] Open
Abstract
Contemporary theory of spiking neuronal networks is based on the linear response of the integrate-and-fire neuron model derived in the diffusion limit. We find that for non-zero synaptic weights, the response to transient inputs differs qualitatively from this approximation. The response is instantaneous rather than exhibiting low-pass characteristics, non-linearly dependent on the input amplitude, asymmetric for excitation and inhibition, and is promoted by a characteristic level of synaptic background noise. We show that at threshold the probability density of the potential drops to zero within the range of one synaptic weight and explain how this shapes the response. The novel mechanism is exhibited on the network level and is a generic property of pulse-coupled networks of threshold units. Our work demonstrates a fast-firing response of nerve cells that remained unconsidered in network analysis, because it is inaccessible by the otherwise successful linear response theory. For the sake of analytic tractability, this theory assumes infinitesimally weak synaptic coupling. However, realistic synaptic impulses cause a measurable deflection of the membrane potential. Here we quantify the effect of this pulse-coupling on the firing rate and the membrane-potential distribution. We demonstrate how the postsynaptic potentials give rise to a fast, non-linear rate transient present for excitatory, but not for inhibitory, inputs. It is particularly pronounced in the presence of a characteristic level of synaptic background noise. We show that feed-forward inhibition enhances the fast response on the network level. This enables a mode of information processing based on short-lived activity transients. Moreover, the non-linear neural response appears on a time scale that critically interacts with spike-timing dependent synaptic plasticity rules. Our results are derived for biologically realistic synaptic amplitudes, but also extend earlier work based on Gaussian white noise. The novel theoretical framework is generically applicable to any threshold unit governed by a stochastic differential equation driven by finite jumps. Therefore, our results are relevant for a wide range of biological, physical, and technical systems.
Collapse
|
16
|
de Kamps M, Velde FVD. The MIIND framework: combining population density methods, neural simulations and Wilson-Cowan dynamics into large-scale heterogeneous neural models of cognition. BMC Neurosci 2009. [DOI: 10.1186/1471-2202-10-s1-p276] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
17
|
de Kamps M, Baier V, Drever J, Dietz M, Mösenlechner L, van der Velde F. The state of MIIND. Neural Netw 2008; 21:1164-81. [PMID: 18783918 DOI: 10.1016/j.neunet.2008.07.006] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2007] [Revised: 06/09/2008] [Accepted: 07/28/2008] [Indexed: 10/21/2022]
Abstract
MIIND (Multiple Interacting Instantiations of Neural Dynamics) is a highly modular multi-level C++ framework, that aims to shorten the development time for models in Cognitive Neuroscience (CNS). It offers reusable code modules (libraries of classes and functions) aimed at solving problems that occur repeatedly in modelling, but tries not to impose a specific modelling philosophy or methodology. At the lowest level, it offers support for the implementation of sparse networks. For example, the library SparseImplementationLib supports sparse random networks and the library LayerMappingLib can be used for sparse regular networks of filter-like operators. The library DynamicLib, which builds on top of the library SparseImplementationLib, offers a generic framework for simulating network processes. Presently, several specific network process implementations are provided in MIIND: the Wilson-Cowan and Ornstein-Uhlenbeck type, and population density techniques for leaky-integrate-and-fire neurons driven by Poisson input. A design principle of MIIND is to support detailing: the refinement of an originally simple model into a form where more biological detail is included. Another design principle is extensibility: the reuse of an existing model in a larger, more extended one. One of the main uses of MIIND so far has been the instantiation of neural models of visual attention. Recently, we have added a library for implementing biologically-inspired models of artificial vision, such as HMAX and recent successors. In the long run we hope to be able to apply suitably adapted neuronal mechanisms of attention to these artificial models.
Collapse
Affiliation(s)
- Marc de Kamps
- Biosystems Group, School of Computing, University of Leeds, LS2 9JT Leeds, United Kingdom.
| | | | | | | | | | | |
Collapse
|
18
|
Sirovich L, Omurtag A, Lubliner K. Dynamics of neural populations: stability and synchrony. NETWORK (BRISTOL, ENGLAND) 2006; 17:3-29. [PMID: 16613792 DOI: 10.1080/09548980500421154] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
A population formulation of neuronal activity is employed to study an excitatory network of (spiking) neurons receiving external input as well as recurrent feedback. At relatively low levels of feedback, the network exhibits time stationary asynchronous behavior. A stability analysis of this time stationary state leads to an analytical criterion for the critical gain at which time asynchronous behavior becomes unstable. At instability the dynamics can undergo a supercritical Hopf bifurcation and the population passes to a synchronous state. Under different conditions it can pass to synchrony through a subcritical Hopf bifurcation. And at high gain a network can reach a runaway state, in finite time, after which the network no longer supports bounded solutions. The introduction of time delayed feedback leads to a rich range of phenomena. For example, for a given external input, increasing gain produces transition from asynchrony, to synchrony, to asynchrony and finally can lead to divergence. Time delay is also shown to strongly mollify the amplitude of synchronous oscillations. Perhaps, of general importance, is the result that synchronous behavior can exist only for a narrow range of time delays, which range is an order of magnitude smaller than periods of oscillation.
Collapse
Affiliation(s)
- Lawrence Sirovich
- Laboratory of Applied Mathematics, Mount Sinai School of Medicine, 1 Gustave L. Levy Place, New York, NY 10029, USA.
| | | | | |
Collapse
|
19
|
Abstract
Cortical activity is the product of interactions among neuronal populations. Macroscopic electrophysiological phenomena are generated by these interactions. In principle, the mechanisms of these interactions afford constraints on biologically plausible models of electrophysiological responses. In other words, the macroscopic features of cortical activity can be modelled in terms of the microscopic behaviour of neurons. An evoked response potential (ERP) is the mean electrical potential measured from an electrode on the scalp, in response to some event. The purpose of this paper is to outline a population density approach to modelling ERPs. We propose a biologically plausible model of neuronal activity that enables the estimation of physiologically meaningful parameters from electrophysiological data. The model encompasses four basic characteristics of neuronal activity and organization: (i) neurons are dynamic units, (ii) driven by stochastic forces, (iii) organized into populations with similar biophysical properties and response characteristics and (iv) multiple populations interact to form functional networks. This leads to a formulation of population dynamics in terms of the Fokker-Planck equation. The solution of this equation is the temporal evolution of a probability density over state-space, representing the distribution of an ensemble of trajectories. Each trajectory corresponds to the changing state of a neuron. Measurements can be modelled by taking expectations over this density, e.g. mean membrane potential, firing rate or energy consumption per neuron. The key motivation behind our approach is that ERPs represent an average response over many neurons. This means it is sufficient to model the probability density over neurons, because this implicitly models their average state. Although the dynamics of each neuron can be highly stochastic, the dynamics of the density is not. This means we can use Bayesian inference and estimation tools that have already been established for deterministic systems. The potential importance of modelling density dynamics (as opposed to more conventional neural mass models) is that they include interactions among the moments of neuronal states (e.g. the mean depolarization may depend on the variance of synaptic currents through nonlinear mechanisms).Here, we formulate a population model, based on biologically informed model-neurons with spike-rate adaptation and synaptic dynamics. Neuronal sub-populations are coupled to form an observation model, with the aim of estimating and making inferences about coupling among sub-populations using real data. We approximate the time-dependent solution of the system using a bi-orthogonal set and first-order perturbation expansion. For didactic purposes, the model is developed first in the context of deterministic input, and then extended to include stochastic effects. The approach is demonstrated using synthetic data, where model parameters are identified using a Bayesian estimation scheme we have described previously.
Collapse
Affiliation(s)
- L M Harrison
- The Wellcome Department of Imaging Neuroscience, Institute of Neurology, UCL, 12 Queen Square, London WC1N 3BG, UK.
| | | | | |
Collapse
|