1
|
Cain N, Iyer R, Koch C, Mihalas S. The Computational Properties of a Simplified Cortical Column Model. PLoS Comput Biol 2016; 12:e1005045. [PMID: 27617444 PMCID: PMC5019422 DOI: 10.1371/journal.pcbi.1005045] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2015] [Accepted: 07/01/2016] [Indexed: 01/09/2023] Open
Abstract
The mammalian neocortex has a repetitious, laminar structure and performs functions integral to higher cognitive processes, including sensory perception, memory, and coordinated motor output. What computations does this circuitry subserve that link these unique structural elements to their function? Potjans and Diesmann (2014) parameterized a four-layer, two cell type (i.e. excitatory and inhibitory) model of a cortical column with homogeneous populations and cell type dependent connection probabilities. We implement a version of their model using a displacement integro-partial differential equation (DiPDE) population density model. This approach, exact in the limit of large homogeneous populations, provides a fast numerical method to solve equations describing the full probability density distribution of neuronal membrane potentials. It lends itself to quickly analyzing the mean response properties of population-scale firing rate dynamics. We use this strategy to examine the input-output relationship of the Potjans and Diesmann cortical column model to understand its computational properties. When inputs are constrained to jointly and equally target excitatory and inhibitory neurons, we find a large linear regime where the effect of a multi-layer input signal can be reduced to a linear combination of component signals. One of these, a simple subtractive operation, can act as an error signal passed between hierarchical processing stages. What computations do existing biophysically-plausible models of cortex perform on their inputs, and how do these computations relate to theories of cortical processing? We begin with a computational model of cortical tissue and seek to understand its input/output transformations. Our approach limits confirmation bias, and differs from a more constructionist approach of starting with a computational theory and then creating a model that can implement its necessary features. We here choose a population-level modeling technique that does not sacrifice accuracy, as it well-approximates the mean firing-rate of a population of leaky integrate-and-fire neurons. We extend this approach to simulate recurrently coupled neural populations, and characterize the computational properties of the Potjans and Diesmann cortical column model. We find that this model is capable of computing linear operations and naturally generates a subtraction operation implicated in theories of predictive coding. Although our quantitative findings are restricted to this particular model, we demonstrate that these conclusions are not highly sensitive to the model parameterization.
Collapse
Affiliation(s)
- Nicholas Cain
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Ramakrishnan Iyer
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Christof Koch
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Stefan Mihalas
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| |
Collapse
|
2
|
Keles HO, Barbour RL, Omurtag A. Hemodynamic correlates of spontaneous neural activity measured by human whole-head resting state EEG+fNIRS. Neuroimage 2016; 138:76-87. [PMID: 27236081 DOI: 10.1016/j.neuroimage.2016.05.058] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2015] [Revised: 05/18/2016] [Accepted: 05/24/2016] [Indexed: 02/05/2023] Open
Abstract
The brains of awake, resting human subjects display spontaneously occurring neural activity patterns whose magnitude is typically many times greater than those triggered by cognitive or perceptual performance. Evoked and resting state activations affect local cerebral hemodynamic properties through processes collectively referred to as neurovascular coupling. Its investigation calls for an ability to track both the neural and vascular aspects of brain function. We used scalp electroencephalography (EEG), which provided a measure of the electrical potentials generated by cortical postsynaptic currents. Simultaneously we utilized functional near-infrared spectroscopy (NIRS) to continuously monitor hemoglobin concentration changes in superficial cortical layers. The multi-modal signal from 18 healthy adult subjects allowed us to investigate the association of neural activity in a range of frequencies over the whole-head to local changes in hemoglobin concentrations. Our results verified the delayed alpha (8-16Hz) modulation of hemodynamics in posterior areas known from the literature. They also indicated strong beta (16-32Hz) modulation of hemodynamics. Analysis revealed, however, that beta modulation was likely generated by the alpha-beta coupling in EEG. Signals from the inferior electrode sites were dominated by scalp muscle related activity. Our study aimed to characterize the phenomena related to neurovascular coupling observable by practical, cost-effective, and non-invasive multi-modal techniques.
Collapse
Affiliation(s)
- Hasan Onur Keles
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA 02114, United States
| | - Randall L Barbour
- Department of Pathology, Optical Tomography Group, State University of New York, NY, 11203, United States
| | - Ahmet Omurtag
- Department of Biomedical Engineering, University of Houston, Houston, TX 77204, United States.
| |
Collapse
|
3
|
Dumont G, Henry J, Tarniceriu CO. Well-posedness of a density model for a population of theta neurons. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2014; 4:2. [PMID: 24742324 DOI: 10.1186/2190-8567-4-2] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/15/2013] [Accepted: 04/17/2014] [Indexed: 06/03/2023]
Abstract
Population density models used to describe the evolution of neural populations in a phase space are closely related to the single neuron model that describes the individual trajectories of the neurons of the population and which gives in particular the phase-space where the computations are made. Based on a transformation of the quadratic integrate and fire single neuron model, the so called theta-neuron model is obtained and we shall introduce in this paper a corresponding population density model for it. Existence and uniqueness of a solution will be proved and some numerical simulations are presented.
Collapse
|
4
|
Lytton WW, Neymotin SA, Kerr CC. Multiscale modeling for clinical translation in neuropsychiatric disease. ACTA ACUST UNITED AC 2014; 1. [PMID: 26925364 PMCID: PMC4766859 DOI: 10.1186/2194-3990-1-7] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Multiscale modeling of neuropsychiatric illness bridges scales of clinical importance: from the highest scales (presentation of behavioral signs and symptoms), through intermediate scales (clinical testing and surgical intervention), down to the molecular scale of pharmacotherapy. Modeling of brain disease is difficult compared to modeling of other organs, because dysfunction manifests at scales where measurements are rudimentary due both to inadequate access (memory and cognition) and to complexity (behavior). Nonetheless, we can begin to explore these aspects through the use of information-theoretic measures as stand-ins for meaning at the top scales. We here describe efforts across five disorders: Parkinson’s, Alzheimer’s, stroke, schizophrenia, and epilepsy. We look at the use of therapeutic brain stimulation to replace lost neural signals, a loss that produces diaschisis, defined as activity changes in other brain areas due to missing inputs. These changes may in some cases be compensatory, hence beneficial, but in many cases a primary pathology, whether itself static or dynamic, sets in motion a series of dynamic consequences that produce further pathology. The simulations presented here suggest how diaschisis can be reversed by using a neuroprosthetic signal. Despite having none of the information content of the lost physiological signal, the simplified neuroprosthetic signal can restore a diaschitic area to near-normal patterns of activity. Computer simulation thus begins to explain the remarkable success of stimulation technologies - deep brain stimulation, transcranial magnetic stimulation, ultrasound stimulation, transcranial direct current stimulation - across an extremely broad range of pathologies. Multiscale modeling can help us to optimize and integrate these neuroprosthetic therapies by taking into consideration effects of different stimulation protocols, combinations of stimulation with neuropharmacological therapy, and interplay of these therapeutic modalities with particular patterns of disease focality, dynamics, and prior therapies.
Collapse
Affiliation(s)
- William W Lytton
- Department of Physiology & Pharmacology and Neurology, SUNY Downstate Medical Center, Brooklyn, NY 11203, USA; Department of Neurology, Kings County Hospital, Brooklyn, NY 11203, USA
| | - Samuel A Neymotin
- Department of Physiology & Pharmacology and Neurology, SUNY Downstate Medical Center, Brooklyn, NY 11203, USA
| | - Cliff C Kerr
- Department of Physiology & Pharmacology and Neurology, SUNY Downstate Medical Center, Brooklyn, NY 11203, USA
| |
Collapse
|
5
|
Dumont G, Henry J. Synchronization of an Excitatory Integrate-and-Fire Neural Network. Bull Math Biol 2013; 75:629-48. [DOI: 10.1007/s11538-013-9823-8] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2012] [Accepted: 01/28/2013] [Indexed: 11/28/2022]
|
6
|
Bifurcations of large networks of two-dimensional integrate and fire neurons. J Comput Neurosci 2013; 35:87-108. [PMID: 23430291 DOI: 10.1007/s10827-013-0442-z] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2012] [Revised: 11/29/2012] [Accepted: 01/17/2013] [Indexed: 12/25/2022]
Abstract
Recently, a class of two-dimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045-1079, 2008). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasi-steady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed.
Collapse
|
7
|
Population density models of integrate-and-fire neurons with jumps: well-posedness. J Math Biol 2012; 67:453-81. [DOI: 10.1007/s00285-012-0554-5] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2011] [Revised: 05/11/2012] [Indexed: 10/28/2022]
|
8
|
Cáceres MJ, Carrillo JA, Perthame B. Analysis of nonlinear noisy integrate & fire neuron models: blow-up and steady states. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2011; 1:7. [PMID: 22657097 PMCID: PMC3496469 DOI: 10.1186/2190-8567-1-7] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/29/2010] [Accepted: 07/18/2011] [Indexed: 06/01/2023]
Abstract
Nonlinear Noisy Leaky Integrate and Fire (NNLIF) models for neurons networks can be written as Fokker-Planck-Kolmogorov equations on the probability density of neurons, the main parameters in the model being the connectivity of the network and the noise. We analyse several aspects of the NNLIF model: the number of steady states, a priori estimates, blow-up issues and convergence toward equilibrium in the linear case. In particular, for excitatory networks, blow-up always occurs for initial data concentrated close to the firing potential. These results show how critical is the balance between noise and excitatory/inhibitory interactions to the connectivity parameter.AMS Subject Classification: 35K60, 82C31, 92B20.
Collapse
Affiliation(s)
- María J Cáceres
- Departamento de Matemática Aplicada, Universidad de Granada, E-18071, Granada, Spain
| | - José A Carrillo
- ICREA and Departament de Matemàtiques, Universitat Autònoma de Barcelona, E-08193, Bellaterra, Spain
| | - Benoît Perthame
- Laboratoire Jacques-Louis Lions, UPMC, CNRS UMR 7598 and INRIA-Bang, F-75005, Paris, France
- Institut Universitaire de France, 75005, Paris, France
| |
Collapse
|
9
|
Rothkegel A, Lehnertz K. Multistability, local pattern formation, and global collective firing in a small-world network of nonleaky integrate-and-fire neurons. CHAOS (WOODBURY, N.Y.) 2009; 19:015109. [PMID: 19335013 DOI: 10.1063/1.3087432] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
We investigate numerically the collective dynamical behavior of pulse-coupled nonleaky integrate-and-fire neurons that are arranged on a two-dimensional small-world network. To ensure ongoing activity, we impose a probability for spontaneous firing for each neuron. We study network dynamics evolving from different sets of initial conditions in dependence on coupling strength and rewiring probability. Besides a homogeneous equilibrium state for low coupling strength, we observe different local patterns including cyclic waves, spiral waves, and turbulentlike patterns, which-depending on network parameters-interfere with the global collective firing of the neurons. We attribute the various network dynamics to distinct regimes in the parameter space. For the same network parameters different network dynamics can be observed depending on the set of initial conditions only. Such a multistable behavior and the interplay between local pattern formation and global collective firing may be attributable to the spatiotemporal dynamics of biological networks.
Collapse
|
10
|
Liu CY, Nykamp DQ. A kinetic theory approach to capturing interneuronal correlation: the feed-forward case. J Comput Neurosci 2008; 26:339-68. [PMID: 18987967 DOI: 10.1007/s10827-008-0116-4] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2008] [Revised: 09/19/2008] [Accepted: 09/24/2008] [Indexed: 11/30/2022]
Abstract
We present an approach for using kinetic theory to capture first and second order statistics of neuronal activity. We coarse grain neuronal networks into populations of neurons and calculate the population average firing rate and output cross-correlation in response to time varying correlated input. We derive coupling equations for the populations based on first and second order statistics of the network connectivity. This coupling scheme is based on the hypothesis that second order statistics of the network connectivity are sufficient to determine second order statistics of neuronal activity. We implement a kinetic theory representation of a simple feed-forward network and demonstrate that the kinetic theory model captures key aspects of the emergence and propagation of correlations in the network, as long as the correlations do not become too strong. By analyzing the correlated activity of feed-forward networks with a variety of connectivity patterns, we provide evidence supporting our hypothesis of the sufficiency of second order connectivity statistics.
Collapse
Affiliation(s)
- Chin-Yueh Liu
- School of Mathematics, University of Minnesota, 206 Church St., Minneapolis, MN 55455, USA
| | | |
Collapse
|
11
|
Dynamical relaying can yield zero time lag neuronal synchrony despite long conduction delays. Proc Natl Acad Sci U S A 2008; 105:17157-62. [PMID: 18957544 DOI: 10.1073/pnas.0809353105] [Citation(s) in RCA: 201] [Impact Index Per Article: 12.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Multielectrode recordings have revealed zero time lag synchronization among remote cerebral cortical areas. However, the axonal conduction delays among such distant regions can amount to several tens of milliseconds. It is still unclear which mechanism is giving rise to isochronous discharge of widely distributed neurons, despite such latencies. Here, we investigate the synchronization properties of a simple network motif and found that, even in the presence of large axonal conduction delays, distant neuronal populations self-organize into lag-free oscillations. According to our results, cortico-cortical association fibers and certain cortico-thalamo-cortical loops represent ideal circuits to circumvent the phase shifts and time lags associated with conduction delays.
Collapse
|
12
|
de Kamps M, Baier V, Drever J, Dietz M, Mösenlechner L, van der Velde F. The state of MIIND. Neural Netw 2008; 21:1164-81. [PMID: 18783918 DOI: 10.1016/j.neunet.2008.07.006] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2007] [Revised: 06/09/2008] [Accepted: 07/28/2008] [Indexed: 10/21/2022]
Abstract
MIIND (Multiple Interacting Instantiations of Neural Dynamics) is a highly modular multi-level C++ framework, that aims to shorten the development time for models in Cognitive Neuroscience (CNS). It offers reusable code modules (libraries of classes and functions) aimed at solving problems that occur repeatedly in modelling, but tries not to impose a specific modelling philosophy or methodology. At the lowest level, it offers support for the implementation of sparse networks. For example, the library SparseImplementationLib supports sparse random networks and the library LayerMappingLib can be used for sparse regular networks of filter-like operators. The library DynamicLib, which builds on top of the library SparseImplementationLib, offers a generic framework for simulating network processes. Presently, several specific network process implementations are provided in MIIND: the Wilson-Cowan and Ornstein-Uhlenbeck type, and population density techniques for leaky-integrate-and-fire neurons driven by Poisson input. A design principle of MIIND is to support detailing: the refinement of an originally simple model into a form where more biological detail is included. Another design principle is extensibility: the reuse of an existing model in a larger, more extended one. One of the main uses of MIIND so far has been the instantiation of neural models of visual attention. Recently, we have added a library for implementing biologically-inspired models of artificial vision, such as HMAX and recent successors. In the long run we hope to be able to apply suitably adapted neuronal mechanisms of attention to these artificial models.
Collapse
Affiliation(s)
- Marc de Kamps
- Biosystems Group, School of Computing, University of Leeds, LS2 9JT Leeds, United Kingdom.
| | | | | | | | | | | |
Collapse
|
13
|
Abstract
A mathematical model, of general character for the dynamic description of coupled neural oscillators is presented. The population approach that is employed applies equally to coupled cells as to populations of such coupled cells. The formulation includes stochasticity and preserves details of precisely firing neurons. Based on the generally accepted view of cortical wiring, this formulation is applied to the retinal ganglion cell (RGC)/lateral geniculate nucleus (LGN) relay cell system, of the early mammalian visual system. The smallness of quantal voltage jumps at the retinal level permits a Fokker-Planck approximation for the RGC contribution; however, the LGN description requires the use of finite jumps, which for fast synaptic dynamics appears as finite jumps in the membrane potential. Analyses of equilibrium spiking behavior for both the deterministic and stochastic cases are presented. Green's function methods form the basis for the asymptotic and exact results that are presented. This determines the spiking ratio (i.e., the number of RGC arrivals per LGN spike), which is the reciprocal of the transfer ratio, under wide circumstances. Criteria for spiking regimes, in terms of the relatively few parameters of the model, are presented. Under reasonable hypotheses, it is shown that the transfer ratio is ≤1/2, in the absence of input from other areas. Thus, the model suggests that the LGN/RGC system may be a relatively unsophisticated spike editor. In the absence of other input, the system is designed to fire an LGN spike only when two or more RGC spikes appear in a relatively short time. Transfer ratios that briefly exceed 1/2 (but are less than 1) have been recorded in the laboratory. Inclusion of brain stem input has been shown to provide a signal that elevates the transfer ratio (Ozaki & Kaplan, 2006). A model that includes this contribution is also presented.
Collapse
Affiliation(s)
- Lawrence Sirovich
- Laboratory of Applied Mathematics, Mt. Sinai School of Medicine, New York, NY 10029, U.S.A
| |
Collapse
|
14
|
Rangamani P, Sirovich L. Survival and apoptotic pathways initiated by TNF-alpha: modeling and predictions. Biotechnol Bioeng 2007; 97:1216-29. [PMID: 17171720 DOI: 10.1002/bit.21307] [Citation(s) in RCA: 54] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
We present a mathematical model which includes TNF-alpha initiated survival and apoptotic cascades, as well as nuclear transcription of IkappaB. These pathways play a crucial role in deciding cell fate in response to inflammation and infection. Our model incorporates known specific protein-protein interactions as identified by experiments. Using these biochemical interactions, we develop a mathematical model of the NF-kappaB-mediated survival and caspase-mediated apoptosis pathways. Using mass action kinetics, we follow the formation of the survival and late complexes as well as the dynamics of DNA fragmentation. The effect of TNF-alpha concentration on DNA fragmentation is modeled and compares well with experiment. Nuclear transcription is also modeled phenomenologically by means of time lagged cytosolic concentrations. This results in transcription related concentrations undergoing under-damped oscillations, in qualitative and quantitative agreement with experiment. Using a tumor cell as a hypothetical model, we explore the interplay between the components of the survival and apoptotic pathways. Results are presented which make predictions on the limits of cellular oscillations in terms of time delay, initial concentration ratios and other features of the model. The model also makes clear predictions on cell viability in terms of DNA damage within the framework of TNF-alpha stimulus duration.
Collapse
Affiliation(s)
- Padmini Rangamani
- Laboratory of Applied Mathematics, Mount Sinai School of Medicine, One Gustave L. Levy Place, New York, NY 10029, USA.
| | | |
Collapse
|
15
|
Abstract
SUMMARY Network simulations can help identify underlying mechanisms of epileptic activity that are hard to isolate in biologic preparations. To be useful, simulations must be sufficiently realistic to make possible biologic and clinical prediction. This requirement for large networks of sufficiently detailed neurons raises challenges both with regard to computational load and the difficulty of obtaining insights with large numbers of free parameters and the large amounts of generated data. The authors have addressed these problems by simulating computationally manageable networks of moderate size consisting of 1,000 to 3,000 neurons with multiple intrinsic and synaptic properties. Experiments on these simulations demonstrated the presence of epileptiform behavior in the form of repetitive high-intensity population events (clonic behavior) or latch-up with near maximal activity (tonic behavior). Intrinsic neuronal excitability is not always a predictor of network epileptiform activity but may paradoxically produce antiepileptic effects, depending on the settings of other parameters. Several simulations revealed the importance of random coincident inputs to shift a network from a low-activation to a high-activation epileptiform state. Finally, a simulated anticonvulsant acting on excitability tended to preferentially decrease tonic activity.
Collapse
|
16
|
Ermentrout B. Gap junctions destroy persistent states in excitatory networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2006; 74:031918. [PMID: 17025678 DOI: 10.1103/physreve.74.031918] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/23/2006] [Indexed: 05/12/2023]
Abstract
Gap junctions between excitatory neurons are shown to disrupt the persistent state. The asynchronous state of the network loses stability via a Hopf bifurcation and then the active state is destroyed via a homoclinic bifurcation with a stationary state. A partial differential equation (PDE) is developed to analyze the Hopf and the homoclinic bifurcations. The simplified dynamics are compared to a biophysical model where similar behavior is observed. In the low noise case, the dynamics of the PDE is shown to be very complicated and includes possible chaotic behavior. The onset of synchrony is studied by the application of averaging to obtain a simple criterion for destabilization of the asynchronous persistent state.
Collapse
Affiliation(s)
- Bard Ermentrout
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA
| |
Collapse
|