51
|
Hildebrand EJ, Buice MA, Chow CC. Kinetic theory of coupled oscillators. PHYSICAL REVIEW LETTERS 2007; 98:054101. [PMID: 17358861 PMCID: PMC2561959 DOI: 10.1103/physrevlett.98.054101] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/28/2006] [Indexed: 05/14/2023]
Abstract
We present an approach for the description of fluctuations that are due to finite system size induced correlations in the Kuramoto model of coupled oscillators. We construct a hierarchy for the moments of the density of oscillators that is analogous to the Bogoliubov-Born-Green-Kirkwood-Yvon hierarchy in the kinetic theory of plasmas and gases. To calculate the lowest order system size effect, we truncate this hierarchy at second order and solve the resulting closed equations for the two-oscillator correlation function around the incoherent state. We use this correlation function to compute the fluctuations of the order parameter, including the effect of transients, and compare this computation with numerical simulations.
Collapse
Affiliation(s)
- Eric J Hildebrand
- Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, Pennsylvania, USA
| | | | | |
Collapse
|
52
|
Arkachar P, Wagh MD. Criticality of lateral inhibition for edge enhancement in neural systems. Neurocomputing 2007. [DOI: 10.1016/j.neucom.2006.03.017] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
53
|
Apfaltrer F, Ly C, Tranchina D. Population density methods for stochastic neurons with realistic synaptic kinetics: firing rate dynamics and fast computational methods. NETWORK (BRISTOL, ENGLAND) 2006; 17:373-418. [PMID: 17162461 DOI: 10.1080/09548980601069787] [Citation(s) in RCA: 31] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
An outstanding problem in computational neuroscience is how to use population density function (PDF) methods to model neural networks with realistic synaptic kinetics in a computationally efficient manner. We explore an application of two-dimensional (2-D) PDF methods to simulating electrical activity in networks of excitatory integrate-and-fire neurons. We formulate a pair of coupled partial differential-integral equations describing the evolution of PDFs for neurons in non-refractory and refractory pools. The population firing rate is given by the total flux of probability across the threshold voltage. We use an operator-splitting method to reduce computation time. We report on speed and accuracy of PDF results and compare them to those from direct, Monte-Carlo simulations. We compute temporal frequency response functions for the transduction from the rate of postsynaptic input to population firing rate, and examine its dependence on background synaptic input rate. The behaviors in the1-D and 2-D cases--corresponding to instantaneous and non-instantaneous synaptic kinetics, respectively--differ markedly from those for a somewhat different transduction: from injected current input to population firing rate output (Brunel et al. 2001; Fourcaud & Brunel 2002). We extend our method by adding inhibitory input, consider a 3-D to 2-D dimension reduction method, demonstrate its limitations, and suggest directions for future study.
Collapse
Affiliation(s)
- Felix Apfaltrer
- Courant Institute of Mathematical Sciences, New York University, New York, NY 10012, USA
| | | | | |
Collapse
|
54
|
Ermentrout B. Gap junctions destroy persistent states in excitatory networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2006; 74:031918. [PMID: 17025678 DOI: 10.1103/physreve.74.031918] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/23/2006] [Indexed: 05/12/2023]
Abstract
Gap junctions between excitatory neurons are shown to disrupt the persistent state. The asynchronous state of the network loses stability via a Hopf bifurcation and then the active state is destroyed via a homoclinic bifurcation with a stationary state. A partial differential equation (PDE) is developed to analyze the Hopf and the homoclinic bifurcations. The simplified dynamics are compared to a biophysical model where similar behavior is observed. In the low noise case, the dynamics of the PDE is shown to be very complicated and includes possible chaotic behavior. The onset of synchrony is studied by the application of averaging to obtain a simple criterion for destabilization of the asynchronous persistent state.
Collapse
Affiliation(s)
- Bard Ermentrout
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA
| |
Collapse
|
55
|
Tao L, Cai D, McLaughlin DW, Shelley MJ, Shapley R. Orientation selectivity in visual cortex by fluctuation-controlled criticality. Proc Natl Acad Sci U S A 2006; 103:12911-6. [PMID: 16905648 PMCID: PMC1562545 DOI: 10.1073/pnas.0605415103] [Citation(s) in RCA: 28] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Within a large-scale neuronal network model of macaque primary visual cortex, we examined how intrinsic dynamic fluctuations in synaptic currents modify the effect of strong recurrent excitation on orientation selectivity. Previously, we showed that, using a strong network inhibition countered by feedforward and recurrent excitation, the cortical model reproduced many observed properties of simple and complex cells. However, that network's complex cells were poorly selective for orientation, and increasing cortical self-excitation led to network instabilities and unrealistically high firing rates. Here, we show that a sparsity of connections in the network produces large, intrinsic fluctuations in the cortico-cortical conductances that can stabilize the network and that there is a critical level of fluctuations (controllable by sparsity) that allows strong cortical gain and the emergence of orientation-selective complex cells. The resultant sparse network also shows near contrast invariance in its selectivity and, in agreement with recent experiments, has extracellular tuning properties that are similar in pinwheel center and iso-orientation regions, whereas intracellular conductances show positional dependencies. Varying the strength of synaptic fluctuations by adjusting the sparsity of network connectivity, we identified a transition between the dynamics of bistability and without bistability. In a network with strong recurrent excitation, this transition is characterized by a near hysteretic behavior and a rapid rise of network firing rates as the synaptic drive or stimulus input is increased. We discuss the connection between this transition and orientation selectivity in our model of primary visual cortex.
Collapse
Affiliation(s)
- Louis Tao
- *Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, NJ 07102
| | - David Cai
- Courant Institute of Mathematical Sciences and
- To whom correspondence may be addressed. E-mail:
or
| | - David W. McLaughlin
- Courant Institute of Mathematical Sciences and
- Center for Neural Science, New York University, New York, NY 10012; and
- To whom correspondence may be addressed. E-mail:
or
| | - Michael J. Shelley
- Courant Institute of Mathematical Sciences and
- Center for Neural Science, New York University, New York, NY 10012; and
| | | |
Collapse
|
56
|
Rangan AV, Cai D. Fast numerical methods for simulating large-scale integrate-and-fire neuronal networks. J Comput Neurosci 2006; 22:81-100. [PMID: 16896522 DOI: 10.1007/s10827-006-8526-7] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2005] [Revised: 03/25/2006] [Accepted: 03/28/2006] [Indexed: 10/24/2022]
Abstract
We discuss numerical methods for simulating large-scale, integrate-and-fire (I&F) neuronal networks. Important elements in our numerical methods are (i) a neurophysiologically inspired integrating factor which casts the solution as a numerically tractable integral equation, and allows us to obtain stable and accurate individual neuronal trajectories (i.e., voltage and conductance time-courses) even when the I&F neuronal equations are stiff, such as in strongly fluctuating, high-conductance states; (ii) an iterated process of spike-spike corrections within groups of strongly coupled neurons to account for spike-spike interactions within a single large numerical time-step; and (iii) a clustering procedure of firing events in the network to take advantage of localized architectures, such as spatial scales of strong local interactions, which are often present in large-scale computational models-for example, those of the primary visual cortex. (We note that the spike-spike corrections in our methods are more involved than the correction of single neuron spike-time via a polynomial interpolation as in the modified Runge-Kutta methods commonly used in simulations of I&F neuronal networks.) Our methods can evolve networks with relatively strong local interactions in an asymptotically optimal way such that each neuron fires approximately once in [Formula: see text] operations, where N is the number of neurons in the system. We note that quantifications used in computational modeling are often statistical, since measurements in a real experiment to characterize physiological systems are typically statistical, such as firing rate, interspike interval distributions, and spike-triggered voltage distributions. We emphasize that it takes much less computational effort to resolve statistical properties of certain I&F neuronal networks than to fully resolve trajectories of each and every neuron within the system. For networks operating in realistic dynamical regimes, such as strongly fluctuating, high-conductance states, our methods are designed to achieve statistical accuracy when very large time-steps are used. Moreover, our methods can also achieve trajectory-wise accuracy when small time-steps are used.
Collapse
Affiliation(s)
- Aaditya V Rangan
- Courant Institute of Mathematical Sciences, New York University, New York, NY 10012, USA.
| | | |
Collapse
|
57
|
Rangan AV, Cai D. Maximum-entropy closures for kinetic theories of neuronal network dynamics. PHYSICAL REVIEW LETTERS 2006; 96:178101. [PMID: 16712338 DOI: 10.1103/physrevlett.96.178101] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/27/2005] [Indexed: 05/09/2023]
Abstract
We analyze (1 + 1)D kinetic equations for neuronal network dynamics, which are derived via an intuitive closure from a Boltzmann-like equation governing the evolution of a one-particle (i.e., one-neuron) probability density function. We demonstrate that this intuitive closure is a generalization of moment closures based on the maximum-entropy principle. By invoking maximum-entropy closures, we show how to systematically extend this kinetic theory to obtain higher-order, kinetic equations and to include coupled networks of both excitatory and inhibitory neurons.
Collapse
Affiliation(s)
- Aaditya V Rangan
- Courant Institute of Mathematical Sciences, New York University, New York 10012, USA
| | | |
Collapse
|
58
|
Sirovich L, Omurtag A, Lubliner K. Dynamics of neural populations: stability and synchrony. NETWORK (BRISTOL, ENGLAND) 2006; 17:3-29. [PMID: 16613792 DOI: 10.1080/09548980500421154] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
A population formulation of neuronal activity is employed to study an excitatory network of (spiking) neurons receiving external input as well as recurrent feedback. At relatively low levels of feedback, the network exhibits time stationary asynchronous behavior. A stability analysis of this time stationary state leads to an analytical criterion for the critical gain at which time asynchronous behavior becomes unstable. At instability the dynamics can undergo a supercritical Hopf bifurcation and the population passes to a synchronous state. Under different conditions it can pass to synchrony through a subcritical Hopf bifurcation. And at high gain a network can reach a runaway state, in finite time, after which the network no longer supports bounded solutions. The introduction of time delayed feedback leads to a rich range of phenomena. For example, for a given external input, increasing gain produces transition from asynchrony, to synchrony, to asynchrony and finally can lead to divergence. Time delay is also shown to strongly mollify the amplitude of synchronous oscillations. Perhaps, of general importance, is the result that synchronous behavior can exist only for a narrow range of time delays, which range is an order of magnitude smaller than periods of oscillation.
Collapse
Affiliation(s)
- Lawrence Sirovich
- Laboratory of Applied Mathematics, Mount Sinai School of Medicine, 1 Gustave L. Levy Place, New York, NY 10029, USA.
| | | | | |
Collapse
|
59
|
Câteau H, Reyes AD. Relation between single neuron and population spiking statistics and effects on network activity. PHYSICAL REVIEW LETTERS 2006; 96:058101. [PMID: 16486995 DOI: 10.1103/physrevlett.96.058101] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/05/2005] [Indexed: 05/06/2023]
Abstract
To simplify theoretical analyses of neural networks, individual neurons are often modeled as Poisson processes. An implicit assumption is that even if the spiking activity of each neuron is non-Poissonian, the composite activity obtained by summing many spike trains limits to a Poisson process. Here, we show analytically and through simulations that this assumption is invalid. Moreover, we show with Fokker-Planck equations that the behavior of feedforward networks is reproduced accurately only if the tendency of neurons to fire periodically is incorporated by using colored noise whose autocorrelation has a negative component.
Collapse
Affiliation(s)
- Hideyuki Câteau
- Center for Neural Science, New York University, 4 Washington Place, New York, New York 10003, USA
| | | |
Collapse
|
60
|
Batarseh KI. Energy levels of Moiré patterns: relation to human perception. BIOLOGICAL CYBERNETICS 2005; 93:248-55. [PMID: 16189673 DOI: 10.1007/s00422-005-0001-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/10/2004] [Accepted: 05/11/2005] [Indexed: 05/04/2023]
Abstract
Quantitative analyses were undertaken to obtain a priori information regarding the energy levels of the random-dot display or Moiré patterns as a function of the angle of rotation theta by employing classical Newtonian mechanics. The energy profiles for these patterns were found to be similar for 10 degrees <t heta < 350 degrees in which the energies exhibited a maxima. For 10 degrees > or = theta > or = 350 degrees , the profiles were found to be dramatically different, especially for the focus pattern where the profile exhibited a downward spike. Specifically, it was found that the minimum energy levels correspond to the angles of rotation where the profiles are perceived by humans. These results may provide insights into the underlying mechanism responsible for the perception of these patterns and information processing in the brain, specifically in the cerebral cortex.
Collapse
Affiliation(s)
- Kareem I Batarseh
- Alpha-Omega Biologicals, 8610 Larkview Lane, Fairfax Station, VA 22039, USA.
| |
Collapse
|
61
|
Cai D, Tao L, McLaughlin DW. An embedded network approach for scale-up of fluctuation-driven systems with preservation of spike information. Proc Natl Acad Sci U S A 2004; 101:14288-93. [PMID: 15381777 PMCID: PMC521148 DOI: 10.1073/pnas.0404062101] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
To address computational "scale-up" issues in modeling large regions of the cortex, many coarse-graining procedures have been invoked to obtain effective descriptions of neuronal network dynamics. However, because of local averaging in space and time, these methods do not contain detailed spike information and, thus, cannot be used to investigate, e.g., cortical mechanisms that are encoded through detailed spike-timing statistics. To retain high-order statistical information of spikes, we develop a hybrid theoretical framework that embeds a subnetwork of point neurons within, and fully interacting with, a coarse-grained network of dynamical background. We use a newly developed kinetic theory for the description of the coarse-grained background, in combination with a Poisson spike reconstruction procedure to ensure that our method applies to the fluctuation-driven regime as well as to the mean-driven regime. This embedded-network approach is verified to be dynamically accurate and numerically efficient. As an example, we use this embedded representation to construct "reverse-time correlations" as spiked-triggered averages in a ring model of orientation-tuning dynamics.
Collapse
Affiliation(s)
- David Cai
- Courant Institute of Mathematical Sciences and Center for Neural Science, New York University, New York, NY 10012, USA.
| | | | | |
Collapse
|