1
|
Vetter J, Macke JH, Gao R. Generating realistic neurophysiological time series with denoising diffusion probabilistic models. PATTERNS (NEW YORK, N.Y.) 2024; 5:101047. [PMID: 39568643 PMCID: PMC11573898 DOI: 10.1016/j.patter.2024.101047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/06/2024] [Revised: 07/01/2024] [Accepted: 07/31/2024] [Indexed: 11/22/2024]
Abstract
Denoising diffusion probabilistic models (DDPMs) have recently been shown to accurately generate complicated data such as images, audio, or time series. Experimental and clinical neuroscience also stand to benefit from this progress, as the accurate generation of neurophysiological time series can enable or improve many neuroscientific applications. Here, we present a flexible DDPM-based method for modeling multichannel, densely sampled neurophysiological recordings. DDPMs can generate realistic synthetic data for a variety of datasets from different species and recording techniques. The generated data capture important statistics, such as frequency spectra and phase-amplitude coupling, as well as fine-grained features such as sharp wave ripples. Furthermore, data can be generated based on additional information such as experimental conditions. We demonstrate the flexibility of DDPMs in several applications, including brain-state classification and missing-data imputation. In summary, DDPMs can serve as accurate generative models of neurophysiological recordings and have broad utility in the probabilistic generation of synthetic recordings for neuroscientific applications.
Collapse
Affiliation(s)
- Julius Vetter
- Machine Learning in Science, University of Tübingen and Tübingen AI Center, Tübingen, Germany
| | - Jakob H Macke
- Machine Learning in Science, University of Tübingen and Tübingen AI Center, Tübingen, Germany
- Max Planck Institute for Intelligent Systems, Tübingen, Germany
| | - Richard Gao
- Machine Learning in Science, University of Tübingen and Tübingen AI Center, Tübingen, Germany
| |
Collapse
|
2
|
Huszár R, Zhang Y, Blockus H, Buzsáki G. Preconfigured dynamics in the hippocampus are guided by embryonic birthdate and rate of neurogenesis. Nat Neurosci 2022; 25:1201-1212. [PMID: 35995878 PMCID: PMC10807234 DOI: 10.1038/s41593-022-01138-x] [Citation(s) in RCA: 50] [Impact Index Per Article: 16.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2021] [Accepted: 07/12/2022] [Indexed: 02/08/2023]
Abstract
The incorporation of new information into the hippocampal network is likely to be constrained by its innate architecture and internally generated activity patterns. However, the origin, organization and consequences of such patterns remain poorly understood. In the present study we show that hippocampal network dynamics are affected by sequential neurogenesis. We birthdated CA1 pyramidal neurons with in utero electroporation over 4 embryonic days, encompassing the peak of hippocampal neurogenesis, and compared their functional features in freely moving adult mice. Neurons of the same birthdate displayed distinct connectivity, coactivity across brain states and assembly dynamics. Same-birthdate neurons exhibited overlapping spatial representations, which were maintained across different environments. Overall, the wiring and functional features of CA1 pyramidal neurons reflected a combination of birthdate and the rate of neurogenesis. These observations demonstrate that sequential neurogenesis during embryonic development shapes the preconfigured forms of adult network dynamics.
Collapse
Affiliation(s)
- Roman Huszár
- Neuroscience Institute, New York University, New York, NY, USA.
- Center for Neural Science, New York University, New York, NY, USA.
| | - Yunchang Zhang
- Neuroscience Institute, New York University, New York, NY, USA
- Center for Neural Science, New York University, New York, NY, USA
| | - Heike Blockus
- Department of Neuroscience, Columbia University, New York, NY, USA
- Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, USA
| | - György Buzsáki
- Neuroscience Institute, New York University, New York, NY, USA.
- Center for Neural Science, New York University, New York, NY, USA.
- Department of Neurology, Langone Medical Center, New York, NY, USA.
| |
Collapse
|
3
|
Grzesiek A, Gąsior K, Wyłomańska A, Zimroz R. Divergence-Based Segmentation Algorithm for Heavy-Tailed Acoustic Signals with Time-Varying Characteristics. SENSORS (BASEL, SWITZERLAND) 2021; 21:8487. [PMID: 34960579 PMCID: PMC8709018 DOI: 10.3390/s21248487] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/18/2021] [Revised: 12/12/2021] [Accepted: 12/16/2021] [Indexed: 11/30/2022]
Abstract
Many real-world systems change their parameters during the operation. Thus, before the analysis of the data, there is a need to divide the raw signal into parts that can be considered as homogeneous segments. In this paper, we propose a segmentation procedure that can be applied for the signal with time-varying characteristics. Moreover, we assume that the examined signal exhibits impulsive behavior, thus it corresponds to the so-called heavy-tailed class of distributions. Due to the specific behavior of the data, classical algorithms known from the literature cannot be used directly in the segmentation procedure. In the considered case, the transition between parts corresponding to homogeneous segments is smooth and non-linear. This causes that the segmentation algorithm is more complex than in the classical case. We propose to apply the divergence measures that are based on the distance between the probability density functions for the two examined distributions. The novel segmentation algorithm is applied to real acoustic signals acquired during coffee grinding. Justification of the methodology has been performed experimentally and using Monte-Carlo simulations for data from the model with heavy-tailed distribution (here the stable distribution) with time-varying parameters. Although the methodology is demonstrated for a specific case, it can be extended to any process with time-changing characteristics.
Collapse
Affiliation(s)
- Aleksandra Grzesiek
- Faculty of Pure and Applied Mathematics, Hugo Steinhaus Center, Wroclaw University of Science and Technology, Wyspianskiego 27, 50-370 Wroclaw, Poland; (K.G.); (A.W.)
| | - Karolina Gąsior
- Faculty of Pure and Applied Mathematics, Hugo Steinhaus Center, Wroclaw University of Science and Technology, Wyspianskiego 27, 50-370 Wroclaw, Poland; (K.G.); (A.W.)
| | - Agnieszka Wyłomańska
- Faculty of Pure and Applied Mathematics, Hugo Steinhaus Center, Wroclaw University of Science and Technology, Wyspianskiego 27, 50-370 Wroclaw, Poland; (K.G.); (A.W.)
| | - Radosław Zimroz
- Faculty of Geoengineering, Mining and Geology, Wroclaw University of Science and Technology, Wyspianskiego 27, 50-370 Wroclaw, Poland;
| |
Collapse
|
4
|
A general method to generate artificial spike train populations matching recorded neurons. J Comput Neurosci 2020; 48:47-63. [PMID: 31974719 DOI: 10.1007/s10827-020-00741-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2019] [Revised: 01/05/2020] [Accepted: 01/07/2020] [Indexed: 10/25/2022]
Abstract
We developed a general method to generate populations of artificial spike trains (ASTs) that match the statistics of recorded neurons. The method is based on computing a Gaussian local rate function of the recorded spike trains, which results in rate templates from which ASTs are drawn as gamma distributed processes with a refractory period. Multiple instances of spike trains can be sampled from the same rate templates. Importantly, we can manipulate rate-covariances between spike trains by performing simple algorithmic transformations on the rate templates, such as filtering or amplifying specific frequency bands, and adding behavior related rate modulations. The method was examined for accuracy and limitations using surrogate data such as sine wave rate templates, and was then verified for recorded spike trains from cerebellum and cerebral cortex. We found that ASTs generated with this method can closely follow the firing rate and local as well as global spike time variance and power spectrum. The method is primarily intended to generate well-controlled spike train populations as inputs for dynamic clamp studies or biophysically realistic multicompartmental models. Such inputs are essential to study detailed properties of synaptic integration with well-controlled input patterns that mimic the in vivo situation while allowing manipulation of input rate covariances at different time scales.
Collapse
|
5
|
Okun M, Steinmetz NA, Lak A, Dervinis M, Harris KD. Distinct Structure of Cortical Population Activity on Fast and Infraslow Timescales. Cereb Cortex 2019; 29:2196-2210. [PMID: 30796825 PMCID: PMC6458908 DOI: 10.1093/cercor/bhz023] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2018] [Revised: 01/26/2019] [Accepted: 01/28/2019] [Indexed: 12/20/2022] Open
Abstract
Cortical activity is organized across multiple spatial and temporal scales. Most research on the dynamics of neuronal spiking is concerned with timescales of 1 ms-1 s, and little is known about spiking dynamics on timescales of tens of seconds and minutes. Here, we used frequency domain analyses to study the structure of individual neurons' spiking activity and its coupling to local population rate and to arousal level across 0.01-100 Hz frequency range. In mouse medial prefrontal cortex, the spiking dynamics of individual neurons could be quantitatively captured by a combination of interspike interval and firing rate power spectrum distributions. The relative strength of coherence with local population often differed across timescales: a neuron strongly coupled to population rate on fast timescales could be weakly coupled on slow timescales, and vice versa. On slow but not fast timescales, a substantial proportion of neurons showed firing anticorrelated with the population. Infraslow firing rate changes were largely determined by arousal rather than by local factors, which could explain the timescale dependence of individual neurons' population coupling strength. These observations demonstrate how neurons simultaneously partake in fast local dynamics, and slow brain-wide dynamics, extending our understanding of infraslow cortical activity beyond the mesoscale resolution of fMRI.
Collapse
Affiliation(s)
- Michael Okun
- Centre for Systems Neuroscience and Department of Neuroscience, Psychology and Behaviour, University of Leicester, Leicester, UK
- Institute of Neurology, University College London, London, UK
| | | | - Armin Lak
- Institute of Neurology, University College London, London, UK
| | - Martynas Dervinis
- Centre for Systems Neuroscience and Department of Neuroscience, Psychology and Behaviour, University of Leicester, Leicester, UK
| | | |
Collapse
|
6
|
Abstract
During the past decades, the Ising distribution has attracted interest in many applied disciplines, as the maximum entropy distribution associated to any set of correlated binary ("spin") variables with observed means and covariances. However, numerically speaking, the Ising distribution is unpractical, so alternative models are often preferred to handle correlated binary data. One popular alternative, especially in life sciences, is the Cox distribution (or the closely related dichotomized Gaussian distribution and log-normal Cox point process), where the spins are generated independently conditioned on the drawing of a latent variable with a multivariate normal distribution. This article explores the conditions for a principled replacement of the Ising distribution by a Cox distribution. It shows that the Ising distribution itself can be treated as a latent variable model, and it explores when this latent variable has a quasi-normal distribution. A variational approach to this question reveals a formal link with classic mean-field methods, especially Opper and Winther's adaptive TAP approximation. This link is confirmed by weak coupling (Plefka) expansions of the different approximations and then by numerical tests. Overall, this study suggests that an Ising distribution can be replaced by a Cox distribution in practical applications, precisely when its parameters lie in the "mean-field domain."
Collapse
Affiliation(s)
- Adrien Wohrer
- Université Clermont Auvergne, CNRS, SIGMA Clermont, Institut Pascal, F-63000 Clermont-Ferrand, France
| |
Collapse
|
7
|
Abstract
Time series generated by complex systems like financial markets and the earth’s atmosphere often represent superstatistical random walks: on short time scales, the data follow a simple low-level model, but the model parameters are not constant and can fluctuate on longer time scales according to a high-level model. While the low-level model is often dictated by the type of the data, the high-level model, which describes how the parameters change, is unknown in most cases. Here we present a computationally efficient method to infer the time course of the parameter variations from time-series with short-range correlations. Importantly, this method evaluates the model evidence to objectively select between competing high-level models. We apply this method to detect anomalous price movements in financial markets, characterize cancer cell invasiveness, identify historical policies relevant for working safety in coal mines, and compare different climate change scenarios to forecast global warming. Systematic changes in stock market prices or in the migration behaviour of cancer cells may be hidden behind random fluctuations. Here, Mark et al. describe an empirical approach to identify when and how such real-world systems undergo systematic changes.
Collapse
|
8
|
Törő K, Pongrácz R, Bartholy J, Váradi-T A, Marcsa B, Szilágyi B, Lovas A, Dunay G, Sótonyi P. Evaluation of meteorological and epidemiological characteristics of fatal pulmonary embolism. INTERNATIONAL JOURNAL OF BIOMETEOROLOGY 2016; 60:351-359. [PMID: 26178756 DOI: 10.1007/s00484-015-1032-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/01/2014] [Revised: 06/22/2015] [Accepted: 06/23/2015] [Indexed: 06/04/2023]
Abstract
The objective of the present study was to identify risk factors among epidemiological factors and meteorological conditions in connection with fatal pulmonary embolism. Information was collected from forensic autopsy records in sudden unexpected death cases where pulmonary embolism was the exact cause of death between 2001 and 2010 in Budapest. Meteorological parameters were detected during the investigated period. Gender, age, manner of death, cause of death, place of death, post-mortem pathomorphological changes and daily meteorological conditions (i.e. daily mean temperature and atmospheric pressure) were examined. We detected that the number of registered pulmonary embolism (No 467, 211 male) follows power law in time regardless of the manner of death. We first described that the number of registered fatal pulmonary embolism up to the nth day can be expressed as Y(n) = α ⋅ n (β) where Y denotes the number of fatal pulmonary embolisms up to the nth day and α > 0 and β > 1 are model parameters. We found that there is a definite link between the cold temperature and the increasing incidence of fatal pulmonary embolism. Cold temperature and the change of air pressure appear to be predisposing factors for fatal pulmonary embolism. Meteorological parameters should have provided additional information about the predisposing factors of thromboembolism.
Collapse
Affiliation(s)
- Klára Törő
- Department of Forensic and Insurance Medicine, Semmelweis University, Budapest, Üllői st. 93, 1091, Hungary.
| | - Rita Pongrácz
- Department of Meteorology, Eötvös Loránd University, Budapest, Pázmány st. 1/a, 1117, Hungary.
| | - Judit Bartholy
- Department of Meteorology, Eötvös Loránd University, Budapest, Pázmány st. 1/a, 1117, Hungary.
| | - Aletta Váradi-T
- Department of Forensic and Insurance Medicine, Semmelweis University, Budapest, Üllői st. 93, 1091, Hungary.
| | - Boglárka Marcsa
- Department of Forensic and Insurance Medicine, Semmelweis University, Budapest, Üllői st. 93, 1091, Hungary.
| | - Brigitta Szilágyi
- Department of Geometry, Budapest University of Technology and Economics, Budapest, Egry József st. 1, 1111, Hungary.
| | - Attila Lovas
- Department of Geometry, Budapest University of Technology and Economics, Budapest, Egry József st. 1, 1111, Hungary.
| | - György Dunay
- Department of Forensic and Insurance Medicine, Semmelweis University, Budapest, Üllői st. 93, 1091, Hungary.
| | - Péter Sótonyi
- Department of Vascular Surgery, Semmelweis University, Budapest, Városmajor st. 68, 1122, Hungary.
| |
Collapse
|
9
|
Bi Z, Zhou C. Spike Pattern Structure Influences Synaptic Efficacy Variability under STDP and Synaptic Homeostasis. I: Spike Generating Models on Converging Motifs. Front Comput Neurosci 2016; 10:14. [PMID: 26941634 PMCID: PMC4763167 DOI: 10.3389/fncom.2016.00014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2015] [Accepted: 02/01/2016] [Indexed: 11/26/2022] Open
Abstract
In neural systems, synaptic plasticity is usually driven by spike trains. Due to the inherent noises of neurons and synapses as well as the randomness of connection details, spike trains typically exhibit variability such as spatial randomness and temporal stochasticity, resulting in variability of synaptic changes under plasticity, which we call efficacy variability. How the variability of spike trains influences the efficacy variability of synapses remains unclear. In this paper, we try to understand this influence under pair-wise additive spike-timing dependent plasticity (STDP) when the mean strength of plastic synapses into a neuron is bounded (synaptic homeostasis). Specifically, we systematically study, analytically and numerically, how four aspects of statistical features, i.e., synchronous firing, burstiness/regularity, heterogeneity of rates and heterogeneity of cross-correlations, as well as their interactions influence the efficacy variability in converging motifs (simple networks in which one neuron receives from many other neurons). Neurons (including the post-synaptic neuron) in a converging motif generate spikes according to statistical models with tunable parameters. In this way, we can explicitly control the statistics of the spike patterns, and investigate their influence onto the efficacy variability, without worrying about the feedback from synaptic changes onto the dynamics of the post-synaptic neuron. We separate efficacy variability into two parts: the drift part (DriftV) induced by the heterogeneity of change rates of different synapses, and the diffusion part (DiffV) induced by weight diffusion caused by stochasticity of spike trains. Our main findings are: (1) synchronous firing and burstiness tend to increase DiffV, (2) heterogeneity of rates induces DriftV when potentiation and depression in STDP are not balanced, and (3) heterogeneity of cross-correlations induces DriftV together with heterogeneity of rates. We anticipate our work important for understanding functional processes of neuronal networks (such as memory) and neural development.
Collapse
Affiliation(s)
- Zedong Bi
- State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of SciencesBeijing, China; Department of Physics, Hong Kong Baptist UniversityKowloon Tong, Hong Kong
| | - Changsong Zhou
- Department of Physics, Hong Kong Baptist UniversityKowloon Tong, Hong Kong; Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist UniversityKowloon Tong, Hong Kong; Beijing Computational Science Research CenterBeijing, China; Research Centre, HKBU Institute of Research and Continuing EducationShenzhen, China
| |
Collapse
|
10
|
Yim MY, Kumar A, Aertsen A, Rotter S. Impact of correlated inputs to neurons: modeling observations from in vivo intracellular recordings. J Comput Neurosci 2014; 37:293-304. [PMID: 24789376 PMCID: PMC4159600 DOI: 10.1007/s10827-014-0502-z] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2013] [Revised: 04/02/2014] [Accepted: 04/04/2014] [Indexed: 11/24/2022]
Abstract
In vivo recordings in rat somatosensory cortex suggest that excitatory and inhibitory inputs are often correlated during spontaneous and sensory-evoked activity. Using a computational approach, we study how the interplay of input correlations and timing observed in experiments controls the spiking probability of single neurons. Several correlation-based mechanisms are identified, which can effectively switch a neuron on and off. In addition, we investigate the transfer of input correlation to output correlation in pairs of neurons, at the spike train and the membrane potential levels, by considering spike-driving and non-spike-driving inputs separately. In particular, we propose a plausible explanation for the in vivo finding that membrane potentials in neighboring neurons are correlated, but the spike-triggered averages of membrane potentials preceding a spike are not: Neighboring neurons possibly receive an ongoing bombardment of correlated subthreshold background inputs, and occasionally uncorrelated spike-driving inputs.
Collapse
Affiliation(s)
- Man Yi Yim
- Department of Mathematics, University of Hong Kong, Pokfulam Road, Hong Kong
| | | | | | | |
Collapse
|
11
|
Moshitch D, Nelken I. Using Tweedie distributions for fitting spike count data. J Neurosci Methods 2014; 225:13-28. [PMID: 24440773 DOI: 10.1016/j.jneumeth.2014.01.004] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2013] [Revised: 01/06/2014] [Accepted: 01/07/2014] [Indexed: 11/27/2022]
Abstract
BACKGROUND The nature of spike count distributions is of great practical concern for the analysis of neural data. These distributions often have a tendency for 'failures' and a long tail of large counts, and may show a strong dependence of variance on the mean. Furthermore, spike count distributions often show multiplicative rather than additive effects of covariates. We analyzed the responses of neurons in primary auditory cortex to transposed stimuli as a function of interaural time differences (ITD). In more than half of the cases, the variance of neuronal responses showed a supralinear dependence on the mean spike count. NEW METHOD We explored the use of the Tweedie family of distributions, which has a supralinear dependence of means on variances. To quantify the effects of ITD on neuronal responses, we used generalized linear models (GLMs), and developed methods for significance testing under the Tweedie assumption. RESULTS We found the Tweedie distribution to be generally a better fit to the data than the Poisson distribution for over-dispersed responses. COMPARISON WITH EXISTING METHODS Standard analysis of variance wrongly assumes Gaussian distributions with fixed variance and additive effects, but even generalized models under Poisson assumptions may be hampered by the over-dispersion of spike counts. The use of GLMs assuming Tweedie distributions increased the reliability of tests of sensitivity to ITD in our data. CONCLUSIONS When spike count variance depends strongly on the mean, the use of Tweedie distributions for analyzing the data is advised.
Collapse
Affiliation(s)
- Dina Moshitch
- Department of Neurobiology, Silberman Institute of Life Sciences, Hebrew University, Jerusalem, Israel.
| | - Israel Nelken
- Department of Neurobiology, Silberman Institute of Life Sciences, Hebrew University, Jerusalem, Israel; The Edmond and Lily Safra Center for Brain Sciences, Hebrew University, Jerusalem, Israel.
| |
Collapse
|
12
|
Trousdale J, Hu Y, Shea-Brown E, Josić K. A generative spike train model with time-structured higher order correlations. Front Comput Neurosci 2013; 7:84. [PMID: 23908626 PMCID: PMC3727174 DOI: 10.3389/fncom.2013.00084] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2013] [Accepted: 06/12/2013] [Indexed: 11/16/2022] Open
Abstract
Emerging technologies are revealing the spiking activity in ever larger neural ensembles. Frequently, this spiking is far from independent, with correlations in the spike times of different cells. Understanding how such correlations impact the dynamics and function of neural ensembles remains an important open problem. Here we describe a new, generative model for correlated spike trains that can exhibit many of the features observed in data. Extending prior work in mathematical finance, this generalized thinning and shift (GTaS) model creates marginally Poisson spike trains with diverse temporal correlation structures. We give several examples which highlight the model's flexibility and utility. For instance, we use it to examine how a neural network responds to highly structured patterns of inputs. We then show that the GTaS model is analytically tractable, and derive cumulant densities of all orders in terms of model parameters. The GTaS framework can therefore be an important tool in the experimental and theoretical exploration of neural dynamics.
Collapse
Affiliation(s)
- James Trousdale
- Department of Mathematics, University of Houston Houston, TX, USA
| | | | | | | |
Collapse
|
13
|
Spectral analysis of input spike trains by spike-timing-dependent plasticity. PLoS Comput Biol 2012; 8:e1002584. [PMID: 22792056 PMCID: PMC3390410 DOI: 10.1371/journal.pcbi.1002584] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2011] [Accepted: 05/14/2012] [Indexed: 11/19/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) has been observed in many brain areas such as sensory cortices, where it is hypothesized to structure synaptic connections between neurons. Previous studies have demonstrated how STDP can capture spiking information at short timescales using specific input configurations, such as coincident spiking, spike patterns and oscillatory spike trains. However, the corresponding computation in the case of arbitrary input signals is still unclear. This paper provides an overarching picture of the algorithm inherent to STDP, tying together many previous results for commonly used models of pairwise STDP. For a single neuron with plastic excitatory synapses, we show how STDP performs a spectral analysis on the temporal cross-correlograms between its afferent spike trains. The postsynaptic responses and STDP learning window determine kernel functions that specify how the neuron “sees” the input correlations. We thus denote this unsupervised learning scheme as ‘kernel spectral component analysis’ (kSCA). In particular, the whole input correlation structure must be considered since all plastic synapses compete with each other. We find that kSCA is enhanced when weight-dependent STDP induces gradual synaptic competition. For a spiking neuron with a “linear” response and pairwise STDP alone, we find that kSCA resembles principal component analysis (PCA). However, plain STDP does not isolate correlation sources in general, e.g., when they are mixed among the input spike trains. In other words, it does not perform independent component analysis (ICA). Tuning the neuron to a single correlation source can be achieved when STDP is paired with a homeostatic mechanism that reinforces the competition between synaptic inputs. Our results suggest that neuronal networks equipped with STDP can process signals encoded in the transient spiking activity at the timescales of tens of milliseconds for usual STDP. Tuning feature extraction of sensory stimuli is an important function for synaptic plasticity models. A widely studied example is the development of orientation preference in the primary visual cortex, which can emerge using moving bars in the visual field. A crucial point is the decomposition of stimuli into basic information tokens, e.g., selecting individual bars even though they are presented in overlapping pairs (vertical and horizontal). Among classical unsupervised learning models, independent component analysis (ICA) is capable of isolating basic tokens, whereas principal component analysis (PCA) cannot. This paper focuses on spike-timing-dependent plasticity (STDP), whose functional implications for neural information processing have been intensively studied both theoretically and experimentally in the last decade. Following recent studies demonstrating that STDP can perform ICA for specific cases, we show how STDP relates to PCA or ICA, and in particular explains the conditions under which it switches between them. Here information at the neuronal level is assumed to be encoded in temporal cross-correlograms of spike trains. We find that a linear spiking neuron equipped with pairwise STDP requires additional mechanisms, such as a homeostatic regulation of its output firing, in order to separate mixed correlation sources and thus perform ICA.
Collapse
|
14
|
Reimer ICG, Staude B, Ehm W, Rotter S. Modeling and analyzing higher-order correlations in non-Poissonian spike trains. J Neurosci Methods 2012; 208:18-33. [PMID: 22561088 DOI: 10.1016/j.jneumeth.2012.04.015] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2011] [Revised: 04/17/2012] [Accepted: 04/18/2012] [Indexed: 11/17/2022]
Abstract
Measuring pairwise and higher-order spike correlations is crucial for studying their potential impact on neuronal information processing. In order to avoid misinterpretation of results, the tools used for data analysis need to be carefully calibrated with respect to their sensitivity and robustness. This, in turn, requires surrogate data with statistical properties common to experimental spike trains. Here, we present a novel method to generate correlated non-Poissonian spike trains and study the impact of single-neuron spike statistics on the inference of higher-order correlations. Our method to mimic cooperative neuronal spike activity allows the realization of a large variety of renewal processes with controlled higher-order correlation structure. Based on surrogate data obtained by this procedure we investigate the robustness of the recently proposed method empirical de-Poissonization (Ehm et al., 2007). It assumes Poissonian spiking, which is common also for many other estimation techniques. We observe that some degree of deviation from this assumption can generally be tolerated, that the results are more reliable for small analysis bins, and that the degree of misestimation depends on the detailed spike statistics. As a consequence of these findings we finally propose a strategy to assess the reliability of results for experimental data.
Collapse
Affiliation(s)
- Imke C G Reimer
- Bernstein Center Freiburg and Faculty of Biology, Albert-Ludwig University, Freiburg, Germany
| | | | | | | |
Collapse
|
15
|
Lyamzin DR, Garcia-Lazaro JA, Lesica NA. Analysis and modelling of variability and covariability of population spike trains across multiple time scales. NETWORK (BRISTOL, ENGLAND) 2012; 23:76-103. [PMID: 22578115 DOI: 10.3109/0954898x.2012.679334] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
As multi-electrode and imaging technology begin to provide us with simultaneous recordings of large neuronal populations, new methods for modelling such data must also be developed. We present a model of responses to repeated trials of a sensory stimulus based on thresholded Gaussian processes that allows for analysis and modelling of variability and covariability of population spike trains across multiple time scales. The model framework can be used to specify the values of many different variability measures including spike timing precision across trials, coefficient of variation of the interspike interval distribution, and Fano factor of spike counts for individual neurons, as well as signal and noise correlations and correlations of spike counts across multiple neurons. Using both simulated data and data from different stages of the mammalian auditory pathway, we demonstrate the range of possible independent manipulations of different variability measures, and explore how this range depends on the sensory stimulus. The model provides a powerful framework for the study of experimental and surrogate data and for analyzing dependencies between different statistical properties of neuronal populations.
Collapse
|
16
|
Vidne M, Ahmadian Y, Shlens J, Pillow JW, Kulkarni J, Litke AM, Chichilnisky EJ, Simoncelli E, Paninski L. Modeling the impact of common noise inputs on the network activity of retinal ganglion cells. J Comput Neurosci 2011; 33:97-121. [PMID: 22203465 DOI: 10.1007/s10827-011-0376-2] [Citation(s) in RCA: 65] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2011] [Revised: 12/04/2011] [Accepted: 12/09/2011] [Indexed: 10/14/2022]
Abstract
Synchronized spontaneous firing among retinal ganglion cells (RGCs), on timescales faster than visual responses, has been reported in many studies. Two candidate mechanisms of synchronized firing include direct coupling and shared noisy inputs. In neighboring parasol cells of primate retina, which exhibit rapid synchronized firing that has been studied extensively, recent experimental work indicates that direct electrical or synaptic coupling is weak, but shared synaptic input in the absence of modulated stimuli is strong. However, previous modeling efforts have not accounted for this aspect of firing in the parasol cell population. Here we develop a new model that incorporates the effects of common noise, and apply it to analyze the light responses and synchronized firing of a large, densely-sampled network of over 250 simultaneously recorded parasol cells. We use a generalized linear model in which the spike rate in each cell is determined by the linear combination of the spatio-temporally filtered visual input, the temporally filtered prior spikes of that cell, and unobserved sources representing common noise. The model accurately captures the statistical structure of the spike trains and the encoding of the visual stimulus, without the direct coupling assumption present in previous modeling work. Finally, we examined the problem of decoding the visual stimulus from the spike train given the estimated parameters. The common-noise model produces Bayesian decoding performance as accurate as that of a model with direct coupling, but with significantly more robustness to spike timing perturbations.
Collapse
Affiliation(s)
- Michael Vidne
- Department of Applied Physics & Applied Mathematics, Center for Theoretical Neuroscience, Columbia University, New York, NY, USA.
| | | | | | | | | | | | | | | | | |
Collapse
|
17
|
Gerhard F, Haslinger R, Pipa G. Applying the multivariate time-rescaling theorem to neural population models. Neural Comput 2011; 23:1452-83. [PMID: 21395436 DOI: 10.1162/neco_a_00126] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Statistical models of neural activity are integral to modern neuroscience. Recently interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However, any statistical model must be validated by an appropriate goodness-of-fit test. Kolmogorov-Smirnov tests based on the time-rescaling theorem have proven to be useful for evaluating point-process-based statistical models of single-neuron spike trains. Here we discuss the extension of the time-rescaling theorem to the multivariate (neural population) case. We show that even in the presence of strong correlations between spike trains, models that neglect couplings between neurons can be erroneously passed by the univariate time-rescaling test. We present the multivariate version of the time-rescaling theorem and provide a practical step-by-step procedure for applying it to testing the sufficiency of neural population models. Using several simple analytically tractable models and more complex simulated and real data sets, we demonstrate that important features of the population activity can be detected only using the multivariate extension of the test.
Collapse
Affiliation(s)
- Felipe Gerhard
- Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.
| | | | | |
Collapse
|
18
|
Krumin M, Reutsky I, Shoham S. Correlation-based analysis and generation of multiple spike trains using hawkes models with an exogenous input. Front Comput Neurosci 2010; 4:147. [PMID: 21151360 PMCID: PMC2995522 DOI: 10.3389/fncom.2010.00147] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2009] [Accepted: 10/25/2010] [Indexed: 11/14/2022] Open
Abstract
The correlation structure of neural activity is believed to play a major role in the encoding and possibly the decoding of information in neural populations. Recently, several methods were developed for exactly controlling the correlation structure of multi-channel synthetic spike trains (Brette, 2009; Krumin and Shoham, 2009; Macke et al., 2009; Gutnisky and Josic, 2010; Tchumatchenko et al., 2010) and, in a related work, correlation-based analysis of spike trains was used for blind identification of single-neuron models (Krumin et al., 2010), for identifying compact auto-regressive models for multi-channel spike trains, and for facilitating their causal network analysis (Krumin and Shoham, 2010). However, the diversity of correlation structures that can be explained by the feed-forward, non-recurrent, generative models used in these studies is limited. Hence, methods based on such models occasionally fail when analyzing correlation structures that are observed in neural activity. Here, we extend this framework by deriving closed-form expressions for the correlation structure of a more powerful multivariate self- and mutually exciting Hawkes model class that is driven by exogenous non-negative inputs. We demonstrate that the resulting Linear–Non-linear-Hawkes (LNH) framework is capable of capturing the dynamics of spike trains with a generally richer and more biologically relevant multi-correlation structure, and can be used to accurately estimate the Hawkes kernels or the correlation structure of external inputs in both simulated and real spike trains (recorded from visually stimulated mouse retinal ganglion cells). We conclude by discussing the method's limitations and the broader significance of strengthening the links between neural spike train analysis and classical system identification.
Collapse
Affiliation(s)
- Michael Krumin
- Faculty of Biomedical Engineering, Technion - Israel Institute of Technology Haifa, Israel
| | | | | |
Collapse
|
19
|
Gilson M, Burkitt A, van Hemmen LJ. STDP in Recurrent Neuronal Networks. Front Comput Neurosci 2010; 4. [PMID: 20890448 PMCID: PMC2947928 DOI: 10.3389/fncom.2010.00023] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2010] [Accepted: 06/28/2010] [Indexed: 11/13/2022] Open
Abstract
Recent results about spike-timing-dependent plasticity (STDP) in recurrently connected neurons are reviewed, with a focus on the relationship between the weight dynamics and the emergence of network structure. In particular, the evolution of synaptic weights in the two cases of incoming connections for a single neuron and recurrent connections are compared and contrasted. A theoretical framework is used that is based upon Poisson neurons with a temporally inhomogeneous firing rate and the asymptotic distribution of weights generated by the learning dynamics. Different network configurations examined in recent studies are discussed and an overview of the current understanding of STDP in recurrently connected neuronal networks is presented.
Collapse
|
20
|
Staude B, Rotter S, Grün S. CuBIC: cumulant based inference of higher-order correlations in massively parallel spike trains. J Comput Neurosci 2010; 29:327-350. [PMID: 19862611 PMCID: PMC2940040 DOI: 10.1007/s10827-009-0195-x] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2008] [Revised: 08/07/2009] [Accepted: 09/01/2009] [Indexed: 10/24/2022]
Abstract
Recent developments in electrophysiological and optical recording techniques enable the simultaneous observation of large numbers of neurons. A meaningful interpretation of the resulting multivariate data, however, presents a serious challenge. In particular, the estimation of higher-order correlations that characterize the cooperative dynamics of groups of neurons is impeded by the combinatorial explosion of the parameter space. The resulting requirements with respect to sample size and recording time has rendered the detection of coordinated neuronal groups exceedingly difficult. Here we describe a novel approach to infer higher-order correlations in massively parallel spike trains that is less susceptible to these problems. Based on the superimposed activity of all recorded neurons, the cumulant-based inference of higher-order correlations (CuBIC) presented here exploits the fact that the absence of higher-order correlations imposes also strong constraints on correlations of lower order. Thus, estimates of only few lower-order cumulant suffice to infer higher-order correlations in the population. As a consequence, CuBIC is much better compatible with the constraints of in vivo recordings than previous approaches, which is shown by a systematic analysis of its parameter dependence.
Collapse
Affiliation(s)
- Benjamin Staude
- Unit of Statistical Neuroscience, RIKEN Brain Science Institute, Wako-Shi, Japan
- Bernstein Center for Computational Neuroscience, Freiburg & Faculty of Biology, Albert-Ludwig University, Hansastr. 9a, 79104 Freiburg, Germany
| | - Stefan Rotter
- Bernstein Center for Computational Neuroscience, Freiburg & Faculty of Biology, Albert-Ludwig University, Hansastr. 9a, 79104 Freiburg, Germany
| | - Sonja Grün
- Unit of Statistical Neuroscience, RIKEN Brain Science Institute, Wako-Shi, Japan
- Bernstein Center for Computational Neuroscience, Berlin, Humboldt Unverstität zu, Berlin, Germany
| |
Collapse
|
21
|
Correlation-based approach to analysis of spiking networks. BMC Neurosci 2010. [PMCID: PMC3090890 DOI: 10.1186/1471-2202-11-s1-p182] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
|
22
|
Maran SK, Jaeger D. Data driven generation of Purkinje cell spike train correlations to study input output relations in deep cerebellar nuclei neurons. BMC Neurosci 2010. [PMCID: PMC3090817 DOI: 10.1186/1471-2202-11-s1-p116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/03/2023] Open
|
23
|
Gutnisky DA, Josić K. Generation of Spatiotemporally Correlated Spike Trains and Local Field Potentials Using a Multivariate Autoregressive Process. J Neurophysiol 2010; 103:2912-30. [DOI: 10.1152/jn.00518.2009] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Experimental advances allowing for the simultaneous recording of activity at multiple sites have significantly increased our understanding of the spatiotemporal patterns in neural activity. The impact of such patterns on neural coding is a fundamental question in neuroscience. The simulation of spike trains with predetermined activity patterns is therefore an important ingredient in the study of potential neural codes. Such artificially generated spike trains could also be used to manipulate cortical neurons in vitro and in vivo. Here, we propose a method to generate spike trains with given mean firing rates and cross-correlations. To capture this statistical structure we generate a point process by thresholding a stochastic process that is continuous in space and discrete in time. This stochastic process is obtained by filtering Gaussian noise through a multivariate autoregressive (AR) model. The parameters of the AR model are obtained by a nonlinear transformation of the point-process correlations to the continuous-process correlations. The proposed method is very efficient and allows for the simulation of large neural populations. It can be optimized to the structure of spatiotemporal correlations and generalized to nonstationary processes and spatiotemporal patterns of local field potentials and spike trains.
Collapse
Affiliation(s)
- Diego A. Gutnisky
- Department of Neurobiology and Anatomy, University of Texas–Houston Medical School; and
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas
| |
Collapse
|
24
|
Tchumatchenko T, Geisel T, Volgushev M, Wolf F. Signatures of synchrony in pairwise count correlations. Front Comput Neurosci 2010; 4:1. [PMID: 20422044 PMCID: PMC2857958 DOI: 10.3389/neuro.10.001.2010] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2009] [Accepted: 02/05/2010] [Indexed: 11/13/2022] Open
Abstract
Concerted neural activity can reflect specific features of sensory stimuli or behavioral tasks. Correlation coefficients and count correlations are frequently used to measure correlations between neurons, design synthetic spike trains and build population models. But are correlation coefficients always a reliable measure of input correlations? Here, we consider a stochastic model for the generation of correlated spike sequences which replicate neuronal pairwise correlations in many important aspects. We investigate under which conditions the correlation coefficients reflect the degree of input synchrony and when they can be used to build population models. We find that correlation coefficients can be a poor indicator of input synchrony for some cases of input correlations. In particular, count correlations computed for large time bins can vanish despite the presence of input correlations. These findings suggest that network models or potential coding schemes of neural population activity need to incorporate temporal properties of correlated inputs and take into consideration the regimes of firing rates and correlation strengths to ensure that their building blocks are an unambiguous measures of synchrony.
Collapse
|
25
|
Krumin M, Shoham S. Multivariate autoregressive modeling and granger causality analysis of multiple spike trains. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2010; 2010:752428. [PMID: 20454705 PMCID: PMC2862319 DOI: 10.1155/2010/752428] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/03/2009] [Revised: 08/13/2009] [Accepted: 01/11/2010] [Indexed: 11/17/2022]
Abstract
Recent years have seen the emergence of microelectrode arrays and optical methods allowing simultaneous recording of spiking activity from populations of neurons in various parts of the nervous system. The analysis of multiple neural spike train data could benefit significantly from existing methods for multivariate time-series analysis which have proven to be very powerful in the modeling and analysis of continuous neural signals like EEG signals. However, those methods have not generally been well adapted to point processes. Here, we use our recent results on correlation distortions in multivariate Linear-Nonlinear-Poisson spiking neuron models to derive generalized Yule-Walker-type equations for fitting ''hidden" Multivariate Autoregressive models. We use this new framework to perform Granger causality analysis in order to extract the directed information flow pattern in networks of simulated spiking neurons. We discuss the relative merits and limitations of the new method.
Collapse
Affiliation(s)
- Michael Krumin
- Faculty of Biomedical Engineering, Technion—Israel Institute of Technology, 32000 Haifa, Israel
| | - Shy Shoham
- Faculty of Biomedical Engineering, Technion—Israel Institute of Technology, 32000 Haifa, Israel
- *Shy Shoham:
| |
Collapse
|
26
|
Onken A, Grünewälder S, Munk MHJ, Obermayer K. Analyzing short-term noise dependencies of spike-counts in macaque prefrontal cortex using copulas and the flashlight transformation. PLoS Comput Biol 2009; 5:e1000577. [PMID: 19956759 PMCID: PMC2776173 DOI: 10.1371/journal.pcbi.1000577] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2009] [Accepted: 10/23/2009] [Indexed: 11/29/2022] Open
Abstract
Simultaneous spike-counts of neural populations are typically modeled by a Gaussian distribution. On short time scales, however, this distribution is too restrictive to describe and analyze multivariate distributions of discrete spike-counts. We present an alternative that is based on copulas and can account for arbitrary marginal distributions, including Poisson and negative binomial distributions as well as second and higher-order interactions. We describe maximum likelihood-based procedures for fitting copula-based models to spike-count data, and we derive a so-called flashlight transformation which makes it possible to move the tail dependence of an arbitrary copula into an arbitrary orthant of the multivariate probability distribution. Mixtures of copulas that combine different dependence structures and thereby model different driving processes simultaneously are also introduced. First, we apply copula-based models to populations of integrate-and-fire neurons receiving partially correlated input and show that the best fitting copulas provide information about the functional connectivity of coupled neurons which can be extracted using the flashlight transformation. We then apply the new method to data which were recorded from macaque prefrontal cortex using a multi-tetrode array. We find that copula-based distributions with negative binomial marginals provide an appropriate stochastic model for the multivariate spike-count distributions rather than the multivariate Poisson latent variables distribution and the often used multivariate normal distribution. The dependence structure of these distributions provides evidence for common inhibitory input to all recorded stimulus encoding neurons. Finally, we show that copula-based models can be successfully used to evaluate neural codes, e.g., to characterize stimulus-dependent spike-count distributions with information measures. This demonstrates that copula-based models are not only a versatile class of models for multivariate distributions of spike-counts, but that those models can be exploited to understand functional dependencies.
Collapse
Affiliation(s)
- Arno Onken
- School of Electrical Engineering and Computer Science, Technische Universität Berlin, Berlin, Germany.
| | | | | | | |
Collapse
|
27
|
Krumin M, Shimron A, Shoham S. Correlation-distortion based identification of Linear-Nonlinear-Poisson models. J Comput Neurosci 2009; 29:301-308. [PMID: 19757006 DOI: 10.1007/s10827-009-0184-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2008] [Revised: 08/03/2009] [Accepted: 08/20/2009] [Indexed: 10/20/2022]
Abstract
Linear-Nonlinear-Poisson (LNP) models are a popular and powerful tool for describing encoding (stimulus-response) transformations by single sensory as well as motor neurons. Recently, there has been rising interest in the second- and higher-order correlation structure of neural spike trains, and how it may be related to specific encoding relationships. The distortion of signal correlations as they are transformed through particular LNP models is predictable and in some cases analytically tractable and invertible. Here, we propose that LNP encoding models can potentially be identified strictly from the correlation transformations they induce, and develop a computational method for identifying minimum-phase single-neuron temporal kernels under white and colored random Gaussian excitation. Unlike reverse-correlation or maximum-likelihood, correlation-distortion based identification does not require the simultaneous observation of stimulus-response pairs-only their respective second order statistics. Although in principle filter kernels are not necessarily minimum-phase, and only their spectral amplitude can be uniquely determined from output correlations, we show that in practice this method provides excellent estimates of kernels from a range of parametric models of neural systems. We conclude by discussing how this approach could potentially enable neural models to be estimated from a much wider variety of experimental conditions and systems, and its limitations.
Collapse
Affiliation(s)
- Michael Krumin
- Faculty of Biomedical Engineering, Technion IIT, 32000, Haifa, Israel
| | - Avner Shimron
- Faculty of Biomedical Engineering, Technion IIT, 32000, Haifa, Israel
| | - Shy Shoham
- Faculty of Biomedical Engineering, Technion IIT, 32000, Haifa, Israel.
| |
Collapse
|