1
|
Zhou S, Cai H, Chen H, Ye L. A Comparative Study of Causality Detection Methods in Root Cause Diagnosis: From Industrial Processes to Brain Networks. SENSORS (BASEL, SWITZERLAND) 2024; 24:4908. [PMID: 39123955 PMCID: PMC11314704 DOI: 10.3390/s24154908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/01/2024] [Revised: 07/18/2024] [Accepted: 07/23/2024] [Indexed: 08/12/2024]
Abstract
Abstracting causal knowledge from process measurements has become an appealing topic for decades, especially for fault root cause analysis (RCA) based on signals recorded by multiple sensors in a complex system. Although many causality detection methods have been developed and applied in different fields, some research communities may have an idiosyncratic implementation of their preferred methods, with limited accessibility to the wider community. Targeting interested experimental researchers and engineers, this paper provides a comprehensive comparison of data-based causality detection methods in root cause diagnosis across two distinct domains. We provide a possible taxonomy of those methods followed by descriptions of the main motivations of those concepts. Of the two cases we investigated, one is a root cause diagnosis of plant-wide oscillations in an industrial process, while the other is the localization of the epileptogenic focus in a human brain network where the connectivity pattern is transient and even more complex. Considering the differences in various causality detection methods, we designed several sets of experiments so that for each case, a total of 11 methods could be appropriately compared under a unified and reasonable evaluation framework. In each case, these methods were implemented separately and in a standard way to infer causal interactions among multiple variables to thus establish the causal network for RCA. From the cross-domain investigation, several findings are presented along with insights into them, including an interpretative pitfall that warrants caution.
Collapse
Affiliation(s)
- Sun Zhou
- Department of Automation, Xiamen University, Xiamen 361102, China;
| | - He Cai
- Department of Automation, Xiamen University, Xiamen 361102, China;
| | - Huazhen Chen
- School of Sociology and Anthropology, Xiamen University, Xiamen 361005, China;
| | - Lishan Ye
- Institute of Brain and Cognitive Sciences, Tsinghua University, Beijing 100084, China
| |
Collapse
|
2
|
Zeng Q, Wang J. New fluctuation theorems on Maxwell's demon. SCIENCE ADVANCES 2021; 7:7/23/eabf1807. [PMID: 34088664 PMCID: PMC8177699 DOI: 10.1126/sciadv.abf1807] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/09/2020] [Accepted: 04/20/2021] [Indexed: 06/12/2023]
Abstract
With increasing interest in the control of systems at the nano- and mesoscopic scales, studies have been focused on the limit of the energy dissipation in an open system by refining the concept of the Maxwell's demon. To uncover the underlying physical principle behind a system controlled by a demon, we prove a previously unexplored set of fluctuation theorems. These fluctuation theorems imply that there exists an intrinsic nonequilibrium state of the system, led by the nonnegative demon-induced dissipative information. A consequence of this analysis is that the bounds of both work and heat are tighter than the limits predicted by the Sagawa-Ueda theorem. We also suggest a possible experimental test of these work and heat bounds.
Collapse
Affiliation(s)
- Qian Zeng
- State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Changchun, Jilin 130022, China
| | - Jin Wang
- Departments of Chemistry and of Physics and Astronomy, State University of New York, Stony Brook, NY 11794-3400, USA.
| |
Collapse
|
3
|
A Maximum Entropy Model of Bounded Rational Decision-Making with Prior Beliefs and Market Feedback. ENTROPY 2021; 23:e23060669. [PMID: 34073330 PMCID: PMC8227139 DOI: 10.3390/e23060669] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/21/2021] [Revised: 05/20/2021] [Accepted: 05/21/2021] [Indexed: 11/24/2022]
Abstract
Bounded rationality is an important consideration stemming from the fact that agents often have limits on their processing abilities, making the assumption of perfect rationality inapplicable to many real tasks. We propose an information-theoretic approach to the inference of agent decisions under Smithian competition. The model explicitly captures the boundedness of agents (limited in their information-processing capacity) as the cost of information acquisition for expanding their prior beliefs. The expansion is measured as the Kullblack–Leibler divergence between posterior decisions and prior beliefs. When information acquisition is free, the homo economicus agent is recovered, while in cases when information acquisition becomes costly, agents instead revert to their prior beliefs. The maximum entropy principle is used to infer least biased decisions based upon the notion of Smithian competition formalised within the Quantal Response Statistical Equilibrium framework. The incorporation of prior beliefs into such a framework allowed us to systematically explore the effects of prior beliefs on decision-making in the presence of market feedback, as well as importantly adding a temporal interpretation to the framework. We verified the proposed model using Australian housing market data, showing how the incorporation of prior knowledge alters the resulting agent decisions. Specifically, it allowed for the separation of past beliefs and utility maximisation behaviour of the agent as well as the analysis into the evolution of agent beliefs.
Collapse
|
4
|
Molavipour S, Ghourchian H, Bassi G, Skoglund M. Neural Estimator of Information for Time-Series Data with Dependency. ENTROPY 2021; 23:e23060641. [PMID: 34064014 PMCID: PMC8224080 DOI: 10.3390/e23060641] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Revised: 05/15/2021] [Accepted: 05/18/2021] [Indexed: 11/16/2022]
Abstract
Novel approaches to estimate information measures using neural networks are well-celebrated in recent years both in the information theory and machine learning communities. These neural-based estimators are shown to converge to the true values when estimating mutual information and conditional mutual information using independent samples. However, if the samples in the dataset are not independent, the consistency of these estimators requires further investigation. This is of particular interest for a more complex measure such as the directed information, which is pivotal in characterizing causality and is meaningful over time-dependent variables. The extension of the convergence proof for such cases is not trivial and demands further assumptions on the data. In this paper, we show that our neural estimator for conditional mutual information is consistent when the dataset is generated with samples of a stationary and ergodic source. In other words, we show that our information estimator using neural networks converges asymptotically to the true value with probability one. Besides universal functional approximation of neural networks, a core lemma to show the convergence is Birkhoff’s ergodic theorem. Additionally, we use the technique to estimate directed information and demonstrate the effectiveness of our approach in simulations.
Collapse
Affiliation(s)
- Sina Molavipour
- School of Electrical Engineering and Computer Science (EECS), KTH Royal Institute of Technology, 100 44 Stockholm, Sweden; (H.G.); (M.S.)
- Correspondence:
| | - Hamid Ghourchian
- School of Electrical Engineering and Computer Science (EECS), KTH Royal Institute of Technology, 100 44 Stockholm, Sweden; (H.G.); (M.S.)
| | | | - Mikael Skoglund
- School of Electrical Engineering and Computer Science (EECS), KTH Royal Institute of Technology, 100 44 Stockholm, Sweden; (H.G.); (M.S.)
| |
Collapse
|
5
|
Potts PP, Samuelsson P. Thermodynamic uncertainty relations including measurement and feedback. Phys Rev E 2019; 100:052137. [PMID: 31869995 DOI: 10.1103/physreve.100.052137] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2019] [Indexed: 12/26/2022]
Abstract
Thermodynamic uncertainty relations quantify how the signal-to-noise ratio of a given observable is constrained by dissipation. Fluctuation relations generalize the second law of thermodynamics to stochastic processes. We show that any fluctuation relation directly implies a thermodynamic uncertainty relation, considerably increasing their range of applicability. In particular, we extend thermodynamic uncertainty relations to scenarios which include measurement and feedback. Since feedback generally breaks time-reversal invariance, the uncertainty relations involve quantities averaged over the forward and the backward experiment defined by the associated fluctuation relation. This implies that the signal-to-noise ratio of a given experiment can in principle become arbitrarily large as long as the corresponding backward experiment compensates, e.g., by being sufficiently noisy. We illustrate our results with the Szilard engine as well as work extraction by free energy reduction in a quantum dot.
Collapse
Affiliation(s)
- Patrick P Potts
- Physics Department and NanoLund, Lund University, Box 118, 22100 Lund, Sweden
| | - Peter Samuelsson
- Physics Department and NanoLund, Lund University, Box 118, 22100 Lund, Sweden
| |
Collapse
|
6
|
Auconi A, Giansanti A, Klipp E. Information Thermodynamics for Time Series of Signal-Response Models. ENTROPY 2019; 21:e21020177. [PMID: 33266893 PMCID: PMC7514659 DOI: 10.3390/e21020177] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/17/2018] [Revised: 01/27/2019] [Accepted: 02/11/2019] [Indexed: 11/29/2022]
Abstract
The entropy production in stochastic dynamical systems is linked to the structure of their causal representation in terms of Bayesian networks. Such a connection was formalized for bipartite (or multipartite) systems with an integral fluctuation theorem in [Phys. Rev. Lett. 111, 180603 (2013)]. Here we introduce the information thermodynamics for time series, that are non-bipartite in general, and we show that the link between irreversibility and information can only result from an incomplete causal representation. In particular, we consider a backward transfer entropy lower bound to the conditional time series irreversibility that is induced by the absence of feedback in signal-response models. We study such a relation in a linear signal-response model providing analytical solutions, and in a nonlinear biological model of receptor-ligand systems where the time series irreversibility measures the signaling efficiency.
Collapse
Affiliation(s)
- Andrea Auconi
- Theoretische Biophysik, Humboldt-Universität zu Berlin, Invalidenstraße 42, D-10115 Berlin, Germany
| | - Andrea Giansanti
- Dipartimento di Fisica, Sapienza Università di Roma, 00185 Rome, Italy
- INFN, Sezione di Roma 1, 00185 Rome, Italy
| | - Edda Klipp
- Theoretische Biophysik, Humboldt-Universität zu Berlin, Invalidenstraße 42, D-10115 Berlin, Germany
- Correspondence:
| |
Collapse
|
7
|
Harding N, Nigmatullin R, Prokopenko M. Thermodynamic efficiency of contagions: a statistical mechanical analysis of the SIS epidemic model. Interface Focus 2018; 8:20180036. [PMID: 30443333 PMCID: PMC6227806 DOI: 10.1098/rsfs.2018.0036] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/10/2018] [Indexed: 01/23/2023] Open
Abstract
We present a novel approach to the study of epidemics on networks as thermodynamic phenomena, quantifying the thermodynamic efficiency of contagions, considered as distributed computational processes. Modelling SIS dynamics on a contact network statistical-mechanically, we follow the maximum entropy (MaxEnt) principle to obtain steady-state distributions and derive, under certain assumptions, relevant thermodynamic quantities both analytically and numerically. In particular, we obtain closed-form solutions for some cases, while interpreting key epidemic variables, such as the reproductive ratio of a SIS model, in a statistical mechanical setting. On the other hand, we consider configuration and free entropy, as well as the Fisher information, in the epidemiological context. This allowed us to identify criticality and distinct phases of epidemic processes. For each of the considered thermodynamic quantities, we compare the analytical solutions informed by the MaxEnt principle with the numerical estimates for SIS epidemics simulated on Watts-Strogatz random graphs.
Collapse
Affiliation(s)
- Nathan Harding
- Centre for Complex Systems, Faculty of Engineering and IT, University of Sydney, Sydney, New South Wales 2006, Australia
| | - Ramil Nigmatullin
- Centre for Complex Systems, Faculty of Engineering and IT, University of Sydney, Sydney, New South Wales 2006, Australia
| | - Mikhail Prokopenko
- Centre for Complex Systems, Faculty of Engineering and IT, University of Sydney, Sydney, New South Wales 2006, Australia
- Marie Bashir Institute for Infectious Diseases and Biosecurity, University of Sydney, Westmead, New South Wales 2145, Australia
| |
Collapse
|
8
|
Kolchinsky A, Wolpert DH. Semantic information, autonomous agency and non-equilibrium statistical physics. Interface Focus 2018; 8:20180041. [PMID: 30443338 PMCID: PMC6227811 DOI: 10.1098/rsfs.2018.0041] [Citation(s) in RCA: 52] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/04/2018] [Indexed: 01/24/2023] Open
Abstract
Shannon information theory provides various measures of so-called syntactic information, which reflect the amount of statistical correlation between systems. By contrast, the concept of 'semantic information' refers to those correlations which carry significance or 'meaning' for a given system. Semantic information plays an important role in many fields, including biology, cognitive science and philosophy, and there has been a long-standing interest in formulating a broadly applicable and formal theory of semantic information. In this paper, we introduce such a theory. We define semantic information as the syntactic information that a physical system has about its environment which is causally necessary for the system to maintain its own existence. 'Causal necessity' is defined in terms of counter-factual interventions which scramble correlations between the system and its environment, while 'maintaining existence' is defined in terms of the system's ability to keep itself in a low entropy state. We also use recent results in non-equilibrium statistical physics to analyse semantic information from a thermodynamic point of view. Our framework is grounded in the intrinsic dynamics of a system coupled to an environment, and is applicable to any physical system, living or otherwise. It leads to formal definitions of several concepts that have been intuitively understood to be related to semantic information, including 'value of information', 'semantic content' and 'agency'.
Collapse
Affiliation(s)
| | - David H. Wolpert
- Santa Fe Institute, Santa Fe, NM 87501, USA
- Massachusetts Institute of Technology, Cambridge, MA, USA
- Arizona State University, Tempe, AZ, USA
| |
Collapse
|
9
|
Potts PP, Samuelsson P. Detailed Fluctuation Relation for Arbitrary Measurement and Feedback Schemes. PHYSICAL REVIEW LETTERS 2018; 121:210603. [PMID: 30517817 DOI: 10.1103/physrevlett.121.210603] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/13/2018] [Revised: 09/25/2018] [Indexed: 06/09/2023]
Abstract
Fluctuation relations are powerful equalities that hold far from equilibrium. However, the standard approach to include measurement and feedback schemes may become inapplicable in certain situations, including continuous measurements, precise measurements of continuous variables, and feedback induced irreversibility. Here we overcome these shortcomings by providing a recipe for producing detailed fluctuation relations. Based on this recipe, we derive a fluctuation relation which holds for arbitrary measurement and feedback control. The key insight is that fluctuations inferable from the measurement outcomes may be suppressed by postselection. Our detailed fluctuation relation results in a stringent and experimentally accessible inequality on the extractable work, which is saturated when the full entropy production is inferable from the data.
Collapse
Affiliation(s)
- Patrick P Potts
- Physics Department and NanoLund, Lund University, Box 118, 22100 Lund, Sweden
| | - Peter Samuelsson
- Physics Department and NanoLund, Lund University, Box 118, 22100 Lund, Sweden
| |
Collapse
|
10
|
Nandi M, Biswas A, Banik SK, Chaudhury P. Information processing in a simple one-step cascade. PHYSICAL REVIEW E 2018; 98:042310. [DOI: 10.1103/physreve.98.042310] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/19/2023]
|
11
|
Ito S. Stochastic Thermodynamic Interpretation of Information Geometry. PHYSICAL REVIEW LETTERS 2018; 121:030605. [PMID: 30085772 DOI: 10.1103/physrevlett.121.030605] [Citation(s) in RCA: 33] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/06/2018] [Indexed: 06/08/2023]
Abstract
In recent years, the unified theory of information and thermodynamics has been intensively discussed in the context of stochastic thermodynamics. The unified theory reveals that information theory would be useful to understand nonstationary dynamics of systems far from equilibrium. In this Letter, we have found a new link between stochastic thermodynamics and information theory well-known as information geometry. By applying this link, an information geometric inequality can be interpreted as a thermodynamic uncertainty relationship between speed and thermodynamic cost. We have numerically applied an information geometric inequality to a thermodynamic model of a biochemical enzyme reaction.
Collapse
Affiliation(s)
- Sosuke Ito
- RIES, Hokkaido University, N20 W10, Kita-ku, Sapporo, Hokkaido 001-0020, Japan
| |
Collapse
|
12
|
Crosato E, Spinney RE, Nigmatullin R, Lizier JT, Prokopenko M. Thermodynamics and computation during collective motion near criticality. Phys Rev E 2018; 97:012120. [PMID: 29448440 DOI: 10.1103/physreve.97.012120] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Indexed: 11/07/2022]
Abstract
We study self-organization of collective motion as a thermodynamic phenomenon in the context of the first law of thermodynamics. It is expected that the coherent ordered motion typically self-organises in the presence of changes in the (generalized) internal energy and of (generalized) work done on, or extracted from, the system. We aim to explicitly quantify changes in these two quantities in a system of simulated self-propelled particles and contrast them with changes in the system's configuration entropy. In doing so, we adapt a thermodynamic formulation of the curvatures of the internal energy and the work, with respect to two parameters that control the particles' alignment. This allows us to systematically investigate the behavior of the system by varying the two control parameters to drive the system across a kinetic phase transition. Our results identify critical regimes and show that during the phase transition, where the configuration entropy of the system decreases, the rates of change of the work and of the internal energy also decrease, while their curvatures diverge. Importantly, the reduction of entropy achieved through expenditure of work is shown to peak at criticality. We relate this both to a thermodynamic efficiency and the significance of the increased order with respect to a computational path. Additionally, this study provides an information-geometric interpretation of the curvature of the internal energy as the difference between two curvatures: the curvature of the free entropy, captured by the Fisher information, and the curvature of the configuration entropy.
Collapse
Affiliation(s)
- Emanuele Crosato
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering and IT, The University of Sydney, Sydney, NSW 2006, Australia
| | - Richard E Spinney
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering and IT, The University of Sydney, Sydney, NSW 2006, Australia
| | - Ramil Nigmatullin
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering and IT, The University of Sydney, Sydney, NSW 2006, Australia
| | - Joseph T Lizier
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering and IT, The University of Sydney, Sydney, NSW 2006, Australia
| | - Mikhail Prokopenko
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering and IT, The University of Sydney, Sydney, NSW 2006, Australia
| |
Collapse
|
13
|
Cliff OM, Prokopenko M, Fitch R. Minimising the Kullback-Leibler Divergence for Model Selection in Distributed Nonlinear Systems. ENTROPY 2018; 20:e20020051. [PMID: 33265171 PMCID: PMC7512642 DOI: 10.3390/e20020051] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/21/2017] [Revised: 01/17/2018] [Accepted: 01/18/2018] [Indexed: 02/04/2023]
Abstract
The Kullback-Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be decomposed as a function of two information-theoretic measures, transfer entropy and stochastic interaction. More specifically, these measures are applicable when selecting a candidate model for a distributed system, where individual subsystems are coupled via latent variables and observed through a filter. We represent this model as a directed acyclic graph (DAG) that characterises the unidirectional coupling between subsystems. Standard approaches to structure learning are not applicable in this framework due to the hidden variables; however, we can exploit the properties of certain dynamical systems to formulate exact methods based on differential topology. We approach the problem by using reconstruction theorems to derive an analytical expression for the KL divergence of a candidate DAG from the observed dataset. Using this result, we present a scoring function based on transfer entropy to be used as a subroutine in a structure learning algorithm. We then demonstrate its use in recovering the structure of coupled Lorenz and Rössler systems.
Collapse
Affiliation(s)
- Oliver M. Cliff
- Australian Centre for Field Robotics, The University of Sydney, Sydney NSW 2006, Australia
- Complex Systems Research Group, The University of Sydney, Sydney NSW 2006, Australia
- Correspondence: ; Tel.: +61-2-9351-3040
| | - Mikhail Prokopenko
- Complex Systems Research Group, The University of Sydney, Sydney NSW 2006, Australia
| | - Robert Fitch
- Australian Centre for Field Robotics, The University of Sydney, Sydney NSW 2006, Australia
- Centre for Autonomous Systems, University of Technology Sydney, Ultimo NSW 2007, Australia
| |
Collapse
|
14
|
Information Theoretical Study of Cross-Talk Mediated Signal Transduction in MAPK Pathways. ENTROPY 2017. [DOI: 10.3390/e19090469] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
15
|
Spinney RE, Prokopenko M, Lizier JT. Transfer entropy in continuous time, with applications to jump and neural spiking processes. Phys Rev E 2017; 95:032319. [PMID: 28415203 DOI: 10.1103/physreve.95.032319] [Citation(s) in RCA: 36] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2016] [Indexed: 11/07/2022]
Abstract
Transfer entropy has been used to quantify the directed flow of information between source and target variables in many complex systems. While transfer entropy was originally formulated in discrete time, in this paper we provide a framework for considering transfer entropy in continuous time systems, based on Radon-Nikodym derivatives between measures of complete path realizations. To describe the information dynamics of individual path realizations, we introduce the pathwise transfer entropy, the expectation of which is the transfer entropy accumulated over a finite time interval. We demonstrate that this formalism permits an instantaneous transfer entropy rate. These properties are analogous to the behavior of physical quantities defined along paths such as work and heat. We use this approach to produce an explicit form for the transfer entropy for pure jump processes, and highlight the simplified form in the specific case of point processes (frequently used in neuroscience to model neural spike trains). Finally, we present two synthetic spiking neuron model examples to exhibit the pertinent features of our formalism, namely, that the information flow for point processes consists of discontinuous jump contributions (at spikes in the target) interrupting a continuously varying contribution (relating to waiting times between target spikes). Numerical schemes based on our formalism promise significant benefits over existing strategies based on discrete time formalisms.
Collapse
Affiliation(s)
- Richard E Spinney
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering & IT, The University of Sydney, NSW 2006, Australia
| | - Mikhail Prokopenko
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering & IT, The University of Sydney, NSW 2006, Australia
| | - Joseph T Lizier
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering & IT, The University of Sydney, NSW 2006, Australia
| |
Collapse
|
16
|
Levakova M, Tamborrino M, Kostal L, Lansky P. Accuracy of rate coding: When shorter time window and higher spontaneous activity help. Phys Rev E 2017; 95:022310. [PMID: 28297875 DOI: 10.1103/physreve.95.022310] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2016] [Indexed: 11/07/2022]
Abstract
It is widely accepted that neuronal firing rates contain a significant amount of information about the stimulus intensity. Nevertheless, theoretical studies on the coding accuracy inferred from the exact spike counting distributions are rare. We present an analysis based on the number of observed spikes assuming the stochastic perfect integrate-and-fire model with a change point, representing the stimulus onset, for which we calculate the corresponding Fisher information to investigate the accuracy of rate coding. We analyze the effect of changing the duration of the time window and the influence of several parameters of the model, in particular the level of the presynaptic spontaneous activity and the level of random fluctuation of the membrane potential, which can be interpreted as noise of the system. The results show that the Fisher information is nonmonotonic with respect to the length of the observation period. This counterintuitive result is caused by the discrete nature of the count of spikes. We observe also that the signal can be enhanced by noise, since the Fisher information is nonmonotonic with respect to the level of spontaneous activity and, in some cases, also with respect to the level of fluctuation of the membrane potential.
Collapse
Affiliation(s)
- Marie Levakova
- Department of Computational Neuroscience, Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| | - Massimiliano Tamborrino
- Institute for Stochastics, Johannes Kepler University Linz, Altenbergerstraße 69, 4040 Linz, Austria
| | - Lubomir Kostal
- Department of Computational Neuroscience, Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| | - Petr Lansky
- Department of Computational Neuroscience, Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| |
Collapse
|