1
|
Marvan T, Phillips WA. Cellular mechanisms of cooperative context-sensitive predictive inference. CURRENT RESEARCH IN NEUROBIOLOGY 2024; 6:100129. [PMID: 38665363 PMCID: PMC11043869 DOI: 10.1016/j.crneur.2024.100129] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2023] [Revised: 02/14/2024] [Accepted: 03/25/2024] [Indexed: 04/28/2024] Open
Abstract
We argue that prediction success maximization is a basic objective of cognition and cortex, that it is compatible with but distinct from prediction error minimization, that neither objective requires subtractive coding, that there is clear neurobiological evidence for the amplification of predicted signals, and that we are unconvinced by evidence proposed in support of subtractive coding. We outline recent discoveries showing that pyramidal cells on which our cognitive capabilities depend usually transmit information about input to their basal dendrites and amplify that transmission when input to their distal apical dendrites provides a context that agrees with the feedforward basal input in that both are depolarizing, i.e., both are excitatory rather than inhibitory. Though these intracellular discoveries require a level of technical expertise that is beyond the current abilities of most neuroscience labs, they are not controversial and acclaimed as groundbreaking. We note that this cellular cooperative context-sensitivity greatly enhances the cognitive capabilities of the mammalian neocortex, and that much remains to be discovered concerning its evolution, development, and pathology.
Collapse
Affiliation(s)
- Tomáš Marvan
- Institute of Philosophy, Czech Academy of Sciences (CAS), Czech Republic
| | | |
Collapse
|
2
|
Livi L. On Multiscaling of Parkinsonian Rest Tremor Signals and Their Classification. ADVANCES IN NEUROBIOLOGY 2024; 36:571-583. [PMID: 38468054 DOI: 10.1007/978-3-031-47606-8_30] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/13/2024]
Abstract
Self-similar stochastic processes and broad probability distributions are ubiquitous in nature and in many man-made systems. The brain is a particularly interesting example of (natural) complex system where those features play a pivotal role. In fact, the controversial yet experimentally validated "criticality hypothesis" explaining the functioning of the brain implies the presence of scaling laws for correlations. Recently, we have analyzed a collection of rest tremor velocity signals recorded from patients affected by Parkinson's disease, with the aim of determining and hence exploiting the presence of scaling laws. Our results show that multiple scaling laws are required in order to describe the dynamics of such signals, stressing the complexity of the underlying generating mechanism. We successively extracted numeric features by using the multifractal detrended fluctuation analysis procedure. We found that such features can be effective for discriminating classes of signals recorded in different experimental conditions. Notably, we show that the use of medication (L-DOPA) can be recognized with high accuracy.
Collapse
Affiliation(s)
- Lorenzo Livi
- Department of Computer Science, University of Manitoba, Winnipeg, MB, Canada.
| |
Collapse
|
3
|
Wollstadt P, Rathbun DL, Usrey WM, Bastos AM, Lindner M, Priesemann V, Wibral M. Information-theoretic analyses of neural data to minimize the effect of researchers' assumptions in predictive coding studies. PLoS Comput Biol 2023; 19:e1011567. [PMID: 37976328 PMCID: PMC10703417 DOI: 10.1371/journal.pcbi.1011567] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2023] [Revised: 12/07/2023] [Accepted: 10/02/2023] [Indexed: 11/19/2023] Open
Abstract
Studies investigating neural information processing often implicitly ask both, which processing strategy out of several alternatives is used and how this strategy is implemented in neural dynamics. A prime example are studies on predictive coding. These often ask whether confirmed predictions about inputs or prediction errors between internal predictions and inputs are passed on in a hierarchical neural system-while at the same time looking for the neural correlates of coding for errors and predictions. If we do not know exactly what a neural system predicts at any given moment, this results in a circular analysis-as has been criticized correctly. To circumvent such circular analysis, we propose to express information processing strategies (such as predictive coding) by local information-theoretic quantities, such that they can be estimated directly from neural data. We demonstrate our approach by investigating two opposing accounts of predictive coding-like processing strategies, where we quantify the building blocks of predictive coding, namely predictability of inputs and transfer of information, by local active information storage and local transfer entropy. We define testable hypotheses on the relationship of both quantities, allowing us to identify which of the assumed strategies was used. We demonstrate our approach on spiking data collected from the retinogeniculate synapse of the cat (N = 16). Applying our local information dynamics framework, we are able to show that the synapse codes for predictable rather than surprising input. To support our findings, we estimate quantities applied in the partial information decomposition framework, which allow to differentiate whether the transferred information is primarily bottom-up sensory input or information transferred conditionally on the current state of the synapse. Supporting our local information-theoretic results, we find that the synapse preferentially transfers bottom-up information.
Collapse
Affiliation(s)
- Patricia Wollstadt
- MEG Unit, Brain Imaging Center, Goethe University, Frankfurt/Main, Germany
| | - Daniel L. Rathbun
- Center for Neuroscience, University of California, Davis, California, United States of America
- Center for Ophthalmology, University of Tübingen, Tübingen, Germany
| | - W. Martin Usrey
- Center for Neuroscience, University of California, Davis, California, United States of America
- Department of Neurobiology, Physiology, and Behavior, University of California, Davis, California, United States of America
| | - André Moraes Bastos
- Department of Psychology and Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee, United States of America
| | - Michael Lindner
- Campus Institute for Dynamics of Biological Networks, University of Göttingen, Göttingen, Germany
| | - Viola Priesemann
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Michael Wibral
- Campus Institute for Dynamics of Biological Networks, University of Göttingen, Göttingen, Germany
| |
Collapse
|
4
|
Barà C, Sparacino L, Pernice R, Antonacci Y, Porta A, Kugiumtzis D, Faes L. Comparison of discretization strategies for the model-free information-theoretic assessment of short-term physiological interactions. CHAOS (WOODBURY, N.Y.) 2023; 33:033127. [PMID: 37003789 DOI: 10.1063/5.0140641] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Accepted: 02/17/2023] [Indexed: 06/19/2023]
Abstract
This work presents a comparison between different approaches for the model-free estimation of information-theoretic measures of the dynamic coupling between short realizations of random processes. The measures considered are the mutual information rate (MIR) between two random processes X and Y and the terms of its decomposition evidencing either the individual entropy rates of X and Y and their joint entropy rate, or the transfer entropies from X to Y and from Y to X and the instantaneous information shared by X and Y. All measures are estimated through discretization of the random variables forming the processes, performed either via uniform quantization (binning approach) or rank ordering (permutation approach). The binning and permutation approaches are compared on simulations of two coupled non-identical Hènon systems and on three datasets, including short realizations of cardiorespiratory (CR, heart period and respiration flow), cardiovascular (CV, heart period and systolic arterial pressure), and cerebrovascular (CB, mean arterial pressure and cerebral blood flow velocity) measured in different physiological conditions, i.e., spontaneous vs paced breathing or supine vs upright positions. Our results show that, with careful selection of the estimation parameters (i.e., the embedding dimension and the number of quantization levels for the binning approach), meaningful patterns of the MIR and of its components can be achieved in the analyzed systems. On physiological time series, we found that paced breathing at slow breathing rates induces less complex and more coupled CR dynamics, while postural stress leads to unbalancing of CV interactions with prevalent baroreflex coupling and to less complex pressure dynamics with preserved CB interactions. These results are better highlighted by the permutation approach, thanks to its more parsimonious representation of the discretized dynamic patterns, which allows one to explore interactions with longer memory while limiting the curse of dimensionality.
Collapse
Affiliation(s)
- Chiara Barà
- Department of Engineering, University of Palermo, 90128 Palermo, Italy
| | - Laura Sparacino
- Department of Engineering, University of Palermo, 90128 Palermo, Italy
| | - Riccardo Pernice
- Department of Engineering, University of Palermo, 90128 Palermo, Italy
| | - Yuri Antonacci
- Department of Engineering, University of Palermo, 90128 Palermo, Italy
| | - Alberto Porta
- Department of Biomedical Sciences for Health, University of Milan, 20133 Milan, Italy
| | - Dimitris Kugiumtzis
- Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, 54124 Thessaloniki, Greece
| | - Luca Faes
- Department of Engineering, University of Palermo, 90128 Palermo, Italy
| |
Collapse
|
5
|
Pinzuti E, Wollstadt P, Tüscher O, Wibral M. Information theoretic evidence for layer- and frequency-specific changes in cortical information processing under anesthesia. PLoS Comput Biol 2023; 19:e1010380. [PMID: 36701388 PMCID: PMC9904504 DOI: 10.1371/journal.pcbi.1010380] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2022] [Revised: 02/07/2023] [Accepted: 01/05/2023] [Indexed: 01/27/2023] Open
Abstract
Nature relies on highly distributed computation for the processing of information in nervous systems across the entire animal kingdom. Such distributed computation can be more easily understood if decomposed into the three elementary components of information processing, i.e. storage, transfer and modification, and rigorous information theoretic measures for these components exist. However, the distributed computation is often also linked to neural dynamics exhibiting distinct rhythms. Thus, it would be beneficial to associate the above components of information processing with distinct rhythmic processes where possible. Here we focus on the storage of information in neural dynamics and introduce a novel spectrally-resolved measure of active information storage (AIS). Drawing on intracortical recordings of neural activity in ferrets under anesthesia before and after loss of consciousness (LOC) we show that anesthesia- related modulation of AIS is highly specific to different frequency bands and that these frequency-specific effects differ across cortical layers and brain regions. We found that in the high/low gamma band the effects of anesthesia result in AIS modulation only in the supergranular layers, while in the alpha/beta band the strongest decrease in AIS can be seen at infragranular layers. Finally, we show that the increase of spectral power at multiple frequencies, in particular at alpha and delta bands in frontal areas, that is often observed during LOC ('anteriorization') also impacts local information processing-but in a frequency specific way: Increases in isoflurane concentration induced a decrease in AIS in the alpha frequencies, while they increased AIS in the delta frequency range < 2Hz. Thus, the analysis of spectrally-resolved AIS provides valuable additional insights into changes in cortical information processing under anaesthesia.
Collapse
Affiliation(s)
- Edoardo Pinzuti
- Leibniz Institute for Resilience Research (LIR), Mainz, Germany
- MEG Unit, Brain Imaging Center, Goethe University, Frankfurt/Main, Germany
- * E-mail:
| | - Patricia Wollstadt
- MEG Unit, Brain Imaging Center, Goethe University, Frankfurt/Main, Germany
| | - Oliver Tüscher
- Leibniz Institute for Resilience Research (LIR), Mainz, Germany
- Department of Psychiatry and Psychotherapy, Johannes Gutenberg University of Mainz, Mainz, Germany
- Institute of Molecular Biology (IMB), Mainz, Germany
| | - Michael Wibral
- Campus Institute for Dynamics of Biological Networks, Georg August University, Göttingen, Germany
| |
Collapse
|
6
|
Kim SH, Woo J, Choi K, Choi M, Han K. Neural Information Processing and Computations of Two-Input Synapses. Neural Comput 2022; 34:2102-2131. [PMID: 36027799 DOI: 10.1162/neco_a_01534] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Accepted: 06/02/2022] [Indexed: 11/04/2022]
Abstract
Information processing in artificial neural networks is largely dependent on the nature of neuron models. While commonly used models are designed for linear integration of synaptic inputs, accumulating experimental evidence suggests that biological neurons are capable of nonlinear computations for many converging synaptic inputs via homo- and heterosynaptic mechanisms. This nonlinear neuronal computation may play an important role in complex information processing at the neural circuit level. Here we characterize the dynamics and coding properties of neuron models on synaptic transmissions delivered from two hidden states. The neuronal information processing is influenced by the cooperative and competitive interactions among synapses and the coherence of the hidden states. Furthermore, we demonstrate that neuronal information processing under two-input synaptic transmission can be mapped to linearly nonseparable XOR as well as basic AND/OR operations. In particular, the mixtures of linear and nonlinear neuron models outperform the fashion-MNIST test compared to the neural networks consisting of only one type. This study provides a computational framework for assessing information processing of neuron and synapse models that may be beneficial for the design of brain-inspired artificial intelligence algorithms and neuromorphic systems.
Collapse
Affiliation(s)
- Soon Ho Kim
- Laboratory of Computational Neurophysics, Convergence Research Center for Brain Science, Brain Science Institute, Korea Institute of Science and Technology, Seoul 02792, South Korea
| | - Junhyuk Woo
- Laboratory of Computational Neurophysics, Convergence Research Center for Brain Science, Brain Science Institute, Korea Institute of Science and Technology, Seoul 02792, South Korea
| | - Kiri Choi
- School of Computational Sciences, Korea Institute for Advanced Study, Seoul 02455, South Korea
| | - MooYoung Choi
- Department of Physics and Astronomy and Center for Theoretical Physics, Seoul National University, Seoul 08826, South Korea
| | - Kyungreem Han
- Laboratory of Computational Neurophysics, Convergence Research Center for Brain Science, Brain Science Institute, Korea Institute of Science and Technology, Seoul 02792, South Korea
| |
Collapse
|
7
|
Kay JW, Schulz JM, Phillips WA. A Comparison of Partial Information Decompositions Using Data from Real and Simulated Layer 5b Pyramidal Cells. ENTROPY 2022; 24:e24081021. [PMID: 35893001 PMCID: PMC9394329 DOI: 10.3390/e24081021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/13/2022] [Revised: 07/18/2022] [Accepted: 07/20/2022] [Indexed: 02/04/2023]
Abstract
Partial information decomposition allows the joint mutual information between an output and a set of inputs to be divided into components that are synergistic or shared or unique to each input. We consider five different decompositions and compare their results using data from layer 5b pyramidal cells in two different studies. The first study was on the amplification of somatic action potential output by apical dendritic input and its regulation by dendritic inhibition. We find that two of the decompositions produce much larger estimates of synergy and shared information than the others, as well as large levels of unique misinformation. When within-neuron differences in the components are examined, the five methods produce more similar results for all but the shared information component, for which two methods produce a different statistical conclusion from the others. There are some differences in the expression of unique information asymmetry among the methods. It is significantly larger, on average, under dendritic inhibition. Three of the methods support a previous conclusion that apical amplification is reduced by dendritic inhibition. The second study used a detailed compartmental model to produce action potentials for many combinations of the numbers of basal and apical synaptic inputs. Decompositions of the entire data set produce similar differences to those in the first study. Two analyses of decompositions are conducted on subsets of the data. In the first, the decompositions reveal a bifurcation in unique information asymmetry. For three of the methods, this suggests that apical drive switches to basal drive as the strength of the basal input increases, while the other two show changing mixtures of information and misinformation. Decompositions produced using the second set of subsets show that all five decompositions provide support for properties of cooperative context-sensitivity—to varying extents.
Collapse
Affiliation(s)
- Jim W. Kay
- School of Mathematics and Statistics, University of Glasgow, Glasgow G12 8QQ, UK
- Correspondence:
| | - Jan M. Schulz
- Department of Biomedicine, University of Basel, 4001 Basel, Switzerland;
| | | |
Collapse
|
8
|
Combrisson E, Allegra M, Basanisi R, Ince RAA, Giordano B, Bastin J, Brovelli A. Group-level inference of information-based measures for the analyses of cognitive brain networks from neurophysiological data. Neuroimage 2022; 258:119347. [PMID: 35660460 DOI: 10.1016/j.neuroimage.2022.119347] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2021] [Revised: 05/24/2022] [Accepted: 05/30/2022] [Indexed: 12/30/2022] Open
Abstract
The reproducibility crisis in neuroimaging and in particular in the case of underpowered studies has introduced doubts on our ability to reproduce, replicate and generalize findings. As a response, we have seen the emergence of suggested guidelines and principles for neuroscientists known as Good Scientific Practice for conducting more reliable research. Still, every study remains almost unique in its combination of analytical and statistical approaches. While it is understandable considering the diversity of designs and brain data recording, it also represents a striking point against reproducibility. Here, we propose a non-parametric permutation-based statistical framework, primarily designed for neurophysiological data, in order to perform group-level inferences on non-negative measures of information encompassing metrics from information-theory, machine-learning or measures of distances. The framework supports both fixed- and random-effect models to adapt to inter-individuals and inter-sessions variability. Using numerical simulations, we compared the accuracy in ground-truth retrieving of both group models, such as test- and cluster-wise corrections for multiple comparisons. We then reproduced and extended existing results using both spatially uniform MEG and non-uniform intracranial neurophysiological data. We showed how the framework can be used to extract stereotypical task- and behavior-related effects across the population covering scales from the local level of brain regions, inter-areal functional connectivity to measures summarizing network properties. We also present an open-source Python toolbox called Frites1 that includes the proposed statistical pipeline using information-theoretic metrics such as single-trial functional connectivity estimations for the extraction of cognitive brain networks. Taken together, we believe that this framework deserves careful attention as its robustness and flexibility could be the starting point toward the uniformization of statistical approaches.
Collapse
Affiliation(s)
- Etienne Combrisson
- Institut de Neurosciences de la Timone, Aix Marseille Université, UMR 7289 CNRS, 13005, Marseille, France.
| | - Michele Allegra
- Institut de Neurosciences de la Timone, Aix Marseille Université, UMR 7289 CNRS, 13005, Marseille, France; Dipartimento di Fisica e Astronomia "Galileo Galilei", Università di Padova, via Marzolo 8, 35131 Padova, Italy; Padua Neuroscience Center, Università di Padova, via Orus 2, 35131 Padova, Italy
| | - Ruggero Basanisi
- Institut de Neurosciences de la Timone, Aix Marseille Université, UMR 7289 CNRS, 13005, Marseille, France
| | - Robin A A Ince
- School of Psychology and Neuroscience, University of Glasgow, Glasgow, UK
| | - Bruno Giordano
- Institut de Neurosciences de la Timone, Aix Marseille Université, UMR 7289 CNRS, 13005, Marseille, France
| | - Julien Bastin
- Univ. Grenoble Alpes, Inserm, U1216, Grenoble Institut Neurosciences, 38000 Grenoble, France
| | - Andrea Brovelli
- Institut de Neurosciences de la Timone, Aix Marseille Université, UMR 7289 CNRS, 13005, Marseille, France.
| |
Collapse
|
9
|
Faes L, Pernice R, Mijatovic G, Antonacci Y, Krohova JC, Javorka M, Porta A. Information decomposition in the frequency domain: a new framework to study cardiovascular and cardiorespiratory oscillations. PHILOSOPHICAL TRANSACTIONS. SERIES A, MATHEMATICAL, PHYSICAL, AND ENGINEERING SCIENCES 2021; 379:20200250. [PMID: 34689619 DOI: 10.1098/rsta.2020.0250] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 12/21/2020] [Indexed: 06/13/2023]
Abstract
While cross-spectral and information-theoretic approaches are widely used for the multivariate analysis of physiological time series, their combined utilization is far less developed in the literature. This study introduces a framework for the spectral decomposition of multivariate information measures, which provides frequency-specific quantifications of the information shared between a target and two source time series and of its expansion into amounts related to how the sources contribute to the target dynamics with unique, redundant and synergistic information. The framework is illustrated in simulations of linearly interacting stochastic processes, showing how it allows us to retrieve amounts of information shared by the processes within specific frequency bands which are otherwise not detectable by time-domain information measures, as well as coupling features which are not detectable by spectral measures. Then, it is applied to the time series of heart period, systolic and diastolic arterial pressure and respiration variability measured in healthy subjects monitored in the resting supine position and during head-up tilt. We show that the spectral measures of unique, redundant and synergistic information shared by these variability series, integrated within specific frequency bands of physiological interest and reflect the mechanisms of short-term regulation of cardiovascular and cardiorespiratory oscillations and their alterations induced by the postural stress. This article is part of the theme issue 'Advanced computation in cardiovascular physiology: new challenges and opportunities'.
Collapse
Affiliation(s)
- Luca Faes
- Department of Engineering, University of Palermo, Palermo, Italy
| | - Riccardo Pernice
- Department of Engineering, University of Palermo, Palermo, Italy
| | - Gorana Mijatovic
- Faculty of Technical Sciences, University of Novi Sad, Novi Sad, Serbia
| | - Yuri Antonacci
- Department of Physics and Chemistry 'Emilio Segrè', University of Palermo, Palermo, Italy
| | - Jana Cernanova Krohova
- Department of Physiology and Biomedical Centre Martin (BioMed Martin), Jessenius Faculty of Medicine in Martin, Comenius University in Bratislava, Martin, Slovakia
| | - Michal Javorka
- Department of Physiology and Biomedical Centre Martin (BioMed Martin), Jessenius Faculty of Medicine in Martin, Comenius University in Bratislava, Martin, Slovakia
| | - Alberto Porta
- Department of Biomedical Sciences for Health, University of Milan, Milan, Italy
- Department of Cardiothoracic, Vascular Anesthesia and Intensive Care, IRCCS Policlinico San Donato, San Donato Milanese, Milan, Italy
| |
Collapse
|
10
|
Nonlinear reconfiguration of network edges, topology and information content during an artificial learning task. Brain Inform 2021; 8:26. [PMID: 34859330 PMCID: PMC8639979 DOI: 10.1186/s40708-021-00147-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2021] [Accepted: 11/18/2021] [Indexed: 11/10/2022] Open
Abstract
Here, we combine network neuroscience and machine learning to reveal connections between the brain's network structure and the emerging network structure of an artificial neural network. Specifically, we train a shallow, feedforward neural network to classify hand-written digits and then used a combination of systems neuroscience and information-theoretic tools to perform 'virtual brain analytics' on the resultant edge weights and activity patterns of each node. We identify three distinct phases of network reconfiguration across learning, each of which are characterized by unique topological and information-theoretic signatures. Each phase involves aligning the connections of the neural network with patterns of information contained in the input dataset or preceding layers (as relevant). We also observe a process of low-dimensional category separation in the network as a function of learning. Our results offer a systems-level perspective of how artificial neural networks function-in terms of multi-stage reorganization of edge weights and activity patterns to effectively exploit the information content of input data during edge-weight training-while simultaneously enriching our understanding of the methods used by systems neuroscience.
Collapse
|
11
|
Gutknecht AJ, Wibral M, Makkeh A. Bits and pieces: understanding information decomposition from part-whole relationships and formal logic. Proc Math Phys Eng Sci 2021; 477:20210110. [PMID: 35197799 PMCID: PMC8261229 DOI: 10.1098/rspa.2021.0110] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Accepted: 06/10/2021] [Indexed: 11/24/2022] Open
Abstract
Partial information decomposition (PID) seeks to decompose the multivariate mutual information that a set of source variables contains about a target variable into basic pieces, the so-called ‘atoms of information’. Each atom describes a distinct way in which the sources may contain information about the target. For instance, some information may be contained uniquely in a particular source, some information may be shared by multiple sources and some information may only become accessible synergistically if multiple sources are combined. In this paper, we show that the entire theory of PID can be derived, firstly, from considerations of part-whole relationships between information atoms and mutual information terms, and secondly, based on a hierarchy of logical constraints describing how a given information atom can be accessed. In this way, the idea of a PID is developed on the basis of two of the most elementary relationships in nature: the part-whole relationship and the relation of logical implication. This unifying perspective provides insights into pressing questions in the field such as the possibility of constructing a PID based on concepts other than redundant information in the general n-sources case. Additionally, it admits of a particularly accessible exposition of PID theory.
Collapse
Affiliation(s)
- A J Gutknecht
- Campus Institute for Dynamics of Biological Networks, Georg-August University, Goettingen, Germany.,MEG Unit, Brain Imaging Center, Goethe University, Frankfurt, Germany
| | - M Wibral
- Campus Institute for Dynamics of Biological Networks, Georg-August University, Goettingen, Germany
| | - A Makkeh
- Campus Institute for Dynamics of Biological Networks, Georg-August University, Goettingen, Germany
| |
Collapse
|
12
|
Rudelt L, González Marx D, Wibral M, Priesemann V. Embedding optimization reveals long-lasting history dependence in neural spiking activity. PLoS Comput Biol 2021; 17:e1008927. [PMID: 34061837 PMCID: PMC8205186 DOI: 10.1371/journal.pcbi.1008927] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2020] [Revised: 06/15/2021] [Accepted: 03/31/2021] [Indexed: 11/19/2022] Open
Abstract
Information processing can leave distinct footprints on the statistics of neural spiking. For example, efficient coding minimizes the statistical dependencies on the spiking history, while temporal integration of information may require the maintenance of information over different timescales. To investigate these footprints, we developed a novel approach to quantify history dependence within the spiking of a single neuron, using the mutual information between the entire past and current spiking. This measure captures how much past information is necessary to predict current spiking. In contrast, classical time-lagged measures of temporal dependence like the autocorrelation capture how long-potentially redundant-past information can still be read out. Strikingly, we find for model neurons that our method disentangles the strength and timescale of history dependence, whereas the two are mixed in classical approaches. When applying the method to experimental data, which are necessarily of limited size, a reliable estimation of mutual information is only possible for a coarse temporal binning of past spiking, a so-called past embedding. To still account for the vastly different spiking statistics and potentially long history dependence of living neurons, we developed an embedding-optimization approach that does not only vary the number and size, but also an exponential stretching of past bins. For extra-cellular spike recordings, we found that the strength and timescale of history dependence indeed can vary independently across experimental preparations. While hippocampus indicated strong and long history dependence, in visual cortex it was weak and short, while in vitro the history dependence was strong but short. This work enables an information-theoretic characterization of history dependence in recorded spike trains, which captures a footprint of information processing that is beyond time-lagged measures of temporal dependence. To facilitate the application of the method, we provide practical guidelines and a toolbox.
Collapse
Affiliation(s)
- Lucas Rudelt
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
| | | | - Michael Wibral
- Campus Institute for Dynamics of Biological Networks, University of Göttingen, Göttingen, Germany
| | - Viola Priesemann
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
13
|
Spitzner FP, Dehning J, Wilting J, Hagemann A, P. Neto J, Zierenberg J, Priesemann V. MR. Estimator, a toolbox to determine intrinsic timescales from subsampled spiking activity. PLoS One 2021; 16:e0249447. [PMID: 33914774 PMCID: PMC8084202 DOI: 10.1371/journal.pone.0249447] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2020] [Accepted: 03/18/2021] [Indexed: 11/23/2022] Open
Abstract
Here we present our Python toolbox "MR. Estimator" to reliably estimate the intrinsic timescale from electrophysiologal recordings of heavily subsampled systems. Originally intended for the analysis of time series from neuronal spiking activity, our toolbox is applicable to a wide range of systems where subsampling-the difficulty to observe the whole system in full detail-limits our capability to record. Applications range from epidemic spreading to any system that can be represented by an autoregressive process. In the context of neuroscience, the intrinsic timescale can be thought of as the duration over which any perturbation reverberates within the network; it has been used as a key observable to investigate a functional hierarchy across the primate cortex and serves as a measure of working memory. It is also a proxy for the distance to criticality and quantifies a system's dynamic working point.
Collapse
Affiliation(s)
- F. P. Spitzner
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - J. Dehning
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - J. Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - A. Hagemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - J. P. Neto
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - J. Zierenberg
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - V. Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein-Center for Computational Neuroscience (BCCN) Göttingen, Göttingen, Germany
| |
Collapse
|
14
|
Makkeh A, Gutknecht AJ, Wibral M. Introducing a differentiable measure of pointwise shared information. Phys Rev E 2021; 103:032149. [PMID: 33862718 DOI: 10.1103/physreve.103.032149] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2020] [Accepted: 02/19/2021] [Indexed: 11/07/2022]
Abstract
Partial information decomposition of the multivariate mutual information describes the distinct ways in which a set of source variables contains information about a target variable. The groundbreaking work of Williams and Beer has shown that this decomposition cannot be determined from classic information theory without making additional assumptions, and several candidate measures have been proposed, often drawing on principles from related fields such as decision theory. None of these measures is differentiable with respect to the underlying probability mass function. We here present a measure that satisfies this property, emerges solely from information-theoretic principles, and has the form of a local mutual information. We show how the measure can be understood from the perspective of exclusions of probability mass, a principle that is foundational to the original definition of mutual information by Fano. Since our measure is well defined for individual realizations of random variables it lends itself, for example, to local learning in artificial neural networks. We also show that it has a meaningful Möbius inversion on a redundancy lattice and obeys a target chain rule. We give an operational interpretation of the measure based on the decisions that an agent should take if given only the shared information.
Collapse
Affiliation(s)
- Abdullah Makkeh
- Campus Institute for Dynamics of Biological Networks, Georg-August Univeristy, Goettingen, Germany
| | - Aaron J Gutknecht
- Campus Institute for Dynamics of Biological Networks, Georg-August Univeristy, Goettingen, Germany
| | - Michael Wibral
- Campus Institute for Dynamics of Biological Networks, Georg-August Univeristy, Goettingen, Germany
| |
Collapse
|
15
|
Discovering Higher-Order Interactions Through Neural Information Decomposition. ENTROPY 2021; 23:e23010079. [PMID: 33430463 PMCID: PMC7827712 DOI: 10.3390/e23010079] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/03/2020] [Revised: 12/21/2020] [Accepted: 12/25/2020] [Indexed: 11/25/2022]
Abstract
If regularity in data takes the form of higher-order functions among groups of variables, models which are biased towards lower-order functions may easily mistake the data for noise. To distinguish whether this is the case, one must be able to quantify the contribution of different orders of dependence to the total information. Recent work in information theory attempts to do this through measures of multivariate mutual information (MMI) and information decomposition (ID). Despite substantial theoretical progress, practical issues related to tractability and learnability of higher-order functions are still largely unaddressed. In this work, we introduce a new approach to information decomposition—termed Neural Information Decomposition (NID)—which is both theoretically grounded, and can be efficiently estimated in practice using neural networks. We show on synthetic data that NID can learn to distinguish higher-order functions from noise, while many unsupervised probability models cannot. Additionally, we demonstrate the usefulness of this framework as a tool for exploring biological and artificial neural networks.
Collapse
|
16
|
Pinzuti E, Wollstadt P, Gutknecht A, Tüscher O, Wibral M. Measuring spectrally-resolved information transfer. PLoS Comput Biol 2020; 16:e1008526. [PMID: 33370259 PMCID: PMC7793276 DOI: 10.1371/journal.pcbi.1008526] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2020] [Revised: 01/08/2021] [Accepted: 11/12/2020] [Indexed: 12/13/2022] Open
Abstract
Information transfer, measured by transfer entropy, is a key component of distributed computation. It is therefore important to understand the pattern of information transfer in order to unravel the distributed computational algorithms of a system. Since in many natural systems distributed computation is thought to rely on rhythmic processes a frequency resolved measure of information transfer is highly desirable. Here, we present a novel algorithm, and its efficient implementation, to identify separately frequencies sending and receiving information in a network. Our approach relies on the invertible maximum overlap discrete wavelet transform (MODWT) for the creation of surrogate data in the computation of transfer entropy and entirely avoids filtering of the original signals. The approach thereby avoids well-known problems due to phase shifts or the ineffectiveness of filtering in the information theoretic setting. We also show that measuring frequency-resolved information transfer is a partial information decomposition problem that cannot be fully resolved to date and discuss the implications of this issue. Last, we evaluate the performance of our algorithm on simulated data and apply it to human magnetoencephalography (MEG) recordings and to local field potential recordings in the ferret. In human MEG we demonstrate top-down information flow in temporal cortex from very high frequencies (above 100Hz) to both similarly high frequencies and to frequencies around 20Hz, i.e. a complex spectral configuration of cortical information transmission that has not been described before. In the ferret we show that the prefrontal cortex sends information at low frequencies (4-8 Hz) to early visual cortex (V1), while V1 receives the information at high frequencies (> 125 Hz).
Collapse
Affiliation(s)
| | - Patricia Wollstadt
- MEG Unit, Brain Imaging Center, Goethe University, Frankfurt/Main, Germany
| | - Aaron Gutknecht
- Campus Institute for Dynamics of Biological Networks, Georg August University, Göttingen, Germany
| | - Oliver Tüscher
- Leibniz Institute for Resilience Research, Mainz, Germany
- Department of Psychiatry and Psychotherapy, Johannes Gutenberg University of Mainz, Mainz, Germany
| | - Michael Wibral
- MEG Unit, Brain Imaging Center, Goethe University, Frankfurt/Main, Germany
- Campus Institute for Dynamics of Biological Networks, Georg August University, Göttingen, Germany
| |
Collapse
|
17
|
Martínez-Cancino R, Delorme A, Wagner J, Kreutz-Delgado K, Sotero RC, Makeig S. What Can Local Transfer Entropy Tell Us about Phase-Amplitude Coupling in Electrophysiological Signals? ENTROPY 2020; 22:e22111262. [PMID: 33287030 PMCID: PMC7712258 DOI: 10.3390/e22111262] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/29/2020] [Revised: 11/03/2020] [Accepted: 11/04/2020] [Indexed: 12/18/2022]
Abstract
Modulation of the amplitude of high-frequency cortical field activity locked to changes in the phase of a slower brain rhythm is known as phase-amplitude coupling (PAC). The study of this phenomenon has been gaining traction in neuroscience because of several reports on its appearance in normal and pathological brain processes in humans as well as across different mammalian species. This has led to the suggestion that PAC may be an intrinsic brain process that facilitates brain inter-area communication across different spatiotemporal scales. Several methods have been proposed to measure the PAC process, but few of these enable detailed study of its time course. It appears that no studies have reported details of PAC dynamics including its possible directional delay characteristic. Here, we study and characterize the use of a novel information theoretic measure that may address this limitation: local transfer entropy. We use both simulated and actual intracranial electroencephalographic data. In both cases, we observe initial indications that local transfer entropy can be used to detect the onset and offset of modulation process periods revealed by mutual information estimated phase-amplitude coupling (MIPAC). We review our results in the context of current theories about PAC in brain electrical activity, and discuss technical issues that must be addressed to see local transfer entropy more widely applied to PAC analysis. The current work sets the foundations for further use of local transfer entropy for estimating PAC process dynamics, and extends and complements our previous work on using local mutual information to compute PAC (MIPAC).
Collapse
Affiliation(s)
- Ramón Martínez-Cancino
- Swartz Center for Computational Neurosciences, Institute for Neural Computation, University of California San Diego, La Jolla, CA 92093, USA; (A.D.); (J.W.); (S.M.)
- Jacobs School of Engineering, University of California San Diego, La Jolla, CA 92093, USA;
- Correspondence:
| | - Arnaud Delorme
- Swartz Center for Computational Neurosciences, Institute for Neural Computation, University of California San Diego, La Jolla, CA 92093, USA; (A.D.); (J.W.); (S.M.)
- Centre de Recherche Cerveau et Cognition (CerCo), Université Paul Sabatier, 31059 Toulouse, France
- CNRS, UMR 5549, 31052 Toulouse, France
| | - Johanna Wagner
- Swartz Center for Computational Neurosciences, Institute for Neural Computation, University of California San Diego, La Jolla, CA 92093, USA; (A.D.); (J.W.); (S.M.)
| | - Kenneth Kreutz-Delgado
- Jacobs School of Engineering, University of California San Diego, La Jolla, CA 92093, USA;
| | - Roberto C. Sotero
- Computational Neurophysics Lab, University of Calgary, Calgary, AB T2N 4N1, Canada;
| | - Scott Makeig
- Swartz Center for Computational Neurosciences, Institute for Neural Computation, University of California San Diego, La Jolla, CA 92093, USA; (A.D.); (J.W.); (S.M.)
| |
Collapse
|
18
|
Cramer B, Stöckel D, Kreft M, Wibral M, Schemmel J, Meier K, Priesemann V. Control of criticality and computation in spiking neuromorphic networks with plasticity. Nat Commun 2020; 11:2853. [PMID: 32503982 PMCID: PMC7275091 DOI: 10.1038/s41467-020-16548-3] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2019] [Accepted: 04/23/2020] [Indexed: 11/08/2022] Open
Abstract
The critical state is assumed to be optimal for any computation in recurrent neural networks, because criticality maximizes a number of abstract computational properties. We challenge this assumption by evaluating the performance of a spiking recurrent neural network on a set of tasks of varying complexity at - and away from critical network dynamics. To that end, we developed a plastic spiking network on a neuromorphic chip. We show that the distance to criticality can be easily adapted by changing the input strength, and then demonstrate a clear relation between criticality, task-performance and information-theoretic fingerprint. Whereas the information-theoretic measures all show that network capacity is maximal at criticality, only the complex tasks profit from criticality, whereas simple tasks suffer. Thereby, we challenge the general assumption that criticality would be beneficial for any task, and provide instead an understanding of how the collective network state should be tuned to task requirement.
Collapse
Affiliation(s)
- Benjamin Cramer
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany.
| | - David Stöckel
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany
| | - Markus Kreft
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany
| | - Michael Wibral
- Campus Institute for Dynamics of Biological Networks, Georg-August University, Hermann-Rein-Straße 3, 37075, Göttingen, Germany
| | - Johannes Schemmel
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany
| | - Karlheinz Meier
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany
| | - Viola Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Am Faßberg 17, 37077, Göttingen, Germany.
- Bernstein Center for Computational Neuroscience, Georg-August University, Am Faßberg 17, 37077, Göttingen, Germany.
- Department of Physics, Georg-August University, Friedrich-Hund-Platz 1, 37077, Göttingen, Germany.
| |
Collapse
|
19
|
A Method to Present and Analyze Ensembles of Information Sources. ENTROPY 2020; 22:e22050580. [PMID: 33286352 PMCID: PMC7517101 DOI: 10.3390/e22050580] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/29/2020] [Accepted: 05/18/2020] [Indexed: 01/22/2023]
Abstract
Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with information theory to provide estimates of information shared between variables (forming a network between variables), or between neural variables and other variables (e.g., behavior or sensory stimuli). However, it can be difficult to (1) evaluate if the ensemble is significantly different from what would be expected in a purely noisy system and (2) determine if two ensembles are different. Herein, we introduce relatively simple methods to address these problems by analyzing ensembles of information sources. We demonstrate how an ensemble built of mutual information connections can be compared to null surrogate data to determine if the ensemble is significantly different from noise. Next, we show how two ensembles can be compared using a randomization process to determine if the sources in one contain more information than the other. All code necessary to carry out these analyses and demonstrations are provided.
Collapse
|
20
|
Abstract
Neural systems are composed of many local processors that generate an output given their many inputs as specified by a transfer function. This paper studies a transfer function that is fundamentally asymmetric and builds on multi-site intracellular recordings indicating that some neocortical pyramidal cells can function as context-sensitive two-point processors in which some inputs modulate the strength with which they transmit information about other inputs. Learning and processing at the level of the local processor can then be guided by the context of activity in the system as a whole without corrupting the message that the local processor transmits. We use a recent advance in the foundations of information theory to compare the properties of this modulatory transfer function with that of the simple arithmetic operators. This advance enables the information transmitted by processors with two distinct inputs to be decomposed into those components unique to each input, that shared between the two inputs, and that which depends on both though it is in neither, i.e., synergy. We show that contextual modulation is fundamentally asymmetric, contrasts with all four simple arithmetic operators, can take various forms, and can occur together with the anatomical asymmetry that defines pyramidal neurons in mammalian neocortex.
Collapse
|
21
|
Optimal Interplay between Synaptic Strengths and Network Structure Enhances Activity Fluctuations and Information Propagation in Hierarchical Modular Networks. Brain Sci 2020; 10:brainsci10040228. [PMID: 32290351 PMCID: PMC7226268 DOI: 10.3390/brainsci10040228] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2020] [Revised: 04/03/2020] [Accepted: 04/04/2020] [Indexed: 01/21/2023] Open
Abstract
In network models of spiking neurons, the joint impact of network structure and synaptic parameters on activity propagation is still an open problem. Here, we use an information-theoretical approach to investigate activity propagation in spiking networks with a hierarchical modular topology. We observe that optimized pairwise information propagation emerges due to the increase of either (i) the global synaptic strength parameter or (ii) the number of modules in the network, while the network size remains constant. At the population level, information propagation of activity among adjacent modules is enhanced as the number of modules increases until a maximum value is reached and then decreases, showing that there is an optimal interplay between synaptic strength and modularity for population information flow. This is in contrast to information propagation evaluated among pairs of neurons, which attains maximum value at the maximum values of these two parameter ranges. By examining the network behavior under the increase of synaptic strength and the number of modules, we find that these increases are associated with two different effects: (i) the increase of autocorrelations among individual neurons and (ii) the increase of cross-correlations among pairs of neurons. The second effect is associated with better information propagation in the network. Our results suggest roles that link topological features and synaptic strength levels to the transmission of information in cortical networks.
Collapse
|
22
|
Tehrani-Saleh A, Adami C. Can Transfer Entropy Infer Information Flow in Neuronal Circuits for Cognitive Processing? ENTROPY 2020; 22:e22040385. [PMID: 33286159 PMCID: PMC7516857 DOI: 10.3390/e22040385] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/04/2019] [Revised: 03/11/2020] [Accepted: 03/25/2020] [Indexed: 11/16/2022]
Abstract
How cognitive neural systems process information is largely unknown, in part because of how difficult it is to accurately follow the flow of information from sensors via neurons to actuators. Measuring the flow of information is different from measuring correlations between firing neurons, for which several measures are available, foremost among them the Shannon information, which is an undirected measure. Several information-theoretic notions of "directed information" have been used to successfully detect the flow of information in some systems, in particular in the neuroscience community. However, recent work has shown that directed information measures such as transfer entropy can sometimes inadequately estimate information flow, or even fail to identify manifest directed influences, especially if neurons contribute in a cryptographic manner to influence the effector neuron. Because it is unclear how often such cryptic influences emerge in cognitive systems, the usefulness of transfer entropy measures to reconstruct information flow is unknown. Here, we test how often cryptographic logic emerges in an evolutionary process that generates artificial neural circuits for two fundamental cognitive tasks (motion detection and sound localization). Besides counting the frequency of problematic logic gates, we also test whether transfer entropy applied to an activity time-series recorded from behaving digital brains can infer information flow, compared to a ground-truth model of direct influence constructed from connectivity and circuit logic. Our results suggest that transfer entropy will sometimes fail to infer directed information when it exists, and sometimes suggest a causal connection when there is none. However, the extent of incorrect inference strongly depends on the cognitive task considered. These results emphasize the importance of understanding the fundamental logic processes that contribute to information flow in cognitive processing, and quantifying their relevance in any given nervous system.
Collapse
Affiliation(s)
- Ali Tehrani-Saleh
- Department of Computer Science and Engineering, Michigan State University, East Lansing, MI 48824, USA;
- BEACON Center for the Study of Evolution, Michigan State University, East Lansing, MI 48824, USA
| | - Christoph Adami
- BEACON Center for the Study of Evolution, Michigan State University, East Lansing, MI 48824, USA
- Department of Microbiology & Molecular Genetics, Michigan State University, East Lansing, MI 48824, USA
- Department of Physics & Astronomy, Michigan State University, East Lansing, MI 48824, USA
- Correspondence:
| |
Collapse
|
23
|
Finn C, Lizier JT. Generalised Measures of Multivariate Information Content. ENTROPY (BASEL, SWITZERLAND) 2020; 22:E216. [PMID: 33285991 PMCID: PMC7851747 DOI: 10.3390/e22020216] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/11/2019] [Revised: 02/05/2020] [Accepted: 02/12/2020] [Indexed: 12/12/2022]
Abstract
The entropy of a pair of random variables is commonly depicted using a Venn diagram. This representation is potentially misleading, however, since the multivariate mutual information can be negative. This paper presents new measures of multivariate information content that can be accurately depicted using Venn diagrams for any number of random variables. These measures complement the existing measures of multivariate mutual information and are constructed by considering the algebraic structure of information sharing. It is shown that the distinct ways in which a set of marginal observers can share their information with a non-observing third party corresponds to the elements of a free distributive lattice. The redundancy lattice from partial information decomposition is then subsequently and independently derived by combining the algebraic structures of joint and shared information content.
Collapse
Affiliation(s)
- Conor Finn
- Centre for Complex Systems, The University of Sydney, Sydney NSW 2006, Australia;
- CSIRO Data61, Marsfield NSW 2122, Australia
| | - Joseph T. Lizier
- Centre for Complex Systems, The University of Sydney, Sydney NSW 2006, Australia;
| |
Collapse
|
24
|
Kolchinsky A, Corominas-Murtra B. Decomposing information into copying versus transformation. J R Soc Interface 2020; 17:20190623. [PMID: 31964273 DOI: 10.1098/rsif.2019.0623] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023] Open
Abstract
In many real-world systems, information can be transmitted in two qualitatively different ways: by copying or by transformation. Copying occurs when messages are transmitted without modification, e.g. when an offspring receives an unaltered copy of a gene from its parent. Transformation occurs when messages are modified systematically during transmission, e.g. when mutational biases occur during genetic replication. Standard information-theoretic measures do not distinguish these two modes of information transfer, although they may reflect different mechanisms and have different functional consequences. Starting from a few simple axioms, we derive a decomposition of mutual information into the information transmitted by copying versus the information transmitted by transformation. We begin with a decomposition that applies when the source and destination of the channel have the same set of messages and a notion of message identity exists. We then generalize our decomposition to other kinds of channels, which can involve different source and destination sets and broader notions of similarity. In addition, we show that copy information can be interpreted as the minimal work needed by a physical copying process, which is relevant for understanding the physics of replication. We use the proposed decomposition to explore a model of amino acid substitution rates. Our results apply to any system in which the fidelity of copying, rather than simple predictability, is of critical relevance.
Collapse
|
25
|
Kostal L, Kobayashi R. Critical size of neural population for reliable information transmission. Phys Rev E 2019; 100:050401. [PMID: 31870018 DOI: 10.1103/physreve.100.050401] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2019] [Indexed: 11/07/2022]
Abstract
It is known that the probability of decoding error has a phase transition at information rate equal to the channel capacity. The corresponding thermodynamic limit requires infinite coding dimension, hence making the actual decoding practically impossible. In this Rapid Communication we analyze finite-size effects that occur in limited neural populations. We report that the achievable rate approaches the asymptote in a remarkably nonlinear manner with the population size. Qualitatively, our findings do not seem to depend on the details of the model.
Collapse
Affiliation(s)
- Lubomir Kostal
- Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| | - Ryota Kobayashi
- Principles of Informatics Research Division, National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo, Japan
| |
Collapse
|
26
|
Barta T, Kostal L. The effect of inhibition on rate code efficiency indicators. PLoS Comput Biol 2019; 15:e1007545. [PMID: 31790384 PMCID: PMC6907877 DOI: 10.1371/journal.pcbi.1007545] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2019] [Revised: 12/12/2019] [Accepted: 11/12/2019] [Indexed: 11/30/2022] Open
Abstract
In this paper we investigate the rate coding capabilities of neurons whose input signal are alterations of the base state of balanced inhibitory and excitatory synaptic currents. We consider different regimes of excitation-inhibition relationship and an established conductance-based leaky integrator model with adaptive threshold and parameter sets recreating biologically relevant spiking regimes. We find that given mean post-synaptic firing rate, counter-intuitively, increased ratio of inhibition to excitation generally leads to higher signal to noise ratio (SNR). On the other hand, the inhibitory input significantly reduces the dynamic coding range of the neuron. We quantify the joint effect of SNR and dynamic coding range by computing the metabolic efficiency-the maximal amount of information per one ATP molecule expended (in bits/ATP). Moreover, by calculating the metabolic efficiency we are able to predict the shapes of the post-synaptic firing rate histograms that may be tested on experimental data. Likewise, optimal stimulus input distributions are predicted, however, we show that the optimum can essentially be reached with a broad range of input distributions. Finally, we examine which parameters of the used neuronal model are the most important for the metabolically efficient information transfer.
Collapse
Affiliation(s)
- Tomas Barta
- Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
- Charles University, First Medical Faculty, Prague, Czech Republic
- Institute of Ecology and Environmental Sciences, INRA, Versailles, France
| | - Lubomir Kostal
- Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
| |
Collapse
|
27
|
Boonstra TW, Faes L, Kerkman JN, Marinazzo D. Information decomposition of multichannel EMG to map functional interactions in the distributed motor system. Neuroimage 2019; 202:116093. [DOI: 10.1016/j.neuroimage.2019.116093] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2019] [Revised: 07/12/2019] [Accepted: 08/09/2019] [Indexed: 01/21/2023] Open
|
28
|
Li M, Han Y, Aburn MJ, Breakspear M, Poldrack RA, Shine JM, Lizier JT. Transitions in information processing dynamics at the whole-brain network level are driven by alterations in neural gain. PLoS Comput Biol 2019; 15:e1006957. [PMID: 31613882 PMCID: PMC6793849 DOI: 10.1371/journal.pcbi.1006957] [Citation(s) in RCA: 39] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Accepted: 09/02/2019] [Indexed: 12/20/2022] Open
Abstract
A key component of the flexibility and complexity of the brain is its ability to dynamically adapt its functional network structure between integrated and segregated brain states depending on the demands of different cognitive tasks. Integrated states are prevalent when performing tasks of high complexity, such as maintaining items in working memory, consistent with models of a global workspace architecture. Recent work has suggested that the balance between integration and segregation is under the control of ascending neuromodulatory systems, such as the noradrenergic system, via changes in neural gain (in terms of the amplification and non-linearity in stimulus-response transfer function of brain regions). In a previous large-scale nonlinear oscillator model of neuronal network dynamics, we showed that manipulating neural gain parameters led to a 'critical' transition in phase synchrony that was associated with a shift from segregated to integrated topology, thus confirming our original prediction. In this study, we advance these results by demonstrating that the gain-mediated phase transition is characterized by a shift in the underlying dynamics of neural information processing. Specifically, the dynamics of the subcritical (segregated) regime are dominated by information storage, whereas the supercritical (integrated) regime is associated with increased information transfer (measured via transfer entropy). Operating near to the critical regime with respect to modulating neural gain parameters would thus appear to provide computational advantages, offering flexibility in the information processing that can be performed with only subtle changes in gain control. Our results thus link studies of whole-brain network topology and the ascending arousal system with information processing dynamics, and suggest that the constraints imposed by the ascending arousal system constrain low-dimensional modes of information processing within the brain.
Collapse
Affiliation(s)
- Mike Li
- Centre for Complex Systems, The University of Sydney, Sydney, Australia
- Brain and Mind Centre, The University of Sydney, Sydney, Australia
- Complex Systems Research Group, Faculty of Engineering, The University of Sydney, Sydney, Australia
| | - Yinuo Han
- Centre for Complex Systems, The University of Sydney, Sydney, Australia
- Brain and Mind Centre, The University of Sydney, Sydney, Australia
| | - Matthew J. Aburn
- QIMR Berghofer Medical Research Institute, Queensland, Australia
| | | | - Russell A. Poldrack
- Department of Psychology, Stanford University, Stanford, California, United States of America
| | - James M. Shine
- Centre for Complex Systems, The University of Sydney, Sydney, Australia
- Brain and Mind Centre, The University of Sydney, Sydney, Australia
| | - Joseph T. Lizier
- Centre for Complex Systems, The University of Sydney, Sydney, Australia
- Complex Systems Research Group, Faculty of Engineering, The University of Sydney, Sydney, Australia
| |
Collapse
|
29
|
Makkeh A, Chicharro D, Theis DO, Vicente R. MAXENT3D_PID: An Estimator for the Maximum-Entropy Trivariate Partial Information Decomposition. ENTROPY 2019; 21:862. [PMCID: PMC7515392 DOI: 10.3390/e21090862] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/28/2019] [Accepted: 08/27/2019] [Indexed: 07/04/2023]
Abstract
Partial information decomposition (PID) separates the contributions of sources about a target into unique, redundant, and synergistic components of information. In essence, PID answers the question of “who knows what” of a system of random variables and hence has applications to a wide spectrum of fields ranging from social to biological sciences. The paper presents MaxEnt3D_Pid, an algorithm that computes the PID of three sources, based on a recently-proposed maximum entropy measure, using convex optimization (cone programming). We describe the algorithm and its associated software utilization and report the results of various experiments assessing its accuracy. Moreover, the paper shows that a hierarchy of bivariate and trivariate PID allows obtaining the finer quantities of the trivariate partial information measure.
Collapse
Affiliation(s)
- Abdullah Makkeh
- Institute of Computer Science, University of Tartu, 51014 Tartu, Estonia
| | - Daniel Chicharro
- Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, 38068 Rovereto (TN), Italy
| | - Dirk Oliver Theis
- Institute of Computer Science, University of Tartu, 51014 Tartu, Estonia
| | - Raul Vicente
- Institute of Computer Science, University of Tartu, 51014 Tartu, Estonia
| |
Collapse
|
30
|
Daube C, Ince RAA, Gross J. Simple Acoustic Features Can Explain Phoneme-Based Predictions of Cortical Responses to Speech. Curr Biol 2019; 29:1924-1937.e9. [PMID: 31130454 PMCID: PMC6584359 DOI: 10.1016/j.cub.2019.04.067] [Citation(s) in RCA: 69] [Impact Index Per Article: 13.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2018] [Revised: 03/25/2019] [Accepted: 04/25/2019] [Indexed: 01/06/2023]
Abstract
When we listen to speech, we have to make sense of a waveform of sound pressure. Hierarchical models of speech perception assume that, to extract semantic meaning, the signal is transformed into unknown, intermediate neuronal representations. Traditionally, studies of such intermediate representations are guided by linguistically defined concepts, such as phonemes. Here, we argue that in order to arrive at an unbiased understanding of the neuronal responses to speech, we should focus instead on representations obtained directly from the stimulus. We illustrate our view with a data-driven, information theoretic analysis of a dataset of 24 young, healthy humans who listened to a 1 h narrative while their magnetoencephalogram (MEG) was recorded. We find that two recent results, the improved performance of an encoding model in which annotated linguistic and acoustic features were combined and the decoding of phoneme subgroups from phoneme-locked responses, can be explained by an encoding model that is based entirely on acoustic features. These acoustic features capitalize on acoustic edges and outperform Gabor-filtered spectrograms, which can explicitly describe the spectrotemporal characteristics of individual phonemes. By replicating our results in publicly available electroencephalography (EEG) data, we conclude that models of brain responses based on linguistic features can serve as excellent benchmarks. However, we believe that in order to further our understanding of human cortical responses to speech, we should also explore low-level and parsimonious explanations for apparent high-level phenomena.
Collapse
Affiliation(s)
- Christoph Daube
- Institute of Neuroscience and Psychology, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, UK.
| | - Robin A A Ince
- Institute of Neuroscience and Psychology, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, UK
| | - Joachim Gross
- Institute of Neuroscience and Psychology, University of Glasgow, 62 Hillhead Street, Glasgow G12 8QB, UK; Institute for Biomagnetism and Biosignalanalysis, University of Münster, Malmedyweg 15, 48149 Münster, Germany
| |
Collapse
|
31
|
Comparison of short-term heart rate variability indexes evaluated through electrocardiographic and continuous blood pressure monitoring. Med Biol Eng Comput 2019; 57:1247-1263. [PMID: 30730027 DOI: 10.1007/s11517-019-01957-4] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2018] [Accepted: 01/28/2019] [Indexed: 10/27/2022]
Abstract
Heart rate variability (HRV) analysis represents an important tool for the characterization of complex cardiovascular control. HRV indexes are usually calculated from electrocardiographic (ECG) recordings after measuring the time duration between consecutive R peaks, and this is considered the gold standard. An alternative method consists of assessing the pulse rate variability (PRV) from signals acquired through photoplethysmography, a technique also employed for the continuous noninvasive monitoring of blood pressure. In this work, we carry out a thorough analysis and comparison of short-term variability indexes computed from HRV time series obtained from the ECG and from PRV time series obtained from continuous blood pressure (CBP) signals, in order to evaluate the reliability of using CBP-based recordings in place of standard ECG tracks. The analysis has been carried out on short time series (300 beats) of HRV and PRV in 76 subjects studied in different conditions: resting in the supine position, postural stress during 45° head-up tilt, and mental stress during computation of arithmetic test. Nine different indexes have been taken into account, computed in the time domain (mean, variance, root mean square of the successive differences), frequency domain (low-to-high frequency power ratio LF/HF, HF spectral power, and central frequency), and information domain (entropy, conditional entropy, self entropy). Thorough validation has been performed using comparison of the HRV and PRV distributions, robust linear regression, and Bland-Altman plots. Results demonstrate the feasibility of extracting HRV indexes from CBP-based data, showing an overall relatively good agreement of time-, frequency-, and information-domain measures. The agreement decreased during postural and mental arithmetic stress, especially with regard to band-power ratio, conditional, and self-entropy. This finding suggests to use caution in adopting PRV as a surrogate of HRV during stress conditions.
Collapse
|
32
|
Faes L, Bari V, Ranucci M, Porta A. Multiscale Decomposition of Cardiovascular and Cardiorespiratory Information Transfer under General Anesthesia. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2018:4607-4610. [PMID: 30441378 DOI: 10.1109/embc.2018.8513191] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The analysis of short-term cardiovascular and cardiorespiratory regulation during altered conscious states, such as those induced by anesthesia, requires to employ time series analysis methods able to deal with the multivariate and multiscale nature of the observed dynamics. To meet this requirement, the present study exploits the extension to multiscale analysis of recently proposed information decomposition methods which allow to quantify, from short realizations, the amounts of joint, unique, redundant and synergistic information transferred within multivariate time series. These methods were applied to the spontaneous variability of heart period (HP), systolic arterial pressure (SAP) and respiration (RESP) in patients undergoing coronary artery bypass graft monitored before and after the induction of general anesthesia. We found that, after anesthesia induction, information is processed within the cardiovascular network in a scale-dependent way: at short time scales, a shift from synergistic to redundant information transferred from SAP and RESP to HP occurs, which is associated with enhanced baroreflex-mediated respiratory effects on arterial pressure; at longer time scales, the increased information transfer from SAP to HP denotes an enhancement of the baroreflex coupling related to slow cardiovascular oscillations.
Collapse
|
33
|
Information Theory and Cognition: A Review. ENTROPY 2018; 20:e20090706. [PMID: 33265795 PMCID: PMC7513233 DOI: 10.3390/e20090706] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/24/2018] [Revised: 08/31/2018] [Accepted: 09/08/2018] [Indexed: 01/12/2023]
Abstract
We examine how information theory has been used to study cognition over the last seven decades. After an initial burst of activity in the 1950s, the backlash that followed stopped most work in this area. The last couple of decades has seen both a revival of interest, and a more firmly grounded, experimentally justified use of information theory. We can view cognition as the process of transforming perceptions into information—where we use information in the colloquial sense of the word. This last clarification is one of the problems we run into when trying to use information theoretic principles to understand or analyze cognition. Information theory is mathematical, while cognition is a subjective phenomenon. It is relatively easy to discern a subjective connection between cognition and information; it is a different matter altogether to apply the rigor of information theory to the process of cognition. In this paper, we will look at the many ways in which people have tried to alleviate this problem. These approaches range from narrowing the focus to only quantifiable aspects of cognition or borrowing conceptual machinery from information theory to address issues of cognition. We describe applications of information theory across a range of cognition research, from neural coding to cognitive control and predictive coding.
Collapse
|
34
|
Zubler F, Seiler A, Horvath T, Roth C, Miano S, Rummel C, Gast H, Nobili L, Schindler KA, Bassetti CL. Stroke causes a transient imbalance of interhemispheric information flow in EEG during non-REM sleep. Clin Neurophysiol 2018; 129:1418-1426. [DOI: 10.1016/j.clinph.2018.03.038] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2017] [Revised: 02/21/2018] [Accepted: 03/19/2018] [Indexed: 12/12/2022]
|
35
|
Towards understanding the complexity of cardiovascular oscillations: Insights from information theory. Comput Biol Med 2018; 98:48-57. [PMID: 29763765 DOI: 10.1016/j.compbiomed.2018.05.007] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2018] [Revised: 05/03/2018] [Accepted: 05/03/2018] [Indexed: 11/24/2022]
Abstract
Cardiovascular complexity is a feature of healthy physiological regulation, which stems from the simultaneous activity of several cardiovascular reflexes and other non-reflex physiological mechanisms. It is manifested in the rich dynamics characterizing the spontaneous heart rate and blood pressure variability (HRV and BPV). The present study faces the challenge of disclosing the origin of short-term HRV and BPV from the statistical perspective offered by information theory. To dissect the physiological mechanisms giving rise to cardiovascular complexity in different conditions, measures of predictive information, information storage, information transfer and information modification were applied to the beat-to-beat variability of heart period (HP), systolic arterial pressure (SAP) and respiratory volume signal recorded non-invasively in 61 healthy young subjects at supine rest and during head-up tilt (HUT) and mental arithmetics (MA). Information decomposition enabled to assess simultaneously several expected and newly inferred physiological phenomena, including: (i) the decreased complexity of HP during HUT and the increased complexity of SAP during MA; (ii) the suppressed cardiorespiratory information transfer, related to weakened respiratory sinus arrhythmia, under both challenges; (iii) the altered balance of the information transferred along the two arms of the cardiovascular loop during HUT, with larger baroreflex involvement and smaller feedforward mechanical effects; and (iv) an increased importance of direct respiratory effects on SAP during HUT, and on both HP and SAP during MA. We demonstrate that a decomposition of the information contained in cardiovascular oscillations can reveal subtle changes in system dynamics and improve our understanding of the complexity changes during physiological challenges.
Collapse
|
36
|
Lizier JT, Bertschinger N, Jost J, Wibral M. Information Decomposition of Target Effects from Multi-Source Interactions: Perspectives on Previous, Current and Future Work. ENTROPY 2018; 20:e20040307. [PMID: 33265398 PMCID: PMC7512824 DOI: 10.3390/e20040307] [Citation(s) in RCA: 53] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/19/2018] [Revised: 04/19/2018] [Accepted: 04/19/2018] [Indexed: 11/29/2022]
Abstract
The formulation of the Partial Information Decomposition (PID) framework by Williams and Beer in 2010 attracted a significant amount of attention to the problem of defining redundant (or shared), unique and synergistic (or complementary) components of mutual information that a set of source variables provides about a target. This attention resulted in a number of measures proposed to capture these concepts, theoretical investigations into such measures, and applications to empirical data (in particular to datasets from neuroscience). In this Special Issue on “Information Decomposition of Target Effects from Multi-Source Interactions” at Entropy, we have gathered current work on such information decomposition approaches from many of the leading research groups in the field. We begin our editorial by providing the reader with a review of previous information decomposition research, including an overview of the variety of measures proposed, how they have been interpreted and applied to empirical investigations. We then introduce the articles included in the special issue one by one, providing a similar categorisation of these articles into: i. proposals of new measures; ii. theoretical investigations into properties and interpretations of such approaches, and iii. applications of these measures in empirical studies. We finish by providing an outlook on the future of the field.
Collapse
Affiliation(s)
- Joseph T. Lizier
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering & IT, The University of Sydney, NSW 2006, Australia
- Correspondence: ; Tel.:+61-2-9351-3208
| | - Nils Bertschinger
- Frankfurt Institute of Advanced Studies (FIAS) and Goethe University, 60438 Frankfurt am Main, Germany
| | - Jürgen Jost
- Max Planck Institute for Mathematics in the Sciences, Inselstraße 22, 04103 Leipzig, Germany
- Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA
| | - Michael Wibral
- MEG Unit, Brain Imaging Center, Goethe University, 60528 Frankfurt, Germany
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
| |
Collapse
|
37
|
Brodski-Guerniero A, Naumer MJ, Moliadze V, Chan J, Althen H, Ferreira-Santos F, Lizier JT, Schlitt S, Kitzerow J, Schütz M, Langer A, Kaiser J, Freitag CM, Wibral M. Predictable information in neural signals during resting state is reduced in autism spectrum disorder. Hum Brain Mapp 2018; 39:3227-3240. [PMID: 29617056 DOI: 10.1002/hbm.24072] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2017] [Revised: 03/20/2018] [Accepted: 03/25/2018] [Indexed: 11/12/2022] Open
Abstract
The neurophysiological underpinnings of the nonsocial symptoms of autism spectrum disorder (ASD) which include sensory and perceptual atypicalities remain poorly understood. Well-known accounts of less dominant top-down influences and more dominant bottom-up processes compete to explain these characteristics. These accounts have been recently embedded in the popular framework of predictive coding theory. To differentiate between competing accounts, we studied altered information dynamics in ASD by quantifying predictable information in neural signals. Predictable information in neural signals measures the amount of stored information that is used for the next time step of a neural process. Thus, predictable information limits the (prior) information which might be available for other brain areas, for example, to build predictions for upcoming sensory information. We studied predictable information in neural signals based on resting-state magnetoencephalography (MEG) recordings of 19 ASD patients and 19 neurotypical controls aged between 14 and 27 years. Using whole-brain beamformer source analysis, we found reduced predictable information in ASD patients across the whole brain, but in particular in posterior regions of the default mode network. In these regions, epoch-by-epoch predictable information was positively correlated with source power in the alpha and beta frequency range as well as autocorrelation decay time. Predictable information in precuneus and cerebellum was negatively associated with nonsocial symptom severity, indicating a relevance of the analysis of predictable information for clinical research in ASD. Our findings are compatible with the assumption that use or precision of prior knowledge is reduced in ASD patients.
Collapse
Affiliation(s)
| | - Marcus J Naumer
- Institute of Medical Psychology, Faculty of Medicine, Goethe University, Frankfurt am Main, Germany
| | - Vera Moliadze
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe University, Frankfurt am Main, Germany.,Department of Medical Psychology and Medical Sociology, Schleswig-Holstein University Hospital (UKSH), Christian-Albrechts-University, Kiel, Germany
| | - Jason Chan
- Institute of Medical Psychology, Faculty of Medicine, Goethe University, Frankfurt am Main, Germany.,Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe University, Frankfurt am Main, Germany.,School of Applied Psychology, University College Cork, Cork, Ireland
| | - Heike Althen
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe University, Frankfurt am Main, Germany
| | - Fernando Ferreira-Santos
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, University of Porto, Porto, Portugal
| | - Joseph T Lizier
- Complex Systems Research Group and Centre for Complex Systems, Faculty of Engineering & IT, The University of Sydney, New South Wales, 2006, Australia
| | - Sabine Schlitt
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe University, Frankfurt am Main, Germany
| | - Janina Kitzerow
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe University, Frankfurt am Main, Germany
| | - Magdalena Schütz
- Institute of Medical Psychology, Faculty of Medicine, Goethe University, Frankfurt am Main, Germany.,Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe University, Frankfurt am Main, Germany
| | - Anne Langer
- Institute of Medical Psychology, Faculty of Medicine, Goethe University, Frankfurt am Main, Germany.,Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe University, Frankfurt am Main, Germany
| | - Jochen Kaiser
- Institute of Medical Psychology, Faculty of Medicine, Goethe University, Frankfurt am Main, Germany
| | - Christine M Freitag
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe University, Frankfurt am Main, Germany
| | - Michael Wibral
- MEG Unit, Brain Imaging Center, Goethe University, Frankfurt am Main, Germany
| |
Collapse
|
38
|
Kostal L, D'Onofrio G. Coordinate invariance as a fundamental constraint on the form of stimulus-specific information measures. BIOLOGICAL CYBERNETICS 2018; 112:13-23. [PMID: 28856427 DOI: 10.1007/s00422-017-0729-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/20/2017] [Accepted: 08/16/2017] [Indexed: 06/07/2023]
Abstract
The value of Shannon's mutual information is commonly used to describe the total amount of information that the neural code transfers between the ensemble of stimuli and the ensemble of neural responses. In addition, it is often desirable to know which features of the stimulus or response are most informative. The literature offers several different decompositions of the mutual information into its stimulus or response-specific components, such as the specific surprise or the uncertainty reduction, but the number of mutually distinct measures is in fact infinite. We resolve this ambiguity by requiring the specific information measures to be invariant under invertible coordinate transformations of the stimulus and the response ensembles. We prove that the Kullback-Leibler divergence is then the only suitable measure of the specific information. On a more general level, we discuss the necessity and the fundamental aspects of the coordinate invariance as a selection principle. We believe that our results will encourage further research into invariant statistical methods for the analysis of neural coding.
Collapse
Affiliation(s)
- Lubomir Kostal
- Institute of Physiology, Czech Academy of Sciences, Videnska 1083, 14220, Prague 4, Czech Republic.
| | - Giuseppe D'Onofrio
- Institute of Physiology, Czech Academy of Sciences, Videnska 1083, 14220, Prague 4, Czech Republic
| |
Collapse
|
39
|
Crosato E, Jiang L, Lecheval V, Lizier JT, Wang XR, Tichit P, Theraulaz G, Prokopenko M. Informative and misinformative interactions in a school of fish. SWARM INTELLIGENCE 2018. [DOI: 10.1007/s11721-018-0157-x] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
|
40
|
Chicharro D, Pica G, Panzeri S. The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy. ENTROPY (BASEL, SWITZERLAND) 2018; 20:e20030169. [PMID: 33265260 PMCID: PMC7512685 DOI: 10.3390/e20030169] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/13/2017] [Revised: 02/26/2018] [Accepted: 02/28/2018] [Indexed: 06/12/2023]
Abstract
Understanding how different information sources together transmit information is crucial in many domains. For example, understanding the neural code requires characterizing how different neurons contribute unique, redundant, or synergistic pieces of information about sensory or behavioral variables. Williams and Beer (2010) proposed a partial information decomposition (PID) that separates the mutual information that a set of sources contains about a set of targets into nonnegative terms interpretable as these pieces. Quantifying redundancy requires assigning an identity to different information pieces, to assess when information is common across sources. Harder et al. (2013) proposed an identity axiom that imposes necessary conditions to quantify qualitatively common information. However, Bertschinger et al. (2012) showed that, in a counterexample with deterministic target-source dependencies, the identity axiom is incompatible with ensuring PID nonnegativity. Here, we study systematically the consequences of information identity criteria that assign identity based on associations between target and source variables resulting from deterministic dependencies. We show how these criteria are related to the identity axiom and to previously proposed redundancy measures, and we characterize how they lead to negative PID terms. This constitutes a further step to more explicitly address the role of information identity in the quantification of redundancy. The implications for studying neural coding are discussed.
Collapse
Affiliation(s)
- Daniel Chicharro
- Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
- Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy
| | - Giuseppe Pica
- Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy
| | - Stefano Panzeri
- Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy
| |
Collapse
|
41
|
|
42
|
|
43
|
Partial and Entropic Information Decompositions of a Neuronal Modulatory Interaction. ENTROPY 2017. [DOI: 10.3390/e19110560] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
|
44
|
Faes L, Nollo G, Krohova J, Czippelova B, Turianikova Z, Javorka M. Information transfer and information modification to identify the structure of cardiovascular and cardiorespiratory networks. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2017; 2017:1563-1566. [PMID: 29060179 DOI: 10.1109/embc.2017.8037135] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
To fully elucidate the complex physiological mechanisms underlying the short-term autonomic regulation of heart period (H), systolic and diastolic arterial pressure (S, D) and respiratory (R) variability, the joint dynamics of these variables need to be explored using multivariate time series analysis. This study proposes the utilization of information-theoretic measures to measure causal interactions between nodes of the cardiovascular/cardiorespiratory network and to assess the nature (synergistic or redundant) of these directed interactions. Indexes of information transfer and information modification are extracted from the H, S, D and R series measured from healthy subjects in a resting state and during postural stress. Computations are performed in the framework of multivariate linear regression, using bootstrap techniques to assess on a single-subject basis the statistical significance of each measure and of its transitions across conditions. We find patterns of information transfer and modification which are related to specific cardiovascular and cardiorespiratory mechanisms in resting conditions and to their modification induced by the orthostatic stress.
Collapse
|
45
|
Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition. ENTROPY 2017. [DOI: 10.3390/e19090494] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
46
|
Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes. ENTROPY 2017. [DOI: 10.3390/e19080408] [Citation(s) in RCA: 51] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
47
|
Wollstadt P, Sellers KK, Rudelt L, Priesemann V, Hutt A, Fröhlich F, Wibral M. Breakdown of local information processing may underlie isoflurane anesthesia effects. PLoS Comput Biol 2017; 13:e1005511. [PMID: 28570661 PMCID: PMC5453425 DOI: 10.1371/journal.pcbi.1005511] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2016] [Accepted: 04/11/2017] [Indexed: 02/07/2023] Open
Abstract
The disruption of coupling between brain areas has been suggested as the mechanism underlying loss of consciousness in anesthesia. This hypothesis has been tested previously by measuring the information transfer between brain areas, and by taking reduced information transfer as a proxy for decoupling. Yet, information transfer is a function of the amount of information available in the information source—such that transfer decreases even for unchanged coupling when less source information is available. Therefore, we reconsidered past interpretations of reduced information transfer as a sign of decoupling, and asked whether impaired local information processing leads to a loss of information transfer. An important prediction of this alternative hypothesis is that changes in locally available information (signal entropy) should be at least as pronounced as changes in information transfer. We tested this prediction by recording local field potentials in two ferrets after administration of isoflurane in concentrations of 0.0%, 0.5%, and 1.0%. We found strong decreases in the source entropy under isoflurane in area V1 and the prefrontal cortex (PFC)—as predicted by our alternative hypothesis. The decrease in source entropy was stronger in PFC compared to V1. Information transfer between V1 and PFC was reduced bidirectionally, but with a stronger decrease from PFC to V1. This links the stronger decrease in information transfer to the stronger decrease in source entropy—suggesting reduced source entropy reduces information transfer. This conclusion fits the observation that the synaptic targets of isoflurane are located in local cortical circuits rather than on the synapses formed by interareal axonal projections. Thus, changes in information transfer under isoflurane seem to be a consequence of changes in local processing more than of decoupling between brain areas. We suggest that source entropy changes must be considered whenever interpreting changes in information transfer as decoupling. Currently we do not understand how anesthesia leads to loss of consciousness (LOC). One popular idea is that we loose consciousness when brain areas lose their ability to communicate with each other–as anesthetics might interrupt transmission on nerve fibers coupling them. This idea has been tested by measuring the amount of information transferred between brain areas, and taking this transfer to reflect the coupling itself. Yet, information that isn’t available in the source area can’t be transferred to a target. Hence, the decreases in information transfer could be related to less information being available in the source, rather than to a decoupling. We tested this possibility measuring the information available in source brain areas and found that it decreased under isoflurane anesthesia. In addition, a stronger decrease in source information lead to a stronger decrease of the information transfered. Thus, the input to the connection between brain areas determined the communicated information, not the strength of the coupling (which would result in a stronger decrease in the target). We suggest that interrupted information processing within brain areas has an important contribution to LOC, and should be focused on more in attempts to understand loss of consciousness under anesthesia.
Collapse
Affiliation(s)
- Patricia Wollstadt
- MEG Unit, Brain Imaging Center, Goethe University, Frankfurt/Main, Germany
- * E-mail: (PW); (VP)
| | - Kristin K. Sellers
- Department of Psychiatry, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
- Neurobiology Curriculum, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
| | - Lucas Rudelt
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Viola Priesemann
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, BCCN, Göttingen, Germany
- * E-mail: (PW); (VP)
| | - Axel Hutt
- Deutscher Wetterdienst, Section FE 12 - Data Assimilation, Offenbach/Main, Germany
- Department of Mathematics and Statistics, University of Reading, Reading, United Kingdom
| | - Flavio Fröhlich
- Department of Psychiatry, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
- Neurobiology Curriculum, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
- Department of Cell Biology and Physiology, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
- Department of Biomedical Engineering, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
- Neuroscience Center, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
- Department of Neurology, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
| | - Michael Wibral
- MEG Unit, Brain Imaging Center, Goethe University, Frankfurt/Main, Germany
| |
Collapse
|
48
|
Xiong W, Faes L, Ivanov PC. Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations. Phys Rev E 2017; 95:062114. [PMID: 28709192 PMCID: PMC6117159 DOI: 10.1103/physreve.95.062114] [Citation(s) in RCA: 85] [Impact Index Per Article: 12.1] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2017] [Indexed: 11/07/2022]
Abstract
Entropy measures are widely applied to quantify the complexity of dynamical systems in diverse fields. However, the practical application of entropy methods is challenging, due to the variety of entropy measures and estimators and the complexity of real-world time series, including nonstationarities and long-range correlations (LRC). We conduct a systematic study on the performance, bias, and limitations of three basic measures (entropy, conditional entropy, information storage) and three traditionally used estimators (linear, kernel, nearest neighbor). We investigate the dependence of entropy measures on estimator- and process-specific parameters, and we show the effects of three types of nonstationarities due to artifacts (trends, spikes, local variance change) in simulations of stochastic autoregressive processes. We also analyze the impact of LRC on the theoretical and estimated values of entropy measures. Finally, we apply entropy methods on heart rate variability data from subjects in different physiological states and clinical conditions. We find that entropy measures can only differentiate changes of specific types in cardiac dynamics and that appropriate preprocessing is vital for correct estimation and interpretation. Demonstrating the limitations of entropy methods and shedding light on how to mitigate bias and provide correct interpretations of results, this work can serve as a comprehensive reference for the application of entropy methods and the evaluation of existing studies.
Collapse
Affiliation(s)
- Wanting Xiong
- School of Systems Science, Beijing Normal University, Beijing 100875, People’s Republic of China
- Keck Laboratory for Network Physiology, Department of Physics, Boston University, Boston, Massachusetts 02215, USA
| | - Luca Faes
- Bruno Kessler Foundation and BIOtech, University of Trento, Trento 38123, Italy
| | - Plamen Ch. Ivanov
- Keck Laboratory for Network Physiology, Department of Physics, Boston University, Boston, Massachusetts 02215, USA
- Harvard Medical School and Division of Sleep Medicine, Brigham and Women’s Hospital, Boston, Massachusetts 02115, USA
- Institute of Solid State Physics, Bulgarian Academy of Sciences, Sofia 1784, Bulgaria
| |
Collapse
|
49
|
Quantifying Synergistic Information Using Intermediate Stochastic Variables. ENTROPY 2017. [DOI: 10.3390/e19020085] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
50
|
Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks. ENTROPY 2016. [DOI: 10.3390/e19010005] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
|