1
|
Bird AD, Cuntz H, Jedlicka P. Robust and consistent measures of pattern separation based on information theory and demonstrated in the dentate gyrus. PLoS Comput Biol 2024; 20:e1010706. [PMID: 38377108 PMCID: PMC10906873 DOI: 10.1371/journal.pcbi.1010706] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Revised: 03/01/2024] [Accepted: 12/13/2023] [Indexed: 02/22/2024] Open
Abstract
Pattern separation is a valuable computational function performed by neuronal circuits, such as the dentate gyrus, where dissimilarity between inputs is increased, reducing noise and increasing the storage capacity of downstream networks. Pattern separation is studied from both in vivo experimental and computational perspectives and, a number of different measures (such as orthogonalisation, decorrelation, or spike train distance) have been applied to quantify the process of pattern separation. However, these are known to give conclusions that can differ qualitatively depending on the choice of measure and the parameters used to calculate it. We here demonstrate that arbitrarily increasing sparsity, a noticeable feature of dentate granule cell firing and one that is believed to be key to pattern separation, typically leads to improved classical measures for pattern separation even, inappropriately, up to the point where almost all information about the inputs is lost. Standard measures therefore both cannot differentiate between pattern separation and pattern destruction, and give results that may depend on arbitrary parameter choices. We propose that techniques from information theory, in particular mutual information, transfer entropy, and redundancy, should be applied to penalise the potential for lost information (often due to increased sparsity) that is neglected by existing measures. We compare five commonly-used measures of pattern separation with three novel techniques based on information theory, showing that the latter can be applied in a principled way and provide a robust and reliable measure for comparing the pattern separation performance of different neurons and networks. We demonstrate our new measures on detailed compartmental models of individual dentate granule cells and a dentate microcircuit, and show how structural changes associated with epilepsy affect pattern separation performance. We also demonstrate how our measures of pattern separation can predict pattern completion accuracy. Overall, our measures solve a widely acknowledged problem in assessing the pattern separation of neural circuits such as the dentate gyrus, as well as the cerebellum and mushroom body. Finally we provide a publicly available toolbox allowing for easy analysis of pattern separation in spike train ensembles.
Collapse
Affiliation(s)
- Alexander D. Bird
- Computer-Based Modelling in the field of 3R Animal Protection, ICAR3R, Faculty of Medicine, Justus Liebig University, Giessen, Germany
- Ernst Strüngmann Institute (ESI) for Neuroscience in cooperation with the Max Planck Society, Frankfurt-am-Main, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt-am-Main, Germany
- Translational Neuroscience Network Giessen, Germany
| | - Hermann Cuntz
- Ernst Strüngmann Institute (ESI) for Neuroscience in cooperation with the Max Planck Society, Frankfurt-am-Main, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt-am-Main, Germany
- Translational Neuroscience Network Giessen, Germany
| | - Peter Jedlicka
- Computer-Based Modelling in the field of 3R Animal Protection, ICAR3R, Faculty of Medicine, Justus Liebig University, Giessen, Germany
- Translational Neuroscience Network Giessen, Germany
| |
Collapse
|
2
|
Griffa A, Mach M, Dedelley J, Gutierrez-Barragan D, Gozzi A, Allali G, Grandjean J, Van De Ville D, Amico E. Evidence for increased parallel information transmission in human brain networks compared to macaques and male mice. Nat Commun 2023; 14:8216. [PMID: 38081838 PMCID: PMC10713651 DOI: 10.1038/s41467-023-43971-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Accepted: 11/24/2023] [Indexed: 12/18/2023] Open
Abstract
Brain communication, defined as information transmission through white-matter connections, is at the foundation of the brain's computational capacities that subtend almost all aspects of behavior: from sensory perception shared across mammalian species, to complex cognitive functions in humans. How did communication strategies in macroscale brain networks adapt across evolution to accomplish increasingly complex functions? By applying a graph- and information-theory approach to assess information-related pathways in male mouse, macaque and human brains, we show a brain communication gap between selective information transmission in non-human mammals, where brain regions share information through single polysynaptic pathways, and parallel information transmission in humans, where regions share information through multiple parallel pathways. In humans, parallel transmission acts as a major connector between unimodal and transmodal systems. The layout of information-related pathways is unique to individuals across different mammalian species, pointing at the individual-level specificity of information routing architecture. Our work provides evidence that different communication patterns are tied to the evolution of mammalian brain networks.
Collapse
Affiliation(s)
- Alessandra Griffa
- Leenaards Memory Center, Lausanne University Hospital and University of Lausanne, Lausanne, Switzerland.
- Medical Image Processing Laboratory, Neuro-X Institute, École Polytechnique Fédérale De Lausanne (EPFL), Geneva, Switzerland.
- Department of Radiology and Medical Informatics, University of Geneva, Geneva, Switzerland.
| | - Mathieu Mach
- Medical Image Processing Laboratory, Neuro-X Institute, École Polytechnique Fédérale De Lausanne (EPFL), Geneva, Switzerland
| | - Julien Dedelley
- Medical Image Processing Laboratory, Neuro-X Institute, École Polytechnique Fédérale De Lausanne (EPFL), Geneva, Switzerland
| | - Daniel Gutierrez-Barragan
- Functional Neuroimaging Laboratory, Center for Neuroscience and Cognitive systems, Istituto Italiano di Tecnologia, Rovereto, Italy
| | - Alessandro Gozzi
- Functional Neuroimaging Laboratory, Center for Neuroscience and Cognitive systems, Istituto Italiano di Tecnologia, Rovereto, Italy
| | - Gilles Allali
- Leenaards Memory Center, Lausanne University Hospital and University of Lausanne, Lausanne, Switzerland
| | - Joanes Grandjean
- Department of Medical Imaging, Radboud University Medical Center, 6525 GA, Nijmegen, The Netherlands
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Medical Center, 6525 EN, Nijmegen, The Netherlands
| | - Dimitri Van De Ville
- Medical Image Processing Laboratory, Neuro-X Institute, École Polytechnique Fédérale De Lausanne (EPFL), Geneva, Switzerland
- Department of Radiology and Medical Informatics, University of Geneva, Geneva, Switzerland
| | - Enrico Amico
- Medical Image Processing Laboratory, Neuro-X Institute, École Polytechnique Fédérale De Lausanne (EPFL), Geneva, Switzerland.
- Department of Radiology and Medical Informatics, University of Geneva, Geneva, Switzerland.
| |
Collapse
|
3
|
Novitskaya Y, Dümpelmann M, Schulze-Bonhage A. Physiological and pathological neuronal connectivity in the living human brain based on intracranial EEG signals: the current state of research. FRONTIERS IN NETWORK PHYSIOLOGY 2023; 3:1297345. [PMID: 38107334 PMCID: PMC10723837 DOI: 10.3389/fnetp.2023.1297345] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/19/2023] [Accepted: 11/17/2023] [Indexed: 12/19/2023]
Abstract
Over the past decades, studies of human brain networks have received growing attention as the assessment and modelling of connectivity in the brain is a topic of high impact with potential application in the understanding of human brain organization under both physiological as well as various pathological conditions. Under specific diagnostic settings, human neuronal signal can be obtained from intracranial EEG (iEEG) recording in epilepsy patients that allows gaining insight into the functional organisation of living human brain. There are two approaches to assess brain connectivity in the iEEG-based signal: evaluation of spontaneous neuronal oscillations during ongoing physiological and pathological brain activity, and analysis of the electrophysiological cortico-cortical neuronal responses, evoked by single pulse electrical stimulation (SPES). Both methods have their own advantages and limitations. The paper outlines available methodological approaches and provides an overview of current findings in studies of physiological and pathological human brain networks, based on intracranial EEG recordings.
Collapse
Affiliation(s)
- Yulia Novitskaya
- Epilepsy Center, Department of Neurosurgery, Medical Center—University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Matthias Dümpelmann
- Epilepsy Center, Department of Neurosurgery, Medical Center—University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
- Department of Microsystems Engineering (IMTEK), University of Freiburg, Freiburg, Germany
| | - Andreas Schulze-Bonhage
- Epilepsy Center, Department of Neurosurgery, Medical Center—University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
- Center for Basics in NeuroModulation, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| |
Collapse
|
4
|
Abbasi S, Wolff A, Çatal Y, Northoff G. Increased noise relates to abnormal excitation-inhibition balance in schizophrenia: a combined empirical and computational study. Cereb Cortex 2023; 33:10477-10491. [PMID: 37562844 PMCID: PMC10560578 DOI: 10.1093/cercor/bhad297] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2023] [Revised: 07/25/2023] [Accepted: 07/26/2023] [Indexed: 08/12/2023] Open
Abstract
Electroencephalography studies link sensory processing issues in schizophrenia to increased noise level-noise here is background spontaneous activity-as measured by the signal-to-noise ratio. The mechanism, however, of such increased noise is unknown. We investigate if this relates to changes in cortical excitation-inhibition balance, which has been observed to be atypical in schizophrenia, by combining electroencephalography and computational modeling. Our electroencephalography task results, for which the local field potentials can be used as a proxy, show lower signal-to-noise ratio due to higher noise in schizophrenia. Both electroencephalography rest and task states exhibit higher levels of excitation in the functional excitation-inhibition (as a proxy of excitation-inhibition balance). This suggests a relationship between increased noise and atypical excitation in schizophrenia, which was addressed by using computational modeling. A Leaky Integrate-and-Fire model was used to simulate the effects of varying degrees of noise on excitation-inhibition balance, local field potential, NMDA current, and . Results show a noise-related increase in the local field potential, excitation in excitation-inhibition balance, pyramidal NMDA current, and spike rate. Mutual information and mediation analysis were used to explore a cross-level relationship, showing that the cortical local field potential plays a key role in transferring the effect of noise to the cellular population level of NMDA.
Collapse
Affiliation(s)
- Samira Abbasi
- University of Ottawa, Institute of Mental Health Research, Ottawa ON K1Z 7K4, Canada
- Department of Biomedical Engineering, Hamedan University of Technology, Hamedan 65169-13733, Iran
| | - Annemarie Wolff
- University of Ottawa, Institute of Mental Health Research, Ottawa ON K1Z 7K4, Canada
| | - Yasir Çatal
- University of Ottawa, Institute of Mental Health Research, Ottawa ON K1Z 7K4, Canada
| | - Georg Northoff
- University of Ottawa, Institute of Mental Health Research, Ottawa ON K1Z 7K4, Canada
| |
Collapse
|
5
|
Bellingrath JE. The Self-Simulational Theory of temporal extension. Neurosci Conscious 2023; 2023:niad015. [PMID: 37342236 PMCID: PMC10279415 DOI: 10.1093/nc/niad015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2023] [Revised: 05/15/2023] [Accepted: 06/02/2023] [Indexed: 06/22/2023] Open
Abstract
Subjective experience is experience in time. Unfolding in a continuous river of moments, our experience, however, consists not only in the changing phenomenological content per se but, further, in additional retrodiction and prospection of the moments that immediately preceded and followed it. It is in this way that William James's 'specious present' presents itself as extending between the past and future. While the phenomenology of temporality always happens, in normal waking states, to someone, and the notions of self-representation and temporal experience have continuously been associated with each other, there has not yet been an explicit account of their relationship. In this paper, the emergence of the subjective experience of temporal extension will be conceived of as arising out of a difference-relation between counterfactual and actual self-representations. After presenting the proposed relationship on both a conceptual level and a formalized and neuronally realistic level of description using information theory, convergent empirical evidence from general findings about temporal experience and inference, altered states of consciousness, and mental illness is examined. The Self-Simulational Theory of temporal extension is able to explain systematic variations in the subjectively experienced length of the temporal Now across numerous domains and holds potentially wide implications for the neuroscience of consciousness, as well as for a deeper understanding of different forms of mental illness.
Collapse
Affiliation(s)
- Jan Erik Bellingrath
- *Corresponding author. Center for Mind and Cognition, Ruhr-Universität Bochum, Universitätsstraße 150, 44801 Bochum, Nord-Rhein-Westfalen, Germany. E-mail:
| |
Collapse
|
6
|
Lueckel JM, Upadhyay N, Purrer V, Maurer A, Borger V, Radbruch A, Attenberger U, Wuellner U, Panda R, Boecker H. Whole-brain network transitions within the framework of ignition and transfer entropy following VIM-MRgFUS in essential tremor patients. Brain Stimul 2023; 16:879-888. [PMID: 37230462 DOI: 10.1016/j.brs.2023.05.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 03/30/2023] [Accepted: 05/08/2023] [Indexed: 05/27/2023] Open
Abstract
Magnetic resonance-guided focused ultrasound (MRgFUS) lesioning of the ventralis intermedius nucleus (VIM) has shown promise in treating drug-refractory essential tremor (ET). It remains unknown whether focal VIM lesions by MRgFUS have broader restorative effects on information flow within the whole-brain network of ET patients. We applied an information-theoretical approach based on intrinsic ignition and the concept of transfer entropy (TE) to assess the spatiotemporal dynamics after VIM-MRgFUS. Eighteen ET patients (mean age 71.44 years) underwent repeated 3T resting-state functional magnetic resonance imaging combined with Clinical Rating Scale for Tremor (CRST) assessments one day before (T0) and one month (T1) and six months (T2) post-MRgFUS, respectively. We observed increased whole brain ignition-driven mean integration (IDMI) at T1 (p < 0.05), along with trend increases at T2. Further, constraining to motor network nodes, we identified significant increases in information-broadcasting (bilateral supplementary motor area (SMA) and left cerebellar lobule III) and information-receiving (right precentral gyrus) at T1. Remarkably, increased information-broadcasting in bilateral SMA was correlated with relative improvement of the CRST in the treated hand. In addition, causal TE-based effective connectivity (EC) at T1 showed an increase from right SMA to left cerebellar lobule crus II and from left cerebellar lobule III to right thalamus. In conclusion, results suggest a change in information transmission capacity in ET after MRgFUS and a shift towards a more integrated functional state with increased levels of global and directional information flow.
Collapse
Affiliation(s)
- Julia M Lueckel
- Clinical Functional Imaging Group, Department of Diagnostic and Interventional Radiology, University Hospital Bonn, Bonn, Germany.
| | - Neeraj Upadhyay
- Clinical Functional Imaging Group, Department of Diagnostic and Interventional Radiology, University Hospital Bonn, Bonn, Germany
| | - Veronika Purrer
- German Center for Neurodegenerative Diseases, Bonn, Germany; Department of Neurology, University Hospital Bonn, Bonn, Germany
| | - Angelika Maurer
- Clinical Functional Imaging Group, Department of Diagnostic and Interventional Radiology, University Hospital Bonn, Bonn, Germany
| | - Valeri Borger
- Department of Neurosurgery, University Hospital Bonn, Bonn, Germany
| | - Alexander Radbruch
- German Center for Neurodegenerative Diseases, Bonn, Germany; Department of Neuroradiology, University Hospital Bonn, Bonn, Germany
| | - Ulrike Attenberger
- Department of Diagnostic and Interventional Radiology, University Hospital Bonn, Bonn, Germany
| | - Ullrich Wuellner
- German Center for Neurodegenerative Diseases, Bonn, Germany; Department of Neurology, University Hospital Bonn, Bonn, Germany
| | - Rajanikant Panda
- Coma Science Group, GIGA-Consciousness, University of Liège, Liège, Belgium
| | - Henning Boecker
- Clinical Functional Imaging Group, Department of Diagnostic and Interventional Radiology, University Hospital Bonn, Bonn, Germany; German Center for Neurodegenerative Diseases, Bonn, Germany.
| |
Collapse
|
7
|
Jacob M, Ford J, Deacon T. Cognition is entangled with metabolism: relevance for resting-state EEG-fMRI. Front Hum Neurosci 2023; 17:976036. [PMID: 37113322 PMCID: PMC10126302 DOI: 10.3389/fnhum.2023.976036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2022] [Accepted: 03/02/2023] [Indexed: 04/29/2023] Open
Abstract
The brain is a living organ with distinct metabolic constraints. However, these constraints are typically considered as secondary or supportive of information processing which is primarily performed by neurons. The default operational definition of neural information processing is that (1) it is ultimately encoded as a change in individual neuronal firing rate as this correlates with the presentation of a peripheral stimulus, motor action or cognitive task. Two additional assumptions are associated with this default interpretation: (2) that the incessant background firing activity against which changes in activity are measured plays no role in assigning significance to the extrinsically evoked change in neural firing, and (3) that the metabolic energy that sustains this background activity and which correlates with differences in neuronal firing rate is merely a response to an evoked change in neuronal activity. These assumptions underlie the design, implementation, and interpretation of neuroimaging studies, particularly fMRI, which relies on changes in blood oxygen as an indirect measure of neural activity. In this article we reconsider all three of these assumptions in light of recent evidence. We suggest that by combining EEG with fMRI, new experimental work can reconcile emerging controversies in neurovascular coupling and the significance of ongoing, background activity during resting-state paradigms. A new conceptual framework for neuroimaging paradigms is developed to investigate how ongoing neural activity is "entangled" with metabolism. That is, in addition to being recruited to support locally evoked neuronal activity (the traditional hemodynamic response), changes in metabolic support may be independently "invoked" by non-local brain regions, yielding flexible neurovascular coupling dynamics that inform the cognitive context. This framework demonstrates how multimodal neuroimaging is necessary to probe the neurometabolic foundations of cognition, with implications for the study of neuropsychiatric disorders.
Collapse
Affiliation(s)
- Michael Jacob
- Mental Health Service, San Francisco VA Healthcare System, San Francisco, CA, United States
- Department of Psychiatry and Behavioral Sciences, Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, United States
| | - Judith Ford
- Mental Health Service, San Francisco VA Healthcare System, San Francisco, CA, United States
- Department of Psychiatry and Behavioral Sciences, Weill Institute for Neurosciences, University of California, San Francisco, San Francisco, CA, United States
| | - Terrence Deacon
- Department of Anthropology, University of California, Berkeley, Berkeley, CA, United States
| |
Collapse
|
8
|
Hintze A, Adami C. Detecting Information Relays in Deep Neural Networks. ENTROPY (BASEL, SWITZERLAND) 2023; 25:401. [PMID: 36981289 PMCID: PMC10047156 DOI: 10.3390/e25030401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/03/2023] [Revised: 02/20/2023] [Accepted: 02/20/2023] [Indexed: 06/18/2023]
Abstract
Deep learning of artificial neural networks (ANNs) is creating highly functional processes that are, unfortunately, nearly as hard to interpret as their biological counterparts. Identification of functional modules in natural brains plays an important role in cognitive and neuroscience alike, and can be carried out using a wide range of technologies such as fMRI, EEG/ERP, MEG, or calcium imaging. However, we do not have such robust methods at our disposal when it comes to understanding functional modules in artificial neural networks. Ideally, understanding which parts of an artificial neural network perform what function might help us to address a number of vexing problems in ANN research, such as catastrophic forgetting and overfitting. Furthermore, revealing a network's modularity could improve our trust in them by making these black boxes more transparent. Here, we introduce a new information-theoretic concept that proves useful in understanding and analyzing a network's functional modularity: the relay information IR. The relay information measures how much information groups of neurons that participate in a particular function (modules) relay from inputs to outputs. Combined with a greedy search algorithm, relay information can be used to identify computational modules in neural networks. We also show that the functionality of modules correlates with the amount of relay information they carry.
Collapse
Affiliation(s)
- Arend Hintze
- Department of MicroData Analytics, Dalarna University, 791 31 Falun, Sweden
- BEACON Center for the Study of Evolution in Action, Michigan State University, East Lansing, MI 48824, USA
| | - Christoph Adami
- BEACON Center for the Study of Evolution in Action, Michigan State University, East Lansing, MI 48824, USA
- Department of Microbiology and Molecular Genetics, Michigan State University, East Lansing, MI 48824, USA
- Program in Evolution, Ecology, and Behavior, Michigan State University, East Lansing, MI 48824, USA
| |
Collapse
|
9
|
Li Q, Steeg GV, Yu S, Malo J. Functional Connectome of the Human Brain with Total Correlation. ENTROPY (BASEL, SWITZERLAND) 2022; 24:1725. [PMID: 36554129 PMCID: PMC9777567 DOI: 10.3390/e24121725] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Revised: 11/14/2022] [Accepted: 11/21/2022] [Indexed: 06/17/2023]
Abstract
Recent studies proposed the use of Total Correlation to describe functional connectivity among brain regions as a multivariate alternative to conventional pairwise measures such as correlation or mutual information. In this work, we build on this idea to infer a large-scale (whole-brain) connectivity network based on Total Correlation and show the possibility of using this kind of network as biomarkers of brain alterations. In particular, this work uses Correlation Explanation (CorEx) to estimate Total Correlation. First, we prove that CorEx estimates of Total Correlation and clustering results are trustable compared to ground truth values. Second, the inferred large-scale connectivity network extracted from the more extensive open fMRI datasets is consistent with existing neuroscience studies, but, interestingly, can estimate additional relations beyond pairwise regions. And finally, we show how the connectivity graphs based on Total Correlation can also be an effective tool to aid in the discovery of brain diseases.
Collapse
Affiliation(s)
- Qiang Li
- Image Processing Laboratory, University of Valencia, 46980 Valencia, Spain
| | - Greg Ver Steeg
- Information Sciences Institute, University of Southern California, Marina del Rey, CA 90292, USA
| | - Shujian Yu
- Machine Learning Group, UiT—The Arctic University of Norway, 9037 Tromsø, Norway
| | - Jesus Malo
- Image Processing Laboratory, University of Valencia, 46980 Valencia, Spain
| |
Collapse
|
10
|
Végh J, Berki ÁJ. Towards Generalizing the Information Theory for Neural Communication. ENTROPY (BASEL, SWITZERLAND) 2022; 24:1086. [PMID: 36010750 PMCID: PMC9407630 DOI: 10.3390/e24081086] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/02/2022] [Revised: 07/27/2022] [Accepted: 08/02/2022] [Indexed: 05/06/2023]
Abstract
Neuroscience extensively uses the information theory to describe neural communication, among others, to calculate the amount of information transferred in neural communication and to attempt the cracking of its coding. There are fierce debates on how information is represented in the brain and during transmission inside the brain. The neural information theory attempts to use the assumptions of electronic communication; despite the experimental evidence that the neural spikes carry information on non-discrete states, they have shallow communication speed, and the spikes' timing precision matters. Furthermore, in biology, the communication channel is active, which enforces an additional power bandwidth limitation to the neural information transfer. The paper revises the notions needed to describe information transfer in technical and biological communication systems. It argues that biology uses Shannon's idea outside of its range of validity and introduces an adequate interpretation of information. In addition, the presented time-aware approach to the information theory reveals pieces of evidence for the role of processes (as opposed to states) in neural operations. The generalized information theory describes both kinds of communication, and the classic theory is the particular case of the generalized theory.
Collapse
Affiliation(s)
| | - Ádám József Berki
- Department of Neurology, Semmelweis University, 1085 Budapest, Hungary
- János Szentágothai Doctoral School of Neurosciences, Semmelweis University, 1085 Budapest, Hungary
| |
Collapse
|
11
|
Weber I, Oehrn CR. NoLiTiA: An Open-Source Toolbox for Non-linear Time Series Analysis. Front Neuroinform 2022; 16:876012. [PMID: 35811996 PMCID: PMC9263366 DOI: 10.3389/fninf.2022.876012] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Accepted: 05/23/2022] [Indexed: 11/13/2022] Open
Abstract
In many scientific fields including neuroscience, climatology or physics, complex relationships can be described most parsimoniously by non-linear mechanics. Despite their relevance, many neuroscientists still apply linear estimates in order to evaluate complex interactions. This is partially due to the lack of a comprehensive compilation of non-linear methods. Available packages mostly specialize in only one aspect of non-linear time-series analysis and most often require some coding proficiency to use. Here, we introduce NoLiTiA, a free open-source MATLAB toolbox for non-linear time series analysis. In comparison to other currently available non-linear packages, NoLiTiA offers (1) an implementation of a broad range of classic and recently developed methods, (2) an implementation of newly proposed spatially and time-resolved recurrence amplitude analysis and (3) an intuitive environment accessible even to users with little coding experience due to a graphical user interface and batch-editor. The core methodology derives from three distinct fields of complex systems theory, including dynamical systems theory, recurrence quantification analysis and information theory. Besides established methodology including estimation of dynamic invariants like Lyapunov exponents and entropy-based measures, such as active information storage, we include recent developments of quantifying time-resolved aperiodic oscillations. In general, the toolbox will make non-linear methods accessible to the broad neuroscientific community engaged in time series processing.
Collapse
Affiliation(s)
- Immo Weber
- Department of Neurology, Philipps University of Marburg, Marburg, Germany
| | - Carina R. Oehrn
- Department of Neurology, Philipps University of Marburg, Marburg, Germany
- Center for Mind, Brain and Behavior, Philipps University of Marburg, Marburg, Germany
| |
Collapse
|
12
|
Casagrande A, Fabris F, Girometti R. Fifty years of Shannon information theory in assessing the accuracy and agreement of diagnostic tests. Med Biol Eng Comput 2022; 60:941-955. [PMID: 35195818 PMCID: PMC8863911 DOI: 10.1007/s11517-021-02494-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Accepted: 12/17/2021] [Indexed: 11/28/2022]
Abstract
Since 1948, Shannon theoretic methods for modeling information have found a wide range of applications in several areas where information plays a key role, which goes well beyond the original scopes for which they have been conceived, namely data compression and error correction over a noisy channel. Among other uses, these methods have been applied in the broad field of medical diagnostics since the 1970s, to quantify diagnostic information, to evaluate diagnostic test performance, but also to be used as technical tools in image processing and registration. This review illustrates the main contributions in assessing the accuracy of diagnostic tests and the agreement between raters, focusing on diagnostic test performance measurements and paired agreement evaluation. This work also presents a recent unified, coherent, and hopefully, final information-theoretical approach to deal with the flows of information involved among the patient, the diagnostic test performed to appraise the state of disease, and the raters who are checking the test results. The approach is assessed by considering two case studies: the first one is related to evaluating extra-prostatic cancers; the second concerns the quality of rapid tests for COVID-19 detection.
Collapse
Affiliation(s)
- Alberto Casagrande
- Dipartimento di Matematica e Geoscienze, Università degli Studi di Trieste, Trieste, Italy
| | - Francesco Fabris
- Dipartimento di Matematica e Geoscienze, Università degli Studi di Trieste, Trieste, Italy
| | - Rossano Girometti
- Istituto di Radiologia, Dipartimento di Area Medica, Università degli Studi di Udine, Ospedale S. Maria della Misericordia, Udine, Italy
| |
Collapse
|
13
|
Ricci L, Perinelli A, Castelluzzo M. Estimating the variance of Shannon entropy. Phys Rev E 2021; 104:024220. [PMID: 34525589 DOI: 10.1103/physreve.104.024220] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2021] [Accepted: 08/10/2021] [Indexed: 11/07/2022]
Abstract
The statistical analysis of data stemming from dynamical systems, including, but not limited to, time series, routinely relies on the estimation of information theoretical quantities, most notably Shannon entropy. To this purpose, possibly the most widespread tool is provided by the so-called plug-in estimator, whose statistical properties in terms of bias and variance were investigated since the first decade after the publication of Shannon's seminal works. In the case of an underlying multinomial distribution, while the bias can be evaluated by knowing support and data set size, variance is far more elusive. The aim of the present work is to investigate, in the multinomial case, the statistical properties of an estimator of a parameter that describes the variance of the plug-in estimator of Shannon entropy. We then exactly determine the probability distributions that maximize that parameter. The results presented here allow one to set upper limits to the uncertainty of entropy assessments under the hypothesis of memoryless underlying stochastic processes.
Collapse
Affiliation(s)
- Leonardo Ricci
- Department of Physics, University of Trento, 38123 Trento, Italy.,CIMeC, Center for Mind/Brain Sciences, University of Trento, 38068 Rovereto, Italy
| | - Alessio Perinelli
- CIMeC, Center for Mind/Brain Sciences, University of Trento, 38068 Rovereto, Italy
| | | |
Collapse
|
14
|
He F, Yang Y. Nonlinear System Identification of Neural Systems from Neurophysiological Signals. Neuroscience 2021; 458:213-228. [PMID: 33309967 PMCID: PMC7925423 DOI: 10.1016/j.neuroscience.2020.12.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2020] [Revised: 11/30/2020] [Accepted: 12/01/2020] [Indexed: 12/20/2022]
Abstract
The human nervous system is one of the most complicated systems in nature. Complex nonlinear behaviours have been shown from the single neuron level to the system level. For decades, linear connectivity analysis methods, such as correlation, coherence and Granger causality, have been extensively used to assess the neural connectivities and input-output interconnections in neural systems. Recent studies indicate that these linear methods can only capture a certain amount of neural activities and functional relationships, and therefore cannot describe neural behaviours in a precise or complete way. In this review, we highlight recent advances in nonlinear system identification of neural systems, corresponding time and frequency domain analysis, and novel neural connectivity measures based on nonlinear system identification techniques. We argue that nonlinear modelling and analysis are necessary to study neuronal processing and signal transfer in neural systems quantitatively. These approaches can hopefully provide new insights to advance our understanding of neurophysiological mechanisms underlying neural functions. These nonlinear approaches also have the potential to produce sensitive biomarkers to facilitate the development of precision diagnostic tools for evaluating neurological disorders and the effects of targeted intervention.
Collapse
Affiliation(s)
- Fei He
- Centre for Data Science, Coventry University, Coventry CV1 2JH, UK
| | - Yuan Yang
- Stephenson School of Biomedical Engineering, The University of Oklahoma, Tulsa, OK 74135, USA; Department of Physical Therapy and Human Movement Sciences, Feinberg School of Medicine, Northwestern University, Chicago, IL 60611, USA; Laureate Institute for Brain Research, Tulsa, OK 74136, USA
| |
Collapse
|
15
|
Young J, Neveu CL, Byrne JH, Aazhang B. Inferring functional connectivity through graphical directed information. J Neural Eng 2021; 18. [PMID: 33684898 PMCID: PMC8600965 DOI: 10.1088/1741-2552/abecc6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2020] [Accepted: 03/08/2021] [Indexed: 11/25/2022]
Abstract
Objective. Accurate inference of functional connectivity is critical for understanding brain function. Previous methods have limited ability distinguishing between direct and indirect connections because of inadequate scaling with dimensionality. This poor scaling performance reduces the number of nodes that can be included in conditioning. Our goal was to provide a technique that scales better and thereby enables minimization of indirect connections. Approach. Our major contribution is a powerful model-free framework, graphical directed information (GDI), that enables pairwise directed functional connections to be conditioned on the activity of substantially more nodes in a network, producing a more accurate graph of functional connectivity that reduces indirect connections. The key technology enabling this advancement is a recent advance in the estimation of mutual information (MI), which relies on multilayer perceptrons and exploiting an alternative representation of the Kullback–Leibler divergence definition of MI. Our second major contribution is the application of this technique to both discretely valued and continuously valued time series. Main results. GDI correctly inferred the circuitry of arbitrary Gaussian, nonlinear, and conductance-based networks. Furthermore, GDI inferred many of the connections of a model of a central pattern generator circuit in Aplysia, while also reducing many indirect connections. Significance. GDI is a general and model-free technique that can be used on a variety of scales and data types to provide accurate direct connectivity graphs and addresses the critical issue of indirect connections in neural data analysis.
Collapse
Affiliation(s)
- Joseph Young
- Department of Electrical & Computer Engineering, Rice University, 6100 Main St, Houston, Texas, 77005, UNITED STATES
| | - Curtis L Neveu
- Department of Neurobiology & Anatomy, The University of Texas Health Science Center at Houston John P and Katherine G McGovern Medical School, 6431 Fannin Street, Houston, Texas, 77030-1501, UNITED STATES
| | - John H Byrne
- Department of Neurobiology and Anatomy, The University of Texas Health Science Center at Houston John P and Katherine G McGovern Medical School, 6431 Fannin Street, Houston, Texas, 77030-1501, UNITED STATES
| | - Behnaam Aazhang
- Department of Electrical & Computer Engineering, Rice University, 6100 Main St, Houston, Texas, 77005, UNITED STATES
| |
Collapse
|
16
|
Pires DP, Modi K, Céleri LC. Bounding generalized relative entropies: Nonasymptotic quantum speed limits. Phys Rev E 2021; 103:032105. [PMID: 33862799 DOI: 10.1103/physreve.103.032105] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2020] [Accepted: 02/16/2021] [Indexed: 06/12/2023]
Abstract
Information theory has become an increasingly important research field to better understand quantum mechanics. Noteworthy, it covers both foundational and applied perspectives, also offering a common technical language to study a variety of research areas. Remarkably, one of the key information-theoretic quantities is given by the relative entropy, which quantifies how difficult is to tell apart two probability distributions, or even two quantum states. Such a quantity rests at the core of fields like metrology, quantum thermodynamics, quantum communication, and quantum information. Given this broadness of applications, it is desirable to understand how this quantity changes under a quantum process. By considering a general unitary channel, we establish a bound on the generalized relative entropies (Rényi and Tsallis) between the output and the input of the channel. As an application of our bounds, we derive a family of quantum speed limits based on relative entropies. Possible connections between this family with thermodynamics, quantum coherence, asymmetry, and single-shot information theory are briefly discussed.
Collapse
Affiliation(s)
- Diego Paiva Pires
- International Institute of Physics and Departamento de Física Teórica e Experimental, Universidade Federal do Rio Grande do Norte, Natal, RN, 59078-970, Brazil
| | - Kavan Modi
- School of Physics & Astronomy, Monash University, Clayton, Victoria 3800, Australia
| | - Lucas Chibebe Céleri
- Department of Physical Chemistry, University of the Basque Country UPV/EHU, Apartado 644, E-48080 Bilbao, Spain
- Institute of Physics, Federal University of Goiás, 74.690-900 Goiânia, Goiás, Brazil
| |
Collapse
|
17
|
Moffett AS, Wallbridge N, Plummer C, Eckford AW. Fitness value of information with delayed phenotype switching: Optimal performance with imperfect sensing. Phys Rev E 2020; 102:052403. [PMID: 33327185 DOI: 10.1103/physreve.102.052403] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 10/12/2020] [Indexed: 11/07/2022]
Abstract
The ability of organisms to accurately sense their environment and respond accordingly is critical for evolutionary success. However, exactly how the sensory ability influences fitness is a topic of active research, while the necessity of a time delay between when unreliable environmental cues are sensed and when organisms can mount a response has yet to be explored at any length. Accounting for this delay in phenotype response in models of population growth, we find that a critical error probability can exist under certain environmental conditions: An organism with a sensory system with any error probability less than the critical value can achieve the same long-term growth rate as an organism with a perfect sensing system. We also observe a tradeoff between the evolutionary value of sensory information and robustness to error, mediated by the rate at which the phenotype distribution relaxes to steady state. The existence of the critical error probability could have several important evolutionary consequences, primarily that sensory systems operating at the nonzero critical error probability may be evolutionarily optimal.
Collapse
Affiliation(s)
- Alexander S Moffett
- Department of Electrical Engineering and Computer Science, York University, Toronto, Ontario M3J 1P3, Canada
| | | | | | - Andrew W Eckford
- Department of Electrical Engineering and Computer Science, York University, Toronto, Ontario M3J 1P3, Canada
| |
Collapse
|
18
|
Malo J. Spatio-chromatic information available from different neural layers via Gaussianization. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2020; 10:18. [PMID: 33175257 PMCID: PMC7658285 DOI: 10.1186/s13408-020-00095-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2019] [Accepted: 10/22/2020] [Indexed: 06/11/2023]
Abstract
How much visual information about the retinal images can be extracted from the different layers of the visual pathway? This question depends on the complexity of the visual input, the set of transforms applied to this multivariate input, and the noise of the sensors in the considered layer. Separate subsystems (e.g. opponent channels, spatial filters, nonlinearities of the texture sensors) have been suggested to be organized for optimal information transmission. However, the efficiency of these different layers has not been measured when they operate together on colorimetrically calibrated natural images and using multivariate information-theoretic units over the joint spatio-chromatic array of responses.In this work, we present a statistical tool to address this question in an appropriate (multivariate) way. Specifically, we propose an empirical estimate of the information transmitted by the system based on a recent Gaussianization technique. The total correlation measured using the proposed estimator is consistent with predictions based on the analytical Jacobian of a standard spatio-chromatic model of the retina-cortex pathway. If the noise at certain representation is proportional to the dynamic range of the response, and one assumes sensors of equivalent noise level, then transmitted information shows the following trends: (1) progressively deeper representations are better in terms of the amount of captured information, (2) the transmitted information up to the cortical representation follows the probability of natural scenes over the chromatic and achromatic dimensions of the stimulus space, (3) the contribution of spatial transforms to capture visual information is substantially greater than the contribution of chromatic transforms, and (4) nonlinearities of the responses contribute substantially to the transmitted information but less than the linear transforms.
Collapse
Affiliation(s)
- Jesús Malo
- Image Processing Lab, Universitat de València, Catedrático Escardino, 46980, Valencia, Paterna, Spain.
| |
Collapse
|
19
|
Etter G, Manseau F, Williams S. A Probabilistic Framework for Decoding Behavior From in vivo Calcium Imaging Data. Front Neural Circuits 2020; 14:19. [PMID: 32499681 PMCID: PMC7243991 DOI: 10.3389/fncir.2020.00019] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2019] [Accepted: 04/06/2020] [Indexed: 11/13/2022] Open
Abstract
Understanding the role of neuronal activity in cognition and behavior is a key question in neuroscience. Previously, in vivo studies have typically inferred behavior from electrophysiological data using probabilistic approaches including Bayesian decoding. While providing useful information on the role of neuronal subcircuits, electrophysiological approaches are often limited in the maximum number of recorded neurons as well as their ability to reliably identify neurons over time. This can be particularly problematic when trying to decode behaviors that rely on large neuronal assemblies or rely on temporal mechanisms, such as a learning task over the course of several days. Calcium imaging of genetically encoded calcium indicators has overcome these two issues. Unfortunately, because calcium transients only indirectly reflect spiking activity and calcium imaging is often performed at lower sampling frequencies, this approach suffers from uncertainty in exact spike timing and thus activity frequency, making rate-based decoding approaches used in electrophysiological recordings difficult to apply to calcium imaging data. Here we describe a probabilistic framework that can be used to robustly infer behavior from calcium imaging recordings and relies on a simplified implementation of a naive Baysian classifier. Our method discriminates between periods of activity and periods of inactivity to compute probability density functions (likelihood and posterior), significance and confidence interval, as well as mutual information. We next devise a simple method to decode behavior using these probability density functions and propose metrics to quantify decoding accuracy. Finally, we show that neuronal activity can be predicted from behavior, and that the accuracy of such reconstructions can guide the understanding of relationships that may exist between behavioral states and neuronal activity.
Collapse
Affiliation(s)
- Guillaume Etter
- Douglas Mental Health University Institute, McGill University, Montreal, QC, Canada
| | | | - Sylvain Williams
- Douglas Mental Health University Institute, McGill University, Montreal, QC, Canada
| |
Collapse
|
20
|
Barta T, Kostal L. The effect of inhibition on rate code efficiency indicators. PLoS Comput Biol 2019; 15:e1007545. [PMID: 31790384 PMCID: PMC6907877 DOI: 10.1371/journal.pcbi.1007545] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2019] [Revised: 12/12/2019] [Accepted: 11/12/2019] [Indexed: 11/30/2022] Open
Abstract
In this paper we investigate the rate coding capabilities of neurons whose input signal are alterations of the base state of balanced inhibitory and excitatory synaptic currents. We consider different regimes of excitation-inhibition relationship and an established conductance-based leaky integrator model with adaptive threshold and parameter sets recreating biologically relevant spiking regimes. We find that given mean post-synaptic firing rate, counter-intuitively, increased ratio of inhibition to excitation generally leads to higher signal to noise ratio (SNR). On the other hand, the inhibitory input significantly reduces the dynamic coding range of the neuron. We quantify the joint effect of SNR and dynamic coding range by computing the metabolic efficiency-the maximal amount of information per one ATP molecule expended (in bits/ATP). Moreover, by calculating the metabolic efficiency we are able to predict the shapes of the post-synaptic firing rate histograms that may be tested on experimental data. Likewise, optimal stimulus input distributions are predicted, however, we show that the optimum can essentially be reached with a broad range of input distributions. Finally, we examine which parameters of the used neuronal model are the most important for the metabolically efficient information transfer.
Collapse
Affiliation(s)
- Tomas Barta
- Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
- Charles University, First Medical Faculty, Prague, Czech Republic
- Institute of Ecology and Environmental Sciences, INRA, Versailles, France
| | - Lubomir Kostal
- Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
| |
Collapse
|
21
|
Salman MS, Vergara VM, Damaraju E, Calhoun VD. Decreased Cross-Domain Mutual Information in Schizophrenia From Dynamic Connectivity States. Front Neurosci 2019; 13:873. [PMID: 31507357 PMCID: PMC6714616 DOI: 10.3389/fnins.2019.00873] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2019] [Accepted: 08/02/2019] [Indexed: 12/18/2022] Open
Abstract
The study of dynamic functional network connectivity (dFNC) has been important to understand the healthy and diseased brain. Recent developments model groups of functionally related brain structures (defined as functional domains) as entities that can send and receive information. A domain analysis starts by detecting a finite set of connectivity patterns known as domain states within each functional domain. Dynamic functional domain connectivity (DFDC) is a novel information theoretic framework for studying the temporal sequence of the domain states and the amount of information shared among domains. In this setting, the information flow among functional domains can be compared to the flow of bits among entities in a digital network. Schizophrenia is a chronic psychiatric disorder which is associated with how the brain processes information. Here, we employed the DFDC framework to analyze a dataset containing resting-state fMRI scans from 163 healthy controls (HCs) and 151 schizophrenia patients (SZs). As in other information theory methods, this study measured domain state probabilities, entropy within each DFDC and the cross-domain mutual information (CDMI) between pairs of DFDC. Results indicate that SZs show significantly higher (transformed) entropy than HCs in subcortical (SC)-SC; default mode network (DMN)-visual (VIS) and frontoparietal (FRN)-VIS DFDCs. SZs also show lower (transformed) CDMI between SC-VIS vs. SC-sensorimotor (SM), attention (ATTN)-VIS vs. ATTN-SM and ATTN-SM vs. ATTN-ATTN DFDC pairs after correcting for multiple comparisons. These results imply that different DFDC pairs function in a more independent manner in SZs compared to HCs. Our findings present evidence of higher uncertainty and randomness in SZ brain function.
Collapse
Affiliation(s)
- Mustafa S. Salman
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, United States
- Tri-Institutional Center for Translational Research in Neuroimaging and Data Science (TReNDS), Georgia State University, Georgia Institute of Technology and Emory University, Atlanta, GA, United States
| | - Victor M. Vergara
- Tri-Institutional Center for Translational Research in Neuroimaging and Data Science (TReNDS), Georgia State University, Georgia Institute of Technology and Emory University, Atlanta, GA, United States
| | - Eswar Damaraju
- Tri-Institutional Center for Translational Research in Neuroimaging and Data Science (TReNDS), Georgia State University, Georgia Institute of Technology and Emory University, Atlanta, GA, United States
| | - Vince D. Calhoun
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, United States
- Tri-Institutional Center for Translational Research in Neuroimaging and Data Science (TReNDS), Georgia State University, Georgia Institute of Technology and Emory University, Atlanta, GA, United States
| |
Collapse
|
22
|
Salmasi M, Stemmler M, Glasauer S, Loebel A. Synaptic Information Transmission in a Two-State Model of Short-Term Facilitation. ENTROPY 2019; 21:e21080756. [PMID: 33267470 PMCID: PMC7515285 DOI: 10.3390/e21080756] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 07/24/2019] [Accepted: 07/31/2019] [Indexed: 11/16/2022]
Abstract
Action potentials (spikes) can trigger the release of a neurotransmitter at chemical synapses between neurons. Such release is uncertain, as it occurs only with a certain probability. Moreover, synaptic release can occur independently of an action potential (asynchronous release) and depends on the history of synaptic activity. We focus here on short-term synaptic facilitation, in which a sequence of action potentials can temporarily increase the release probability of the synapse. In contrast to the phenomenon of short-term depression, quantifying the information transmission in facilitating synapses remains to be done. We find rigorous lower and upper bounds for the rate of information transmission in a model of synaptic facilitation. We treat the synapse as a two-state binary asymmetric channel, in which the arrival of an action potential shifts the synapse to a facilitated state, while in the absence of a spike, the synapse returns to its baseline state. The information bounds are functions of both the asynchronous and synchronous release parameters. If synchronous release facilitates more than asynchronous release, the mutual information rate increases. In contrast, short-term facilitation degrades information transmission when the synchronous release probability is intrinsically high. As synaptic release is energetically expensive, we exploit the information bounds to determine the energy-information trade-off in facilitating synapses. We show that unlike information rate, the energy-normalized information rate is robust with respect to variations in the strength of facilitation.
Collapse
Affiliation(s)
- Mehrdad Salmasi
- Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, 82152 Planegg-Martinsried, Germany
- Bernstein Center for Computational Neuroscience Munich, 82152 Planegg-Martinsried, Germany
- German Center for Vertigo and Balance Disorders, Ludwig-Maximilians-Universität, 81377 Munich, Germany
- Correspondence:
| | - Martin Stemmler
- Bernstein Center for Computational Neuroscience Munich, 82152 Planegg-Martinsried, Germany
- Department of Biology II, Ludwig-Maximilians-Universität München, 82152 Planegg-Martinsried, Germany
| | - Stefan Glasauer
- Bernstein Center for Computational Neuroscience Munich, 82152 Planegg-Martinsried, Germany
- German Center for Vertigo and Balance Disorders, Ludwig-Maximilians-Universität, 81377 Munich, Germany
- Computational Neuroscience, Brandenburg University of Technology Cottbus-Senftenberg, 03046 Cottbus, Germany
| | - Alex Loebel
- Bernstein Center for Computational Neuroscience Munich, 82152 Planegg-Martinsried, Germany
- Department of Biology II, Ludwig-Maximilians-Universität München, 82152 Planegg-Martinsried, Germany
| |
Collapse
|
23
|
Kumar S, Yoo K, Rosenberg MD, Scheinost D, Constable RT, Zhang S, Li CR, Chun MM. An information network flow approach for measuring functional connectivity and predicting behavior. Brain Behav 2019; 9:e01346. [PMID: 31286688 PMCID: PMC6710195 DOI: 10.1002/brb3.1346] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/14/2019] [Revised: 04/13/2019] [Accepted: 04/21/2019] [Indexed: 11/12/2022] Open
Abstract
INTRODUCTION Connectome-based predictive modeling (CPM) is a recently developed machine-learning-based framework to predict individual differences in behavior from functional brain connectivity (FC). In these models, FC was operationalized as Pearson's correlation between brain regions' fMRI time courses. However, Pearson's correlation is limited since it only captures linear relationships. We developed a more generalized metric of FC based on information flow. This measure represents FC by abstracting the brain as a flow network of nodes that send bits of information to each other, where bits are quantified through an information theory statistic called transfer entropy. METHODS With a sample of individuals performing a sustained attention task and resting during functional magnetic resonance imaging (fMRI) (n = 25), we use the CPM framework to build machine-learning models that predict attention from FC patterns measured with information flow. Models trained on n - 1 participants' task-based patterns were applied to an unseen individual's resting-state pattern to predict task performance. For further validation, we applied our model to two independent datasets that included resting-state fMRI data and a measure of attention (Attention Network Task performance [n = 41] and stop-signal task performance [n = 72]). RESULTS Our model significantly predicted individual differences in attention task performance across three different datasets. CONCLUSIONS Information flow may be a useful complement to Pearson's correlation as a measure of FC because of its advantages for nonlinear analysis and network structure characterization.
Collapse
Affiliation(s)
- Sreejan Kumar
- Department of PsychologyYale UniversityNew HavenConnecticut
| | - Kwangsun Yoo
- Department of PsychologyYale UniversityNew HavenConnecticut
| | - Monica D. Rosenberg
- Department of PsychologyYale UniversityNew HavenConnecticut
- Department of PsychologyUniversity of ChicagoChicagoIllinois
| | - Dustin Scheinost
- Department of Radiology and Biomedical ImagingYale School of MedicineNew HavenConnecticut
| | - R. Todd Constable
- Department of Radiology and Biomedical ImagingYale School of MedicineNew HavenConnecticut
- Interdepartmental Neuroscience ProgramYale UniversityNew HavenConnecticut
- Department of NeurosurgeryYale School of MedicineNew HavenConnecticut
| | - Sheng Zhang
- Department of PsychiatryYale School of MedicineNew HavenConnecticut
| | - Chiang‐Shan R. Li
- Interdepartmental Neuroscience ProgramYale UniversityNew HavenConnecticut
- Department of PsychiatryYale School of MedicineNew HavenConnecticut
- Department of NeuroscienceYale School of MedicineNew HavenConnecticut
| | - Marvin M. Chun
- Department of PsychologyYale UniversityNew HavenConnecticut
- Interdepartmental Neuroscience ProgramYale UniversityNew HavenConnecticut
- Department of NeuroscienceYale School of MedicineNew HavenConnecticut
| |
Collapse
|
24
|
He B, Astolfi L, Valdés-Sosa PA, Marinazzo D, Palva SO, Bénar CG, Michel CM, Koenig T. Electrophysiological Brain Connectivity: Theory and Implementation. IEEE Trans Biomed Eng 2019; 66:10.1109/TBME.2019.2913928. [PMID: 31071012 PMCID: PMC6834897 DOI: 10.1109/tbme.2019.2913928] [Citation(s) in RCA: 90] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
We review the theory and algorithms of electrophysiological brain connectivity analysis. This tutorial is aimed at providing an introduction to brain functional connectivity from electrophysiological signals, including electroencephalography (EEG), magnetoencephalography (MEG), electrocorticography (ECoG), stereoelectroencephalography (SEEG). Various connectivity estimators are discussed, and algorithms introduced. Important issues for estimating and mapping brain functional connectivity with electrophysiology are discussed.
Collapse
Affiliation(s)
- Bin He
- Department of Biomedical Engineering, Carnegie Mellon University, Pittsburgh, USA
| | - Laura Astolfi
- Department of Computer, Control and Management Engineering, University of Rome Sapienza, and with IRCCS Fondazione Santa Lucia, Rome, Italy
| | | | | | | | | | | | | |
Collapse
|
25
|
Ray SK, Valentini G, Shah P, Haque A, Reid CR, Weber GF, Garnier S. Information Transfer During Food Choice in the Slime Mold Physarum polycephalum. Front Ecol Evol 2019. [DOI: 10.3389/fevo.2019.00067] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022] Open
|
26
|
Evaluating performance of neural codes in model neural communication networks. Neural Netw 2018; 109:90-102. [PMID: 30408697 DOI: 10.1016/j.neunet.2018.10.008] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2018] [Revised: 09/24/2018] [Accepted: 10/05/2018] [Indexed: 11/22/2022]
Abstract
Information needs to be appropriately encoded to be reliably transmitted over physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-known that neurons exchange information using a pool of several protocols of spatio-temporal encodings, the suitability of each code and their performance as a function of network parameters and external stimuli is still one of the great mysteries in neuroscience. This paper sheds light on this by modeling small-size networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on a class of temporal and firing-rate codes that result from neurons' membrane-potentials and phases, and quantify numerically their performance estimating the Mutual Information Rate, aka the rate of information exchange. Our results suggest that the firing-rate and interspike-intervals codes are more robust to additive Gaussian white noise. In a network of four interconnected neurons and in the absence of such noise, pairs of neurons that have the largest rate of information exchange using the interspike-intervals and firing-rate codes are not adjacent in the network, whereas spike-timings and phase codes (temporal) promote large rate of information exchange for adjacent neurons. If that result would have been possible to extend to larger neural networks, it would suggest that small microcircuits would preferably exchange information using temporal codes (spike-timings and phase codes), whereas on the macroscopic scale, where there would be typically pairs of neurons not directly connected due to the brain's sparsity, firing-rate and interspike-intervals codes would be the most efficient codes.
Collapse
|
27
|
Mehta K, Kliewer J, Ihlefeld A. Quantifying Neuronal Information Flow in Response to Frequency and Intensity Changes in the Auditory Cortex. CONFERENCE RECORD. ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS 2018; 2018:1367-1371. [PMID: 31595139 PMCID: PMC6782062 DOI: 10.1109/acssc.2018.8645091] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Studies increasingly show that behavioral relevance alters the population representation of sensory stimuli in the sensory cortices. However, the mechanisms underlying this behavior are incompletely understood. Here, we record neuronal responses in the auditory cortex while a highly trained, awake, normal-hearing gerbil listens passively to target tones of high versus low behavioral relevance. Using an information theoretic framework, we model the overall transmission chain from acoustic input stimulus to recorded cortical response as a communication channel. To quantify how much information core auditory cortex carries about high versus low relevance sound, we then compute the mutual information of the multi-unit neuronal responses. Results show that the output over the stimulus-to-response channel can be modeled as a Poisson mixture. We derive a closed-form fast approximation for the entropy of a mixture of univariate Poisson random variables. A purely rate-code based model reveals reduced information transfer for high relevance compared to low relevance tones, hinting that changes in temporal discharge pattern may encode behavioral relevance.
Collapse
Affiliation(s)
- Ketan Mehta
- Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA 22030
| | - Jörg Kliewer
- Helen and John C. Hartmann Dept. of Electrical and Computer Engineering New Jersey Institute of Technology, Newark, NJ 07102
| | - Antje Ihlefeld
- Dept. of Biomedical Engineering, New Jersey Institute of Technology, Newark, NJ 07102
| |
Collapse
|
28
|
Timme NM, Lapish C. A Tutorial for Information Theory in Neuroscience. eNeuro 2018; 5:ENEURO.0052-18.2018. [PMID: 30211307 PMCID: PMC6131830 DOI: 10.1523/eneuro.0052-18.2018] [Citation(s) in RCA: 100] [Impact Index Per Article: 16.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2018] [Revised: 04/10/2018] [Accepted: 05/30/2018] [Indexed: 11/21/2022] Open
Abstract
Understanding how neural systems integrate, encode, and compute information is central to understanding brain function. Frequently, data from neuroscience experiments are multivariate, the interactions between the variables are nonlinear, and the landscape of hypothesized or possible interactions between variables is extremely broad. Information theory is well suited to address these types of data, as it possesses multivariate analysis tools, it can be applied to many different types of data, it can capture nonlinear interactions, and it does not require assumptions about the structure of the underlying data (i.e., it is model independent). In this article, we walk through the mathematics of information theory along with common logistical problems associated with data type, data binning, data quantity requirements, bias, and significance testing. Next, we analyze models inspired by canonical neuroscience experiments to improve understanding and demonstrate the strengths of information theory analyses. To facilitate the use of information theory analyses, and an understanding of how these analyses are implemented, we also provide a free MATLAB software package that can be applied to a wide range of data from neuroscience experiments, as well as from other fields of study.
Collapse
Affiliation(s)
- Nicholas M Timme
- Department of Psychology, Indiana University - Purdue University Indianapolis, 402 N. Blackford St, Indianapolis, IN 46202
| | - Christopher Lapish
- Department of Psychology, Indiana University - Purdue University Indianapolis, 402 N. Blackford St, Indianapolis, IN 46202
| |
Collapse
|
29
|
Kostal L, D'Onofrio G. Coordinate invariance as a fundamental constraint on the form of stimulus-specific information measures. BIOLOGICAL CYBERNETICS 2018; 112:13-23. [PMID: 28856427 DOI: 10.1007/s00422-017-0729-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/20/2017] [Accepted: 08/16/2017] [Indexed: 06/07/2023]
Abstract
The value of Shannon's mutual information is commonly used to describe the total amount of information that the neural code transfers between the ensemble of stimuli and the ensemble of neural responses. In addition, it is often desirable to know which features of the stimulus or response are most informative. The literature offers several different decompositions of the mutual information into its stimulus or response-specific components, such as the specific surprise or the uncertainty reduction, but the number of mutually distinct measures is in fact infinite. We resolve this ambiguity by requiring the specific information measures to be invariant under invertible coordinate transformations of the stimulus and the response ensembles. We prove that the Kullback-Leibler divergence is then the only suitable measure of the specific information. On a more general level, we discuss the necessity and the fundamental aspects of the coordinate invariance as a selection principle. We believe that our results will encourage further research into invariant statistical methods for the analysis of neural coding.
Collapse
Affiliation(s)
- Lubomir Kostal
- Institute of Physiology, Czech Academy of Sciences, Videnska 1083, 14220, Prague 4, Czech Republic.
| | - Giuseppe D'Onofrio
- Institute of Physiology, Czech Academy of Sciences, Videnska 1083, 14220, Prague 4, Czech Republic
| |
Collapse
|
30
|
Zeldenrust F, de Knecht S, Wadman WJ, Denève S, Gutkin B. Estimating the Information Extracted by a Single Spiking Neuron from a Continuous Input Time Series. Front Comput Neurosci 2017; 11:49. [PMID: 28663729 PMCID: PMC5471316 DOI: 10.3389/fncom.2017.00049] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2016] [Accepted: 05/19/2017] [Indexed: 11/30/2022] Open
Abstract
Understanding the relation between (sensory) stimuli and the activity of neurons (i.e., “the neural code”) lies at heart of understanding the computational properties of the brain. However, quantifying the information between a stimulus and a spike train has proven to be challenging. We propose a new (in vitro) method to measure how much information a single neuron transfers from the input it receives to its output spike train. The input is generated by an artificial neural network that responds to a randomly appearing and disappearing “sensory stimulus”: the hidden state. The sum of this network activity is injected as current input into the neuron under investigation. The mutual information between the hidden state on the one hand and spike trains of the artificial network or the recorded spike train on the other hand can easily be estimated due to the binary shape of the hidden state. The characteristics of the input current, such as the time constant as a result of the (dis)appearance rate of the hidden state or the amplitude of the input current (the firing frequency of the neurons in the artificial network), can independently be varied. As an example, we apply this method to pyramidal neurons in the CA1 of mouse hippocampi and compare the recorded spike trains to the optimal response of the “Bayesian neuron” (BN). We conclude that like in the BN, information transfer in hippocampal pyramidal cells is non-linear and amplifying: the information loss between the artificial input and the output spike train is high if the input to the neuron (the firing of the artificial network) is not very informative about the hidden state. If the input to the neuron does contain a lot of information about the hidden state, the information loss is low. Moreover, neurons increase their firing rates in case the (dis)appearance rate is high, so that the (relative) amount of transferred information stays constant.
Collapse
Affiliation(s)
- Fleur Zeldenrust
- Department of Neurophysiology, Faculty of Science, Donders Institute for Brain, Cognition and Behaviour, Radboud UniversityNijmegen, Netherlands
| | - Sicco de Knecht
- Cellular and Systems Neurobiology, Swammerdam Institute for Life Sciences, University of AmsterdamAmsterdam, Netherlands
| | - Wytse J Wadman
- Cellular and Systems Neurobiology, Swammerdam Institute for Life Sciences, University of AmsterdamAmsterdam, Netherlands
| | - Sophie Denève
- Group for Neural Theory, Institut National de la Santé et de la Recherche Médicale U960, Institute of Cognitive Studies, École Normale SupérieureParis, France
| | - Boris Gutkin
- Group for Neural Theory, Institut National de la Santé et de la Recherche Médicale U960, Institute of Cognitive Studies, École Normale SupérieureParis, France.,Department of Psychology, Center for Cognition and Decision Making, National Research University Higher School of EconomicsMoscow, Russia
| |
Collapse
|
31
|
|
32
|
Cannon J. Analytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated Communication. Neural Comput 2016; 29:118-145. [PMID: 27870617 DOI: 10.1162/neco_a_00915] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Mutual information is a commonly used measure of communication between neurons, but little theory exists describing the relationship between mutual information and the parameters of the underlying neuronal interaction. Such a theory could help us understand how specific physiological changes affect the capacity of neurons to synaptically communicate, and, in particular, they could help us characterize the mechanisms by which neuronal dynamics gate the flow of information in the brain. Here we study a pair of linear-nonlinear-Poisson neurons coupled by a weak synapse. We derive an analytical expression describing the mutual information between their spike trains in terms of synapse strength, neuronal activation function, the time course of postsynaptic currents, and the time course of the background input received by the two neurons. This expression allows mutual information calculations that would otherwise be computationally intractable. We use this expression to analytically explore the interaction of excitation, information transmission, and the convexity of the activation function. Then, using this expression to quantify mutual information in simulations, we illustrate the information-gating effects of neural oscillations and oscillatory coherence, which may either increase or decrease the mutual information across the synapse depending on parameters. Finally, we show analytically that our results can quantitatively describe the selection of one information pathway over another when multiple sending neurons project weakly to a single receiving neuron.
Collapse
Affiliation(s)
- Jonathan Cannon
- Department of Biology, Brandeis University, Waltham, MA 02453, U.S.A.
| |
Collapse
|
33
|
Methodology for Simulation and Analysis of Complex Adaptive Supply Network Structure and Dynamics Using Information Theory. ENTROPY 2016. [DOI: 10.3390/e18100367] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
34
|
de Assis JM, Santos MO, de Assis FM. Auditory Stimuli Coding by Postsynaptic Potential and Local Field Potential Features. PLoS One 2016; 11:e0160089. [PMID: 27513950 PMCID: PMC4981406 DOI: 10.1371/journal.pone.0160089] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2015] [Accepted: 07/13/2016] [Indexed: 11/19/2022] Open
Abstract
The relation between physical stimuli and neurophysiological responses, such as action potentials (spikes) and Local Field Potentials (LFP), has recently been experimented in order to explain how neurons encode auditory information. However, none of these experiments presented analyses with postsynaptic potentials (PSPs). In the present study, we have estimated information values between auditory stimuli and amplitudes/latencies of PSPs and LFPs in anesthetized rats in vivo. To obtain these values, a new method of information estimation was used. This method produced more accurate estimates than those obtained by using the traditional binning method; a fact that was corroborated by simulated data. The traditional binning method could not certainly impart such accuracy even when adjusted by quadratic extrapolation. We found that the information obtained from LFP amplitude variation was significantly greater than the information obtained from PSP amplitude variation. This confirms the fact that LFP reflects the action of many PSPs. Results have shown that the auditory cortex codes more information of stimuli frequency with slow oscillations in groups of neurons than it does with slow oscillations in neurons separately.
Collapse
Affiliation(s)
- Juliana M. de Assis
- Department of Electrical Engineering, Federal University of Campina Grande, Campina Grande, Paraíba, Brazil
| | - Mikaelle O. Santos
- Department of Electrical Engineering, Federal University of Campina Grande, Campina Grande, Paraíba, Brazil
| | - Francisco M. de Assis
- Department of Electrical Engineering, Federal University of Campina Grande, Campina Grande, Paraíba, Brazil
| |
Collapse
|
35
|
Zorick T, Smith J. Generalized Information Equilibrium Approaches to EEG Sleep Stage Discrimination. COMPUTATIONAL AND MATHEMATICAL METHODS IN MEDICINE 2016; 2016:6450126. [PMID: 27516806 PMCID: PMC4969566 DOI: 10.1155/2016/6450126] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/15/2016] [Revised: 05/28/2016] [Accepted: 06/19/2016] [Indexed: 11/18/2022]
Abstract
Recent advances in neuroscience have raised the hypothesis that the underlying pattern of neuronal activation which results in electroencephalography (EEG) signals is via power-law distributed neuronal avalanches, while EEG signals are nonstationary. Therefore, spectral analysis of EEG may miss many properties inherent in such signals. A complete understanding of such dynamical systems requires knowledge of the underlying nonequilibrium thermodynamics. In recent work by Fielitz and Borchardt (2011, 2014), the concept of information equilibrium (IE) in information transfer processes has successfully characterized many different systems far from thermodynamic equilibrium. We utilized a publicly available database of polysomnogram EEG data from fourteen subjects with eight different one-minute tracings of sleep stage 2 and waking and an overlapping set of eleven subjects with eight different one-minute tracings of sleep stage 3. We applied principles of IE to model EEG as a system that transfers (equilibrates) information from the time domain to scalp-recorded voltages. We find that waking consciousness is readily distinguished from sleep stages 2 and 3 by several differences in mean information transfer constants. Principles of IE applied to EEG may therefore prove to be useful in the study of changes in brain function more generally.
Collapse
Affiliation(s)
- Todd Zorick
- Department of Psychiatry, Veterans Affairs Greater Los Angeles Healthcare System, Los Angeles, CA 90073, USA; Department of Psychiatry and Biobehavioral Sciences, UCLA, Los Angeles, CA, USA
| | | |
Collapse
|
36
|
Sims CR. Rate-distortion theory and human perception. Cognition 2016; 152:181-198. [PMID: 27107330 DOI: 10.1016/j.cognition.2016.03.020] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2015] [Revised: 03/22/2016] [Accepted: 03/25/2016] [Indexed: 11/19/2022]
Abstract
The fundamental goal of perception is to aid in the achievement of behavioral objectives. This requires extracting and communicating useful information from noisy and uncertain sensory signals. At the same time, given the complexity of sensory information and the limitations of biological information processing, it is necessary that some information must be lost or discarded in the act of perception. Under these circumstances, what constitutes an 'optimal' perceptual system? This paper describes the mathematical framework of rate-distortion theory as the optimal solution to the problem of minimizing the costs of perceptual error subject to strong constraints on the ability to communicate or transmit information. Rate-distortion theory offers a general and principled theoretical framework for developing computational-level models of human perception (Marr, 1982). Models developed in this framework are capable of producing quantitatively precise explanations for human perceptual performance, while yielding new insights regarding the nature and goals of perception. This paper demonstrates the application of rate-distortion theory to two benchmark domains where capacity limits are especially salient in human perception: discrete categorization of stimuli (also known as absolute identification) and visual working memory. A software package written for the R statistical programming language is described that aids in the development of models based on rate-distortion theory.
Collapse
Affiliation(s)
- Chris R Sims
- Department of Psychology, Drexel University, Philadelphia, PA, United States.
| |
Collapse
|
37
|
Katz ML, Viney TJ, Nikolic K. Receptive Field Vectors of Genetically-Identified Retinal Ganglion Cells Reveal Cell-Type-Dependent Visual Functions. PLoS One 2016; 11:e0147738. [PMID: 26845435 PMCID: PMC4742227 DOI: 10.1371/journal.pone.0147738] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2015] [Accepted: 01/07/2016] [Indexed: 11/18/2022] Open
Abstract
Sensory stimuli are encoded by diverse kinds of neurons but the identities of the recorded neurons that are studied are often unknown. We explored in detail the firing patterns of eight previously defined genetically-identified retinal ganglion cell (RGC) types from a single transgenic mouse line. We first introduce a new technique of deriving receptive field vectors (RFVs) which utilises a modified form of mutual information (“Quadratic Mutual Information”). We analysed the firing patterns of RGCs during presentation of short duration (~10 second) complex visual scenes (natural movies). We probed the high dimensional space formed by the visual input for a much smaller dimensional subspace of RFVs that give the most information about the response of each cell. The new technique is very efficient and fast and the derivation of novel types of RFVs formed by the natural scene visual input was possible even with limited numbers of spikes per cell. This approach enabled us to estimate the 'visual memory' of each cell type and the corresponding receptive field area by calculating Mutual Information as a function of the number of frames and radius. Finally, we made predictions of biologically relevant functions based on the RFVs of each cell type. RGC class analysis was complemented with results for the cells’ response to simple visual input in the form of black and white spot stimulation, and their classification on several key physiological metrics. Thus RFVs lead to predictions of biological roles based on limited data and facilitate analysis of sensory-evoked spiking data from defined cell types.
Collapse
Affiliation(s)
- Matthew L. Katz
- Centre for Bio-Inspired Technology, Institute of Biomedical Engineering, Department of Electrical and Electronic Engineering, The Bessemer Building, Imperial College London, London SW7 2AZ, United Kingdom
| | - Tim J. Viney
- Neural Circuit Laboratories, Friedrich Miescher Institute for Biomedical Research, 4058 Basel, Switzerland
- University of Basel, 4003 Basel, Switzerland
| | - Konstantin Nikolic
- Centre for Bio-Inspired Technology, Institute of Biomedical Engineering, Department of Electrical and Electronic Engineering, The Bessemer Building, Imperial College London, London SW7 2AZ, United Kingdom
- * E-mail:
| |
Collapse
|
38
|
Optimal decoding and information transmission in Hodgkin-Huxley neurons under metabolic cost constraints. Biosystems 2015; 136:3-10. [PMID: 26141378 DOI: 10.1016/j.biosystems.2015.06.008] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2015] [Revised: 06/03/2015] [Accepted: 06/25/2015] [Indexed: 11/23/2022]
Abstract
Information theory quantifies the ultimate limits on reliable information transfer by means of the channel capacity. However, the channel capacity is known to be an asymptotic quantity, assuming unlimited metabolic cost and computational power. We investigate a single-compartment Hodgkin-Huxley type neuronal model under the spike-rate coding scheme and address how the metabolic cost and the decoding complexity affects the optimal information transmission. We find that the sub-threshold stimulation regime, although attaining the smallest capacity, allows for the most efficient balance between the information transmission and the metabolic cost. Furthermore, we determine post-synaptic firing rate histograms that are optimal from the information-theoretic point of view, which enables the comparison of our results with experimental data.
Collapse
|
39
|
Rucci M, Victor JD. The unsteady eye: an information-processing stage, not a bug. Trends Neurosci 2015; 38:195-206. [PMID: 25698649 PMCID: PMC4385455 DOI: 10.1016/j.tins.2015.01.005] [Citation(s) in RCA: 131] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2014] [Revised: 01/20/2015] [Accepted: 01/22/2015] [Indexed: 11/25/2022]
Abstract
How is space represented in the visual system? At first glance, the answer to this fundamental question appears straightforward: spatial information is directly encoded in the locations of neurons within maps. This concept has long dominated visual neuroscience, leading to mainstream theories of how neurons encode information. However, an accumulation of evidence indicates that this purely spatial view is incomplete and that, even for static images, the representation is fundamentally spatiotemporal. The evidence for this new understanding centers on recent experimental findings concerning the functional role of fixational eye movements, the tiny movements humans and other species continually perform, even when attending to a single point. We review some of these findings and discuss their functional implications.
Collapse
Affiliation(s)
- Michele Rucci
- Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA; Graduate Program in Neuroscience, Boston University, Boston, MA 02215, USA.
| | - Jonathan D Victor
- Brain and Mind Research Institute, Weill Cornell Medical College, New York, NY 10065, USA
| |
Collapse
|
40
|
Barigye SJ, Marrero-Ponce Y, Pérez-Giménez F, Bonchev D. Trends in information theory-based chemical structure codification. Mol Divers 2014; 18:673-86. [DOI: 10.1007/s11030-014-9517-7] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2013] [Accepted: 03/07/2014] [Indexed: 11/25/2022]
|
41
|
Faghihi F, Kolodziejski C, Fiala A, Wörgötter F, Tetzlaff C. An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency. Front Comput Neurosci 2013; 7:183. [PMID: 24391579 PMCID: PMC3868887 DOI: 10.3389/fncom.2013.00183] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2013] [Accepted: 12/03/2013] [Indexed: 11/13/2022] Open
Abstract
Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.
Collapse
Affiliation(s)
- Faramarz Faghihi
- Department of Computational Neuroscience, Bernstein Center for Computational Neuroscience, III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen, Germany
| | - Christoph Kolodziejski
- Department of Computational Neuroscience, Bernstein Center for Computational Neuroscience, III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen, Germany
| | - André Fiala
- Molecular Neurobiology of Behavior, Johann-Friedrich-Blumenbach-Institute for Zoology and Anthropology, Georg-August-Universität Göttingen, Germany
| | - Florentin Wörgötter
- Department of Computational Neuroscience, Bernstein Center for Computational Neuroscience, III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Bernstein Center for Computational Neuroscience, III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen, Germany
| |
Collapse
|
42
|
Influence of biophysical properties on temporal filters in a sensory neuron. BMC Neurosci 2013. [PMCID: PMC3704666 DOI: 10.1186/1471-2202-14-s1-p347] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
|
43
|
Tsubo Y, Isomura Y, Fukai T. Power-law inter-spike interval distributions infer a conditional maximization of entropy in cortical neurons. PLoS Comput Biol 2012; 8:e1002461. [PMID: 22511856 PMCID: PMC3325172 DOI: 10.1371/journal.pcbi.1002461] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2011] [Accepted: 02/20/2012] [Indexed: 11/18/2022] Open
Abstract
The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI), which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously) and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE) solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates) at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains.
Collapse
Affiliation(s)
- Yasuhiro Tsubo
- Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute, Wako, Saitama, Japan.
| | | | | |
Collapse
|
44
|
Crumiller M, Knight B, Yu Y, Kaplan E. Estimating the amount of information conveyed by a population of neurons. Front Neurosci 2011; 5:90. [PMID: 21811435 PMCID: PMC3139929 DOI: 10.3389/fnins.2011.00090] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2011] [Accepted: 06/28/2011] [Indexed: 11/13/2022] Open
Abstract
Recent technological advances have made the simultaneous recording of the activity of many neurons common. However, estimating the amount of information conveyed by the discharge of a neural population remains a significant challenge. Here we describe our recently published analysis method that assists in such estimates. We describe the key concepts and assumptions on which the method is based, illustrate its use with data from both simulated and real neurons recorded from the lateral geniculate nucleus of a monkey, and show how it can be used to calculate redundancy and synergy among neuronal groups.
Collapse
Affiliation(s)
- Marshall Crumiller
- The Fishberg Department of Neuroscience and Friedman Brain Institute, The Mount Sinai School of Medicine New York, NY, USA
| | | | | | | |
Collapse
|