1
|
Shalev Y, Painsky A, Ben-Gal I. Neural Joint Entropy Estimation. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2024; 35:5488-5500. [PMID: 36155469 DOI: 10.1109/tnnls.2022.3204919] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Estimating the entropy of a discrete random variable is a fundamental problem in information theory and related fields. This problem has many applications in various domains, including machine learning, statistics, and data compression. Over the years, a variety of estimation schemes have been suggested. However, despite significant progress, most methods still struggle when the sample is small, compared to the variable's alphabet size. In this work, we introduce a practical solution to this problem, which extends the work of McAllester and Statos. The proposed scheme uses the generalization abilities of cross-entropy estimation in deep neural networks (DNNs) to introduce improved entropy estimation accuracy. Furthermore, we introduce a family of estimators for related information-theoretic measures, such as conditional entropy and mutual information (MI). We show that these estimators are strongly consistent and demonstrate their performance in a variety of use cases. First, we consider large alphabet entropy estimation. Then, we extend the scope to MI estimation. Next, we apply the proposed scheme to conditional MI estimation, as we focus on independence testing tasks. Finally, we study a transfer entropy (TE) estimation problem. The proposed estimators demonstrate improved performance compared to existing methods in all of these setups.
Collapse
|
2
|
Macedo F, Valadas R, Carrasquinha E, Oliveira MR, Pacheco A. Feature selection using Decomposed Mutual Information Maximization. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.09.101] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
3
|
Altmann U, Strauss B, Tschacher W. Cross-Correlation- and Entropy-Based Measures of Movement Synchrony: Non-Convergence of Measures Leads to Different Associations with Depressive Symptoms. ENTROPY (BASEL, SWITZERLAND) 2022; 24:1307. [PMID: 36141194 PMCID: PMC9497848 DOI: 10.3390/e24091307] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/16/2022] [Revised: 09/06/2022] [Accepted: 09/07/2022] [Indexed: 05/29/2023]
Abstract
BACKGROUND Several algorithms have been proposed to quantify synchronization. However, little is known about their convergent and predictive validity. METHODS The sample included 30 persons who completed a manualized interview focusing on psychosomatic symptoms. The intensity of body motions was measured using motion-energy analysis. We computed several measures of movement synchrony based on the time series of the interviewer and participant: mutual information, windowed cross-recurrence analysis, cross-correlation, rMEA, SUSY, SUCO, WCLC-PP and WCLR-PP. Depressive symptoms were assessed with the Patient Health Questionnaire (PHQ9). RESULTS According to the explorative factor analyses, all the variants of cross-correlation and all the measures of SUSY, SUCO and rMEA-WCC led to similar synchrony measures and could be assigned to the same factor. All the mutual-information measures, rMEA-WCLC, WCLC-PP-F, WCLC-PP-R2, WCLR-PP-F, and WinCRQA-DET loaded on the second factor. Depressive symptoms correlated negatively with WCLC-PP-F and WCLR-PP-F and positively with rMEA-WCC, SUCO-ES-CO, and MI-Z. CONCLUSION More standardization efforts are needed because different synchrony measures have little convergent validity, which can lead to contradictory conclusions concerning associations between depressive symptoms and movement synchrony using the same dataset.
Collapse
Affiliation(s)
- Uwe Altmann
- Institute of Psychosocial Medicine, Psychotherapy and Psycho-Oncology, Jena University Hospital, D-07743 Jena, Germany
| | - Bernhard Strauss
- Institute of Psychosocial Medicine, Psychotherapy and Psycho-Oncology, Jena University Hospital, D-07743 Jena, Germany
| | - Wolfgang Tschacher
- Department of Experimental Psychology, University Hospital of Psychiatry and Psychotherapy, CH-3060 Bern, Switzerland
| |
Collapse
|
4
|
Newman EL, Varley TF, Parakkattu VK, Sherrill SP, Beggs JM. Revealing the Dynamics of Neural Information Processing with Multivariate Information Decomposition. ENTROPY (BASEL, SWITZERLAND) 2022; 24:930. [PMID: 35885153 PMCID: PMC9319160 DOI: 10.3390/e24070930] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Revised: 06/28/2022] [Accepted: 06/30/2022] [Indexed: 11/16/2022]
Abstract
The varied cognitive abilities and rich adaptive behaviors enabled by the animal nervous system are often described in terms of information processing. This framing raises the issue of how biological neural circuits actually process information, and some of the most fundamental outstanding questions in neuroscience center on understanding the mechanisms of neural information processing. Classical information theory has long been understood to be a natural framework within which information processing can be understood, and recent advances in the field of multivariate information theory offer new insights into the structure of computation in complex systems. In this review, we provide an introduction to the conceptual and practical issues associated with using multivariate information theory to analyze information processing in neural circuits, as well as discussing recent empirical work in this vein. Specifically, we provide an accessible introduction to the partial information decomposition (PID) framework. PID reveals redundant, unique, and synergistic modes by which neurons integrate information from multiple sources. We focus particularly on the synergistic mode, which quantifies the "higher-order" information carried in the patterns of multiple inputs and is not reducible to input from any single source. Recent work in a variety of model systems has revealed that synergistic dynamics are ubiquitous in neural circuitry and show reliable structure-function relationships, emerging disproportionately in neuronal rich clubs, downstream of recurrent connectivity, and in the convergence of correlated activity. We draw on the existing literature on higher-order information dynamics in neuronal networks to illustrate the insights that have been gained by taking an information decomposition perspective on neural activity. Finally, we briefly discuss future promising directions for information decomposition approaches to neuroscience, such as work on behaving animals, multi-target generalizations of PID, and time-resolved local analyses.
Collapse
Affiliation(s)
- Ehren L. Newman
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA;
| | - Thomas F. Varley
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA;
| | - Vibin K. Parakkattu
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA;
| | | | - John M. Beggs
- Department of Physics, Indiana University, Bloomington, IN 47405, USA;
| |
Collapse
|
5
|
Ricci L, Perinelli A. Estimating Permutation Entropy Variability via Surrogate Time Series. ENTROPY (BASEL, SWITZERLAND) 2022; 24:853. [PMID: 35885077 PMCID: PMC9318716 DOI: 10.3390/e24070853] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/23/2022] [Revised: 06/19/2022] [Accepted: 06/20/2022] [Indexed: 01/27/2023]
Abstract
In the last decade permutation entropy (PE) has become a popular tool to analyze the degree of randomness within a time series. In typical applications, changes in the dynamics of a source are inferred by observing changes of PE computed on different time series generated by that source. However, most works neglect the crucial question related to the statistical significance of these changes. The main reason probably lies in the difficulty of assessing, out of a single time series, not only the PE value, but also its uncertainty. In this paper we propose a method to overcome this issue by using generation of surrogate time series. The analysis conducted on both synthetic and experimental time series shows the reliability of the approach, which can be promptly implemented by means of widely available numerical tools. The method is computationally affordable for a broad range of users.
Collapse
Affiliation(s)
- Leonardo Ricci
- Department of Physics, University of Trento, 38123 Trento, Italy
- CIMeC, Center for Mind/Brain Sciences, University of Trento, 38068 Rovereto, Italy
| | | |
Collapse
|
6
|
Shang J, Wang J, Sun Y, Li F, Liu JX, Zhang H. Multiscale part mutual information for quantifying nonlinear direct associations in networks. Bioinformatics 2021; 37:2920-2929. [PMID: 33730153 DOI: 10.1093/bioinformatics/btab182] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2020] [Revised: 02/15/2021] [Accepted: 03/15/2021] [Indexed: 02/02/2023] Open
Abstract
MOTIVATION For network-assisted analysis, which has become a popular method of data mining, network construction is a crucial task. Network construction relies on the accurate quantification of direct associations among variables. The existence of multiscale associations among variables presents several quantification challenges, especially when quantifying nonlinear direct interactions. RESULTS In this study, the multiscale part mutual information (MPMI), based on part mutual information (PMI) and nonlinear partial association (NPA), was developed for effectively quantifying nonlinear direct associations among variables in networks with multiscale associations. First, we defined the MPMI in theory and derived its five important properties. Second, an experiment in a three-node network was carried out to numerically estimate its quantification ability under two cases of strong associations. Third, experiments of the MPMI and comparisons with the PMI, NPA and conditional mutual information were performed on simulated datasets and on datasets from DREAM challenge project. Finally, the MPMI was applied to real datasets of glioblastoma and lung adenocarcinoma to validate its effectiveness. Results showed that the MPMI is an effective alternative measure for quantifying nonlinear direct associations in networks, especially those with multiscale associations. AVAILABILITY AND IMPLEMENTATION The source code of MPMI is available online at https://github.com/CDMB-lab/MPMI. SUPPLEMENTARY INFORMATION Supplementary data are available at Bioinformatics online.
Collapse
Affiliation(s)
- Junliang Shang
- School of Computer Science, Qufu Normal University, Rizhao 276826, China
| | - Jing Wang
- School of Computer Science, Qufu Normal University, Rizhao 276826, China
| | - Yan Sun
- School of Computer Science, Qufu Normal University, Rizhao 276826, China
| | - Feng Li
- School of Computer Science, Qufu Normal University, Rizhao 276826, China
| | - Jin-Xing Liu
- School of Computer Science, Qufu Normal University, Rizhao 276826, China
| | - Honghai Zhang
- College of Life Science, Qufu Normal University, Qufu 273165, China
| |
Collapse
|
7
|
Feutrill A, Roughan M. A Review of Shannon and Differential Entropy Rate Estimation. ENTROPY (BASEL, SWITZERLAND) 2021; 23:1046. [PMID: 34441186 PMCID: PMC8392187 DOI: 10.3390/e23081046] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 07/27/2021] [Revised: 08/04/2021] [Accepted: 08/09/2021] [Indexed: 11/17/2022]
Abstract
In this paper, we present a review of Shannon and differential entropy rate estimation techniques. Entropy rate, which measures the average information gain from a stochastic process, is a measure of uncertainty and complexity of a stochastic process. We discuss the estimation of entropy rate from empirical data, and review both parametric and non-parametric techniques. We look at many different assumptions on properties of the processes for parametric processes, in particular focussing on Markov and Gaussian assumptions. Non-parametric estimation relies on limit theorems which involve the entropy rate from observations, and to discuss these, we introduce some theory and the practical implementations of estimators of this type.
Collapse
Affiliation(s)
- Andrew Feutrill
- CSIRO/Data61, 13 Kintore Avenue, Adelaide, SA 5000, Australia
- School of Mathematical Sciences, The University of Adelaide, Adelaide, SA 5005, Australia;
- ARC Centre of Excellence for Mathematical & Statistical Frontiers, The University of Melbourne, Parkville, VIC 3010, Australia
| | - Matthew Roughan
- School of Mathematical Sciences, The University of Adelaide, Adelaide, SA 5005, Australia;
- ARC Centre of Excellence for Mathematical & Statistical Frontiers, The University of Melbourne, Parkville, VIC 3010, Australia
| |
Collapse
|
8
|
Madarro-Capó EJ, Legón-Pérez CM, Rojas O, Sosa-Gómez G. Information Theory Based Evaluation of the RC4 Stream Cipher Outputs. ENTROPY 2021; 23:e23070896. [PMID: 34356437 PMCID: PMC8306200 DOI: 10.3390/e23070896] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/22/2021] [Revised: 07/04/2021] [Accepted: 07/12/2021] [Indexed: 11/16/2022]
Abstract
This paper presents a criterion, based on information theory, to measure the amount of average information provided by the sequences of outputs of the RC4 on the internal state. The test statistic used is the sum of the maximum plausible estimates of the entropies H(jt|zt), corresponding to the probability distributions P(jt|zt) of the sequences of random variables (jt)t∈T and (zt)t∈T, independent, but not identically distributed, where zt are the known values of the outputs, while jt is one of the unknown elements of the internal state of the RC4. It is experimentally demonstrated that the test statistic allows for determining the most vulnerable RC4 outputs, and it is proposed to be used as a vulnerability metric for each RC4 output sequence concerning the iterative probabilistic attack.
Collapse
Affiliation(s)
- Evaristo José Madarro-Capó
- Facultad de Matemática y Computación, Instituto de Criptografía, Universidad de la Habana, Habana 10400, Cuba; (E.J.M.-C.); (C.M.L.-P.)
| | - Carlos Miguel Legón-Pérez
- Facultad de Matemática y Computación, Instituto de Criptografía, Universidad de la Habana, Habana 10400, Cuba; (E.J.M.-C.); (C.M.L.-P.)
| | - Omar Rojas
- Facultad de Ciencias Económicas y Empresariales, Universidad Panamericana, Álvaro del Portillo 49, Zapopan 45010, Jalisco, Mexico;
- Correspondence: ; Tel.: +52-331-368-2200
| | - Guillermo Sosa-Gómez
- Facultad de Ciencias Económicas y Empresariales, Universidad Panamericana, Álvaro del Portillo 49, Zapopan 45010, Jalisco, Mexico;
| |
Collapse
|
9
|
Abstract
Quantitative genetics has evolved dramatically in the past century, and the proliferation of genetic data, in quantity as well as type, enables the characterization of complex interactions and mechanisms beyond the scope of its theoretical foundations. In this article, we argue that revisiting the framework for analysis is important and we begin to lay the foundations of an alternative formulation of quantitative genetics based on information theory. Information theory can provide sensitive and unbiased measures of statistical dependencies among variables, and it provides a natural mathematical language for an alternative view of quantitative genetics. In the previous work, we examined the information content of discrete functions and applied this approach and methods to the analysis of genetic data. In this article, we present a framework built around a set of relationships that both unifies the information measures for the discrete functions and uses them to express key quantitative genetic relationships. Information theory measures of variable interdependency are used to identify significant interactions, and a general approach is described for inferring functional relationships in genotype and phenotype data. We present information-based measures of the genetic quantities: penetrance, heritability, and degrees of statistical epistasis. Our scope here includes the consideration of both two- and three-variable dependencies and independently segregating variants, which captures additive effects, genetic interactions, and two-phenotype pleiotropy. This formalism and the theoretical approach naturally apply to higher multivariable interactions and complex dependencies, and can be adapted to account for population structure, linkage, and nonrandomly segregating markers. This article thus focuses on presenting the initial groundwork for a full formulation of quantitative genetics based on information theory.
Collapse
Affiliation(s)
- David J. Galas
- Pacific Northwest Research Institute, Seattle, Washington, USA
| | | | - Lisa Uechi
- Pacific Northwest Research Institute, Seattle, Washington, USA
| | | |
Collapse
|
10
|
Contreras Rodríguez L, Madarro-Capó EJ, Legón-Pérez CM, Rojas O, Sosa-Gómez G. Selecting an Effective Entropy Estimator for Short Sequences of Bits and Bytes with Maximum Entropy. ENTROPY (BASEL, SWITZERLAND) 2021; 23:561. [PMID: 33946438 PMCID: PMC8147137 DOI: 10.3390/e23050561] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Revised: 04/26/2021] [Accepted: 04/28/2021] [Indexed: 11/22/2022]
Abstract
Entropy makes it possible to measure the uncertainty about an information source from the distribution of its output symbols. It is known that the maximum Shannon's entropy of a discrete source of information is reached when its symbols follow a Uniform distribution. In cryptography, these sources have great applications since they allow for the highest security standards to be reached. In this work, the most effective estimator is selected to estimate entropy in short samples of bytes and bits with maximum entropy. For this, 18 estimators were compared. Results concerning the comparisons published in the literature between these estimators are discussed. The most suitable estimator is determined experimentally, based on its bias, the mean square error short samples of bytes and bits.
Collapse
Affiliation(s)
- Lianet Contreras Rodríguez
- Facultad de Matemática y Computación, Instituto de Criptografía, Universidad de la Habana, Habana 10400, Cuba; (L.C.R.); (E.J.M.-C.); (C.M.L.-P.)
| | - Evaristo José Madarro-Capó
- Facultad de Matemática y Computación, Instituto de Criptografía, Universidad de la Habana, Habana 10400, Cuba; (L.C.R.); (E.J.M.-C.); (C.M.L.-P.)
| | - Carlos Miguel Legón-Pérez
- Facultad de Matemática y Computación, Instituto de Criptografía, Universidad de la Habana, Habana 10400, Cuba; (L.C.R.); (E.J.M.-C.); (C.M.L.-P.)
| | - Omar Rojas
- Facultad de Ciencias Económicas y Empresariales, Universidad Panamericana, Álvaro del Portillo 49, Zapopan, Jalisco 45010, Mexico;
| | - Guillermo Sosa-Gómez
- Facultad de Ciencias Económicas y Empresariales, Universidad Panamericana, Álvaro del Portillo 49, Zapopan, Jalisco 45010, Mexico;
| |
Collapse
|
11
|
Cao W, Dytso A, Fauß M, Poor HV. Finite-Sample Bounds on the Accuracy of Plug-in Estimators of Fisher Information. ENTROPY 2021; 23:e23050545. [PMID: 33924955 PMCID: PMC8145518 DOI: 10.3390/e23050545] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Revised: 04/14/2021] [Accepted: 04/21/2021] [Indexed: 11/16/2022]
Abstract
Finite-sample bounds on the accuracy of Bhattacharya’s plug-in estimator for Fisher information are derived. These bounds are further improved by introducing a clipping step that allows for better control over the score function. This leads to superior upper bounds on the rates of convergence, albeit under slightly different regularity conditions. The performance bounds on both estimators are evaluated for the practically relevant case of a random variable contaminated by Gaussian noise. Moreover, using Brown’s identity, two corresponding estimators of the minimum mean-square error are proposed.
Collapse
Affiliation(s)
- Wei Cao
- National Key Lab of Science and Technology on Communications, University of Electronic Science and Technology of China, Chengdu 611731, China;
| | - Alex Dytso
- Department of Electrical and Computer Engineering, New Jersey Institute of Technology, Newark, NJ 07102, USA
- Correspondence:
| | - Michael Fauß
- Department of Electrical and Computer Engineering, Princeton University, Princeton, NJ 08544, USA; (M.F.); (H.V.P.)
| | - H. Vincent Poor
- Department of Electrical and Computer Engineering, Princeton University, Princeton, NJ 08544, USA; (M.F.); (H.V.P.)
| |
Collapse
|
12
|
Ricci L. Asymptotic distribution of sample Shannon entropy in the case of an underlying finite, regular Markov chain. Phys Rev E 2021; 103:022215. [PMID: 33736022 DOI: 10.1103/physreve.103.022215] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Accepted: 02/09/2021] [Indexed: 11/07/2022]
Abstract
The inference of Shannon entropy out of sample histograms is known to be affected by systematic and random errors that depend on the finite size of the available data set. This dependence was mostly investigated in the multinomial case, in which states are visited in an independent fashion. In this paper the asymptotic behavior of the distribution of the sample Shannon entropy, also referred to as plug-in estimator, is investigated in the case of an underlying finite Markov process characterized by a regular stochastic matrix. As the size of the data set tends to infinity, the plug-in estimator is shown to become asymptotically normal, though in a way that substantially deviates from the known multinomial case. The asymptotic behavior of bias and variance of the plug-in estimator are expressed in terms of the spectrum of the stochastic matrix and of the related covariance matrix. Effects of initial conditions are also considered. By virtue of the formal similarity with Shannon entropy, the results are directly applicable to the evaluation of permutation entropy.
Collapse
Affiliation(s)
- Leonardo Ricci
- Dipartimento di Fisica, Università di Trento, 38123 Trento, Italy
| |
Collapse
|
13
|
Abstract
The bit independence criterion was proposed to evaluate the security of the S-boxes used in block ciphers. This paper proposes an algorithm that extends this criterion to evaluate the degree of independence between the bits of inputs and outputs of the stream ciphers. The effectiveness of the algorithm is experimentally confirmed in two scenarios: random outputs independent of the input, in which it does not detect dependence, and in the RC4 ciphers, where it detects significant dependencies related to some known weaknesses. The complexity of the algorithm is estimated based on the number of inputs l, and the dimensions, n and m, of the inputs and outputs, respectively.
Collapse
|
14
|
Measuring Independence between Statistical Randomness Tests by Mutual Information. ENTROPY 2020; 22:e22070741. [PMID: 33286513 PMCID: PMC7517289 DOI: 10.3390/e22070741] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Revised: 06/30/2020] [Accepted: 07/01/2020] [Indexed: 11/16/2022]
Abstract
The analysis of independence between statistical randomness tests has had great attention in the literature recently. Dependency detection between statistical randomness tests allows one to discriminate statistical randomness tests that measure similar characteristics, and thus minimize the amount of statistical randomness tests that need to be used. In this work, a method for detecting statistical dependency by using mutual information is proposed. The main advantage of using mutual information is its ability to detect nonlinear correlations, which cannot be detected by the linear correlation coefficient used in previous work. This method analyzes the correlation between the battery tests of the National Institute of Standards and Technology, used as a standard in the evaluation of randomness. The results of the experiments show the existence of statistical dependencies between the tests that have not been previously detected.
Collapse
|
15
|
Cadena Muñoz E, Pedraza Martínez LF, Hernandez CA. Rényi Entropy-Based Spectrum Sensing in Mobile Cognitive Radio Networks Using Software Defined Radio. ENTROPY 2020; 22:e22060626. [PMID: 33286398 PMCID: PMC7517161 DOI: 10.3390/e22060626] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/12/2019] [Revised: 11/28/2019] [Accepted: 12/03/2019] [Indexed: 11/18/2022]
Abstract
A very important task in Mobile Cognitive Radio Networks (MCRN) is to ensure that the system releases a given frequency when a Primary User (PU) is present, by maintaining the principle to not interfere with its activity within a cognitive radio system. Afterwards, a cognitive protocol must be set in order to change to another frequency channel that is available or shut down the service if there are no free channels to be found. The system must sense the frequency spectrum constantly through the energy detection method which is the most commonly used. However, this analysis takes place in the time domain and signals cannot be easily identified due to changes in modulation, power and distance from mobile users. The proposed system works with Gaussian Minimum Shift Keying (GMSK) and Orthogonal Frequency Division Multiplexing (OFDM) for systems from Global System for Mobile Communication (GSM) to 5G systems, the signals are analyzed in the frequency domain and the Rényi-Entropy method is used as a tool to distinguish the noise and the PU signal without prior knowledge of its features. The main contribution of this research is that uses a Software Defined Radio (SDR) system to implement a MCRN in order to measure the behavior of Primary and Secondary signals in both time and frequency using GNURadio and OpenBTS as software tools to allow a phone call service between two Secondary Users (SU). This allows to extract experimental results that are compared with simulations and theory using Rényi-entropy to detect signals from SU in GMSK and OFDM systems. It is concluded that the Rényi-Entropy detector has a higher performance than the conventional energy detector in the Additive White Gaussian Noise (AWGN) and Rayleigh channels. The system increases the detection probability (PD) to over 96% with a Signal to Noise Ratio (SNR) of 10dB and starting 5 dB below energy sensing levels.
Collapse
Affiliation(s)
- Ernesto Cadena Muñoz
- Systems and Industrial Department, Universidad Nacional de Colombia, Bogotá 111321, Colombia
- Correspondence:
| | | | - Cesar Augusto Hernandez
- Electrical Engineering Department, Universidad Distrital Francisco José de Caldas, Bogotá 110231, Colombia;
| |
Collapse
|
16
|
Gençağa D, Şengül Ayan S, Farnoudkia H, Okuyucu S. Statistical Approaches for the Analysis of Dependency Among Neurons Under Noise. ENTROPY (BASEL, SWITZERLAND) 2020; 22:E387. [PMID: 33286161 PMCID: PMC7516863 DOI: 10.3390/e22040387] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/06/2020] [Revised: 03/09/2020] [Accepted: 03/23/2020] [Indexed: 06/12/2023]
Abstract
Neuronal noise is a major factor affecting the communication between coupled neurons. In this work, we propose a statistical toolset to infer the coupling between two neurons under noise. We estimate these statistical dependencies from data which are generated by a coupled Hodgkin-Huxley (HH) model with additive noise. To infer the coupling using observation data, we employ copulas and information-theoretic quantities, such as the mutual information (MI) and the transfer entropy (TE). Copulas and MI between two variables are symmetric quantities, whereas TE is asymmetric. We demonstrate the performances of copulas and MI as functions of different noise levels and show that they are effective in the identification of the interactions due to coupling and noise. Moreover, we analyze the inference of TE values between neurons as a function of noise and conclude that TE is an effective tool for finding out the direction of coupling between neurons under the effects of noise.
Collapse
Affiliation(s)
- Deniz Gençağa
- Department of Electrical and Electronics Engineering, Antalya Bilim University, 07190 Antalya, Turkey
| | - Sevgi Şengül Ayan
- Department of Industrial Engineering, Antalya Bilim University, 07190 Antalya, Turkey
| | - Hajar Farnoudkia
- Department of Statistics, Middle East Technical University, 06800 Ankara, Turkey
| | - Serdar Okuyucu
- Department of Electrical and Electronics Engineering, Antalya Bilim University, 07190 Antalya, Turkey
| |
Collapse
|
17
|
Ponce-Flores M, Frausto-Solís J, Santamaría-Bonfil G, Pérez-Ortega J, González-Barbosa JJ. Time Series Complexities and Their Relationship to Forecasting Performance. ENTROPY 2020; 22:e22010089. [PMID: 33285864 PMCID: PMC7516527 DOI: 10.3390/e22010089] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/17/2019] [Revised: 01/06/2020] [Accepted: 01/07/2020] [Indexed: 11/16/2022]
Abstract
Entropy is a key concept in the characterization of uncertainty for any given signal, and its extensions such as Spectral Entropy and Permutation Entropy. They have been used to measure the complexity of time series. However, these measures are subject to the discretization employed to study the states of the system, and identifying the relationship between complexity measures and the expected performance of the four selected forecasting methods that participate in the M4 Competition. This relationship allows the decision, in advance, of which algorithm is adequate. Therefore, in this paper, we found the relationships between entropy-based complexity framework and the forecasting error of four selected methods (Smyl, Theta, ARIMA, and ETS). Moreover, we present a framework extension based on the Emergence, Self-Organization, and Complexity paradigm. The experimentation with both synthetic and M4 Competition time series show that the feature space induced by complexities, visually constrains the forecasting method performance to specific regions; where the logarithm of its metric error is poorer, the Complexity based on the emergence and self-organization is maximal.
Collapse
Affiliation(s)
- Mirna Ponce-Flores
- Graduate Program Division, Tecnológico Nacional de México/Instituto Tecnológico de Ciudad Madero, Cd. Madero 89440, Mexico;
- Correspondence: (M.P.-F.); (J.F.-S.); (G.S.-B.)
| | - Juan Frausto-Solís
- Graduate Program Division, Tecnológico Nacional de México/Instituto Tecnológico de Ciudad Madero, Cd. Madero 89440, Mexico;
- Correspondence: (M.P.-F.); (J.F.-S.); (G.S.-B.)
| | - Guillermo Santamaría-Bonfil
- Information Technologies Department, Consejo Nacional de Ciencia y Tecnología—Instituto Nacional de Electricidad y Energías Limpias, Cuernavaca 62490, Mexico
- Correspondence: (M.P.-F.); (J.F.-S.); (G.S.-B.)
| | - Joaquín Pérez-Ortega
- Computing Department, Tecnológico Nacional de México/Centro Nacional de Investigación y Desarrollo Tecnológico, Cuernavaca 62490, Mexico;
| | - Juan J. González-Barbosa
- Graduate Program Division, Tecnológico Nacional de México/Instituto Tecnológico de Ciudad Madero, Cd. Madero 89440, Mexico;
| |
Collapse
|