Login / Signup

Measuring Independence between Statistical Randomness Tests by Mutual Information.

Jorge Augusto Karell-AlboCarlos Miguel Legón-PérezEvaristo José Madarro-CapóOmar RojasGuillermo Sosa-Gómez
Published in: Entropy (Basel, Switzerland) (2020)
The analysis of independence between statistical randomness tests has had great attention in the literature recently. Dependency detection between statistical randomness tests allows one to discriminate statistical randomness tests that measure similar characteristics, and thus minimize the amount of statistical randomness tests that need to be used. In this work, a method for detecting statistical dependency by using mutual information is proposed. The main advantage of using mutual information is its ability to detect nonlinear correlations, which cannot be detected by the linear correlation coefficient used in previous work. This method analyzes the correlation between the battery tests of the National Institute of Standards and Technology, used as a standard in the evaluation of randomness. The results of the experiments show the existence of statistical dependencies between the tests that have not been previously detected.
Keyphrases
  • systematic review
  • healthcare
  • health information
  • quality improvement
  • working memory
  • loop mediated isothermal amplification