Reprint

Information and Divergence Measures

Edited by
August 2023
282 pages
  • ISBN978-3-0365-8386-0 (Hardback)
  • ISBN978-3-0365-8387-7 (PDF)

This is a Reprint of the Special Issue Information and Divergence Measures that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences
Summary

The concept of distance is important for establishing the degree of similarity and/or closeness between functions, populations, or distributions. As a result, distances are related to inferential statistics, including problems related to both estimation and hypothesis testing, as well as modelling with applications in regression analysis, multivariate analysis, actuarial science, portfolio optimization, survival analysis, reliability theory, and many other areas. Thus, entropy and divergence measures are always a central concern for scientists, researchers, medical experts, engineers, industrial managers, computer experts, data analysts, and other professionals.

This reprint focuses on recent developments in information and divergence measures and presents new theoretical issues as well as solutions to important practical problems and case studies illustrating the great applicability of these innovative techniques and methods. The contributions in this reprint highlight the diversity of topics in this scientific field.

Format
  • Hardback
License and Copyright
© 2022 by the authors; CC BY-NC-ND license
Keywords
exponential family; statistical divergence; truncated exponential family; truncated normal distributions; double index divergence test statistic; multivariate data analysis; conditional independence; cross tabulations; extremal combinatorics; graphs; Han’s inequality; information inequalities; polymatroid; rank function; set function; Shearer’s lemma; submodularity; empirical survival Jensen–Shannon divergence; Kolmogorov–Smirnov two-sample test; skew logistic distribution; bi-logistic growth; epidemic waves; COVID-19 data; Rényi’s pseudodistance; minimum Rényi’s pseudodistance estimators; restricted minimum Rényi’s pseudodistance estimators; Rao-type tests; divergence-based tests; Multivariate Cauchy distribution (MCD); Kullback–Leibler divergence (KLD); multiple power series; Lauricella D-hypergeometric series; statistical K-means; academic evaluation; statistical manifold; clustering; concomitants; GOS; FGM family; Shannon entropy; Tsallis entropy; Awad entropy; residual entropy; past entropy; Fisher–Tsallis information number; Tsallis divergence; bootstrap discrepancy comparison probability (BDCP); discrepancy comparison probability (DCP); likelihood ratio test (LRT); model selection; p-value; LPI; radar waveform; passive interception systems; Kullback–Leibler divergence; joint entropy; Tsallis logarithm; Kaniadakis logarithm; weighted Tsallis divergence; weighted Kaniadakis divergence; geodesic; Fisher information; differential geometry; transversality; multivariate Gaussian; n/a; moment condition models; divergences; robustness

Related Books

May 2022

Divergence Measures

Computer Science & Mathematics
December 2019

Entropy Measures for Data Analysis

Biology & Life Sciences
...