entropy-logo

Journal Browser

Journal Browser

Entropy-Based Statistics and Their Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: 31 January 2025 | Viewed by 3626

Special Issue Editor


E-Mail Website
Guest Editor
Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223, USA
Interests: entropy; entropic statistics; Turing’s formula; diversity indices; domains of attraction on alphabets; decision trees

Special Issue Information

Dear Colleagues,

During the last few decades, research activity in modeling the properties of random systems via entropies has increased. From the early days of statistical thermodynamics, the concept of entropy has evolved into many practically useful tools. This Special Issue, under the theme of “Entropy-Based Statistics and Their Applications”, which may be more concisely termed “Entropic Statistics”, aims to collect research contributions on this topic in both theory and application. Theoretically, many fundamental questions may be effectively considered in a more holistic framework of entropic statistics. For example, what is entropy or could there be a more general definition of entropy to more efficiently serve the objective of statistically describing the underlying random system? What types of properties are describable and to what extent are they exclusive to entropies? What are the advantages and limitations in subscribing to a framework of entropic statistics? Discussions on these topics can add great value to this Special Issue. Reports of applied studies, based on an estimation of justified entropies with well-gauged statistical reliability, are also vital to this Special Issue.

Prof. Dr. Zhiyi Zhang
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • entropic probability and statistics
  • entropy indices
  • diversity indices
  • decision tree classifiers
  • confidence levels of classifiers
  • machine learning
  • artificial intelligence

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review

2 pages, 181 KiB  
Editorial
Entropy-Based Statistics and Their Applications
by Zhiyi Zhang
Entropy 2023, 25(6), 936; https://doi.org/10.3390/e25060936 - 14 Jun 2023
Cited by 1 | Viewed by 1261
Abstract
During the last few decades, research activity in modeling the properties of random systems via entropies has grown noticeably across a wide spectrum of fields [...] Full article
(This article belongs to the Special Issue Entropy-Based Statistics and Their Applications)

Research

Jump to: Editorial, Review

19 pages, 390 KiB  
Article
Several Basic Elements of Entropic Statistics
by Zhiyi Zhang
Entropy 2023, 25(7), 1060; https://doi.org/10.3390/e25071060 - 13 Jul 2023
Cited by 1 | Viewed by 1118
Abstract
Inspired by the development in modern data science, a shift is increasingly visible in the foundation of statistical inference, away from a real space, where random variables reside, toward a nonmetrized and nonordinal alphabet, where more general random elements reside. While statistical inferences [...] Read more.
Inspired by the development in modern data science, a shift is increasingly visible in the foundation of statistical inference, away from a real space, where random variables reside, toward a nonmetrized and nonordinal alphabet, where more general random elements reside. While statistical inferences based on random variables are theoretically well supported in the rich literature of probability and statistics, inferences on alphabets, mostly by way of various entropies and their estimation, are less systematically supported in theory. Without the familiar notions of neighborhood, real or complex moments, tails, et cetera, associated with random variables, probability and statistics based on random elements on alphabets need more attention to foster a sound framework for rigorous development of entropy-based statistical exercises. In this article, several basic elements of entropic statistics are introduced and discussed, including notions of general entropies, entropic sample spaces, entropic distributions, entropic statistics, entropic multinomial distributions, entropic moments, and entropic basis, among other entropic objects. In particular, an entropic-moment-generating function is defined and it is shown to uniquely characterize the underlying distribution in entropic perspective, and, hence, all entropies. An entropic version of the Glivenko–Cantelli convergence theorem is also established. Full article
(This article belongs to the Special Issue Entropy-Based Statistics and Their Applications)

Review

Jump to: Editorial, Research

24 pages, 517 KiB  
Review
A Survey on Error Exponents in Distributed Hypothesis Testing: Connections with Information Theory, Interpretations, and Applications
by Sebastián Espinosa, Jorge F. Silva and Sandra Céspedes
Entropy 2024, 26(7), 596; https://doi.org/10.3390/e26070596 - 12 Jul 2024
Viewed by 560
Abstract
A central challenge in hypothesis testing (HT) lies in determining the optimal balance between Type I (false positive) and Type II (non-detection or false negative) error probabilities. Analyzing these errors’ exponential rate of convergence, known as error exponents, provides crucial insights into system [...] Read more.
A central challenge in hypothesis testing (HT) lies in determining the optimal balance between Type I (false positive) and Type II (non-detection or false negative) error probabilities. Analyzing these errors’ exponential rate of convergence, known as error exponents, provides crucial insights into system performance. Error exponents offer a lens through which we can understand how operational restrictions, such as resource constraints and impairments in communications, affect the accuracy of distributed inference in networked systems. This survey presents a comprehensive review of key results in HT, from the foundational Stein’s Lemma to recent advancements in distributed HT, all unified through the framework of error exponents. We explore asymptotic and non-asymptotic results, highlighting their implications for designing robust and efficient networked systems, such as event detection through lossy wireless sensor monitoring networks, collective perception-based object detection in vehicular environments, and clock synchronization in distributed environments, among others. We show that understanding the role of error exponents provides a valuable tool for optimizing decision-making and improving the reliability of networked systems. Full article
(This article belongs to the Special Issue Entropy-Based Statistics and Their Applications)
Show Figures

Figure 1

Back to TopTop