An Intrinsic Characterization of Shannon’s and Rényi’s Entropy
Abstract
:1. Introduction
2. Results
- 1.
- H is the Shannon entropy,
- 2.
- H has the following properties:
- (a)
- H is a continuous function in the topology of convergence in distribution;
- (b)
- for and any permutation π;
- (c)
- for all ;
- (d)
- for all , ;
- (e)
- if p is a nondegenerate geometric distribution;
- (f)
- for all ;
- (g)
- a function exists such that
- 1.
- H is the Rényi entropy,
- 2.
- H satisfies Conditions 2(a)–2(f) of Theorem 1. A value and a function exist such that for we have
- 1.
- H is the min-entropy,
- 2.
- H satisfies Conditions 2(a)–2(f) of Theorem 1. A function exists, such that for , we have
- 1.
- H is the Hartley entropy,
- 2.
- H satisfies Conditions 2(b)–2(d) and 2(f) of Theorem 1 and . Further, for holds
3. Proofs
3.1. Functional Equations
- 1.
- f is continuous and, for all , we have
- 2.
- 3.
- f is the identity.
- 1.
- Φ is continuous on with values in , and as . For all and , we have
- 2.
- .
3.2. Properties of the Operator
3.3. Proof of Theorem 1
3.4. Proof of Theorem 2
3.5. Proof of Theorem 3
3.6. Proof of Theorem 4
4. Discussion
- The discrete nature of the objects;
- The hierarchical, self-similar structure of the objects, cf. Condition 2(g);
- The parity among the objects at each hierarchical level, cf. Condition 2(b).
4.1. Practical Implications
- Classification systems: forms of art (e.g., music genres), languages and their dialects.
- Biology: taxonomie, diversity, genome.
- Administration units: districts, corporate structure, military.
- Universe: clustering.
- Networks [20].
- Statistics: graphs and cluster analysis.
4.2. Modifications
4.3. The Geometric Distribution
4.4. On the Proportionality Relation Characterizing the Rényi Entropy
4.5. Tsallis and Other Entropies
4.6. Shannon’s Notion of Entropy
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Khinchin, A. The concept of entropy in the theory of probability. Uspekhi Mat. Nauk 1953, 8, 3–20. [Google Scholar]
- Faddeev, D. On the concept of entropy of a finite probabilistic scheme. Uspekhi Mat. Nauk 1956, 11, 227–231. [Google Scholar]
- Rényi, A. On measures of entropy and information. In Fourth Berkeley Symposium on Mathematical Statistics and Probability: Contributions to the Theory of Statistics; University of California Press: Berkeley, CA, USA, 1961; pp. 547–562. [Google Scholar]
- Leinster, T. Entropy and Diversity—The Axiomatic Approach; Cambridge University Press: Cambrdige, UK, 2021. [Google Scholar]
- Aczél, J.; Daróczy, Z. On Measures of Information and Their Characterizations; Academic Press: New York, NY, USA, 1975. [Google Scholar]
- Ebanks, B.; Sahoo, P.K.; Sander, W. Characterization of Information Measures; World Scientific: Singapore, 1998. [Google Scholar]
- Carcassi, G.; Aidala, C.; Barbour, J. Variability as a better characterization of Shannon entropy. Eur. J. Phys. 2021, 42, 045102. [Google Scholar] [CrossRef]
- Baez, J.; Fritz, T.; Leinster, T. A characterization of entropy in terms of information loss. Entropy 2011, 13, 1945–1957. [Google Scholar] [CrossRef]
- Onicescu, O. Theorie de l’information energie informationelle. Comptes Rendus De L’Acad. Des Sci. Ser. A-B 1966, 263, 841–842. [Google Scholar]
- Pardo, L. Order-α weighted information energy. Inf. Sci. 1986, 40, 155–164. [Google Scholar] [CrossRef]
- Hartley, R. Transmission of information 1. Bell Syst. Tech. J. 1928, 7, 535–563. [Google Scholar] [CrossRef]
- Jakimiuk, J.; Murawski, D.; Nayar, P.; Słobodianiuk, S. Log-concavity and discrete degrees of freedom. Discret. Math. 2024, 347, 114020. [Google Scholar] [CrossRef]
- Aczél, J.; Forte, B.; Ng, C. Why the Shannon and Hartley entropies are ’natural’. Adv. Appl. Probab. 1974, 6, 131–146. [Google Scholar] [CrossRef]
- Arunachalam, S.; Chakraborty, S.; Kouckỳ, M.; Saurabh, N.; De Wolf, R. Improved bounds on Fourier entropy and min-entropy. ACM Trans. Comput. Theory 2021, 13, 22. [Google Scholar] [CrossRef]
- Schlather, M. An algebraic generalization of the entropy and its application to statistics. arXiv 2024, arXiv:2404.05854. [Google Scholar]
- Gradshteyn, I.; Ryzhik, I. Table of Integrals, Series, and Products, 6th ed.; Academic Press: London, UK, 2000. [Google Scholar]
- Erdös, P. On the distribution function of additive functions. Ann. Math. 1946, 47, 1–20. [Google Scholar] [CrossRef]
- Rickman, J.; Barmak, K.; Chen, B.; Patrick, M. Evolving information complexity of coarsening materials microstructures. Sci. Rep. 2023, 13, 22390. [Google Scholar] [CrossRef] [PubMed]
- Du, C.; Li, X.; Liu, C.; Song, C.; Yuan, J.; Xin, Y. Combining ultrasonic guided wave and low-frequency electromagnetic technology for defect detection in high-temperature Cr–Ni alloy furnace tubes. Sci. Rep. 2023, 13, 18592. [Google Scholar] [CrossRef] [PubMed]
- Song, C.; Havlin, S.; Makse, H. Self-similarity of complex networks. Nature 2005, 433, 392–395. [Google Scholar] [CrossRef]
- Engelke, S.; Hitz, A. Graphical models for extremes. J. R. Stat. Soc. Ser. B 2020, 82, 871–932. [Google Scholar] [CrossRef]
- Abraham, R.; Delmas, J.F. An introduction to Galton–Watson trees and their local limits, 2015. arXiv 2014, arXiv:1506.05571. [Google Scholar]
- Conrad, K. Probability Distributions and Maximum Entropy. Available online: https://kconrad.math.uconn.edu/blurbs/analysis/entropypost.pdf (accessed on 14 November 2012).
- Janson, S. Simply generated trees, conditioned Galton-Watson trees, random allocations and condensation. Discret. Math. Theor. Comput. Sci. 2012, 9, 103–252. [Google Scholar] [CrossRef]
- Wikipedia. Canonical Ensemble. Available online: https://en.wikipedia.org/wiki/Canonical_ensemble (accessed on 28 November 2012).
- Życzkowski, K. Rényi extrapolation of Shannon entropy. Open Syst. Inf. Dyn. 2003, 10, 297–310. [Google Scholar] [CrossRef]
- Havrda, J.; Charvát, F. Quantification method of classification processes. Concept of structural a-entropy. Kybernetika 1967, 3, 30–35. [Google Scholar]
- Tsallis, C. Entropy. Encyclopedia 2022, 2, 264–300. [Google Scholar] [CrossRef]
- Berg, C.; Christensen, J.P.R.; Ressel, P. Harmonic Analysis on Semigroups. Theory of Positive Definite and Related Functions; Springer: New York, NY, USA, 1984. [Google Scholar]
- Shannon, C. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Shannon, C. The bandwagon. IRE Trans. Inf. Theory 1956, 2, 3. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Schlather, M.; Ditscheid, C. An Intrinsic Characterization of Shannon’s and Rényi’s Entropy. Entropy 2024, 26, 1051. https://doi.org/10.3390/e26121051
Schlather M, Ditscheid C. An Intrinsic Characterization of Shannon’s and Rényi’s Entropy. Entropy. 2024; 26(12):1051. https://doi.org/10.3390/e26121051
Chicago/Turabian StyleSchlather, Martin, and Carmen Ditscheid. 2024. "An Intrinsic Characterization of Shannon’s and Rényi’s Entropy" Entropy 26, no. 12: 1051. https://doi.org/10.3390/e26121051
APA StyleSchlather, M., & Ditscheid, C. (2024). An Intrinsic Characterization of Shannon’s and Rényi’s Entropy. Entropy, 26(12), 1051. https://doi.org/10.3390/e26121051