Calibration Invariance of the MaxEnt Distribution in the Maximum Entropy Principle
Abstract
:1. Introduction
- (I)
- Finding a distribution (MaxEnt distribution) that maximizes entropy under given constraints.
- (II)
- Plugging the distribution into the entropic functional and calculating physical quantities as thermodynamic potentials, temperature, or response coefficients (specific heat, compressibility, etc.).
- (i)
- For each MaxEnt distribution, there exists the whole class of entropies and constraints leading to generally different thermodynamics.
- (ii)
- It is possible to establish transformation relations of Lagrange parameters (and subsequently the thermodynamic quantities) for classes of entropies and constraints giving the same MaxEnt distribution.
2. Maximum Entropy Principle in Statistical Physics
3. Calibration Invariance of MaxEnt Distribution with Entropy Transformation
- Rényi entropy and Tsallis entropy: Two most famous examples of generalized entropies are Rényi entropy and Tsallis entropy . Their relation can be expressed asand therefore we obtain thatThe difference between free entropy and α can be obtained asOne can therefore see that even though Rényi and Tsallis entropy lead to the same MaxEnt distribution, their thermodynamic quantities, such as temperature or free entropy, are different. Whether the system follows Rényi or Tsallis entropy depends on additional facts, as e.g., (non)-extensitivity and (non)-intensivity of thermodynamic quantities.
- Shannon entropy and Entropy power: A similar example is provided with Shannon entropy and entropy power . The relation between them is simplyso we obtain thatFor the difference between free entropy and α, we obtain thatfrom which we get thatTherefore, we see that even that the MaxEnt distribution remains unchanged, the relation between α and free energy is different.
4. Calibration Invariance of MaxEnt Distribution with Constraints Transformation
- Energy shift:Under this scheme, we can assume the constant shift in the energy spectrum. Let us rewrite the constraint in the following form,which allows us to identify the function as
- Latent escort means:Apart from linear means, it is possible to use some generalized approaches. One of these examples is provided by so-called escort mean:Therefore, we obtain that and , which correspond to the previous example for . Therefore, the latent energy mean can be understood in terms of MaxEnt procedure as the shift of the energy spectrum by its average energy.
5. Conclusions
Funding
Acknowledgments
Conflicts of Interest
References
- Jaynes, E.T. Information Theory and Statistical Mechanics. Phys. Rev. 1957, 106, 620. [Google Scholar] [CrossRef]
- Jaynes, E.T. Information Theory and Statistical Mechanics. II. Phys. Rev. 1957, 108, 171. [Google Scholar] [CrossRef]
- Burg, J.P. The relationship between maximum entropy spectra and maximum likelihood spectra. Geophysics 1972, 37, 375–376. [Google Scholar] [CrossRef]
- Rényi, A. Selected Papers of Alfréd Rényi; Akademia Kiado: Budapest, Hungary, 1976; Volume 2. [Google Scholar]
- Havrda, J.H.; Charvát, F. Quantification Method of Classification Processes. Concept of Structural α-Entropy. Kybernetika 1967, 3, 30–35. [Google Scholar]
- Sharma, B.D.; Mitter, J.; Mohan, M. On Measures of “Useful” Information. Inf. Control 1978, 39, 323–336. [Google Scholar] [CrossRef] [Green Version]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Frank, T.; Daffertshofer, A. Exact time-dependent solutions of the Renyi Fokker-Planck equation and the Fokker-Planck equations related to the entropies proposed by Sharma and Mittal. Physica A 2000, 285, 351–366. [Google Scholar] [CrossRef]
- Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef] [Green Version]
- Jizba, P.; Arimitsu, T. The world according to Rényi: Thermodynamics of multifractal systems. Ann. Phys. 2004, 312, 17–59. [Google Scholar] [CrossRef]
- Hanel, R.; Thurner, S. A comprehensive classification of complex statistical systems and an ab-initio derivation of their entropy and distribution functions. Europhys. Lett. 2011, 93, 20006. [Google Scholar] [CrossRef]
- Thurner, S.; Hanel, R.; Klimek, P. Introduction to the Theory of Complex Systems; Oxford University Press: Oxford, UK, 2018. [Google Scholar]
- Korbel, J.; Hanel, R.; Thurner, S. Classification of complex systems by their sample-space scaling exponents. New J. Phys. 2018, 20, 093007. [Google Scholar] [CrossRef]
- Tempesta, P.; Jensen, H.J. Universality classes and information-theoretic Measures of complexity via Group entropies. Sci. Rep. 2020, 10, 1–11. [Google Scholar] [CrossRef] [Green Version]
- Ilić, V.M.; Stankovixcx, M.S. Generalized Shannon-Khinchin axioms and uniqueness theorem for pseudo-additive entropies. Physica A 2014, 411, 138–145. [Google Scholar] [CrossRef] [Green Version]
- Ilić, V.M.; Scarfone, A.M.; Wada, T. Equivalence between four versions of thermostatistics based on strongly pseudoadditive entropies. Phys. Rev. E 2019, 100, 062135. [Google Scholar] [CrossRef] [Green Version]
- Czachor, M. Unifying Aspects of Generalized Calculus. Entropy 2020, 22, 1180. [Google Scholar] [CrossRef]
- Beck, C.; Schlögl, F. Thermodynamics of Chaotic Systems: An Introduction; Cambridge University Press: Cambridge, UK, 1993. [Google Scholar]
- Abe, S. Geometry of escort distributions. Phys. Rev. E 2003, 68, 031101. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bercher, J.-F. On escort distributions, q-gaussians and Fisher information. AIP Conf. Proc. 2011, 1305, 208. [Google Scholar]
- Czachor, M.; Naudts, J. Thermostatistics based on Kolmogorov-Nagumo averages: Unifying framework for extensive and nonextensive generalizations. Phys. Lett. A 2002, 298, 369–374. [Google Scholar] [CrossRef] [Green Version]
- Scarfone, A.M.; Matsuzoe, H.; Wada, T. Consistency of the structure of Legendre transform in thermodynamics with the Kolmogorov-Nagumo average. Phys. Lett. A 2016, 380, 3022–3028. [Google Scholar] [CrossRef]
- Bercher, J.-F. Tsallis distribution as a standard maximum entropy solution with ‘tail’ constraint. Phys. Lett. A 2008, 372, 5657–5659. [Google Scholar] [CrossRef] [Green Version]
- Pressé, S.; Ghosh, K.; Lee, J.; Dill, K.A. Nonadditive Entropies Yield Probability Distributions with Biases not Warranted by the Data. Phys. Rev. Lett. 2013, 111, 180604. [Google Scholar] [CrossRef] [PubMed]
- Oikonomou, T.; Bagci, B. Misusing the entropy maximization in the jungle of generalized entropies. Phys. Lett. A 2017, 381, 207–211. [Google Scholar] [CrossRef] [Green Version]
- Tsallis, C.; Mendes, R.S.; Plastino, A.R. The role of constraints within generalized nonextensive statistics. Phys. A 1998, 286, 534–554. [Google Scholar] [CrossRef]
- Martínez, S.; Nicolás, F.; Peninni, F.; Plastino, A. Tsallis’ entropy maximization procedure revisited. Phys. A 2000, 286, 489–502. [Google Scholar] [CrossRef] [Green Version]
- Plastino, A.; Plastino, A.R. On the universality of thermodynamics’ Legendre transform structure. Phys. Lett. A 1997, 226, 257–263. [Google Scholar] [CrossRef]
- Rama, S.K. Tsallis Statistics: Averages and a Physical Interpretation of the Lagrange Multiplier β. Phys. Lett. A 2000, 276, 103–108. [Google Scholar] [CrossRef] [Green Version]
- Campisi, M.; Bagci, G.B. Tsallis Ensemble as an Exact Orthode. Phys. Lett. A 2007, 362, 11–15. [Google Scholar] [CrossRef] [Green Version]
- Dixit, P.D.; Wagoner, J.; Weistuch, C.; Pressé, S.; Ghosh, K.; Dill, K.A. Perspective: Maximum caliber is a general variational principle for dynamical systems. J. Chem. Phys. 2018, 148, 010901. [Google Scholar] [CrossRef] [Green Version]
- Lucia, U. Stationary Open Systems: A Brief Review on Contemporary Theories on Irreversibility. Physica A 2013, 392, 1051–1062. [Google Scholar] [CrossRef]
- Palazzo, P. Hierarchical Structure of Generalized Thermodynamic and Informational Entropy. Entropy 2018, 20, 553. [Google Scholar] [CrossRef] [Green Version]
- Shore, J.E.; Johnson, R.W. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans. Inf. Theor. 1980, 26, 26–37. [Google Scholar] [CrossRef] [Green Version]
- Shore, J.E.; Johnson, R.W. Properties of cross-entropy minimization. IEEE Trans. Inf. Theor. 1981, 27, 472–482. [Google Scholar] [CrossRef] [Green Version]
- Uffink, J. Can the maximum entropy principle be explained as a consistency requirement? Stud. Hist. Philos. Mod. Phys. 1995, 26, 223–261. [Google Scholar] [CrossRef] [Green Version]
- Tsallis, C. Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems. Entropy 2015, 17, 2853–2861. [Google Scholar] [CrossRef] [Green Version]
- Pressé, S.; Ghosh, K.; Lee, J.; Dill, K.A. Reply to C. Tsallis’ “Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems”. Entropy 2015, 17, 5043–5046. [Google Scholar] [CrossRef] [Green Version]
- Oikonomou, T.; Bagci, G.B. Rényi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data. Phys. Rev. E 2019, 99, 032134. [Google Scholar] [CrossRef] [Green Version]
- Jizba, P.; Korbel, J. Comment on “Rényi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data”. Phys. Rev. E 2019, 100, 026101. [Google Scholar] [CrossRef] [Green Version]
- Oikonomou, T.; Bagci, G.B. Reply to “Comment on Rényi entropy yields artificial biases not in the data and incorrect updating due to the finite-size data”. Phys. Rev. E 2019, 100, 026102. [Google Scholar] [CrossRef] [Green Version]
- Jizba, P.; Korbel, J. Maximum Entropy Principle in Statistical Inference: Case for Non-Shannonian Entropies. Phys. Rev. Lett. 2019, 122, 120601. [Google Scholar] [CrossRef] [Green Version]
- Jizba, P.; Korbel, J. When Shannon and Khinchin meet Shore and Johnson: Equivalence of information theory and statistical inference axiomatics. Phys. Rev. E 2020, 101, 042126. [Google Scholar] [CrossRef]
- Plastino, A.; Plastino, A.R. Tsallis Entropy and Jaynes’ Information Theory Formalism. Braz. J. Phys. 1999, 29, 50–60. [Google Scholar] [CrossRef]
- Naudts, J. Generalized Thermostatistics; Springer: London, UK, 2011. [Google Scholar]
- Biró, T.S.; Ván, P. Zeroth law compatibility of nonadditive thermodynamics. Phys. Rev. E 2011, 83, 061147. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wada, T.; Scarfone, A.M. Connections between Tsallis’ formalisms employing the standard linear average energy and ones employing the normalized q-average energy. Phys. Lett. A 2005, 335, 351–362. [Google Scholar] [CrossRef] [Green Version]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Korbel, J. Calibration Invariance of the MaxEnt Distribution in the Maximum Entropy Principle. Entropy 2021, 23, 96. https://doi.org/10.3390/e23010096
Korbel J. Calibration Invariance of the MaxEnt Distribution in the Maximum Entropy Principle. Entropy. 2021; 23(1):96. https://doi.org/10.3390/e23010096
Chicago/Turabian StyleKorbel, Jan. 2021. "Calibration Invariance of the MaxEnt Distribution in the Maximum Entropy Principle" Entropy 23, no. 1: 96. https://doi.org/10.3390/e23010096
APA StyleKorbel, J. (2021). Calibration Invariance of the MaxEnt Distribution in the Maximum Entropy Principle. Entropy, 23(1), 96. https://doi.org/10.3390/e23010096