Next Article in Journal
Spontaneous Emergence of Agent Individuality Through Social Interactions in Large Language Model-Based Communities
Next Article in Special Issue
Computing Entropy for Long-Chain Alkanes Using Linear Regression: Application to Hydroisomerization
Previous Article in Journal
Control of Overfitting with Physics
Previous Article in Special Issue
An Instructive CO2 Adsorption Model for DAC: Wave Solutions and Optimal Processes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Internal Energy, Fundamental Thermodynamic Relation, and Gibbs’ Ensemble Theory as Emergent Laws of Statistical Counting

Department of Applied Mathematics, University of Washington, Seattle, WA 98195-3925, USA
Entropy 2024, 26(12), 1091; https://doi.org/10.3390/e26121091
Submission received: 20 November 2024 / Accepted: 11 December 2024 / Published: 13 December 2024

Abstract

:
Statistical counting ad infinitum is the holographic observable to a statistical dynamics with finite states under independent and identically distributed N sampling. Entropy provides the infinitesimal probability for an observed empirical frequency ν ^ with respect to a probability prior p , when ν ^ p as N . Following Callen’s postulate and through Legendre–Fenchel transform, without help from mechanics, we show that an internal energy u emerges; it provides a linear representation of real-valued observables with full or partial information. Gibbs’ fundamental thermodynamic relation and theory of ensembles follow mathematically. u is to ν ^ what chemical potential μ is to particle number N in Gibbs’ chemical thermodynamics, what β = T 1 is to internal energy U in classical thermodynamics, and what ω is to t in Fourier analysis.

1. Introduction

It is a pleasure to be a part of this celebration for Signe Kjelstrup. She has made significant contributions to nonequilibrium thermodynamics, in both theory and applications that include electrochemistry, transport in heterogeneous media, and T. L. Hill’s small systems [1,2,3]. In this work we extend Gibbs’ and Hill’s approach to equilibrium thermodynamics [4] and show that the new logical path via the “crucial step” advocated in [5] is in fact a consequence of a limit theorem [6] in the mathematical theory of probability [7,8]. The results perfectly fit P. W. Anderson’s notion of emergent phenomenon [9].
Sometimes a mathematical transform can provide a fundamental concept beyond just being a technique for solving a problem, and through which a new representation of a natural phenomenon emerges. A case in point is the Fourier transform (FT) that leads to the theory of harmonics in music instruments [10] and the very concept of optical spectrum. FT represents a function of time f ( t ) in terms of f ˜ ( ω ) , where ω is introduced as a novel notion, the temporal frequency of a sinusoidal oscillatory component in time [11]. The solutions to a large class of problems in differential calculus involving t can be very efficiently expressed through FT.
We show in the present paper that the fundamental notion of internal energy first appeared in the theory of thermodynamics in the 19th century, collectively developed by J. R. von Mayer, W. Rankine, R. Clausius, and W. Thomson among many others [12], is a concept that can be understood, and generalized, in statistical counting. The transformation in question is the Legendre–Fenchel transform (LFT) [13,14], a more refined mathematical formulation of the traditional Legendre transform [15].
When a simple statistical analysis is carried out on a set of data, correlated or not, it is usually supposed that they are from an identical probability distribution. One of the best understood systems that exhibit an invariant probability is the ergodic dynamical system [16]. The ergodic theory of classical Hamiltonian dynamics has been an intense research area in both physics and mathematics for more than a century [17,18]. Even when the data are from seemingly different “objects”, say different individuals within a biological species, it is understood that an ergodic mating or mutational process is behind the statistical practice; and the conclusions drawn are most meaningful in this regard. Such an ergodic stochastic dynamic perspective has transformed cell biology through the notion of phenotypic switching in recent years [19].

2. Energetic Theory of Statistical Counting

Let us consider the repeated statistical samples ad infinitum of a system with finite state space S = { 0 , 1 , , n } . In the present work we shall restrict our discussion for independent and identically distributed (i.i.d.) samples. More general sampling of Markov data will be published elsewhere. The number counting ν = ( ν 0 , , ν n ) with ν 0 + + ν n = N and counting frequency ν ^ = ν / N , not to be confused with the ω in FT above, has a homogeneous degree 1 neg-entropy function with respect to a given probability prior p = ( p 0 , , p n ) [8,20]:
Φ ( ν ) = i = 0 n ν i ln ν i p i j = 0 n ν j .
The Appendix A provides the mathematical origin of the non-negative Φ ( ν ) as a result of statistical counting. In information theory, it is interpreted as the “surprise” in observing the ν under the assumption p [21,22]. It is a double-edged sword which tells the rareness of ν (or ν ^ ) with respect to p or erroneous model p with respect to empirical ν . The Kolmogorov probability p has two rather different roles in statistical inference and in statistical physics. In the former it has been identified as only half to the other “half of probability theory as it is needed in current applications—the principles for assigning probabilities by logical analysis of incomplete information—is not present at all in the Kolmogorov system” [20].
The application of modern probability to statistical physics involves the limit of sample ad infinitum represented by N [6]. In this case, the probability p is for all the systems with the same state space S ; it is not meant to be realistic for any particular system. It simply provides a “metric” under which each and every particular system has its own representation in terms of its complete information, ν . The Φ ( ν ) is introduced to further gauge the differences among systems with different ν ’s on the same S ; it becomes a “theory of everything.” Because the N limit, there are no uncertainties in ν ^ ; it is a definitive characterization of an i.i.d. statistical distribution with state space S .
Therefore, statistical inference is about the mathematical model of a particular system, and statistical physics is about the mathematical representation of all systems with the same S under the supposition of i.i.d. data ad infinitum. The entropy in (1) is an emergent characterization in the limit of N , with the starting point in terms of generative models [9]. It provides the relationship between ν and p in the sampling process. It is an Eulerian degree 1 homogeneous function of ν : Φ ( λ ν ) = λ Φ ( ν ) . This fits naturally to the fundamental thermodynamic postulate formulated by H. B. Callen [23]. The LFT of Φ as a function of the normalized ν ^ then yields [13,14,24]:
Ψ ( u ) = inf ν ^ i = 0 n ν ^ i u i + Φ ν ^ = ln i = 0 n p i e u i ,
with corresponding optimal ν ^ * ( u )
ν ^ * i = p i e u i = 0 p e u , and u i = Φ ( ν ^ * ) ν ^ i .
Note that the second equation in (2b) is obtained when one uses calculus to solve the infimum in (2a); this recovers the traditional Legendre transform. Normalizing ν to ν ^ induces a gauge freedom in (2), an arbitrary additive constant to u i . In statistical thermodynamics, the conjugate variable u k introduced in Equation (2) has been interpreted as the internal energy of the state k, in k B T unit [25]; then ν ^ · u is the mean internal energy of “the statistical system”.
In a real-world laboratory working on a particular system, the ν tends to infinity as N but ν ^ converges to the intrinsic property of the statistical system. In statistical inference, the assumed p , as a prior, then is expected to be replaced by the observed, real, posterior ν ^ according to conditional probability and/or Bayesian statistical logic [24,25]. This concludes the statistical investigation of the particular system with respect to the type of observations. The neg-entropy function in (1) actually provides a meta-statistical theory for all possible observed ν ^ , assessing their respective infinitesimal probability (rate) with respect to the prior p (see Appendix A).

3. Maximization of Entropy Φ Under Constraint by Empirical Mean Value

However, The complete counting for the entire state space S in terms of empirical frequencies ν ^ is only a gedankenexperiment. The significance of Gibbs’ ensemble theory is in dealing with observations from a small set of real-valued observables g 1 ( i ) , g 2 ( i ) , , g J ( i ) , where i S but J n . These g’s are random variables on the state space S . In fact, their empirical mean values are linear combinations of the ν ^ :
x j = i = 0 n ν ^ i g j ( i ) .
To fix mathematical notations, we append g 0 ( i ) = 1 and x 0 = 1 , which represent the fact that ν ^ is always normalized, and denote ( n + 1 ) × ( J + 1 ) matrix G J with elements
( G J ) i j = 1 j = 0 , g j ( i ) j = 1 , , J .
Equation (3) shows that if all the g’s are linearly independent and J = n , then one can solve the normalized ν ^ uniquely from each set of x’s: ν ^ = x G n 1 . We refer to such a set of observables as holographic with full information. In the following discussion, we shall always imagine the ( g 1 , , g J ) as the first J component of a holographic observable ( g 0 ,   g 1 , , g n ) . When J < n , there is missing information [20,22,25].
With a set of observed values x = ( x 1 , , x J ) in hand where J < n , the maximum entropy principle (MEP) from classical thermodynamics [23] and the contraction principle from the mathematical theory of probability [8] assert that the most probable ν ^ * that is consistent with the set of x corresponds to minimum neg-entropy:
ν ^ * = arg inf ν ^ Φ ( ν ^ ) | ν ^ G J = x .
The entire Gibbs’ ensemble theory arises in solving the mathematical problem posed in Equation (5) through LFT. See Appendix A for its origin.
Entropy functions for different observables are different. First, for invertible G n , one has the entropy function for the holographic observable x = ( 1 , x 1 , , x n ) :
Φ x ( x ) Φ x G n 1 .
This is simply a change in the independent variables from ν to x . Then in terms of this entropy function Φ x , (5) becomes
φ ( x ) = inf ν ^ Φ ν ^ | ν ^ G J = x = inf x J + 1 , , x n Φ x x | x 1 = x 1 , , x J = x J .
Intimately related to the generating function of a probability distribution, the LFT provides a powerful mathematical transform of the entropy functions Φ ( ν ) , Φ x ( x ) , and φ ( x ) in terms of their conjugates in the energy representation: Parallel to the Ψ ( u ) in (2) are,
Ψ y ( y ) = inf x i = 1 n x i y i + Φ x ( x ) ,
ψ ( y ) = inf x j = 1 J x j y j + φ x .
These psi’s are now related through linear transformation:
Ψ y ( y ) = Ψ G n y ,
and projection:
ψ ( y ) = Ψ y y 1 , , y J , 0 , , 0 = Ψ G J y
= ln i = 0 n p i exp j = 1 g j ( i ) y j .
And finally, since ψ is convex, the inverse LFT yields
φ ( x ) = inf y i = 1 J x i y i ψ ( y ) = φ = y · ψ ( y ) ψ ( y ) x = ψ ( y )
The optimization in (5) is completely “solved” in closed form, through LFT and its inverse, as a parametric function in terms of y given in (12).
The equation φ = y · ψ ψ in (12) should be recognized as a generalization of the celebrated “entropy = mean internal energy − free energy”, where
ψ k = i = 0 n g k ( i ) p i exp j = 1 J g j ( i ) y j i = 0 n p i exp j = 1 J g j ( i ) y j
is the mean value of g k following Equation (11b), whose conjugate variable is y k . The identification of u = G J y in (11a) with the first law of thermodynamics as formulated by Gibbs seems natural.
The y J + 1 = = y n = 0 in (11a) has a very clear thermodynamic interpretation: Since the conjugate variable y are the partial derivatives of the entropy function Φ x with respect to x , finding x’s with maximum entropy in Equation (7) is simply setting the corresponding y = 0 , e.g., letting the entropic force be zero. For each independent observable g j , y j is its “custom-designed” conjugate force and y j × d x j contributes a term to the internal energy as the “thermodynamic work” associated with g j : The internal energy u is a highly flexible, adaptive representation of the ν ^ . When J = n , u = G n y and Equation (10) provides a complete “detailing” of the internal energy in terms of a set of holographic observables. MEP is for missing information [20].

4. Gibbs Distribution and Linear Algebraic Representation

There is a geometric picture associated with the above “thermodynamic analysis”. As we have stated, counting frequency ad infinitum  ν ^ is a fundamental, intrinsic property of an ergodic dynamical system. The space of all possible frequency distributions ν ^ , with ν ^ 0 + + ν ^ n = 1 , is a n-dimensional hyper-plane in the positive quadrant of R n + 1 , known as a probability simplex M n . For a given set of observables ( g 1 , , g J ) , the M n is foliated by ν ^ G j = x with different x . On each leave of the foliation, there is the most probable ν * ( x ) , which is located at the tangent point between the ( n J ) -dimensional leave and a ( n 1 ) -dimensional level set of the Φ ν ^ function. At this point, ν Φ ( ν ^ ) = u ( ν ^ ) is the normal vector to the x -leave in R n + 1 , and φ ( x ) = y is its projection onto the J-manifold of x :
ν * x = ν i * = 1 Z ( y ) p i exp j = 1 J g j ( i ) y j x j = 1 Z ( y ) i = 0 n g j ( i ) p i exp j = 1 J g j ( i ) y j
in which
Z ( y ) = i = 0 n p i exp i = 1 J g j ( i ) y j .
All the other points on the same x -leave are no longer relevant: they are deemed statistically impossible under the prior p and observed x . The foliation therefore represents a partition of the M n into macro- and micro-worlds: Transversing between different x -leaves are macroscopic thermodynamic processes that follow the y ( x ) . According to the logic of Bayesian statistics, one should use the most suitable probability frequency distribution ν ^ * ( x ) to update the prior p for the particular system with observed x . The microscopic world is still random, due to missing information, but its prior is now updated. This is Gibbs’ statistical ensemble.
With a given set of ( g 1 , , g J ) , the M n is collapsed into J-manifold in R n + 1 , which is parametrized by the x , or equivalently y . There is no uncertainty in this “macroscopic” description. For a different set of g’s and J , there will be a different J -manifold. It will be desirable to treat different g’s through transformations. We note that even though M n is a “plane” in R n + 1 , it is not a linear Euclidean space since for any c 1 , c ν ^ M n , and neither are the x -leaves. They are affine manifolds [26]. The locating of ν ^ * ( x ) is a highly nonlinear procedure in the space of energies.
The LFT, in terms Ψ ( u ) , Ψ y ( y ) and ψ ( y ) , etc., enters as a powerful algebraic linear representation of the MEP procedure. The “collapse” of a holographic y to y with missing information means simply neglecting all the extra dimensions: y J + 1 = = y n = 0 . This is because due to the convexity of Φ ν ^ , there is a one-to-one relation between ν ^ and u = Φ under a proper gauge fixing. And since the constrains to MEP in (5) are all linear due to the nature of observables being random variables, each g determines a 1-dimensional linear subspace in the space of u .

5. Generalized Clausius Inequality

A combination of Equations (9) and (12a) yields a Clausius’ inequality-like relation:
φ ( x ) + x · y ψ ( y ) 0 .
The thermodynamics equilibrium is between the observed mean value x and its conjugate “force” y . When the equality holds, there is a relation between x and y which should be identified as a “the equation of state”, with x φ = y and y ψ = x . When the x y ψ ( y ) , the difference x · y ψ ( y ) can be interpreted as the nonequilibrium heat and φ again as the entropy; then, the inequality in (15) becomes the Clausius’ inequality.

6. Generalized Gibbs–Duhem Equation

The celebrated Gibbs–Duhem equation in classical thermodynamics is a consequence of the entropy being a Eulerian degree 1 homogeneous function. Thus, for the Φ ( ν ) in (1), we have
Φ ( ν ) = i = 0 n ν i Φ ν i , i = 0 n ν i 2 Φ ν i ν j = 0 , j = 0 n d ν j i = 0 n ν i 2 Φ ν i ν j = i = 0 n ν i j = 0 n u i ν j d ν j = 0 , that is , i = 0 n ν i d u i = 0 ,
in which we have used (2b). We identify (16) as a generalized Gibbs–Duhem equation.

7. Conclusions

The mathematical theory of probability deals with a set of elementary events S , on which the probability p and random variables g’s are introduced. Applying this mathematics to the real world, each ergodic dynamical system with state space S has its own unique steady-state probability distribution which can be obtained as the ν ^ from i.i.d. sampling ad infinitum.
Our present theory is to statistical inference obtaining particular ν ^ ’s what dynamics is to kinematics in classical mechanics [27]. The entropy function in (1) arises in this context as a measure of the quantitative relationship between the assumed, “hypothesis” ( p ) and the observed “data” ( ν and ν ^ ), as “missing information” or “surprise” [21,22]. Motivated by the analogy to the Fourier analysis, our generalized Gibbs’ theory suggests that the notion of thermo-energetics is a powerful mathematical transformation of the statistical description; ν ^ and u are simply two representations of a same physical reality, the former being statistical while the latter thermo-energetic. u is to ν ^ what chemical potential μ is to particle number N in J. W. Gibbs’ chemical thermodynamics. With a fixed p , the theory of probability [8] revealed a powerful, dual energetic representation for various different systems, with the same state space S , in terms of their respective internal energy functions u [25]. This fundamental duality between counting frequency and internal energy of course has been recognized by L. Boltzmann already in 1880s, when he was developing the statistical mechanics as a foundation of classical thermodynamics under the principle of equal probability a priori. The present work shows that while the probability and statistics are fundamental as the foundation of thermodynamics, mechanics is not necessary. A similar conclusion was reached in the 1925 thesis of L. Szilard [28,29].
For sufficiently large N, the probability of observing a particular ν ^ is asymptotically zero except ν ^ = p . The significance of Φ ν ^ ; p is to provide a “high-resolution magnifying glass” for the asymptotically small
exp N Φ ν ^ ; p .
This is known as the large deviations rate function in the modern theory of probability [8]. The entropy Φ ( ν ^ , p ) is a function of both ν ^ and p , Φ ( ν ^ ; p ) 0 and Φ ( p ; p ) = 0 . For a given p , it views each possible ν ^ from a real system as a part of an entire class of systems under a common p , a meta-statistics. If one chooses the true steady-state probability π of a particular system to replace p , then Equation (17) gives the probability distribution of the uncertainties in the measurement ν ^ from N samples. The second-order Taylor expansion near π ,
e N Φ ( ν ^ ; π ) exp N i , j = 0 n ( ν ^ i π i ) ( δ i j π i ) ( ν ^ j π j ) π i ,
is the central limit theorem for the statistics of counting frequency ν ^ , with Var [ ν ^ i ] = π i ( 1 π i ) / N and Cov [ ν ^ i , ν ^ j ] = π i π j / N . This is not the fluctuations within the π of the system itself. Gibbs’ theory of ensemble is about statistical measurements of a whole system; not about the individuals within.
We choose to present our theory with finite state space S for mathematical simplicity. Formal generalization to continuous state space is straight forward if mathematical rigor is not required. Beyond the finite state space, it is known that modern probability and the theory of measures encounter challenges, c.f., de Finetti’s treatment of infinite sets and the axiom of choice of nonempity subsets [20]. In addition to continuous R n [30], there are even larger Hilbert spaces of functions on R n and/or von Neumann algebra of operators acting on a Hilbert space.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

I thank Jin Feng, Weishi Liu, Zhang-Ju Liu, Bing Miao, Zhongmin Shen, Xiang Tang, Yong-Shi Wu, and particularly Jun Zhang, for many helpful discussions, and the support from Olga Jung Wan Endowed Professorship.

Conflicts of Interest

The author declares no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
LFTLegendre–Fenchel transform
FTFourier transform

Appendix A. Statistical Counting ad Infinitum

In this section, we provide the mathematical reasoning for stating “entropy provides the infinitesimal probability for an observed frequency ν ^ with respect to a probability prior p ”, “it characterizes the relationship between ν and p in a sampling process”, and the origin of Legendre–Fenchel transform in entropy analysis. The counting of independent and identically distributed samples with state space S = { 0 , , n } yields ν = ( ν 0 , , ν n ) , a ( n + 1 ) -tuple of non-negative integers. We call all the ν with N = ν 0 + + ν n a simplex for counting. The simplex for counting grows with N, which we shall identify as “time”. With a given prior probability p = ( p 0 , , p n ) on S , statistical counting is a Markov process on a growing simplex, with probability:
P ( N + 1 ) ν = k = 0 n p k P ( N ) ν δ k ,
in which δ k = ( 0 , , 1 , , 0 ) is the unit vector for the k t h component. One can easily verify that
P ( N ) ( ν ) = N ! ν 0 ! ν n ! p 0 ν 0 p n ν n
is a solution to (A1).
One is interested in the limit of counting ad infinitum, when all the ν i s are expected to tend to infinity as N . On the increasing simplex for ν , the probability P ( N ) ( ν ) 0 . However, the properly normalized ν ^ = ν / N converges, and P ( N ) as a function of the ν ^ becomes sharper and sharper, concentrated around ν ^ * = p . To more precisely characterize this limiting situation, one introduces counting frequency ν ^ = ν / N . The space of ν ^ s then is called a probability simplex M n ; Equation (A1) then becomes
P ˜ ( N + 1 ) ν ^ = k = 0 n p k P ˜ ( N ) N + 1 N ν ^ i 1 N δ i k .
Its limit is a Dirac- δ function: P ˜ ( ) = 0 for all ν ^ p , and P ˜ ( ) = at ν ^ = p . However, “a higher order” infinitesimal analysis shows that [8]
lim N 1 N ln P ˜ ( N ) ν ^ = i = 0 n ν ^ i ln ν ^ i p i = Φ ν ^ .
It is clear that entropy function Φ ( ν ) represents the infinitesimal prior probability e N Φ ( ν ) on M n . For two ν ’s with different entropy values, Φ ( ν ) and Φ ( ν ) , their probabilities P ( ν ) / P ( ν ) = 0 if Φ ( ν ) > Φ ( ν ) . This is the origin of the maximum entropy principle (MEP).
To understand the limit P ( N ) ( ν ) 0 , one can also introduce the probability generating function [8]:
W ( N ) u = ν P ( N ) ν e u · ν ,
in which u · ν = u 0 ν 0 + + u n ν n . Then, Equation (A1) becomes
W ( N + 1 ) u = ν P ( N + 1 ) ν e u · ν = k = 0 n p k e u · δ k ν P ( N ) ν δ k e u · ( ν δ k ) = W ( N ) u e Ψ ( u ) , where Ψ ( u ) = ln k = 0 n p k e u k .
The free energy function Ψ is meaningful for all finite N. This is why the partition function is valid even for small systems in Gibbs’ theory of ensembles [13]. The Legendre–Fenchel transform of Ψ ( u ) is precisely the the right-hand side of (A3):
inf u u · ν Ψ u = inf u i = 0 n ν i ln e u i + ln k = 0 n p k e u k = inf u i = 0 n ν i ln e u i k = 0 n p k e u k = i = 0 n ν i ln ν ^ i p i ,
in which the optimal e u i ν i / p i . Legendre–Frenchel transform arises in the limit of N through the Laplace’s method of evaluating asymptotic integrals, or the related Darwin–Fowler method of maximum term.
The analysis in this Appendix suggests that a proper interpretation of p = ( p 0 , , p n ) in Equation (A1) is not as an intrinsic property, for example, the generative model of data statistics, rather it should be interpreted as a choice of a “gauge” in terms of which a set of counting data is represented: each particular set of data ad infinitum is represented by the energy function u , not p , and the ν ^ is gauge invariant via the Boltzmann relation ν ^ i p i e u i —this yields an i.i.d. generative model. Probability is not for generative models, it is for analyzing empirical measurements on random variables.

References

  1. Førland, K.S.; Førland, T.; Kjelstrup, S. Irreversible Thermodynamics: Theory and Applications; John Wiley & Sons: Chichester, UK, 1988. [Google Scholar]
  2. Kjelstrup, S.; Bedeaux, D. Non-Equlibrium Thermodynamics of Heterogeneous Systems; Series on Advances in Statistical Mechanics; World Scientific: Singapore, 2008; Volume 16. [Google Scholar] [CrossRef]
  3. Bedeaux, D.; Kjelstrup, S.; Schnell, S.K. Nanothermodynamics Theory and Applications; World Scientific: Singapore, 2023. [Google Scholar] [CrossRef]
  4. Guggenheim, E.A. Modern Thermodynamics by the Methods of Willard Gibbs; Methuen & Co.: New York, NY, USA, 1933. [Google Scholar]
  5. Hill, T.L. A different approach to nanothermodynamics. Nano Lett. 2001, 1, 273–275. [Google Scholar] [CrossRef]
  6. Khinchin, A.Y. Mathematical Foundations of Statistical Mechanics; Dover: New York, NY, USA, 1949. [Google Scholar]
  7. Touchette, H. The large deviation approach to statistical mechanics. Phys. Rep. 2009, 478, 1–69. [Google Scholar] [CrossRef]
  8. Dembo, A.; Zeitouni, O. Large Deviations Techniques and Applications, 2nd ed.; Springer: New York, NY, USA, 1998. [Google Scholar] [CrossRef]
  9. Anderson, P.W. More is different: Broken symmetry and the nature of the hierarchical structure of science. Science 1972, 177, 393–396. [Google Scholar] [CrossRef] [PubMed]
  10. Alm, J.F.; Walker, J.S. Time-frequency analysis of musical instruments. SIAM Rev. 2002, 44, 457–476. [Google Scholar] [CrossRef]
  11. Fourier, J.B.J. The Analytic Theory of Heat; Freeman, A., Translator; Cambridge University Press: London, UK, 1878. [Google Scholar] [CrossRef]
  12. Truesdell, C. Rational Thermodynamics; Springer: New York, NY, USA, 1984. [Google Scholar] [CrossRef]
  13. Lu, Z.; Qian, H. Emergence and breaking of duality symmetry in thermodynamic behavior: Repeated measurements and macroscopic limit. Phys. Rev. Lett. 2022, 128, 150603. [Google Scholar] [CrossRef] [PubMed]
  14. Galteland, O.; Bering, E.; Kristiansen, K.; Bedeaux, D.; Kjelstrup, S. Legendre-Fenchel transforms capture layering transitions in porous media. Nanoscale Adv. 2022, 4, 2660–2670. [Google Scholar] [CrossRef] [PubMed]
  15. Rockafellar, R.T. Convex Analysis; Princeton University Press: Princeton, NJ, USA, 1970. [Google Scholar]
  16. Qian, M.; Xie, J.S.; Zhu, S. Smooth Ergodic Theory for Endomorphisms; Lecture Notes in Mathematics; Springer: Berlin, Germany, 2009; Volume 1978. [Google Scholar] [CrossRef]
  17. Dorfman, J.R. An Introduction to Chaos in Nonequilibrium Statistical Mechanics; Cambridge Lect. Notes in Phys.; Cambridge University Press: London, UK, 1999. [Google Scholar] [CrossRef]
  18. Mackey, M.C. The dynamic origin of increasing entropy. Rev. Mod. Phys. 1989, 61, 981–1015. [Google Scholar] [CrossRef]
  19. Qian, H.; Ge, H. Stochastic Chemical Reaction Systems in Biology; Lect. Notes on Math. Modelling in the Life Sci.; Springer Nature: Cham, Switzerland, 2021. [Google Scholar] [CrossRef]
  20. Jaynes, E.T. Probability Theory: The Logic of Science; Cambridge University Press: London, UK, 2003. [Google Scholar]
  21. Levine, R.D. Information theory approach to molecular reaction dynamics. Annu. Rev. Phys. Chem. 1978, 29, 59–92. [Google Scholar] [CrossRef]
  22. Ben-Naim, A. A Farewell to Entropy: Statistical Thermodynamics Based on Information; World Scientific: Singapore, 2008. [Google Scholar] [CrossRef]
  23. Callen, H.B. Thermodynamics and an Introduction to Thermostatistics, 2nd ed.; Wiley: New York, NY, USA, 1991. [Google Scholar]
  24. Commons, J.; Yang, Y.J.; Qian, H. Duality symmetry, two entropy functions, and an eigenvalue problem in Gibbs’ theory. arXiv 2021. [Google Scholar] [CrossRef]
  25. Qian, H. Statistical chemical thermodynamics and energetic behavior of counting: Gibbs’ theory revisited. J. Chem. Theory Comput. 2022, 18, 6421–6436. [Google Scholar] [CrossRef]
  26. Hong, L.; Qian, H.; Thompson, L.F. Representations and divergences in the space of probability measures and stochastic thermodynamics. J. Comput. Appl. Math. 2020, 376, 112842. [Google Scholar] [CrossRef]
  27. Goldstein, H. Classical Mechanics; Addison-Wesley: New York, NY, USA, 1951. [Google Scholar]
  28. Szilard, L. Über die ausdehnung der phänomenologschen thermodynamik auf die schwankungserscheinungen. Z. Physik. 1925, 32, 753–7888. [Google Scholar] [CrossRef]
  29. Mandelbrot, B. On the derivation of statistical thermodynamics from purely phenomenological principles. J. Math. Phys. 1964, 5, 164–171. [Google Scholar] [CrossRef]
  30. Miao, B.; Qian, H.; Wu, Y.S. Emergence of Newtonian deterministic causality from stochastic motions in continuous space and time. arXiv 2024. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Qian, H. Internal Energy, Fundamental Thermodynamic Relation, and Gibbs’ Ensemble Theory as Emergent Laws of Statistical Counting. Entropy 2024, 26, 1091. https://doi.org/10.3390/e26121091

AMA Style

Qian H. Internal Energy, Fundamental Thermodynamic Relation, and Gibbs’ Ensemble Theory as Emergent Laws of Statistical Counting. Entropy. 2024; 26(12):1091. https://doi.org/10.3390/e26121091

Chicago/Turabian Style

Qian, Hong. 2024. "Internal Energy, Fundamental Thermodynamic Relation, and Gibbs’ Ensemble Theory as Emergent Laws of Statistical Counting" Entropy 26, no. 12: 1091. https://doi.org/10.3390/e26121091

APA Style

Qian, H. (2024). Internal Energy, Fundamental Thermodynamic Relation, and Gibbs’ Ensemble Theory as Emergent Laws of Statistical Counting. Entropy, 26(12), 1091. https://doi.org/10.3390/e26121091

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop