The “Real” Gibbs Paradox and a Composition-Based Resolution
Abstract
:1. Introduction
Jaynes, however, contends that Gibbs committed a mathematical mistake in his latest writings, which is ultimately responsible for the whole confusion around extensivity in statistical mechanics:“For 60 years, textbooks and teachers (including, regrettably, the present writer) have impressed upon students how remarkable it was that Gibbs, already in 1902, had been able to hit upon this paradox which foretold—and had its resolution only in—quantum theory with its lore about indistinguishable particles, Bose and Fermi statistics, etc.”
In the above passage, the mentioned integration constant would arise from a result derived by Gibbs in Chapter IV of his book on statistical mechanics. This point will be important for what we are going to discuss in a few paragraphs.“In particular, Gibbs failed to point out that an “integration constant” was not an arbitrary constant, but an arbitrary function. However, this has, as we shall see, nontrivial physical consequences. What is remarkable is not that Gibbs should have failed to stress a fine mathematical point in almost the last words he wrote; but that for 80 years thereafter all textbook writers (except possibly Pauli) failed to see it.”
At this point, the reader may be reminded that the alleged “integration constant” mistake (conflating a function of the particle number with a constant) that Gibbs made, according to Jaynes, is in Chapter IV.“Finally, in Chapter XV, we consider the modification of the preceding results which is necessary when we consider systems composed of a number of entirely similar particles of several kinds, all of which kind being entirely similar to each other, and when one of the numbers of variations to be considered is that of the numbers of the particles of the various kinds which are contained in the system.”
If the substances were considered to be exactly identical, however, one would find the entropy change to be exactly zero.“The fact is not less significant that the increase of entropy due to the mixture of gases of different kinds … is independent of the nature of the gases … and of the degree of similarity between them.”
This leads to the conclusion that there should not be any change in entropy as long as the experimenter does not have any physical mean (or is not interested in using any such means) to separate two substances. Indeed, in the above quote, the second process mentioned refers to obtaining the final entropy by determining the work performed by a semi-permeable membrane letting A pass through and not B, for example.“But suppose that [substances] A and B are so similar that the experimenter has no physical way of distinguishing between them. Then he does not have the semi-permeable walls needed for the second process, but on the other hand the first one will look perfectly reversible to him.”
and was also likely known by Duhem as well.“when we say that two gas-masses of the same kind are mixed under similar circumstances there is no change of energy or entropy, we do not mean that the gases that have been mixed can be separated without change to external bodies. On the contrary, the separation of the gases is entirely impossible … because we do not recognise any difference in the substance of the two masses”,
- Firstly, from a mathematical standpoint, if a mixture is characterised by a probability distribution , then each time a particle of that substance is added to the system, the species i is selected with probability . In a simpler case, where only two species are possible, i.e., , the problem becomes analogous to the flipping of N coins. In that case, the number of particles of a given species shall follow a binomial distribution, as studied in [21], and is the (binomial) expectation value of . In practice, however, a single realisation of N coin flips is not expected to give exactly . Thus, the proposed prescription, albeit heuristically intuitive, conflates an instantiated value taken by a random variable with its expectation value; this is akin to a sort of ‘mean field’ approximation, especially given that ends up in various logarithmic functions in statistical thermodynamics.
- Secondly, from a conceptual standpoint, there is a problem with the substitution of by ; the latter is often not an integer. Gibbs’ statistical mechanics, or its quantum extension, establishes a relationship between the dimensionality of state spaces to be explored and the number of particles in the system. These spaces possess an integral number of dimensions, not a fractional one.
- Finally, the prescription is often taken after the Stirling approximation has been used, which must assume that each is large; this cannot be guaranteed for all system sizes for all composition probabilities.
2. Materials and Methods
- 1-
- Discrete mixtures comprising m possible identifiable species, labeled from 1 to m, are characterised—in a definitional sense—by an ideal composition corresponding to a set of fixed probabilities , satisfying
- 2-
- Any real mixture with a finite number N of particles is but one of many realisations obtained from independently sampling each N particle identity from the sample space , with the corresponding probabilities .
- 3-
- Let denote the random variable representing the number of particles of species i in a given mixture and let represent the multivariate random variable characterising the empirical composition. If the mixture comprises N particles. We must have that
- 4-
- We denote the composition realisation of a given mixture, such that for any species index i, we have . Given that the indicator random variables for the N particles are considered independent and identically distributed with , it follows that the probability distribution for satisfies a multinomial distribution:
- 5-
- The Helmholtz free energy of an N-particle system is also a random variable via the composition random variable . We seek the realisation-independent Helmholtz free energy of the system by averaging over all possible composition realisations:
3. Results
3.1. Case of a Single Mixture
3.1.1. Finite Discrete Mixtures: m and Map p Are Fixed and
- At fixed composition, i.e., at fixed m and probability map p on , is a constant, and the mixture can be essentially conceived as a homogeneous substance. All thermodynamic properties of the system will be identical to that of a single component system with only the total substance particle density (as opposed to partial densities) playing a role.
- In the large system size limit, the realisation-independent free energy becomes proportional to the total amount of matter N in the system, i.e., becomes extensive. Consequently, given the safeguarding criterion we chose, the finite discrete mixture model described above constitutes a valid composition model of general mixtures.
- By applying the defining relation for the thermodynamic entropy to Equation (19), we obtain
“[Equation (20)] applies to all gases of constant composition for which the matter is entirely determined by a single variable [N]”
3.1.2. Infinite Discrete Mixtures: and
3.1.3. Finite Continuous Mixtures: and
- Mathematical limit: The model for map may depend on m in a manner such that, as m becomes large “enough”, is very well-approximated by the distribution function of a continuous variable and is identical to it in the infinite m limit. Let us investigate in more detail the typical example of a binomial model for map with parameters m and w, i.e.,Consider now the map between the species label i taking values in and a new variable taking values in the finite interval of the rationals . From Equation (25) and the bijective character between i and r, it follows thatIn expression (26), r must be limited to take values in the set . However, we really do have that , where is the probability density function of a normally distributed continuous random variable, and where we can identify . (cf. Figure A2 in Appendix B).
- Experimental considerations: Whether the composition is characterised by the preparation protocol or by a measurement technique, any experimental process is accompanied by a corresponding finite precision. Thus, in practice, even if one were to use a particle attribute that takes continuous values and is associated with a theoretical underlying to characterise the composition of a mixture, it is bound to be expressed in terms of a discrete set of a finite number m of continuous intervals effectively corresponding to the m identifiable ’species’ of the system with the provided resolution and particle number. If m were to be increased, this would just improve the “granularity” of these intervals without making the set actually continuous. If this granularity is fine enough, one may define an empirical probability density and fit it to a corresponding continuous probability density model. Convergence to a stable continuous model is expected as m (i.e., the resolution) is increased. This experimental resolution aspect can also be justified under the assumption that the excess free energy of the mixture solely depends on a subset of moments of the polydisperse distribution [19], where these moments are related to the precision of the measurement.
3.2. Mixing of Two Substances
- Firstly, let us consider a situation where a box of volume V is separated into two equally sized compartments by a removable wall with particles of substance A in, say, the left-hand side compartment, and particles of substance B in the right-hand-side compartment.
- Secondly, the wall separating the substances is removed so as to let them intermix until equilibrium is reached.
3.2.1. Free Energy before Removing the Wall
3.2.2. Free Energy after Having Removed the Wall
3.2.3. Entropy Change
- corresponds to the partitioning entropy, i.e., the entropy gained by releasing initially confined particles into double the initial volume. Note that this term is oblivious to the particle type and is, therefore, always positive, even if the substances are identical.
- measures a specific, bounded, square distance between substances and . We shall propose an additional interpretation for this term a bit later.
- represents the entropy change owing to composition realisations, i.e., to the fact that, upon repeating experiments, one is bound to have different sampled empirical compositions from the ideal compositions expressed by the probability distributions and .
4. Discussion
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Appendix B
Appendix C
- It is possible to estimate the order of magnitude of in the large N limit. Indeed, given that the entropy of a multinomial distribution is positive, it automatically follows that
References
- Gibbs, J.W. Elementary Principles in Statistical Mechanics; Ox Bow Press: Woodbridge, CT, USA, 1981. [Google Scholar]
- Jaynes, E.T. The Gibbs’ paradox. In Proceedings of the Maximum Entropy and Bayesian Methods; Smith, C., Erickson, G., Neudorfer, P., Eds.; Kluwer Academic: Norwell, MA, USA, 1992; pp. 1–22. [Google Scholar]
- Hastings, C.S. Bibliographical Memoir of Josiah Willard Gibbs. In Proceedings of the Bibliographical Memoirs, Part of Volume VI; National Academy of Sciences: Washington, DC, USA, 1909. [Google Scholar]
- Paillusson, F. Gibbs’ paradox according to Gibbs and slightly beyond. Mol. Phys. 2018, 116, 3196. [Google Scholar] [CrossRef]
- Gibbs, J.W. On the Equilibrium of Heterogenous Substances; Connecticut Academy of Arts and Sciences: New Haven, CT, USA, 1876; p. 108. [Google Scholar]
- Darrigol, O. The Gibbs paradox: Early history and solutions. Entropy 2018, 20, 443. [Google Scholar] [CrossRef] [PubMed]
- Kondepudi, D.; Prigogine, I. Modern Thermodynamics, 2nd ed.; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2014. [Google Scholar]
- Sator, N.; Pavloff, N. Physique Statistique, 1st ed.; Vuibert: Paris, France, 2016. [Google Scholar]
- Van Kampen, N.G. The Gibbs’ paradox. In Proceedings of the Essays in Theoretical Physics: In Honor of Dirk ter Haar; Parry, W., Ed.; Pergamon: Oxford, UK, 1984. [Google Scholar]
- Cates, M.E.; Manoharan, V.N. Testing the Foundations of Classical Entropy: Colloid Experiments. Soft. Matt. 2015, 15, 6538. [Google Scholar] [CrossRef] [PubMed]
- Bérut, A.; Arakelyan, A.; Petrosyan, A.; Ciliberto, S.; Dillenschneider, R.; Lutz, E. Experimental verification of Landauer’s principle linking information and thermodynamics. Nature 2012, 5, 183. [Google Scholar] [CrossRef] [PubMed]
- Frenkel, D. Why Colloidal Systems Can be described by Statistical Mechanics: Some not very original comments on the Gibbs’ paradox. Mol. Phys. 2014, 112, 2325. [Google Scholar] [CrossRef]
- Paillusson, F.; Pagonabarraga, I. On the role of compositions entropies in the statistical mechanics of polydisperse systems. J. Stat. Mech. 2014, 2014, P10038. [Google Scholar] [CrossRef]
- Flory, P.J. Principles of Polymer Chemistry; Cornell University Press: Ithaca, NY, USA, 1953. [Google Scholar]
- Salacuse, J.J. Random systems of particles: An approach to polydisperse systems. J. Chem. Phys. 1984, 81, 2468. [Google Scholar] [CrossRef]
- Sollich, P. Projected free energy for polydisperse phase equilibria. Phys. Rev. Lett. 1998, 80, 1365. [Google Scholar] [CrossRef]
- Warren, P.B. Combinatorial entropy and the statistical mechanics of polydispersity. Phys. Rev. Lett. 1998, 80, 1369. [Google Scholar] [CrossRef]
- Sollich, P. Predicting phase equilibria in polydisperse systems. J. Phys. Condens. Matter 2002, 14, 79–117. [Google Scholar] [CrossRef]
- Sollich, P.; Warren, P.B.; Cates, M.E. Moment Free Energies for Polydisperse Systems. In Advances in Chemical Physics; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2001; Volume 116, pp. 265–336. [Google Scholar]
- Endres, D.M.; Schindelin, J.E. A new metric for probability distributions. IEEE Trans. Inf. Theory 2003, 49, 1858. [Google Scholar] [CrossRef]
- Paillusson, F. On the Logic of a Prior-Based Statistical Mechanics of Polydisperse Systems: The Case of Binary Mixtures. Entropy 2019, 21, 599. [Google Scholar] [CrossRef] [PubMed]
- Cichón, J.; Golebiewski, Z. On Bernouilli Sums and Bernstein Polynomials. Discret. Math. Theor. Comput. Sci. 2012, 179–190. [Google Scholar] [CrossRef]
- Jensen, J.L.W.V. Sur les fonctions convexes et les inégalités entre les valeurs moyennes. Acta Math. 1906, 30, 175–193. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Paillusson, F. The “Real” Gibbs Paradox and a Composition-Based Resolution. Entropy 2023, 25, 833. https://doi.org/10.3390/e25060833
Paillusson F. The “Real” Gibbs Paradox and a Composition-Based Resolution. Entropy. 2023; 25(6):833. https://doi.org/10.3390/e25060833
Chicago/Turabian StylePaillusson, Fabien. 2023. "The “Real” Gibbs Paradox and a Composition-Based Resolution" Entropy 25, no. 6: 833. https://doi.org/10.3390/e25060833
APA StylePaillusson, F. (2023). The “Real” Gibbs Paradox and a Composition-Based Resolution. Entropy, 25(6), 833. https://doi.org/10.3390/e25060833