Next Article in Journal
Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?
Next Article in Special Issue
Some Observations on the Concepts of Information-Theoretic Entropy and Randomness
Previous Article in Journal
An Algorithmic Complexity Interpretation of Lin's Third Law of Information Theory
Previous Article in Special Issue
Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Gibbs’ Paradox and the Definition of Entropy

by
Robert H. Swendsen
Physics Department, Carnegie Mellon University, Pittsburgh, PA 15213, USA
Entropy 2008, 10(1), 15-18; https://doi.org/10.3390/entropy-e10010015
Submission received: 10 December 2007 / Accepted: 14 March 2008 / Published: 20 March 2008
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)

Abstract

:
Gibbs’ Paradox is shown to arise from an incorrect traditional definition of the entropy that has unfortunately become entrenched in physics textbooks. Among its flaws, the traditional definition predicts a violation of the second law of thermodynamics when applied to colloids. By adopting Boltzmann’s definition of the entropy, the violation of the second law is eliminated, the properties of colloids are correctly predicted, and Gibbs’ Paradox vanishes.

1. Introduction

Gibbs’ Paradox [1,2,3] is based on a traditional definition of the entropy in statistical mechanics found in most textbooks [4,5,6]. According to this definition, the entropy of a classical system is given by the product of Boltzmann’s constant, k, with the logarithm of a volume in phase space. This is shown in many textbooks to lead to the following equation for the entropy of a classical ideal gas of distinguishable particles,
Entropy 10 00015 i001
where X is a constant. Most versions of Gibbs’ Paradox, including the one I give in this section, rest on the fact that Eq. 1 is not extensive. There is another version of Gibbs’ Paradox that involves the mixing of two gases, which I will discuss in the third section of this paper.
Discussions of Gibbs’ Paradox tend to be fairly abstract. I would like to make my argument concrete by applying the traditional definition of the entropy to a physical system for which Gibb’s Paradox has clear implications. Any colloid would do for this purpose [7]; glue, paint, ink, or blood could be used, but I will choose homogenized milk. The fat globules have a diameter of about one micron, which is large enough to obey classical mechanics. They are distinguishable particles, not only because they obey classical mechanics, but also because different particles generally contain different arrangements of atoms, different impurities, and even different numbers of atoms. The exchange of two particles produces a different microscopic state, so milk (and other colloids) should be described by the classical statistical mechanics of distinguishable particles.
To see why the traditional definition of the entropy is a problem, consider its application to a container of homogenized milk. The entropy of fat globules in the milk should be given by Eq. 1. If we insert a partition into the container so that it is divided into two parts, each with volume V/2, the entropy of each part would be
Entropy 10 00015 i002
The change in entropy upon inserting the partition is then
Entropy 10 00015 i003
which — being negative — would represent a violation of the second law of thermodynamics. This is one aspect of what is known as Gibbs’ Paradox.
However, Eq. 3 does not really present us with a paradox; it presents us with a proof that the traditional definition of the entropy is untenable. Unless we are willing to give up the second law, we must abandon the definition of entropy in terms of a volume in phase space.

2. Boltzmann’s definition of the entropy

Although Boltzmann never addressed Gibbs’ Paradox directly, his approach to statistical mechanics provides a solid basis for its resolution. Boltzmann defined the entropy in terms of the probability of the macroscopic state of a composite system [8,9,10]. Although the traditional definition of the entropy is often attributed to Boltzmann, this attribution is not correct. The equation on Boltzmann’s tombstone, S = k log W , which is sometimes called in evidence, was never written by Boltzmann and does not refer to the logarithm of a volume in phase space. The equation was first written down by Max Planck, who correctly attributed the ideas behind it to Boltzmann [11,12]. Planck also stated explicitly that the symbol “W” stands for the German word “Wahrscheinlichkeit” (which means probability) and refers to the probability of a macroscopic state.
The dependence of Boltzmann’s entropy on the number of particles requires the calculation of the probability of the number of distinguishable particles in the each subsystem of a composite system. The calculation of this probability requires the inclusion of the binomial coefficent, N!/N1!N2!, where N1 and N2 are the numbers of particles in each subsystem and N = N1 + N2. This binomial coefficient is the origin of the missing factor of 1/N! in the traditional definition, and leads to an expression for the entropy that is extensive [9,10].
Entropy 10 00015 i004
Boltzmann’s entropy is consistent with the second law of thermodynamics [8,9,10]. It is easy to see for the example given above that the change in Boltzmann’s entropy is zero when a container of milk is partitioned.

3. Gibbs’ Paradox and the entropy of mixing

The discussion above has concentrated on those versions of Gibbs’ Paradox that rest on the fact that Eq. 1 is not extensive. There is another version that is concerned with the mixing of two gases whose properties are imagined to change continuously from being different to being the same. This “paradox” consists of a discomfort with the entropy of mixing going to zero discontinuously as the properties become the same. Adopting Boltzmann’s definition of entropy also resolves this version of Gibbs’ Paradox.
Since Boltzmann’s definition of the entropy involves probabilities, it must reflect our knowledge (or ignorance) of the microscopic state. This is completely appropriate, since entropy is a description of our knowledge of the properties of a thermodynamic system, rather than a property of the system itself. The dependence of entropy on available information has been most clearly illustrated by Jaynes in his delightful discussion of the properties of the “superkalic elements” Whifnium and Whoofnium, which is too well known to need repetition here [13]. For any given experiment, there either is, or is not, available information on possible differences between types of particles. Since there are only two possibilities for whether or not the particle types are known to be the same, it is neither surprising nor paradoxical that the entropy of mixing changes discontinuously between the two cases [14].

4. Conclusions

If we reject the traditional textbook definition of the entropy in terms of a volume in phase space and adopt Boltzmann’s definition in terms of the probability of the macroscopic state of a composite system, the statistical mechanics of both classical and quantum systems acquire a solid foundation. There is no violation of the second law of thermodynamics. The entropy of colloids is correctly calculated and predicted to be extensive. And Gibbs’ Paradox vanishes.

References

  1. Gibbs, J. W. On the Equilibrium of Heterogeneous Substances. Transactions of the Connecticut Academy 1873, 3, 108–248, 343–524. [Google Scholar] [CrossRef]
  2. Gibbs, J. W. The Collected Works of J. W. Gibbs; Yale University Press, 1948; Vol. 1. [Google Scholar]
  3. Gibbs, J. W. Elementary Principles of Statistical Mechanics; Yale University Press: New Haven, 1902; Reprinted by (Dover, New York, 1960); pp. 206–207. [Google Scholar]
  4. Pathria, R. K. Statistical Mechanics, 2nd ed.; Butterworth-Heinemann: Boston, 1996; pp. 22–26, See especially Eq. (1a), which agrees with Eq. 1 in this paper if the temperature is replaced by the energy as a variable. [Google Scholar]
  5. Landau, L. D.; Lifshitz, E. M. Statistical Physics; Pergamon Press: New York, 1980; pp. 24–25. [Google Scholar]
  6. Reif, F. Fundamentals of Statistical and Thermal Physics; McGraw-Hill: New York, 1965; pp. 243–245, This book and some others use a definition of the partition function that also leads to the traditional expression for the entropy. [Google Scholar]
  7. Everett, D. H. Basic Principles of Colloid Science; Royal Society of Chemistry: London, 1988. [Google Scholar]
  8. Boltzmann, L. Über die Beziehung zwischen dem zweiten Hauptsatze der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung respektive den Sätzen über das Wärmegleichgewicht. Wien. Ber. 1877, 76, 373–435, Reprinted in Wissenschaftliche Abhandlungen von Ludwig Boltzmann, Vol. II; Chelsea: New York, 1968; pp. 164–223. [Google Scholar]
  9. Swendsen, R. H. Statistical mechanics of colloids and Boltzmann’s definition of the entropy. Am. J. Phys. 2006, 74, 187–190. [Google Scholar] [CrossRef]
  10. Swendsen, R. H. Statistical mechanics of classical systems of distinguishable particles. J. Stat. Phys. 2002, 107, 1143–1165. [Google Scholar] [CrossRef]
  11. Planck, M. Über das Gesetz der Energieverteilung im Normalspektrum. Drudes Annalen 1901, 553–562, Reprinted in Ostwalds Klassiker der exakten Wissenschaften, Band 206, Die Ableitung der Strahlungsgesetze, pp.65–74. The equation appears with an arbitrary additive constant on p. 68 of the reprinted text. [Google Scholar] [CrossRef]
  12. Planck, M. Theorie der Wärmestrahlung; J. A. Barth: Leipzig, 1906; Translated into English by Morton Masius in M. Planck, The Theory of Heat Radiation; Dover: New York, 1991; p. 119. [Google Scholar]
  13. Jaynes, E. T. The Gibbs Paradox. In Maximum-Entropy and Bayesian Methods; Erickson, G., Neudorfer, P., Smith, C. R., Eds.; Kluwer: Dordrecht, 1992; pp. 1–22. [Google Scholar]
  14. Ben-Naim, A. On the So-Called Gibbs Paradox, and on the Real Paradox. Entropy 2007, 9, 133–136. [Google Scholar] [CrossRef] [Green Version]

Share and Cite

MDPI and ACS Style

Swendsen, R.H. Gibbs’ Paradox and the Definition of Entropy. Entropy 2008, 10, 15-18. https://doi.org/10.3390/entropy-e10010015

AMA Style

Swendsen RH. Gibbs’ Paradox and the Definition of Entropy. Entropy. 2008; 10(1):15-18. https://doi.org/10.3390/entropy-e10010015

Chicago/Turabian Style

Swendsen, Robert H. 2008. "Gibbs’ Paradox and the Definition of Entropy" Entropy 10, no. 1: 15-18. https://doi.org/10.3390/entropy-e10010015

Article Metrics

Back to TopTop