2.1. Standard Approach to Calculate the Configurational Entropy of a One-Dimensional Ideal Gas
Let us consider a diluted, ideal gas in a (one-dimensional) box with
M (imaginary) sites and
N indistinguishable particles of point-like character and equal mass
m (with
because the gas is diluted). For simplicity, each site may be occupied with more than one particle, which is possible since the particles are point-like. Following Gibbs (after resolving the Gibbs paradox) the number of micro-states possible is [
1,
2]
yielding
The number of micro-states is divided by because the particles are indistinguishable. Furthermore, it is noted that the configurational space sites of which there are M are not further defined here.
Using only the first term (i.e., the first order approximation) of the Stirling formula, i.e.,
, which is usually used in standard textbooks on statistical thermodynamics, the following description of the Boltzmann entropy is obtained
After applying (and only after applying) the first term in the Stirling formula (shown in brackets above) the entropy is of extensive nature [
8], provided that the number of sites,
M, is chosen proportional to the system size, i.e., the number of particles,
N.
Similarly, a model of a gas that does not allow two particles to occupy the same site yields
and concomitantly
Again, the first order approximation of Stirling’s formula, and for were used. Since , the two models are essentially equivalent. It is evident, that these two presented approaches to calculate the configurational entropy of an ideal gas are equivalent.
2.2. A Quantitative and Analytical Description of the Entropy of a One-Dimensional Gas with Particles Having a Constant Absolute Velocity
A one-dimensional gas is described here as N single atom entities of point-like character of mass m located within a finite space with length L (in the following figures ). A particle has a position coordinate and velocity v, which is related to the temperature T of the system by equipartition of the kinetic energy, i.e., and thus . That is, if all particles have the same absolute velocity. If a particle collides, the collision is of elastic nature, i.e., at the collision; this includes also collisions with the wall of the finite space.
For this model, the ideal gas equation,
with pressure
p and volume
V, is fulfilled because
with
A the area of the wall perpendicular to the axis of the one-dimensional system and the time-averaged force on the wall,
. The number
of collisions with the wall during the (long) averaging time period
can be estimated by considering that the particle closest to the wall has to travel, on average, twice the distance
between two collisions:
. This yields
and hence
. In other words, this very simple one-dimensional model with constant absolute velocities can be considered a one-dimensional ideal gas. Furthermore, the restriction to a single absolute velocity thereby enables an analytical solution of the entropy as we shall see. However, first a simulation of a typical example of a three-particle system is calculated (see Material and Methods for details). For the position of each particle, the probability density distributions shown in
Figure 1A are obtained. The yellow particle is predominantly located in the left part of the space while it may still occupy positions in the entire box. Correspondingly, the blue particle is rather in the middle and the green particle on the right side of the one-dimensional box. In the case of four particles shown in
Figure 1B, the location distributions change slightly to accommodate the additional particle and collisions therein. Going from a three- to a four-particle system, the individual distributions get sharper along the space coordinate (see also below).
An analytical description of the probability density can be given by using a generalized Sinai Billiard for which the probability density of a single particle in a
N-dimensional simplex is equivalent to
N particles in a one-dimensional box [
15,
16,
17]. The approach is illustrated for a system of two particles with equal masses in
Figure 2A.
The single particle in the two-dimensional triangle can be regarded as a one-dimensional system with two particles. The
axis corresponds to the left wall, the
axis to the right wall. If the particle comes from the top (along the vertical axis) as shown, it can be considered particle 1, hitting the hypotenuse corresponds to the collision with particle 2, and the following horizontal motion corresponds to particle 2 that eventually hits the right wall. In the example in
Figure 1A, the red particle coming from the top, hitting the wall, and going to the left therefore corresponds to a one-dimensional situation where particle 1 comes from the left wall, while particle 2 does not move. After the collision, particle 1 stopped moving, while particle 2 moves towards the right wall. Hence, to study the configurational entropy (or the phase space) of the multi-particle system in a one-dimensional box is equivalent to that of a single particle system in a N-dimensional simplex (with all its properties). From the multi-particle system point of view, according to Sinai [
15], the probability density function for the position of particle
p = 1, …,
N, numbered from left to right in a box of length
, is then given by the beta distribution
for
and
, otherwise (i.e., outside the box) with
. The beta function is given by
in terms of the gamma function
. A proof of Equation (
6) is given in
Appendix A. The probability density
is exemplified in
Figure 1 for
N = 3, 4, 12, 100,000.
The mean position of particle number
p counting from the left is (reintroducing the length
L)
The mean position can also be obtained studying a single particle in an
N-dimensional simplex as illustrated for one particle in a two-dimensional space (corresponding to two-particles system in a one-dimensional space) in
Figure 2A,B.
From inspection of
Figure 1, it is evident that the probability density distributions narrow with increasing number of particles
N. Analytically this can be described by the standard deviation
of the beta distribution given by
The standard deviation is smallest for particles near the left or right wall, where for large N, and largest for particles in the middle of the box, where for large N. The standard deviation thus depends on the location of the particle p. Actually it is the distance to the wall that matters.
In order to calculate the configurational entropy of this one-dimensional gas system and to study its extensive character three approaches I-III are discussed in the following. In approach I, we continue on the above standard deviation
. We assume each particle to be located within
times the standard deviation (for instance,
covers >99% of its location) and the number of micro-states to be proportional to the accessible configuration space (which requests removing the unit of L), i.e.,
yielding an entropy estimation
that is not extensive, i.e.,
, even if we use the first order Stirling approximation. This is related to the non-local influence of the walls that influence the standard deviation of the particle positions not only in their vicinity, but over the entire system. The corresponding standard deviations of a system scaled by a factor
in size and number of particles are approximately
fold larger than for the original system,
, whereas they should remain constant for extensivity. That is, boundary effects decisively influence the bulk properties of the system. This is not compatible with extensivity of a macroscopic thermodynamic system that can in general only be realized if boundary effects are negligible.
In approach II, we therefore calculate the accessible configuration space of a (sub)system of length
comprising
particles located near the center of the original system (
Figure 1G). Such an approach is often done in theoretical calculations to get rid of issues with the wall (i.e., boundary effects). The
n particles of the subsystem thus correspond to particles
of the original system. Assuming that the particles are located within
times the standard deviation from their mean position and that the number of micro-states is proportional to the accessible configuration space, this yields the number of micro-states
for the
n-particle subsystem of length
l in the center of the
N-particle system of length
L (and thus
). The Boltzmann entropy is then given by
Note that is the particle density and thus the term within the logarithm is of intensive nature and the constant factor was moved into the const., which is possible since both and L are within this calculation arbitrary parameters (it is also noted that the term results from the selection criterion of standard deviations). In the case of calculating entropy differences it cancels out and is thus of no importance.
This yields an entropy description that is of extensive character for any number of particles,
n, located in the middle of a large box such that boundary effects can be neglected. This is in contrast to the standard entropy of an ideal gas that holds only in the thermodynamic limit with
(and this only under the assumption that the first order term of the Stirling formula can be used, but see [
8,
18]). The entropy calculated here is proportional to the number of particles of interest,
n, and proportional to the logarithm of the particle density. Note that the present derivation does not differentiate between distinguishable and non-distinguishable particles.
Comparing the standard configurational entropy of the ideal gas (Equation (
3)) with the here derived one-dimensional analog of Equation (
11) reveals that
is replaced by
(in order to return to the general formalism, we again write
N instead of
n). This difference has major consequences. As described, it makes entropy extensive for any particle number and without assuming the Stirling formula. Furthermore, it gives the space-to-particle number relationship a prominent role. In the case of the one-dimensional gas calculations in
Figure 1, it is evident that the particles
influence the position of particle
p and its location distribution. The more particles there are, the narrower the spatial distribution of particle
p becomes. In other words, the presence of all the (other) particles
j restricts the space of any given particle
p (Equation (
9)) and determines the mean location and standard deviation (Equations (
7) and (
8)) of particle
p via the density. Accordingly, the entropy is affected by the interactions between particles.
In approach III, we follow Boltzmann’s concept by determining the probability of a system to be in a given macro state [
2]. For the one-dimensional cases this means that the probability for a particle to be in the region
(with
) is calculated by the cumulative distribution function
of the beta distribution for each particle
p with a mean position
.
Figure 1H shows the cumulative distribution function of several particles at distributed locations. It is evident that for the particles with mean position
, the cumulative probability
is very close to 1, whereas for all particles with
, this probability is almost 0. Because of the shape of the positional probability distributions, it is assumed that the configuration space for each particle is
and that the number of micro-states
. This yields
and, without further approximations, a strictly extensive entropy
In summary, three approaches are presented to calculate the configurational entropy of the one dimensional gas with constant absolute velocity introduced in the beginning of
Section 2.2 quantitatively (
Figure 1). All three are analytically solved and valid because usually entropy differences are calculated. Approach I includes the effect of the wall, and as such is not extensive because a merge of two systems with each two walls would yield a reduction of the number of walls by two. Approaches II and III circumvent the issue with the wall and allow the study of entropy extensivity.
2.5. Absence of Ideal Gas Mixing and the GIBBS Paradox in a One-Dimensional System
The so called Gibbs paradox describes the odd finding that by mixing two ideal gases, the Boltzmann entropy will increase only if the two gases are of distinct nature (whatever the distinctness is) and the change of entropy is thereby not dependent on the nature or degree of distinctness [
2]. To elaborate on this paradox in the one-dimensional scenario, we consider a box of length
with 2 particles having the same mass
m with a corresponding box of two particles with unequal mass with
. In
Figure 2A, the Sinai billiard representation of the system with equal mass is shown, while in
Figure 2B the system with two particles with distinct mass is shown by one particle moving within a right-angled triangle with side lengths
and
. In the latter example after a vertical motion and hitting the hypotenuse, the particle moves in both directions
and
. This situation translated into the one-dimensional case corresponds to the heavier particle with mass
coming from the left wall and colliding with the non-moving particle 2 with mass
. After the collision, particle 1 moves back to the left wall, while particle 2 moves to the right wall. The average horizontal and vertical position of the particle is
and
, which can either be calculated straightforwardly or by comparing
Figure 2A,B using the geometric theorem of intersecting lines. This yields in real space
and
identical to the case of the system with two particles of equal mass (
Figure 2A, Equation (
7)). Similarly, the standard deviations of particles 1 and 2 are given by
and
, which translates to
in the real space, again identical to the result for the system with two equal-mass particles (Equation (
14)). It appears obvious that extending this finding to a system of
N particles with distinct masses represented by a
N-dimensional simplex with side lengths
will still yield Equations (
7) and (
8) for the mean positions and standard deviations of localization. Hence, the entropy of a mixed one dimensional gas system is the same as its homogeneous analog. Translated to the mixing of two systems, which is the topic of this section, if two one-dimensional gas systems are just mixed, the entropy does not change irrespective of whether the gas molecules are of homogeneous or heterogenous mass.
In summary, in the one-dimensional gas there is not only no Gibbs paradox present, but there exists also no mixing entropy. The latter is due to the impossibility of mixing the positions of the gas particles. This also means that the mixing entropy is solely generated by the mixing of the positions and not by enlarging the accessible volume of each gas upon removing a wall.