Next Article in Journal
Generalized Toffoli Gate Decomposition Using Ququints: Towards Realizing Grover’s Algorithm with Qudits
Next Article in Special Issue
Arbitrage Equilibrium, Invariance, and the Emergence of Spontaneous Order in the Dynamics of Bird-like Agents
Previous Article in Journal
Residence Time vs. Adjustment Time of Carbon Dioxide in the Atmosphere
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Combinatorics and Statistical Mechanics of Integer Partitions

Department of Chemical Engineering, Pennsylvania State University, State College, PA 16801, USA
Entropy 2023, 25(2), 385; https://doi.org/10.3390/e25020385
Submission received: 19 December 2022 / Revised: 13 February 2023 / Accepted: 14 February 2023 / Published: 20 February 2023
(This article belongs to the Special Issue Generalized Statistical Thermodynamics II)

Abstract

:
We study the set of integer partitions as a probability space that generates distributions and, in the asymptotic limit, obeys thermodynamics. We view ordered integer partition as a configuration of cluster masses and associate them with the distribution of masses it contains. We organized the set of ordered partitions into a table that forms a microcanonical ensemble and whose columns form a set of canonical ensembles. We define a functional of the distribution (selection functional) that establishes a probability measure on the distributions of the ensemble, study the combinatorial properties of this space, define its partition functions, and show that, in the asymptotic limit, this space obeys thermodynamics. We construct a stochastic process that we call exchange reaction and used it to sample the mean distribution by Mote Carlo simulation. We demonstrated that, with appropriate choice of the selection functional, we can obtain any distribution as the equilibrium distribution of the ensemble.

1. Introduction

The central element of statistical mechanics is the ensemble and its partition function. The microcanonical ensemble contains all microstates with fixed macroscopic energy, volume, and number of particles. The canonical ensemble is a subset of the microcanonical (“system”), and its complement forms another canonical ensemble (the “bath”). The microcanonical and the canonical partition function are Legendre transformations of each other, and their derivatives generate all thermodynamic properties. This summarizes the thermodynamic formalism of Gibbs [1], a recipe that has proven immensely successful in physics, chemistry, and biology. At the heart of Gibbs’s method is the enumeration of microstates, a primitive stochastic variable whose enumeration (multiplicity) is a key element of the method. In the microcanonical ensemble, all microstates are represented in equal numbers; in the canonical ensemble, they are represented in proportion to the Boltzmann factor e E / k B T , as the same microstate can be paired with several microstates in the bath. These multiplicities fix the probability of the microstate. We may generalize this method and apply it to other types of ensembles of “configurations” that are not physical microstates. The set of integer partitions provides a concrete example of an ensemble whose elements can be enumerated exactly. A partition of integer M into N parts is a list of N positive integers whose sum is M. This list can be viewed as M monomers combined into N clusters (polymers). The number of monomers, dimers, etc., present in the partition represents a possible distribution of clusters. The set of partitions forms an ensemble defined by the macroscopic variables M and N, which, like energy, volume, and the number of particles in statistical mechanics, act as constraints on the microstates that can be accessed by the macroscopic system. Integer partitions appear naturally in population balance problems. In discrete fragmentation [2,3,4], fragments are partitions of the mass that generates them. In closed discrete finite populations, the rearrangement of mass via aggregation and fragmentation events represents a random walk on a space of partitions [5,6,7,8,9,10,11]. If we adopt the view of partitions as a finite sample from a population of elements distributed by some extensive property (“mass”), it should come as no surprise that partitions appear in discrete stochastic processes in general, one example of which is a random walk on a discrete lattice [12].
The connection between partitions and thermodynamics may be summarized as follows. The set of partitions together with a probability measure form an ensemble. The distribution of elements in a partition is a random function whose probability is determined by the probability of the partition. In the thermodynamic limit, the most-probable distribution is overwhelmingly more probable than all others, and this behavior gives rise to thermodynamics. This connection has attracted the interest of the mathematical literature, which has focused on the definition of appropriate probability measures, the limit form of the distribution in the thermodynamic limit, and the mathematical conditions that ensure the existence of such a limit [13,14,15,16,17,18]. We take a different approach. We view the ensemble of partitions as a container of distributions—asymptotically of all distributions whose support is on the positive real axis—and the probability measure as a tool that can selectively extract any such distribution to deliver it as the most-probable distribution in the thermodynamic limit. In this sense, the space of partitions brings together two seemingly unrelated areas of mathematical physics, stochastic processes, and statistical thermodynamics. The point of contact between them is the probability distribution. To formalize this connection, we construct discrete finite ensembles, establish the equivalence between the microcanonical and the canonical ensemble, and study their combinatorial properties in the discrete finite domain.
The paper is organized as follows. In Section 2, we introduce the microcanonical table as a structured arrangement of partitions and define the probability measure. In Section 3, we define and study the discrete canonical ensemble and its partition function and establish equivalence with the microcanonical ensemble. In Section 4, we construct a random walk that samples the microcanonical table with the proper microcanonical probability and demonstrate with Monte Carlo simulations. In Section 5, we pass to the asymptotic limit and make full contact with thermodynamics. We discuss the results in Section 7 and, finally, summarize the conclusions in Section 8.

2. Microcanonical Ensemble of Partitions

An integer partition is a list of N integers whose sum is M. We consider ordered partitions (compositions, in the the language of number theory) and represent them by vector c = ( c 1 , c 2 c N ) with c i > 0 and i = 1 N c i = M . We construct the microcanonical ensemble E M , N as the set of all ordered partitions of integer M into N non-zero parts with M and N fixed. In our nomenclature, an order partition is a configuration, its elements are clusters, and their numerical value is the mass of the cluster. The number of clusters in the configuration is N, and their total mass is M. “Cluster” and “mass” are terms of the nomenclature, and they are not meant to assign physical properties to partitions, but rather serve as an analogy. Mass, in particular, stands for any additive property, for example mass, energy, volume, etc. The number of configurations in E M , N is [19]
Ω M , N = M 1 N 1 .
A configuration is characterized by its distribution n = ( n 1 , n 2 ) , where n k is the number of times integer k appears in configuration (number of clusters with mass k). All distributions in E M , N satisfy the conditions:
i n i = N , i i n i = M ,
which fix the zeroth- and first-order moments. The maximum possible cluster in the ensemble is M N + 1 , but we let the upper limit in the summations go to with the understanding that n i = 0 for all i > M N 1 . The conditions in Equation (2) are necessary and sufficient: all distributions of configurations in E M , N satisfy these conditions, and conversely, all distributions that satisfy them represent configurations in E M , N .

2.1. Microcanonical Table

A visual representation of the microcanonical ensemble is illustrated in Table 1 for M = 7 , N = 5 . We collect all ordered partitions of M into N parts to form a table (microcanonical table). The table has N columns and M 1 N 1 rows, corresponding to the number of configurations of the ensemble. Every column contains the same list of numbers, k = 1 through M N + 1 , which may appear multiple times. We calculate this multiplicity as follows. The number of times element k appears in any one column is equal to the number of ways it may be combined with a list of N 1 positive integers that contain mass M k to produce a configuration with N clusters and mass M. The set of these lists is the microcanonical ensemble E M k , N 1 and contains Ω M k , N 1 ordered partitions. Therefore, the number of times integer k appears in each column of the microcanonical table is
Ω M k , N 1 = M k 1 N 2 , k = 1 M N + 1 .
The total number of elements in any column is Ω M , N . Therefore, we have the identity:
Ω M , N = k = 1 M N + 1 Ω M k , N 1 .
We also have
Ω M , N = k = 1 M N + 1 k Ω M k , N 1 Ω M , N = M N ,
which expresses the fact that the mean cluster in any column of the table is the same as in the entire table.
Returning to Table 1, the order of rows and columns is not important; permutations in their order produce the same list of configurations. The configurations in Table 1 are grouped by distribution, but in an otherwise arbitrary order. In this example, there are two distributions: n A = ( 4 , 0 , 1 ) with four monomers and one trimer, represented by five configurations; and n B = ( 3 , 2 , 0 ) with three monomers and one dimer, which is represented by ten configurations. In writing distributions in vector form, we understand that all omitted elements are zero.

2.2. Multiplicity

Configurations that are permutations of each other have the same distribution. The number of configurations with distribution n is equal to the multinomial factor of the distribution:
n ! = N ! n 1 ! n 2 ! .
The multinomial factor represents the intrinsic multiplicity of the distribution, i.e., the number of configurations represented by the distribution. Suppose, however, that the elements of the configuration come in multiple internal variants, for example “color,” “shape,” “reactivity”, or a similar attribute. Variants increase the multiplicity of the distribution. If element i exists in w i variants and distribution n contains n i such elements, the multiplicity of the distribution increases by a factor w i n i . When the variants of all elements are considered, the multiplicity of the distribution increases by the product of these factors:
W ( n ) = i = 1 N w i n i .
The total multiplicity is the product of W ( n ) with the intrinsic multiplicity n ! and represents the statistical weight of the distribution in the ensemble:
microcanonical weight of distribution = n ! W ( n ) = N ! i w i n i n i ! .
W ( n ) is a functional of n that biases the multiplicity of the distribution in the ensemble. With W ( n ) = 1 , this multiplicity reduces to the intrinsic value n ! and the partition function is reduced to Ω in Equation (1). By allowing W to vary between distributions, we effectively bias the weight of the distribution in the ensemble and gain a degree of freedom in manipulating the ensemble, which will prove useful.
It is possible to construct functionals that are more general than that in Equation (8), but the above form has several special properties that make it particularly useful. Among them is that W may be explicitly expressed in terms of the elements of the configuration. Given configuration c = c 1 , c 2 with distribution n , its multiplicity is
W ( c ) = i w ( c i ) = W ( n ) ,
where w ( c i ) is the number of variants of size c i .

2.3. Microcanonical Probability

The multiplicity of distribution is the number of times the distribution is represented in the ensemble in all ordered permutations of all variants of its elements. We define the microcanonical probability of the distribution as the ratio of its multiplicity over the total multiplicity in the ensemble:
P μ C ( n ) = n ! W ( n ) Ω M , N = N ! Ω M , N i w i n i n i ! .
The normalizing constant is the microcanonical partition function, equal to the sum of microcanonical weights in the ensemble:
Ω M , N = n n ! W ( n ) .
The probability of configuration c in the microcanonical table is
P μ C ( c ) = W ( c ) Ω M , N = 1 Ω M , N i w ( c i ) ,
and follows from Equation (10) and the fact that all n ! permutations with the same distribution are equally probable.
To continue to use the microcanonical table as a visualization of the ensemble, we imagine each configuration in the table to represent W ( n ) actual configurations; this is the number of ways to build the configuration from all available variants, and it depends on the distribution of elements in the configuration. Therefore, every row of the table is understood to represent W ( n ) rows, where n is the distribution of the elements in that row.

2.4. Mean Distribution

One result that can be obtained very easily from the microcanonical table is the mean distribution of clusters, which we define as the frequency of element k in the ensemble. The mean frequency of clusters of size k in the ensemble is its frequency in any row of the microcanonical table. The number of times cluster size k appears in a row is equal to the number of times it can be combined with a complementary configuration of N 1 clusters and total mass M k to form a configuration with N clusters and mass M:. The set of such complements forms the microcanonical ensemble E M k , N 1 and contains Ω M k , N 1 elements. Since each cluster in the microcanonical table represents w k variants, the mean cluster distribution is
n k N = w k Ω M k , N 1 Ω M , N
for all M N + 1 k 1 . A different derivation of the same result in the context of the zero-range process is given in [12].

3. Canonical Ensemble

We define the canonical configuration of length N < N as the ordered subset of the first N elements of a microcanonical configuration. The set of all canonical configurations of size N forms the canonical ensemble C N | M , N . The notation emphasizes the fact that the canonical ensemble is defined in the context of an enclosing microcanonical ensemble, and to distinguish between the two, we use primed variables for the canonical ensemble and unprimed for the enclosing microcanonical. The graphical construction of the canonical ensemble is illustrated in Table 2: Form the microcanonical table of the enclosing ensemble, and collect the first N columns. This produces a canonical table with N columns and Ω M , N rows, each row representing a canonical configuration. In the illustration of Table 2, N = 2 . The remaining columns of the microcanonical table form the complementary canonical ensemble C N N | M , N . Since the ordering of columns in the microcanonical table is immaterial, we could pick any N columns in any order, but for simplicity, we will continue to work with the first N columns of the microcanonical table.

3.1. Canonical Probability

When a canonical configuration c with distribution n is cut from a microcanonical configuration c with distribution n , it leaves behind a complementary configuration c with distribution n . The set of complements forms the complementary ensemble C N N ; M , N , and the two complements together form the enclosing microcanonical ensemble. The sum of two complementary distributions,
n = n + n ,
is a member of the enclosing microcanonical ensemble E M , N . In the language of thermodynamics, C N N ; M , N is the system, its complement is the “bath”, and the enclosing microcanonical ensemble is the universe. We obtain the probability of canonical distribution by a combinatorial calculation. Distribution n and its complement n form a microcanonical distribution n with microcanonical probability
P ( n + n ) = ( n + n ) ! W ( n ) W ( n ) Ω M , N .
There are n ! n ! ways out of ( n + n ) ! to combine n and its complement; we obtain the canonical probability by summing the factor P ( n + n ) n ! n ! / ( n + n ) ! over all complements n . Using P ( n + n ) = ( n + n ) ! W ( n + n ) / Ω M , N for the microcanonical probability of distribution n + n , we obtain:
P ( n | N ; M , N ) = n P ( n + n ) n ! n ! ( n + n ) ! = n ! W ( n ) Ω M , N n n ! W ( n ) .
The summation on the far right is over the microcanonical multiplicities of all distributions with mass M M and number of particles N N , and it is equal to the microcanonical partition function Ω M M , N N . This leads to the following result for the canonical probability:
P ( n | N ; M , N ) = n ! W ( n ) Ω M M , N N Ω M , N .
The canonical probability is proportional to the microcanonical weight of the distribution, but also depends on the partition functions of the complement and of the enclosing ensemble. To separate the effect of the enclosing ensemble, we write the canonical probability as
P ( n | N ; M , N ) = n W ( n ) Ω M M , N N Ω M , N N Ω M , N N Ω M , N .
Noting that Ω M , N N / Ω M , N is constant, we define the remaining portion of the expression on the right-hand side as the statistical weight of the canonical configuration:
canonical weight of distribution = n ! W ( n ) Ω M M , N N Ω M , N N ,
We define the canonical partition function as the sum of canonical weights:
Q N ; M , N = n n ! W ( n ) Ω M M , N N Ω M , N N ,
with the summation over all canonical distributions. Applying the normalization condition on the canonical probability in Equation (18), we obtain
Q N ; M , N = Ω M , N Ω M , N N .
This establishes the relationship between the canonical and the microcanonical partition functions.
Unlike the microcanonical ensemble, in which the multiplicity and the statistical weight are equal, in the canonical ensemble, the two are not equal, but proportional to each other. The canonical multiplicity is the number of times a canonical distribution appears in the enclosing microcanonical table and clearly depends not only on the distribution itself, but also on the size of the enclosing ensemble: the same canonical distribution has higher multiplicity if it is removed from a larger microcanonical ensemble. By basing the definition of the canonical partition function on its statistical weight, rather than its multiplicity, we obtain a quantity that, in the asymptotic limit, is independent of the size of the enclosing ensemble. We derive the asymptotic limit in Section 5.

3.2. Mean Canonical Distribution

The mean distribution of the canonical ensemble is obtained trivially: it is the same as that of the canonical ensemble because all columns of the microcanonical table contain the same distribution of clusters:
n k N C = n k N μ C = w k Ω M k , N 1 Ω M , N .
The result is true for all 1 N N M .

4. A Random Walk in the Microcanonical Space: The Exchange Reaction

4.1. Binary Exchange Reaction and Its Graph

We have defined a microcanonical space of configurations with an associated space of distributions. To experiment numerically with this space, we need a method to sample its elements with the correct probability. In this section, we construct such a method in the form of a random walk that visits configurations according to their microcanonical probability in Equation (10). We will then use the method in Section 6 to study the asymptotic behavior of the ensemble and the close relationship between the selection functional and the most-probable distribution.
We formulate the sampling process as a binary exchange reaction, a process that emulates reactions between physical clusters. Starting with a configuration c, we select two elements, i and j, and exchange the mass between them to create a new pair of clusters i and j under the mass-conserving condition i + j = i + j . The transfer produces a new configuration with the same number of particles and total mass, which we represent by the reaction:
c i + j i + j c .
The binary exchange reaction establishes a network of transitions and adds a layer of connectivity between the elements of the microcanonical table. These connections form a graph whose nodes are configurations, and its edges represent individual exchange reactions. The graph has the following properties:
1.
It is bidirectional because the reverse of the exchange reaction is also a binary exchange reaction.
2.
It is connected: starting from any configuration, it is always possible to reach through a series of exchange reactions a configuration with one cluster of size M N 1 (giant cluster) plus N 1 monomers; the mass of the giant cluster can then be distributed to the other clusters to produce any other configuration of the ensemble. Therefore, any configuration can be reached from any other.
3.
Every configuration is connected to ( M N ) ( N 1 ) other configurations. The maximum number of units that can be transferred from a cluster with mass k is k 1 (cluster masses cannot be zero). The total number of units that are available for exchange within a configuration is M N , and since each cluster may transfer mass any of the other N 1 clusters, the number of connections that depart from any configuration is ( M N ) ( N 1 ) .
Figure 1 shows the graph of binary exchange reactions in the microcanonical ensemble with M = 7 , N = 5 . The ensemble contains 15 configurations, each linked to 8 others via exchange reactions. The 15 configurations belong to two distributions, one with 4 monomers and 1 trimer, and one with 3 monomers and 2 dimers. A transition between two configurations corresponds to a transition between the two distributions; however, it is possible for the distribution to transition back to itself if the reaction produces a permutation of the initial configuration.

4.2. Random Walk on the Binary Exchange Reaction Graph

To sample the microcanonical table, we construct a random walk on the graph of binary exchange reactions, but in order to visit configurations with proper microcanonical probability, we must construct an appropriate transition probability. We begin by defining the equilibrium constant of the transition in Equation (23) as
K ( c c ) = W ( c ) W ( c ) = w i w j w i w j ,
where w i and w j are the cluster weights of the products and w i and w j are those of the reactant clusters. According to Equation (9), the selection functional of the configuration is the product of the multiplicity of its elements, and since the product and reactant distributions differ only in the mass of elements that participate in in the reaction, the final result has the form of the familiar reaction equilibrium constant with the activities of chemical species replaced by the multiplicities of the elements of the configuration. We now set the transition probability for the reaction c c by the Metropolis prescription:
P ( c c ) = K ( c c ) if K ( c c ) 1 1 otherwise
The detailed balance condition:
P ( c ) P ( c c ) = P ( c ) P ( c c )
is satisfied by the microcanonical probability in Equation (12). It follows [20] that the microcanonical probability is the stationary probability of the Markov process described by the transition probabilities from Equation (25): a random walk that starts from any configuration visits in the long run every configuration according to its microcanonical probability.

4.3. Monte Carlo Sampling

The exchange reaction can be simulated easily by the Monte Carlo method. We begin with an arbitrary configuration of N ordered integers, pick at random two elements, i and j, and replace them by two new numbers i and j such that i + j = i + j . To implement this numerically, we draw an integer random number 1 < r < i + j and set i = r , j = i + j r . If i = i , we reject the result and repeat with a new random number in order to avoid self transitions. However, this step is not necessary because self transitions do not alter the stationary distribution: the probability of self transition is 1 / ( z + 1 ) , where z = ( M N ) ( N 1 ) is the number of cross-transitions; this number is the same for all configurations and affects all distributions uniformly. This simplifies the simulation by allowing the transfer of any mass between two clusters.
With w i = 1 , the selection functional is W ( n ) = 1 for all n and distributions are sampled in proportion to their multinomial factor. By choosing the selection functional appropriately, we can bias the probability of the distribution towards any distribution of the ensemble. We demonstrate the effect of the selection functional for the case M = 7 , N = 5 . The microcanonical ensemble contains two distributions, n A = ( 4 , 0 , 1 ) and n B = ( 3 , 2 , 1 ) with n A ! = 5 , n B ! = 10 and with corresponding probabilities 1 / 3 and 2 / 3 , respectively. We construct the selection functional using w i = i α . This leads to the following probabilities for the two distributions:
P ( n ) = 3 α 2 2 α + 1 + 3 α , P ( n ) = 2 2 α + 1 2 2 α + 1 + 3 α .
Positive exponents favor the probability of n A ; in the limit α , we obtain P ( n A ) 1 , and similarly, with α , we obtained P ( n B ) 1 . We illustrate this behavior in Figure 2 using Monte Carlo sampling to track the number of times each configuration is visited. With α = 1 , all configurations are visited with the same probability. With α > 0 , the configurations with distribution n B = ( 3 , 2 , 0 ) are visited more frequently than those with distribution n A = ( 4 , 1 , 1 ) . With a < 0 , the bias shifts towards configurations with distribution n A . In all cases, the configurations within the same distributions are equiprobable.

5. Asymptotic Limit

5.1. Microcanonical Thermodynamics

In the asymptotic limit M , N at fixed M / N , the discrete ensemble of distributions is quasi continuous in n and Ω M , N may be treated as a continuous function of M and N. We define the parameters β and q as
β = log Ω M + 1 , N Ω M , N Ω M , N M N , log q = Ω M , N + 1 Ω M , N Ω M , N N M .
By Taylor’s expansion, we have
log Ω M k , N l = log Ω M , N β k l log q ,
which is valid for k M , l N . Applying this result with l = 1 to Equation (13), we obtain the mean distribution in the form:
n k N = w k e β i q .
The factors β and log q are obtained from the conditions:
k w k e β i q = 1 ; k k w k e β i q = M N ,
which express the fact that n k / N is normalized to unity and its mean is M / N .
The mean distribution is uniquely determined by the weights w k and the mean cluster size. This further implies that β and log q , both of which are derivatives of log Ω M , N , are functions of the ratio M / N , i.e., they are homogeneous in M and N with degree 0. It follows that log Ω M , N is homogeneous in M and N with degree one, then, by Euler’s theorem for homogeneous functions, we obtain
log Ω M , N = M β + N log q .
A consequence of homogeneity is that the mean distribution in the asymptotic limit is overwhelmingly more probable than any other distributions. To see why, we calculate the log of the probability of distribution n by combining Equations (10) and (30):
log P ( n ) = M β + N log q log Ω M , N = 0 ,
whose right-hand side is zero by virtue of of Equation (32). Effectively, the selection functional picks out a single distribution from the microcanonical table and renders all others invisible. The inequality log P ( n ) < log P ( n ) , which merely states that all other distributions of the ensemble are less probable that the mean distribution, can be expressed in the equivalent form:
S ( n ) + log W ( n ) Ω M , N ,
where S ( n ) is the extensive Shannon functional of distribution n :
S ( n ) = N i n i N log n i N .
The inequality in Equation (34) is a statement of the second law: the functional on the left-hand side is maximized by the mean distribution in Equation (30). In the special case W ( n ) = 1 , Equation (34) states that the Shannon entropy of the mean distribution is the highest among all distributions in the ensemble.

5.2. Canonical Thermodynamics

We return to the canonical probability in Equation (17). Using Equation (32) to write log Ω M M , N N = β M N log q , the canonical probability in the asymptotic limit is
P ( n ) = n ! W ( n ) e β M q N .
The canonical probability in the asymptotic limit is proportional to the microcanonical weight and the Boltzmann factor e β M , where M is the total mass in the configuration. We obtained the canonical partition function by returning to Equation (21) in combination with Equation (32):
Q N , β = q N .
Here, the notation Q N , β implies that the canonical partition function does not depend on M and N individually, but on the intensive variable β , a function of the intensive ratio M / N . We now recognize the parameter q, which was defined as the partial derivative of Ω M , N , as the canonical partition function in a single column of the microcanonical table.

6. Construction of the Selection Functional

The microcanonical table asymptotically contains every normalized distribution f k with mean k ¯ = M / N . We will construct a selection functional that picks any distribution f k from this space. We begin by setting
w k = a e b k f k ,
where a > 0 and b are arbitrary constants. It is a simple matter to confirm that this form satisfies Equation (30) with n k / N = f k , q = a , and β = b . Since we are free to select a and b, we choose a = 1 , b = 0 , which gives w k = f k . Thus, we have a straightforward way to construct equilibrium constants for the exchange reaction so as to target any distribution as the equilibrium distribution. The only requirement is that the distribution has a finite mean.
We demonstrate the use of Equation (38) with two examples. In the first example, we consider the triangular distribution:
f k = 0.0001 × 0 if x < 100 or x > 300 x 100 if 100 < x 200 300 x if 200 < x 300
with support in 100 k 300 and mean k ¯ = 200 . To simulate this distribution by the exchange reaction method, we use a list of N = 1000 clusters with total mass M = 200 N so that the mean cluster in the list is 200, as in the distribution. We set w k = f k if f k > 0 and w k = 10 10 if f k = 0 (the cluster weight appears in the denominator of the equilibrium constant and cannot be zero). The simulation begins with the mass of all clusters set to k ¯ = 200 . Figure 3a shows the results of the simulation after 4 × 10 5 steps and demonstrates very good agreement with the distribution for which the selection functional was constructed.
For the second example, we construct a bimodal distribution formed as a mixture of two Gaussian distributions in equal proportions, one centered at k 1 = 180 with variance σ 1 2 = 100 , the other one centered at k 2 = 220 with variance σ 2 2 = 1000 . The mean of the bimodal distribution is k ¯ = 200 . The simulation is again conducted with N = 1000 particles with total mass M = 200 N , starting with all particles at mass k ¯ = 200 . Figure 3b shows that, in this case, as well, the simulated distribution converges to the distribution for which the w k values were designed.

7. Discussion

The set of integer partitions illustrates the structure of thermodynamic ensembles and the emergence of the thermodynamic limit. The table of ordered partitions represents the microcanonical ensemble; any number of columns extracted from this table forms a canonical ensemble. The equivalence between these ensembles is established by the elementary property that all columns of the table of ordered partitions contain the same list of clusters. The classical proof requires the study of fluctuations in the thermodynamic limit. Here, we have established equivalence in the discrete finite domain.
We have assigned probabilities in proportion to the microcanonical functional n ! W ( n ) . The standard mathematical treatment [14] does not distinguish between n ! and W ( n ) individually. This is not a trivial mathematical detail. The multinomial coefficient arises because we treat permutations in the order of the partitions as distinct from each other. It is only when we take the order of partitions into consideration that the microcanonical table presented a structure with well-defined combinatorial properties. The reference ensemble is defined by the condition W = 1 . This renders all ordered partitions equally probable; the multiplicity of the distribution is given by the multinomial coefficient, and the most-probable distribution is exponential. This is Boltzmann’s derivation of combinatorial entropy [21] (p. 55). Thus, we make contact with a fundamental result of statistical mechanics. The uniform selection functional is the mathematical statement of the postulate of equal a priori probabilities. The most-probable distribution in this case is the distribution with the maximum multinomial coefficient, and since its logarithm is the entropy functional, we concluded that the most-probable distribution in the microcanonical table under a uniform prior is the maximum entropy distribution. The selection functional biases the probability of configurations relative to that in the reference ensemble and can be designed to deliver any distribution present in the ensemble as the most-probable distribution in the thermodynamic limit. This was shown previously within an abstract space of distributions [22]; we have obtained the same result in the space of distributions contained in the microcanonical table. The exchange reaction serves to highlight the connection between integer partitions and thermodynamics in a physical way. We imagine integer partitions to represent a collection of clusters and the exchange reaction to represent a reversible reaction between them. The selection functional is the chemical “activity” of the clusters and determines the equilibrium constant of the reaction. Importantly, only the selection functional appears in the equilibrium constant. This is yet another reason for treating the multinomial coefficient and the selection functional as two separate functionals: The multinomial factor accounts for the random selection of the clusters chosen to react and represents what in thermodynamics we call an “ideal” system; the selection functional expresses deviations from ideality by imposing a bias relative to a purely random system. This is most clearly expressed in Equation (30), in which the factor w k can be viewed as a correction to the exponential distribution e β k / q because of nonidealities.
In a further departure from the standard mathematical literature, we have provided an exact treatment of discrete finite ensembles of partitions, including ensembles that are too small to be treated as continuous. The microcanonical probability is given in Equation (10) and the canonical probability in Equation (17). The mean distribution is given by Equation (22), while the most-probable distribution is defined by the condition max n n ! W ( n ) and satisfies the second law in Equation (34) as a strict inequality. All of these results apply to ensembles of any size. The thermodynamic limit requires n i to be large enough (equivalently, M , N 1 ) that it may be treated as a continuous variable. To demonstrate the transition from the discrete/finite to discrete/infinite and, finally, to continuous/infinite domain, we consider the case W ( n ) = 1 for which the partition function is given in Equation (1). Applying Equation (13) with w i = 1 , we obtain the mean distribution in the form:
n k N = M k 1 N 1 M 1 N 1 1 x ¯ 1 x ¯ x ¯ 1 k e k / x ¯ x ¯ ,
with x ¯ = M / N . The first result applies to all M, N, M N + 1 k 1 ; the second result applies to M > N 1 (M); the last result is true for M N 1 . Under the last condition, the cluster mass may be treated as a continuous variable, and thus, we recover the exponential distribution as the most-probable distribution in the unbiased ensemble.

8. Conclusions

In summary, we have formulated the combinatorial properties of the ensemble of integer partitions. By considering ordered partitions, we obtain an organized tabulation such that the table itself is a microcanonical ensemble and its rows are canonical ensembles. The ensemble is a container of distributions and contains a discrete finite sample of all distributions with finite mean and support on the real axis. The natural multiplicity of the distribution in the ensemble is its multinomial coefficient, and its logarithm is the entropy of the distribution. By further biasing the probability of the distribution via the selection functional, we can select any distribution from this ensemble. In the asymptotic limit, the space of distributions obeys thermodynamics: it gives rise to a distribution that is overwhelmingly more probable than all others and whose parameters satisfy the familiar thermodynamic relationships. We may view the ensemble of partitions as a generic template for stochastic processes. The central quantity in any stochastic process is the probability distribution of a stochastic variable, which may be thought to arise from the ensemble of partitions under a suitable functional. The determination of this functional is the challenge in this approach, but if this functional could be identified, we would obtain a rigorous thermodynamic treatment of the process. A few examples have been given in the literature of population balances [5,10,11,23]. We suggest that this approach can be extended beyond population balances to stochastic processes in general.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gibbs, J.W. Elementary Principles in Statistical Mechanics; Ox Bow Press: Woodbridge, CT, USA, 1981. [Google Scholar]
  2. Berestycki, J. Exchangeable Fragmentation-Coalescence Processes and their Equilibrium Measures. Electron. J. Probab. 2004, 9, 770–824. [Google Scholar] [CrossRef]
  3. Berestycki, N.; Pitman, J. Gibbs Distributions for Random Partitions Generated by a Fragmentation Process. J. Stat. Phys. 2007, 127, 381–418. [Google Scholar] [CrossRef] [Green Version]
  4. Matsoukas, T. Statistical Mechanics of Discrete Multicomponent Fragmentation. Condens. Matter 2020, 5, 64. [Google Scholar] [CrossRef]
  5. Durrett, R.; Granovsky, B.L.; Gueron, S. The Equilibrium Behavior of Reversible Coagulation-Fragmentation Processes. J. Theor. Probab. 1999, 12, 447–474. [Google Scholar] [CrossRef]
  6. Freiman, G.A.; Granovsky, B.L. Asymptotic formula for a partition function of reversible coagulation-fragmentation processes. Isr. J. Math. 2002, 130, 259–279. [Google Scholar] [CrossRef] [Green Version]
  7. Granovsky, B.L. Asymptotics of counts of small components in random structures and models of coagulation-fragmentation. arXiv 2005, arXiv:math/0511381. [Google Scholar] [CrossRef]
  8. Granovsky, B.L.; Kryvoshaev, A.V. Coagulation Processes with Gibbsian Time Evolution. J. Appl. Probab. 2012, 49, 612–626. [Google Scholar] [CrossRef] [Green Version]
  9. Matsoukas, T. Statistical thermodynamics of clustered populations. Phys. Rev. E 2014, 90, 022113. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Matsoukas, T. Statistical Thermodynamics of Irreversible Aggregation: The Sol-Gel Transition. Sci. Rep. 2015, 5, 8855. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Matsoukas, T. The Smoluchowski Ensemble—Statistical Mechanics of Aggregation. Entropy 2020, 22, 1181. [Google Scholar] [CrossRef] [PubMed]
  12. Evans, M.R.; Hanney, T. Nonequilibrium statistical mechanics of the zero-range process and related models. J. Phys. A Math. Gen. 2005, 38, R195–R240. [Google Scholar] [CrossRef] [Green Version]
  13. Erdös, P.; Lehner, J. The distribution of the number of summands in the partitions of a positive integer. Duke Math. J. 1941, 8, 335–345. [Google Scholar] [CrossRef] [Green Version]
  14. Vershik, A.M. Statistical mechanics of combinatorial partitions, and their limit shapes. Funct. Anal. Its Appl. 1996, 30, 90–105. [Google Scholar] [CrossRef]
  15. Fatkullin, I.; Slastikov, V. Limit Shapes for Gibbs Ensembles of Partitions. J. Stat. Phys. 2018, 172, 1545–1563. [Google Scholar] [CrossRef] [Green Version]
  16. Erlihson, M.M.; Granovsky, B.L. Limit shapes of Gibbs distributions on the set of integer partitions: The expansive case. Ann. L’Institut Henri Poincaré Probab. Stat. 2008, 44, 915–945. [Google Scholar] [CrossRef]
  17. Adams, S.; Dickson, M. Large deviations analysis for random combinatorial partitions with counter terms. J. Phys. A Math. Theor. 2022, 55, 255001. [Google Scholar] [CrossRef]
  18. Bridges, W.; Bringmann, K. Statistics for unimodal sequences. Adv. Math. 2022, 401, 108288. [Google Scholar] [CrossRef]
  19. Bóna, M. A Walk Through Combinatorics—An Introduction to Enumeration and Graph Theory, 2nd ed.; World Scientific Publishing Co.: Singapore, 2006. [Google Scholar]
  20. Kelly, F.P. Reversibility and Stochastic Networks; Cambridge University Press: Cambridge, UK, 2011. [Google Scholar]
  21. Boltzmann, L. Lecctures on Gas Theory; Dover: New York, NY, USA, 1995. [Google Scholar]
  22. Matsoukas, T. Thermodynamics Beyond Molecules: Statistical Thermodynamics of Probability Distributions. Entropy 2019, 21, 890. [Google Scholar] [CrossRef] [Green Version]
  23. Matsoukas, T. Stochastic Theory of Discrete Binary Fragmentation—Kinetics and Thermodynamics. Entropy 2022, 24, 229. [Google Scholar] [CrossRef] [PubMed]
Figure 1. (Left) Configurations undergoing an exchange reaction represent transitions that visit the space of configurations uniformly. A random walk on this space visits each configuration the same number of times. (Right) The corresponding transitions between distributions visit each distribution n in proportion to its multinomial factor n ! .
Figure 1. (Left) Configurations undergoing an exchange reaction represent transitions that visit the space of configurations uniformly. A random walk on this space visits each configuration the same number of times. (Right) The corresponding transitions between distributions visit each distribution n in proportion to its multinomial factor n ! .
Entropy 25 00385 g001
Figure 2. Exchange reactions in the ensemble M = 7 , N = 5 , visit the 15 configurations of the ensemble in proportion to the cluster weight w k = k α . Configurations 1–5 represent distribution n A = ( 4 , 1 , 1 ) ; Configurations 6–15 represent distribution n A = ( 3 , 2 , 0 ) (configurations are numbered in the order they appear in Table 1). (a) α = 0 : all configurations are visited with equal probability; (b) α = 4 : configurations of distribution n A are visited more frequently; (c) configurations of distribution n B are visited more frequently. Lines show the theoretical probability calculated as P ( n ) / n ! , where n is the distribution of the clusters in the configuration.
Figure 2. Exchange reactions in the ensemble M = 7 , N = 5 , visit the 15 configurations of the ensemble in proportion to the cluster weight w k = k α . Configurations 1–5 represent distribution n A = ( 4 , 1 , 1 ) ; Configurations 6–15 represent distribution n A = ( 3 , 2 , 0 ) (configurations are numbered in the order they appear in Table 1). (a) α = 0 : all configurations are visited with equal probability; (b) α = 4 : configurations of distribution n A are visited more frequently; (c) configurations of distribution n B are visited more frequently. Lines show the theoretical probability calculated as P ( n ) / n ! , where n is the distribution of the clusters in the configuration.
Entropy 25 00385 g002
Figure 3. (a) Triangular distribution; (b) bimodal distribution of two Gaussian distributions. Symbols are MC simulations of the exchange reaction with w k = f k , where f k is the triangular or the bimodal distribution. In both cases, the simulation agrees very well with the corresponding distribution.
Figure 3. (a) Triangular distribution; (b) bimodal distribution of two Gaussian distributions. Symbols are MC simulations of the exchange reaction with w k = f k , where f k is the triangular or the bimodal distribution. In both cases, the simulation agrees very well with the corresponding distribution.
Entropy 25 00385 g003
Table 1. Microcanonical ensemble with M = 7 , N = 5 . (a) Microcanonical table of configurations ( m i is the ith element of the configuration); (b) list of distributions in the ensemble ( n i is the number of i-mers in the configuration). The table of distributions is a more concise representation of the ensemble of configurations with each distribution representing n ! W ( n ) distinct configurations.
Table 1. Microcanonical ensemble with M = 7 , N = 5 . (a) Microcanonical table of configurations ( m i is the ith element of the configuration); (b) list of distributions in the ensemble ( n i is the number of i-mers in the configuration). The table of distributions is a more concise representation of the ensemble of configurations with each distribution representing n ! W ( n ) distinct configurations.
(a) Configurations
m 1 m 2 m 3 m 4 m 5
31111
13111
11311
11131
11113
22111
21211
21121
21112
12211
12121
12112
11221
11212
11122
(b) Distributions
n 1 n 2 n 3 n 4 n !
40105
320010
Table 2. Canonical partitioning of microcanonical ensemble E M , N with M = 7 , N = 5 . The shaded configurations form canonical ensemble C N N | M , C with N = 2 . The unshaded portion constitutes the complementary ensemble C N n | M , N .
Table 2. Canonical partitioning of microcanonical ensemble E M , N with M = 7 , N = 5 . The shaded configurations form canonical ensemble C N N | M , C with N = 2 . The unshaded portion constitutes the complementary ensemble C N n | M , N .
Canonical Configurations
m 1 m 2 m 1 m 2 m 3
11311
11131
11113
11221
11212
11122
21211
21121
21112
12211
12121
12112
22111
31111
13111
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Matsoukas, T. Combinatorics and Statistical Mechanics of Integer Partitions. Entropy 2023, 25, 385. https://doi.org/10.3390/e25020385

AMA Style

Matsoukas T. Combinatorics and Statistical Mechanics of Integer Partitions. Entropy. 2023; 25(2):385. https://doi.org/10.3390/e25020385

Chicago/Turabian Style

Matsoukas, Themis. 2023. "Combinatorics and Statistical Mechanics of Integer Partitions" Entropy 25, no. 2: 385. https://doi.org/10.3390/e25020385

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop