The Nature of the Chemical Process. 1. Symmetry Evolution – Revised Information Theory, Similarity Principle and Ugly Symmetry
Received: 16 December 2000 / Accepted: 15 March 2001 / Published: 25 March 2001
Cited by 14 | PDF Full-text (139 KB) | HTML Full-text | XML Full-text
Symmetry is a measure of indistinguishability. Similarity is a continuous measure of imperfect symmetry. Lewis' remark that “gain of entropy means loss of information” defines the relationship of entropy and information. Three laws of information theory have been proposed. Labeling by introducing nonsymmetry
[...] Read more.
Symmetry is a measure of indistinguishability. Similarity is a continuous measure of imperfect symmetry. Lewis' remark that “gain of entropy means loss of information” defines the relationship of entropy and information. Three laws of information theory have been proposed. Labeling by introducing nonsymmetry and formatting by introducing symmetry are defined. The function L ( L=lnw, w is the number of microstates, or the sum of entropy and information, L=S+I) of the universe is a constant (the first law of information theory). The entropy S of the universe tends toward a maximum (the second law law of information theory). For a perfect symmetric static structure, the information is zero and the static entropy is the maximum (the third law law of information theory). Based on the Gibbs inequality and the second law of the revised information theory we have proved the similarity principle (a continuous higher similarity−higher entropy relation after the rejection of the Gibbs paradox) and proved the Curie-Rosen symmetry principle (a higher symmetry−higher stability relation) as a special case of the similarity principle. The principles of information minimization and potential energy minimization are compared. Entropy is the degree of symmetry and information is the degree of nonsymmetry. There are two kinds of symmetries: dynamic and static symmetries. Any kind of symmetry will define an entropy and, corresponding to the dynamic and static symmetries, there are static entropy and dynamic entropy. Entropy in thermodynamics is a special kind of dynamic entropy. Any spontaneous process will evolve towards the highest possible symmetry, either dynamic or static or both. Therefore the revised information theory can be applied to characterizing all kinds of structural stability and process spontaneity. Some examples in chemical physics have been given. Spontaneous processes of all kinds of molecular interaction, phase separation and phase transition, including symmetry breaking and the densest molecular packing and crystallization, are all driven by information minimization or symmetry maximization. The evolution of the universe in general and evolution of life in particular can be quantitatively considered as a series of symmetry breaking processes. The two empirical rules − similarity rule and complementarity rule − have been given a theoretical foundation. All kinds of periodicity in space and time are symmetries and contribute to the stability. Symmetry is beautiful because it renders stability. However, symmetry is in principle ugly because it is associated with information loss.