1. Introduction
The topic of this paper is the application of the concept of entropy in tribology. Tribology is defined as the science and technology of interacting surfaces in relative motion, or, in other words, the study of friction, wear and lubrication. The concept of entropy often seems difficult and confusing for non-physicists. The reason is that, unlike in the case of temperature and energy, there is no direct way of measuring entropy, so our everyday intuition does not work well when we have to deal with entropy. This is apparently the main reason why entropy, being the most important quantitative measure of disorder and dissipation, has not yet become the major tool for analyzing such dissipative processes as friction and wear.
The classical thermodynamic definition of entropy was suggested by R. Clausius in the 1860s. According to that definition, every time when the amount of heat d
Q is transferred to a system at temperature
T, the entropy of the system grows by
Thus, when heat is transferred from a body with temperature
T1 to a body with temperature
T2, the entropy grows by d
S = (1/
T2 – 1/
T1)d
Q. This provides a convenient mathematical formulation for the Second Law of thermodynamics, which states that heat does not spontaneously flow from a colder body to a hotter body or, in other words, that the entropy does not decrease, just d
S ≥ 0. A more formal thermodynamic definition involves the energy of a system
U(
S,
V,
N) as a function of several parameters, including entropy, volume
V, and the number of particles,
N. Then temperature is defined as a partial derivative of the energy of the system with respect to entropy
whereas the change of energy is given by
where
P is the pressure and μ is the chemical potential [
1].
Note that the temperature and entropy are so-called “conjugate variables,” in a similar manner as the pressure and volume, or the chemical potential and the number of particles. For a mechanician, the most common examples of conjugate variables are the force F and distance x, defined in such a manner that the change of energy (or just the work of the force F) is given by dU = Fdx. Similarly to the force-coordinate pair F and x, the temperature, pressure, and the chemical potential are generalized forces, while entropy, volume and the number of particles are corresponding generalized coordinates. Note that the classical thermodynamic definition to entropy (as outlined in Equations 1-3) requires the internal energy U to be defined prior to the definition of entropy.
Many textbooks define entropy also as a measure of the uniformity of the distribution of energy, or a difference between the internal energy of a system and the energy available to do the useful work, known as the Helmholtz free energy A = U – TS. Every time when irreversible dissipation of energy occurs, the difference between the internal energy and the Helmholtz free energy grows, so that less energy remains available for the useful work.
A different approach to entropy was suggested by L. Boltzmann in 1877, who defined it using the concept of microstates, which correspond to a given macrostate. Microstates are arrangements of energy and matter in the system, which are distinguishable at the atomic or molecular level; however, they are indistinguishable at the macroscopic level. If Ω microstates correspond to a given macrostate, then the entropy of the macrostate is given by
where
k is Boltzmann’s constant. The microstates have equal probabilities, and the system tends to evolve to a more probable macrostate,
i.e., the macrostate that has a larger number of microstates [
2]. The entropy definition given by Equation 4 is convenient for using entropy as a measure of disorder, since it deals with a finite (and, actually, integer) number of microstates. In the most ordered ideal state,
i.e., at the absolute zero temperature, there is only one microstate and the entropy is zero. In the most disordered state of a particular system (e.g., the homogeneous mixing of two substances), the number of microstates reaches its maximum and thus the entropy is at maximum. The definition of Equation 3 can also be easily generalized for the theory of information, since the discrete microstates can be seen as bits of information required to uniquely characterize the macrostate, and thus the so-called Shannon entropy serves a measure of uncertainty in the information science.
The concept of microstate is, however, a bit obscure. It is noted, that the statistical definition of entropy given by Equation 4 implies a finite number of microstates and thus a discrete spectrum of entropy, whereas the thermodynamic definition in Equations 1-3 apparently implies a continuum spectrum of entropy and an infinite number of microstates. According to Boltzmann, the microstates should be grouped together to obtain a countable set. Two states of an atom are counted as the same state if their positions,
x, and momenta,
p, are within
δx and
δp of each other. Since the values of
δx and
δp can be chosen arbitrarily, the entropy is defined only up to an additive constant. However, an apparent contradiction between the continuum and discrete approach remains, so the question is often asked, whether the entropy given by Equation 1 is the same quantity as the entropy given by Equation 4? The question “what is a microstate” is not completely clarified even if we take into consideration the fact that any measurement of any parameter is conducted with a finite accuracy and, therefore, a measuring device provides a discrete rather than continuum output. Thermodynamic parameters should be independent of the resolution of our measurement devices. Some authors prefer to use the concept of “quantum states” instead of the microstates [
3]; however, the classical (non-quantum) description should use classical concepts.
Fortunately, Equation 4 works independently of what is microstate. The only property of microstates that is of importance is their multiplicativity, that is, for a system consisting of two non-interacting subsystems, the total number of microstates is equal to the product of the numbers of microstates of the subsystems. This makes entropy, defined as a logarithm of the number of microstates, an additive function, that is, the entropy of a system is equal to the sum of entropies of the sub-systems. In a sense, Equation 4 serves as a definition of the microstate: the number of microstates is just an exponent of S/k. The arbitrary constant, which can be added to S, corresponds to the arbitrarily choice of δx and δp during the grouping of the microstates. The additive constant is usually chosen in such a manner that S = 0 at T = 0. In other words, there is only one microstate of any system at the absolute zero temperature when no thermal motion occurs.
While for discrete systems, microstates have a well defined meaning, for continuum systems the number of microstates is just the exponent of entropy. Therefore, another question can be asked: whether the concept of entropy provides a connecting link between the discrete and continuum systems, in other words, between energy and information? For example, the standard molar entropy (i.e., entropy per mol at the room temperature) of diamond and water is equal to 2.38 J/(K mol) and 189 J/(K mol) per mol, respectively, or, 0.3 and 22.7 per molecule. These values of entropy are calculated by integrating Equation 1 by temperature from the absolute zero to the room temperature. Does this literally mean that every atom of diamond has, on average, exp(0.3) = 1.35 microstates and every water molecule has exp(22.7) = 7.2 billion microstates? The opposite question can be asked as well. For example, the entropy of rolling a set of N dice is S = Nln6, since every die has six states. Does it mean that every time when the rolling occurs at temperature T, the amount of energy kNln(6)/T is dissipated? Indeed in order to stop a rolling die, frictional damping is needed and some dissipation always occurs; otherwise it will continue to roll forever due to the inertia. In practice, for a macroscopic die, a much larger amount of energy is dissipated than kln(6)/T. However, in the limit of a very small die, kln(6)/T is the lower limit of the energy that should be dissipated. A student of classical thermodynamics who starts to ask these questions finds quickly that an answer is found only in quantum physics, which states that most systems have a discrete spectrum of energy.
The kinetic friction is an irreversible dissipative process, during which heat is generated. For example, when the friction force F is applied to a body that passes the distance dx, the energy dQ = Fdx is dissipated into the environment, and the entropy of the environment increases for the amount of dS = F/Tdx. Furthermore, friction is a complex phenomenon which involves many diverse mechanisms, such as adhesion, elastic and plastic deformation, fracture, etc. However, in most cases it does not matter (and even not known), which particular mechanism of dissipation dominates in a particular situation, because it does not affect the macroscale properties of friction. The remarkable property of friction is its universality. It is very difficult to completely eliminate friction. Friction represents the general tendency for irreversible energy dissipation in accordance with the Second Law of thermodynamics. It is therefore reasonable to expect that the concept of entropy can capture some general properties of systems with friction, which are present irrespective of a particular mechanism of friction and thus define the phenomenon of friction.
Wear is the degradation of surface as a result of friction. While friction is manifested in the heat transfer and dissipation, wear is manifested in the mass transfer. Wear is characterized by irreversible change of surface and thus by an increase of entropy, so it is natural also to characterize wear using the concept of entropy. Despite that, the concept of entropy is rarely used in Tribology. One of the reasons is the above mentioned difficulty of the concept of entropy for the students of engineering, who prefer to use more intuitively tangible energies, temperatures, and forces in their calculations.
Historically, several attempts to use the concept of entropy in tribology were made in the Soviet Union since the 1970s. Several groups should be mentioned. First, B. Kostetsky and L. Bershadsky [
4,
5] at the Institute for Materials Science in Kiev investigated the formation of the so-called self-organized “secondary structures” during friction and the regime of “structural dissipative adjustment.” According to Bershadsky, friction and wear are two sides of the same phenomenon and they represent the tendency of energy and matter to achieve the most disordered state. However, the synergy of various mechanism can lead to the self-organization of the secondary structures, which are “nonstoichiometric and metastable phases,” whereas “the friction force is also a reaction on the informational (entropic) excitations, analogous to the elastic properties of a polymer, which are related mostly to the change of entropy and have the magnitude of the order of the elasticity of a gas.” [
5]. Bershadsky also formulated entropic variational principles that governed friction and wear and a number of important ideas on the structural dissipative nature of friction.
Many of these ideas were influenced by the theory of self-organization developed by Ilya Prigogine (a Belgian physical chemist of the Russian origin), the winner of the 1977 Nobel Prize in Chemistry. Prigogine [
6] used the ideas of Onsager’s non-equilibrium thermodynamics to describe the processes of self-organization [
7]. Later, Prigogine wrote several popular books about the scientific and philosophical importance of self-organization [
8,
9]. At the same time, in the end of the 1970s, the concept of “Synergetics” was suggested by H. Haken [
10] as an interdisciplinary science investigating the formation and self-organization of patterns and structures in open systems, which are far from thermodynamic equilibrium. The very word “synergetics” was apparently coined by the architect and philosopher R.B. Fuller (the same person, after whom the C
60 molecule was later called “fullerene”). The books by Prigogine and Haken were published in the USSR in Russian translations and became very popular in the 1980s among critically-minded Soviet intellectuals, since the “synergetic” studies claimed to suggest a general methodology to investigate physical, biological, information and social phenomena, which, in a sense, was (or at least, was perfected in this way by some scientists) an alternative to the official Soviet Marxist methodology. On the other hand, many more traditionally-minded theoretical physicists opposed the “synergetics,” since the synergetic studies often contained a lot of rhetoric and little practical quantitative and verifiable results [
11], and even the term “pseudo-synergetics” was coined to refer to these speculative studied.
It is important to understand this historical context of the studies of self-organization during friction. According to Bershadsky [
4], “tribosystems apparently constitute the most diverse objects capable for self-organization, which possess all major features of a synergetic system (strong non-linearity, parameter distribution and delay, auto-catalysis, natural regulator, feedback and target function,
etc.), as well as a number of specific features, such as the energetic and materials heterogeneity, memory, learning capability,
etc. This is why many unsolved tribological problems are in fact general problems of synergetics.”
The group of N. Bushe and his assistant I. Gershman at the Research Institute for the Railway Transport in Moscow is the second group that should be mentioned. To a large extent, they continued the research of the Kiev group, in particular, in the theory of “tribological compatibility” of materials, and the synergetic action of the electric current during the current collection, as well as wear resistant composite metallic coatings for heavily loaded cutting tools. Their results, which employ extensively the concept of entropy, were summarized in the monograph edited by Fox-Rabinovich and Totten [
12]. In 2002, N. Bushe won the most prestigious tribology award, the Tribology Gold Medal, for his studies of tribological compatibility and other related effects.
A different approach, which also employed the concept of entropy, was suggested by Prof. Col. D. Garkunov [
13] of the Air Force Academy in Russia, who claimed the discovery of the synergetic “non-deterioration effect” also called the “selective transfer.” Together with A. Polyakov, he suggested a concept of dynamically formed protective tribofilms (which they called “servovite films” or “serfing-films”). These films are formed due to a chemical reaction induced by friction and they protect against wear, leading to a dynamic equilibrium between the wear and the formation of the protective film. Garkunov’s heavily experimental research also received an international recognition when he was awarded the 2005 Tribology Gold Medal “for his achievements in tribology, especially in the fields of selective transfer.”
Among other related groups engaged in entropic studies in tribology, the scientists of Rostov and Tomsk University, Prof. K. Krawczyk of Poland, and Dr. L. Sosnovsky [
14] from Gomel, Byelorussia should be mentioned. The latter suggested the concept of “tribofatika” (the coupling of wear and fatigue).
In the English-speaking world, an important entropic study of the thermodynamics of wear was conducted by M. Bryant
et al. [
15], who introduced a degradation function and formulated the Degradation-Entropy Generation theorem in their approach intended to study the friction and wear in complex. They note that friction and wear, which are often treated as unrelated processes, are in fact manifestations of the same dissipative physical processes occurring at sliding interfaces [
16]. Their approach is based on classical Clausius’ concept of entropy.
A completely different approach is related to the theory of dynamical systems. The results of friction tests can be viewed as data series characterized by certain statistical parameters, including the entropy of distributions. Since the 1980s, it has been suggested that a very specific type of self-organization, called self-organized criticality, plays a role in diverse “avalanche-like” processes, such as the stick-slip phenomenon during dry friction. The research of F. Zypman and J. Ferrante [
17,
18,
19,
20] and others deals with this topic. A different approach on the basis of the theory of dynamical systems was suggested by E. Kagan [
21], and it is based on the so-called Turing systems (the diffusion-reaction systems) to describe formation of spatial and time patterns induced by friction.
Nosonovsky and co-authors [
22,
23,
24] suggested using the entropic methods to describe self-lubrication and surface-healing (self-healing surfaces). He noted that the orderliness at the interface can increase (and, therefore, the entropy is decreased) at the expense of the entropy either in the bulk of the body or at the microscale. He also suggested that self-organized spatial patterns (such as interface slip waves) can be studied by the methods of the theory of self-organization.
In the opinion of the author, the entropic methods are especially promising for the analysis of friction-induced self-organization, especially self-lubrication, surface-healing and self-cleaning. According to the preliminary studies, there are two key features in these processes. First, the hierarchical (multiscale) organization of the material, which allows implementing healing (as observed at the macroscale) at the expense of microscale deterioration. Second is the presence of positive and/or negative feedbacks that lead to dynamic destabilization and friction-induced vibrations. Hierarchical organization is the key feature of biological systems, which have capacity for self-organization, self-healing, and self-lubrication, so it is not surprising that hierarchy plays a central role also in artificial materials capable for self-organization. In most cases, the self-organization is induced by bringing the system out of thermodynamic equilibrium, so that the restoring force is coupled with the degradation in such a manner that healing (decreasing degradation) occurs during the return to the thermodynamic equilibrium.
The question is often asked by practical engineers and applied scientists, are there any practical benefits of using the concept of entropy in tribology? The present paper is intended to answer this question and to demonstrate practical applications, both potential and actual, of the concept of entropy, such as establishing the structure-property relationships for new materials. In this review, I will emphasize entropic approaches to friction and wear which deal with structure-property relationships, rather than speculations about entropy in friction and wear.