2.1. Background
In this section the formalism and equations are derived for the entropy and its related evolution under electromagnetic driving. We know that changes in entropy satisfy
, where
is the generalized heat added to the system, such as heat flowing in through the system boundaries and
is the entropy change due to irreversible processes such as relaxation. For a closed system,
and
[
5,
25,
26,
27,
28]. As the system is dynamically driven by applied fields, the material relaxes and local fields are formed in the material that differ from the applied fields. As a consequence, a new energy configuration is formed. The origin of relaxation is the process of transforming from applied fields acting on the material to local fields acting on materials.
The dynamical variables we use are a set of operators, or classically, a set of functions of phase , . For normalization, is included in the set. These operators are, for example, the internal-energy density , and the electromagnetic polarizations , . The operators are functions of and phase variables, but are not explicitly time dependent. The time dependence enters when the trace is taken, through the driving fields and in the Hamiltonian. Associated with these operators are a set of thermodynamic fields, that are not operators and do not depend on phase, such as generalized temperature and local electromagnetic fields such as and and temperature. In any complex system, in addition to the set of , there are many other uncontrolled or unobserved variables that are categorized as irrelevant variables.
A brief overview of the approach for calculating the equations of motion for the relevant variables will now be presented, for details please refer to [
6,
9,
19]. Later, these results will be used to derive an entropy-fluctuation relation. There are two density operators. The first is the full statistical-density operator
that satisfies the Liouville equation
Note we are using the Fourier transform time convention
,
is the time-dependent Liouville operator, and
is the Hamiltonian that is time dependent because the applied fields are time dependent.
In addition to ρ we define the relevant canonical-density operator that is developed by maximizing the information entropy subject to constraints on the expected values of operators.
The entropy is defined as
where
denotes trace. In Eq.(2),
σ is formed from the relevant variables and is constructed through maximizing the entropy subject to constraints on the expectations of the operators
. Maximization by the common variational procedure leads to the generalized canonical density,
where we require
In Eq.(
4),
are Lagrangian multipliers that correspond to local nonquantized fields, such as temperature and electromagnetic fields. We use the notation
. The constraints for
requires that the expected values of the relevant variables, (but not their derivatives, etc.), with respect to
σ and
ρ are equal at each time:
In our analysis the relevant variables will be the polarizations and the internal energy.
The dynamical evolution of the relevant variables, that is, the evolution from the Hamiltonian, is described by
where the Hamiltonian is, for example, in electromagnetic driving:
. In addition to the dynamical evolution, we will see that there are also changes in the relevant variables evolution due to the irrelevant information. An important identity was proven previously [
18]
where the bar is defined for any operator
A as
. In a classical analysis
. Also
.
Robertson developed an exact equation for
, that contains memory, in terms of
. When solving that equation and using Oppenheim’s extended initial condition we obtain [
13]
for the initial condition
(note that Oppenheim and Levine [
13] generalized the analysis of Robertson to include this more generalized initial condition).
is an evolution operator,
, and satisfies
where
P is a nonhermitian projection-like operator defined by the functional derivative
for any operator
A [
9]. In Robertson’s pioneering work he has shown that Eq.(
10) is equivalent to the Kawasaki-Grunton and Grabert’s projection operators, and is a generalization of the Mori and Zwanzig projection operators[
29]. As a consequence of this definition of the projection operator:
. The normal maximum entropy procedure (MAXENT) neglects the last term on the RHS of Eq.(
8).
For the very special, but unrealistic case, when all
can be written as a linear sum of the set
for all
n. If one could write this for all
n then the last terms on the RHS of Eq.(
8) are zero and then
: in general
. Here the function
does not depend on the phase variables. This is a consequence of the identity
[
13,
29]. When Eq.(
11) applies for all
n(see Robertson [
21] Eq.(A7) Oppenheim and Levine [
13]), the relaxation terms in Eq.(
8) are absent. Eq.(
11) would not apply for any macroscopic system and in addition causality, through the Kramer’s-Kronig condition, requires dissipation. As noted by Oppenheim and Levine and Robertson[
9,
13], it is the failure of Eq.(
11) for almost all physical systems, that produces the relaxation term in Eq.(
8) and the resultant entropy production. Due to the invariance of the trace operation under unitary transformations, it is known that the Neumann entropy
formed from the full statistical-density operator
, that satisfies the Liouville equation, is independent of time. This is very seldom realized in measurement systems where irrelevant variables influence the system’s evolution.
In the above, we have only considered closed, dynamically driven systems. In an open system,
may not evolve unitarily and
need not satisfy Eq.(
1)[
16].
It has been shown that the exact time evolution of the relevant variables can be expressed for a dynamically-driven system as [
9,
13]
Equations (
5) and (
12) form a closed system of equations and the procedure for determining the Lagrange multipliers in terms of
is to solve Eqs.(
5) and Eq.(
12) simultaneously. For operators that are odd under time reversal, such as the magnetic moment, the first term on the right hand side of Eq.(
12) is nonzero, whereas for functions even under time reversal, such as dielectric polarization, and microscopic entropy, this term is zero. However, the third term in Eq.(
12) in any dissipative system is nonzero. The relaxation correction term that appears in the projection-operator formalism is essential and is a source of the time-dependence in the entropy rate. Although these equations are nonlinear in many cases linear approximations have been successfully made[
30]. For open systems Eq.(
12) is modified only by adding a source term[
31].
When
and
, we can express Eq.(
12) using integration by parts in terms of currents[
18]
The mean energy of a quantum oscillator defines the temperature
and in a high temperature approximation this reduces to
. Approximate transport coefficients and the related fluctuation-dissipation relations for conductivity, susceptibility, noise, and other quantities follow naturally from Eq.(
13), if we have time-harmonic fields, and take a linear, time-invariant approximation. If we take the Laplace transform (
), then,
where
We will use Eq.(
14) in our applications to electromagnetic constitutive properties.
2.2. The Entropy in the Projection-Operator Formulation
From Eq.(
2) and using Eq.(
3) we define the entropy for a set of relevant variables: the electric polarization,
, the magnetic polarization,
, and the internal-energy density,
has been included in the set of operators for normalization. The free energy is
. The Lagrangian multipliers in this example were
,
, and
.
In the following sections we will apply the theory we have developed to various problems in electromagnetism. Before doing this, we need to define the entropy density and production rate. The entropy density
follows a conservation equation of the form
where
is an entropy current; for example, due to a heat flux flowing into the system. In Eq.(
16),
is the entropy density production rate due to irreversible processes and relaxation. The units of
are entropy per second per unit volume.
The total microscopic entropy production rate, which is the integrated-entropy density production rate, originates from the dynamical evolution of the relevant variables We define the total microscopic entropy production rate as
The expected value of the dynamical contribution to the total entropy production rate vanishes due to Eq. (
7)[
13]:
This result follows from time-reversal invariance of the trace of
, using the cyclic invariance of the trace, and Eq.(
6). Equation (
18) is a result of the microreversiblity of the equations of motion of the relevant variables. For example, in the case of an isolated system with dynamical electromagnetic driving that has microscopic internal energy
u, magnetization
, local field
, and generalized temperature
T, we would have from Eq.(
18):
. In other words, in the dynamical contribution to the evolution of the relevant variables, all contributions to the entropy rate are taken into account and in this sense
does not directly influence the total entropy evolution
, which includes the effects of both the relevant and nonrelevant variables in the dissipative term.
The total entropy evolution can be formed from Eq.(
12) by multiplying by
λ and integration over space
Equation (
19) is an exact expression of the second law of thermodynamics since the Robertson-Zwanzig statistical-mechanical theory is an exact quantum-mechanical solution of the Liouville equation applied to the relevant variables, without approximation (see Robertson [
18]). Note that it is time-reversal invariant, but also models dissipation. We used the relation
. The last expression in Eq.(
19) indicates that the entropy production rate satisfies a fluctuation-dissipation relationship in terms of the microscopic entropy production rate
. At
,
. Equation(
19) will form the basis of our applications to various electromagnetic driving and measurement problems. The LHS of Eq.(
19) represents the dissipation and the last term on the RHS represents the fluctuations in terms of the microscopic entropy production rate
. Due to incomplete information there are contributions from the positive semi-definite relaxation terms in Eq.(
19) for almost all many-body systems. For a dynamically-driven system
. For an open system Eq.(
19) would be modified by adding an entropy source term. To summarize, for a dynamically driven system, the expected value of the microscopic entropy production rate is zero due to the microscopic reversibility of the underlying equations of motion; however in a complex system there are other uncontrolled variables in addition to the relevant ones that act to produce dissipation and irreversibility and a net positive macroscopic entropy evolution. Since this equation is exact, systems away from equilibrium can be modeled.
Equation (
19) could be used to determine Boltzmann’s constant if
was obtained from a measurement of the losses in a system and the trace expression was determined by measurements of the fluctuations of the entropy production. This could be actualized, in principle, by the measurement of noise in an electrical system, similar to the studies of determining Boltzmann’s constant from Johnson noise. It seems intuitive that Boltzmann’s constant could be determined by entropy measurements, since they both have the same units.
For a stationary process, the RHS of Eq. (
19) can be expressed through the Wiener-Khinchine theorem, over a bandwidth of
, and cast into a form that relates the net dissipative entropy production, averaged over a cycle, in terms of entropy-production fluctuations. A special case of this equation is Johnson noise where
and
, that yields
. The entropy-production correlation function can be measured in terms of power fluctuations. We will use this relation for applications in electromagnetism and indicate how it relates to the fluctuation-dissipation theorems for dielectric and magnetic measurements, Johnson noise, and Boltzmann’s constant determination.
Using Eq.(
19),
, and assuming a continuity equation for the evolution, the conserved relevant variables can be written as
, where the normal component of
vanishes on bounding surfaces, and the rest are nonconserved variables. The entropy-density balance equation is
For an open system entropy fluxes may travel through the boundaries and may be modeled by the addition of an entropy source.
Equation (
19) can also be used as a generator of the relaxation part of the equations of motion of the relevant variables. Taking a functional derivative of Eq.(
19) w.r.t.
, and noting that the volume integral over
can be arbitrary, we obtain the exact evolution equations, without the reversible contribution, that were derived by Robertson and later by Oppenheim [
9,
13] and others, given in Eq.(
12). For example, if the driving forces are the related Lagrange multipliers,
,
, and
, we obtain the equation of motion of the polarization response
, (see Baker-Jarvis et al. [
6,
20]):
The second form in Eq.(
21) displays the interaction with the microscopic entropy production rate. As another illustration, if we vary the generalized force
in Eq.(
19) we obtain the equation of motion for the magnetization
We can identify
where
is the effective gyromagnetic factor. The equation of motion for the internal-energy density is
Equations (
21) through (
23) are exact, coupled nonlinear equations that must be solved in conjunction with Eqs. (
5) for the unknowns:
and
. In general, this is not a simple task, but in many examples it is possible to make approximations to linearize the kernel. Robertson showed how Eq.(
22) reduces to the Landau-Lifshitz equation for appropriate assumptions when the kernel
was approximated by using
. The electric polarization Eq.(
21) was linearized and solved in [
30].