Next Article in Journal
Understanding Atmospheric Behaviour in Terms of Entropy: A Review of Applications of the Second Law of Thermodynamics to Meteorology
Next Article in Special Issue
Information Theory in Scientific Visualization
Previous Article in Journal
Finite-Time Thermoeconomic Optimization of a Solar-Driven Heat Engine Model
Previous Article in Special Issue
Information Storage in Liquids with Ordered Molecular Assemblies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Extreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations

1
Instituto de Fisica (IFLP), Universidad Nacional de La Plata, C.C. 727, 1900 La Plata, Argentina
2
Departamento de Ciencias Basicas, Universidad Nacional de La Plata, Fac. de Ingenieria, 1900 La Plata, Argentina
3
Departamento de Fisica, Universidad Nacional de La Plata, Fac. Ciencias Exactas, C.C. 67, 1900 La Plata, Argentina
4
Argentine National Research Center (CCT-CONICET), Argentina
5
Departament de Fisica, Universitat de les Illes Balears and IFISC-CSIC, 07122 Palma de Mallorca, Spain
*
Author to whom correspondence should be addressed.
Entropy 2011, 13(1), 184-194; https://doi.org/10.3390/e13010184
Submission received: 12 November 2010 / Revised: 10 January 2011 / Accepted: 12 January 2011 / Published: 14 January 2011
(This article belongs to the Special Issue Advances in Information Theory)

Abstract

:
In employing MaxEnt, a crucial role is assigned to the reciprocity relations that relate the quantifier to be extremized (Shannon’s entropy S), the Lagrange multipliers that arise during the variational process, and the expectation values that constitute the a priori input information. We review here just how these ingredients relate to each other when the information quantifier S is replaced by Fisher’s information measure I. The connection of these proceedings with thermodynamics constitute our physical background.

1. Introduction

We present here a review on the connections between Fisher’s information measure and the formalism of thermodynamics, whose main feature resides in its Legendre transform structure. Our starting point is the connection between information theory and statistical mechanics, which was established by Jaynes [1,2] on the basis of a constrained variational approach. This entails extremization of Shannon’s information measure subject to the constraints imposed by the a priori knowledge on may possess concerning the system of interest. Jaynes has shown that the entire statistical mechanics can be elegantly reformulated, without reference to the ensemble-notion, if one chooses Boltzmann’s constant as the informational unit and identifies Shannon’s measure with the thermodynamic entropy. The concomitant methodology is referred to as the Maximum Entropy Principle (MaxEnt) [1,2]. Nevertheless, such methodology does not always lead to an adequate distribution function [3]. This fact has encouraged the formulation of either alternative entropies or variational procedures. The later is the case if one appeals to Fisher’s information measure (FIM) I [3,4,5,6], where I replaces S. Such an approach provides a new viewpoint within the so-called Wheeler’s paradigm of observer participatory physics [7]. Indeed, much effort has been focused recently upon FIM-applications. The work of Frieden, Soffer, Plastino, Nikolov, Casas, Pennini, Miller, and others has shed much light upon the manifold physical applications of I (as a non-exhaustive set, see for instance [5,8,9,10,11,12,13]).

2. MaxEnt and Reciprocity Relations

As stated above, statistical mechanics and thermodynamics can be formulated on the basis of Information Theory if the density distribution f ( x ) is obtained by MaxEnt [1,2]. This theory asserts that assuming that your prior knowledge about the system is given by the values of M expectation values < A 1 > , , < A M > , then f ( x ) is uniquely fixed by extremizing S ( f ) subject to the constraints given by the M conditions < A j > = d x f ( x ) A j ( x ) , entailing the introduction of M Lagrange multipliers λ i . In the process, one discovers that the information quantifier S can be identified with the equilibrium entropy of thermodynamics if our prior knowledge < A 1 > , , < A M > refers to extensive quantities. S c o n s t r a i n t s e x t r e m . , once determined, yields complete thermodynamical information with respect to the system of interest [1]. f ( x ) , the classical MaxEnt probability distribution function (PDF), associated to Boltzmann-Gibbs-Shannon’s logarithmic entropy, is given by [1,2]
f M a x E n t = f ( x ) = exp - Ω + i = 1 M λ i A i ( x ) ,
with the normalization parameter Ω being given by [1,2]
Ω ( λ 1 , , λ M ) = ln d x exp - i = 1 M λ i A i ( x ) ,
that also verifies
Ω ( λ 1 , , λ M ) λ j = - A j , ( j = 1 , , M ) ,
and
S = Ω + i = 1 M λ i A i .
Accordingly,
d S = i = 1 M λ i d A i ,
so that the Euler theorem holds [2]:
S λ i = k λ k A k λ i ,
and, using (4), one arrives to the so-called reciprocity relations, around which this communication will revolve, starting with
d S = i = 1 M λ i d A i S A i = λ i
S = S ( A 1 , , A M ) ,
applying the Legendre transform
Ω = Ω ( λ 1 , , λ M ) = S - i = 1 M λ i A i ,
and then immediately finding that reciprocity holds, namely,
S A j = λ j a n d Ω λ j = - A j ; j = 1 , , M ,
where the second set of equations, together with (2), yield the Lagrange multipliers as a function of the input information regarding expectation values [2]. Finally, let us point out that the nice expression (2) results from having a closed analytical expression for f M a x E n t . Things become more involved below, in the Fisher instance, when a such expression is not available.

3. A Fisher Primer

3.1. Preliminaries

We have in mind here the formalism developed in Reference [6], adapted to a three-dimensional setting. Consider a system that is specified by a physical parameter Θ ( θ ) and let f ( v , θ | t ) describe the normalized probability distribution function (PDF) for this parameter, at that time t. Fisher’s Information Measure (FIM) I will read
I = d 3 v f ( v , θ | t ) i = 1 3 θ i ln [ f ( v , θ | t ) ] 2 .
The special case of translational families deserves a special mention. These are mono-parametric distribution families of the form
f ( v , θ | t ) = f ( v - θ | t ) ,
which are known up to the shift parameter Θ. Following Mach’s Principle (no absolute origin), all members of the family possess identical shape, and here the FIM adopts the form
I = d 3 v f ( v | t ) i = 1 3 v i ln [ f ( v | t ) ] 2 .
This FIM-form exhibits a variety of mathematical properties (see, for instance, [14,15]) and constitutes the main ingredient of a powerful variational principle, as discussed below.

3.2. Extremizing Fisher’s Information Measure

Consider a system that is specified by a set of M physical parameters Θ k measured at the time t. We can write
Θ k = A k t , w i t h A k = A k ( v ) ( k = 1 , . . . , M )
a n d Θ k m e a s u r e d a t t h e t i m e t .
Note that the set of Θ k -values is the prior knowledge which represents empirical information measured at the fixed time t. Let the pertinent probability distribution function (PDF) be f ( v | t ) . Then,
A k t = d 3 v A k ( v ) f ( v | t ) , k = 1 , , M .
These mean values play the role of extensive thermodynamical variables, as explained in Reference [6]. In this context, the relevant PDF f ( v | t ) extremizes the FIM (11) subject to (i) the prior conditions (12) and, of course, (ii) the normalization condition
d 3 v f ( v | t ) = 1 .
Our Fisher-based extremization problem adopts, at a given time t, the appearance
δ I - α d 3 v f ( v | t ) - k = 1 M λ k d 3 v A k ( v ) f ( v | t ) = 0
where we have introduced the ( M + 1 ) Lagrange multiplier. Variation leads to
i = 1 3 1 f 2 f v i 2 + v i 2 f f v i + α + k = 1 M λ k A k ( v ) = 0
To put the above equation in a more manageable form [6,16,17], we introduce the function ψ ( v | t ) via the identification | ψ ( v | t ) | 2 = f ( v | t ) so that Equation (16) adopts the Schrödinger-aspect
- 1 2 2 ψ - k = 1 M λ k ( t ) 8 A k ψ = α 8 ψ ,
Then, in order to find the PDF one has to solve the above wave-equation (WE) where the Lagrange multiplier ( α / 8 ) plays the role of an energy eigenvalue E, and the sum of the ( λ k A k ) is an effective potential function
U = U ( v , t ) = - 1 8 k = 1 M λ k ( t ) A k ( v ) ,
where the Lagrange parameters λ k are fixed with the available prior information. Notice that the eigen-energies α / 8 yield automatically the value of the Lagrange multiplier associated to normalization [cf. Equation (2) for the Shannon instance]. Squaring the solutions ψ yields the PDF, i.e.,
ψ ( v , t ) 2 = f ( v | t ) .
It is important to remark that
  • No specific potential has been assumed, as it is appropriate for thermodynamics. Also, we note that U is a time-dependent potential function and will allow for the description of non-equilibrium situations.
  • The specific A k ( v ) to be used depend upon the nature of the physical application at hand. This application could be of either a classical or a quantum nature.
  • Equation (17) represents a boundary value problem, generally with multiple solutions, in contrast with the unique solution that one obtains when employing Jaynes-Shannon’s entropy in place of Fisher’s measure [18].
  • The solution leading to the lowest I-value is the equilibrium one [6], admixtures of excited solutions yield non-equilibrium states [6].

4. Illustration: The Treatment of the Ideal Gas

As a didactic example we will here discuss the Fisher treatment of the ideal gas, by following the considerations expounded in [23]. We look for the density distribution, in configuration space, of the (translational invariant) ideal gas (IG) that describes non-interacting classical particles of mass m with coordinates q = ( r , p ) , where m d r / d t = p . The translational invariance is described by the translational family of distributions F ( r , p | θ r , θ p ) = F ( r , p ) whose form does not change under the transformations r = r - θ r and p = p - θ p . We assume that these coordinates are canonical and uncorrelated. This assumption is introduced into the Fisher information measure (FIM) (11), with 2 D dimensions (phase space). For the sake of dimensional balance we introduce in (11) two appropriate dimensional constants, namely, c r for space coordinates and c p for momentum coordinates [23]. The phase space probability density F ( r , p ) can obviously be factorized in the fashion F ( r , p ) = ρ ( r ) η ( p ) , and then it follows from the additivity of the information measure [5] that I = I r + I p , i.e., FIM becomes the sum of a coordinate-FIM and a momentum-one. Since D is the dimensionality we have
I r = c r d D r ρ ( r ) r ln ρ ( r ) 2 I p = c p d D p η ( p ) p ln η ( p ) 2 .
In extremizing FIM we constrain [23] the normalization of ρ ( r ) and η ( p ) to the total number of particles N and to 1, respectively, i.e.,
d D r ρ ( r ) = N , d D p η ( p ) = 1 .
In addition, we must penalize infinite values for the particle momentum (infinite energies are un-physical) with a constraint on the variance of η ( p ) to force it to be finite [23], namely,
d D p η ( p ) ( p - p ¯ ) 2 = D σ p 2 ,
where p ¯ is the mean value of p . For each degree of freedom it is known from the Virial Theorem that the variance is related to the temperature T as σ p 2 = m k B T , with k B the Boltzmann constant. Variation thus yields
δ c r d D r ρ r ln ρ 2 + μ d D r ρ = 0
and
δ c p d D p η p ln η 2 + λ d D p η ( p - p ¯ ) 2 + ν d D p η = 0 ,
where μ, λ and ν are Lagrange multipliers. Introducing now ρ ( r ) = Ψ 2 ( r ) and varying (23) with respect to Ψ leads to a Schröedinger-like equation
- 4 r 2 + μ Ψ ( r ) = 0 ,
where μ = μ / c r . To fix the boundary conditions, we first assume that the N particles are confined in a box of volume V, and next we take the thermodynamic limit N , V with N / V finite. The equilibrium state compatible with this limit corresponds to the ground state solution ( μ = 0 ), which is the uniform density ρ ( r ) = N / V .
Introducing η ( p ) = Φ 2 ( p ) and varying (24) with respect to Φ leads to the quantum harmonic oscillator-like equation
- 4 p 2 + λ ( p - p ¯ ) 2 + ν Φ ( p ) = 0 ,
where λ = λ / c p and ν = ν / c p . The equilibrium configuration corresponds to the ground state solution, which is now a Gaussian distribution. Using (22) to identify | λ | - 1 / 2 = σ p 2 we get the Maxwell-Boltzmann distribution, which leads to a density distribution in configuration space of the form
f ( r , p ) = N V exp - ( p - p ¯ ) 2 / 2 σ p 2 ( 2 π σ p 2 ) D / 2 .
If H is the elementary volume in phase space, the total number of microstates is [24] Z = N ! H D N i = 1 N F 1 ( r i , p i ) , where F 1 = F / N is the single particle distribution and N ! counts all possible permutations for distinguishable particles. The entropy S = - k B ln Z gets then written in the form
S = N k B ln V N 2 π σ p 2 H 2 D / 2 + 2 + D 2 ,
where we have used the Stirling approximation for N ! . This expression agrees, of course with the known value entropic expression for the IG [24], illustrating on the predictive power of the FIM formulation.

5. Connecting the SWE’s Solutions to Thermodynamics

The connection between the solutions of Equation (17) and thermodynamics has been established in References [6] and [9]. Here we will look at things in a rather different fashion. For starters, we will consider that the vector v in that equation is a “velocity” and we are going to extend the procedure of [6] and [9] to the three-dimensional instance, by dealing with an equilibrium gas of mass density ρ o . Moreover, we will focus on non-equilibrium thermodynamics’ facets.
Accordingly, our velocity-space Schrödinger (SWE) reads
- 1 2 ρ o 2 ψ ( v , t ) - k = 1 M λ k ( t ) 8 A k ( v ) ψ ( v , t ) = α 8 ψ ( v , t ) ,
The prior knowledge is chosen to be the temperature characterizing our equilibrium state. How? Via the equipartition theorem [19], that allows us calculate the average value of the squared velocity square in the equilibrium state, v 2 o . Consequently, choosing A 1 ( v ) = v 2 and writing λ 1 ( t ) = ρ o / ( 2 ω o 2 ) , α / 8 = E / ω o 2 , the ensuing time-independent Schrödinger wave equation is given by
- 1 2 2 + ρ o 2 2 ω o 2 v 2 ψ = ρ o E ω o 2 ψ .
At this point, we split up the Hamiltonian H into the unperturbed Hamiltonian H o plus a perturbation part H ,
H = H o + H , H ψ n = E n ψ n , H o ϕ n = E n o ϕ n
H o can be identified with the harmonic oscillator hamiltonian,
- 1 2 2 + ρ o 2 2 ω o 2 v 2 ϕ n ( v , t ) = ( ρ o E n o / ω o 2 ) ϕ n ( v , t ) ,
so that the ground state solution becomes a Gaussian function,
ϕ o = ρ o π ω o 3 / 4 e x p - ρ o 2 ω o v 2 .
The excited solutions ψ n ( x , t ) to the Fisher-based SWE can be obtained using an appropriate, standard approximation method for stationary states [6,9,21]. The expansion coefficients are computed using the < A k > of (13) by Hermite polynomials [22]. The total number of them that one needs depends upon how far from equilibrium we are.
Note that, the coefficients are computed at the fixed time t at which the input data
< A k > t are collected. At equilibrium there is only one such coefficient. The premise of the constrained Fisher information approach is that its input constraints are correct, since they come from experiment. Summing up, the approach of [6] yields solutions at the fixed (but arbitrary) time t. Schrödinger’s wave equation approach gives solutions valid at discrete time-points t. In other words, for any other time value t * we need to input new < A k > values, appropriate for this time, but this does not compromise the validity of the Fisher-Schr o ¨ dinger approach.

6. Fisher Reciprocity Relations

The reciprocity relations are an expression of the Legendre-invariant structure of thermodynamics and constitute its essential formal ingredient [20]. It is thus a crucial question to ascertain that they also hold for the Fisher treatment, as we will show below, so that we can speak of a Fisher-thermodynamics.
As stated in Section 2, standard thermodynamic makes use of the derivatives of the entropy S with respect to both λ i and A i parameters (for instance, pressure and volume, respectively). In the same way, we are led to investigate analogous properties of I / λ i and I / A i . The concomitant proceedings are not so direct as their MaxEnt counterparts, but do not present serious difficulties. The derivation below is original as far as we know.
We start by substituting (19) into Equation (11),
I = d 3 v f i = 1 3 ln f v i 2 = d 3 v ψ n 2 i = 1 3 ln ψ n 2 v i 2 = 4 d 3 v i = 1 3 ψ n v i 2
which, with some vectorial algebra and the Gauss Theorem, can be recast as
I = 4 S ψ n ψ n · n ^ d s - 4 ψ n 2 ψ n d 3 v = - 4 ψ n 2 ψ n d 3 v
Then, using the SWE (17) we get
I = ψ n α + k = 1 M λ k A k ψ n d 3 v
Finally, the prior conditions (12) and the normalization condition (13) lead to
I ( A 1 , , A M ) = α + k = 1 M λ k A k ,
the Fisher-counterpart of (4). Note that the Legendre transform of I is α,
α = I ( A 1 , , A M ) - k = 1 M λ k A k = α ( λ 1 , , λ M ) ,
so that
α λ i = - A i .
Finally, according to (36)
λ k = I A k ,
and moreover,
I λ i = k M λ k A k λ i
which is a generalized Fisher-Euler theorem that was previously proved in [6].

7. Second Illustrative Simple Example

To illustrate the above assertions we discuss a simple and instructive one-dimensional example. We focus attention on Equation (17) and assume that the prior information leads to a harmonic oscillator-problem because it consists of the moment < x 2 > ’s value, i.e.,
- 1 2 d 2 d x 2 - λ 2 8 x 2 ψ = α 8 ψ ,
It is immediately verified that the Gaussian wave function
ψ ( x ) = ( 2 π ) - 1 / 4 σ - 1 / 2 exp [ - ( 2 σ ) - 2 x 2 ] ,
( σ 2 the dispersion) is a solution of the above Schrödinger’s equation with α , λ 2 , σ linked in the fashion
α = 2 σ 2 , λ 2 = - 1 σ 4
We can then evaluate the pertinent PDF as f = ψ 2 . Thus, I ( f ) and the x 2 - moment turn out to be
I = I ( x 2 ) = 1 σ 2 , x 2 = σ 2 ,
so that the so-called Cramer-Rao bound I σ 2 1 gets saturated, as it should for Gaussian distributions [4].
The reciprocity relations for the present situation are now seen to be given, as expected, by
α λ 2 = ( 2 | λ 2 | ) λ 2 = - x 2 ,
I x 2 = - 1 σ 4 = λ 2 .

8. Convexity

In order to be able to construct a thermodynamic based upon I, it is also necessary to examine the convexity nature of I [20]. We prove below that I is a convex functional of the probability distribution p. Therefore, I exhibits the desirable mixing property [20].
Let a, b be two real scalars such that a + b = 1 , p 1 , p 2 two normalized probability distributions, and consider
ψ = a p 1 + i b p 2 ,
so that
p = | ψ | 2 = a p 1 + b p 2 ,
is a third probability distribution whose associated Fisher Information for translation families reads
I = 4 d 3 v i = 1 3 | ψ | v i 2 .
Then, to investigate the convexity question we must find the relationship relating I ( p ) to a I ( p 1 ) + b I ( p 2 ) . If we set now
ψ ( v ) = R ( v ) exp [ i S ( v ) ] ,
R y S two real functions in ℜ, we immediately find
I = 4 d 3 v i = 1 3 R v i 2 .
Now, it is easy from (47) to see that
ψ v i = 1 2 a p 1 p 1 v i + i b p 2 p 2 v i ,
so that
i = 1 3 ψ v i 2 = 1 4 i = 1 3 a p 1 p 1 v i 2 + b p 2 p 2 v i 2 ,
which implies
d 3 v i = 1 3 ψ v i 2 = a I ( p 1 ) + b I ( p 2 ) .
In the other hand by using (50) we see that
d 3 v i = 1 3 ψ v i 2 = I ( p ) + 4 d 3 v | ψ | 2 i = 1 3 S v i 2 .
the integral on the right side of the preceding equation is clearly 0 , which allows one to assert [25] that
I ( a p 1 + b p 2 ) a I ( p 1 ) + b I ( p 2 ) ;
i.e., Fisher information for translational families is indeed a convex functional of the probability distributions. The right side of Equation (56) represents the net probability, after mixing, for two distinct systems. It should be mentioned here that the approach can be generalized in the same fashion to a mixture theorem for N systems. The inequality Equation (56) is a special instance of Fisher’s I-theorem
d I d t 0 ,
predicted in [3] and proved in [10,11,12].

9. Conclusions

We have here reviewed the steps necessary to prove that a Fisher-based thermodynamics exists. The question is not trivial, since we do not have at hand a closed analytical expression for the probability distribution function that extremizes the Fisher measure subjected to appropriate constraints, but must obtain it via the solutions of a Schrödnger-like equation. This makes things more involved than in the Shannon instance, but more general as well, since it allows to deal with equilibrium and off-equilibrium scenarios on an equal footing, as we have endeavored here to explain.

Acknowledgements

F. Olivares is supported by a Fellowship of the Chilean Goverment, CONICYT. M. Casas is funded by the Spain Ministry of Science and Innovation (Project FIS2008-00781) and by FEDER funds (EU).

References

  1. Jaynes, E.T. Information theory and statiscical mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  2. Katz, A. Principles of Statistical Mechanics: The Information Theory Approach; Freeman and Co.: San Francisco, CA, USA, 1967. [Google Scholar]
  3. Frieden, B.R. Fisher information, disorder, and the equilibrium distribution of physics. Phys. Rev. A 1990, 41, 4265–4276. [Google Scholar] [CrossRef] [PubMed]
  4. Frieden, B.R. Physics from Fisher Information Measure; Cambridge University Press: Cambridge, UK, 1998. [Google Scholar]
  5. Frieden, B.R.; Soffer, B.H. Lagrangians of physics and the game of Fisher-information transfer. Phys. Rev. E 1995, 52, 2274–2286. [Google Scholar] [CrossRef]
  6. Frieden, B.R.; Plastino, A.; Plastino, A.R.; Soffer, B. Fisher-based thermodynamics: Its Legendre transform and concavity properties. Phys. Rev. E 1999, 60, 48–53. [Google Scholar] [CrossRef]
  7. Wheeler, J.A. Complexity, Entropy and the Physics of Information; Zurek, W.H., Ed.; Addison Wesley: New York, NY, USA, 1991; pp. 3–28. [Google Scholar]
  8. Nikolov, B.; Frieden, B.R. Limitation on entropy increase imposed by Fisher information. Phys. Rev. E 1994, 49, 4815–4820. [Google Scholar] [CrossRef]
  9. Flego, S.P.; Frieden, B.R.; Plastino, A.; Plastino, A.R.; Soffer, B. Nonequilibrium thermodynamics and Fisher information: Sound wave propagation in a dilute gas. Phys. Rev. E 2003, 68, 016105. [Google Scholar] [CrossRef]
  10. Plastino, A.R.; Plastino, A. Symetries of the Fokker-Planck equation and the Fisher-Frieden arrow of time. Phys. Rev. E 1996, 54, 4423–4426. [Google Scholar] [CrossRef]
  11. Plastino, A.; Plastino, A.R.; Miller, H.G. On the relationship between the Fisher-Frieden-Soffer arrow of time, and the behaviour of the Boltzmann and kullback entropies. Phys. Lett. A 1997, 235, 129–134. [Google Scholar] [CrossRef]
  12. Plastino, A.R.; Casas, M.; Plastino, A. Fisher’s information, Kullback’s measure, and H-theorems. Phys. Lett. A 1998, 246, 498–504. [Google Scholar] [CrossRef]
  13. Olivares, F.; Pennini, F.; Plastino, A. Phase space distribution from variation of information measures. Phys. A 2010, 389, 2218–2226. [Google Scholar] [CrossRef]
  14. Zamir, R. A proof of the Fisher information inequality via a data processing argument. IEEE Trans. Inform. Theory 1998, 44, 1246–1250. [Google Scholar] [CrossRef]
  15. Huber, P.J.; Ronchetti, E.M. Robust Statistics; Wiley: New York, NY, USA, 2009. [Google Scholar]
  16. Silver, R.N. E. T. Jaynes: Physics and Probability; Grandy, W.T., Jr., Milonni, P.W., Eds.; Cambridge University Press: Cambridge, UK, 1992. [Google Scholar]
  17. Richards, P.I. Manual of Mathematical Physics; Pergamon Press: London, UK, 1959; p. 342. [Google Scholar]
  18. Jaynes, E.T. Statistical Physics; Ford, W.K., Ed.; Benjamin: New York, NY, USA, 1963. [Google Scholar]
  19. Martinez, S.; Pennini, F.; Plastino, A.; Tessone, C. On the equipartition and virial theorems. Phys. A 2002, 305, 48–51. [Google Scholar] [CrossRef]
  20. Desloge, E.A. Thermal Physics; Holt, Rinehart and Winston: New York, NY, USA, 1968. [Google Scholar]
  21. Tannor, D.J. Introduction to Quantum Mechanics: Time-Dependent Perspective; University Science Books: South Orange, NJ, USA, 2007. [Google Scholar]
  22. It is important to remark that Hermite-Gaussian polynomials are orthogonal with respect to a Gaussian kernel, i.e., the equilibrium distribution. No other set of functions is orthogonal (and complete) with respect to a Gaussian kernel function.
  23. Hernando, A.; Vesperinas, C.; Plastino, A. Fisher-information and the thermodynamics of scale-invariant systems. Phys. A 2010, 389, 490–498. [Google Scholar] [CrossRef]
  24. Zemansky, M.W.; Dittmann, R.H. Heat and Thermodynamics; McGraw-Hill: London, UK, 1981. [Google Scholar]
  25. Broekee, D.E. A Generalization of the Fisher Information Measure; Delft University Press: Delft, The Netherlands, 1977. [Google Scholar]

Share and Cite

MDPI and ACS Style

Flego, S.; Olivares, F.; Plastino, A.; Casas, M. Extreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations. Entropy 2011, 13, 184-194. https://doi.org/10.3390/e13010184

AMA Style

Flego S, Olivares F, Plastino A, Casas M. Extreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations. Entropy. 2011; 13(1):184-194. https://doi.org/10.3390/e13010184

Chicago/Turabian Style

Flego, Silvana, Felipe Olivares, Angelo Plastino, and Montserrat Casas. 2011. "Extreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations" Entropy 13, no. 1: 184-194. https://doi.org/10.3390/e13010184

Article Metrics

Back to TopTop