Next Article in Journal
Entropy Generation During the Interaction of Thermal Radiation with a Surface
Previous Article in Journal
Protein Loop Dynamics Are Complex and Depend on the Motions of the Whole Protein
Previous Article in Special Issue
Classes of N-Dimensional Nonlinear Fokker-Planck Equations Associated to Tsallis Entropy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tsallis Relative Entropy and Anomalous Diffusion

by
Janett Prehl
1,
Christopher Essex
2 and
Karl Heinz Hoffmann
1,*
1
Institute of Physics, Chemnitz University of Technology, D-09107 Chemnitz, Germany
2
Department of Applied Mathematics, University of Western Ontario, Middlesex College, London, ON, N6A 5B7, Canada
*
Author to whom correspondence should be addressed.
Entropy 2012, 14(4), 701-716; https://doi.org/10.3390/e14040701
Submission received: 1 March 2012 / Revised: 19 March 2012 / Accepted: 30 March 2012 / Published: 10 April 2012
(This article belongs to the Special Issue Tsallis Entropy)

Abstract

:
In this paper we utilize the Tsallis relative entropy, a generalization of the Kullback–Leibler entropy in the frame work of non-extensive thermodynamics to analyze the properties of anomalous diffusion processes. Anomalous (super-) diffusive behavior can be described by fractional diffusion equations, where the second order space derivative is extended to fractional order α ( 1 , 2 ) . They represent a bridging regime, where for α = 2 one obtains the diffusion equation and for α = 1 the (half) wave equation is given. These fractional diffusion equations are solved by so-called stable distributions, which exhibit heavy tails and skewness. In contrast to the Shannon or Tsallis entropy of these distributions, the Kullback and Tsallis relative entropy, relative to the pure diffusion case, induce a natural ordering of the stable distributions consistent with the ordering implied by the pure diffusion and wave limits.
PACS Classification:
05.70.-a; 05.70.Ln; 05.40.Fb; 05.40.Jc

1. Introduction

Anomalous diffusion processes differ from regular diffusion in that the dispersion of particles proceeds faster (superdiffusion) or slower (subdiffision) than for the regular case. Examples for physical anomalous diffusion processes are observed for instance in porous media [1,2], in biological tissues [3,4,5], or in chemical systems [6]; other examples are superdiffusive processes like target search (for instance foraging of albatrosses) [7,8,9] or turbulent diffusion [10,11]. In order to describe anomalous diffusive processes analytically a number of different evolution equations based on fractional derivatives [12,13,14,15,16,17,18,19,20] have been developed to describe the spreading of the probability to find a particle at a certain distance from the origin of the diffusive process. While some descriptions use non-linear dependencies on the probability density functions (PDF) (for instance see E. K. Lenzi [21,22]) we here focus on the space-fractional diffusion equation
t P ( x , t ) = D α x α P ( x , t )
where the PDF P ( x , t ) is defined on - < x < and 0 t < and D 0 is the diffusion constant.
Here we focus on the space-fractional diffusion equation not as a modeling tool for a highly interesting class of superdiffusion processes with remarkable features [23,24] but as a bridge to link the usually unrelated regular diffusion equation (a paradigm for fully irreversible processes) to the wave equation (a paradigm for completely reversible processes). For that the parameter α must vary between 1 (the (half) wave case) and 2 (the diffusion case).
This bridge thus not only provides a continuous mapping between diffusion and waves but also a continuously ordered sequence of equations between them. This in turn implies a continuous sequence of PDF’s running from the Gaussian density to the delta function limit for the wave equation. That sequence provides an inherent ordering in which we can compare any two members of the bridging PDF family to say which is “closer” to pure diffusion than the other. In that sense we can say which PDF is “closer” to a Gaussian. This ordering as mentioned is the most rudimentary but essential property of the notion of a distance, without specifying a metrical property. We will refer to this ordering as the “bridge ordering”.
We have been exploring this specific bridging domain between irreversible and reversible processes and found that it has some surprising properties with broader implications. One such property is known as the entropy production paradox [23,24,25,26]. Contrary to the ordering implied by the fractional exponent α (the bridge ordering) the entropy production does not decrease as the reversible end of the bridging regime is approached. The same behavior was observed for a different one-parameter path defining a continuous sequence of fractional differential equations that connect the diffusion equation to the wave equation based on time-fractional diffusion equations [25,26]. Over several papers [23,24,25,26] it became increasingly clear that both distinct families exhibited remarkably comparable behaviors from a “thermodynamic” viewpoint. In particular entropies (classical, Tsallis, and Rényi) and their rates exhibited common features, suggesting that the bridge ordering of the fractional diffusion equations is not compatible with their increase or decrease.
We thus turned to the problem of ordering on a more fundamental basis. Ordering for functions or vectors is not an inherent property. One can of course impose an ordering, but generally there is no absolute ranking or ordering for functions or finite dimensional vectors. One can alter any imposed orderings simply by altering how the functions or vectors are mapped to R . With one scheme, the ordering can call for a certain pair of functions to be ranked in one manner, while with another scheme the ranking can as easily be reversed. The reversal property can be illustrated simply with two row vectors a = ( 2.1 , 0 ) and b = ( 2 , 1 ) . In terms of -norms, 2.1 = a > b = 2 while 5 = b 2 > a 2 = 2.1 . The ordering is reversed without altering the vector in any way but by simply changing the -norm. Generally, if some exterior criterion imposes a particular mapping scheme then the ordering may only make sense in terms of that particular mapping scheme.
Ordering may also be used as a rudimentary form of distance, serving as an alternative to discussing “near” and “far” by recognizing that things between an object and a common reference, due to the ordering, are “closer” to the reference, independent of any metrical structure. Of course metric structure, like ordering, is not inherent between functions or vectors either. But ordering will often serve to discuss distance in a broader context, which often happens in physical applications. For example, in the framework of thermodynamical applications one may speak of physical systems being near or far from equilibrium, or of processes being more “irreversible” than another for example. Obviously these vague statements need specific frameworks to avoid unanticipated reversals arising from changes in context.
In this paper we will analyze the ordering implied by the Kullback–Leibler entropy or relative entropy and its extension in the non-extensive framework, the Tsallis relative entropy. In practice we compute relative entropy and its Tsallis generalization as a function of the bridging parameter α. If there are no maxima or minima in the resulting graphs over the bridging interval then the ordering is consistent between the relative entropy picture and the bridging picture. Tsallis relative entropies come into play because Kullback–Leibler can be divergent in some cases when comparing Gaussian PDF’s to the fat tail distributions emerging in the bridging regime. This problem does not seem to emerge for Tsallis relative entropies. Moreover the broader class of relative entropies can convey robustness in terms of the basic notion of entropy. As well, in so much as the various relative entropies are physically consistent, one should expect that they would have significance to the physically meaningful questions of irreversibility versus reversibility.
The strategy of this paper is to specify the equations of the bridging regime and the resulting PDF’s. We will limit discussion to the space-fractional case for simplicity and brevity. The relevant properties of the family of PDF’s noting the ordering with respect to the parameter α will be presented. We define the relative entropies and the Tsallis generalization, and then set up the ordering in terms of the relative entropy using the irreversible (Gaussian case) as the reference PDF. This is the only sensible choice as at the wave limit the family of bridging PDFs goes to a delta function. The cases where the relative entropy diverges and converges are outlined. Then finally the graphs comparing ordering are presented.

2. Introduction to Relative Entropies

What is now known as the Kullback entropy [27], Kullback–Leibler entropy, relative entropy or information loss is a measure to compare two probability distributions given on the same domain. While being an information-theoretic concept it is also intimately connected to properties of thermodynamic systems [28]. The relative entropy is typically used in cases of reliability [29], which is important in the robust dynamic pricing problems [30] and keystroke dynamics [31], or in order to compare dynamical systems [32] and Markov models [33] as well as to measure their complexity, called thermodynamic depth [34,35]. Furthermore the relative entropy is an important measure in the quantum information theory [36,37], quantum mechanics [38,39], computer graphics [40], or ecology [41].
The Kullback–Leibler entropy K ( P , P 0 ) is here defined as [27,28,42]
K ( P , P 0 ) Σ P ( x ) ln P ( x ) P 0 ( x ) d x = - P ( x ) ln P ( x ) d x - - P ( x ) ln P 0 ( x ) d x = S ( P ) + K ( P , P 0 )
where the first term is the (negative) Shannon entropy S ( P ) and K ( P , P 0 ) represents the cross entropy. It is a (positive) measure for the information gain possible if a probability distribution P ( x ) is described by an encoding optimal for P ( x ) rather than for a reference distribution P 0 ( x ) .
In 1998 C. Tsallis introduced a generalization of the Kullback–Leibler entropy in the framework of the non-extensive thermodynamics [42,43,44], the Tsallis relative entropy or q-relative entropy, which is given as
T q ( P , P 0 ) Σ P ( x ) [ P ( x ) / P 0 ( x ) ] q - 1 - 1 q - 1 d x = 1 q - 1 Σ P ( x ) P ( x ) P 0 ( x ) q - 1 d x - 1 = 1 q - 1 Σ P ( x ) q P 0 ( x ) 1 - q d x - 1
For uniform P 0 ( x ) the Tsallis relative entropy reduces to the (negative) Tsallis entropy S q T ( P ) , for details see [24].
Reference [45] indicates that both relative entropies, the Kullback–Leibler and the Tsallis relative entropies, are useful for finding approximate time dependent solutions of fractional diffusion (or Fokker–Planck) equations. In the limit of q 1 the Tsallis relative entropy becomes the Kullback–Leibler entropy. Both relative entropies are not symmetric, i.e., K ( P , P 0 ) K ( P 0 , P ) and T q ( P , P 0 ) T q ( P 0 , P ) .

3. Space-Fractional Diffusion Equation and Stable Distributions

In the one-dimensional space-fractional diffusion equation (1) the fractional differential operator α x α P ( x ) is defined via the Fourier transformation F { P } = - P ( x ) e i k x d x as
α x α P ( x ) = F - 1 { ( i k ) α F { P ( x ) } }
Choosing the initial distribution to be the δ-function
P ( x , t = 0 ) = δ ( x )
we can determine the solution of the space-fractional diffusion equation (1) in terms of a stable distribution S x α , β , γ , δ n ; n and using the definition given in [46] and represent it as
P α ( x , t ) = S x α , 1 , ( D α t ) 1 / α , 0 ; 1
where D α = - D cos ( α π 2 ) and the parameters are chosen appropriately with
β = 1
γ = - D t cos α π 2 1 / α = ( D α t ) 1 / α γ α
δ 1 = 0
n = 1
We stress that (6) is a solution to the fractional diffusion equation (1) only for α ( 1 , 2 ] . In the fully asymmetric case of interest here with β = 1 we find in the limit α 1 that the stable distribution has the scale parameter (dispersion indicator) γ = ( - D t cos ( α π / 2 ) ) 1 / α 0 and the mode (i.e., the maximum of the distribution) x ^ α - D t . This represents a δ-function moving in time centered at - D t , which is the one-sided solution of the wave equation with the initial distribution (5) .
The stable distribution S x α , β , γ , δ n ; n is defined on the whole real axis with α ( 0 , 2 ] , β [ - 1 , 1 ] , γ 0 , and δ n R [46]. Their definition is based on their characteristic functions via a Fourier transformation. In the literature different definitions exist for the same choice of the parameters. These are indexed by n [46], here we will use only those with n = 0 and n = 1 . For α ( 1 , 2 ] the characteristic function is given in the parametrization n = 0 by
F { S x α , β , γ , δ 0 ; 0 } = exp - γ α | k | α [ 1 + i β tan α π 2 sign k ( | γ k | 1 - α - 1 ) ] + i δ 0 k
while for n = 1 the characteristic function is
F { S x α , β , γ , δ 1 ; 1 } = exp - γ α | k | α [ 1 - i β tan α π 2 sign k ] + i δ 1 k
In these definitions the characteristic exponent α (or index of stability) and the skewness parameter β determine the form of the distribution. The scale parameter γ is a measure for the dispersion of the distribution, and finally, the location parameter δ n is related to the position of the mean and the mode.
In the parametrizations n = 0 and n = 1 the parameters α , β , and γ are the same, but the location parameter δ n differs. In order to distinguish between the two parametrizations we supplement the location parameter δ n with the index n. Then, we have for S x α , β , γ , δ 0 ; 0 = S x α , β , γ , δ 1 ; 1 the relation
δ 1 = δ 0 - β γ tan α π 2
which allows to shift between the two parametrizations.
Furthermore, the following useful mathematical properties of stable distributions are known [24,46]:
  • The stable distribution rescales as
    S x α , β , γ , δ n ; n = 1 γ S x - δ n γ α , β , 1 , 0 ; n
  • For n = 0 , 1 stable distributions satisfy the reflection property [47]
    S x α , β , 1 , 0 ; n = S - x α , - β , 1 , 0 ; n
  • The position of the mean μ of the distribution is given by
    μ = δ 1 = δ 0 - β γ tan π α 2
  • Stable distributions are unimodal distributions. The mode x ^ α depends on the parametrization and on the location parameter. For S x α , β , γ , δ 0 ; 0 = S x α , β , γ , δ 1 ; 1 it is given by
    x ^ α = γ m ( α , β ) + δ 0 = γ m ( α , β ) + δ 1 + β γ tan π α 2
    in the range α ( 1 , 2 ] , where m ( α , β ) is a function which for the case of interest below, β = 1 , stays bounded between 0 and 1 and can be determined numerically.
  • For α ( 0 , 2 ) the stable distributions do not have a finite second moment, i.e., a finite variance.
From (16) and (17) we find that in the n = 1 parametrization the location parameter indicates the mean of the distribution, whereas in the n = 0 parametrization the location parameter indicates where the mode and thus the bulk of the probability is located. Note that for γ 0 the location parameter gives the mode.
Although the inverse Fourier transforms of (11) and (12) are in general not known in a closed form, it is possible to give the asymptotic tail behavior [46]. In general, for α ( 0 , 2 ) and β ( - 1 , 1 ) , the left and the right tail follows asymptotic power laws:
S x α , β , γ , δ 0 ; 0 α γ α c α ( 1 + β ) x - ( α + 1 ) x , α ( 0 , 2 ) , β ( - 1 , 1 ]
S x α , β , γ , δ 0 ; 0 α γ α c α ( 1 - β ) ( - x ) - ( α + 1 ) x - , α ( 0 , 2 ) , β [ - 1 , 1 )
with c α = sin ( α π 2 ) Γ ( α ) / π and Γ ( α ) being the Gamma-function. In the fully asymmetric cases β = 1 the left tail and β = - 1 the right tail have a different tail behavior. Here we give the tail behavior for the case β = 1 , as this case represents the solution of the space-fractional diffusion equation (1)
S x α , 1 , 1 , 0 ; 1 C 1 ( - x ) ( 2 - α ) / ( 2 α - 2 ) exp ( - C 2 ( - x ) α / ( α - 1 ) ) x R , x - , α ( 1 , 2 )
with the constants
C 1 = 1 2 π | 1 - α | α | cos ( α π 2 ) | 1 / ( 2 - 2 α ) and C 2 = | 1 - α | α α | cos ( α π 2 ) | 1 / ( 1 - α )
which are both positive. The β = - 1 case can be obtained via the reflection property (15) .

4. Reference Distribution

The aim of this paper is to analyze the Kullback–Leibler entropy and its Tsallis generalization as a means to establish an ordering of the solutions P α and in particular study its compatibility with bridge ordering expressed by α. As a reference we will use the solution to the α = 2 regular diffusion case, for which the PDF is the well known Gaussian. Thus, for α = 2 the space-fractional diffusion equation (1) reduces to the normal diffusion equation. Correspondingly the stable distribution S x α , 1 , ( D α t ) 1 / α , 0 ; 1 becomes the (Gaussian) normal distribution
P D ( x , t ) = S x 2 , 1 , D t , 0 ; 1 = N ( 0 , 2 D t ) = 1 2 π D t exp - x 2 4 D t
where N ( μ , σ 2 ) is the normal distribution with mean μ and variance σ 2 , here μ = 0 and σ 2 = 2 D t , thus the standard deviation σ = 2 D t [46].
In Figure 1 a comparison plot is given, where the stable distribution S x α , 1 , ( D α t ) 1 / α , 0 ; 1 is depicted over x for α = 2 , 1.5 , 1.2 . Note that t = 1 and D = 1 is chosen.
Figure 1. In this graph the solution P α ( x , t ) for t = 1 and D = 1 is shown for different values of α = 2.0 , 1.5 , and 1.2 . Note that for α = 2 the notation P D ( x , t ) is used in the following.
Figure 1. In this graph the solution P α ( x , t ) for t = 1 and D = 1 is shown for different values of α = 2.0 , 1.5 , and 1.2 . Note that for α = 2 the notation P D ( x , t ) is used in the following.
Entropy 14 00701 g001

5. Kullback–Leibler Entropy

In this section we will use the Kullback–Leibler entropy as a comparison measure for the solutions of the space-fractional diffusion equation for different α. We thus aim to calculate K ( P α , P D ) and K ( P D , P α ) .
While for standard distributions falling of fast enough as x approaches ± this poses no problem, we are here concerned with distributions with heavy tails, for which we already know, that higher moments do not exist. Thus, we first have to address the question whether the Kullback–Leibler entropy exists at all. We will discuss the convergence question by using a generic Gaussian P G = N ( μ , σ 2 ) and a generic S x α , 1 , γ , 0 ; 1 with yet unspecified γ. At the end we will replace γ by γ α .
As our aim is to establish an ordering of the distributions obtained for different α we are here interested in K ( P α , P G ) as well as K ( P G , P α ) . We start with K ( P α , P G ) and find
K ( P α , P G ) = Σ P α ( x , t ) ln P α ( x , t ) P G ( x , t ) d x = Σ P α ( x , t ) ln P α ( x , t ) d x - Σ P α ( x , t ) ln P G ( x , t ) d x
The first integral of Equation (22) converges as shown in [24], whereas we will show below that the second integral does not converge and thus K ( P α , P G ) does not exist. We mention here that the use of escort distributions provide an interesting route to the analysis of heavy tailed distributions in the context of the Kullback–Leibler and the q-relative entropies. For details see [48]. Their implications for the ordering of the distributions and its comparison to the bridge ordering is beyond the scope of this paper, but provides certainly an interesting starting point for further research.
For this and the following analysis we divide the integrals into three parts, the left tail, the middle part, and the right tail. Due to the known asymptotic tail behavior (18) – (20) the left tail and right tail integral can be analyzed analytically, whereas the middle part is always finite and has to be determined numerically. We split the integration domain for an arbitrary f ( x , t ) as follows
Σ f ( x , t ) d x = - x - f ( x , t ) d x + x - x + f ( x , t ) d x + x + f ( x , t ) d x
where x - < 0 and x + > 0 are chosen such that for the numerical treatment the stable distributions can be well approximated by their asymptotic tail behavior. Accordingly, we have
K ( P α , P G ) = K ( P α , P G ) - + K ( P α , P G ) ± + K ( P α , P G ) +
We start with K ( P α , P G ) + . Taking the logarithm of P G ( x , t ) we find
ln P G ( x , t ) = - ln ( 2 π σ ) - ( x - μ ) 2 2 σ 2
which leads to
K ( P α , P G ) + = ln ( 2 π σ ) + x + α γ α c α x - ( α + 1 ) ( x - μ ) 2 σ 2 d x
for the right tail integral. The range of α here is α ( 1 , 2 ) , therefore the exponent of x of the integrand in (26) is - α + 1 > - 1 and thus the integral does not converge. This is compatible with the non-existence of the second moment for the stable distributions discussed here. As a result we cannot use K ( P α , P G ) and thus K ( P α , P D ) as a basis for defining a distance measure on the distributions in the transition regime between reversible and irreversible processes.
As the Kullback–Leibler entropy is an asymmetric measure, the non-existence of K ( P α , P G ) does not imply the non-existence of K ( P G , P α ) which we now analyze
K ( P G , P α ) = Σ P G ( x , t ) ln P G ( x , t ) P α ( x , t ) d x = Σ P G ( x , t ) ln P G ( x , t ) d x - Σ P G ( x , t ) ln P α ( x , t ) d x
The first integral yields
Σ P G ( x , t ) ln P G ( x , t ) d x = - 1 / 2 ( 1 + ln 2 π σ 2 )
which is finite.
For the further discussion below we recall that
x + x a e - c x d d x < and x + ln ( x ) e - c x d d x <
for any given c > 0 , d > 0 , and a arbitrary but fixed. The logarithm ln P α ( x , t ) within the second integral of (27) is obtained from the asymptotic tail behavior. For the right tail we get
ln ( P α ( x + , t ) ) = ln ( 2 α c α ) + ln γ α - ( α + 1 ) ln x
which according to (29) converges. For the left tail we have
ln ( P α ( x - , t ) ) = ln C 1 + 2 - α 2 α - 2 ln - x γ - C 2 - x γ α α - 1
Multiplied with P G , one finds from (29) that the integrals converge for α ( 1 , 2 ) .
As a result K ( P G , P α ) and thus K ( P D , P α ) can provide a new basis of comparison between the probability distributions for different α. In Figure 2 we show K ( P D , P α ) as a function of α for time t = 1 . The monotonous dependence indicates that K ( P D , P α ) provides the same ordering of the diffusion processes in the bridging regime as does the entropy production rate, albeit in reversed order. The K ( P D , P α ) -ordering is fully compatible with the ordering obtained by using the Shannon entropy and the Tsallis entropy as a means of comparison, when the appropriate times are chosen for the different processes, such that the internal quickness is separated out [24]. If one does not separate out the internal quickness, then the Shannon entropy S ( P α ) as well as the Tsallis entropy S q T ( P α ) , see [24], provides an ordering not compatible with the bridge ordering. The corresponding curves for t = 1 are shown in Figure 2, too.
Figure 2. Here the Kullback–Leibler entropy K ( P D , P α ) , the Tsallis entropy S q = 0.7 T ( P α ) and the Shannon entropy S ( P α ) are plotted over α at t = 1 . We can see that the Kullback–Leibler entropy shows a monotonic decreasing behavior for increasing α, whereas the Tsallis and the Shannon entropy exhibit a maximum. Thus, K ( P D , P α ) is an appropriate ordering measure for the bridging regime even when other measure candidates are not monotonic.
Figure 2. Here the Kullback–Leibler entropy K ( P D , P α ) , the Tsallis entropy S q = 0.7 T ( P α ) and the Shannon entropy S ( P α ) are plotted over α at t = 1 . We can see that the Kullback–Leibler entropy shows a monotonic decreasing behavior for increasing α, whereas the Tsallis and the Shannon entropy exhibit a maximum. Thus, K ( P D , P α ) is an appropriate ordering measure for the bridging regime even when other measure candidates are not monotonic.
Entropy 14 00701 g002

6. Tsallis Relative Entropy

Now, we turn to the Tsallis relative entropy. In the following section we will analyze for which values of the non-extensivity parameter q the integrals do converge. For this analysis we assume again a generic Gaussian and a generic stable distribution. Afterwards, we present the numerical results.
We start with T q ( P α , P G ) , for which we have
T q ( P α , P G ) = 1 q - 1 Σ P α q ( x , t ) P G q - 1 ( x , t ) d x - 1 = 1 q - 1 Σ 2 π σ q - 1 S x α , 1 , γ , 0 ; 1 q exp - ( x - μ ) 2 2 σ 2 q - 1 d x - 1 = 1 q - 1 ( 2 π σ ) q - 1 γ q Σ S x γ α , 1 , 1 , 0 ; 1 q exp - ( x - μ ) 2 2 σ 2 q - 1 d x - 1
In analogy to the analysis of the Kullback–Leibler entropy, (23) and (24) , we split the Tsallis relative entropy integral into three parts, left tail, middle part, and right tail. Due to the asymptotic tail behavior of the stable distribution given in (18) and (20) we can investigate the left and the right tail analytically. For the Tsallis relative entropy we thus have
T q ( P α , P G ) = 1 q - 1 ( T q ( P α , P G ) - + T q ( P α , P G ) ± + T q ( P α , P G ) + ) - 1
The left tail integral yields
T q ( P α , P G ) -
= ( 2 π σ ) q - 1 γ q - x - C 1 - x γ 2 - α 2 α - 2 exp - C 2 - x γ α α - 1 q exp - ( x - μ ) 2 2 σ 2 1 - q d x = ( 2 π σ ) q - 1 γ q - x - C 1 q - x γ q ( 2 - α ) 2 α - 2 exp - q C 2 - x γ α α - 1 - ( 1 - q ) ( x - μ ) 2 2 σ 2 d x
First, we note that the constant C 2 takes values from 0 to 1 4 monotonically increasing for α ( 1 , 2 ) . Also we find that the exponent α / ( α - 1 ) lies within the range of ( , 2 ) for α ( 1 , 2 ) . Thus, the first term in the exponent of the exponential will dominate the second one and the integral will converge if the prefactor - q C 2 of that term is negative, whereas for q = 1 the second term takes over. This sets q 0 as a first requirement.
For the right tail integral T q ( P α , P G ) + we obtain
T q ( P α , P G ) + = x + 2 α γ c α x - ( α + 1 ) q 2 π σ q - 1 exp - ( x - μ ) 2 2 σ 2 1 - q d x = x + 2 α γ c α q 2 π σ q - 1 x - q ( α + 1 ) exp - ( 1 - q ) ( x - μ ) 2 2 σ 2 d x
that converges for 1 - q > 0 or q < 1 . This sets the second requirement on q. Overall, the Tsallis relative entropy T q ( P α , P G ) and thus T q ( P α , P D ) exist for q [ 0 , 1 ) .
In Figure 3 the dependence of T q ( P α , P D ) on α is shown. Again one finds an ordering compatible with the original order as established by α, the bridge ordering. This holds independent of the q value used as long as q [ 0 , 1 ) . We note that the larger q is the easier it is to separate two processes with different α. Also as expected for α 2 the measure goes to zero, independent of q [ 0 , 1 ) .
Figure 3. The results for the Tsallis relative entropy T q ( P α , P D ) are shown over α for four different values of q ( q = 0.1 , 0.3 , 0.6 , and 0.9 ). As expected T q ( P α , P D ) goes to zero as α approaches 2, independent of q.
Figure 3. The results for the Tsallis relative entropy T q ( P α , P D ) are shown over α for four different values of q ( q = 0.1 , 0.3 , 0.6 , and 0.9 ). As expected T q ( P α , P D ) goes to zero as α approaches 2, independent of q.
Entropy 14 00701 g003
While the complete discussion of the time development of the Tsallis relative entropy is beyond the scope of this paper, we show in Figure 4 how T q ( P α , P D ) (for q = 0.6 ) behaves for different times. One sees that the monotonic behavior is preserved. An interesting question for future research is whether one can formulate an extension of the H-theorem for this measure.
Figure 4. For the case of q = 0.6 the results for the Tsallis relative entropy T 0.6 ( P α , P D ) for different times t is given over α. One can observe that with increasing time the monotonic decreasing behavior is preserved.
Figure 4. For the case of q = 0.6 the results for the Tsallis relative entropy T 0.6 ( P α , P D ) for different times t is given over α. One can observe that with increasing time the monotonic decreasing behavior is preserved.
Entropy 14 00701 g004
Finally, we analyze for which values of q the Tsallis relative entropy T q ( P G , P α ) does exist. It is given as
T q ( P G , P α ) = 1 q - 1 Σ 1 2 π σ q exp - ( x - μ ) 2 2 σ 2 q S x α , 1 , γ , 0 ; 1 q - 1 d x - 1 = 1 q - 1 Σ γ ( q - 1 ) ( 2 π σ ) q exp - ( x - μ ) 2 2 σ 2 q S x γ α , 1 , 1 , 0 ; 1 q - 1 d x - 1
Again we have to investigate the tail integrals based on the asymptotic tail behaviors that are given in (20) and (18) . We split our integral into three part as explained in (33) and in the case of the (36) we obtain the subsequent left tail integral T q ( P G , P α ) - as
T q ( P G , P α ) -
= γ ( q - 1 ) ( 2 π σ ) q - x - exp - ( x - μ ) 2 2 σ 2 q C 1 - x γ 2 - α 2 α - 2 exp - C 2 - x γ α α - 1 1 - q d x = γ ( q - 1 ) ( 2 π σ ) q - x - C 1 1 - q - x γ ( 1 - q ) ( 2 - α ) 2 α - 2 exp - q ( x - μ ) 2 2 σ 2 - ( 1 - q ) C 2 - x γ α α - 1 d x
In order to find the requirement on q we proceed in analogy to the discussion for (34) . Here the second term of the exponent of the exponential in the integrand will dominate the first term for large enough | x | . Thus the prefactor must not be negative for convergence, i.e., ( 1 - q ) C 2 > 0 , or it is zero and then the first term takes over, and thus the requirement is q < 1 , as for q = 1 the Tsallis relative entropy is not defined.
For the right tail integral T q ( P G , P α ) + we get
T q ( P G , P α ) + = x + 1 2 π σ exp - ( x - μ ) 2 2 σ 2 q 2 α γ α c α x - ( α + 1 ) 1 - q d x = x + 1 2 π σ q 2 α γ α c α 1 - q x - ( 1 - q ) ( α + 1 ) exp - q ( x - μ ) 2 2 σ 2 d x
which converges for all α ( 1 , 2 ) for q 0 . Therefore, q has to be in the range [ 0 , 1 ) in order to insure that the Tsallis relative entropy T q ( P G , P α ) has a finite value.
In Figure 5 we show the dependence of T q ( P D , P α ) on α. The ordering of the superdiffusion processes in the bridging regime induced by T q ( P D , P α ) is again in full agreement with the bridge ordering by α.
Figure 5. Here, the Tsallis relative entropy T q ( P D , P α ) is depicted over α for different values of q. We find that for α 2 T q ( P D , P α ) goes down to zero, independent of q. Note that in the case of q = 1 the corresponding Kullback–Leibler entropy is shown.
Figure 5. Here, the Tsallis relative entropy T q ( P D , P α ) is depicted over α for different values of q. We find that for α 2 T q ( P D , P α ) goes down to zero, independent of q. Note that in the case of q = 1 the corresponding Kullback–Leibler entropy is shown.
Entropy 14 00701 g005
As both relative entropies, the Kullback–Leibler K ( P D , P α ) as well as the Tsallis relative entropy T q ( P D , P α ) , preserve the ordering of the processes in the superdiffusion regime as induced by α, it is interesting to study their relative behavior. In Figure 6 a log-log-plot of K ( P D , P α ) versus T q ( P D , P α ) is shown for different q and identical P α and P D . The graph shows a clear monotonic relationship independent of the q value used. As T q ( P D , P α ) approaches K ( P D , P α ) in the limit q 1 the close to straight line for q = 0.95 is expected. Interestingly, the curves for the other q values also show this feature for small values of the relative entropies.
Figure 6. The log-log-plot of the Kullback–Leibler entropy K ( P D , P α ) over the Tsallis relative entropy T q ( P D , P α ) for different q-values is given. A clear monotonic monotonic relationship is observed independent of q.
Figure 6. The log-log-plot of the Kullback–Leibler entropy K ( P D , P α ) over the Tsallis relative entropy T q ( P D , P α ) for different q-values is given. A clear monotonic monotonic relationship is observed independent of q.
Entropy 14 00701 g006

7. Summary and Discussion

The bridge ordering by α of the space-fractional diffusion equations and of the corresponding PDFs P α ( x , t ) had previously shown a structure not compatible with an ordering based on the Shannon, the Tsallis, and the Rényi entropies, and their respective rates. While we were able to resolve this behavior known as the entropy production paradox by an in depth analysis of the intricate coupling of internal quickness to the form change of the PDFs, we here found that the Kullback or Kullback–Leibler entropy and its generalization, the Tsallis relative entropy, naturally establish an ordering on the P α ( x , t ) consistent with the bridge ordering. We did this by calculating the Kullback entropy and the Tsallis relative entropy with respect to the solution of the regular diffusion equation, i.e., the α = 2 case of the space-fractional diffusion equation.
We found that the Kullback–Leibler entropy for a P α versus a general Gaussian distribution does not exist as the integrals diverge. The reason for this divergence is rooted in the heavy tails of the stable distributions. These heavy tails lead to the non-existence of higher moments, and considering the fact that the logarithm of a Gaussian is proportional to the second moment, the divergence of the Kullback–Leibler entropy is not surprising.
The Tsallis relative entropy, which is a non-extensive extension of the Kullback–Leibler entropy depending on a non-extensivity parameter q R / { 1 } , however does exist and thus proves to be a highly useful generalization of the Kullback–Leibler entropy. In particular we showed that the Tsallis relative entropy can be determined for q-values in the range q [ 0 , 1 ) . Within that range the Tsallis relative entropy of P α with respect to P D as well as Tsallis relative entropy of P D with respect to P α for all q show an ordering of the P α consistent with the bridge ordering as induced by α.

Acknowledgments

We thank J. P. Nolan for making chapter 3 of [46] available to us.

References

  1. Havlin, S.; Ben-Avraham, D. Diffusion in disordered media. Adv. Phys. 1987, 36, 695–798. [Google Scholar] [CrossRef]
  2. Schirmacher, W.; Perm, M.; Suck, J.B.; Heidemann, A. Anomalous diffusion of hydrogen in amorphous metals. Europhys. Lett. 1990, 13, 523–529. [Google Scholar] [CrossRef]
  3. Weiss, M.; Elsner, M.; Kartberg, F.; Nilsson, T. Anomalous subdiffusion is a measure for cytoplasmic crowding in living cells. Biophys. J. 2004, 87, 3518–3524. [Google Scholar] [CrossRef] [PubMed]
  4. Banks, D.S.; Fradin, C. Anomalous diffusion of proteins due to molecular crowding. Biophys. J. 2005, 89, 2960–2971. [Google Scholar] [CrossRef] [PubMed]
  5. Köpf, M.; Corinth, C.; Haferkamp, O.; Nonnenmacher, T.F. Anomalous diffusion of water in biological tissues. Biophys. J. 1996, 70, 2950–2958. [Google Scholar] [CrossRef]
  6. Yuste, S.B.; Lindberg, K. Subdiffusion-limited reactions. Chem. Phys. 2002, 284, 169–180. [Google Scholar] [CrossRef]
  7. Bénichou, O.; Coppey, M.; Moreau, M.; Suet, P.H.; Voituriez, R. Optimal search strategies for hidden targets. Phys. Rev. Lett. 2005, 94, 198101. [Google Scholar] [CrossRef] [PubMed]
  8. Bénichou, O.; Loverdo, C.; Moreau, M.; Voituriez, R. Two-dimensional intermittent search processes: An alternative to Lévy flight strategies. Phys. Rev. E 2006, 74, 020102. [Google Scholar] [CrossRef]
  9. Shlesinger, M.F. Mathematical physics: Search research. Nature 2006, 443, 281–282. [Google Scholar] [CrossRef] [PubMed]
  10. Solomon, T.H.; Weeks, E.R.; Swinney, H.L. Observation of anomalous diffusion and Lévy flights in a two-dimensional rotating flow. Phys. Rev. Lett. 1993, 71, 3975–3978. [Google Scholar] [CrossRef] [PubMed]
  11. Hansen, A.E.; Jullien, M.C.; Paret, J.; Tabeling, P. Dispersion in freely decaying and forced 2D turbulence. In Anomalous Diffusion from Basics to Application; Pekalski, A., Kutner, R., Eds.; Lecture Notes in Physics, 519; Springer-Verlag: Berlin/Heidelberg, Germany, 1999; pp. XVIII, 378 S. [Google Scholar]
  12. Schneider, W.R.; Wyss, W. Fractional diffusion and wave equation. J. Math. Phys. 1989, 30, 134–144. [Google Scholar] [CrossRef]
  13. Giona, M.; Roman, H.E. Fractional diffusion equation for transport phenomena in random media. Physica A 1992, 185, 87–97. [Google Scholar] [CrossRef]
  14. Metzler, R.; Glöckle, W.G.; Nonnenmacher, T.F. Fractional model equation for anomalous diffusion. Physica A 1994, 211, 13–24. [Google Scholar] [CrossRef]
  15. Metzler, R.; Klafter, J. The random walk’s guide to anomalous diffusion: A fractional dynamics approach. Phys. Rep. 2000, 339, 1–77. [Google Scholar] [CrossRef]
  16. Hilfer, R. Applications of Fractional Calculus in Physics; World Scientific Publishing: Singapore, 2000. [Google Scholar]
  17. del Castillo-Negrete, D. Fractional diffusional models of anomalous transport. In Anomalous Transport: Foundations and Applications; Klages, R., Radons, G., Sokolov, I.M., Eds.; Wiley-VCH: Weinheim, Germany, 2008; Chapter 6; pp. 163–212. [Google Scholar]
  18. Schulzky, C.; Essex, C.; Davison, M.; Franz, A.; Hoffmann, K.H. The similarity group and anomalous diffusion equations. J. Phys. A: Math. Gen. 2000, 33, 5501–5511. [Google Scholar] [CrossRef]
  19. Fischer, A.; Seeger, S.; Hoffmann, K.H.; Essex, C.; Davison, M. Modeling anomalous superdiffusion. J. Phys. A: Math. Gen. 2007, 40, 11441–11452. [Google Scholar] [CrossRef]
  20. Hoffmann, K.H.; Prehl, J. Anomalous transport in disordered fractals. In Anomalous Transport—Foundations and Applications, 1st ed.; Klages, R., Radons, G., Sokolov, I.M., Eds.; Wiley-VCH: Weinheim, Germany, 2008. [Google Scholar]
  21. Malacarne, L.C.; Mendes, R.S.; Pedron, I.T.; Lenzi, E.K. Nonlinear equation for anomalous diffusion: Unified power-law and stretched exponential exact solution. Phys. Rev. E 2001, 63, 030101(R). [Google Scholar] [CrossRef]
  22. Pedron, I.T.; Mendes, R.S.; Buratta, T.J.; Malacarne, L.C.; Lenzi, E.K. Logarithmic diffusion and porous media equations: a unified description. Phys. Rev. E 2005, 72, 031106. [Google Scholar] [CrossRef]
  23. Li, X.; Essex, C.; Davison, M.; Hoffmann, K.H.; Schulzky, C. Fractional diffusion, irreversibility and entropy. J. Non-Equilib. Thermodyn. 2003, 28, 279–291. [Google Scholar] [CrossRef]
  24. Prehl, J.; Essex, C.; Hoffmann, K.H. The superdiffusion entropy production paradox in the space-fractional case for extended entropies. Physica A 2010, 389, 214–224. [Google Scholar] [CrossRef]
  25. Hoffmann, K.H.; Essex, C.; Schulzky, C. Fractional diffusion and entropy production. J. Non-Equilib. Thermodyn. 1998, 23, 166–175. [Google Scholar] [CrossRef]
  26. Essex, C.; Schulzky, C.; Franz, A.; Hoffmann, K.H. Tsallis and Rényi entropies in fractional diffusion and entropy production. Physica A 2000, 284, 299–308. [Google Scholar] [CrossRef]
  27. Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
  28. Schlögl, F. Probability and Heat; Vieweg: Braunschweig/Wiesbaden, Germany, 1989. [Google Scholar]
  29. Ebrahimi, N. Testing exponentiality of the residual life, based on dynamic Kullback–Leibler information. IEEE Trans. Reliab. 1998, 47, 197–201. [Google Scholar] [CrossRef]
  30. Lim, A.E.B.; Shanthikumar, J.G. Relative entropy, exponential utility, and robust dynamic pricing. Oper. Res. 2007, 55, 198–214. [Google Scholar] [CrossRef]
  31. Kekre, H.B.; Bharadi, V.A.; Shaktia, P.; Shah, V.; Ambardekar, A.A. Keystroke dynamic analysis using relative entropy & timing sequence Euclidean distance. In Proceedings of the International Conference and Workshop on Emerging Trends in Technology (ICWET 2011), Mumbai, India, 25–26 February 2011; pp. 220–223.
  32. Yu, S.; Mehta, P.G. The Kullback–Leibler rate metric for comparing dynamical systems. In Proceedings of the Joint 48th IEEE Conference on Decision and Control and 28th Chinese Control Conference, Shanghai, China, 16–18 December 2009; pp. 8363–8368.
  33. Do, M.N. Fast approximation of Kullback–Leibler distance for dependence trees and hidden Markov models. IEEE Signal Process. Lett. 2003, 10, 115–118. [Google Scholar] [CrossRef]
  34. Llody, S.; Pagels, H. Complexity as thermodynamic depth. Ann. Physics 1988, 188, 186–213. [Google Scholar] [CrossRef]
  35. Crutchfield, J.P.; Shalizi, C.R. Thermodynamic depth of casual states: Objective complexity via minimal representation. Phys. Rev. E 1999, 59, 275–283. [Google Scholar] [CrossRef]
  36. Vedral, V. The role of relative entropy in quantum information theory. Rev. Mod. Phys. 2002, 74, 197–234. [Google Scholar] [CrossRef]
  37. Georgiou, T.T.; Lindquist, A. Kullback–Leibler approximation of spectral density functions. IEEE Trans. Inform. Theor. 2003, 49, 2910–2917. [Google Scholar] [CrossRef]
  38. Braunstein, S.L. Geometry of quantum inference. Phys. Lett. A 1996, 219, 169–174. [Google Scholar] [CrossRef]
  39. Modi, K.; Paterek, T.; Son, W.; Vedral, V.; Williamson, M. Unified view of quantum and classical correlations. Phys. Rev. Lett. 2010, 104, 080501–1–4. [Google Scholar] [CrossRef] [PubMed]
  40. Do, M.N.; Vetterli, M. Wavelet-based texture retrieval using generalized Gaussian density and Kullback–Leibler distance. IEEE Trans. Image Process. 2002, 11, 146–158. [Google Scholar] [CrossRef] [PubMed]
  41. Burnham, K.P.; R., A.D. Kullback–Leibler information as a basis for strong inference in ecological studies. Wildl. Res. 2001, 28, 111–119. [Google Scholar] [CrossRef]
  42. Tsallis, C. Generalized entropy-based criterion for consistent testing. Phys. Rev. E 1998, 58, 1442–1445. [Google Scholar] [CrossRef]
  43. Borland, L.; Plastino, A.R.; Tsallis, C. Information gain within nonextensive thermostatics. J. Math. Phys. 1998, 39, 6490–6501. [Google Scholar]Borland, L.; Plastino, A.R.; Tsallis, C. Erratum: “Information gain with generalized termostatistics”[J. Math. Phys. 39, 6490 (1998)]. J. Math. Phys. 1999, 40, 2196. [Google Scholar] [CrossRef]
  44. Furuichi, S.; Yanagi, K.; Kuriyama, K. Fundamental properties of Tsallis relative entropy. J. Math. Phys. 2004, 45, 4868–4877. [Google Scholar] [CrossRef]
  45. Plastino, A.R.; Miller, H.G.; Plastino, A. Minimum Kullback entropy approach to the Fokker–Planck equation. Phys. Rev. E 1997, 56, 3927–3934. [Google Scholar] [CrossRef]
  46. Nolan, J.P. Stable Distributions—Models for Heavy Tailed Data; Birkhäuser: Boston, MA, USA, 2012. Unpublished material. Chapter 1: Available online: http://academic2.american.edu/~jpnolan (accessed on 30 March 2012). Chapter 3: Personal communication.
  47. Samorodnitsky, G.; Taqqu, M.S. Stable non-Gaussian Random Processes; Chapman & Hall: New York, NY, USA, 1994. [Google Scholar]
  48. Tsallis, C.; Plastino, A.R.; Alvarez-Estrade, R.F. Escort mean values and the characterization of power-law-decaying probability densities. J. Math. Phys. 2009, 50, 043303. [Google Scholar] [CrossRef] [Green Version]

Share and Cite

MDPI and ACS Style

Prehl, J.; Essex, C.; Hoffmann, K.H. Tsallis Relative Entropy and Anomalous Diffusion. Entropy 2012, 14, 701-716. https://doi.org/10.3390/e14040701

AMA Style

Prehl J, Essex C, Hoffmann KH. Tsallis Relative Entropy and Anomalous Diffusion. Entropy. 2012; 14(4):701-716. https://doi.org/10.3390/e14040701

Chicago/Turabian Style

Prehl, Janett, Christopher Essex, and Karl Heinz Hoffmann. 2012. "Tsallis Relative Entropy and Anomalous Diffusion" Entropy 14, no. 4: 701-716. https://doi.org/10.3390/e14040701

APA Style

Prehl, J., Essex, C., & Hoffmann, K. H. (2012). Tsallis Relative Entropy and Anomalous Diffusion. Entropy, 14(4), 701-716. https://doi.org/10.3390/e14040701

Article Metrics

Back to TopTop