entropy-logo

Journal Browser

Journal Browser

Tsallis Entropy

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 July 2011) | Viewed by 109237

Special Issue Editor


E-Mail Website
Guest Editor
Space Research & Technology Group, Institute for Astronomy, Astrophysics, Space Applications and Remote Sensing, National Observatory of Athens, GR-15236 Penteli, Greece
Interests: cellular automata; complexity; acceleration, transport and diffusion processes in dynamical systems

Special Issue Information

Dear Colleagues,

The aim of statistical mechanics is to establish a direct link between the mechanical laws and classical thermodynamics. The uncertainty of an open system state can be quantified by the Boltzmann-Gibbs entropy, which is the widest known uncertainty measure in statistical mechanics.

One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if the correlations within the system are essentially local. In such cases the energy of the system is typically extensive and the entropy is additive. In general, however, the situation is not of this type and correlations may be far from negligible at all scales. Tsallis [1988, 1998] introduced an entropic expression characterized by an index q which leads to a nonextensive statistics. Tsallis entropy, Sq, is the basis of the so called nonextensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory. Tsallis statistics has been found applications to a wide range of phenomena in diverse disciplines such as physics, chemistry, biology, medicine, economics, geophysics etc. For this special issue of Entropy we solicit contributions that apply Tsallis entropy in various scientific fields.

Dr. Anastasios Anastasiadis
Guest Editor

Keywords

  • Tsallis entropy
  • complex system dynamics
  • non- extensive statistical mechanics

Published Papers (12 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review

41 KiB  
Editorial
Special Issue: Tsallis Entropy
by Anastasios Anastasiadis
Entropy 2012, 14(2), 174-176; https://doi.org/10.3390/e14020174 - 03 Feb 2012
Cited by 25 | Viewed by 7259
Abstract
One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if [...] Read more.
One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if the correlations within the system are essentially local. In such cases the energy of the system is typically extensive and the entropy is additive. In general, however, the situation is not of this type and correlations may be far from negligible at all scales. Tsallis in 1988 introduced an entropic expression characterized by an index q which leads to a non-extensive statistics. Tsallis entropy, Sq, is the basis of the so called non-extensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory. Tsallis statistics have found applications in a wide range of phenomena in diverse disciplines such as physics, chemistry, biology, medicine, economics, geophysics, etc. The focus of this special issue of Entropy was to solicit contributions that apply Tsallis entropy in various scientific fields. [...] Full article
(This article belongs to the Special Issue Tsallis Entropy)

Research

Jump to: Editorial, Review

182 KiB  
Article
Tsallis Relative Entropy and Anomalous Diffusion
by Janett Prehl, Christopher Essex and Karl Heinz Hoffmann
Entropy 2012, 14(4), 701-716; https://doi.org/10.3390/e14040701 - 10 Apr 2012
Cited by 40 | Viewed by 8106
Abstract
In this paper we utilize the Tsallis relative entropy, a generalization of the Kullback–Leibler entropy in the frame work of non-extensive thermodynamics to analyze the properties of anomalous diffusion processes. Anomalous (super-) diffusive behavior can be described by fractional diffusion equations, where the [...] Read more.
In this paper we utilize the Tsallis relative entropy, a generalization of the Kullback–Leibler entropy in the frame work of non-extensive thermodynamics to analyze the properties of anomalous diffusion processes. Anomalous (super-) diffusive behavior can be described by fractional diffusion equations, where the second order space derivative is extended to fractional order α ∈ (1, 2). They represent a bridging regime, where for α = 2 one obtains the diffusion equation and for α = 1 the (half) wave equation is given. These fractional diffusion equations are solved by so-called stable distributions, which exhibit heavy tails and skewness. In contrast to the Shannon or Tsallis entropy of these distributions, the Kullback and Tsallis relative entropy, relative to the pure diffusion case, induce a natural ordering of the stable distributions consistent with the ordering implied by the pure diffusion and wave limits. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

302 KiB  
Article
Classes of N-Dimensional Nonlinear Fokker-Planck Equations Associated to Tsallis Entropy
by Mauricio S. Ribeiro, Fernando D. Nobre and Evaldo M. F. Curado
Entropy 2011, 13(11), 1928-1944; https://doi.org/10.3390/e13111928 - 01 Nov 2011
Cited by 41 | Viewed by 6545
Abstract
Several previous results valid for one-dimensional nonlinear Fokker-Planck equations are generalized to N-dimensions. A general nonlinear N-dimensional Fokker-Planck equation is derived directly from a master equation, by considering nonlinearitiesin the transition rates. Using nonlinear Fokker-Planck equations, the H-theorem is proved;for that, an important [...] Read more.
Several previous results valid for one-dimensional nonlinear Fokker-Planck equations are generalized to N-dimensions. A general nonlinear N-dimensional Fokker-Planck equation is derived directly from a master equation, by considering nonlinearitiesin the transition rates. Using nonlinear Fokker-Planck equations, the H-theorem is proved;for that, an important relation involving these equations and general entropic forms is introduced. It is shown that due to this relation, classes of nonlinear N-dimensional Fokker-Planck equations are connected to a single entropic form. A particular emphasis is given to the class of equations associated to Tsallis entropy, in both cases of the standard, and generalized definitions for the internal energy. Full article
(This article belongs to the Special Issue Tsallis Entropy)
443 KiB  
Article
Quantifying Dynamical Complexity of Magnetic Storms and Solar Flares via Nonextensive Tsallis Entropy
by Georgios Balasis, Ioannis A. Daglis, Constantinos Papadimitriou, Anastasios Anastasiadis, Ingmar Sandberg and Konstantinos Eftaxias
Entropy 2011, 13(10), 1865-1881; https://doi.org/10.3390/e13101865 - 14 Oct 2011
Cited by 29 | Viewed by 6933
Abstract
Over the last couple of decades nonextensive Tsallis entropy has shown remarkable applicability to describe nonequilibrium physical systems with large variability and multifractal structure. Herein, we review recent results from the application of Tsallis statistical mechanics to the detection of dynamical changes related [...] Read more.
Over the last couple of decades nonextensive Tsallis entropy has shown remarkable applicability to describe nonequilibrium physical systems with large variability and multifractal structure. Herein, we review recent results from the application of Tsallis statistical mechanics to the detection of dynamical changes related with the occurrence of magnetic storms. We extend our review to describe attempts to approach the dynamics of magnetic storms and solar flares by means of universality through Tsallis statistics. We also include a discussion of possible implications on space weather forecasting efforts arising from the verification of Tsallis entropy in the complex system of the magnetosphere. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

10102 KiB  
Article
Tsallis Entropy for Geometry Simplification
by Pascual Castelló, Carlos González, Miguel Chover, Mateu Sbert and Miquel Feixas
Entropy 2011, 13(10), 1805-1828; https://doi.org/10.3390/e13101805 - 29 Sep 2011
Cited by 3 | Viewed by 7089
Abstract
This paper presents a study and a comparison of the use of different information-theoretic measures for polygonal mesh simplification. Generalized measures from Information Theory such as Havrda–Charvát–Tsallis entropy and mutual information have been applied. These measures have been used in the error metric [...] Read more.
This paper presents a study and a comparison of the use of different information-theoretic measures for polygonal mesh simplification. Generalized measures from Information Theory such as Havrda–Charvát–Tsallis entropy and mutual information have been applied. These measures have been used in the error metric of a surfaces implification algorithm. We demonstrate that these measures are useful for simplifying three-dimensional polygonal meshes. We have also compared these metrics with the error metrics used in a geometry-based method and in an image-driven method. Quantitative results are presented in the comparison using the root-mean-square error (RMSE). Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

2244 KiB  
Article
Projective Power Entropy and Maximum Tsallis Entropy Distributions
by Shinto Eguchi, Osamu Komori and Shogo Kato
Entropy 2011, 13(10), 1746-1764; https://doi.org/10.3390/e13101746 - 26 Sep 2011
Cited by 21 | Viewed by 8734
Abstract
We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties [...] Read more.
We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterization problem of which conditions uniquely determine the projective power entropy up to the power index. A close relation of the entropy with the Lebesgue space Lp and the dual Lq is explored, in which the escort distribution associates with an interesting property. When we consider maximum Tsallis entropy distributions under the constraints of the mean vector and variance matrix, the model becomes a multivariate q-Gaussian model with elliptical contours, including a Gaussian and t-distribution model. We discuss the statistical estimation by minimization of the empirical loss associated with the projective power entropy. It is shown that the minimum loss estimator for the mean vector and variance matrix under the maximum entropy model are the sample mean vector and the sample variance matrix. The escort distribution of the maximum entropy distribution plays the key role for the derivation. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

385 KiB  
Article
Tsallis Mutual Information for Document Classification
by Màrius Vila, Anton Bardera, Miquel Feixas and Mateu Sbert
Entropy 2011, 13(9), 1694-1707; https://doi.org/10.3390/e13091694 - 14 Sep 2011
Cited by 24 | Viewed by 9401
Abstract
Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, [...] Read more.
Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence, respectively. In addition, the ratio between these measures and the Tsallis joint entropy is analyzed. The performance of all these measures is studied for different entropic indexes in the context of document classification and registration. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Graphical abstract

2392 KiB  
Article
A Risk Profile for Information Fusion Algorithms
by Kenric P. Nelson, Brian J. Scannell and Herbert Landau
Entropy 2011, 13(8), 1518-1532; https://doi.org/10.3390/e13081518 - 17 Aug 2011
Cited by 10 | Viewed by 7864
Abstract
E.T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the [...] Read more.
E.T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the sensitivity to outliers. The principal of nonlinear statistical coupling, which is an interpretation of the Tsallis entropy generalization, can be used to quantify this trade-off. The coupled-surprisal, -lnκ(p)≡-(pκ-1)/κ , is a generalization of Shannon surprisal or the logarithmic scoring rule, given a forecast p of a true event by an inferencing algorithm. The coupling parameter κ=1-q, where q is the Tsallis entropy index, is the degree of nonlinear coupling between statistical states. Positive (negative) values of nonlinear coupling decrease (increase) the surprisal information metric and thereby biases the risk in favor of decisive (robust) algorithms relative to the Shannon surprisal (κ=0). We show that translating the average coupled-surprisal to an effective probability is equivalent to using the generalized mean of the true event probabilities as a scoring rule. The metric is used to assess the robustness, accuracy, and decisiveness of a fusion algorithm. We use a two-parameter fusion algorithm to combine input probabilities from N sources. The generalized mean parameter ‘alpha’ varies the degree of smoothing and raising to a power Νβ with β between 0 and 1 provides a model of correlation. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

8015 KiB  
Article
Tsallis-Based Nonextensive Analysis of the Southern California Seismicity
by Luciano Telesca
Entropy 2011, 13(7), 1267-1280; https://doi.org/10.3390/e13071267 - 11 Jul 2011
Cited by 61 | Viewed by 7088
Abstract
Nonextensive statistics has been becoming a very useful tool to describe the complexity of dynamic systems. Recently, analysis of the magnitude distribution of earthquakes has been increasingly used in the context of nonextensivity. In the present paper, the nonextensive analysis of the southern [...] Read more.
Nonextensive statistics has been becoming a very useful tool to describe the complexity of dynamic systems. Recently, analysis of the magnitude distribution of earthquakes has been increasingly used in the context of nonextensivity. In the present paper, the nonextensive analysis of the southern California earthquake catalog was performed. The results show that the nonextensivity parameter q lies in the same range as obtained for other different seismic areas, thus suggesting a sort of universal character in the nonextensive interpretation of seismicity. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

140 KiB  
Article
On the Thermodynamics of Classical Micro-Canonical Systems
by Maarten Baeten and Jan Naudts
Entropy 2011, 13(6), 1186-1199; https://doi.org/10.3390/e13061186 - 21 Jun 2011
Cited by 12 | Viewed by 7329
Abstract
We give two arguments why the thermodynamic entropy of non-extensive systems involves R´enyi’s entropy function rather than that of Tsallis. The first argument is that the temperature of the configurational subsystem of a mono-atomic gas is equal to that of the kinetic subsystem. [...] Read more.
We give two arguments why the thermodynamic entropy of non-extensive systems involves R´enyi’s entropy function rather than that of Tsallis. The first argument is that the temperature of the configurational subsystem of a mono-atomic gas is equal to that of the kinetic subsystem. The second argument is that the instability of the pendulum, which occurs for energies close to the rotation threshold, is correctly reproduced. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

3104 KiB  
Article
Optimal Multi-Level Thresholding Based on Maximum Tsallis Entropy via an Artificial Bee Colony Approach
by Yudong Zhang and Lenan Wu
Entropy 2011, 13(4), 841-859; https://doi.org/10.3390/e13040841 - 13 Apr 2011
Cited by 223 | Viewed by 17130
Abstract
This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy [...] Read more.
This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy as a general information theory entropy formalism. For the algorithm, we used the artificial bee colony approach since execution of an exhaustive algorithm would be too time-consuming. The experiments demonstrate that: 1) the Tsallis entropy is superior to traditional maximum entropy thresholding, maximum between class variance thresholding, and minimum cross entropy thresholding; 2) the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Therefore, our approach is effective and rapid. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

Review

Jump to: Editorial, Research

1415 KiB  
Review
The Nonadditive Entropy Sq and Its Applications in Physics and Elsewhere: Some Remarks
by Constantino Tsallis
Entropy 2011, 13(10), 1765-1804; https://doi.org/10.3390/e13101765 - 28 Sep 2011
Cited by 147 | Viewed by 11691
Abstract
The nonadditive entropy Sq has been introduced in 1988 focusing on a generalization of Boltzmann–Gibbs (BG) statistical mechanics. The aim was to cover a (possibly wide) class of systems among those very many which violate hypothesis such as ergodicity, under which the [...] Read more.
The nonadditive entropy Sq has been introduced in 1988 focusing on a generalization of Boltzmann–Gibbs (BG) statistical mechanics. The aim was to cover a (possibly wide) class of systems among those very many which violate hypothesis such as ergodicity, under which the BG theory is expected to be valid. It is now known that Sq has a large applicability; more specifically speaking, even outside Hamiltonian systems and their thermodynamical approach. In the present paper we review and comment some relevant aspects of this entropy, namely (i) Additivity versus extensivity; (ii) Probability distributions that constitute attractors in the sense of Central Limit Theorems; (iii) The analysis of paradigmatic low-dimensional nonlinear dynamical systems near the edge of chaos; and (iv) The analysis of paradigmatic long-range-interacting many-body classical Hamiltonian systems. Finally, we exhibit recent as well as typical predictions, verifications and applications of these concepts in natural, artificial, and social systems, as shown through theoretical, experimental, observational and computational results. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

Back to TopTop