E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Tsallis Entropy"

Quicklinks

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 July 2011)

Special Issue Editor

Guest Editor
Dr. Anastasios Anastasiadis

Space Research & Technology Group, Institute for Astronomy, Astrophysics, Space Applications and Remote Sensing, National Observatory of Athens, GR-15236 Penteli, Greece
Website | E-Mail
Fax: +30 2 106138343
Interests: cellular automata; complexity; acceleration, transport and diffusion processes in dynamical systems

Special Issue Information

Dear Colleagues,

The aim of statistical mechanics is to establish a direct link between the mechanical laws and classical thermodynamics. The uncertainty of an open system state can be quantified by the Boltzmann-Gibbs entropy, which is the widest known uncertainty measure in statistical mechanics.

One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if the correlations within the system are essentially local. In such cases the energy of the system is typically extensive and the entropy is additive. In general, however, the situation is not of this type and correlations may be far from negligible at all scales. Tsallis [1988, 1998] introduced an entropic expression characterized by an index q which leads to a nonextensive statistics. Tsallis entropy, Sq, is the basis of the so called nonextensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory. Tsallis statistics has been found applications to a wide range of phenomena in diverse disciplines such as physics, chemistry, biology, medicine, economics, geophysics etc. For this special issue of Entropy we solicit contributions that apply Tsallis entropy in various scientific fields.

Dr. Anastasios Anastasiadis
Guest Editor

Keywords

  • Tsallis entropy
  • complex system dynamics
  • non- extensive statistical mechanics

Published Papers (12 papers)

View options order results:
result details:
Displaying articles 1-12
Export citation of selected articles as:

Editorial

Jump to: Research, Review

Open AccessEditorial Special Issue: Tsallis Entropy
Entropy 2012, 14(2), 174-176; doi:10.3390/e14020174
Received: 2 February 2012 / Accepted: 2 February 2012 / Published: 3 February 2012
Cited by 4 | PDF Full-text (41 KB) | HTML Full-text | XML Full-text
Abstract
One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if
[...] Read more.
One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if the correlations within the system are essentially local. In such cases the energy of the system is typically extensive and the entropy is additive. In general, however, the situation is not of this type and correlations may be far from negligible at all scales. Tsallis in 1988 introduced an entropic expression characterized by an index q which leads to a non-extensive statistics. Tsallis entropy, Sq, is the basis of the so called non-extensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory. Tsallis statistics have found applications in a wide range of phenomena in diverse disciplines such as physics, chemistry, biology, medicine, economics, geophysics, etc. The focus of this special issue of Entropy was to solicit contributions that apply Tsallis entropy in various scientific fields. [...] Full article
(This article belongs to the Special Issue Tsallis Entropy)

Research

Jump to: Editorial, Review

Open AccessArticle Tsallis Relative Entropy and Anomalous Diffusion
Entropy 2012, 14(4), 701-716; doi:10.3390/e14040701
Received: 1 March 2012 / Revised: 19 March 2012 / Accepted: 30 March 2012 / Published: 10 April 2012
Cited by 18 | PDF Full-text (182 KB) | HTML Full-text | XML Full-text
Abstract
In this paper we utilize the Tsallis relative entropy, a generalization of the Kullback–Leibler entropy in the frame work of non-extensive thermodynamics to analyze the properties of anomalous diffusion processes. Anomalous (super-) diffusive behavior can be described by fractional diffusion equations, where the
[...] Read more.
In this paper we utilize the Tsallis relative entropy, a generalization of the Kullback–Leibler entropy in the frame work of non-extensive thermodynamics to analyze the properties of anomalous diffusion processes. Anomalous (super-) diffusive behavior can be described by fractional diffusion equations, where the second order space derivative is extended to fractional order α ∈ (1, 2). They represent a bridging regime, where for α = 2 one obtains the diffusion equation and for α = 1 the (half) wave equation is given. These fractional diffusion equations are solved by so-called stable distributions, which exhibit heavy tails and skewness. In contrast to the Shannon or Tsallis entropy of these distributions, the Kullback and Tsallis relative entropy, relative to the pure diffusion case, induce a natural ordering of the stable distributions consistent with the ordering implied by the pure diffusion and wave limits. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle Classes of N-Dimensional Nonlinear Fokker-Planck Equations Associated to Tsallis Entropy
Entropy 2011, 13(11), 1928-1944; doi:10.3390/e13111928
Received: 4 October 2011 / Accepted: 21 October 2011 / Published: 1 November 2011
Cited by 20 | PDF Full-text (302 KB) | HTML Full-text | XML Full-text
Abstract
Several previous results valid for one-dimensional nonlinear Fokker-Planck equations are generalized to N-dimensions. A general nonlinear N-dimensional Fokker-Planck equation is derived directly from a master equation, by considering nonlinearitiesin the transition rates. Using nonlinear Fokker-Planck equations, the H-theorem is proved;for that, an important
[...] Read more.
Several previous results valid for one-dimensional nonlinear Fokker-Planck equations are generalized to N-dimensions. A general nonlinear N-dimensional Fokker-Planck equation is derived directly from a master equation, by considering nonlinearitiesin the transition rates. Using nonlinear Fokker-Planck equations, the H-theorem is proved;for that, an important relation involving these equations and general entropic forms is introduced. It is shown that due to this relation, classes of nonlinear N-dimensional Fokker-Planck equations are connected to a single entropic form. A particular emphasis is given to the class of equations associated to Tsallis entropy, in both cases of the standard, and generalized definitions for the internal energy. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle Quantifying Dynamical Complexity of Magnetic Storms and Solar Flares via Nonextensive Tsallis Entropy
Entropy 2011, 13(10), 1865-1881; doi:10.3390/e13101865
Received: 1 September 2011 / Revised: 28 September 2011 / Accepted: 30 September 2011 / Published: 14 October 2011
Cited by 11 | PDF Full-text (443 KB)
Abstract
Over the last couple of decades nonextensive Tsallis entropy has shown remarkable applicability to describe nonequilibrium physical systems with large variability and multifractal structure. Herein, we review recent results from the application of Tsallis statistical mechanics to the detection of dynamical changes related
[...] Read more.
Over the last couple of decades nonextensive Tsallis entropy has shown remarkable applicability to describe nonequilibrium physical systems with large variability and multifractal structure. Herein, we review recent results from the application of Tsallis statistical mechanics to the detection of dynamical changes related with the occurrence of magnetic storms. We extend our review to describe attempts to approach the dynamics of magnetic storms and solar flares by means of universality through Tsallis statistics. We also include a discussion of possible implications on space weather forecasting efforts arising from the verification of Tsallis entropy in the complex system of the magnetosphere. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle Tsallis Entropy for Geometry Simplification
Entropy 2011, 13(10), 1805-1828; doi:10.3390/e13101805
Received: 1 August 2011 / Revised: 20 September 2011 / Accepted: 27 September 2011 / Published: 29 September 2011
Cited by 1 | PDF Full-text (10102 KB)
Abstract
This paper presents a study and a comparison of the use of different information-theoretic measures for polygonal mesh simplification. Generalized measures from Information Theory such as Havrda–Charvát–Tsallis entropy and mutual information have been applied. These measures have been used in the error metric
[...] Read more.
This paper presents a study and a comparison of the use of different information-theoretic measures for polygonal mesh simplification. Generalized measures from Information Theory such as Havrda–Charvát–Tsallis entropy and mutual information have been applied. These measures have been used in the error metric of a surfaces implification algorithm. We demonstrate that these measures are useful for simplifying three-dimensional polygonal meshes. We have also compared these metrics with the error metrics used in a geometry-based method and in an image-driven method. Quantitative results are presented in the comparison using the root-mean-square error (RMSE). Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle Projective Power Entropy and Maximum Tsallis Entropy Distributions
Entropy 2011, 13(10), 1746-1764; doi:10.3390/e13101746
Received: 26 July 2011 / Revised: 20 September 2011 / Accepted: 20 September 2011 / Published: 26 September 2011
Cited by 10 | PDF Full-text (2244 KB) | HTML Full-text | XML Full-text
Abstract
We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties
[...] Read more.
We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterization problem of which conditions uniquely determine the projective power entropy up to the power index. A close relation of the entropy with the Lebesgue space Lp and the dual Lq is explored, in which the escort distribution associates with an interesting property. When we consider maximum Tsallis entropy distributions under the constraints of the mean vector and variance matrix, the model becomes a multivariate q-Gaussian model with elliptical contours, including a Gaussian and t-distribution model. We discuss the statistical estimation by minimization of the empirical loss associated with the projective power entropy. It is shown that the minimum loss estimator for the mean vector and variance matrix under the maximum entropy model are the sample mean vector and the sample variance matrix. The escort distribution of the maximum entropy distribution plays the key role for the derivation. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle Tsallis Mutual Information for Document Classification
Entropy 2011, 13(9), 1694-1707; doi:10.3390/e13091694
Received: 1 August 2011 / Revised: 5 September 2011 / Accepted: 8 September 2011 / Published: 14 September 2011
Cited by 9 | PDF Full-text (385 KB) | HTML Full-text | XML Full-text
Abstract
Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy
[...] Read more.
Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence, respectively. In addition, the ratio between these measures and the Tsallis joint entropy is analyzed. The performance of all these measures is studied for different entropic indexes in the context of document classification and registration. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Figures

Open AccessArticle A Risk Profile for Information Fusion Algorithms
Entropy 2011, 13(8), 1518-1532; doi:10.3390/e13081518
Received: 17 May 2011 / Revised: 4 August 2011 / Accepted: 11 August 2011 / Published: 17 August 2011
Cited by 5 | PDF Full-text (2392 KB) | HTML Full-text | XML Full-text
Abstract
E.T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the
[...] Read more.
E.T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the sensitivity to outliers. The principal of nonlinear statistical coupling, which is an interpretation of the Tsallis entropy generalization, can be used to quantify this trade-off. The coupled-surprisal, -lnκ(p)≡-(pκ-1)/κ , is a generalization of Shannon surprisal or the logarithmic scoring rule, given a forecast p of a true event by an inferencing algorithm. The coupling parameter κ=1-q, where q is the Tsallis entropy index, is the degree of nonlinear coupling between statistical states. Positive (negative) values of nonlinear coupling decrease (increase) the surprisal information metric and thereby biases the risk in favor of decisive (robust) algorithms relative to the Shannon surprisal (κ=0). We show that translating the average coupled-surprisal to an effective probability is equivalent to using the generalized mean of the true event probabilities as a scoring rule. The metric is used to assess the robustness, accuracy, and decisiveness of a fusion algorithm. We use a two-parameter fusion algorithm to combine input probabilities from N sources. The generalized mean parameter ‘alpha’ varies the degree of smoothing and raising to a power Νβ with β between 0 and 1 provides a model of correlation. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle Tsallis-Based Nonextensive Analysis of the Southern California Seismicity
Entropy 2011, 13(7), 1267-1280; doi:10.3390/e13071267
Received: 6 June 2011 / Revised: 21 June 2011 / Accepted: 6 July 2011 / Published: 11 July 2011
Cited by 21 | PDF Full-text (8015 KB) | HTML Full-text | XML Full-text
Abstract
Nonextensive statistics has been becoming a very useful tool to describe the complexity of dynamic systems. Recently, analysis of the magnitude distribution of earthquakes has been increasingly used in the context of nonextensivity. In the present paper, the nonextensive analysis of the southern
[...] Read more.
Nonextensive statistics has been becoming a very useful tool to describe the complexity of dynamic systems. Recently, analysis of the magnitude distribution of earthquakes has been increasingly used in the context of nonextensivity. In the present paper, the nonextensive analysis of the southern California earthquake catalog was performed. The results show that the nonextensivity parameter q lies in the same range as obtained for other different seismic areas, thus suggesting a sort of universal character in the nonextensive interpretation of seismicity. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle On the Thermodynamics of Classical Micro-Canonical Systems
Entropy 2011, 13(6), 1186-1199; doi:10.3390/e13061186
Received: 18 May 2011 / Revised: 15 June 2011 / Accepted: 16 June 2011 / Published: 21 June 2011
Cited by 6 | PDF Full-text (140 KB) | HTML Full-text | XML Full-text
Abstract
We give two arguments why the thermodynamic entropy of non-extensive systems involves R´enyi’s entropy function rather than that of Tsallis. The first argument is that the temperature of the configurational subsystem of a mono-atomic gas is equal to that of the kinetic subsystem.
[...] Read more.
We give two arguments why the thermodynamic entropy of non-extensive systems involves R´enyi’s entropy function rather than that of Tsallis. The first argument is that the temperature of the configurational subsystem of a mono-atomic gas is equal to that of the kinetic subsystem. The second argument is that the instability of the pendulum, which occurs for energies close to the rotation threshold, is correctly reproduced. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle Optimal Multi-Level Thresholding Based on Maximum Tsallis Entropy via an Artificial Bee Colony Approach
Entropy 2011, 13(4), 841-859; doi:10.3390/e13040841
Received: 2 March 2011 / Revised: 17 March 2011 / Accepted: 29 March 2011 / Published: 13 April 2011
Cited by 69 | PDF Full-text (3104 KB) | HTML Full-text | XML Full-text
Abstract
This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy
[...] Read more.
This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy as a general information theory entropy formalism. For the algorithm, we used the artificial bee colony approach since execution of an exhaustive algorithm would be too time-consuming. The experiments demonstrate that: 1) the Tsallis entropy is superior to traditional maximum entropy thresholding, maximum between class variance thresholding, and minimum cross entropy thresholding; 2) the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Therefore, our approach is effective and rapid. Full article
(This article belongs to the Special Issue Tsallis Entropy)

Review

Jump to: Editorial, Research

Open AccessReview The Nonadditive Entropy Sq and Its Applications in Physics and Elsewhere: Some Remarks
Entropy 2011, 13(10), 1765-1804; doi:10.3390/e13101765
Received: 15 August 2011 / Revised: 11 September 2011 / Accepted: 19 September 2011 / Published: 28 September 2011
Cited by 43 | PDF Full-text (1415 KB)
Abstract
The nonadditive entropy Sq has been introduced in 1988 focusing on a generalization of Boltzmann–Gibbs (BG) statistical mechanics. The aim was to cover a (possibly wide) class of systems among those very many which violate hypothesis such as ergodicity, under which the
[...] Read more.
The nonadditive entropy Sq has been introduced in 1988 focusing on a generalization of Boltzmann–Gibbs (BG) statistical mechanics. The aim was to cover a (possibly wide) class of systems among those very many which violate hypothesis such as ergodicity, under which the BG theory is expected to be valid. It is now known that Sq has a large applicability; more specifically speaking, even outside Hamiltonian systems and their thermodynamical approach. In the present paper we review and comment some relevant aspects of this entropy, namely (i) Additivity versus extensivity; (ii) Probability distributions that constitute attractors in the sense of Central Limit Theorems; (iii) The analysis of paradigmatic low-dimensional nonlinear dynamical systems near the edge of chaos; and (iv) The analysis of paradigmatic long-range-interacting many-body classical Hamiltonian systems. Finally, we exhibit recent as well as typical predictions, verifications and applications of these concepts in natural, artificial, and social systems, as shown through theoretical, experimental, observational and computational results. Full article
(This article belongs to the Special Issue Tsallis Entropy)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top