entropy-logo

Journal Browser

Journal Browser

Distance in Information and Statistical Physics Volume 2

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 June 2013) | Viewed by 91825

Special Issue Editor

Department of Mathematics and Physics, Faculty of Science, Kanagawa University, 3-27-1 Rokkakubashi, Yokohama 221-8686, Kanagawa, Japan
Interests: fisher information; non-extensivity; information theory; nonlinear Fokker–Planck equations; non-linear Schrödinger equations; complexity measure; irreversibility; tumor growth; temperature-dependent energy levels in statistical physics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The notion of distance plays a pivotal role in information sciences and statistical physics. For example, relative entropy helps our understanding of the asymptotic process of systems and serves to identify how distinguishable two distributions are. It is not exaggerated to say that much effort revolves around clarification of information structure pertain to distance measures (entropies). This special issue should provide a forum to present and discuss recent progress on the topics listed in the keywords below.

Takuya Yamano
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • relative entropy
  • Kullback-Leibler divergence
  • typicality
  • quantum thermodynamics
  • nonequilibrium entropy
  • fluctuation
  • 2nd law of thermodynamics
  • information geometry
  • Fisher information

Related Special Issue

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

205 KiB  
Article
Local Softening of Information Geometric Indicators of Chaos in Statistical Modeling in the Presence of Quantum-Like Considerations
by Adom Giffin, Sean A. Ali and Carlo Cafaro
Entropy 2013, 15(11), 4622-4633; https://doi.org/10.3390/e15114622 - 28 Oct 2013
Cited by 8 | Viewed by 4315
Abstract
In a previous paper (C. Cafaro et al., 2012), we compared an uncorrelated 3D Gaussian statistical model to an uncorrelated 2D Gaussian statistical model obtained from the former model by introducing a constraint that resembles the quantum mechanical canonical minimum [...] Read more.
In a previous paper (C. Cafaro et al., 2012), we compared an uncorrelated 3D Gaussian statistical model to an uncorrelated 2D Gaussian statistical model obtained from the former model by introducing a constraint that resembles the quantum mechanical canonical minimum uncertainty relation. Analysis was completed by way of the information geometry and the entropic dynamics of each system. This analysis revealed that the chaoticity of the 2D Gaussian statistical model, quantified by means of the Information Geometric Entropy (IGE), is softened or weakened with respect to the chaoticity of the 3D Gaussian statistical model, due to the accessibility of more information. In this companion work, we further constrain the system in the context of a correlation constraint among the system’s micro-variables and show that the chaoticity is further weakened, but only locally. Finally, the physicality of the constraints is briefly discussed, particularly in the context of quantum entanglement. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
321 KiB  
Article
Correlation Distance and Bounds for Mutual Information
by Michael J. W. Hall
Entropy 2013, 15(9), 3698-3713; https://doi.org/10.3390/e15093698 - 06 Sep 2013
Cited by 7 | Viewed by 6285
Abstract
The correlation distance quantifies the statistical independence of two classical or quantum systems, via the distance from their joint state to the product of the marginal states. Tight lower bounds are given for the mutual information between pairs of two-valued classical variables and [...] Read more.
The correlation distance quantifies the statistical independence of two classical or quantum systems, via the distance from their joint state to the product of the marginal states. Tight lower bounds are given for the mutual information between pairs of two-valued classical variables and quantum qubits, in terms of the corresponding classical and quantum correlation distances. These bounds are stronger than the Pinsker inequality (and refinements thereof) for relative entropy. The classical lower bound may be used to quantify properties of statistical models that violate Bell inequalities. Partially entangled qubits can have lower mutual information than can any two-valued classical variables having the same correlation distance. The qubit correlation distance also provides a direct entanglement criterion, related to the spin covariance matrix. Connections of results with classically-correlated quantum states are briefly discussed. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Show Figures

Figure 1

293 KiB  
Article
Information Geometry of Complex Hamiltonians and Exceptional Points
by Dorje C. Brody and Eva-Maria Graefe
Entropy 2013, 15(9), 3361-3378; https://doi.org/10.3390/e15093361 - 23 Aug 2013
Cited by 31 | Viewed by 7075
Abstract
Information geometry provides a tool to systematically investigate the parameter sensitivity of the state of a system. If a physical system is described by a linear combination of eigenstates of a complex (that is, non-Hermitian) Hamiltonian, then there can be phase transitions where [...] Read more.
Information geometry provides a tool to systematically investigate the parameter sensitivity of the state of a system. If a physical system is described by a linear combination of eigenstates of a complex (that is, non-Hermitian) Hamiltonian, then there can be phase transitions where dynamical properties of the system change abruptly. In the vicinities of the transition points, the state of the system becomes highly sensitive to the changes of the parameters in the Hamiltonian. The parameter sensitivity can then be measured in terms of the Fisher-Rao metric and the associated curvature of the parameter-space manifold. A general scheme for the geometric study of parameter-space manifolds of eigenstates of complex Hamiltonians is outlined here, leading to generic expressions for the metric. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
298 KiB  
Article
Time Evolution of Relative Entropies for Anomalous Diffusion
by Janett Prehl, Frank Boldt, Christopher Essex and Karl Heinz Hoffmann
Entropy 2013, 15(8), 2989-3006; https://doi.org/10.3390/e15082989 - 26 Jul 2013
Cited by 13 | Viewed by 6176
Abstract
The entropy production paradox for anomalous diffusion processes describes a phenomenon where one-parameter families of dynamical equations, falling between the diffusion and wave equations, have entropy production rates (Shannon, Tsallis or Renyi) that increase toward the wave equation limit unexpectedly. Moreover, also surprisingly, [...] Read more.
The entropy production paradox for anomalous diffusion processes describes a phenomenon where one-parameter families of dynamical equations, falling between the diffusion and wave equations, have entropy production rates (Shannon, Tsallis or Renyi) that increase toward the wave equation limit unexpectedly. Moreover, also surprisingly, the entropy does not order the bridging regime between diffusion and waves at all. However, it has been found that relative entropies, with an appropriately chosen reference distribution, do. Relative entropies, thus, provide a physically sensible way of setting which process is “nearer” to pure diffusion than another, placing pure wave propagation, desirably, “furthest” from pure diffusion. We examine here the time behavior of the relative entropies under the evolution dynamics of the underlying one-parameter family of dynamical equations based on space-fractional derivatives. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Show Figures

Figure 1

188 KiB  
Article
Pushing for the Extreme: Estimation of Poisson Distribution from Low Count Unreplicated Data—How Close Can We Get?
by Peter Tiňo
Entropy 2013, 15(4), 1202-1220; https://doi.org/10.3390/e15041202 - 08 Apr 2013
Cited by 1 | Viewed by 5585
Abstract
Studies of learning algorithms typically concentrate on situations where potentially ever growing training sample is available. Yet, there can be situations (e.g., detection of differentially expressed genes on unreplicated data or estimation of time delay in non-stationary gravitationally lensed photon streams) where only [...] Read more.
Studies of learning algorithms typically concentrate on situations where potentially ever growing training sample is available. Yet, there can be situations (e.g., detection of differentially expressed genes on unreplicated data or estimation of time delay in non-stationary gravitationally lensed photon streams) where only extremely small samples can be used in order to perform an inference. On unreplicated data, the inference has to be performed on the smallest sample possible—sample of size 1. We study whether anything useful can be learnt in such extreme situations by concentrating on a Bayesian approach that can account for possible prior information on expected counts. We perform a detailed information theoretic study of such Bayesian estimation and quantify the effect of Bayesian averaging on its first two moments. Finally, to analyze potential benefits of the Bayesian approach, we also consider Maximum Likelihood (ML) estimation as a baseline approach. We show both theoretically and empirically that the Bayesian model averaging can be potentially beneficial. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Show Figures

Figure 1

2213 KiB  
Article
Kullback–Leibler Divergence Measure for Multivariate Skew-Normal Distributions
by Javier E. Contreras-Reyes and Reinaldo B. Arellano-Valle
Entropy 2012, 14(9), 1606-1626; https://doi.org/10.3390/e14091606 - 04 Sep 2012
Cited by 62 | Viewed by 12281
Abstract
The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal [...] Read more.
The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these distributions. Finally, we applied our results on a seismological catalogue data set related to the 2010 Maule earthquake. Specifically, we compare the distributions of the local magnitudes of the regions formed by the aftershocks. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Show Figures

Figure 1

157 KiB  
Article
Geometry of q-Exponential Family of Probability Distributions
by Shun-ichi Amari and Atsumi Ohara
Entropy 2011, 13(6), 1170-1185; https://doi.org/10.3390/e13061170 - 14 Jun 2011
Cited by 65 | Viewed by 11124
Abstract
The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than [...] Read more.
The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability) estimator. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Show Figures

Graphical abstract

532 KiB  
Article
Distances in Probability Space and the Statistical Complexity Setup
by Andres M. Kowalski, Maria Teresa Martín, Angelo Plastino, Osvaldo A. Rosso and Montserrat Casas
Entropy 2011, 13(6), 1055-1075; https://doi.org/10.3390/e13061055 - 03 Jun 2011
Cited by 39 | Viewed by 11248
Abstract
Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important [...] Read more.
Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important topics underlying the SCM structure, viz., (a) a good choice of probability metric space and (b) how to assess the best distance-choice, which in this context is called a “disequilibrium” and is denoted with the letter Q. Q, indeed the crucial SCM ingredient, is cast in terms of an associated distance D. Since out input data consists of time-series, we also discuss the best way of extracting from the time series a probability distribution P. As an illustration, we show just how these issues affect the description of the classical limit of quantum mechanics. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Show Figures

Figure 1

356 KiB  
Article
Parametric Bayesian Estimation of Differential Entropy and Relative Entropy
by Maya Gupta and Santosh Srivastava
Entropy 2010, 12(4), 818-843; https://doi.org/10.3390/e12040818 - 09 Apr 2010
Cited by 25 | Viewed by 9776
Abstract
Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, [...] Read more.
Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, formulas are given for a log gamma Bregman divergence and the differential entropy and relative entropy for the Wishart and inverse Wishart. The results, as always with Bayesian estimates, depend on the accuracy of the prior parameters, but example simulations show that the performance can be substantially improved compared to maximum likelihood or state-of-the-art nonparametric estimators. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Show Figures

Figure 1

133 KiB  
Article
Entropy and Divergence Associated with Power Function and the Statistical Application
by Shinto Eguchi and Shogo Kato
Entropy 2010, 12(2), 262-274; https://doi.org/10.3390/e12020262 - 25 Feb 2010
Cited by 32 | Viewed by 8392
Abstract
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation [...] Read more.
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
194 KiB  
Article
Transport of Heat and Charge in Electromagnetic Metrology Based on Nonequilibrium Statistical Mechanics
by James Baker-Jarvis and Jack Surek
Entropy 2009, 11(4), 748-765; https://doi.org/10.3390/e11040748 - 03 Nov 2009
Cited by 1 | Viewed by 8745
Abstract
Current research is probing transport on ever smaller scales. Modeling of the electromagnetic interaction with nanoparticles or small collections of dipoles and its associated energy transport and nonequilibrium characteristics requires a detailed understanding of transport properties. The goal of this paper is to [...] Read more.
Current research is probing transport on ever smaller scales. Modeling of the electromagnetic interaction with nanoparticles or small collections of dipoles and its associated energy transport and nonequilibrium characteristics requires a detailed understanding of transport properties. The goal of this paper is to use a nonequilibrium statistical-mechanical method to obtain exact time-correlation functions, fluctuation-dissipation theorems (FD), heat and charge transport, and associated transport expressions under electromagnetic driving. We extend the time-symmetric Robertson statistical-mechanical theory to study the exact time evolution of relevant variables and entropy rate in the electromagnetic interaction with materials. In this exact statistical-mechanical theory, a generalized canonical density is used to define an entropy in terms of a set of relevant variables and associated Lagrange multipliers. Then the entropy production rate are defined through the relevant variables. The influence of the nonrelevant variables enter the equations through the projection-like operator and thereby influences the entropy. We present applications to the response functions for the electrical and thermal conductivity, specific heat, generalized temperature, Boltzmann’s constant, and noise. The analysis can be performed either classically or quantum-mechanically, and there are only a few modifications in transferring between the approaches. As an application we study the energy, generalized temperature, and charge transport equations that are valid in nonequilibrium and relate it to heat flow and temperature relations in equilibrium states. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Back to TopTop