entropy-logo

Journal Browser

Journal Browser

Distance in Information and Statistical Physics III

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Statistical Physics".

Deadline for manuscript submissions: closed (25 June 2022) | Viewed by 13004

Special Issue Editor


E-Mail Website
Guest Editor
Department of Mathematics and Physics, Faculty of Science, Kanagawa University, 3-27-1 Rokkakubashi, Yokohama 221-8686, Kanagawa, Japan
Interests: fisher information; non-extensivity; information theory; nonlinear Fokker–Planck equations; non-linear Schrödinger equations; complexity measure; irreversibility; tumor growth; temperature-dependent energy levels in statistical physics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The distance measures are fundamental tools in science, especially in information sciences and statistical physics. We need to quantify the extent of the approach of nonequilibrium states to an equilibrium one by using a divergence measure between the two states. Thus, the relative entropy helps our understanding of the asymptotic process of systems and serves to identify the difference. In information theory, much effort has been done to clarify information structures between various distance measures (entropies and divergences). To reflect growing interests and ongoing recent insights in these areas, we invite researchers to contribute to this renewed edition (Please see the previous edition at https://www.mdpi.com/si/entropy/distance-info-stat-physics). We use the term “distance”; however, you may regard it in a broad sense; geometry, divergence, discrimination, degree of irreversibility, the arrow of time, and all the rest. This Special Issue should provide a forum to present and discuss recent progress on topics listed in the keywords below and related areas.

Dr. Takuya Yamano
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • relative entropies
  • Kullback–Leibler divergence
  • Jensen–Shannon divergence
  • f-divergence
  • fisher information
  • information geometry
  • Bregman divergence
  • nonequilibrium processes
  • states discrimination in quantum thermodynamics
  • nonequilibrium statistical mechanics
  • fluctuation theorems
  • second law of thermodynamics
  • irreversibility

Related Special Issue

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

2 pages, 156 KiB  
Editorial
Distance in Information and Statistical Physics III
by Takuya Yamano
Entropy 2023, 25(1), 110; https://doi.org/10.3390/e25010110 - 05 Jan 2023
Viewed by 920
Abstract
This Special Issue is a subsequent edition of a previous collection that focused on the notion of distance in two major fields: Distance in Information and Statistical Physics Volume 2 [...] Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics III)

Research

Jump to: Editorial

15 pages, 1172 KiB  
Article
Information–Theoretic Aspects of Location Parameter Estimation under Skew–Normal Settings
by Javier E. Contreras-Reyes
Entropy 2022, 24(3), 399; https://doi.org/10.3390/e24030399 - 13 Mar 2022
Cited by 5 | Viewed by 1541
Abstract
In several applications, the assumption of normality is often violated in data with some level of skewness, so skewness affects the mean’s estimation. The class of skew–normal distributions is considered, given their flexibility for modeling data with asymmetry parameter. In this paper, we [...] Read more.
In several applications, the assumption of normality is often violated in data with some level of skewness, so skewness affects the mean’s estimation. The class of skew–normal distributions is considered, given their flexibility for modeling data with asymmetry parameter. In this paper, we considered two location parameter (μ) estimation methods in the skew–normal setting, where the coefficient of variation and the skewness parameter are known. Specifically, the least square estimator (LSE) and the best unbiased estimator (BUE) for μ are considered. The properties for BUE (which dominates LSE) using classic theorems of information theory are explored, which provides a way to measure the uncertainty of location parameter estimations. Specifically, inequalities based on convexity property enable obtaining lower and upper bounds for differential entropy and Fisher information. Some simulations illustrate the behavior of differential entropy and Fisher information bounds. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics III)
Show Figures

Figure 1

21 pages, 614 KiB  
Article
Generalizations of Talagrand Inequality for Sinkhorn Distance Using Entropy Power Inequality
by Shuchan Wang, Photios A. Stavrou and Mikael Skoglund
Entropy 2022, 24(2), 306; https://doi.org/10.3390/e24020306 - 21 Feb 2022
Cited by 2 | Viewed by 3017
Abstract
The distance that compares the difference between two probability distributions plays a fundamental role in statistics and machine learning. Optimal transport (OT) theory provides a theoretical framework to study such distances. Recent advances in OT theory include a generalization of classical OT with [...] Read more.
The distance that compares the difference between two probability distributions plays a fundamental role in statistics and machine learning. Optimal transport (OT) theory provides a theoretical framework to study such distances. Recent advances in OT theory include a generalization of classical OT with an extra entropic constraint or regularization, called entropic OT. Despite its convenience in computation, entropic OT still lacks sufficient theoretical support. In this paper, we show that the quadratic cost in entropic OT can be upper-bounded using entropy power inequality (EPI)-type bounds. First, we prove an HWI-type inequality by making use of the infinitesimal displacement convexity of the OT map. Second, we derive two Talagrand-type inequalities using the saturation of EPI that corresponds to a numerical term in our expressions. These two new inequalities are shown to generalize two previous results obtained by Bolley et al. and Bai et al. Using the new Talagrand-type inequalities, we also show that the geometry observed by Sinkhorn distance is smoothed in the sense of measure concentration. Finally, we corroborate our results with various simulation studies. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics III)
Show Figures

Figure 1

12 pages, 282 KiB  
Article
Inequalities for Jensen–Sharma–Mittal and Jeffreys–Sharma–Mittal Type f–Divergences
by Paweł A. Kluza
Entropy 2021, 23(12), 1688; https://doi.org/10.3390/e23121688 - 16 Dec 2021
Cited by 2 | Viewed by 1789
Abstract
In this paper, we introduce new divergences called Jensen–Sharma–Mittal and Jeffreys–Sharma–Mittal in relation to convex functions. Some theorems, which give the lower and upper bounds for two new introduced divergences, are provided. The obtained results imply some new inequalities corresponding to known divergences. [...] Read more.
In this paper, we introduce new divergences called Jensen–Sharma–Mittal and Jeffreys–Sharma–Mittal in relation to convex functions. Some theorems, which give the lower and upper bounds for two new introduced divergences, are provided. The obtained results imply some new inequalities corresponding to known divergences. Some examples, which show that these are the generalizations of Rényi, Tsallis, and Kullback–Leibler types of divergences, are provided in order to show a few applications of new divergences. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics III)
22 pages, 1497 KiB  
Article
Fast Approximations of the Jeffreys Divergence between Univariate Gaussian Mixtures via Mixture Conversions to Exponential-Polynomial Distributions
by Frank Nielsen
Entropy 2021, 23(11), 1417; https://doi.org/10.3390/e23111417 - 28 Oct 2021
Cited by 7 | Viewed by 3167
Abstract
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–Leibler divergence broadly used in information sciences. Since the Jeffreys divergence between Gaussian mixture models is not available in closed-form, various techniques with advantages and disadvantages have been proposed in the literature [...] Read more.
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–Leibler divergence broadly used in information sciences. Since the Jeffreys divergence between Gaussian mixture models is not available in closed-form, various techniques with advantages and disadvantages have been proposed in the literature to either estimate, approximate, or lower and upper bound this divergence. In this paper, we propose a simple yet fast heuristic to approximate the Jeffreys divergence between two univariate Gaussian mixtures with arbitrary number of components. Our heuristic relies on converting the mixtures into pairs of dually parameterized probability densities belonging to an exponential-polynomial family. To measure with a closed-form formula the goodness of fit between a Gaussian mixture and an exponential-polynomial density approximating it, we generalize the Hyvärinen divergence to α-Hyvärinen divergences. In particular, the 2-Hyvärinen divergence allows us to perform model selection by choosing the order of the exponential-polynomial densities used to approximate the mixtures. We experimentally demonstrate that our heuristic to approximate the Jeffreys divergence between mixtures improves over the computational time of stochastic Monte Carlo estimations by several orders of magnitude while approximating the Jeffreys divergence reasonably well, especially when the mixtures have a very small number of modes. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics III)
Show Figures

Graphical abstract

11 pages, 490 KiB  
Article
Fisher Information of Free-Electron Landau States
by Takuya Yamano
Entropy 2021, 23(3), 268; https://doi.org/10.3390/e23030268 - 25 Feb 2021
Cited by 2 | Viewed by 1499
Abstract
An electron in a constant magnetic field has energy levels, known as the Landau levels. One can obtain the corresponding radial wavefunction of free-electron Landau states in cylindrical polar coordinates. However, this system has not been explored so far in terms of an [...] Read more.
An electron in a constant magnetic field has energy levels, known as the Landau levels. One can obtain the corresponding radial wavefunction of free-electron Landau states in cylindrical polar coordinates. However, this system has not been explored so far in terms of an information-theoretical viewpoint. Here, we focus on Fisher information associated with these Landau states specified by the two quantum numbers. Fisher information provides a useful measure of the electronic structure in quantum systems, such as hydrogen-like atoms and under some potentials. By numerically evaluating the generalized Laguerre polynomials in the radial densities, we report that Fisher information increases linearly with the principal quantum number that specifies energy levels, but decreases monotonically with the azimuthal quantum number m. We also present relative Fisher information of the Landau states against the reference density with m=0, which is proportional to the principal quantum number. We compare it with the case when the lowest Landau level state is set as the reference. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics III)
Show Figures

Figure 1

Back to TopTop