E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Information Geometry II"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory".

Deadline for manuscript submissions: closed (30 April 2017)

Special Issue Editor

Guest Editor
Prof. Dr. Geert Verdoolaege

1 Research Unit Nuclear Fusion, Department of Applied Physics, Ghent University, Sint-Pietersnieuwstraat 41, B-9000 Ghent, Belgium
2 Laboratory for Plasma Physics, Royal Military Academy (ERM/KMS), Renaissancelaan 30 Avenue de la Renaissance, B-1000 Brussels, Belgium
Website | E-Mail
Phone: +32-9-264.95.91
Interests: probability theory; Bayesian inference; machine learning; information geometry; differential geometry; nuclear fusion; plasma physics; plasma turbulence; continuum mechanics; statistical mechanics

Special Issue Information

Dear Colleagues,

The mathematical field of Information Geometry originated from the observation by C.R. Rao in 1945 the Fisher information can be used to define a Riemannian metric in spaces of probability distributions. This led to a geometrical description of probability theory and statistics, allowing the study of the invariant properties of statistical manifolds. It was through the work of S.-I. Amari and others that it was later realized that the differential-geometric structure of a statistical manifold can be derived from divergence functions, yielding a Riemannian metric and a pair of dually coupled affine connections.

Since then, Information Geometry has become a truly interdisciplinary field with applications in various domains. It enables a deeper understanding of the methods of statistical inference and machine learning, while providing a powerful framework for deriving new algorithms. As such, Information Geometry has many applications in optimization (e.g., on matrix manifolds), signal and image processing, computer vision, neural networks and other subfields of the information sciences. Furthermore, the methods of Information Geometry have been applied to a wide variety of topics in physics, mathematical finance, biology and the neurosciences. In physics, there are many links with fields that have a natural probabilistic interpretation, including (non-extensive) statistical mechanics and quantum mechanics.

For this Special Issue we welcome submissions related to the foundations and applications of Information Geometry. We envisage contributions that aim at clarifying the connection of Information Geometry with both the information sciences and the physical sciences, so as to demonstrate the profound impact of the field in these disciplines. In addition, we hope to receive original papers illustrating the wide variety of applications of the methods of Information Geometry.

Prof. Dr. Geert Verdoolaege
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Information geometry
  • statistical manifold
  • probability theory
  • machine learning
  • optimization
  • signal processing
  • image processing
  • statistical mechanics
  • quantum mechanics

Related Special Issue

Published Papers (3 papers)

View options order results:
result details:
Displaying articles 1-3
Export citation of selected articles as:

Research

Open AccessArticle Information Submanifold Based on SPD Matrices and Its Applications to Sensor Networks
Entropy 2017, 19(3), 131; doi:10.3390/e19030131
Received: 30 December 2016 / Revised: 1 March 2017 / Accepted: 16 March 2017 / Published: 17 March 2017
PDF Full-text (900 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, firstly, manifoldPD(n)consisting of alln×nsymmetric positive-definite matrices is introduced based on matrix information geometry; Secondly, the geometrical structures of information submanifold ofPD(n)are presented including metric,
[...] Read more.
In this paper, firstly, manifoldPD(n)consisting of alln×nsymmetric positive-definite matrices is introduced based on matrix information geometry; Secondly, the geometrical structures of information submanifold ofPD(n)are presented including metric, geodesic and geodesic distance; Thirdly, the information resolution with sensor networks is presented by three classical measurement models based on information submanifold; Finally, the bearing-only tracking by single sensor is introduced by the Fisher information matrix. The preliminary analysis results introduced in this paper indicate that information submanifold is able to offer consistent and more comprehensive means to understand and solve sensor network problems for targets resolution and tracking, which are not easily handled by some conventional analysis methods. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle On Hölder Projective Divergences
Entropy 2017, 19(3), 122; doi:10.3390/e19030122
Received: 20 January 2017 / Revised: 8 March 2017 / Accepted: 10 March 2017 / Published: 16 March 2017
PDF Full-text (6948 KB) | HTML Full-text | XML Full-text
Abstract
We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities and present two novel classes of Hölder divergences and pseudo-divergences
[...] Read more.
We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities and present two novel classes of Hölder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy–Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians) or affine (e.g., categorical distributions). Those new classes of Hölder distances are invariant to rescaling and thus do not require distributions to be normalized. Finally, we show how to compute statistical Hölder centroids with respect to those divergences and carry out center-based clustering toy experiments on a set of Gaussian distributions which demonstrate empirically that symmetrized Hölder divergences outperform the symmetric Cauchy–Schwarz divergence. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Open AccessArticle Information Geometric Approach to Recursive Update in Nonlinear Filtering
Entropy 2017, 19(2), 54; doi:10.3390/e19020054
Received: 25 November 2016 / Revised: 14 January 2017 / Accepted: 20 January 2017 / Published: 26 January 2017
PDF Full-text (931 KB) | HTML Full-text | XML Full-text
Abstract
The measurement update stage in the nonlinear filtering is considered in the viewpoint of information geometry, and the filtered state is considered as an optimization estimation in parameter space has been corresponded with the iteration in the statistical manifold, then a recursive method
[...] Read more.
The measurement update stage in the nonlinear filtering is considered in the viewpoint of information geometry, and the filtered state is considered as an optimization estimation in parameter space has been corresponded with the iteration in the statistical manifold, then a recursive method is proposed in this paper. This method is derived based on the natural gradient descent on the statistical manifold, which constructed by the posterior probability density function (PDF) of state conditional on the measurement. The derivation procedure is processing in the geometric viewpoint, and gives a geometric interpretation for the iteration update. Besides, the proposed method can be seen as an extended for the Kalman filter and its variants. For the one step in our proposed method, it is identical to the Extended Kalman filter (EKF) in the nonlinear case, while traditional Kalman filter in the linear case. Benefited from the natural gradient descent used in the update stage, our proposed method performs better than the existing methods, and the results have showed in the numerical experiments. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
E-Mail: 
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy Edit a special issue Review for Entropy
loading...
Back to Top