E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Information Geometry II"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory".

Deadline for manuscript submissions: closed (30 April 2017)

Special Issue Editor

Guest Editor
Prof. Dr. Geert Verdoolaege

1 Research Unit Nuclear Fusion, Department of Applied Physics, Ghent University, Sint-Pietersnieuwstraat 41, B-9000 Ghent, Belgium
2 Laboratory for Plasma Physics, Royal Military Academy (ERM/KMS), Renaissancelaan 30 Avenue de la Renaissance, B-1000 Brussels, Belgium
Website | E-Mail
Phone: +32-9-264.95.91
Interests: probability theory; Bayesian inference; machine learning; information geometry; differential geometry; nuclear fusion; plasma physics; plasma turbulence; continuum mechanics; statistical mechanics

Special Issue Information

Dear Colleagues,

The mathematical field of Information Geometry originated from the observation by C.R. Rao in 1945 the Fisher information can be used to define a Riemannian metric in spaces of probability distributions. This led to a geometrical description of probability theory and statistics, allowing the study of the invariant properties of statistical manifolds. It was through the work of S.-I. Amari and others that it was later realized that the differential-geometric structure of a statistical manifold can be derived from divergence functions, yielding a Riemannian metric and a pair of dually coupled affine connections.

Since then, Information Geometry has become a truly interdisciplinary field with applications in various domains. It enables a deeper understanding of the methods of statistical inference and machine learning, while providing a powerful framework for deriving new algorithms. As such, Information Geometry has many applications in optimization (e.g., on matrix manifolds), signal and image processing, computer vision, neural networks and other subfields of the information sciences. Furthermore, the methods of Information Geometry have been applied to a wide variety of topics in physics, mathematical finance, biology and the neurosciences. In physics, there are many links with fields that have a natural probabilistic interpretation, including (non-extensive) statistical mechanics and quantum mechanics.

For this Special Issue we welcome submissions related to the foundations and applications of Information Geometry. We envisage contributions that aim at clarifying the connection of Information Geometry with both the information sciences and the physical sciences, so as to demonstrate the profound impact of the field in these disciplines. In addition, we hope to receive original papers illustrating the wide variety of applications of the methods of Information Geometry.

Prof. Dr. Geert Verdoolaege
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Information geometry
  • statistical manifold
  • probability theory
  • machine learning
  • optimization
  • signal processing
  • image processing
  • statistical mechanics
  • quantum mechanics

Related Special Issue

Published Papers (18 papers)

View options order results:
result details:
Displaying articles 1-18
Export citation of selected articles as:

Research

Open AccessArticle Detection of Causal Relations in Time Series Affected by Noise in Tokamaks Using Geodesic Distance on Gaussian Manifolds
Entropy 2017, 19(10), 569; doi:10.3390/e19100569
Received: 20 July 2017 / Revised: 20 October 2017 / Accepted: 20 October 2017 / Published: 24 October 2017
PDF Full-text (2245 KB) | HTML Full-text | XML Full-text
Abstract
Abstract: Modern experiments in Magnetic Confinement Nuclear Fusion can produce Gigabytes of data, mainly in form of time series. The acquired signals, composing massive databases, are typically affected by significant levels of noise. The interpretation of the time series can therefore become
[...] Read more.
Abstract: Modern experiments in Magnetic Confinement Nuclear Fusion can produce Gigabytes of data, mainly in form of time series. The acquired signals, composing massive databases, are typically affected by significant levels of noise. The interpretation of the time series can therefore become quite involved, particularly when tenuous causal relations have to be investigated. In the last years, synchronization experiments, to control potentially dangerous instabilities, have become a subject of intensive research. Their interpretation requires quite delicate causality analysis. In this paper, the approach of Information Geometry is applied to the problem of assessing the effectiveness of synchronization experiments on JET (Joint European Torus). In particular, the use of the Geodesic Distance on Gaussian Manifolds is shown to improve the results of advanced techniques such as Recurrent Plots and Complex Networks, when the noise level is not negligible. In cases affected by particularly high levels of noise, compromising the traditional treatments, the use of the Geodesic Distance on Gaussian Manifolds allows deriving quite encouraging results. In addition to consolidating conclusions previously quite uncertain, it has been demonstrated that the proposed approach permit to successfully analyze signals of discharges which were otherwise unusable, therefore salvaging the interpretation of those experiments. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessFeature PaperArticle On the Limiting Behaviour of the Fundamental Geodesics of Information Geometry
Entropy 2017, 19(10), 524; doi:10.3390/e19100524
Received: 13 July 2017 / Revised: 13 September 2017 / Accepted: 28 September 2017 / Published: 30 September 2017
PDF Full-text (386 KB) | HTML Full-text | XML Full-text
Abstract
The Information Geometry of extended exponential families has received much recent attention in a variety of important applications, notably categorical data analysis, graphical modelling and, more specifically, log-linear modelling. The essential geometry here comes from the closure of an exponential family in a
[...] Read more.
The Information Geometry of extended exponential families has received much recent attention in a variety of important applications, notably categorical data analysis, graphical modelling and, more specifically, log-linear modelling. The essential geometry here comes from the closure of an exponential family in a high-dimensional simplex. In parallel, there has been a great deal of interest in the purely Fisher Riemannian structure of (extended) exponential families, most especially in the Markov chain Monte Carlo literature. These parallel developments raise challenges, addressed here, at a variety of levels: both theoretical and practical—relatedly, conceptual and methodological. Centrally to this endeavour, this paper makes explicit the underlying geometry of these two areas via an analysis of the limiting behaviour of the fundamental geodesics of Information Geometry, these being Amari’s (+1) and (0)-geodesics, respectively. Overall, a substantially more complete account of the Information Geometry of extended exponential families is provided than has hitherto been the case. We illustrate the importance and benefits of this novel formulation through applications. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Connecting Information Geometry and Geometric Mechanics
Entropy 2017, 19(10), 518; doi:10.3390/e19100518
Received: 13 July 2017 / Revised: 10 September 2017 / Accepted: 20 September 2017 / Published: 27 September 2017
PDF Full-text (377 KB) | HTML Full-text | XML Full-text
Abstract
The divergence function in information geometry, and the discrete Lagrangian in discrete geometric mechanics each induce a differential geometric structure on the product manifold Q×Q. We aim to investigate the relationship between these two objects, and the fundamental role that
[...] Read more.
The divergence function in information geometry, and the discrete Lagrangian in discrete geometric mechanics each induce a differential geometric structure on the product manifold Q × Q . We aim to investigate the relationship between these two objects, and the fundamental role that duality, in the form of Legendre transforms, plays in both fields. By establishing an analogy between these two approaches, we will show how a fruitful cross-fertilization of techniques may arise from switching formulations based on the cotangent bundle T * Q (as in geometric mechanics) and the tangent bundle T Q (as in information geometry). In particular, we establish, through variational error analysis, that the divergence function agrees with the exact discrete Lagrangian up to third order if and only if Q is a Hessian manifold. Full article
(This article belongs to the Special Issue Information Geometry II)
Open AccessArticle Intrinsic Losses Based on Information Geometry and Their Applications
Entropy 2017, 19(8), 405; doi:10.3390/e19080405
Received: 30 April 2017 / Revised: 21 July 2017 / Accepted: 3 August 2017 / Published: 6 August 2017
PDF Full-text (837 KB) | HTML Full-text | XML Full-text
Abstract
One main interest of information geometry is to study the properties of statistical models that do not depend on the coordinate systems or model parametrization; thus, it may serve as an analytic tool for intrinsic inference in statistics. In this paper, under the
[...] Read more.
One main interest of information geometry is to study the properties of statistical models that do not depend on the coordinate systems or model parametrization; thus, it may serve as an analytic tool for intrinsic inference in statistics. In this paper, under the framework of Riemannian geometry and dual geometry, we revisit two commonly-used intrinsic losses which are respectively given by the squared Rao distance and the symmetrized Kullback–Leibler divergence (or Jeffreys divergence). For an exponential family endowed with the Fisher metric and α -connections, the two loss functions are uniformly described as the energy difference along an α -geodesic path, for some α { 1 , 0 , 1 } . Subsequently, the two intrinsic losses are utilized to develop Bayesian analyses of covariance matrix estimation and range-spread target detection. We provide an intrinsically unbiased covariance estimator, which is verified to be asymptotically efficient in terms of the intrinsic mean square error. The decision rules deduced by the intrinsic Bayesian criterion provide a geometrical justification for the constant false alarm rate detector based on generalized likelihood ratio principle. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Extracting Knowledge from the Geometric Shape of Social Network Data Using Topological Data Analysis
Entropy 2017, 19(7), 360; doi:10.3390/e19070360
Received: 13 May 2017 / Revised: 10 July 2017 / Accepted: 14 July 2017 / Published: 14 July 2017
PDF Full-text (1349 KB) | HTML Full-text | XML Full-text
Abstract
Topological data analysis is a noble approach to extract meaningful information from high-dimensional data and is robust to noise. It is based on topology, which aims to study the geometric shape of data. In order to apply topological data analysis, an algorithm called
[...] Read more.
Topological data analysis is a noble approach to extract meaningful information from high-dimensional data and is robust to noise. It is based on topology, which aims to study the geometric shape of data. In order to apply topological data analysis, an algorithm called mapper is adopted. The output from mapper is a simplicial complex that represents a set of connected clusters of data points. In this paper, we explore the feasibility of topological data analysis for mining social network data by addressing the problem of image popularity. We randomly crawl images from Instagram and analyze the effects of social context and image content on an image’s popularity using mapper. Mapper clusters the images using each feature, and the ratio of popularity in each cluster is computed to determine the clusters with a high or low possibility of popularity. Then, the popularity of images are predicted to evaluate the accuracy of topological data analysis. This approach is further compared with traditional clustering algorithms, including k-means and hierarchical clustering, in terms of accuracy, and the results show that topological data analysis outperforms the others. Moreover, topological data analysis provides meaningful information based on the connectivity between the clusters. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle α-Connections and a Symmetric Cubic Form on a Riemannian Manifold
Entropy 2017, 19(7), 344; doi:10.3390/e19070344
Received: 9 May 2017 / Revised: 6 July 2017 / Accepted: 6 July 2017 / Published: 10 July 2017
PDF Full-text (691 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we study the construction of α-conformally equivalent statistical manifolds for a given symmetric cubic form on a Riemannian manifold. In particular, we describe a method to obtain α-conformally equivalent connections from the relation between tensors and the symmetric
[...] Read more.
In this paper, we study the construction of α -conformally equivalent statistical manifolds for a given symmetric cubic form on a Riemannian manifold. In particular, we describe a method to obtain α -conformally equivalent connections from the relation between tensors and the symmetric cubic form. Full article
(This article belongs to the Special Issue Information Geometry II)
Open AccessArticle Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines
Entropy 2017, 19(7), 310; doi:10.3390/e19070310
Received: 30 April 2017 / Revised: 19 June 2017 / Accepted: 23 June 2017 / Published: 3 July 2017
Cited by 2 | PDF Full-text (1461 KB) | HTML Full-text | XML Full-text
Abstract
In the past three decades, many theoretical measures of complexity have been proposed to help understand complex systems. In this work, for the first time, we place these measures on a level playing field, to explore the qualitative similarities and differences between them,
[...] Read more.
In the past three decades, many theoretical measures of complexity have been proposed to help understand complex systems. In this work, for the first time, we place these measures on a level playing field, to explore the qualitative similarities and differences between them, and their shortcomings. Specifically, using the Boltzmann machine architecture (a fully connected recurrent neural network) with uniformly distributed weights as our model of study, we numerically measure how complexity changes as a function of network dynamics and network parameters. We apply an extension of one such information-theoretic measure of complexity to understand incremental Hebbian learning in Hopfield networks, a fully recurrent architecture model of autoassociative memory. In the course of Hebbian learning, the total information flow reflects a natural upward trend in complexity as the network attempts to learn more and more patterns. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Regularizing Neural Networks via Retaining Confident Connections
Entropy 2017, 19(7), 313; doi:10.3390/e19070313
Received: 30 April 2017 / Revised: 7 June 2017 / Accepted: 23 June 2017 / Published: 30 June 2017
PDF Full-text (308 KB) | HTML Full-text | XML Full-text
Abstract
Regularization of neural networks can alleviate overfitting in the training phase. Current regularization methods, such as Dropout and DropConnect, randomly drop neural nodes or connections based on a uniform prior. Such a data-independent strategy does not take into consideration of the quality of
[...] Read more.
Regularization of neural networks can alleviate overfitting in the training phase. Current regularization methods, such as Dropout and DropConnect, randomly drop neural nodes or connections based on a uniform prior. Such a data-independent strategy does not take into consideration of the quality of individual unit or connection. In this paper, we aim to develop a data-dependent approach to regularizing neural network in the framework of Information Geometry. A measurement for the quality of connections is proposed, namely confidence. Specifically, the confidence of a connection is derived from its contribution to the Fisher information distance. The network is adjusted by retaining the confident connections and discarding the less confident ones. The adjusted network, named as ConfNet, would carry the majority of variations in the sample data. The relationships among confidence estimation, Maximum Likelihood Estimation and classical model selection criteria (like Akaike information criterion) is investigated and discussed theoretically. Furthermore, a Stochastic ConfNet is designed by adding a self-adaptive probabilistic sampling strategy. The proposed data-dependent regularization methods achieve promising experimental results on three data collections including MNIST, CIFAR-10 and CIFAR-100. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Conjugate Representations and Characterizing Escort Expectations in Information Geometry
Entropy 2017, 19(7), 309; doi:10.3390/e19070309
Received: 29 May 2017 / Revised: 22 June 2017 / Accepted: 27 June 2017 / Published: 28 June 2017
PDF Full-text (258 KB) | HTML Full-text | XML Full-text
Abstract
Based on the maximum entropy (MaxEnt) principle for a generalized entropy functional and the conjugate representations introduced by Zhang, we have reformulated the method of information geometry. For a set of conjugate representations, the associated escort expectation is naturally introduced and characterized by
[...] Read more.
Based on the maximum entropy (MaxEnt) principle for a generalized entropy functional and the conjugate representations introduced by Zhang, we have reformulated the method of information geometry. For a set of conjugate representations, the associated escort expectation is naturally introduced and characterized by the generalized score function which has zero-escort expectation. Furthermore, we show that the escort expectation induces a conformal divergence. Full article
(This article belongs to the Special Issue Information Geometry II)
Open AccessArticle Optimal Nonlinear Estimation in Statistical Manifolds with Application to Sensor Network Localization
Entropy 2017, 19(7), 308; doi:10.3390/e19070308
Received: 30 April 2017 / Revised: 23 June 2017 / Accepted: 24 June 2017 / Published: 28 June 2017
PDF Full-text (1059 KB) | HTML Full-text | XML Full-text
Abstract
Information geometry enables a deeper understanding of the methods of statistical inference. In this paper, the problem of nonlinear parameter estimation is considered from a geometric viewpoint using a natural gradient descent on statistical manifolds. It is demonstrated that the nonlinear estimation for
[...] Read more.
Information geometry enables a deeper understanding of the methods of statistical inference. In this paper, the problem of nonlinear parameter estimation is considered from a geometric viewpoint using a natural gradient descent on statistical manifolds. It is demonstrated that the nonlinear estimation for curved exponential families can be simply viewed as a deterministic optimization problem with respect to the structure of a statistical manifold. In this way, information geometry offers an elegant geometric interpretation for the solution to the estimator, as well as the convergence of the gradient-based methods. The theory is illustrated via the analysis of a distributed mote network localization problem where the Radio Interferometric Positioning System (RIPS) measurements are used for free mote location estimation. The analysis results demonstrate the advanced computational philosophy of the presented methodology. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle The Mehler-Fock Transform in Signal Processing
Entropy 2017, 19(6), 289; doi:10.3390/e19060289
Received: 25 April 2017 / Revised: 13 June 2017 / Accepted: 15 June 2017 / Published: 20 June 2017
PDF Full-text (943 KB) | HTML Full-text | XML Full-text
Abstract
Many signals can be described as functions on the unit disk (ball). In the framework of group representations it is well-known how to construct Hilbert-spaces containing these functions that have the groups SU(1,N) as their symmetry groups. One illustration of this construction is
[...] Read more.
Many signals can be described as functions on the unit disk (ball). In the framework of group representations it is well-known how to construct Hilbert-spaces containing these functions that have the groups SU(1,N) as their symmetry groups. One illustration of this construction is three-dimensional color spaces in which chroma properties are described by points on the unit disk. A combination of principal component analysis and the Perron-Frobenius theorem can be used to show that perspective projections map positive signals (i.e., functions with positive values) to a product of the positive half-axis and the unit ball. The representation theory (harmonic analysis) of the group SU(1,1) leads to an integral transform, the Mehler-Fock-transform (MFT), that decomposes functions, depending on the radial coordinate only, into combinations of associated Legendre functions. This transformation is applied to kernel density estimators of probability distributions on the unit disk. It is shown that the transform separates the influence of the data and the measured data. The application of the transform is illustrated by studying the statistical distribution of RGB vectors obtained from a common set of object points under different illuminants. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Inconsistency of Template Estimation by Minimizing of the Variance/Pre-Variance in the Quotient Space
Entropy 2017, 19(6), 288; doi:10.3390/e19060288
Received: 27 April 2017 / Revised: 7 June 2017 / Accepted: 17 June 2017 / Published: 20 June 2017
PDF Full-text (340 KB) | HTML Full-text | XML Full-text
Abstract
We tackle the problem of template estimation when data have been randomly deformed under a group action in the presence of noise. In order to estimate the template, one often minimizes the variance when the influence of the transformations have been removed (computation
[...] Read more.
We tackle the problem of template estimation when data have been randomly deformed under a group action in the presence of noise. In order to estimate the template, one often minimizes the variance when the influence of the transformations have been removed (computation of the Fréchet mean in the quotient space). The consistency bias is defined as the distance (possibly zero) between the orbit of the template and the orbit of one element which minimizes the variance. In the first part, we restrict ourselves to isometric group action, in this case the Hilbertian distance is invariant under the group action. We establish an asymptotic behavior of the consistency bias which is linear with respect to the noise level. As a result the inconsistency is unavoidable as soon as the noise is enough. In practice, template estimation with a finite sample is often done with an algorithm called “max-max”. In the second part, also in the case of isometric group finite, we show the convergence of this algorithm to an empirical Karcher mean. Our numerical experiments show that the bias observed in practice can not be attributed to the small sample size or to a convergence problem but is indeed due to the previously studied inconsistency. In a third part, we also present some insights of the case of a non invariant distance with respect to the group action. We will see that the inconsistency still holds as soon as the noise level is large enough. Moreover we prove the inconsistency even when a regularization term is added. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Information Geometry of Non-Equilibrium Processes in a Bistable System with a Cubic Damping
Entropy 2017, 19(6), 268; doi:10.3390/e19060268
Received: 29 April 2017 / Revised: 2 June 2017 / Accepted: 6 June 2017 / Published: 11 June 2017
Cited by 1 | PDF Full-text (487 KB) | HTML Full-text | XML Full-text
Abstract
A probabilistic description is essential for understanding the dynamics of stochastic systems far from equilibrium, given uncertainty inherent in the systems. To compare different Probability Density Functions (PDFs), it is extremely useful to quantify the difference among different PDFs by assigning an appropriate
[...] Read more.
A probabilistic description is essential for understanding the dynamics of stochastic systems far from equilibrium, given uncertainty inherent in the systems. To compare different Probability Density Functions (PDFs), it is extremely useful to quantify the difference among different PDFs by assigning an appropriate metric to probability such that the distance increases with the difference between the two PDFs. This metric structure then provides a key link between stochastic systems and information geometry. For a non-equilibrium process, we define an infinitesimal distance at any time by comparing two PDFs at times infinitesimally apart and sum these distances in time. The total distance along the trajectory of the system quantifies the total number of different states that the system undergoes in time and is called the information length. By using this concept, we investigate the information geometry of non-equilibrium processes involved in disorder-order transitions between the critical and subcritical states in a bistable system. Specifically, we compute time-dependent PDFs, information length, the rate of change in information length, entropy change and Fisher information in disorder-to-order and order-to-disorder transitions and discuss similarities and disparities between the two transitions. In particular, we show that the total information length in order-to-disorder transition is much larger than that in disorder-to-order transition and elucidate the link to the drastically different evolution of entropy in both transitions. We also provide the comparison of the results with those in the case of the transition between the subcritical and supercritical states and discuss implications for fitness. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Projection to Mixture Families and Rate-Distortion Bounds with Power Distortion Measures
Entropy 2017, 19(6), 262; doi:10.3390/e19060262
Received: 4 May 2017 / Revised: 22 May 2017 / Accepted: 5 June 2017 / Published: 7 June 2017
PDF Full-text (298 KB) | HTML Full-text | XML Full-text
Abstract
The explicit form of the rate-distortion function has rarely been obtained, except for few cases where the Shannon lower bound coincides with the rate-distortion function for the entire range of the positive rate. From an information geometrical point of view, the evaluation of
[...] Read more.
The explicit form of the rate-distortion function has rarely been obtained, except for few cases where the Shannon lower bound coincides with the rate-distortion function for the entire range of the positive rate. From an information geometrical point of view, the evaluation of the rate-distortion function is achieved by a projection to the mixture family defined by the distortion measure. In this paper, we consider the β -th power distortion measure, and prove that β -generalized Gaussian distribution is the only source that can make the Shannon lower bound tight at the minimum distortion level at zero rate. We demonstrate that the tightness of the Shannon lower bound for β = 1 (Laplacian source) and β = 2 (Gaussian source) yields upper bounds to the rate-distortion function of power distortion measures with a different power. These bounds evaluate from above the projection of the source distribution to the mixture family of the generalized Gaussian models. Applying similar arguments to ϵ -insensitive distortion measures, we consider the tightness of the Shannon lower bound and derive an upper bound to the distortion-rate function which is accurate at low rates. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Information Submanifold Based on SPD Matrices and Its Applications to Sensor Networks
Entropy 2017, 19(3), 131; doi:10.3390/e19030131
Received: 30 December 2016 / Revised: 1 March 2017 / Accepted: 16 March 2017 / Published: 17 March 2017
PDF Full-text (900 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, firstly, manifoldPD(n)consisting of alln×nsymmetric positive-definite matrices is introduced based on matrix information geometry; Secondly, the geometrical structures of information submanifold ofPD(n)are presented including metric,
[...] Read more.
In this paper, firstly, manifoldPD(n)consisting of alln×nsymmetric positive-definite matrices is introduced based on matrix information geometry; Secondly, the geometrical structures of information submanifold ofPD(n)are presented including metric, geodesic and geodesic distance; Thirdly, the information resolution with sensor networks is presented by three classical measurement models based on information submanifold; Finally, the bearing-only tracking by single sensor is introduced by the Fisher information matrix. The preliminary analysis results introduced in this paper indicate that information submanifold is able to offer consistent and more comprehensive means to understand and solve sensor network problems for targets resolution and tracking, which are not easily handled by some conventional analysis methods. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle Witnessing Multipartite Entanglement by Detecting Asymmetry
Entropy 2017, 19(3), 124; doi:10.3390/e19030124
Received: 4 February 2017 / Revised: 8 March 2017 / Accepted: 12 March 2017 / Published: 16 March 2017
Cited by 4 | PDF Full-text (817 KB) | HTML Full-text | XML Full-text
Abstract
The characterization of quantum coherence in the context of quantum information theory and its interplay with quantum correlations is currently subject of intense study. Coherence in a Hamiltonian eigenbasis yields asymmetry, the ability of a quantum system to break a dynamical symmetry generated
[...] Read more.
The characterization of quantum coherence in the context of quantum information theory and its interplay with quantum correlations is currently subject of intense study. Coherence in a Hamiltonian eigenbasis yields asymmetry, the ability of a quantum system to break a dynamical symmetry generated by the Hamiltonian. We here propose an experimental strategy to witness multipartite entanglement in many-body systems by evaluating the asymmetry with respect to an additive Hamiltonian. We test our scheme by simulating asymmetry and entanglement detection in a three-qubit Greenberger–Horne–Zeilinger (GHZ) diagonal state. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Open AccessArticle On Hölder Projective Divergences
Entropy 2017, 19(3), 122; doi:10.3390/e19030122
Received: 20 January 2017 / Revised: 8 March 2017 / Accepted: 10 March 2017 / Published: 16 March 2017
PDF Full-text (6948 KB) | HTML Full-text | XML Full-text
Abstract
We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities and present two novel classes of Hölder divergences and pseudo-divergences
[...] Read more.
We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities and present two novel classes of Hölder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy–Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians) or affine (e.g., categorical distributions). Those new classes of Hölder distances are invariant to rescaling and thus do not require distributions to be normalized. Finally, we show how to compute statistical Hölder centroids with respect to those divergences and carry out center-based clustering toy experiments on a set of Gaussian distributions which demonstrate empirically that symmetrized Hölder divergences outperform the symmetric Cauchy–Schwarz divergence. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Open AccessArticle Information Geometric Approach to Recursive Update in Nonlinear Filtering
Entropy 2017, 19(2), 54; doi:10.3390/e19020054
Received: 25 November 2016 / Revised: 14 January 2017 / Accepted: 20 January 2017 / Published: 26 January 2017
PDF Full-text (931 KB) | HTML Full-text | XML Full-text
Abstract
The measurement update stage in the nonlinear filtering is considered in the viewpoint of information geometry, and the filtered state is considered as an optimization estimation in parameter space has been corresponded with the iteration in the statistical manifold, then a recursive method
[...] Read more.
The measurement update stage in the nonlinear filtering is considered in the viewpoint of information geometry, and the filtered state is considered as an optimization estimation in parameter space has been corresponded with the iteration in the statistical manifold, then a recursive method is proposed in this paper. This method is derived based on the natural gradient descent on the statistical manifold, which constructed by the posterior probability density function (PDF) of state conditional on the measurement. The derivation procedure is processing in the geometric viewpoint, and gives a geometric interpretation for the iteration update. Besides, the proposed method can be seen as an extended for the Kalman filter and its variants. For the one step in our proposed method, it is identical to the Extended Kalman filter (EKF) in the nonlinear case, while traditional Kalman filter in the linear case. Benefited from the natural gradient descent used in the update stage, our proposed method performs better than the existing methods, and the results have showed in the numerical experiments. Full article
(This article belongs to the Special Issue Information Geometry II)
Figures

Figure 1

Back to Top