Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (13)

Search Parameters:
Keywords = geometry–information duality

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
28 pages, 1237 KB  
Article
Counting Cosmic Cycles: Past Big Crunches, Future Recurrence Limits, and the Age of the Quantum Memory Matrix Universe
by Florian Neukart, Eike Marx and Valerii Vinokur
Entropy 2025, 27(10), 1043; https://doi.org/10.3390/e27101043 - 7 Oct 2025
Viewed by 578
Abstract
We present a quantitative theory of contraction and expansion cycles within the Quantum Memory Matrix (QMM) cosmology. In this framework, spacetime consists of finite-capacity Hilbert cells that store quantum information. Each non-singular bounce adds a fixed increment of imprint entropy, defined as the [...] Read more.
We present a quantitative theory of contraction and expansion cycles within the Quantum Memory Matrix (QMM) cosmology. In this framework, spacetime consists of finite-capacity Hilbert cells that store quantum information. Each non-singular bounce adds a fixed increment of imprint entropy, defined as the cumulative quantum information written irreversibly into the matrix and distinct from coarse-grained thermodynamic entropy, thereby providing an intrinsic, monotonic cycle counter. By calibrating the geometry–information duality, inferring today’s cumulative imprint from CMB, BAO, chronometer, and large-scale-structure constraints, and integrating the modified Friedmann equations with imprint back-reaction, we find that the Universe has already completed Npast=3.6±0.4 cycles. The finite Hilbert capacity enforces an absolute ceiling: propagating the holographic write rate and accounting for instability channels implies only Nfuture=7.8±1.6 additional cycles before saturation halts further bounces. Integrating Kodama-vector proper time across all completed cycles yields a total cumulative age tQMM=62.0±2.5Gyr, compared to the 13.8±0.2Gyr of the current expansion usually described by ΛCDM. The framework makes concrete, testable predictions: an enhanced faint-end UV luminosity function at z12 observable with JWST, a stochastic gravitational-wave background with f2/3 scaling in the LISA band from primordial black-hole mergers, and a nanohertz background with slope α2/3 accessible to pulsar-timing arrays. These signatures provide near-term opportunities to confirm, refine, or falsify the cyclical QMM chronology. Full article
Show Figures

Figure 1

19 pages, 310 KB  
Article
The Gauge Equation in Statistical Manifolds: An Approach through Spectral Sequences
by Michel Nguiffo Boyom and Stephane Puechmorel
Mathematics 2024, 12(8), 1177; https://doi.org/10.3390/math12081177 - 14 Apr 2024
Viewed by 1458
Abstract
The gauge equation is a generalization of the conjugacy relation for the Koszul connection to bundle morphisms that are not isomorphisms. The existence of nontrivial solution to this equation, especially when duality is imposed upon related connections, provides important information about the geometry [...] Read more.
The gauge equation is a generalization of the conjugacy relation for the Koszul connection to bundle morphisms that are not isomorphisms. The existence of nontrivial solution to this equation, especially when duality is imposed upon related connections, provides important information about the geometry of the manifolds under consideration. In this article, we use the gauge equation to introduce spectral sequences that are further specialized to Hessian structures. Full article
(This article belongs to the Special Issue Advances in Differential Geometry and Its Applications)
Show Figures

Figure 1

16 pages, 656 KB  
Article
Divergences Induced by the Cumulant and Partition Functions of Exponential Families and Their Deformations Induced by Comparative Convexity
by Frank Nielsen
Entropy 2024, 26(3), 193; https://doi.org/10.3390/e26030193 - 23 Feb 2024
Cited by 2 | Viewed by 2350
Abstract
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or equivalently normalized divisively by its partition function. Both the [...] Read more.
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or equivalently normalized divisively by its partition function. Both the cumulant and partition functions are strictly convex and smooth functions inducing corresponding pairs of Bregman and Jensen divergences. It is well known that skewed Bhattacharyya distances between the probability densities of an exponential family amount to skewed Jensen divergences induced by the cumulant function between their corresponding natural parameters, and that in limit cases the sided Kullback–Leibler divergences amount to reverse-sided Bregman divergences. In this work, we first show that the α-divergences between non-normalized densities of an exponential family amount to scaled α-skewed Jensen divergences induced by the partition function. We then show how comparative convexity with respect to a pair of quasi-arithmetical means allows both convex functions and their arguments to be deformed, thereby defining dually flat spaces with corresponding divergences when ordinary convexity is preserved. Full article
Show Figures

Figure 1

21 pages, 575 KB  
Article
The Properties of Alpha Risk Parity Portfolios
by Jérôme Gava and Julien Turc
Entropy 2022, 24(11), 1631; https://doi.org/10.3390/e24111631 - 10 Nov 2022
Cited by 1 | Viewed by 6835
Abstract
Risk parity is an approach to investing that aims to balance risk evenly across assets within a given universe. The aim of this study is to unify the most commonly-used approaches to risk parity within a single framework. Links between these approaches have [...] Read more.
Risk parity is an approach to investing that aims to balance risk evenly across assets within a given universe. The aim of this study is to unify the most commonly-used approaches to risk parity within a single framework. Links between these approaches have been identified in the published literature. A key point in risk parity is being able to identify and control the contribution of each asset to the risk of the portfolio. With alpha risk parity, risk contributions are given by a closed-form formula. There is a form of antisymmetry—or self-duality—in alpha risk portfolios that lie between risk budgeting and minimum-risk portfolios. Techniques from information geometry play a key role in establishing these properties. Full article
(This article belongs to the Special Issue Entropy-Based Methods for Finance and Risk Management)
Show Figures

Figure 1

17 pages, 322 KB  
Article
Symplectic Polar Duality, Quantum Blobs, and Generalized Gaussians
by Maurice de Gosson and Charlyne de Gosson
Symmetry 2022, 14(9), 1890; https://doi.org/10.3390/sym14091890 - 9 Sep 2022
Cited by 3 | Viewed by 1996
Abstract
We apply the notion of polar duality from convex geometry to the study of quantum covariance ellipsoids in symplectic phase space. We consider in particular the case of “quantum blobs” introduced in previous work; quantum blobs are the smallest symplectic invariant regions of [...] Read more.
We apply the notion of polar duality from convex geometry to the study of quantum covariance ellipsoids in symplectic phase space. We consider in particular the case of “quantum blobs” introduced in previous work; quantum blobs are the smallest symplectic invariant regions of the phase space compatible with the uncertainty principle in its strong Robertson–Schrödinger form. We show that these phase space units can be characterized by a simple condition of reflexivity using polar duality, thus improving previous results. We apply these geometric constructions to the characterization of pure Gaussian states in terms of partial information on the covariance ellipsoid, which allows us to formulate statements related to symplectic tomography. Full article
(This article belongs to the Topic Quantum Information and Quantum Computing)
26 pages, 410 KB  
Review
λ-Deformation: A Canonical Framework for Statistical Manifolds of Constant Curvature
by Jun Zhang and Ting-Kam Leonard Wong
Entropy 2022, 24(2), 193; https://doi.org/10.3390/e24020193 - 27 Jan 2022
Cited by 5 | Viewed by 3134
Abstract
This paper systematically presents the λ-deformation as the canonical framework of deformation to the dually flat (Hessian) geometry, which has been well established in information geometry. We show that, based on deforming the Legendre duality, all objects in the Hessian case have [...] Read more.
This paper systematically presents the λ-deformation as the canonical framework of deformation to the dually flat (Hessian) geometry, which has been well established in information geometry. We show that, based on deforming the Legendre duality, all objects in the Hessian case have their correspondence in the λ-deformed case: λ-convexity, λ-conjugation, λ-biorthogonality, λ-logarithmic divergence, λ-exponential and λ-mixture families, etc. In particular, λ-deformation unifies Tsallis and Rényi deformations by relating them to two manifestations of an identical λ-exponential family, under subtractive or divisive probability normalization, respectively. Unlike the different Hessian geometries of the exponential and mixture families, the λ-exponential family, in turn, coincides with the λ-mixture family after a change of random variables. The resulting statistical manifolds, while still carrying a dualistic structure, replace the Hessian metric and a pair of dually flat conjugate affine connections with a conformal Hessian metric and a pair of projectively flat connections carrying constant (nonzero) curvature. Thus, λ-deformation is a canonical framework in generalizing the well-known dually flat Hessian structure of information geometry. Full article
(This article belongs to the Special Issue Review Papers for Entropy)
Show Figures

Figure 1

31 pages, 2293 KB  
Article
Sentience and the Origins of Consciousness: From Cartesian Duality to Markovian Monism
by Karl J. Friston, Wanja Wiese and J. Allan Hobson
Entropy 2020, 22(5), 516; https://doi.org/10.3390/e22050516 - 30 Apr 2020
Cited by 161 | Viewed by 29914
Abstract
This essay addresses Cartesian duality and how its implicit dialectic might be repaired using physics and information theory. Our agenda is to describe a key distinction in the physical sciences that may provide a foundation for the distinction between mind and matter, and [...] Read more.
This essay addresses Cartesian duality and how its implicit dialectic might be repaired using physics and information theory. Our agenda is to describe a key distinction in the physical sciences that may provide a foundation for the distinction between mind and matter, and between sentient and intentional systems. From this perspective, it becomes tenable to talk about the physics of sentience and ‘forces’ that underwrite our beliefs (in the sense of probability distributions represented by our internal states), which may ground our mental states and consciousness. We will refer to this view as Markovian monism, which entails two claims: (1) fundamentally, there is only one type of thing and only one type of irreducible property (hence monism). (2) All systems possessing a Markov blanket have properties that are relevant for understanding the mind and consciousness: if such systems have mental properties, then they have them partly by virtue of possessing a Markov blanket (hence Markovian). Markovian monism rests upon the information geometry of random dynamic systems. In brief, the information geometry induced in any system—whose internal states can be distinguished from external states—must acquire a dual aspect. This dual aspect concerns the (intrinsic) information geometry of the probabilistic evolution of internal states and a separate (extrinsic) information geometry of probabilistic beliefs about external states that are parameterised by internal states. We call these intrinsic (i.e., mechanical, or state-based) and extrinsic (i.e., Markovian, or belief-based) information geometries, respectively. Although these mathematical notions may sound complicated, they are fairly straightforward to handle, and may offer a means through which to frame the origins of consciousness. Full article
(This article belongs to the Special Issue Models of Consciousness)
Show Figures

Figure 1

14 pages, 1592 KB  
Article
Information Geometric Duality of ϕ-Deformed Exponential Families
by Jan Korbel, Rudolf Hanel and Stefan Thurner
Entropy 2019, 21(2), 112; https://doi.org/10.3390/e21020112 - 24 Jan 2019
Cited by 10 | Viewed by 4694
Abstract
In the world of generalized entropies—which, for example, play a role in physical systems with sub- and super-exponential phase space growth per degree of freedom—there are two ways for implementing constraints in the maximum entropy principle: linear and escort constraints. Both appear naturally [...] Read more.
In the world of generalized entropies—which, for example, play a role in physical systems with sub- and super-exponential phase space growth per degree of freedom—there are two ways for implementing constraints in the maximum entropy principle: linear and escort constraints. Both appear naturally in different contexts. Linear constraints appear, e.g., in physical systems, when additional information about the system is available through higher moments. Escort distributions appear naturally in the context of multifractals and information geometry. It was shown recently that there exists a fundamental duality that relates both approaches on the basis of the corresponding deformed logarithms (deformed-log duality). Here, we show that there exists another duality that arises in the context of information geometry, relating the Fisher information of ϕ -deformed exponential families that correspond to linear constraints (as studied by J.Naudts) to those that are based on escort constraints (as studied by S.-I. Amari). We explicitly demonstrate this information geometric duality for the case of ( c , d ) -entropy, which covers all situations that are compatible with the first three Shannon–Khinchin axioms and that include Shannon, Tsallis, Anteneodo–Plastino entropy, and many more as special cases. Finally, we discuss the relation between the deformed-log duality and the information geometric duality and mention that the escort distributions arising in these two dualities are generally different and only coincide for the case of the Tsallis deformation. Full article
(This article belongs to the Special Issue Information Theory in Complex Systems)
Show Figures

Figure 1

31 pages, 377 KB  
Article
Connecting Information Geometry and Geometric Mechanics
by Melvin Leok and Jun Zhang
Entropy 2017, 19(10), 518; https://doi.org/10.3390/e19100518 - 27 Sep 2017
Cited by 15 | Viewed by 6473
Abstract
The divergence function in information geometry, and the discrete Lagrangian in discrete geometric mechanics each induce a differential geometric structure on the product manifold Q × Q . We aim to investigate the relationship between these two objects, and the fundamental role that [...] Read more.
The divergence function in information geometry, and the discrete Lagrangian in discrete geometric mechanics each induce a differential geometric structure on the product manifold Q × Q . We aim to investigate the relationship between these two objects, and the fundamental role that duality, in the form of Legendre transforms, plays in both fields. By establishing an analogy between these two approaches, we will show how a fruitful cross-fertilization of techniques may arise from switching formulations based on the cotangent bundle T * Q (as in geometric mechanics) and the tangent bundle T Q (as in information geometry). In particular, we establish, through variational error analysis, that the divergence function agrees with the exact discrete Lagrangian up to third order if and only if Q is a Hessian manifold. Full article
(This article belongs to the Special Issue Information Geometry II)
14 pages, 628 KB  
Article
Expansion of the Kullback-Leibler Divergence, and a New Class of Information Metrics
by David J. Galas, Gregory Dewey, James Kunert-Graf and Nikita A. Sakhanenko
Axioms 2017, 6(2), 8; https://doi.org/10.3390/axioms6020008 - 1 Apr 2017
Cited by 21 | Viewed by 7497
Abstract
Inferring and comparing complex, multivariable probability density functions is fundamental to problems in several fields, including probabilistic learning, network theory, and data analysis. Classification and prediction are the two faces of this class of problem. This study takes an approach that simplifies many [...] Read more.
Inferring and comparing complex, multivariable probability density functions is fundamental to problems in several fields, including probabilistic learning, network theory, and data analysis. Classification and prediction are the two faces of this class of problem. This study takes an approach that simplifies many aspects of these problems by presenting a structured, series expansion of the Kullback-Leibler divergence—a function central to information theory—and devise a distance metric based on this divergence. Using the Möbius inversion duality between multivariable entropies and multivariable interaction information, we express the divergence as an additive series in the number of interacting variables, which provides a restricted and simplified set of distributions to use as approximation and with which to model data. Truncations of this series yield approximations based on the number of interacting variables. The first few terms of the expansion-truncation are illustrated and shown to lead naturally to familiar approximations, including the well-known Kirkwood superposition approximation. Truncation can also induce a simple relation between the multi-information and the interaction information. A measure of distance between distributions, based on Kullback-Leibler divergence, is then described and shown to be a true metric if properly restricted. The expansion is shown to generate a hierarchy of metrics and connects this work to information geometry formalisms. An example of the application of these metrics to a graph comparison problem is given that shows that the formalism can be applied to a wide range of network problems and provides a general approach for systematic approximations in numbers of interactions or connections, as well as a related quantitative metric. Full article
(This article belongs to the Special Issue Entropy and Information Theory)
Show Figures

Figure 1

19 pages, 440 KB  
Article
A Novel Approach to Canonical Divergences within Information Geometry
by Nihat Ay and Shun-ichi Amari
Entropy 2015, 17(12), 8111-8129; https://doi.org/10.3390/e17127866 - 9 Dec 2015
Cited by 44 | Viewed by 9345
Abstract
A divergence function on a manifold M defines a Riemannian metric g and dually coupled affine connections ∇ and ∇ * on M. When M is dually flat, that is flat with respect to ∇ and ∇ * , a canonical divergence is [...] Read more.
A divergence function on a manifold M defines a Riemannian metric g and dually coupled affine connections ∇ and ∇ * on M. When M is dually flat, that is flat with respect to ∇ and ∇ * , a canonical divergence is known, which is uniquely determined from ( M , g , ∇ , ∇ * ) . We propose a natural definition of a canonical divergence for a general, not necessarily flat, M by using the geodesic integration of the inverse exponential map. The new definition of a canonical divergence reduces to the known canonical divergence in the case of dual flatness. Finally, we show that the integrability of the inverse exponential map implies the geodesic projection property. Full article
Show Figures

Figure 1

21 pages, 286 KB  
Article
Duality of Maximum Entropy and Minimum Divergence
by Shinto Eguchi, Osamu Komori and Atsumi Ohara
Entropy 2014, 16(7), 3552-3572; https://doi.org/10.3390/e16073552 - 26 Jun 2014
Cited by 16 | Viewed by 7382
Abstract
We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of [...] Read more.
We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving a statistical model. The dualistic relationship between the maximum entropy model and the minimum divergence estimation is explored in the framework of information geometry. The model of maximum entropy distributions is characterized to be totally geodesic with respect to the linear connection associated with the divergence. A natural extension for the classical theory for the maximum likelihood method under the maximum entropy model in terms of the Boltzmann-Gibbs-Shannon entropy is given. We discuss the duality in detail for Tsallis entropy as a typical example. Full article
(This article belongs to the Special Issue Maximum Entropy and Its Application)
35 pages, 346 KB  
Article
Nonparametric Information Geometry: From Divergence Function to Referential-Representational Biduality on Statistical Manifolds
by Jun Zhang
Entropy 2013, 15(12), 5384-5418; https://doi.org/10.3390/e15125384 - 4 Dec 2013
Cited by 35 | Viewed by 7541
Abstract
Divergence functions are the non-symmetric “distance” on the manifold, Μθ, of parametric probability density functions over a measure space, (Χ,μ). Classical information geometry prescribes, on Μθ: (i) a Riemannian metric given by the Fisher information; (ii) a [...] Read more.
Divergence functions are the non-symmetric “distance” on the manifold, Μθ, of parametric probability density functions over a measure space, (Χ,μ). Classical information geometry prescribes, on Μθ: (i) a Riemannian metric given by the Fisher information; (ii) a pair of dual connections (giving rise to the family of α-connections) that preserve the metric under parallel transport by their joint actions; and (iii) a family of divergence functions ( α-divergence) defined on Μθ x Μθ, which induce the metric and the dual connections. Here, we construct an extension of this differential geometric structure from Μθ (that of parametric probability density functions) to the manifold, Μ, of non-parametric functions on X, removing the positivity and normalization constraints. The generalized Fisher information and α-connections on M are induced by an α-parameterized family of divergence functions, reflecting the fundamental convex inequality associated with any smooth and strictly convex function. The infinite-dimensional manifold, M, has zero curvature for all these α-connections; hence, the generally non-zero curvature of M can be interpreted as arising from an embedding of Μθ into Μ. Furthermore, when a parametric model (after a monotonic scaling) forms an affine submanifold, its natural and expectation parameters form biorthogonal coordinates, and such a submanifold is dually flat for α = ± 1, generalizing the results of Amari’s α-embedding. The present analysis illuminates two different types of duality in information geometry, one concerning the referential status of a point (measurable function) expressed in the divergence function (“referential duality”) and the other concerning its representation under an arbitrary monotone scaling (“representational duality”). Full article
Back to TopTop