Next Article in Journal
GTAD: Graph and Temporal Neural Network for Multivariate Time Series Anomaly Detection
Next Article in Special Issue
Hellinger Information Matrix and Hellinger Priors
Previous Article in Journal
Are Experts Well-Calibrated? An Equivalence-Based Hypothesis Test
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On Almost Norden Statistical Manifolds

1
Department of Mathematics, Faculty of Science, Arak University, Arak 38156-8-8349, Iran
2
Department of Mathematics, Faculty of Mathematics and Computer Science, University of Bucharest, 010014 Bucharest, Romania
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Entropy 2022, 24(6), 758; https://doi.org/10.3390/e24060758
Submission received: 19 April 2022 / Revised: 5 May 2022 / Accepted: 25 May 2022 / Published: 27 May 2022

Abstract

:
We consider a statistical connection ∇ on an almost complex manifold with (pseudo-) Riemannian metric, in particular the Norden metric. We investigate almost Norden (statistical) manifolds under the condition that the almost complex structure J is ∇-recurrent. We provide one example of a complex statistical connection.

1. Introduction

Recently, the study of spaces consisting of probability measures is receiving more attention. Information geometry, as a famous theory in geometry, is a tool to investigate such spaces (of course in the finite-dimensional sense). Information geometry as a combination of differential geometry and statistics has an important role in science. For example, image processing, physics, computer science, and machine learning are some of its applications (see [1,2,3,4]). From one point of view, it is a realm that makes it possible to illustrate statistical objects as geometric ones by the way of capturing their geometric properties.
For an open subset Θ of R n and a sample space Ω with parameter θ = ( θ 1 , , θ n ) , we call the set of probability density functions
S = { p ( x ; θ ) : Ω p ( x ; θ ) = 1 , p ( x ; θ ) > 0 , θ Θ R n } ,
as a statistical model. For a statistical model S, the semi-definite Fisher information matrix g ( θ ) = [ g i j ( θ ) ] is defined as
g i j ( θ ) : = Ω i θ j θ p ( x ; θ ) d x = E p [ i θ j θ ] ,
where θ = ( x ; θ ) : = log p ( x ; θ ) , i : = θ i , and E p [ f ] is the expectation of f with respect to p ( x ; θ ) . Equipping the space S with such information matrixes, it becomes a statistical manifold.
Fisher was the first to introduce the relation (1) as a mathematical intent of information in 1920 (see [5]). It is shown that if g is positive-definite and all of its components are converging to real numbers, then ( S , g ) will be a Riemannian manifold and g is called a Fisher metric on S with components
g i j ( θ ) = Ω i p ( x ; θ ) j θ d x = Ω 1 p ( x ; θ ) i p ( x ; θ ) j p ( x ; θ ) d x .
Rao was the first to study the above metric in 1945 (see [6]).
By a statistical manifold we mean a triple ( M , g , ) , where the manifold M is equipped with a statistical structure ( g , ) containing a (pseudo) Riemannian metric g and a linear connection ∇ on M such that the covariant derivative g is symmetric.
Recently, statistical manifolds have attracted the attention of many mathematicians (see for instance [7,8,9,10]). A fundamental role in characterizing statistical manifolds is played by two geometric quantities, called dual connections, which describe the derivation with respect to vector fields and are interrelated in a duality relation involving the Fisher metric. The study of dual elements and the relations between them constitute the main direction of development in the study of statistical manifolds [11].
Hermitian manifolds as well as Norden manifolds have been studied from various points of view. Here we refer to [9,12,13,14]. Since on Norden manifolds, there exists a pair of Norden metrics, one can consider dual (conjugate) connections with respect to each of these metrical tensors and their relations to dual connections relative to the almost complex structure [14]. Therefore, the study of statistical structures on these manifolds is of great importance.
The main purpose of this paper is to study almost Norden manifolds with statistical connections. The paper is organized as follows. In Section 2, we recall some basic concepts about statistical geometry. In Section 3, the main focus is on almost Norden manifolds with statistical connections. Furthermore, we obtain some results about the skewness tensor K. In Section 4, we concentrate mainly on almost complex structures J which are ∇-recurrent. This condition lets us study some kinds of almost complex statistical manifolds. In the last section, we construct one complex statistical connection.

2. Preliminaries

Let M be a smooth manifold with a (pseudo) Riemannian metric g and ∇ be a symmetric linear connection.
The triple ( M , g , ) is called a statistical manifold [15] if g is symmetric, i.e., g = C , where C is a symmetric tensor of degree (0,3), namely g satisfies the Codazzi equation
( X g ) ( Y , Z ) = ( Y g ) ( X , Z ) = ( Z g ) ( Y , X ) = C ( X , Y , Z ) , X , Y , Z Γ ( T M ) .
In this case, ∇ is called a statistical connection and the pairing ( , g ) a statistical structure on M (see [15]). When C = 0 , we have the unique Levi-Civita connection ( 0 ) . The dual connection * of a linear connection ∇ is defined by
X g ( Y , Z ) = g ( X Y , Z ) + g ( Y , X * Z ) , X , Y , Z Γ ( T M ) .
Now we define the skewness operator K of degree (1,2) on M as follows:
K ( X , Y ) = X Y X * Y , X , Y Γ ( T M ) .
It is easy to see that K satisfies the following.
(i)
K ( X , Y ) = K ( Y , X ) ,
(ii)
g ( K ( X , Y ) , Z ) = g ( Y , K ( X , Z ) ) ,
(iii)
C ( X , Y , Z ) = g ( K ( X , Y ) , Z ) , X , Y , Z Γ ( T M ) .
It is known that if ( M , g , ) is a statistical manifold, then ( M , g , * ) is a statistical manifold as well [15]. Moreover, we have
( 0 ) = 1 2 ( + * ) .
It is remarkable, from (4) and (5) we have
X Y = X ( 0 ) Y + 1 2 K ( X , Y ) .
In affine differential geometry, the dual connections are called conjugate connections (see [16,17]).

3. Statistical Connections on Almost Norden Manifolds

Fei and Zhang proved an important result ([9], Theorem 2.13) about Klein transformation group of conjugate connections.
We will give some related properties of almost Norden statistical manifolds.
Let M be a 2 n -dimensional differentiable manifold, J an almost complex structure and g a pseudo-Riemannian metric compatible with J, i.e.,
J 2 X = X , g ( J X , J Y ) = g ( X , Y ) .
The couple ( M , J ) is said to be an almost complex manifold and the triple ( M , J , g ) is called an almost Norden manifold. The triple ( M , J , g ) is also called an almost complex manifold with Norden metric.
From Equation (6) one obtains g ( J X , Y ) = g ( X , J Y ) ; it follows that the tensor g ˜ defined by
g ˜ ( X , Y ) = g ( X , J Y ) ,
is symmetric. The tensor g ˜ is known as the associated (twin) metric of g. It is also a Norden metric, namely it satisfies
g ˜ ( J X , J Y ) = g ˜ ( X , Y ) .
It is worth noting that the pseudo-Riemannian metrics g and g ˜ are necessarily of neutral signature ( n , n ) .
Proposition 1.
Ref. [18] Let ( M , J , g ) be an almost Norden manifold andthe Levi-Civita connection of g. Then
g ( ( X J ) Z , Y ) = g ( ( X J ) Y , Z ) .
An almost Norden manifold ( M , J , g ) with a statistical connection ∇ is called an almost Norden statistical manifold.
Proposition 2.
Let ( M , J , g ) be an almost Norden manifold andthe Levi-Civita connection of g. If ( , g ˜ ) is a statistical structure, then
g ˜ ( J ( X J ) Z , Y ) = g ˜ ( J ( Y J ) Z , X ) .
Proof. 
For any X , Y , Z Γ ( T M ) , we have
( X g ˜ ) ( Y , Z ) = X g ˜ ( Y , Z ) g ˜ ( X Y , Z ) g ˜ ( Y , X Z ) = X g ( Y , J Z ) g ( X Y , J Z ) g ( Y , J ( X Z ) ) = g ( X J Z , Y ) g ( Y , J ( X Z ) ) = g ( ( X J ) Z , Y ) = g ˜ ( J ( X J ) Z , Y ) .
In the same way
( Y g ˜ ) ( X , Z ) = g ˜ ( J ( Y J ) Z , X ) .
Based on the assumption that ( M , g ˜ , ) is a statistical manifold, we obtain the equality.   □
Now, we consider an almost Norden statistical manifold ( M , J , g ˜ , ) such that ∇ is the Levi-Civita connection of g. Due to the relations (8) and (9), we can define ( 0 , 3 ) -symmetric tensor
C ( X , Y , Z ) = g ˜ ( J ( X J ) Y , Z ) .
Based on the assumption that ( M , g ˜ , ) is a statistical manifold, it follows that X Y = ˜ X ( 0 ) Y + 1 2 K ( X , Y ) , where ˜ ( 0 ) is the Levi-Civita connection with respect to g ˜ . By the relation ( X g ˜ ) ( Y , Z ) = ( X g ) ( J Y , Z ) + g ( ( X J ) Y , Z ) and the above assumption that ∇ is the Levi-Civita connection of g, we have ( X J ) Y = ( Y J ) X . Therefore we can state the following theorem.
Theorem 1.
Let ( M , J , g ˜ , ) be an almost Norden statistical manifold andthe Levi-Civita connection of g. If the ( 1 , 2 ) -symmetric tensor K ( X , Y ) = J ( X J ) Y , then
X Y ˜ X ( 0 ) Y = 1 2 J ( ( X J ) Y ) .
In the following section, we study some properties of the operator K on almost Norden statistical manifolds.
Proposition 3. 
Let ( M , J , g ) be an almost Norden statistical manifold. Then
(i) 
( X J ) Y ( X * J ) Y = K ( X , J Y ) J ( K ( X , Y ) ) , X , Y Γ ( T M ) .
(ii) 
If ( X J ) Y = ( Y J ) X , then ( X * J ) Y = ( Y * J ) X if and only if K ( X , J Y ) = K ( Y , J X ) , X , Y Γ ( T M ) .
Indeed,
( X J ) Y ( X * J ) Y = X J Y J ( X Y ) X * J Y + J ( X * Y ) = K ( X , J Y ) J ( K ( X , Y ) ) .
Proposition 4. 
Let ( M , J , g ˜ , ) be an almost Norden statistical manifold and letbe the Levi-Civita connection of g, then
(i) 
( X J ) Y = J ( K ( X , Y ) ) , X , Y Γ ( T M )
(ii) 
( X * J ) Y = K ( X , J Y ) , X , Y Γ ( T M ) .
Proof. 
(i)
Because ( , g ˜ ) is a statistical structure, one has
X g ˜ ( Y , Z ) = g ˜ ( X Y , Z ) + g ˜ ( Y , X * Z ) ,
X g ( Y , J Z ) = g ( J X Y , J Z ) + g ( Y , J ( X * Z ) ) .
Since ∇ is the Levi-Civita connection of g, it follows that
g ( J X J Z , Y ) = g ( Y , J ( X * Z ) ) ,
so
( X J ) Y = X J Y J ( X Y ) = J ( X * Y ) J ( X Y ) = J ( K ( X , Y ) ) .
(ii)
Since
( X J ) Y ( X * J ) Y = K ( X , J Y ) J ( K ( X , Y ) ) ,
we obtain ( X * J ) Y = K ( X , J Y ) .
Proposition 5. 
Let ( M , J , g , ) be an almost Norden statistical manifold andthe Levi-Civita connection of g ˜ . Then
(i) 
( X J ) Y = K ( X , J Y ) , X , Y Γ ( T M ) .
(ii) 
( X * J ) Y = J ( K ( X , Y ) ) , X , Y Γ ( T M ) .
Proof. 
(i) Let X , Y Γ ( T M ) . We have
K ( X , J Y ) = J ( ( X J ) J Y ) = J X Y + X J Y = ( X J ) Y .
Similarly for statement (ii).   □
Corollary 1.
Let ( M , J , g , ) be an almost Norden statistical manifold satisfying J = 0 . Then K ( J X , Y ) = K ( J Y , X ) , X , Y Γ ( T M ) .
Corollary 2.
Let ( M , J , g ˜ , ) be an almost Norden statistical manifold and let J = 0 then K ( J X , Y ) = K ( J Y , X ) , X , Y Γ ( T M ) .

4. ∇-Recurrent Almost Complex Structures J

In this section, we focus on connections ∇ satisfying, for any X , Y Γ ( T M ) , ( X J ) Y = τ ( X ) J Y , for some 1-form τ , namely the almost complex structure J is ∇-recurrent. The notion ∇-recurrent with respect to the almost complex structure J was used in [19]. Considering this condition for the linear connection ∇, we study several kinds of almost complex manifolds.
Proposition 6.
Let ( M , J , g , ) be an almost Norden statistical manifold with-recurrent almost complex structures J. Then K ( J X , Y ) = K ( J Y , X ) , X , Y Γ ( T M ) .
Proof. 
Since ( M , g , ) is a statistical manifold
( X g ) ( Y , J Z ) = ( Y g ) ( X , J Z ) , X , Y , Z Γ ( T M ) .
We have
( X g ) ( Y , J Z ) = X g ( Y , J Z ) g ( X Y , J Z ) g ( Y , X J Z ) = X g ( J Y , Z ) g ( X Y , J Z ) g ( Y , ( X J ) Z ) g ( Y , J ( X Z ) ) = g ( X * J Y , Z ) g ( X Y , J Z ) g ( Y , ( X J ) Z ) .
Because ( X J ) Y = τ ( X ) J Y , we obtain
g ( ( X J ) Y , Z ) = g ( τ ( X ) J Y , Z ) = g ( τ ( X ) J Z , Y ) = g ( ( X J ) Z , Y ) .
Thus
( X g ) ( Y , J Z ) = g ( X * J Y , Z ) g ( J ( X Y ) , Z ) g ( Z , ( X J ) Y ) .
In the same way
( Y g ) ( X , J Z ) = g ( Y * J X , Z ) g ( J ( Y X ) , Z ) g ( Z , ( Y J ) X ) .
Subtracting (11) from (10), we have
( X g ) ( Y , J Z ) ( Y g ) ( X , J Z ) = g ( X * J Y , Z ) g ( J [ X , Y ] , Z ) g ( ( X J ) Y , Z ) g ( Y * J X , Z ) + g ( ( Y J ) X , Z ) = 0 .
Since g is non-degenerate, it follows that
X * J Y Y * J X J [ X , Y ] ( X J ) Y + ( Y J ) X = 0 ;
then
( X * J ) Y + J ( X * Y ) ( Y * J ) X J ( Y * X ) J [ X , Y ] ( X J ) Y + ( Y J ) X = 0 .
Because * is symmetric,
( X * J ) Y ( X J ) Y ( Y * J ) X + ( Y J ) X = 0 .
Finally from Proposition 3
J ( K ( X , Y ) ) K ( X , J Y ) J ( K ( Y , X ) ) + K ( Y , J X ) = 0 ,
K ( J X , Y ) = K ( J Y , X ) .
We will consider an extension of the notion of a statistical structure. Let M be a smooth manifold. Consider a tensor h of type ( 0 , 2 ) and a linear connection ∇. The triple ( M , h , ) is called a quasi statistical manifold if d h = 0 [20], where d h is defined by
( d h ) ( X , Y , Z ) : = ( X h ) ( Y , Z ) ( Y h ) ( X , Z ) + h ( T ( X , Y ) , Z ) .
If h is a pseudo-Riemannian metric, we call ( M , h , ) a statistical manifold admitting torsion [21].
A semi-symmetric linear connection on a differentiable manifold was introduced by Friedmann and Schouten [22]. Let ( M , g ) be a (pseudo) Riemannian manifold. A linear connection ∇ on M is said to be semi-symmetric if its torsion T is given by
T ( X , Y ) = π ( Y ) X π ( X ) Y ,
for X , Y Γ ( T M ) . π is a 1-form associated with the vector field P, i.e., π ( X ) = g ( X , P ) . Tao and Zhang in some parts of [23] have studied linear connections with respect to an arbitrary invertible operator on space T M as well as an arbitrary 1-form.
Proposition 7.
Let ( M , J , g ˜ ) be an almost Norden manifold. Consider a metrical structure ( g , ) such that J is-recurrent. Then ( M , g ˜ , ) is a statistical manifold admitting torsion if and only if it is semi symmetric.
Proof. 
Using the relation (12), g = 0 and ( X J ) Y = τ ( X ) J Y , we have
( X g ˜ ) ( Y , Z ) ( Y g ˜ ) ( X , Z ) = X g ( Y , J Z ) g ( X Y , J Z ) g ( J Y , X Z ) Y g ( X , J Z ) + g ( Y X , J Z ) + g ( J X , Y Z ) = g ( Y , X J Z ) g ( J Y , X Z ) g ( X , Y J Z ) + g ( J X , Y Z ) = g ( Y , ( X J ) Z ) g ( X , ( Y J ) Z ) = g ( Y , τ ( X ) J Z ) g ( X , τ ( Y ) J Z ) = g ( T ( X , Y ) , J Z ) = g ˜ ( T ( X , Y ) , Z ) .
Proposition 8. 
Let ( M , J , g ) be an almost Norden manifold,the Levi-Civita connection of g and ( , * , g ˜ ) a dualistic structure. If J is-recurrent, then for any X , Y , Z Γ ( T M ) ,
(i) 
X * Y = X Y + τ ( X ) Y ,
(ii) 
( X * g ) ( Y , Z ) = 2 g ( τ ( X ) Y , Z ) ,
(iii) 
( X * g ) ( Y , Z ) ( X * g ˜ ) ( Y , Z ) = ( X g ˜ ) ( Y , Z ) .
Proof. 
(i)
Let X , Y , Z Γ ( T M ) . Then
X g ˜ ( Y , Z ) = g ˜ ( X Y , Z ) + g ˜ ( Y , X * Z ) ,
X g ( J Y , Z ) = g ( J ( X Y ) , Z ) + g ( J Y , X * Z ) ,
X g ( J Y , Z ) = g ( ( X J ) Y , Z ) + g ( X J Y , Z ) + g ( J Y , X * Z ) ,
g ( J Y , X Z ) = g ( τ ( X ) J Z , Y ) + g ( J Y , X * Z ) .
Thus
X * Z = X Z + τ ( X ) Z .
(ii)
By covariant differentiation of g with respect to * , we have
( X * g ) ( Y , Z ) = ( X g ) ( Y , Z ) g ( τ ( X ) Y , Z ) g ( Y , τ ( X ) Z ) = 2 g ( τ ( X ) Y , Z ) .
(iii)
Covariant derivative of g ˜ with respect to * is
( X * g ˜ ) ( Y , Z ) = ( X g ˜ ) ( Y , Z ) 2 g ( τ ( X ) Y , Z ) .
Subtracting (14) from (13),
( X * g ) ( Y , Z ) ( X * g ˜ ) ( Y , Z ) = ( X g ˜ ) ( Y , Z ) .
Denote by X J Y = J X J Y , for any vector fields X , Y on M.
Before stating the next proposition, we recall the concept of projectively equivalent connections. Two linear connections ∇ and P on a differentiable manifold M are called projectively equivalent if there exists a 1-form τ such that
X P Y = X Y + τ ( X ) Y + τ ( Y ) X , X , Y Γ ( T M ) .
Proposition 9.
Let ( M , J ) be an almost complex manifold andand P projectively equivalent linear connections on M. If ( X J ) Y = τ ( X ) J Y , then
X P Y = X J Y J ( Y J ) X , X , Y Γ ( T M ) ,
R P ( X , Y ) Z = R ( X , Y ) Z , X , Y Γ ( T M ) ,
where R P is the curvature tensor of P .
Proof. 
Based on ( X J ) Y = τ ( X ) J Y , we can write
X P Y = X Y + τ ( X ) Y + τ ( Y ) X = X Y J ( X J ) Y J ( Y J ) X = X J Y J ( Y J ) X .
From (15) and using the skew-symmetry of the curvature, for any X , Y Γ ( T M ) we have
X P Y = X J Y + Y J X Y X ;
thus we obtain
R P ( X , Y ) Z = R J ( X , Y ) Z + R J ( Y , X ) Z R ( Y , X ) Z ,
R P ( X , Y ) Z = R ( X , Y ) Z .

5. Example

We construct one complex statistical connection using a 1-form ρ given by ρ ( X ) = g ˜ ( X , ξ ) , where g ˜ is the Norden metric and ξ is the dual vector field of ρ .
Since
ρ ( J ( X ) ) = ( ρ J ) X = g ˜ ( J X , ξ ) = g ( J X , J ξ ) = g ( X , ξ ) ,
and denoting by
K ( X , Y ) = ρ ( J X ) ρ ( J Y ) J ( ξ ) ,
we obtain the following.
g ˜ ( K ( X , Y ) , Z ) = ρ ( J X ) ρ ( J Y ) g ( J ξ , J Z ) = ( g ( X , ξ ) ) ( g ( Y , ξ ) ) ( g ( Z , ξ ) ) = g ( Y , ρ ( J X ) ρ ( J Z ) ξ ) = g ˜ ( Y , ρ ( J X ) ρ ( J Z ) J ξ ) = g ˜ ( K ( X , Z ) , Y ) .
Then, the connection ∇, defined by
X Y = g ˜ X Y + 1 2 ρ ( J X ) ρ ( J Y ) J ( ξ ) ,
is a statistical connection, where g ˜ is the Levi-Civita connection of g ˜ . Now considering the statistical connection (17), we investigate some of its properties.
Let ( M , J , g ) be an almost Norden manifold with the Levi Civita connection ∇ such that J = 0 . The triple ( M , J , g ) is called Kähler-Norden manifold.
Proposition 10.
Let ( M , J , g ˜ ) be a Kähler-Norden statistical manifold with the statistical connection (17). If ρ J ( ξ ) = ρ J ξ , then J = 0 .
Proof. 
Since
( X J ) Y = X J Y J ( X Y ) = ( g ˜ X J ) Y 1 2 ρ ( J X ) ρ ( Y ) J ( ξ ) + 1 2 ρ ( J X ) ρ ( J Y ) ( ξ ) ,
due to the assumption that M is Kähler-Norden, i.e., g ˜ X Y = 0 , and according to ρ J ( ξ ) = ρ J ξ , the proof is complete.   □
Corollary 3.
Let ( M , J , g ˜ ) be a Kähler-Norden statistical manifold with the statistical connection (17). If ρ J ( ξ ) = ρ J ξ , then
K ( J Y , X ) = K ( J X , Y ) , X , Y Γ ( T M ) .
Using (10) and (1), we reach the result.
Proposition 11.
Let ( M , J , g ˜ ) be a Kähler-Norden statistical manifold with the statistical connection (17). Then, for any X , Y Γ ( T M ) ,
K ( J Y , X ) = K ( J X , Y ) ( X J ) Y = ( Y J ) X .
Proof. 
From
( X J ) Y = ( g ˜ X J ) Y 1 2 ρ ( J X ) ρ ( Y ) J ( ξ ) + 1 2 ρ ( J X ) ρ ( J Y ) ( ξ ) ,
and
( Y J ) X = ( g ˜ Y J ) X 1 2 ρ ( X ) ρ ( J Y ) J ( ξ ) + 1 2 ρ ( J Y ) ρ ( J X ) ( ξ ) ,
we have
K ( J Y , X ) K ( J X , Y ) = ( X J ) Y ( Y J ) X .
Corollary 4. 
Let ( M , J , g ˜ ) be a Kähler-Norden statistical manifold with the statistical connection (17). Then the following conditions are equivalent.
(i) 
K ( J Y , X ) = K ( J X , Y ) , X , Y Γ ( T M ) ,
(ii) 
( X J ) Y = ( Y J ) X , X , Y Γ ( T M ) ,
(iii) 
( X * J ) Y = ( Y * J ) X , X , Y Γ ( T M ) .
Proposition 11 and Corollary 4 imply the following.
Proposition 12. 
Let ( M , J , g ˜ ) be an almost Norden statistical manifold with the statistical connection (17). Then any two of the following conditions imply the third one.
(i) 
K ( J Y , X ) = K ( J X , Y ) , X , Y Γ ( T M ) ,
(ii) 
( X J ) Y = ( Y J ) X , X , Y Γ ( T M ) ,
(iii) 
( g ˜ X J ) Y = ( g ˜ Y J ) X , X , Y Γ ( T M ) .

Author Contributions

Conceptualization, L.S.; methodology, L.S.; validation, E.P. and I.M.; formal analysis, E.P.; investigation, L.S. and E.P.; resources, I.M.; writing—original draft preparation, L.S.; writing—review and editing, L.S., E.P. and I.M.; visualization, I.M.; supervision, E.P. and I.M.; project administration, E.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Amari, S. Information geometry of the EM and em algorithms for neural networks. Neural Netw. 1995, 8, 1379–1408. [Google Scholar] [CrossRef]
  2. Belkin, M.; Niyogi, P.; Sindhwani, V. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 2006, 7, 2399–2434. [Google Scholar]
  3. Caticha, A. The information geometry of space-time. Proceedings 2019, 33, 3015. [Google Scholar] [CrossRef] [Green Version]
  4. Sun, K.; Marchand-Maillet, S. An information geometry of statistical manifold learning. In Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, China, 21–26 June 2014; pp. 1–9. [Google Scholar]
  5. Fisher, R.A. On the mathematical foundations of theoretical statistics. Philos. Trans. R. Soc. Lond. 1992, 222, 309–368. [Google Scholar]
  6. Rao, C.R. Information and accuracy attainable in estimation of statistical parameters. Bull. Cal. Math. Soc. 1945, 37, 81–91. [Google Scholar]
  7. Balan, V.; Peyghan, E.; Sharahi, E. Statistical structures on the tangent bundle of a statistical manifold with Sasaki metric. Hacet. J. Math. Stat. 2020, 49, 120–135. [Google Scholar] [CrossRef]
  8. Peyghan, E.; Seifipour, D.; Gezer, A. Statistical structures on tangent bundles and Lie groups. Hacet. J. Math. Stat. 2021, 50, 1140–1154. [Google Scholar]
  9. Fei, T.; Zhang, J. Interaction of Codazzi couplings with (para-) Kähler geometry. Results Math. 2017, 72, 2037–2056. [Google Scholar] [CrossRef]
  10. Furuhata, H.; Hasegawa, I.; Okuyama, Y.; Sato, K.; Shahid, M.H. Sasakian statistical manifolds. J. Geom. Phys. 2017, 117, 179–186. [Google Scholar] [CrossRef]
  11. Călin, O.; Udrişte, C. Geometric Modeling in Probability and Statistics; Springer International Publishing: Cham, Switzerland, 2014. [Google Scholar]
  12. Grigorian, S.; Zhang, J. (Para-) holomorphic and conjugate connections on (para-) Hermitian and (para-) Kähler manifolds. Results Math. 2019, 74, 150. [Google Scholar] [CrossRef]
  13. Gezer, A.; Cakicioglu, H. Notes concerning Codazzi pairs on almost anti-Hermitian manifolds. arXiv 2019, arXiv:1911.06140. [Google Scholar]
  14. Teofilova, M. Conjugate connections and statistical structures on almost Norden manifolds. arXiv 2018, arXiv:1812.04512. [Google Scholar]
  15. Amari, S. Differential-Geometrical Methods in Statistics; Springer: Berlin/Heidelberg, Germany, 1985. [Google Scholar]
  16. Dillen, F.; Nomizu, K.; Vrancken, L. Conjugate connections and Radon’s theorem in affine differential geometry. Monatshefte Math. 1990, 109, 221–235. [Google Scholar] [CrossRef]
  17. Nomizu, K.; Sasaki, T. Affine Differential Geometry. Geometry of Affine Immersions; Cambridge University Press: Cambridge, UK, 1994. [Google Scholar]
  18. Iscan, M.; Salimov, A.A. On Kähler-Norden manifolds. Proc. Indian Acad. Sci. (Math. Sci.) 2009, 119, 71–80. [Google Scholar] [CrossRef]
  19. Blaga, A.M.; Crasmareanu, M. The geometry of complex conjugate connections. Hacet. J. Math. Stat. 2012, 41, 119–126. [Google Scholar]
  20. Matsuzoe, H. Quasi-statistical manifolds and geometry of affine distributions. In Pure and Applied Differential Geometry; Van der Veken, J., Van de Woestyne, I., Verstraelen, L., Vrancken, L., Eds.; In Memory of Franki Dillen; Shaker Verlag: Aachen, Germany, 2013; pp. 208–214. [Google Scholar]
  21. Kurose, T. Statistical Manifolds Admitting Torsion, Geometry and Something; Fukuoka University: Fukuoka-shi, Japan, 2007. (In Japanese) [Google Scholar]
  22. Friedmann, A.; Schouten, J.A. Über die Geometrie der halbsymmetrischen Übertragungen. Math. Z. 1924, 21, 211–223. [Google Scholar] [CrossRef]
  23. Tao, J.; Zhang, J. Transformations and coupling relations for affine connections. Diff. Geom. Appl. 2016, 49, 111–130. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Samereh, L.; Peyghan, E.; Mihai, I. On Almost Norden Statistical Manifolds. Entropy 2022, 24, 758. https://doi.org/10.3390/e24060758

AMA Style

Samereh L, Peyghan E, Mihai I. On Almost Norden Statistical Manifolds. Entropy. 2022; 24(6):758. https://doi.org/10.3390/e24060758

Chicago/Turabian Style

Samereh, Leila, Esmaeil Peyghan, and Ion Mihai. 2022. "On Almost Norden Statistical Manifolds" Entropy 24, no. 6: 758. https://doi.org/10.3390/e24060758

APA Style

Samereh, L., Peyghan, E., & Mihai, I. (2022). On Almost Norden Statistical Manifolds. Entropy, 24(6), 758. https://doi.org/10.3390/e24060758

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop