Next Article in Journal
From Ion Fluxes in Living Cells to Metabolic Power Considerations
Next Article in Special Issue
The (α,p)-Golden Metric Manifolds and Their Submanifolds
Previous Article in Journal
Mathematical and Statistical Aspects of Estimating Small Oscillations Parameters in a Conservative Mechanical System Using Inaccurate Observations
Previous Article in Special Issue
First Natural Connection on Riemannian Π-Manifolds
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On Nearly Sasakian and Nearly Kähler Statistical Manifolds

1
Department of Mathematics, Faculty of Science, King Abdulaziz University, Jeddah 21589, Saudi Arabia
2
Department of Mathematics, Faculty of Science, Arak University, Arak 38156-8-8349, Iran
3
Department of Mathematics, College of Science, Jazan University, Jazan 82817, Saudi Arabia
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(12), 2644; https://doi.org/10.3390/math11122644
Submission received: 17 May 2023 / Revised: 5 June 2023 / Accepted: 8 June 2023 / Published: 9 June 2023
(This article belongs to the Special Issue Submanifolds in Metric Manifolds)

Abstract

:
In this paper, we introduce the notions of nearly Sasakian and nearly Kähler statistical structures with a non-trivial example. The conditions for a real hypersurface in a nearly Kähler statistical manifold to admit a nearly Sasakian statistical structure are given. We also study invariant and anti-invariant statistical submanifolds of nearly Sasakian statistical manifolds. Finally, some conditions under which such a submanifold of a nearly Sasakian statistical manifold is itself a nearly Sasakian statistical manifold are given.

1. Introduction

Information geometry, as a well-known theory in geometry, is a gadget used to peruse spaces including of probability measures. At present, this interdisciplinary field, as a combination of differential geometry and statistics, plays an impressive role in various sciences. For instance, a manifold learning theory in a hypothetic space consisting of models is developed in [1]. The semi-Riemannian metric of this hypothesis space, which is uniquely derived, relies on the information geometry of the probability distributions. In [2], Amari also presented the geometrical and statistical ideas used to investigate neural networks, including invisible units or unobservable variables. To see more applications of this geometry in other sciences, refer to [3,4].
Suppose that ζ is an open subset of R n , and χ is a sample space with parameters ξ = ( ξ 1 , , ξ n ) . A statistical model S is the set of probability density functions defined by
S = { p ( y ; ξ ) : χ p ( y ; ξ ) d y = 1 , p ( y ; ξ ) > 0 , ξ ζ R n } .
The Fisher information matrix g ( ξ ) = [ g l s ( ξ ) ] on S is given as
g l s ( ξ ) : = χ l ξ s ξ p ( y ; ξ ) d y = E p [ l ξ s ξ ] ,
where E p [ ] is the expectation of ( y ) with respect to p ( y ; ξ ) , ξ = ( y ; ξ ) : = log p ( y ; ξ ) and l : = ξ l . The space S, together with the information matrices, is a statistical manifold.
In 1920, Fisher was the first to offer (1) as a mathematical purpose of information (see [5]). It is observed that ( S , g ) is a Riemannian manifold if all components of g are converging to real numbers and g is positive-definite. Therefore, g is called a Fisher metric on S. Using g, an affine connection ∇ with respect to p ( y ; ξ ) is described by
Γ l s , k = g ( l s , k ) : = E p [ ( l s ξ ) k ξ ] .
Nearly Kähler structures on Riemannian manifolds were specified by Gray [6] to describe a special class of almost Hermitian structures in every even dimension. As an odd-dimensional peer of nearly Kähler manifolds, nearly Sasakian manifolds were introduced by Blair, Yano and Showers in [7]. They showed that a normal nearly Sasakian structure is Sasakian and a hypersurface of a nearly Kähler structure is nearly Sasakian if and only if it is quasi-umbilical with the (almost) contact form. In particular, S 5 properly imbedded in S 6 inherits a nearly Sasakian structure which is not Sasakian.
A statistical manifold can be considered as an expanse of a Riemannian manifold such that the compatibility of the Riemannian metric is developed to a general condition. By applying this opinion in geometry, we create a convenient nearly Sasakian structure on statistical structures and define a nearly Sasakian statistical manifold.
The purpose of this paper is to present nearly Sasakian and nearly Kähler structures on statistical manifolds and show the relation between two geometric notions. To achieve this goal, the notions and attributes of statistical manifolds are obtained in Section 2. In Section 3, we describe a nearly Sasakian structure on statistical manifolds and present some of their properties. In Section 4, we investigate nearly Kähler structures on statistical manifolds. In this context, the conditions needed for a real hypersurface in a nearly Kähler statistical manifold to admit a nearly Sasakian statistical structure are provided. Section 5 is devoted to studying (anti-)invariant statistical submanifolds of nearly Sasakian statistical manifolds. Some conditions under which an invariant submanifold of a nearly Sasakian statistical manifold is itself a nearly Sasakian statistical manifold are given at the end.

2. Preliminaries

For an n-dimensional manifold N, consider ( U , x i ) , i = 1 , , n , as a local chart of the point x U . Considering the coordinates ( x i ) on N, we have the local field x i | p as frames on T p N .
An affine connection ∇ is called Codazzi connection if the Codazzi equations satisfy:
( X 1 g ) ( X 2 , X 3 ) = ( X 2 g ) ( X 1 , X 3 ) , ( = ( X 3 g ) ( X 1 , X 2 ) ) ,
for any X 1 , X 2 , X 3 Γ ( T N ) where
( X 1 g ) ( X 2 , X 3 ) = X 1 g ( X 2 , X 3 ) g ( X 1 X 2 , X 3 ) g ( X 2 , X 1 X 3 ) .
The triplet ( N , g , ) is also called a statistical manifold if the Codazzi connection ∇ is a statistical connection, i.e., a torsion-free Codazzi connection. Moreover, the affine connection * as a (dual) conjugate connection of ∇ with respect to g is determined by
X 1 g ( X 2 , X 3 ) = g ( X 1 X 2 , X 3 ) + g ( X 2 , X 1 * X 3 ) .
Considering g as the Levi–Civita connection on N, one can see g = 1 2 ( + * ) and
* g = g .
Thus, ( N , g , * ) forms a statistical manifold. In particular, the torsion-free Codazzi connection ∇ reduces to the Levi–Civita connection g if g = 0 .
A ( 1 , 2 ) -tensor field K on a statistical manifold ( N , g , ) is described by
K X 1 X 2 = X 1 X 2 X 1 g X 2 ,
from (2) and (3), we have
K = g * = 1 2 ( * ) .
Hence, it follows that K satisfies
K X 1 X 2 = K X 2 X 1 , g ( K X 3 X 2 , X 1 ) = g ( X 2 , K X 3 X 1 ) .
The curvature tensor R of a torsion-free linear connection ∇ is described by
R ( X 1 , X 2 ) = X 1 X 2 X 2 X 1 [ X 1 , X 2 ] ,
for any X 1 , X 2 Γ ( T N ) . On a statistical structure ( , g ) , denote the curvature tensor of ∇ as R or R for short, and denote R * as R * in a similar argument. It is obvious that
R ( X 1 , X 2 ) = R ( X 2 , X 1 ) ,
R * ( X 1 , X 2 ) = R * ( X 2 , X 1 ) .
Moreover, setting R ( X 1 , X 2 , X 3 , X 4 ) = g ( R ( X 1 , X 2 ) X 3 , X 4 ) , we can see that
R ( X 1 , X 2 , X 3 , X 4 ) = R * ( X 1 , X 2 , X 4 , X 3 ) ,
R ( X 1 , X 2 ) X 3 + R ( X 2 , X 3 ) X 1 + R ( X 3 , X 1 ) X 2 = 0 ,
R * ( X 1 , X 2 ) X 3 + R * ( X 2 , X 3 ) X 1 + R * ( X 3 , X 1 ) X 2 = 0 .
The statistical curvature tensor field 𝒮 of the statistical structure ( , g ) is given by
𝒮 ( X 1 , X 2 ) X 3 = 1 2 { R ( X 1 , X 2 ) X 3 + R * ( X 1 , X 2 ) X 3 } .
using the definition of R , it follows that
𝒮 ( X 1 , X 2 , X 3 , X 4 ) = 𝒮 ( X 2 , X 1 , X 3 , X 4 ) , 𝒮 ( X 1 , X 2 , X 3 , X 4 ) = 𝒮 ( X 1 , X 2 , X 4 , X 3 ) , 𝒮 ( X 1 , X 2 , X 3 , X 4 ) = 𝒮 ( X 3 , X 4 , X 1 , X 2 ) ,
where 𝒮 ( X 1 , X 2 , X 3 , X 4 ) = g ( 𝒮 ( X 1 , X 2 ) X 3 , X 4 ) .
The Lie derivative with respect to a metric tensor g in a statistical manifold ( N , g , ) , for any X 1 , X 2 , v Γ ( T N ) is given by
( £ v g ) ( X 1 , X 2 ) = g ( X 1 g v , X 2 ) + g ( X 1 , X 2 g v ) = g ( X 1 v , X 2 ) g ( K X 1 v , X 2 ) + g ( X 1 , X 2 v ) g ( X 1 , K X 2 v ) .
The vector field v is said to be the Killing vector field or infinitesimal isometry if £ v g = 0 . Hence, using the above equation and (8), it follows that
g ( X 1 v , X 2 ) + g ( X 1 , X 2 v ) = 2 g ( K X 1 v , X 2 ) .
Similarly, (7) implies
g ( X 1 * v , X 2 ) + g ( X 1 , X 2 * v ) = 2 g ( K X 1 v , X 2 ) .
The curvature tensor R g of a Riemannian manifold ( N , g ) admitting a Killing vector field v satisfies the following
R g ( X 1 , v ) X 2 = X 1 g X 2 g v X 1 g X 2 g v ,
for any X 1 , X 2 , v Γ ( T N ) [8].

3. Nearly Sasakian Statistical Manifolds

An almost contact manifold is a ( 2 n + 1 ) -dimensional differentiable manifold N equipped with an almost contact structure ( F , v , u ) where F is a tensor field of type ( 1 , 1 ) , v a vector field and u a 1-form, such that
F 2 = I + u v , F v = 0 , u ( v ) = 1 .
Additionally, N will be called an almost contact metric manifold if it admits a pseudo-Riemannian metric g with the following condition
g ( F X 1 , F X 2 ) = g ( X 1 , X 2 ) u ( X 1 ) u ( X 2 ) , X 1 , X 2 Γ ( T N ) .
Moreover, as in the almost contact case, (19) yields u = g ( . , v ) and g ( . , F ) = g ( F , . ) .
Theorem 1.
The statistical curvature tensor field 𝒮 of a statistical manifold ( N , g , ) with an almost contact metric structure ( F , v , u , g ) , such that the vector field v is Killing, which satisfies the equation
2 𝒮 ( X 1 , v ) X 2 = X 1 X 2 v X 1 X 2 v + X 1 * X 2 * v X 1 * X 2 * v ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
According to (10), (12) and (14), we can write
R * ( X 2 , X 3 , X 1 , v ) = R * ( X 3 , X 1 , X 2 , v ) R * ( X 1 , X 2 , X 3 , v ) = R ( X 3 , X 1 , v , X 2 ) + R ( X 1 , X 2 , v , X 3 ) = R ( X 1 , X 3 , v , X 2 ) R ( X 2 , X 1 , v , X 3 ) .
Applying (9) to the above equation, we find
R * ( X 2 , X 3 , X 1 , v ) = g ( X 1 X 3 v + X 3 X 1 v + [ X 1 , X 3 ] v , X 2 ) + g ( X 2 X 1 v + X 1 X 2 v + [ X 2 , X 1 ] v , X 3 ) .
Since v is Killing, by differentiating
g ( X 2 v , X 3 ) + g ( X 2 , X 3 v ) = 2 g ( K X 2 v , X 3 ) ,
with respect to X 1 , we obtain
2 X 1 g ( K X 3 X 2 v ) = ( X 1 g ) ( X 3 v , X 2 ) + g ( X 1 X 3 v , X 2 ) + g ( X 3 v , X 1 X 2 ) + ( X 1 g ) ( X 2 v , X 3 ) + g ( X 1 X 2 v , X 3 ) + g ( X 2 v , X 1 X 3 ) .
Setting the last equation in (20), it follows that
R * ( X 2 , X 3 , X 1 , v ) = 2 g ( X 1 X 2 v , X 3 ) 2 g ( X 1 X 2 v , X 3 ) + 2 ( X 1 g ) ( X 3 v , X 2 ) + 2 g ( K X 3 v , X 1 X 2 ) 2 X 1 g ( K X 3 X 2 , v ) 2 g ( K X 1 v , [ X 3 , X 2 ] ) + 2 X 3 g ( K X 1 X 2 , v ) + 2 g ( K X 2 v , [ X 1 , X 3 ] ) 2 X 2 g ( K X 1 X 3 , v ) + 2 g ( K X 3 v , X 2 X 1 ) + R ( X 2 , X 3 , v , X 1 ) .
As ( X 1 g ) ( X 3 v , X 2 ) = 2 g ( K X 1 X 3 v , X 2 ) , and using (12) in the above equation, we can obtain
R ( X 2 , X 3 , v , X 1 ) = g ( X 1 X 2 v , X 3 ) + g ( X 1 X 2 v , X 3 ) + 2 g ( K X 1 X 2 , X 3 v ) g ( K X 3 v , X 1 X 2 ) g ( K X 2 v , [ X 1 , X 3 ] ) + X 1 g ( K X 3 X 2 , v ) + g ( K X 1 v , [ X 3 , X 2 ] ) X 3 g ( K X 1 X 2 , v ) + X 2 g ( K X 1 X 3 , v ) g ( K X 3 v , X 2 X 1 ) .
Similarly, we find
R * ( X 2 , X 3 , v , X 1 ) = g ( X 1 * X 2 * v , X 3 ) + g ( X 1 * X 2 * v , X 3 ) 2 g ( K X 1 X 2 , X 3 * v ) + g ( K X 3 v , X 1 * X 2 ) + g ( K X 2 v , [ X 1 , X 3 ] ) X 1 g ( K X 3 X 2 , v ) g ( K X 1 v , [ X 3 , X 2 ] ) + X 3 g ( K X 1 X 2 , v ) X 2 g ( K X 1 X 3 , v ) + g ( K X 3 v , X 2 * X 1 ) .
Adding the previous relations and using (7) and (15), we obtain the following assertion. □
A nearly Sasakian manifold is an almost contact metric manifold ( N , F , v , u , g ) if
( X 1 g F ) X 2 + ( X 2 g F ) X 1 = 2 g ( X 1 , X 2 ) v + u ( X 1 ) X 2 + u ( X 2 ) X 1 ,
for any X 1 , X 2 Γ ( T N ) [7]. In such manifolds, the vector field v is Killing. Moreover, a tensor field h of type ( 1 , 1 ) is determined by
X 1 g v = F X 1 + h X 1 .
The last equation immediately shows that h is skew-symmetric and
h F = F h , h v = 0 , u h = 0 ,
and
v g h = v g F = F h = 1 3 £ v F .
Moreover, Olszak proved the following formulas in [9]:
R g ( F X 1 , X 2 , X 3 , X 4 ) + R g ( X 1 , F X 2 , X 3 , X 4 ) + R g ( X 1 , X 2 , F X 3 , X 4 ) + R g ( X 1 , X 2 , X 3 , F X 4 ) = 0 ,
R g ( F X 1 , F X 2 , F X 3 , F X 4 ) = R g ( X 1 , X 2 , X 3 , X 4 ) R g ( v , X 2 , X 3 , X 4 ) u ( X 1 ) + R g ( v , X 1 , X 3 , X 4 ) u ( X 2 ) ,
R g ( v , X 1 ) X 2 = g ( X 1 h 2 X 1 , X 2 ) v u ( X 2 ) ( X 1 h 2 X 1 ) ,
R g ( F X 1 , F X 2 ) v = 0 ,
for any X 1 , X 2 , X 3 , X 4 Γ ( T N ) .
Lemma 1.
For a manifold N with a statistical structure ( , g ) , and an almost contact metric structure ( F , v , u , g ) , the following holds
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = ( X 1 g F ) X 2 + ( X 2 g F ) X 1 + K X 1 F X 2 + K X 2 F X 1 + 2 F K X 1 X 2 ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
(6) and (7) imply
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = X 1 g F X 2 + K X 1 F X 2 F X 1 g X 2 + F K X 1 X 2 + X 2 g F X 1 + K X 2 F X 1 F X 2 g X 1 + F K X 2 X 1 = ( X 1 g F ) X 2 + ( X 2 g F ) X 1 + K X 1 F X 2 + K X 2 F X 1 + 2 F K X 1 X 2 .
Hence, the proof is complete. □
Definition 1.
A nearly Sasakian statistical structure on N is a quintuple ( , g , F , v , u ) consisting of a statistical structure ( , g ) and a nearly Sasakian structure ( g , F , v , u ) , satisfying
K X 1 F X 2 + K X 2 F X 1 = 2 F K X 1 X 2 ,
for any X 1 , X 2 Γ ( T N ) .
A nearly Sasakian statistical manifold is a manifold that admits a nearly Sasakian statistical structure.
Remark 1.
A multiple ( N , * , g , F , v , u ) is also a nearly Sasakian statistical manifold if ( N , , g , F , v , u ) is a nearly Sasakian statistical manifold. In this case, from Lemma 1 and Definition 1, we have
X 1 * F X 2 F X 1 X 2 + X 2 * F X 1 F X 2 X 1 = ( X 1 g F ) X 2 + ( X 2 g F ) X 1 ,
for any X 1 , X 2 Γ ( T N ) .
Theorem 2.
If ( N , , g ) is a statistical manifold, and ( g , F , v ) an almost contact metric structure on N; then, ( , g , F , v ) is a nearly Sasakian statistical structure on N if and only if the following formulas hold:
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = u ( X 1 ) X 2 + u ( X 2 ) X 1 2 g ( X 1 , X 2 ) v ,
X 1 * F X 2 F X 1 X 2 + X 2 * F X 1 F X 2 X 1 = u ( X 1 ) X 2 + u ( X 2 ) X 1 2 g ( X 1 , X 2 ) v ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
Let ( N , , g , F , v ) be a nearly Sasakian statistical manifold. Applying (21), Lemma 1 and Definition 1, we get (28). Additionally, (29) follows from Remark 1. Conversely, using (7) and subtracting the relations (28) and (29), we can obtain (27). □
Example 1.
Let us consider the three-dimensional unite sphere S 3 in the complex two-dimensional space C 2 . As S 3 is isomorphic to the Lie group S U ( 2 ) , set { e 1 , e 2 , e 3 } as the basis of the Lie algebra su ( 2 ) of S U ( 2 ) obtained by
e 1 = 2 2 i 0 0 i ¯ , e 2 = 2 2 0 1 1 0 , e 3 = 1 2 0 i i 0 .
Therefore, the Lie bracket is described by
[ e 1 , e 2 ] = 2 e 3 , [ e 2 , e 3 ] = e 1 , [ e 1 , e 3 ] = e 2 .
The Riemannian metric g on S 3 is defined by the following
g ( e 1 , e 2 ) = g ( e 1 , e 3 ) = g ( e 2 , e 3 ) = 0 , g ( e 1 , e 1 ) = g ( e 2 , e 2 ) = g ( e 3 , e 3 ) = 1 .
Assume that v = e 3 and u is the 1-form described by u ( X 1 ) = g ( X 1 , v ) for any X 1 Γ ( T S 3 ) . Considering F as a ( 1 , 1 ) -tensor field determined by F ( e 1 ) = e 2 , F ( e 2 ) = e 1 and F ( v ) = 0 ; the above equations imply that ( S 3 , F , v , u , g ) is an almost contact metric manifold. Using Koszul’s formula, it follows that e i g e j = 0 , i , j = 1 , 2 , 3 , except
e 1 g e 2 = v = e 2 g e 1 , e 1 g v = e 2 , e 2 g v = e 1 .
According to the above equations, we can see that
( e i g F ) e j + ( e j g F ) e i = 0 = 2 g ( e i , e j ) v + u ( e i ) e j + u ( e j ) e i , i , j = 1 , 2 , 3 ,
unless
( e 1 g F ) e 1 + ( e 1 g F ) e 1 = 2 v = 2 g ( e 1 , e 1 ) v + u ( e 1 ) e 1 + u ( e 1 ) e 1 , ( e 1 g F ) v + ( v g F ) e 1 = e 1 = 2 g ( e 1 , v ) v + u ( e 1 ) v + u ( v ) e 1 , ( e 2 g F ) e 2 + ( e 2 g F ) e 2 = 2 v = 2 g ( e 2 , e 2 ) v + u ( e 2 ) e 2 + u ( e 2 ) e 2 , ( e 2 g F ) e 3 + ( e 3 g F ) e 2 = e 2 = 2 g ( e 2 , e 3 ) v + u ( e 2 ) e 3 + u ( e 3 ) e 2 ,
which gives ( g , F , v , u ) , a nearly Sasakian structure on S 3 . By setting
K ( e 1 , e 1 ) = e 1 , K ( e 1 , e 2 ) = K ( e 2 , e 1 ) = e 2 , K ( e 2 , e 2 ) = e 1 ,
while the other cases are zero, one see that K satisfies (8). From (6), it follows that
e 1 e 1 = e 1 , e 1 e 2 = e 3 e 2 , e 1 e 3 = e 2 , e 2 e 1 = e 2 e 3 , e 2 e 2 = e 1 , e 2 e 3 = e 1 .
Therefore, we can obtain ( e i g ) ( e j , e k ) = 0 , i , j , k = 1 , 2 , 3 , except
( e 1 g ) ( e 1 , e 1 ) = 2 , ( e 1 g ) ( e 2 , e 2 ) = ( e 2 g ) ( e 1 , e 2 ) = ( e 2 g ) ( e 2 , e 1 ) = 2 .
Hence, ( , g ) is a statistical structure on S 3 . Moreover, the equations
K e 1 F ( e 1 ) + K e 1 F ( e 1 ) = 2 e 2 = 2 F K e 1 e 1 , K e 1 F ( e 2 ) + K e 2 F ( e 1 ) = 2 e 1 = 2 F K e 1 e 2 , K e 2 F ( e 2 ) + K e 2 F ( e 2 ) = 2 e 2 = 2 F K e 2 e 2 ,
hold. Therefore, ( S 3 , , g , F , v , u ) is a nearly Sasakian statistical manifold.
Proposition 1.
For a nearly Sasakian statistical manifold ( N , , g , F , v , u ) , the following conditions hold:
( i ) F K v v = 0 , ( i i ) F K F X 1 v = 0 , ( i i i ) K v X 1 = u ( X 1 ) K v v , ( i v ) X 1 v = X 1 g v + u ( X 1 ) K v v , ( v ) X 1 * v = X 1 g v u ( X 1 ) K v v ,
for any X 1 Γ ( T N ) .
Proof. 
Setting X 1 = X 2 = v in (27), it follows (i). For X 2 = v in (27), we have
K F X 1 v = 2 F K X 1 v .
Putting X 1 = F X 1 in the last equation and using (18), we can obtain
K X 1 v = u ( X 1 ) K v v + 2 F K F X 1 v .
Applying F yields
F K X 1 v = 2 K F X 1 v + 2 u ( K F X 1 v ) v .
(30) and the last equation imply that
3 K F X 1 v = 4 u ( K F X 1 v ) v ,
which gives us F K F X 1 v = 0 , so (ii) holds. This and (31) yield (iii). From (6), (7) and (iii), we have (iv) and (v). □
Corollary 1.
A nearly Sasakian statistical manifold satisfies the following
u ( X 2 ) K X 1 K v v = u ( X 1 ) K X 2 K v v = u ( K X 1 X 2 ) K v v ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
(6) and (30) imply
F 2 ( X 1 v X 1 g v ) = 0 ,
which gives us
X 1 v = X 1 g v + g ( X 1 v , v ) v .
Similarly,
X 1 * v = X 1 g v + g ( X 1 * v , v ) v .
Then, subtracting the above two equations yields
K X 1 v = g ( X 1 v , v ) v ,
which gives us K v v = g ( v v , v ) v . Thus, we obtain
u ( X 2 ) K X 1 K v v = u ( X 2 ) g ( v v , v ) K X 1 v = u ( X 1 ) u ( X 2 ) g ( v v , v ) K v v = u ( X 1 ) K X 2 K v v .
Moreover, (iii) implies
u ( K X 1 X 2 ) K v v = g ( K X 1 X 2 , v ) K v v = g ( K X 1 v , X 2 ) K v v = u ( X 1 ) u ( X 2 ) g ( v v , v ) K v v .
Therefore, the assertion follows. □
Corollary 2.
In a nearly Sasakian statistical manifold N, let X 1 Γ ( T N ) and X 1 v . Then,
1.
K X 1 v = 0 ,
2.
X 1 v = X 1 * v = X 1 g v .
Proposition 2.
On a nearly Sasakian statistical manifold, the following holds
g ( X 1 v , X 2 ) + g ( X 2 v , X 1 ) = 2 u ( X 1 ) u ( X 2 ) g ( K v v , v ) ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
Since v is a Killing vector field in a nearly Sasakian manifold (see [7]); hence, we have
g ( X 1 g v , X 2 ) + g ( X 2 g v , X 1 ) = 0 .
Setting (6) in the above equation, we have the following assertion. □
Lemma 2.
Let ( N , , g , F , v ) be a nearly Sasakian statistical manifold. Then, the statistical curvature tensor field satisfies
𝒮 ( v , X 1 ) X 2 = g ( X 1 h 2 X 1 , X 2 ) v u ( X 2 ) ( X 1 h 2 X 1 ) ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
According to (6), (7) and Theorem 1, we can write
X 1 X 2 v X 1 X 2 v = X 1 X 2 g v + X 1 ( u ( X 2 ) K v v ) X 1 X 2 g v u ( X 1 X 2 ) K v v = K X 1 X 2 g v + X 1 g X 2 g v + ( X 1 u ) X 2 K v v + u ( X 2 ) ( K X 1 K v v + X 1 g K v v ) X 1 g X 2 g v K X 1 X 2 g v .
Applying (17) in the above equation, we have
X 1 X 2 v X 1 X 2 v = R g ( X 1 , v ) X 2 + K X 1 X 2 g v + ( X 1 u ) X 2 K v v + u ( X 2 ) ( K X 1 K v v + X 1 g K v v ) K X 1 X 2 g v .
We can similarly conclude that
X 1 * X 2 * v X 1 * X 2 * v = R g ( X 1 , v ) X 2 K X 1 X 2 g v ( X 1 * u ) X 2 K v v + u ( X 2 ) ( K X 1 K v v X 1 g K v v ) + K X 1 X 2 g v .
The above two equations imply
X 1 X 2 v X 1 X 2 v + X 1 * X 2 * v X 1 * X 2 * v = 2 R g ( X 1 , v ) X 2 2 u ( K X 1 X 2 ) K v v + 2 u ( X 2 ) K X 1 K v v ,
from this and Theorem 1, we have
𝒮 ( X 1 , v ) X 2 = R g ( X 1 , v ) X 2 u ( K X 1 X 2 ) K v v + u ( X 2 ) K X 1 K v v .
Thus, the assertion follows from (25), (32) and Corollary 1. □
Corollary 3.
On a nearly Sasakian statistical manifold N, the following holds
𝒮 ( X 1 , X 2 ) v = g ( X 1 + h 2 X 1 , X 2 ) v + u ( X 2 ) ( X 1 h 2 X 1 ) + g ( X 2 h 2 X 2 , X 1 ) v u ( X 1 ) ( X 2 h 2 X 2 ) ,
𝒮 ( F X 1 , F X 2 ) v = 0 ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
We have
𝒮 ( X 1 , X 2 ) v = 𝒮 ( v , X 1 ) X 2 𝒮 ( X 2 , v ) X 1 .
Applying Lemma 2 in the last equation, it follows that (33). To prove (34), using X 1 = F X 1 and X 2 = F X 2 in the above equation and using the skew-symmetric property of h, we can obtain
𝒮 ( F X 1 , F X 2 ) v = g ( F X 1 + h 2 F X 1 , F X 2 ) v + g ( F X 2 h 2 F X 2 , F X 1 ) v = 0 .
Proposition 3.
The statistical curvature tensor field S of a nearly Sasakian statistical manifold N satisfies the following
𝒮 ( F X 1 , X 2 , X 3 , X 4 ) + 𝒮 ( X 1 , F X 2 , X 3 , X 4 ) + 𝒮 ( X 1 , X 2 , F X 3 , X 4 ) + 𝒮 ( X 1 , X 2 , X 3 , F X 4 ) = 0 ,
𝒮 ( F X 1 , F X 2 , F X 3 , F X 4 ) = 𝒮 ( X 1 , X 2 , X 3 , X 4 ) + u ( X 2 ) R g ( v , X 1 , X 3 , X 4 ) u ( X 1 ) R g ( v , X 2 , X 3 , X 4 ) ,
for any X 1 , X 2 , X 3 , X 4 Γ ( T N ) .
Proof. 
Applying (7) in (15), it follows that
𝒮 ( X 1 , X 2 ) X 3 = R g ( X 1 , X 2 ) X 3 + [ K X 1 , K X 2 ] X 3 .
Thus, using (23) and (37), we can write
𝒮 ( F X 1 , X 2 , X 3 , X 4 ) + 𝒮 ( X 1 , F X 2 , X 3 , X 4 ) + 𝒮 ( X 1 , X 2 , F X 3 , X 4 ) + 𝒮 ( X 1 , X 2 , X 3 , F X 4 ) = g ( K F X 1 K X 2 X 3 K X 2 K F X 1 X 3 + K X 1 K F X 2 X 3 K F X 2 K X 1 X 3 + K X 1 K X 2 F X 3 K X 2 K X 1 F X 3 , X 4 ) + g ( K X 1 K X 2 X 3 K X 2 K X 1 X 3 , F X 4 ) .
On the other hand, (27) implies
g ( K X 1 F X 2 + K X 2 F X 1 , X 3 ) = 2 g ( K X 1 X 2 , F X 3 ) ,
which gives us
g ( K F X 1 K X 2 X 3 K X 2 K F X 1 X 3 + K X 1 K F X 2 X 3 K F X 2 K X 1 X 3 + K X 1 K X 2 F X 3 K X 2 K X 1 F X 3 , X 4 ) + g ( K X 1 K X 2 X 3 K X 2 K X 1 X 3 , F X 4 ) = 2 g ( K X 2 X 3 , F K X 1 X 4 ) 2 g ( K X 1 X 3 , F K X 2 X 4 ) + 2 g ( F K X 2 X 3 , K X 1 X 4 ) 2 g ( F K X 1 X 3 , K X 2 X 4 ) = 0 .
Using the above equation in (38), we obtain (35). Considering X 1 = F X 1 in (35) and using (18), it follows that
𝒮 ( X 1 , X 2 , X 3 , X 4 ) + u ( X 1 ) 𝒮 ( v , X 2 , X 3 , X 4 ) + 𝒮 ( F X 1 , F X 2 , X 3 , X 4 ) + 𝒮 ( F X 1 , X 2 , F X 3 , X 4 ) + 𝒮 ( F X 1 , X 2 , X 3 , F X 4 ) = 0 .
Similarly, setting X 2 = F X 2 , X 3 = F X 3 and X 4 = F X 4 , respectively, we have
𝒮 ( F X 1 , F X 2 , X 3 , X 4 ) 𝒮 ( X 1 , X 2 , X 3 , X 4 ) + u ( X 2 ) 𝒮 ( X 1 , v , X 3 , X 4 ) + 𝒮 ( X 1 , F X 2 , F X 3 , X 4 ) + 𝒮 ( X 1 , F X 2 , X 3 , F X 4 ) = 0 ,
𝒮 ( F X 1 , X 2 , F X 3 , X 4 ) + 𝒮 ( X 1 , F X 2 , F X 3 , X 4 ) 𝒮 ( X 1 , X 2 , X 3 , X 4 ) + u ( X 3 ) 𝒮 ( X 1 , X 2 , v , X 4 ) + 𝒮 ( X 1 , X 2 , F X 3 , F X 4 ) = 0 ,
and
𝒮 ( F X 1 , X 2 , X 3 , F X 4 ) + 𝒮 ( X 1 , F X 2 , X 3 , F X 4 ) + 𝒮 ( X 1 , X 2 , F X 3 , F X 4 ) 𝒮 ( X 1 , X 2 , X 3 , X 4 ) + u ( X 4 ) 𝒮 ( X 1 , X 2 , X 3 , v ) = 0 .
By adding (39) and (40), and subtracting the expression obtained from (41) and (42), we obtain
2 𝒮 ( F X 1 , F X 2 , X 3 , X 4 ) 2 𝒮 ( X 1 , X 2 , F X 3 , F X 4 ) + u ( X 1 ) 𝒮 ( v , X 2 , X 3 , X 4 ) + u ( X 2 ) 𝒮 ( X 1 , v , X 3 , X 4 ) u ( X 3 ) 𝒮 ( X 1 , X 2 , v , X 4 ) u ( X 4 ) 𝒮 ( X 1 , X 2 , X 3 , v ) = 0 .
Replacing X 1 and X 2 by F X 1 and F X 2 , we can rewrite the last equation as
2 𝒮 ( F 2 X 1 , F 2 X 2 , X 3 , X 4 ) 2 𝒮 ( F X 1 , F X 2 , F X 3 , F X 4 ) u ( X 3 ) 𝒮 ( F X 1 , F X 2 , v , X 4 ) u ( X 4 ) 𝒮 ( F X 1 , F X 2 , X 3 , v ) = 0 .
Applying (34) in the above equation, we obtain
𝒮 ( F 2 X 1 , F 2 X 2 , X 3 , X 4 ) = 𝒮 ( F X 1 , F X 2 , F X 3 , F X 4 ) .
On the other hand, using (18), it can be seen that
𝒮 ( F 2 X 1 , F 2 X 2 , X 3 , X 4 ) = 𝒮 ( X 1 , X 2 , X 3 , X 4 ) u ( X 2 ) 𝒮 ( X 1 , v , X 3 , X 4 ) u ( X 1 ) 𝒮 ( v , X 2 , X 3 , X 4 ) .
According to Corollary 1 and (32), we have
R g ( v , X 1 , X 3 , X 4 ) = R g ( X 3 , X 4 , v , X 1 ) = 𝒮 ( X 3 , X 4 , v , X 1 ) = 𝒮 ( v , X 1 , X 3 , X 4 ) .
The above three equations imply (36). □
Corollary 4.
The tensor field K in a nearly Sasakian statistical manifold, N, satisfies the relation
F [ K F X 2 , K F X 1 ] F = [ K X 1 , K X 2 ] ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
Using (24) and (37), we obtain
𝒮 ( F X 1 , F X 2 , F X 3 , F X 4 ) 𝒮 ( X 1 , X 2 , X 3 , X 4 ) u ( X 2 ) R g ( v , X 1 , X 3 , X 4 ) + u ( X 1 ) R g ( v , X 2 , X 3 , X 4 ) = g ( K F X 1 K F X 2 F X 3 K F X 2 K F X 1 F X 3 , F X 4 ) g ( K X 1 K X 2 X 3 K X 2 K X 1 X 3 , X 4 ) = g ( F [ K F X 2 , K F X 1 ] F X 3 [ K X 1 , K X 2 ] X 3 , X 4 ) .
Comparing this with relation (36) yields the following assertion. □
A statistical manifold is called conjugate symmetric if the curvature tensors of the connections ∇ and * , are equal, i.e.,
R ( X 1 , X 2 ) X 3 = R * ( X 1 , X 2 ) X 3 ,
for all X 1 , X 2 , X 3 Γ ( T N ) .
Corollary 5.
Let ( N , , g , F , v ) be a conjugate symmetric nearly Sasakian statistical manifold. Then, the following holds
R ( F X 1 , F X 2 , F X 3 , F X 4 ) R ( X 1 , X 2 , X 3 , X 4 ) = u ( X 2 ) R ( X 3 , X 4 , v , X 1 ) u ( X 1 ) R ( X 3 , X 4 , v , X 2 ) , R ( X 1 , X 2 ) v = R g ( X 1 , X 2 ) v , R ( F X 1 , F X 2 ) v = 0 ,
for any X 1 , X 2 , X 3 , X 4 Γ ( T N ) .

4. Hypersurfaces in Nearly Kähler Statistical Manifolds

Let N ˜ be a smooth manifold. A pair ( g ˜ , J ) is said to be an almost Hermitian structure on N ˜ if
J 2 = I d , g ˜ ( J X 1 , J X 2 ) = g ˜ ( X 1 , X 2 ) ,
for any X 1 , X 2 Γ ( T N ˜ ) . Let ˜ g denote the Riemannian connection of g ˜ . Then, J is Killing if and only if
( ˜ X 1 g J ) X 2 + ( ˜ X 2 g J ) X 1 = 0 .
In this case, the pair ( g ˜ , J ˜ ) is called a nearly Kähler structure and if J is integrable, the structure is Kählerian [7].
Lemma 3.
Let ( ˜ , g ˜ ) be a statistical structure, and ( g ˜ , J ) a nearly Kähler structure on N ˜ . We have the following formula:
˜ X 1 J X 2 J ˜ X 1 * X 2 + ˜ X 2 J X 1 J ˜ X 2 * X 1 = K ˜ X 1 J X 2 + K ˜ X 2 J X 1 + 2 J K ˜ X 1 X 2 ,
for any X 1 , X 2 Γ ( T N ˜ ) , where K ˜ is given as (8) for ( ˜ , g ˜ ) .
Remark 2.
A multiple ( N ˜ , ˜ * , g ˜ , J ) is also a nearly Kähler statistical manifold if ( N ˜ , ˜ , g ˜ , J ) is a nearly Kähler statistical manifold. In this case, from the above lemma, we have
˜ X 1 * J X 2 J ˜ X 1 X 2 + ˜ X 2 * J X 1 J ˜ X 2 X 1 = ( K ˜ X 1 J X 2 + K ˜ X 2 J X 1 + 2 J K ˜ X 1 X 2 ) ,
for any X 1 , X 2 Γ ( T N ˜ ) .
Definition 2.
A nearly Kähler statistical structure on N ˜ is a triple ( ˜ , g ˜ , J ) , where ( ˜ , g ˜ ) is a statistical structure, ( g ˜ , J ) is a nearly Kähler structure on N ˜ and the following equality is satisfied
K ˜ X 1 J X 2 + K ˜ X 2 J X 1 = 2 J K ˜ X 1 X 2 ,
for any X 1 , X 2 Γ ( T N ˜ ) .
Let N be a hypersurface of a statistical manifold ( N ˜ , g ˜ , ˜ , ˜ * ) . Considering n and g as a unit normal vector field and the induced metric on N, respectively, the following relations hold
˜ X 1 X 2 = X 1 X 2 + h ( X 1 , X 2 ) n , ˜ X 1 n = A X 1 + τ ( X 1 ) n ,
˜ X 1 * X 2 = X 1 * X 2 + h * ( X 1 , X 2 ) n , ˜ X 1 * n = A * X 1 + τ * ( X 1 ) n ,
for any X 1 , X 2 Γ ( T N ) . It follows that
g ( A X 1 , X 2 ) = h * ( X 1 , X 2 ) , g ( A * X 1 , X 2 ) = h ( X 1 , X 2 ) , τ ( X 1 ) + τ * ( X 1 ) = 0 .
Furthermore, the second fundamental form h g is related to the Levi–Civita connections ˜ g and g by
˜ X 1 g X 2 = X 1 g X 2 + h g ( X 1 , X 2 ) n , ˜ X 1 g n = A g X 1 ,
where g ( A g X 1 , X 2 ) = h g ( X 1 , X 2 ) .
Remark 3.
Let ( N ˜ , g ˜ , J ) be a nearly Kähler manifold, and N be a hypersurface with a unit normal vector field n . Let g be the induced metric on N, and consider v, u and F as a vector field, a 1-form and a tensor of type ( 1 , 1 ) on N, respectively, such that
v = J n ,
J X 1 = F X 1 + u ( X 1 ) n ,
for any X 1 Γ ( T N ) . Then, ( g , F , v ) is an almost contact metric structure on N [7].
Lemma 4.
Let ( N ˜ , ˜ , g ˜ , J ) be a nearly Kähler statistical manifold. If ( N , g , F , v ) is a hypersurface with the induced almost contact metric structure as in Remark 2, and ( , g ) is the induced statistical structure on N as in 42, then the following holds
(i)
F A v = 0 ,
(ii)
g ( A X 1 , v ) = u ( A v ) u ( X 1 ) ,
(iii)
A X 1 = v F X 1 F v * X 1 F X 1 * v + u ( X 1 ) A v ,
(iv)
τ ( X 1 ) = g ( X 1 * v , v ) g ( X 1 , v v ) u ( X 1 ) τ ( v ) ,
(v)
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = 2 g ( A X 1 , X 2 ) v + u ( X 2 ) A X 1 + u ( X 1 ) A X 2 ,
(vi)
g ( X 1 v , X 2 ) + g ( X 2 v , X 1 ) = g ( F A * X 1 , X 2 ) + g ( F A * X 2 , X 1 ) u ( X 1 ) τ ( X 2 ) u ( X 2 ) τ ( X 1 ) ,
for any X 1 , X 2 Γ ( T N ) . For the induced statistical structure ( * , g ) on N, we have
(i)*
F A * v = 0 ,
(ii)*
g ( A * X 1 , v ) = u ( A * v ) u ( X 1 ) ,
(iii)*
A * X 1 = v * F X 1 F v X 1 F X 1 v + u ( X 1 ) A * v ,
(iv)*
τ * ( X 1 ) = g ( X 1 v , v ) g ( X 1 , v * v ) u ( X 1 ) τ * ( v ) ,
(v)*
X 1 * F X 2 F X 1 X 2 + X 2 * F X 1 F X 2 X 1 = 2 g ( A * X 1 , X 2 ) v + u ( X 2 ) A * X 1 + u ( X 1 ) A * X 2 ,
(vi)*
g ( X 1 * v , X 2 ) + g ( X 2 * v , X 1 ) = g ( F A X 1 , X 2 ) + g ( F A X 2 , X 1 ) u ( X 1 ) τ * ( X 2 ) u ( X 2 ) τ * ( X 1 ) .
Proof. 
According to Definition 2 and (46), we can write
0 = ˜ X 1 J v ˜ X 1 n = J ˜ X 1 * v ˜ v J X 1 + J ˜ v * X 1 ˜ X 1 n .
Applying (43), (44) and (47) in the above equation, we have
0 = J ( X 1 * v + g ( A X 1 , v ) n ) ˜ v ( F X 1 + u ( X 1 ) n ) + J ( v * X 1 + g ( A v , X 1 ) n ) + A X 1 τ ( X 1 ) n = F ( X 1 * v ) g ( A X 1 , v ) v v F ( X 1 ) + u ( X 1 ) A v + F ( v * X 1 ) g ( A v , X 1 ) v + A X 1 + { u ( X 1 * v ) g ( A * v , F X 1 ) v ( u ( X 1 ) ) u ( X 1 ) τ ( v ) + u ( v * X 1 ) τ ( X 1 ) } n .
The vanishing tangential part yields
A X 1 = v F X 1 F v * X 1 F X 1 * v + 2 g ( A X 1 , v ) v u ( X 1 ) A v .
Setting X 1 = v in the above equation, it follows that
A v = u ( A v ) v ,
hence, F A v = 0 and implies (i), from which (ii) follows because 0 = g ( F A v , F X 1 ) = g ( A v , X 1 ) u ( A v ) u ( X 1 ) . From (49) and (50) we have (iii). Vanishing vertical part in (48), and using ( i ) * and
v ( u ( X 1 ) ) = g ( v * X 1 , v ) + g ( X 1 , v v ) ,
we obtain (iv). As
˜ X 1 J X 2 J ˜ X 1 * X 2 + ˜ X 2 J X 1 J ˜ X 2 * X 1 = 0 ;
thus, (43), (44), (46) and (47) imply
X 1 F X 2 u ( X 2 ) A X 1 F ( X 1 * X 2 ) + g ( A X 1 , X 2 ) v + X 2 F X 1 u ( X 1 ) A X 2 F ( X 2 * X 1 ) + g ( A X 2 , X 1 ) v + { g ( A * X 1 , F X 2 ) + g ( X 1 v , X 2 ) + u ( X 2 ) τ ( X 1 ) + g ( A * X 2 , F X 1 ) + g ( X 1 , X 2 v ) + u ( X 1 ) τ ( X 2 ) } n = 0 .
From the above equation, (v) and (vi) follow. In a similar fashion, we have ( i ) * ( v i ) * . □
Theorem 3.
Let ( N ˜ , ˜ , g ˜ , J ) be a nearly Kähler statistical manifold and ( N , , g , F , v ) be an almost contact metric statistical hypersurface in N ˜ given by (43), (44), (46) and (47). Then, ( N , , g , F , v ) is a nearly Sasakian statistical manifold if and only if
A X 1 = X 1 + u ( X 1 ) ( A v v ) ,
A * X 1 = X 1 + u ( X 1 ) ( A * v v ) ,
for any X 1 Γ ( T N ) .
Proof. 
Let ( , g , F , v ) be a nearly Sasakian statistical structure on N. According to Definition 1, we have
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = 2 g ( X 1 , X 2 ) v + u ( X 1 ) X 2 + u ( X 2 ) X 1 ,
which gives us
v F X 1 F X 1 * v F v * X 1 = u ( X 1 ) v + X 1 .
Placing the last equation in part (iii) of Lemma 4, we obtain (51). Similarly, we can prove (52). Conversely, let the shape operators satisfy (51). Part (v) of Lemma 4 yields
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = 2 g ( X 1 + u ( X 1 ) ( A v v ) , X 2 ) v + u ( X 2 ) ( X 1 + u ( X 1 ) ( A v v ) ) + u ( X 1 ) ( X 2 + u ( X 2 ) ( A v v ) ) = 2 g ( X 1 , X 2 ) v + u ( X 1 ) X 2 + u ( X 2 ) X 1 .
In the same way, (v) * and (52) imply
X 1 * F X 2 F X 1 X 2 + X 2 * F X 1 F X 2 X 1 = 2 g ( X 1 , X 2 ) v + u ( X 1 ) X 2 + u ( X 2 ) X 1 .
According to the above equations and Theorem 2, the proof is completed. □

5. Submanifolds of Nearly Sasakian Statistical Manifolds

Let N be a n-dimensional submanifold of an almost contact metric statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) . We denote the induced metric on N by g. For all U 1 Γ ( T N ) and ζ Γ ( T N ) , we put F ˜ U 1 = F U 1 + F ¯ U 1 and F ˜ ζ = F ζ + F ¯ ζ , where F U 1 , F ζ Γ ( T N ) and F ¯ U 1 , F ¯ ζ Γ ( T N ) . If F ˜ ( T p N ) T p N and F ˜ ( T p N ) T p N for any p N , then N is called F ˜ -invariant and F ˜ -anti-invariant, respectively.
Proposition 4
([10]). Any F ˜ -invariant submanifold N embedded in an almost contact metric manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) in such a way that the vector field v ˜ is always tangent to N, induces an almost contact metric structure ( g , F , v , u ) .
For any U 1 , U 2 Γ ( T N ) , the corresponding Gauss formulas are given by
˜ U 1 U 2 = U 1 U 2 + h ( U 1 , U 2 ) , ˜ U 1 * U 2 = U 1 * U 2 + h * ( U 1 , U 2 ) .
It is proved that ( , g ) and ( * , g ) are statistical structures on N, and h and h * are symmetric and bilinear. The mean curvature vector field with respect to ˜ is described by
H = 1 m t r a c e ( h ) .
The submanifold N is a ˜ totally umbilical submanifold if h ( U 1 , U 2 ) = g ( U 1 , U 2 ) H for all U 1 , U 2 Γ ( T N ) . The submanifold N is called ˜ -autoparallel if h ( U 1 , U 2 ) = 0 for any U 1 , U 2 Γ ( T N ) . The submanifold N is said to be dual-autoparallel if it is both ˜ - and ˜ * -autoparallel, i.e., h ( U 1 , U 2 ) = h * ( U 1 , U 2 ) = 0 for any U 1 , U 2 Γ ( T N ) . If h g ( U 1 , U 2 ) = 0 for any U 1 , U 2 Γ ( T N ) , the submanifold N is called totally geodesic. Moreover, the submanifold N is called ˜ -minimal ( ˜ * -minimal) if H = 0 ( H * = 0 ).
For any U 1 Γ ( T N ) and ζ Γ ( T N ) , the Weingarten formulas are
˜ U 1 ζ = A ζ U 1 + D U 1 ζ , ˜ X 1 * ζ = A ζ * U 1 + D U 1 * ζ ,
where D and D * are the normal connections on Γ ( T N ) and the tensor fields h , h * , A and A * , satisfy
g ( A ζ U 1 , U 2 ) = g ( h * ( U 1 , U 2 ) , ζ ) , g ( A ζ * U 1 , U 2 ) = g ( h ( U 1 , U 2 ) , ζ ) .
The Levi–Civita connections g and ˜ g are associated with the second fundamental form h g by
˜ U 1 g U 2 = U 1 g U 2 + h g ( U 1 , U 2 ) , ˜ U 1 g ζ = A ζ g U 1 + D U 1 g ζ ,
where g ( A ζ g U 1 , U 2 ) = g ( h g ( U 1 , U 2 ) , ζ ) .
On a statistical submanifold ( N , , g ) of a statistical manifold ( N ˜ , ˜ , g ) , for any tangent vector fields U 1 , U 2 Γ ( T N ) , we consider the difference tensor K on N as
2 K U 1 U 2 = U 1 U 2 U 1 * U 2 .
From (7), (53) and the above equation, it follows that
2 K ˜ U 1 U 2 = 2 K U 1 U 2 + h ( U 1 , U 2 ) h * ( U 1 , U 2 ) .
More precisely, for the tangential part and the normal part, we have
( K ˜ U 1 U 2 ) = K U 1 U 2 , ( K ˜ U 1 U 2 ) = 1 2 ( h ( U 1 , U 2 ) h * ( U 1 , U 2 ) ) ,
respectively. Similarly, for U 1 Γ ( T N ) and ζ Γ ( T N ) we have
K ˜ U 1 ζ = ( K ˜ U 1 ζ ) + ( K ˜ U 1 ζ ) ,
where
( K ˜ U 1 ζ ) = 1 2 ( A ζ * U 1 A ζ U 1 ) , ( K ˜ U 1 ζ ) = 1 2 ( D U 1 ζ D U 1 * ζ ) .
Now, suppose that ( N , g ) is a submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ ) . As a tensor field, h ˜ of type ( 1 , 1 ) on N ˜ is described by ˜ g v ˜ = F ˜ + h ˜ ; we can set h ˜ U 1 = h U 1 + h ¯ U 1 and h ˜ ζ = h ζ + h ¯ ζ where h U 1 , h ζ Γ ( T N ) and h ¯ U 1 , h ¯ ζ Γ ( T N ) for any U 1 Γ ( T N ) and ζ Γ ( T N ) . Furthermore, if h ˜ ( T p N ) T p N and h ˜ ( T p N ) T p N , then N is called h ˜ -invariant and h ˜ -anti-invariant, respectively.
Proposition 5.
Let N be a submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) , where the vector field v ˜ is normal to N. Then,
g ( F ˜ U 1 , U 2 ) = g ( U 1 , h ˜ U 2 ) , U 1 , U 2 Γ ( T N ) .
Moreover,
(i) N is a h ˜ -anti-invariant submanifold if and only if N is a F ˜ -anti-invariant submanifold.
(ii) If h ˜ = 0 , then N is a F ˜ -anti-invariant submanifold.
(iii) If N is a h ˜ -invariant and F ˜ -invariant submanifold, then h U 1 = F U 1 , for any U 1 Γ ( T N ) .
Proof. 
Using (22) and Proposition 1 for any U 1 , U 2 Γ ( T N ) , we can write
g ( F ˜ U 1 + h ˜ U 1 , U 2 ) = g ( ˜ U 1 g v ˜ , U 2 ) = g ( ˜ U 1 v ˜ , U 2 ) .
(54) and the above equation imply
g ( F ˜ U 1 + h ˜ U 1 , U 2 ) = g ( A v ˜ U 1 + D U 1 v ˜ , U 2 ) = g ( A v ˜ U 1 , U 2 ) = g ( v ˜ , h * ( U 1 , U 2 ) ) .
As h * is symmetric and the operators h ˜ and g are skew-symmetric, the above equation yields
g ( F ˜ U 1 + h ˜ U 1 , U 2 ) = g ( F ˜ U 2 + h ˜ U 2 , U 1 ) = g ( F ˜ U 1 + h ˜ U 1 , U 2 ) .
Hence, g ( F ˜ U 1 + h ˜ U 1 , U 2 ) = 0 , which gives (58). If N is a h ˜ -anti-invariant submanifold, we have g ( U 1 , h ˜ U 2 ) = 0 . Thus, (i) follows from (58). Similarly, we have (ii) and (iii). □
Lemma 5.
Let ( N , , g ) be a F ˜ -anti-invariant statistical submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) such that the structure ( F , v , u ) on N is given by Proposition 4.
(i) If v ˜ is tangent to N, then
U 1 v = u ( U 1 ) K v v = U 1 * v , h ( U 1 , v ) = F ¯ U 1 + h ¯ U 1 = h * ( U 1 , v ) , U 1 Γ ( T N ) .
(ii) If v ˜ is normal to N, then
A v ˜ = 0 = A v ˜ * , D U 1 v ˜ = F ¯ U 1 + h ¯ U 1 = D U 1 * v ˜ , U 1 Γ ( T N ) .
Proof. 
Applying (22), (53) and Proposition 1, and using K ˜ v v = K v v = g ( v v , v ) v , we have
F ¯ U 1 + h ¯ U 1 + u ( U 1 ) K v v = ˜ U 1 g v + u ( U 1 ) K v v = ˜ U 1 v = U 1 v + h ( U 1 , v ) .
Thus, the normal part is h ( U 1 , v ) = F ¯ U 1 + h ¯ U 1 and the tangential part is U 1 v = u ( U 1 ) K v v . Similarly, we can obtain their dual parts. Hence, (i) holds. If v ˜ is normal to N, from (22) and (54), it follows that
F ¯ U 1 + h ¯ U 1 = ˜ U 1 g v ˜ = ˜ U 1 v ˜ = A v ˜ U 1 + D U 1 v ˜ .
Considering the normal and tangential components of the last equation, we obtain (ii). Since ˜ U 1 v = ˜ U 1 g v = ˜ U 1 * v , we have the dual part of the assertion. □
Lemma 6.
Let ( N , , g ) be a F ˜ -invariant and h ˜ -invariant statistical submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) . Then, for any U 1 Γ ( T N ) , if
(i) v ˜ is tangent to N, then
U 1 v = F U 1 + h U 1 + u ( U 1 ) K v v , U 1 * v = F U 1 + h U 1 u ( U 1 ) K v v , h ( U 1 , v ) = 0 = h * ( U 1 , v ) .
(ii) v ˜ is normal to N, then
A v ˜ U 1 = F U 1 h U 1 = A v ˜ * U 1 , D v ˜ = 0 = D * v ˜ .
Proof. 
The relations are proved using the method applied to the proof of Lemma 5. □
Theorem 4.
On a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) , if N is a F ˜ -anti-invariant ˜ totally umbilical statistical submanifold of N ˜ and v ˜ is tangent to N, then N is ˜ -minimal in N ˜ .
Proof. 
According to Lemma 5, h ( v , v ) = 0 . As N is a totally umbilical submanifold, it follows that
0 = h ( v , v ) = g ( v , v ) H = H ,
which gives us the assertion. □
Theorem 5.
Let N be a F ˜ -invariant submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) , where the vector field v ˜ is tangent to N. If
h g ( U 1 , F U 2 ) = F ˜ h g ( U 1 , U 2 ) ,
h ( U 1 , F U 2 ) h * ( U 1 , F U 2 ) = F ˜ h * ( U 1 , U 2 ) F ˜ h ( U 1 , U 2 ) ,
for all U 1 , U 2 Γ ( T N ) , then ( , g , F , v , u ) forms a nearly Sasakian statistical structure on N.
Proof. 
According to Proposition 4, N induces the almost contact metric structure ( g , F , v , u ) . Furthermore, (53) shows that ( , g ) is a statistical structure on N. By applying (55), we can write
˜ U 1 g F ˜ U 2 = U 1 g F U 2 + h g ( U 1 , F U 2 ) = ( U 1 g F ) U 2 + F U 1 g U 2 + h g ( U 1 , F U 2 ) .
As h g is symmetric, from (59), we have h g ( F U 1 , U 2 ) = h g ( U 1 , F U 2 ) . Hence, the above equation implies
˜ U 1 g F ˜ U 2 + ˜ U 2 g F ˜ U 1 = ( U 1 g F ) U 2 + ( U 2 g F ) U 1 + F U 1 g U 2 + F U 2 g U 1 + 2 h g ( U 1 , F U 2 ) .
On the other hand, since N ˜ has a nearly Sasakian structure, we have
˜ U 1 g F ˜ U 2 + ˜ U 2 g F ˜ U 1 = ( ˜ U 1 g F ˜ ) U 2 + ( ˜ U 2 g F ˜ ) U 1 + F ˜ ( ˜ U 1 g U 2 + ˜ U 2 g U 1 ) = ( ˜ U 1 g F ˜ ) U 2 + ( ˜ U 2 g F ˜ ) U 1 + F ˜ ( U 1 g U 2 + U 2 g U 1 + 2 h g ( U 1 , U 2 ) ) = 2 g ( U 1 , U 2 ) v + u ( U 1 ) U 2 + u ( U 2 ) U 1 + F ˜ ( U 1 g U 2 + U 2 g U + 2 h g ( U , U 2 ) ) = 2 g ( U 1 , U 2 ) v + u ( U 1 ) U 2 + u ( U 2 ) U 1 + F U 1 g U 2 + F U 2 g U 1 + 2 F ˜ h g ( U 1 , U 2 ) .
(59) and the above two equations yield
( U 1 g F ) U 2 + ( U 2 g F ) U 1 = 2 g ( U 1 , U 2 ) v + u ( U 1 ) U 2 + u ( U 2 ) U 1 .
Thus, ( N , g , g , F , v , u ) is a nearly Sasakian manifold. For the nearly Sasakian statistical manifold N ˜ , using (27), we have
K ˜ U 1 F U 2 + K ˜ U 2 F U 1 = 2 F ˜ K ˜ U 1 U 2 ,
for any U 1 , U 2 Γ ( T N ) . Applying (57) in the last equation, it follows
K U 1 F U 2 + 1 2 ( h ( U 1 , F U 2 ) h * ( U 1 , F U 2 ) ) + K U 2 F U 1 + 1 2 ( h ( U 2 , F U 1 ) h * ( U 2 , F U 1 ) ) = 2 F K U 1 U 2 + F ˜ h * ( U 1 , U 2 ) F ˜ h ( U 1 , U 2 ) .
From the above equation and (60), we obtain
K U 1 F U 2 + K U 2 F U 1 = 2 F K U 1 U 2 .
Therefore, ( N , g , g , F , v , u ) is a nearly Sasakian statistical manifold. Hence, the proof is completed. □
Proposition 6.
Let N be a F ˜ -invariant and h ˜ -invariant statistical submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) , such that v ˜ is tangent to N. Then,
( ˜ U 1 h ) ( U 2 , v ) = ( ˜ U 1 * h ) ( U 2 , v ) = ( ˜ U 1 g h ) ( U 2 , v ) = h ( U 2 , F U 1 + h U 1 ) ,
and
( ˜ U 1 h * ) ( U 2 , v ) = ( ˜ U 1 * h * ) ( U 2 , v ) = ( ˜ U 1 g h * ) ( U 2 , v ) = h * ( U 2 , F U 1 + h U 1 ) ,
for any U 1 , U 2 Γ ( T N ) .
Proof. 
We have
( ˜ U 1 h ) ( U 2 , v ) = ˜ U 1 h ( U 2 , v ) h ( ˜ U 1 U 2 , v ) h ( U 2 , ˜ U 1 v ) ,
for any U 1 , U 2 Γ ( T N ) . According to Proposition 1, part (i) of Lemma 6 and the above equation, we have
( ˜ U 1 h ) ( U 2 , v ) = h ( U 2 , ˜ U 1 v ) = h ( U 2 , F U 1 + h U 1 + u ( U 1 ) K v v ) = h ( U 2 , F U 1 + h U 1 ) .
Similarly, other parts are obtained. □
Corollary 6.
Let N be a F ˜ -invariant and h ˜ -invariant statistical submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) . If v ˜ is tangent to N, then the following conditions are equivalent
(i) h and h * are parallel with respect to the connection ˜ ;
(ii) N is dual-autoparallel.

Author Contributions

Writing—original draft, S.U., E.P. and L.N.; Writing—review and editing, R.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research work was funded by Institutional Fund Projects under grant no. (IFPIP: 1184-130-1443). The authors gratefully acknowledge the technical and financial support provided by the Ministry of Education and King Abdulaziz University, DSR, Jeddah, Saudi Arabia.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sun, K.; Marchand-Maillet, S. An information geometry of statistical manifold learning. In Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, China, 21–26 June 2014; pp. 1–9. [Google Scholar]
  2. Amari, S. Information geometry of the EM and em algorithms for neural networks. Neural Netw. 1995, 8, 1379–1408. [Google Scholar] [CrossRef]
  3. Belkin, M.; Niyogi, P.; Sindhwani, V. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 2006, 7, 2399–2434. [Google Scholar]
  4. Caticha, A. Geometry from information geometry. arXiv 2015, arXiv:1512.09076v1. [Google Scholar]
  5. Fisher, R.A. On the mathematical foundations of theoretical statistics. Philos. Trans. R. Soc. Lond. 1922, 222, 309–368. [Google Scholar]
  6. Gray, A. Nearly Kähler manifolds. J. Differ. Geom. 1970, 4, 283–309. [Google Scholar] [CrossRef]
  7. Blair, D.E.; Showers, D.K.; Yano, K. Nearly Sasakian structures. Kodai Math. Semin. Rep. 1976, 27, 175–180. [Google Scholar] [CrossRef]
  8. Blair, D.E. Riemannian Geometry of Contact and Symplectic Manifolds; Birkhäuser: Basel, Switzerland, 2002. [Google Scholar]
  9. Olszak, Z. Nearly Sasakian manifolds. Tensor 1979, 33, 277–286. [Google Scholar]
  10. Yano, K.; Ishihara, S. Invariant submanifolds of almost contact manifolds. Kōdai Math. Semin. Rep. 1969, 21, 350–364. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Uddin, S.; Peyghan, E.; Nourmohammadifar, L.; Bossly, R. On Nearly Sasakian and Nearly Kähler Statistical Manifolds. Mathematics 2023, 11, 2644. https://doi.org/10.3390/math11122644

AMA Style

Uddin S, Peyghan E, Nourmohammadifar L, Bossly R. On Nearly Sasakian and Nearly Kähler Statistical Manifolds. Mathematics. 2023; 11(12):2644. https://doi.org/10.3390/math11122644

Chicago/Turabian Style

Uddin, Siraj, Esmaeil Peyghan, Leila Nourmohammadifar, and Rawan Bossly. 2023. "On Nearly Sasakian and Nearly Kähler Statistical Manifolds" Mathematics 11, no. 12: 2644. https://doi.org/10.3390/math11122644

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop