Next Article in Journal
Neural Optimization Techniques for Noisy-Data Observer-Based Neuro-Adaptive Control for Strict-Feedback Control Systems: Addressing Tracking and Predefined Accuracy Constraints
Previous Article in Journal
Spatiotemporal Analysis of Disease Spread Using a Soliton-Based SIR Framework for Nomadic Populations
Previous Article in Special Issue
Tempered Riemann–Liouville Fractional Operators: Stability Analysis and Their Role in Kinetic Equations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Generalizing Uncertainty Through Dynamic Development and Analysis of Residual Cumulative Generalized Fractional Extropy with Applications in Human Health

by
Mohamed Said Mohamed
1 and
Hanan H. Sakr
2,*
1
Department of Mathematics, Faculty of Education, Ain Shams University, Cairo 11341, Egypt
2
Department of Management Information Systems, College of Business Administration in Hawtat Bani Tamim, Prince Sattam Bin Abdulaziz University, Saudi Arabia
*
Author to whom correspondence should be addressed.
Fractal Fract. 2025, 9(6), 388; https://doi.org/10.3390/fractalfract9060388
Submission received: 27 April 2025 / Revised: 4 June 2025 / Accepted: 11 June 2025 / Published: 17 June 2025

Abstract

:
The complementary dual of entropy has received significant attention in the literature. Due to the emergence of many generalizations and extensions of entropy, the need to generalize the complementary dual of uncertainty arose. This article develops the residual cumulative generalized fractional extropy as a generalization of the residual cumulative complementary dual of entropy. Many properties, including convergence, transformation, bounds, recurrence relations, and connections with other measures, are discussed. Moreover, the proposed measure’s order statistics and stochastic order are examined. Furthermore, the dynamic design of the measure, its properties, and its characterization are considered. Finally, nonparametric estimation via empirical residual cumulative generalized fractional extropy with an application to blood transfusion is performed.

1. Introduction

In contemporary probability theory, assessing distributional uncertainty has emerged as a prominent research area. Consider an absolutely continuous non-negative random variable Y governed by probability density function (pdf) g. To quantify informational uncertainty, Shannon’s foundational entropy measure [1] is formulated as
Φ ( Y ) = 0 g ( y ) ln g ( y ) d y ,
where ln denotes a natural logarithm with 0 ln 0 0 . Subsequent developments introduced alternative uncertainty quantifiers through survival function transformations. Rao et al. [2] pioneered the residual cumulative entropy defined for a random variable Y with cumulative distribution function (cdf) G ( y ) and survival function G ¯ ( y ) = 1 G ( y ) as follows:
R Φ ( Y ) = 0 G ¯ ( y ) ln G ¯ ( y ) d y .
The uncertainty measure based on the survival function has been widely generalized and extended in the literature. One of these measures is the residual cumulative Tsallis entropy, which has been presented through two models. The first was presented by Sati and Gupta [3] as follows
R T n η ( Y ) = 1 η 1 1 0 G ¯ η ( y ) d y .
The second was introduced by Rajesh and Sunoj [4] as follows:
R T n η * ( Y ) = 1 η 1 0 G ¯ ( y ) d y 0 G ¯ η ( y ) d y = 1 η 1 μ 0 G ¯ η ( y ) d y ,
where in both measures we have 1 η > 0 , μ = E ( Y ) = 0 G ¯ ( y ) d y .
The evolution of fractional calculus has inspired generalized entropy formulations, which have found diverse applications across various scientific fields. The discrete form of the fractional version of Shannon entropy, or fractional entropy, was described by Ubriaco [5] as follows:
F Φ ν * ( Y * ) = l = 1 M p l * [ ln p l * ] ν * , 0 ν * 1 ,
where Y * is regarded as a discrete random variable with a probability vector ( p 1 * , , p M * ) and a support S * with cardinality M. The stability qualities for the Lesche and thermodynamic stability criterion situations were examined in this measure. Xiong et al. [6] systematically investigated fractional residual cumulative entropy characteristics, including the following:
  • Boundary constraints.
  • Stochastic dominance relationships.
  • Empirical estimation techniques.
  • Transformation invariance properties.
  • Functional interdependencies.
Their model specification appears as follows:
R F Φ ν * ( Y ) = 0 G ¯ ( y ) [ ln G ¯ ( y ) ] ν * d y , 0 ν * 1 .
The fractional residual cumulative entropy was used for financial data of the Dow Jones Industrial Average price returns by Xiong et al. [6]. In contrast to the fractional residual cumulative entropy, they discovered that the traditional residual cumulative entropy (i.e., when ν * = 1 ) is unable to disclose as much information about the financial system. According to this viewpoint, the fractional residual cumulative entropy is better than the traditional residual cumulative entropy. The fractional residual cumulative entropy can better identify dynamics by retrieving more intrinsic information held by the underlying system when the parameter ν * is taken.
Di Crescenzo et al. [7] subsequently generalized this framework through scaling normalization:
R F G Φ ν ( Y ) = κ ( ν ) 0 G ¯ ( y ) [ ln G ¯ ( y ) ] ν d y , ν 0 ,
where κ ( ν ) = 1 Γ ( ν + 1 ) , ν 0 . Notably, Psarrakos and Navarro [8] established connections to integer-order cases where ν = n N , with extensions discussed in [9]. Concurrently, Mohamed and Almuqrin [10] developed density-based fractional entropy known as the fractional generalized entropy:
F G Φ ν ( Y ) = κ ( ν ) 0 g ( y ) [ ln g ( y ) ] ν d y .
Addressing axiomatic completeness in information theory, Lad et al. [11] identified extropy as Shannon entropy’s dual counterpart. For non-negative Y, this complementary measure is defined by
x Φ ( Y ) ( Y ) = 1 2 0 g 2 ( y ) d y .
The authors demonstrated extropy’s operational characteristics, including extremal distributions and statistical applications. This formulation arises as a second-order approximation of the complete dual functional:
x Φ ( Y ) = 0 [ 1 g ( y ) ] ln [ 1 g ( y ) ] d y .
Recent extensions by Jahanshahi et al. [12] introduced survival-based residual extropy:
R x Φ ( Y ) = 0 G ¯ 2 ( y ) d y .
The notion of dynamic residual extropy was introduced by Abdul Sathar and Nair [13]. The extropy of the random variable [ Y v | Y v ] is really the dynamic residual extropy of a random lifespan Y, and it is defined as
R x Φ ( Y ; v ) = 1 2 v G ¯ 2 ( y ) d y G ¯ 2 ( v ) .
They explored a nonparametric approach to estimate the dynamic residual extropy using real data on metabolite levels within the metabolic network of Escherichia coli.
Another possibility is to examine the discrete form of Tsallis entropy, or discrete Tsallis extropy, which was thoroughly examined by Balakrishnan et al. [14] and used for pattern recognition. Moreover, in the continuous case, the continuous Tsallis extropy is presented by Mohamed et al. [15] with application to financial data of the pharmaceutical market. Actually, in the fields of statistical thermodynamics and quantum theory, the Shannon entropy and extropy measures are uncertain and complementary amounts used to examine the intricate structure of a physical or chemical system. In physics, chemistry, and materials research, these information measurements are employed to examine the atomic organization of a particular system. For additional investigations on extropy and its extensions, one can additionally consult Raqab and Qiu [16], Yang et al. [17], Noughabi and Jarrahiferiz [18], Mohammadi et al. [19], Hashempour et al. [20], and Chakraborty and Pradhan [21].

Work Motivation

The extropy measure has been widely studied and applied in various fields due to its fundamental role in information theory as the complementary dual of entropy. Given the continuous development of entropy-based measures and their numerous extensions and generalizations, it is natural to explore a corresponding generalization of extropy that aligns with these advancements. In this work, we aim to introduce a generalized extropy measure that serves as the complementary dual of an extended entropy function. Our motivation stems from the need to examine whether this generalization preserves the key characteristics of the traditional extropy measure while offering broader applicability in diverse contexts. Specifically, we seek to assess its theoretical properties, interpretability, and potential applications in reliability analysis, uncertainty quantification, and probabilistic modeling. Furthermore, an important aspect of this study is to investigate whether the proposed measure provides deeper insights into information asymmetry and characterization, thereby enriching the existing literature on information measures. Through this research, we aim to answer the fundamental question: To what extent does the generalized extropy measure retain the desirable properties of its classical counterpart, and in what ways does it enhance our understanding of uncertainty and information representation?
The goal of this paper is to introduce the concept of residual cumulative generalized fractional extropy as an extension of the residual cumulative complementary dual measure of uncertainty. The rest of this article is structured as follows: Section 2 introduces the concept of residual cumulative generalized fractional extropy and examines some of its distinctive features. Section 3 discusses order statistics and the stochastic ordering of residual cumulative generalized fractional extropy. Section 4 establishes dynamic residual cumulative generalized fractional extropy by deriving some bounds for it based on the mean residual life, hazard function, and characterization results. Finally, in Section 5, we provide a nonparametric estimator and study its consistency and asymptotic properties. The proposed method is also illustrated using simulated and real data sets.

2. Approximation of Residual Cumulative Generalized Fractional Extropy

In this section, we will discuss the approximation of the residual cumulative generalized fractional extropy. Building upon the concept of fractional generalized entropy introduced in Equation (7), we define the fractional generalized extropy as follows:
F G x Φ ν ( Y ) = κ ( ν ) 0 ( 1 g ( y ) ) ln ( 1 g ( y ) ) ν d y ,
where ν 0 and whenever F G x Φ ν ( Y ) is finite. In analogy with Lad et al. [11] in approximating the extropy measure, under a non-negative continuous random variable Y, and for small g ( y ) , the Maclaurin serial for ln ( 1 g ( y ) ) may be used for expanding the logarithmic term with permitting
ξ = ln ( 1 g ( y ) ) = g ( y ) + g ( y ) 2 2 + O ( g ( y ) 3 ) = g ( y ) 1 + g ( y ) 2 + O ( g ( y ) 2 ) .
Then, we have
ξ ν = ln ( 1 g ( y ) ) ν = g ( y ) ν 1 + g ( y ) 2 + O ( g ( y ) 2 ) ν .
The binomial series for ( 1 + u ) ν , where u, is small, is used for small g ( y ) as
( 1 + u ) ν = 1 + ν u + ν ( ν 1 ) 2 u 2 + O ( u 3 ) .
Therefore, we can see the following:
1 + g ( y ) 2 + O ( g ( y ) 2 ) ν = 1 + ν g ( y ) 2 + O ( g ( y ) 2 ) .
Thus, we have
ξ ν = g ( y ) ν 1 + ν g ( y ) 2 + O ( g ( y ) 2 ) .
Making objects simpler:
ξ ν = g ( y ) ν + ν g ( y ) ν + 1 2 + O ( g ( y ) ν + 2 ) .
Multiply each term in the preceding equation by ( 1 g ( y ) ) :
( 1 g ( y ) ) ξ ν = ( 1 g ( y ) ) g ( y ) ν + ν g ( y ) ν + 1 2 + O ( g ( y ) ν + 2 ) .
When we distribute ( 1 g ( y ) ) , we get
( 1 g ( y ) ) ξ ν = g ( y ) ν g ( y ) ν + 1 + ν g ( y ) ν + 1 2 ν g ( y ) ν + 2 2 + O ( g ( y ) ν + 2 ) = g ( y ) ν + 1 + ν 2 g ( y ) ν + 1 + O ( g ( y ) ν + 2 ) ,
The following approximation may be made when g ( y ) is small as
( 1 g ( y ) ) ln ( 1 g ( y ) ) ν g ( y ) ν 1 + ν 2 1 g ( y ) .
As a result, the continuous generalized fractional extropy measure for ν 0 can be approximately calculated by
F G x Φ ν ( Y ) = κ ( ν ) 0 ( 1 g ( y ) ) [ ln ( 1 g ( y ) ) ] ν d y
κ ( ν ) 0 g ( y ) ν 1 + ν 2 1 g ( y ) d y
κ ( ν ) 0 g ( y ) ν d y
The next example examines the performance of the original generalized fractional extropy alongside its two proposed approximations.
Example 1.
Consider a random variable Y with an exponential distribution characterized by the rate parameter γ, expressed as Y exp ( γ ) . Its probability density function is defined as g ( y ) = γ e γ y for y > 0 and γ 1 . For the specific case where γ = 1 , the differential generalized fractional extropy can be computed using (12), while its approximations are obtained from (13) and (14) as follows:
A p p r o x 1 ν ( Y ) = κ ( ν ) 2 + ν 2 2 ν ( 1 + ν ) ,
A p p r o x 2 ν ( Y ) = κ ( ν ) ν .
A visual comparison in Figure 1 shows the original generalized fractional extropy alongside its approximations, A p p r o x 1 ν ( Y ) and A p p r o x 2 ν ( Y ) . The graph indicates that all three curves closely align across the entire range, confirming the reliability of the approximation techniques.
The following definition of the residual cumulative generalized fractional extropy can be obtained by replacing g ( y ) with G ¯ ( y ) in ().
Definition 1.
Under a non-negative continuous random variable Y following a cdf G ( y ) . Then, the residual cumulative generalized fractional extropy can be defined as
R F G x Φ ν ( Y ) κ ( ν ) 0 G ¯ ν ( y ) d y ,
ν 0 .
If ν = 2 , then R F G x Φ 2 ( Y ) = 1 2 0 G ¯ 2 ( y ) d y , which is actually the positive value of the residual cumulative extropy given in (10) by Jahanshahi et al. [12].
Remark 1.
The extropy function’s sign must be mentioned since the original, unadjusted form, found in (9), yields a positive result. Let Y be a variable at random that has e x p ( γ ) distribution. As shown in the following basic example, a continuous extropy measure is therefore provided by
x Φ ( Y ) = 0 + ( 1 g ( y ) ) ln ( 1 g ( y ) ) d y = 0 ( 1 γ e γ y ) ln ( 1 γ e γ y ) d y ,
Then x Φ ( Y ) = 0.644934 at γ = 1 . In the meanwhile, x Φ ( Y ) = 0.25 is the approximation continuous extropy provided in (8) by Lad et al. [11]. Therefore, the dual entropy or generalized fractional extropy must have a positive value in the suitable approximation.
Table 1 lists the residual cumulative generalized fractional extropy for a few widely used distributions.
In the subsequent analysis, we will display the residual cumulative generalized fractional extropy in a few selected distributions to have a better understanding of its properties.
For finite range, the graphs of R F G x Φ ν ( Y ) with β = 0.5 and power distributions are shown in Figure 2. For finite range distribution, R F G x Φ ν ( Y ) is decreasing, but for power distribution, it is increasing.
The graphs for the Rayleigh and Pareto distributions of R F G x Φ ν ( Y ) are shown in Figure 3. For the Rayleigh distribution, R F G x Φ ν ( Y ) is growing, but for the Pareto distribution, it is dropping.
The following theorem gives the sufficient condition for the residual cumulative generalized fractional extropy to be finite.
Theorem 1.
Let Y be a non-negative random variable. If for some ς > 1 ν , ν 0 , it holds that E ( Y ς ) < + , then R F G x Φ ν ( Y ) < + .
Proof. 
(17) allows us to write
0 G ¯ ν ( y ) d y = 0 1 G ¯ ν ( y ) d x + 1 G ¯ ν ( y ) d y ,
and note that 0 1 G ¯ ν ( y ) d y 1 , ν 0 , so that
0 G ¯ ν ( y ) d y 1 + 1 G ¯ ν ( y ) d y .
From Markov’s inequality, for y 1 we have G ¯ ( y ) E ( Y ς ) y ς , thus
1 G ¯ ν ( y ) d y E ν ( Y ς ) 1 d y y ν ς .
The integral 1 + d x x 2 p is finite if ς > 1 ς . Therefore, the result follows. □
We address the impact of affine transforming on the residual cumulative generalized fractional extropy in the assertion that follows.
Proposition 1.
Let Z be a random variable that is not negative. Then, we have
1. 
If Z = θ 1 Y + θ 2 , θ 1 > 0 , and θ 2 0 , then R F G x Φ ν ( Z ) = θ 1 R F G x Φ ν ( Y ) .
2. 
For all θ 1 ( 0 < θ 1 ) , we can say that
R F G x Φ ν ( θ Y ) = θ R F G x Φ ν ( Y ) ( ) R F G x Φ ν ( Y ) .
Proof. 
This is the outcome of (17), and by seeing that G ¯ θ 1 Y + θ 2 ( y ) = G ¯ Y y θ 2 θ 1 , y 0 . □
Theorem 2
(Limit operator convergence). Assume that { Y m } is a series of random vectors of M dimensions that converge in distributed to a vector Y which is considered random. If each of the Y m ’s are bounded in L * ς for some ς > M ν , ν > 0 , then, we get lim m + R F G x Φ ν ( Y m ) = R F G x Φ ν ( Y ) .
Proof. 
Convergence in distribution of Y m to Y implies that, for all y R + M ,
lim m + G ¯ | Y m | ν ( y ) = G ¯ | Y | ν ( y ) .
Next, observe that the Hölder inequality and Equation (19) of Rao et al. [2] yield
G ¯ | Y m | ν ( y ) j = 1 M G ¯ | Y j | ν / M ( y j ) j = 1 M 1 [ 0 , 1 ] ( y j ) + 1 y j ς 1 [ 1 , ) ( y j ) E | Y m j | ς ν / M .
Consequently, G ¯ | Y m | ν ( y ) is considered to be bounded by an integrable function for ν ς M > 1 . The proof is thus completed by the dominated convergence theorem. □
We will now concentrate on upper and lower bounds for the generalized fractional extropy of residual cumulative. In contrast to the extropy (see Qiu et al. [22]), we demonstrate in the following theorem that the R F G x Φ ν , ν 1 of the sum of two independent random variables is not greater than that of either one.
Theorem 3.
Suppose that the two random variables Y 1 and Y 2 are considered independent and not negative and follow the survival functions G ¯ 1 and G ¯ 2 , respectively. Then, we have
R F G x Φ ν ( Y 1 + Y 2 ) min R F G x Φ ν ( Y 1 ) + κ ( ν ) E ( Y 2 ) , R F G x Φ ν ( Y 2 ) + κ ( ν ) E ( Y 1 ) ,
ν 1 .
Proof. 
Given that Y 1 and Y 2 are independent random variables, 0 G ¯ 1 ( y t ) d G 2 ( t ) represents the survival function of Y 1 + Y 2 . Because ν 1 the function ϕ ( v ) = v ν is convex for v 0 (its second derivative is ν ( ν 1 ) v ν 2 0 ). Therefore, for every fixed y and treating
A ( y ) = 0 G ¯ 1 ( y t ) d G 2 ( t ) ,
as an expectation (since d G 2 ( t ) is a probability measure), Jensen’s inequality gives the following:
A ( y ) ν 0 G ¯ 1 ν ( y t ) d G 2 ( t ) .
Thus, when integrated with respect to y we obtain the following:
R F G x Φ ν ( Y 1 + Y 2 ) = 1 Γ ( ν + 1 ) 0 0 G ¯ 1 ( y t ) d G 2 ( t ) ν d y 1 Γ ( ν + 1 ) 0 0 G ¯ 1 ν ( y t ) d G 2 ( t ) d y
Next, we write the following:
1 Γ ( ν + 1 ) 0 0 G ¯ 1 ν ( y t ) d G 2 ( t ) d y = 1 Γ ( ν + 1 ) 0 d G 2 ( t ) 0 G ¯ 1 ν ( y t ) d y .
This interchange is justified by Fubini’s theorem since the integrand is non-negative. For fixed t, consider the inner integral
I ( t ) = 0 G ¯ 1 ν ( y t ) d y .
Make the substitution u = y t (so that d y = d u ). When y goes from 0 to , the new variable u ranges from t to :
I ( t ) = t G ¯ 1 ν ( u ) d u .
Because Y 1 is non-negative, the survival function G ¯ 1 is typically defined as
G ¯ 1 ( u ) = 1 , u < 0 , usual survival probability , u 0 .
Thus, split the integral as follows:
I ( t ) = t 0 1 d u + 0 G ¯ 1 ν ( u ) d u = t + 0 G ¯ 1 ν ( u ) d u .
Changing the dummy variable u back to y in the second term (since it is a dummy integration variable) yields the following:
I ( t ) = t + 0 G ¯ 1 ν ( y ) d y .
Substitute from (20) into (19), then, (18) will return
R F G x Φ ν ( Y 1 + Y 2 ) E ( Y 2 ) Γ ( ν + 1 ) + 1 Γ ( ν + 1 ) 0 G ¯ 1 ν ( y ) d y .
Then, for ν 1 , we obtain
R F G x Φ ν ( Y 1 + Y 2 ) E ( Y 2 ) Γ ( ν + 1 ) + 1 Γ ( ν + 1 ) 0 G ¯ 1 ν ( y ) d y .
In the same manner, we obtain R F G x Φ ν ( Y 1 + Y 2 ) E ( Y 1 ) Γ ( ν + 1 ) + 1 Γ ( ν + 1 ) 0 G ¯ 2 ν ( y ) d y , ν 1 , which proves the theorem. □
Remark 2.
By noting that G ¯ ν ( y ) ( ) G ¯ ( y ) , for ν 1 ( 0 ν 1 ) . Then, we can say that
R F G x Φ ν ( Y ) ( ) κ ( ν ) E ( Y ) .
The following theorem establishes a recurrence relation for the residual cumulative generalized fractional extropy, which relies on the distorted mean residual life.
Theorem 4.
Let Y be a non-negative absolutely continuous random variable with residual cumulative generalized fractional extropy, R F G x Φ ν ( Y ) . Then, the following relation holds:
R F G x Φ ν ( Y ) = R F G x Φ ν 1 ( Y ) κ ( ν ) E ( G ¯ ν 1 ( y ) m G , ν 1 ( v ) ) ,
where ν 0 , and the distorted mean residual life is defined as
m G , ν ( v ) = v G ¯ ν ( y ) d y G ¯ ν ( v ) .
Proof. 
Starting from (17) and applying Fubini’s theorem, we derive
R F G x Φ ν ( Y ) = κ ( ν ) 0 G ¯ ν ( y ) d y = κ ( ν ) 0 G ¯ ν 1 ( y ) y g ( v ) d v d y = κ ( ν ) 0 g ( v ) 0 v G ¯ ν 1 ( y ) d y d v .
Additionally, we observe that
0 v G ¯ ν 1 ( y ) d y = 0 G ¯ ν 1 ( y ) d y v G ¯ ν 1 ( y ) d y .
By substituting (24) into (23), we arrive at (21). □
Remark 3.
The distorted stop-loss transform D T G , ν ( v ) for a random variable Y is given by
D T G , ν ( v ) = v G ¯ ν ( y ) d y .
Then, the residual cumulative generalized fractional extropy can be expressed as
R F G x Φ ν ( Y ) = R F G x Φ ν 1 ( Y ) κ ( ν ) E ( D T G , ν 1 ( v ) ) ,
Next, we will discuss the connection between the residual cumulative generalized fractional extropy and the two models of the residual cumulative Tsallis entropy presented in (3) and (4).
Remark 4.
From (3) and (4), with 1 η > 0 , we can present the residual cumulative generalized fractional extropy as the following
R F G x Φ η ( Y ) = κ ( ν ) 1 ( η 1 ) R T n η ( Y ) ,
R F G x Φ η ( Y ) = κ ( ν ) μ ( η 1 ) R T n η * ( Y ) = κ ( ν ) μ ( η 1 ) E m G ( Y ) G ¯ η 1 ( Y ) ,
where
m G ( v ) = E ( Y v | Y > v ) = v G ¯ ( y ) d y G ¯ ( v ) ,
is the function of mean residual life.

3. Features on Order Statistics and Stochastic Order

Provided that Y 1 , Y 2 , , Y s are s identically distributed and independently random variables which are considered non-negative with survival function G ¯ . If Y ( j ) stands for the j t h -order statistics in this size sample s, then, using the survival functions G ¯ ( 1 ) and G ¯ ( s ) , respectively, Y ( 1 ) determines the longevity of a series system and Y ( s ) determines the lifetime of a parallel system, where Y ( 1 ) = min { Y 1 , Y 2 , , Y s } and Y ( s ) = max { Y 1 , Y 2 , , Y s } . The following proposition provides upper and lower bounds for residual cumulative generalized fractional extropy of series and parallel systems, utilizing the mean lifetime of their components.
Proposition 2.
R F G x Φ ν ( Y ( s ) ) s ν R F G x Φ ν ( Y ) s ν κ ( ν ) E ( Y ) .
Proof. 
From (17) and Remark 2, with ν 1 , we can utilize the inequality of Bernoulli to show that
R F G x Φ ν ( Y ( s ) ) = κ ( ν ) 0 1 ( 1 G ¯ ( y ) ) s ν d y κ ( ν ) 0 1 ( 1 s G ¯ ( y ) ) ν d y = s ν R F G x Φ ν ( Y ) s ν κ ( ν ) E ( Y ) .
Example 2.
Following the distribution of standard uniform over the interval [ 0 , 1 ] , and with E ( Y ) = 1 2 . Then, we obtain
1. 
R F G x Φ ν ( Y ) = κ ( ν ) 1 + ν .
2. 
R F G x Φ ν ( Y ( s ) ) = Γ s + 1 s Γ s + 1 s + ν .
3. 
D s ( ν ) = s ν κ ( ν ) E ( Y ) = 1 2 s ν κ ( ν ) ,
where ν , s 1 . With noting that 1 s + 1 s 2 , Γ s + 1 s 1 , the outcomes confirm the last proposition (see also Figure 4).
We now present findings related to residual cumulative generalized fractional extropy ordering for random variables. The subsequent definitions are essential, where Y 1 and Y 2 represent random variables with cdfs G 1 and G 2 , pdfs g 1 and g 2 , and survival functions G ¯ 1 and G ¯ 2 , respectively.
Definition 2
(Shaked and Shanthikumar [23]). The random variable Y 1 is considered less than Y 2 under the following criteria:
1. 
Likelihood ratio order ( Y 1 l r o Y 2 ): Occurs when g 1 ( y ) g 2 ( y ) monotonically decreases in y.
2. 
Hazard rate order ( Y 1 h r o Y 2 ): Holds if H λ G 1 ( y ) H λ G 2 ( y ) for all y, where
H λ G ( y ) = g ( y ) G ¯ ( y )
defines the function of the hazard rate.
3. 
Usual stochastic order ( Y 1 s t o Y 2 ): Valid when G ¯ 1 ( y ) G ¯ 2 ( y ) for all y.
4. 
Dispersive order ( Y 1 d i s p o Y 2 ): Established if G 2 1 ( G 1 ( y ) ) y increases in y 0 .
5. 
Increasing concave/convex order ( Y 1 i c v o Y 2 or i c x o ): Satisfied when E [ D * ( Y 1 ) ] E [ D * ( Y 2 ) ] for all increasing concave (convex) functions D * with finite expectations.
The hierarchical relationships among these stochastic orders are summarized below (see [23]):
Y 1 l r o Y 2 Y 1 h r o Y 2 Y 1 s t o Y 2 Y 1 i c x o Y 2 ,
Y 1 d i s p o Y 2 Y 1 s t o Y 2 ( s t o ) if l Y 1 * = l Y 2 * > ( u Y 1 * = u Y 2 * < ) ,
where l Y 1 * , u Y 1 * , l Y 2 * , u Y 2 * denote the support limits of Y 1 and Y 2 . The subsequent theorem links residual cumulative generalized fractional extropy to increasing concave order.
Theorem 5.
For non-negative absolutely continuous random variables Y 1 and Y 2 with sfs G ¯ 1 and G ¯ 2 , respectively, if Y 1 i c v o Y 2 , then R F G x Φ ν ( Y 1 ) R F G x Φ ν ( Y 2 ) , ν 1 .
Proof. 
Since 0 v G ¯ ( y ) d y is an increasing concave function, the result follows directly from (24) and the definition of increasing concave order. □
Remark 5.
The increasing concave order corresponds to second-order stochastic dominance in economics, which is relevant for evaluating returns rather than losses. When Y 1 i c v Y 2 , risk-averse agents prefer Y 1 over Y 2 (Rothschild and Stiglitz [24]). Notably, Y 1 i c v Y 2 is equivalent to Y 1 i c x Y 2 . The increasing convex order, termed stop-loss order in actuarial contexts, holds if and only if D T G , 1 ( v ) D T G , 1 ( v ) , when ν = 1 .
The next theorem establishes a connection between residual cumulative generalized fractional extropy and the usual stochastic order.
Theorem 6.
For non-negative absolutely continuous Y 1 and Y 2 with survival functions G ¯ 1 and G ¯ 2 , respectively, Y 1 s t o Y 2 implies R F G x Φ ν ( Y 1 ) R F G x Φ ν ( Y 2 ) , ν 1 .
Example 3.
Consider Y 1 and Y 2 with cdfs G 1 ( y ) = y β 1 ( 0 y β 1 ) and G 2 ( y ) = y β 2 ( 0 y β 2 ), respectively. For β 2 β 1 , Y 1 s t o Y 2 holds. Applying Table 1, we compute
R F G x Φ ν ( Y 1 ) = β 1 κ ( ν ) ( ν + 1 ) β 2 κ ( ν ) ( ν + 1 ) = R F G x Φ ν ( Y 2 ) .
Remark 6.
R F G x Φ ν ( Y ) , ν 1 , may be readily calculated in closed form for certain distribution families, such as Pareto and exponential, allowing for the direct determination of the ordering. However, dispersion, likelihood ratio, and standard stochastic ordering may be used to determine the ordering for additional distributions by using Theorem 6 (families of ordering parametric). For instance, if Y has a gamma distribution with form parameter α * , it is simple to demonstrate that for α 1 * < α 2 * , Y α 1 * l r o Y α 2 * . Consequently, Y α 1 * s t o Y α 2 * . R F G x Φ ν ( Y α 1 * ) R F G x Φ ν ( Y α 2 * ) is what we have. For α 1 < α 2 , Y α 1 d i s p o Y α 2 , and therefore Y α 1 s t o Y α 2 , if Y has a Weibull distribution with a shape parameter α. R F G x Φ ν ( Y α 1 ) R F G x Φ ν ( Y α 2 ) , as a result.
The following corollary of Theorem 6 can be applied to record values and order statistics. David and Nagaraja [25] and Arnold et al. [26] provide a thorough explanation of certain record values and order statistics.
Corollary 1.
1. 
Assume that Y ( k ) and Z ( K ) denote the Kth order statistic from samples Y 1 , Y 2 , , Y s and Z 1 , Z 2 , , Z s , respectively, both of size s. If Y s t o Z , then Y ( k ) s t o Z ( k ) , which implies R F G x Φ ν ( Y ( k ) ) R F G x Φ ν ( Z ( k ) ) for all k = 1 , 2 , , s , ν 1 .
2. 
Assume that U s and V s correspond to the sth record values from the sequences { Y s , s 1 } and { Z s , s 1 } , respectively. When Y s t o Z , it follows that U s s t o V s , and consequently, R F G x Φ ν ( U s ) R F G x Φ ν ( V s ) holds for k = 1 , 2 , , s , ν 1 .
We demonstrate that residual cumulative generalized fractional extropy can be a superadditive functionality in the following theorem.
Theorem 7.
Consider two independent, non-negative random variables, Y 1 and Y 2 , with right-end supporting locations u Y 1 * = u Y 2 * < . If the densities of Y 1 and Y 2 are log-concave, then using ν 1 , we get
(i) 
R F G x Φ ν ( Y 1 + Y 2 ) min { R F G x Φ ν ( Y 1 ) , R F G x Φ ν ( Y 2 ) } .
(ii) 
R F G x Φ ν ( Y 1 + Y 2 ) R F G x Φ ν ( Y 1 ) + R F G x Φ ν ( Y 2 ) .
Proof. 
Let the density of Y 1 be log-concave. The theory 3.B.7 of Shaked and Shanthikumar [23] states that for every variable that is random Y 1 independent of Y 2 , Y 1 d i s p o Y 1 + Y 2 . We have Y 1 s t o Y 1 + Y 2 considering u Y 1 * = u Y 2 * < . Therefore, R F G x Φ ν ( Y 1 + Y 2 ) R F G x Φ ν ( Y 1 ) is implied by Theorem 6. When Y 2 has a log-concave density, then the same thing happens: R F G x Φ ν ( Y 1 + Y 2 ) R F G x Φ ν ( Y 2 ) . This completes portion ( i ) of the evidence. Furthermore, we may demonstrate portion ( i i ) by pointing out that a random variable’s residual cumulative generalized fractional extropy is never negative. □

4. Dynamic Residual Cumulative Generalized Fractional Extropy

The length of a research period is a key variable of interest in many domains, including business, economics, survival analysis, and reliability. In many applications, the residual lifespan information is essential. The information measurements in these circumstances are dynamic since they depend on time. The dynamic version of the residual cumulative generalized fractional extropy is defined, and its key characteristics are examined in the following.
Definition 3.
Under a non-negative continuous random variable Y following a cdf G ( y ) . Then, by the same manner in Definition 1, the dynamic residual cumulative generalized fractional extropy can be defined as
R F G x Φ ν ( Y ; v ) κ ( ν ) v G ¯ ν ( y ) d y G ¯ ν ( v ) ,
ν 0 .
If ν = 2 , then R F G x Φ 2 ( Y ; v ) = 1 2 0 G ¯ 2 ( y ) d y G ¯ 2 ( v ) , which is actually the positive value of the dynamic residual extropy given in (11) by Abdul Sathar and Nair [13]. In actuality, the residual cumulative generalized fractional extropy of the random variable [ Y v | Y v ] is the dynamic residual cumulative generalized fractional extropy of a random lifespan Y. It is simple to demonstrate that for any v 0 , R F G x Φ ν ( Y ; v ) has every property of R F G x Φ ν ( Y ) . R F G x Φ ν ( Y ; 0 ) = R F G x Φ ν ( Y ) and R F G x Φ ν ( Y ; v ) are clearly always positive.
In the following, we will discuss some properties of the dynamic residual cumulative generalized fractional extropy. The following proposition is an extension of Proposition 1.
Proposition 3.
Let Z be a random variable that is not negative. If Z = θ 1 Y + θ 2 , θ 1 > 0 and θ 2 0 , then R F G x Φ ν ( Z ; v ) = θ 1 R F G x Φ ν ( Y ; v θ 2 θ 1 ) , v θ 2 .
Remark 7.
By noting that G ¯ ν ( y ) ( ) G ¯ ( y ) , for ν 1 ( 0 ν 1 ) . Then, we can say that
R F G x Φ ν ( Y ; v ) ( ) κ ( ν ) m G ( v ) ,
where m G ( v ) is defined in (29).
Definition 4.
If R F G x Φ ν ( Y ; v ) is an increasing (decreasing) function of v, then the distribution function G in dynamic residual cumulative generalized fractional extropy is said to be increasing (decreasing).
Theorem 8.
The distribution function G is increasing (decreasing) in dynamic residual cumulative generalized fractional extropy, if and only if, for every v 0
R F G x Φ ν ( Y ; v ) ( ) 1 ν κ ( ν ) H G ( v ) ,
where
H G ( v ) = g ( y ) G ¯ ( y ) ,
is the function of hazard rate.
Proof. 
From (30), we can write
κ ( ν ) G ¯ ν ( v ) R F G x Φ ν ( Y ; v ) = v G ¯ ν ( y ) d y .
With respect to v, we can differentiate (32) and obtain
κ ( ν ) d R F G x Φ ν ( Y ; v ) d v G ¯ ν ( v ) ν R F G x Φ ν ( Y ; v ) G ¯ ν 1 ( v ) g ( v ) = G ¯ ν ( v ) .
Or equivalent,
d R F G x Φ ν ( Y ; v ) d v = 1 κ ( ν ) + ν H G ( v ) R F G x Φ ν ( Y ; v ) ,
and the outcome comes next. □
In the sense that it defines the underlying distribution in a unique way, the following theorem describes dynamic residual cumulative generalized fractional extropy.
Theorem 9.
With survival functions G ¯ 1 and G ¯ 1 , respectively, and hazard functions H G 1 ( v ) and H G 2 ( v ) , respectively, consider Y 1 and Y 2 as two non-negative completely continuously random variables. Assume that the dynamic residual cumulative generalized fractional extropy matching Y 1 and Y 2 is R F G x Φ ν ( Y 1 ; v ) and R F G x Φ ν ( Y 2 ; v ) . G ¯ 1 ( v ) = G ¯ 2 ( v ) if, for every v 0 , R F G x Φ ν ( Y 1 ; v ) = R F G x Φ ν ( Y 2 ; v ) .
Proof. 
Using Equation (33) and differentiate each side of R F G x Φ ν ( Y 1 ; v ) = R F G x Φ ν ( Y 2 ; v ) with regard to v, we get
1 κ ( ν ) + ν H G 1 ( v ) R F G x Φ ν ( Y 1 ; v ) = 1 κ ( ν ) + ν H G 2 ( v ) R F G x Φ ν ( Y 2 ; v ) .
Instantly, this suggests that H G 1 ( v ) = H G 2 ( v ) , or, in other words, that G ¯ 1 ( v ) = G ¯ 2 ( v ) . □
Next, we will discuss the connection between the dynamic residual cumulative generalized fractional extropy and the two models of the dynamic residual cumulative Tsallis entropy.
Remark 8.
With 1 η > 0 , we can present the residual cumulative generalized fractional extropy as the following
R F G x Φ η ( Y ; v ) = κ ( ν ) 1 ( η 1 ) R T n η ( Y ; v ) ,
R F G x Φ η ( Y ; v ) = κ ( ν ) ( m G ( Y ) ( η 1 ) R T n η * ( Y ; v ) ,
where
R T n η ( Y ; v ) = 1 η 1 1 v G ¯ η ( y ) G ¯ η ( v ) d y ,
R T n η * ( Y ; v ) = 1 η 1 v G ¯ ( y ) G ¯ ( v ) G ¯ η ( y ) G ¯ η ( v ) d y .
The distributions are described by the following theorem, which uses the link between mean residual life m G ( v ) and dynamic residual cumulative generalized fractional extropy.
Theorem 10.
Consider Y as a non-negative continuous random variable characterized by its survival function G ¯ ( y ) , mean residual life m G ( v ) , and dynamic residual cumulative generalized fractional extropy order η, denoted as R F G x Φ η ( Y ; v ) . If the relationship
R F G x Φ η ( Y ; v ) = A * κ ( η ) m G ( v ) , η > 0 , η 1 ,
holds, then the distribution of X can be determined as follows:
1. 
X follows a distribution of exponential if and only if A * = 1 / η ,
2. 
X follows a distribution of Pareto if and only if A * < 1 / η ,
3. 
X follows a distribution of finite range if and only if A * > 1 / η .
Proof. 
1.
When the random variable Y follows an e x p ( γ ) distribution, then its pdf, survival function and mean residual life are, respectively,
g ( v ) = γ e γ v , γ > 0 , v > 0 , G ¯ ( v ) = e γ v , m G ( v ) = 1 γ .
Substituting these into (30), we obtain the following:
R F G x Φ η ( Y ; v ) = A * κ ( η ) m G ( v ) ,
where A * = 1 / η and m G ( v ) = 1 / γ .
2.
When the random variable Y follows a Pareto distribution, then its pdf, survival function, and mean residual life are, respectively,
g ( v ) = 1 + v α α 1 , α > 1 , v > 0 , G ¯ ( v ) = 1 + v α α , m G ( v ) = α + v α 1 .
Using (30), we derive the following:
R F G x Φ η ( Y ; v ) = A * κ ( η ) m G ( v ) ,
where A * = α 1 η α 1 < 1 η if η > 1 , and m G ( v ) = α + v α 1 .
3.
When the random variable Y follows a finite range distribution, then its pdf, survival function, and mean residual life are, respectively,
g ( v ) = β ( 1 v ) β 1 , β > 0 , 0 < v < 1 , G ¯ ( v ) = ( 1 v ) β , m G ( v ) = 1 v β + 1 .
Applying (30), we find the following:
R F G x Φ η ( Y ; v ) = A * κ ( η ) m G ( v ) ,
where A * = β + 1 η β + 1 > 1 η if η > 1 , and m G ( v ) = 1 v β + 1 .
For the opposite direction. Assuming (36) is valid, we use (30) to derive the following:
A * m G ( v ) = v G ¯ ( y ) η d y G ¯ ( v ) η ;
Differentiating (43) with respect to v, we obtain the following:
A * m G ( v ) = 1 + η H G ( v ) v G ¯ ( y ) η d y G ¯ ( v ) η = 1 + η H G ( v ) A * m G ( v ) ,
where H G ( v ) represents the hazard rate of Y. Utilizing the relationship between the hazard rate and the mean residual life,
H G ( v ) m G ( v ) = 1 + m G ( v ) .
We arrive at the following:
m G ( v ) = A * η 1 A * ( 1 η ) .
Integrating (44) with respect to v over the interval ( 0 , y ) , we find the following:
m G ( v ) = A * η 1 A * ( 1 η ) y + m G ( 0 ) .
Equation (45) indicates that the mean residual life function m G ( y ) of the continuous random variable Y is linear if and only if the underlying distribution is exponential ( A * = 1 / η ), Pareto ( A * < 1 / η ), or finite range ( A * > 1 / η ); see Hall and Wellner [27]. This concludes the proof. □

5. Nonparametric Estimation via Empirical Residual Cumulative Generalized Fractional Extropy

In this part, we will estimate the residual cumulative generalized fractional extropy nonparametrically using the empirical cdf. Assuming that Y 1 , Y 2 , , Y s are independent, non-negativity, absolutely continuous, and identically distributed random variables that obey cdf G ( y ) . The empirical residual cumulative generalized fractional extropy is defined as follows, per (17):
R F G x Φ ν ( G ^ s ) = κ ( ν ) 0 ( 1 G ^ s ( y ) ) ν d y ,
where ν 0 , G ^ s ( y ) , is the empirical cdf of the sample. With the indicator function A , and the order statistics Y ( 1 ) Y ( 2 ) , Y ( s ) , the empirical cdf form can be presented as
G ^ s ( y ) = l = 1 s 1 l s A [ Y ( l ) , Y ( l + 1 ) ) ( y ) + A [ Y ( s ) , ) ( y ) , y R .
Then, Equation (47) may be constructed as
R F G x Φ ν ( G ^ s ) = κ ( ν ) l = 1 s 1 Y l Y l + 1 ( 1 G ^ s ( y ) ) ν d y = κ ( ν ) l = 1 s 1 Y ( l + 1 ) Y ( l ) 1 l s ν = κ ( ν ) l = 1 s 1 Λ l 1 l s ν ,
where Λ l = Y ( l + 1 ) Y ( l ) , l = 1 , 2 , , s 1 . Instead of directly summing over order statistics, it integrates over the empirical cdf, reducing variability compared to discrete estimators. This smoothing helps mitigate the discontinuity problem seen in traditional empirical estimators.
Theorem 11.
The consistency of the test based on the sample estimate R F G x Φ ν ( G ^ s ) remains valid.
Proof. 
By applying the Glivenko–Cantelli theorem, as referenced in Howard [28], it follows that
sup y | G ^ s ( y ) G ( y ) | a . s . 0 as n .
Meaning the convergence is almost sure. Consequently, it can be directly concluded that
R F G x Φ ν ( G ^ s ) a . s . R F G x Φ ν ( G ) .
This establishes the proof. □
Based on the computation of empirical residual cumulative generalized fractional extropy shown in Equation (48), Figure 5 estimates the residual cumulative generalized fractional extropy for random variables derived from the traditional uniform distribution in [ 0 , 1 ] and the e x p ( γ ) distribution with parameter γ = 1 2 . Table 1 is used to calculate the theoretical values. The empirical residual cumulative generalized fractional extropy for variable ν is shown to be almost the same as the real value for the uniform distribution. As ν becomes smaller, the actual estimation deviates from the theoretic value for the exponential distribution. Subsequently, the sample size s rises, the deviation decreases, as seen in Table 2 by the root mean squared error ( R M * ) compared to the estimated and theoretical values, meaning,
R M * = 1 k i = 1 k R F G x Φ ν ( G ^ s ) R F G x Φ ν ( G ) 2 .
As s increases, the R M * declines, suggesting that as s grows infinite, the empirical residual cumulative generalized fractional extropy’s departure from the real entropy value reduces to zero. In other words, the sample data’s residual cumulative generalized fractional extropy asymptotically approaches the theoretical value.

5.1. An Empirical Residual Cumulative Generalized Fractional Extropy Central Limit Theorem

We examine the empirical residual cumulative generalized fractional extropy for both uniformly and exponentially distributed random variables in the examples that follow.
Example 4.
Consider a random sample Y 1 , Y 2 , , Y s drawn from a uniform distribution on the interval [ 0 , 1 ] . The sample spacing Λ l follows a beta distribution Beta ( 1 , n ) , which implies
E ( Λ l ) = 1 s + 1   and   Var ( Λ l ) = s ( s + 1 ) 2 ( s + 2 ) ,
as shown in [29]. Using Equation (48), the mean and variance of the empirical residual cumulative generalized fractional extropy can be derived as follows:
E R F G x Φ ν ( G ^ s ) = κ ( ν ) s + 1 l = 1 s 1 1 l s ν ,
Var R F G x Φ ν ( G ^ s ) = s κ ( ν ) ( s + 1 ) 2 ( s + 2 ) l = 1 s 1 1 l s 2 ν .
From Table 1, it is evident that
lim s E R F G x Φ ν ( G ^ s ) = κ ( ν ) ( ν + 1 ) , lim s Var R F G x Φ ν ( G ^ s ) = 0 .
Example 5.
Let Y 1 , Y 2 , , Y s be a random sample drawn from an e x p ( γ ) distribution with parameter γ. Here, Λ l follows an e x p ( γ ( s l ) ) distribution with parameter γ ( s l ) , as referenced in [29]. From Equation (48), we obtain the following:
E R F G x Φ ν ( G ^ s ) = κ ( ν ) s γ l = 1 s 1 1 l s ν 1 ,
Var R F G x Φ ν ( G ^ s ) = κ 2 ( ν ) s 2 γ 2 l = 1 s 1 1 l s 2 ν 2 .
Table 1 makes it evident that
lim s E R F G x Φ ν ( G ^ s ) = κ ( ν ) γ ν , lim s Var R F G x Φ ν ( G ^ s ) = 0 .
The empirical residual cumulative generalized fractional extropy mean and variance for the examples in Examples 4 and 5 are tabulated in Table 3. The sample size s is set at 10, 20, 30, 50, 70, and 100, while the order ν is set at 0.5, 0.9, 1.0, 1.5, 2, 2.5, and 3. From (51), the values of estimated mean as s , with fixed ν , tends to R F G x Φ 0.5 ( G ^ s ) = 0.752253 , R F G x Φ 0.9 ( G ^ s ) = 0.547239 , R F G x Φ 1.0 ( G ^ s ) = 0.5 , R F G x Φ 1.5 ( G ^ s ) = 0.300901 , R F G x Φ 2 ( G ^ s ) = 0.166667 , R F G x Φ 2.5 ( G ^ s ) = 0.0859717 , R F G x Φ 3 ( G ^ s ) = 0.0416667 . From (54), γ = 0.5 , the values of estimated mean as s , with fixed ν , tends to R F G x Φ 0.5 ( G ^ s ) = 4.51352 , R F G x Φ 0.9 ( G ^ s ) = 2.31056 , R F G x Φ 1.0 ( G ^ s ) = 2 , R F G x Φ 1.5 ( G ^ s ) = 1.003 , R F G x Φ 2 ( G ^ s ) = 0.5 , R F G x Φ 2.5 ( G ^ s ) = 0.240721 , R F G x Φ 3 ( G ^ s ) = 0.111111 . once can be shown, once s grows sufficiently, the variance decreases to zero, and the mean of empirical residual cumulative generalized fractional extropy approaches the real value.
For random samples from the exponential distribution, a central limit theorem for the empirical residual cumulative generalized fractional extropy may be obtained from Example 5.
Theorem 12.
Consider a random sample Z 1 , Z 2 , , Z s drawn from an e x p ( γ ) distribution with parameter γ. Then, as s , the standardized form
R F G x Φ ν ( G ^ s ) E R F G x Φ ν ( G ^ s ) Var R F G x Φ ν ( G ^ s ) ,
approaches a standard normal distribution in the limit.
Proof. 
Referring to Equation (48), the empirical residual cumulative generalized fractional extropy can be formulated as the summation of independent exponentially distributed random variables Z l , where the expected value is given by
E [ Z l ] = κ ( ν ) s γ 1 l s ν 1 .
For an exponentially distributed variable Z l , it is known that [30]
E | Z l E ( Z l ) | 3 = 2 e 1 ( 6 e ) [ E ( Z l ) ] 3 .
Thus, we obtain
l = 1 s E | Z l E ( Z l ) | 2 = κ 2 ( ν ) s 2 γ 2 l = 1 s 1 l s 2 ν 2 = κ ( ν ) s γ κ ( ν ) s γ l = 1 s 1 l s ( 2 ν 1 ) 1 κ ( ν ) κ ( 2 ν 1 ) s γ 2 ( 2 ν 1 ) ,
and similarly,
l = 1 s E | Z l E ( Z l ) | 3 = κ 3 ( ν ) s 3 γ 3 l = 1 s 1 l s 3 ν 3 = κ 2 ( ν ) s 2 γ 2 κ ( ν ) s γ l = 1 s 1 l s ( 3 ν 2 ) 1 2 ( 6 e ) κ 2 ( ν ) κ ( 3 ν 2 ) e s 2 γ 3 ( 3 ν 2 )
for ν 0 and sufficiently large s. Consequently, Lyapunov’s condition for the central limit theorem holds (see, e.g., [31]), as follows:
l = 1 s E | Z l E ( Z l ) | 3 1 / 3 l = 1 s E | Z l E ( Z l ) | 2 1 / 2 2 ( 6 e ) κ ( 3 ν 2 ) 1 3 ( 2 ν 1 ) 1 2 κ 1 6 ( ν ) e 1 3 ( 3 ν 2 ) 1 3 κ 1 2 ( 2 ν 1 ) s 1 6 0 as s .
This concludes the proof. □

5.2. Analysis of Real Data

In this subsection, we will apply the obtained estimator of the residual cumulative generalized fractional extropy to a real data sets.
Example 6.
The blood transfusion service center in Hsin-Chu City, Taiwan, provided the donor database for this data. Every three months or so, the facility collects blood donations by sending a plasma transfusion support bus to a specific university in Hsin-Chu City. Several references have looked into it. For instance, refer to Yeh et al. [32]. Its foundation was the random selection of 748 donors from a donation database. This data contains the following variables:
1. 
Recency: months since the last donation.
2. 
Monetary: total blood donated.
3. 
Frequency: total number of donations.
4. 
Time: months since the first donation.
5. 
A binary variable indicating whether the donor donated blood in March 2007 was included in these 748 donor data. One means the donor donated blood, while zero means they did not
In our study, we will use the residual cumulative generalized fractional extropy and its estimator for the variable “Time" to implement the calculation. To fit this data to e x p ( γ ) distribution, we use the maximum likelihood estimation to estimate the parameter γ, which returns 0.02916975 . Moreover, Figure 6 shows the fitting between the histogram of the data and the exponential curve (in the left panel) and the theoretical and empirical cdf (in the right panel), which illustrate how the exponential distribution fits the data. Under e x p ( 0.02916975 ) distribution, Figure 7 shows the theoretical and empirical residual cumulative generalized fractional extropy of the “Time” blood transfusion data for different values of ν.
Example 7.
Originally reported by Chhikara and Folks [33] and later cited by Balakrishnan et al. [34], this dataset shows the operational repair times (in hours) for an airborne telecommunications transmitter. The following values were noted:
0.2, 0.3, 0.5, 0.5, 0.5, 0.5, 0.6, 0.6, 0.7, 0.7, 0.7, 0.8, 0.8, 1.0, 1.0, 1.0, 1.0, 1.1, 1.3, 1.5, 1.5, 1.5, 1.5, 2.0, 2.0, 2.2, 2.5, 2.7, 3.0, 3.0, 3.3, 3.3, 4.0, 4.0, 4.5, 4.7, 5.0, 5.4, 5.4, 7.0, 7.5, 8.8, 9.0, 10.3, 22.0, 24.5.
This data was used by Jahanshahi et al. [12] to estimate the traditional extropy given by Equation (8). They fit this data to an e x p ( γ ) distribution with the parameter estimate γ = 0.2773 . They conclude that by increasing sample size, the empirical estimator become closer to the theoretical value. In our work, we will use this data to estimate the empirical residual cumulative generalized fractional extropy of e x p ( 0.2773 ) distribution. The theoretical and empirical residual cumulative generalized fractional extropy of this data under different values of ν are displayed in Figure 8.
The proposed estimator of the residual cumulative generalized fractional extropy has demonstrated good performance in approximating the theoretical measure across two real datasets. As shown in Figure 7 and Figure 8, the estimator remains close to the theoretical values for all considered values of the fractional parameter ν . Additionally, increasing ν generally improves the agreement between the empirical and theoretical extropy, especially in the “Time” blood transfusion data (Figure 7), where the curves clearly converge. In the operational repair times data (Figure 8), although the convergence is less pronounced, the estimator still maintains a close fit to the theoretical values. These findings suggest that while higher values of ν may improve estimation accuracy, the extent of this effect can depend on the characteristics of the underlying data.
Example 8.
This example illustrates the performance of our proposed estimators in a practical case, previously described in Example 7. The parametric family considered here is the inverse Gaussian (InG) distribution. Chhikara and Folks [35] matched the InG distribution and determined that the fit is satisfactory (test statistic = 0.07245267) based on the value they observed of the Kolmogorov–Smirnov statistic. Both the independently characterised Mudholkar test (test statistic = 0.2026783, Mudholkar et al. [36]) and the Anderson and Darling test (test statistic = 0.2392647) obtained the same result. Lastly, the widely accepted InG distribution was used by Lee et al. [37].
Assume that the random variable Y follows an inverse gamma distribution with shape parameters r 1 and r 2 , denoted as I n G ( r 1 , r 2 ) . Define a transformation Z = 1 Y . The corresponding pdf of Z is then given by
g ( z ) = 2 2 π h 2 exp ( z h 1 z ) 2 2 h 2 2 , z 0 ,
where the parameters are h 1 = 1 r 1 and h 2 2 = 1 r 2 . Based on equation (17), the residual cumulative generalized fractional extropy for Z can be computed as follows:
R F G x Φ ν ( Z ) = κ ( ν ) 0 G ¯ ν ( z ) d z = κ ( ν ) 1 ν 2 h 2 2 π ν 1 0 2 2 π h 2 ν exp ( z h 1 z ) 2 2 h 2 ν 2 d z = κ ( ν ) 1 ν 2 h 2 2 π ν 1 .
Additionally, according to Mudholkar and Tian [38], the uniformly minimum variance unbiased estimator (UMVUE) of h 2 2 is given by the following:
u U M V U E 2 = i = 1 n z i 2 n 1 n 2 ( n 1 ) ( i = 1 n 1 z i 2 ) .
As an illustration, for ν = 3 , the value of the residual cumulative generalized fractional extropy becomes
R F G x Φ ν ( Z ) = κ ( ν ) 1 ν 2 u U M V U E 2 π ν 1 = 0.00165021 .
Using the transformation Z = 1 Y , the sample-based estimators for the residual cumulative generalized fractional extropy are denoted as R F G x Φ ν ( G ^ s ) , as defined in Equation (48). The plot in Figure 9 presents the absolute bias values of these estimators. The trend observed indicates that increasing ν leads to smaller absolute biases.

6. Conclusions

Depending on the Maclaurin expansion, we have presented the residual cumulative generalized fractional extropy as a generalization of the residual cumulative extropy. The survival function G ¯ serves as the foundation for this measure. Examples of some well-known distributions have been presented. Many properties have been discussed, such as the sufficient condition for it to be finite, affine transformation, weak convergence, upper and lower bounds, a recurrence relation, and its connection with residual cumulative Tsallis entropy via residual cumulative generalized fractional extropy. Furthermore, order statistics and stochastic ordering have been considered in the proposed measure. Moreover, some bounds depending on the mean residual life and hazard rate for the dynamic version of the residual cumulative generalized fractional extropy have been derived. It has been shown that the dynamic version of the residual cumulative generalized fractional extropy uniquely determines the distribution. Additionally, this model has been expressed in terms of dynamic residual cumulative Tsallis entropy. We have described many well-known lifespan distributions, including exponential, Pareto, and finite-range distributions, which are essential to reliability modeling based on the suggested dynamic version. In the end, empirical residual cumulative generalized fractional extropy has been used for nonparametric estimation. In addition, the consistency and the central limit theorem have been discussed with examples of the uniform and exponential distributions for the proposed estimator. Across both simulated and real datasets, the suggested estimate of the residual cumulative generalized fractional extropy has shown strong performance in approaching the theoretical measure. In future work, we can extend this model for more discussion in different data sets, see for example Zheng et al. [39]. Moreover, we aim to further explore the stability and robustness of the proposed estimator in more complex and multimodal real-world datasets.

Author Contributions

Conceptualization, M.S.M. and H.H.S.; Formal analysis, M.S.M. and H.H.S.; Investigation, M.S.M. and H.H.S.; Methodology, M.S.M. and H.H.S.; Software, M.S.M. and H.H.S.; Visualization, M.S.M. and H.H.S.; Resources, M.S.M. and H.H.S.; Writing—original draft, M.S.M. and H.H.S.; Validation, M.S.M. and H.H.S.; Writing—review & editing, M.S.M. and H.H.S. All authors contributed equally to the conception and design of the research project. All authors have read and agreed to the published version of the manuscript.

Funding

This study is supported via funding from Prince Sattam bin Abdulaziz University project number (PSAU/2025/R/1446).

Data Availability Statement

The article contains the datasets created and/or examined during the current investigation.

Acknowledgments

This study is supported via funding from Prince Sattam bin Abdulaziz University project number (PSAU/2025/R/1446).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Shannon, C. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  2. Rao, M.; Chen, Y.; Vemuri, B.; Wang, F. Cumulative residual entropy: A new measure of information. IEEE Trans. Inf. Theory 2004, 50, 1220–1228. [Google Scholar] [CrossRef]
  3. Sati, M.M.; Gupta, N. Some Characterization Results on Dynamic Cumulative Residual Tsallis Entropy. J. Probab. Stat. 2015, 2015, 694203. [Google Scholar] [CrossRef]
  4. Rajesh, G.; Sunoj, S.M. Some properties of cumulative Tsallis entropy of order α. Stat. Pap. 2019, 60, 933–943. [Google Scholar] [CrossRef]
  5. Ubriaco, M.R. Entropies based on fractional calculus. Phys. Lett. A 2009, 373, 2516–2519. [Google Scholar] [CrossRef]
  6. Xiong, H.; Shang, P.J.; Zhang, Y.L. Fractional cumulative residual entropy. Commun. Nonlinear Sci. 2019, 78, 104879. [Google Scholar] [CrossRef]
  7. Di Crescenzo, A.; Kayal, S.; Meoli, A. Fractional generalized cumulative entropy and its dynamic version. Commun. Nonlinear Sci. 2021, 102, 105899. [Google Scholar] [CrossRef]
  8. Psarrakos, G.; Navarro, J. Generalized cumulative residual entropy and record values. Metrika 2013, 76, 623–640. [Google Scholar] [CrossRef]
  9. Psarrakos, G.; Toomaj, A. On the generalized cumulative residual entropy with applications in actuarial science. J. Comput. Appl. Math. 2017, 309, 186–199. [Google Scholar] [CrossRef]
  10. Mohamed, M.S.; Almuqrin, M.A. Properties of fractional generalized entropy in ordered variables and symmetry testing. Aims Math. 2025, 10, 1116–1141. [Google Scholar] [CrossRef]
  11. Lad, F.; Sanfilippo, G.; Agro, G. Extropy: Complementary dual of entropy. Stat. Sci. 2015, 30, 40–58. [Google Scholar] [CrossRef]
  12. Jahanshahi, S.M.A.; Zarei, H.; Khammar, A.H. On Cumulative Residual Extropy. Probab. Eng. Inform. Sci. 2020, 34, 605–625. [Google Scholar] [CrossRef]
  13. Abdul Sathar, E.I.; Nair, R.D. On dynamic survival extropy. Communications in Statistics. Theory Methods 2019, 50, 1295–1313. [Google Scholar] [CrossRef]
  14. Balakrishnan, N.; Buono, F.; Longobardi, M. On Tsallis extropy with an application to pattern recognition. Stat. Probab. Lett. 2022, 180, 109241. [Google Scholar] [CrossRef]
  15. Mohamed, M.S.; Alsadat, N.; Balogun, O.S. Continuous Tsallis and Renyi extropy with pharmaceutical market application. Aims Math. 2023, 8, 24176–24195. [Google Scholar] [CrossRef]
  16. Raqab, M.Z.; Qiu, G. On extropy properties of ranked set sampling. Statistics 2019, 53, 210–226. [Google Scholar] [CrossRef]
  17. Yang, J.; Xia, W.; Hu, T. Bounds on extropy with variational distance constraint. Probab. Eng. Inform. Sci. 2019, 33, 186–204. [Google Scholar] [CrossRef]
  18. Noughabi, H.A.; Jarrahiferiz, J. On the estimation of extropy. J. Nonparametr. Stat. 2019, 31, 88–99. [Google Scholar] [CrossRef]
  19. Mohammadi, M.; Hashempour, M.; Kamari, O. On the dynamic residual measure of inaccuracy based on extropy in order statistics. Probab. Eng. Inform. Sci. 2024, 38, 481–502. [Google Scholar] [CrossRef]
  20. Hashempour, M.; Toomaj, A.; Kazemi, M.R. Residual Inaccuracy Extropy and its properties. Math. Slovaca 2024, 74, 1321–1342. [Google Scholar] [CrossRef]
  21. Chakraborty, S.; Pradhan, B. On cumulative residual extropy of coherent and mixed systems. Ann. Oper. Res. 2024, 340, 59–81. [Google Scholar] [CrossRef]
  22. Qiu, G.; Wang, L.; Wang, X. On extropy properties of mixed systems. Probab. Eng. Inform. Sci. 2018, 42, 471–486. [Google Scholar] [CrossRef]
  23. Shaked, M.; Shanthikumar, J.G. Stochastic Orders; Springer: New York, NY, USA, 2007. [Google Scholar] [CrossRef]
  24. Rothschild, M.; Stiglitz, J.E. Increasing risk: I a definition. J. Econ. Theory 1970, 2, 225–243. [Google Scholar] [CrossRef]
  25. David, H.A.; Nagaraja, H.N. Order Statistics, 3rd ed.; Wiley: New York, NY, USA, 2003. [Google Scholar]
  26. Arnold, B.C.; Balakrishnan, B.N.; Nagaraja, H.N. A First Course in Order Statistics; John Wiley and Sons: New York, NY, USA, 1992. [Google Scholar]
  27. Hall, W.J.; Wellner, J.A. Mean residual life. In Statistics and Related Topics; North-Holland Publishing Company: Amsterdam, The Netherland, 1981; pp. 169–184. [Google Scholar]
  28. Howard, G.T. A generalization of the Glivenko-Cantelli theorem. Ann. Math. Stat. 1959, 30, 828–830. [Google Scholar] [CrossRef]
  29. Pyke, R. Spacings. J. R. Stat. Soc. Ser. B 1965, 27, 395–449. [Google Scholar] [CrossRef]
  30. Di Crescenzo, A.; Longobardi, M. On cumulative entropies. J. Stat. Plann. Inference 2009, 139, 4072–4087. [Google Scholar] [CrossRef]
  31. Billingsley, P. Probability and Measure; John Wiley & Sons: New York, NY, USA, 2008. [Google Scholar]
  32. Yeh, I.; Yang, K.; Ting, T. Knowledge discovery on RFM model using Bernoulli sequence. Expert Syst. Appl. 2008, 36, 5866–5871. [Google Scholar] [CrossRef]
  33. Chhikara, R.S.; Folks, J.L. The Inverse Gaussian Distribution; Marcel Dekker: New York, NY, USA, 1989. [Google Scholar]
  34. Balakrishanan, N.; Leiva, V.; Sanhuzea, A.; Cabrera, E. Mixture Inverse Gaussian Distributions and Its Transformations. Moments Appl. Stat. 2009, 43, 91–104. [Google Scholar] [CrossRef]
  35. Chhikara, R.S.; Folks, J.L. The Inverse Gaussian Distribution as a Lifetime model. Technometrics 1977, 19, 461–468. [Google Scholar] [CrossRef]
  36. Mudholkar, G.S.; Natarajan, R.; Chaubey, Y.P. A Goodness-of-fit Test for the Inverse Gaussian Distribution Using its Independence characterization. Sankhya B 2001, 63, 362–374. [Google Scholar]
  37. Lee, S.; Vonta, I.; Karagrigoriou, A. A Maximum Entropy Type Test of fit. Comput. Data Anal. 2011, 55, 2635–2643. [Google Scholar] [CrossRef]
  38. Mudholkar, G.S.; Tian, L. An Entropy Characterization of the Inverse Gaussian Distribution and Related Goodness-of-Fit Test. J. Stat. Plan. Inference 2002, 102, 211–221. [Google Scholar] [CrossRef]
  39. Zheng, Y.; Cheng, P.; Li, Z.; Fan, C.; Wen, J.; Yue, Y.; Jia, L. Efficient removal of gaseous elemental mercury by Fe-UiO-66@BC composite adsorbent: Performance evaluation and mechanistic elucidation. Sep. Purif. Technol. 2025, 372, 133463. [Google Scholar] [CrossRef]
Figure 1. Plots of F G x Φ ν ( Y ) , A p p r o x 1 ν ( Y ) and A p p r o x 2 ν ( Y ) of e x p ( 1 ) distribution.
Figure 1. Plots of F G x Φ ν ( Y ) , A p p r o x 1 ν ( Y ) and A p p r o x 2 ν ( Y ) of e x p ( 1 ) distribution.
Fractalfract 09 00388 g001
Figure 2. Plot of R F G x Φ ν ( Y ) , with different values of ν , for finite range distribution with β = 0.5 (Left) and power distribution (Right).
Figure 2. Plot of R F G x Φ ν ( Y ) , with different values of ν , for finite range distribution with β = 0.5 (Left) and power distribution (Right).
Fractalfract 09 00388 g002
Figure 3. Plot of R F G x Φ ν ( Y ) , with different values of ν , for Rayleigh (Left) and Pareto (Right) distributions.
Figure 3. Plot of R F G x Φ ν ( Y ) , with different values of ν , for Rayleigh (Left) and Pareto (Right) distributions.
Fractalfract 09 00388 g003
Figure 4. Plot of D s ( ν ) , s ν R F G x Φ ν ( Y ) , and R F G x Φ ν ( Y ( s ) ) for distribution of standard uniform over the interval [ 0 , 1 ] , ν = 1.1 .
Figure 4. Plot of D s ( ν ) , s ν R F G x Φ ν ( Y ) , and R F G x Φ ν ( Y ( s ) ) for distribution of standard uniform over the interval [ 0 , 1 ] , ν = 1.1 .
Fractalfract 09 00388 g004
Figure 5. The empirical residual cumulative generalized fractional extropy of random variables following uniform distribution on supporting [0,1] (left), and e x p ( 0.5 ) distribution (right).
Figure 5. The empirical residual cumulative generalized fractional extropy of random variables following uniform distribution on supporting [0,1] (left), and e x p ( 0.5 ) distribution (right).
Fractalfract 09 00388 g005
Figure 6. Plot of histogram (left), and theoretical and empirical cdf (right) of the “Time” blood transfusion data.
Figure 6. Plot of histogram (left), and theoretical and empirical cdf (right) of the “Time” blood transfusion data.
Fractalfract 09 00388 g006
Figure 7. Plot of the theoretical and empirical residual cumulative generalized fractional extropy of the “Time” blood transfusion data, with e x p ( 0.02916975 ) distribution.
Figure 7. Plot of the theoretical and empirical residual cumulative generalized fractional extropy of the “Time” blood transfusion data, with e x p ( 0.02916975 ) distribution.
Fractalfract 09 00388 g007
Figure 8. Plot of the theoretical and empirical residual cumulative generalized fractional extropy of the operational repair times (in hours) for an airborne telecommunications transmitter, with e x p ( 0.2773 ) distribution.
Figure 8. Plot of the theoretical and empirical residual cumulative generalized fractional extropy of the operational repair times (in hours) for an airborne telecommunications transmitter, with e x p ( 0.2773 ) distribution.
Fractalfract 09 00388 g008
Figure 9. Absolute biases of the residual cumulative generalized fractional extropy estimates according to Example 8.
Figure 9. Absolute biases of the residual cumulative generalized fractional extropy estimates according to Example 8.
Fractalfract 09 00388 g009
Table 1. Residual cumulative generalized fractional extropy for a few widely used distributions.
Table 1. Residual cumulative generalized fractional extropy for a few widely used distributions.
Distribution G ¯ ( y ) R F G x Φ ν ( Y ) = κ ( ν ) 0 G ¯ ν ( y ) d y
Weibull e ( y β ) α , y 0 , α , β > 0 κ ( ν ) 1 β α ν 1 / α Γ 1 + 1 α
Rayleigh e ( y 2 2 β 2 ) , y 0 , β > 0 κ ( ν ) β π 2 ν
Exponential e γ y , y 0 , γ > 0 κ ( ν ) γ ν
Pareto y α , y 1 , α > 0 κ ( ν ) ( α ν 1 ) α ν > 1
Uniform 1 y β , 0 y β , β > 0 β κ ( ν ) ( ν + 1 )
Power 1 y α , 0 y 1 , α > 0 Γ 1 + 1 α Γ 1 + 1 α + ν
Finite range ( 1 α y ) β , 0 y 1 α , α , β > 0 κ ( ν ) ( β ν + 1 )
Table 2. The R M * of the residual cumulative generalized fractional extropy of random variables from a uniform distribution over [0,1] and e x p ( 0.5 ) distribution with 10,000 reparations.
Table 2. The R M * of the residual cumulative generalized fractional extropy of random variables from a uniform distribution over [0,1] and e x p ( 0.5 ) distribution with 10,000 reparations.
Uniform distribution
s ν = 0.5 0.911.522.53
100.1617760.1329310.1252480.08646090.05341290.03008070.0156762
200.09442640.08355240.07970540.05736840.03632350.02081160.0109902
300.06984610.0644050.06182450.04540660.02907540.01678280.00891063
500.04879520.04725310.04565650.03420640.0221380.01286910.00686888
700.03841650.0382140.03704540.02803660.01824680.0106480.00569992
1000.03058810.03140060.03055960.02339490.01530970.008962570.00480783
Exponential distribution
103.887541.860781.590860.7630050.3707640.1759840.0806626
203.8231.811171.544690.7323810.3522050.1656270.0752857
303.803831.797291.531880.7240780.347170.1627790.0737777
503.78911.78621.521570.717170.342870.1602930.0724369
703.781011.779591.515350.7129060.3402230.1587820.071634
1003.774881.774611.510660.7096160.3381220.1575490.0709622
Table 3. The mean and variance of the residual cumulative generalized fractional extropy of random variables from a uniform distribution over [0,1] and e x p ( 0.5 ) distribution.
Table 3. The mean and variance of the residual cumulative generalized fractional extropy of random variables from a uniform distribution over [0,1] and e x p ( 0.5 ) distribution.
Uniform distribution
s ν = 0.5 0.911.522.53
100.6888860.4947070.450.2641680.14250.07155430.03375
(0.0349704)(0.0221006)(0.0196281)(0.0104911)(0.00527996)(0.00250388)(0.00112305)
200.7215380.5210850.4750.2823190.1543750.0786060.0376042
(0.0220977)(0.0142542)(0.0127293)(0.00699761)(0.00362468)(0.0017704)(0.00081843)
300.7320710.5298320.4833330.2884640.1584260.08102640.0389352
(0.0159614)(0.0103657)(0.00927311)(0.00514312)(0.00268817)(0.00132502)(0.000618232)
500.7403240.536810.490.2934150.16170.08298780.0400167
(0.0102199)(0.00667283)(0.00597773)(0.00333851)(0.00175721)(0.000872285)(0.000409905)
700.7438020.5397950.4928570.2955470.1631120.08383520.0404847
(0.00750798)(0.00491342)(0.00440416)(0.00246691)(0.00130228)(0.000648377)(0.000305596)
1000.7463810.5420310.4950.2971490.1641750.08447350.0408375
(0.00536806)(0.00351905)(0.00315569)(0.00177146)(0.000937208)(0.000467642)(0.000220898)
Exponential distribution
103.357562.048541.80.9185150.450.2113350.095
(1.44078)(0.46855)(0.36)(0.101859)(0.0285)(0.00733386)(0.00170367)
203.719932.173931.90.9620510.4750.2258550.102917
(0.903424)(0.250413)(0.19)(0.053759)(0.0154375)(0.00408568)(0.000976851)
303.87412.217151.933330.9760950.4833330.2307710.105617
(0.672551)(0.170814)(0.128889)(0.0364681)(0.0105617)(0.0028202)(0.000680384)
504.024832.252671.960.9870990.490.2347320.1078
(0.456248)(0.104467)(0.0784)(0.0221827)(0.006468)(0.00173912)(0.000422519)
704.103472.268311.971430.9917360.4928570.2364370.108741
(0.350581)(0.0752642)(0.0563265)(0.0159371)(0.00466035)(0.00125676)(0.00030623)
1004.172662.280291.980.9951750.4950.2377190.10945
(0.263682)(0.0530409)(0.0396)(0.0112045)(0.0032835)(0.000887397)(0.000216704)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mohamed, M.S.; Sakr, H.H. Generalizing Uncertainty Through Dynamic Development and Analysis of Residual Cumulative Generalized Fractional Extropy with Applications in Human Health. Fractal Fract. 2025, 9, 388. https://doi.org/10.3390/fractalfract9060388

AMA Style

Mohamed MS, Sakr HH. Generalizing Uncertainty Through Dynamic Development and Analysis of Residual Cumulative Generalized Fractional Extropy with Applications in Human Health. Fractal and Fractional. 2025; 9(6):388. https://doi.org/10.3390/fractalfract9060388

Chicago/Turabian Style

Mohamed, Mohamed Said, and Hanan H. Sakr. 2025. "Generalizing Uncertainty Through Dynamic Development and Analysis of Residual Cumulative Generalized Fractional Extropy with Applications in Human Health" Fractal and Fractional 9, no. 6: 388. https://doi.org/10.3390/fractalfract9060388

APA Style

Mohamed, M. S., & Sakr, H. H. (2025). Generalizing Uncertainty Through Dynamic Development and Analysis of Residual Cumulative Generalized Fractional Extropy with Applications in Human Health. Fractal and Fractional, 9(6), 388. https://doi.org/10.3390/fractalfract9060388

Article Metrics

Back to TopTop