1. Introduction
In physics, the idea of entropy has been crucial as a very useful measure. To indicate the thermodynamic change of heat that is transferred during the course of a reversible process at a specific temperature
T, Clausius [
1] invented the metric in 1850. However, entropy’s significance in statistical mechanics lies at a slightly deeper level, where it is typically viewed as the level of uncertainty in the state that a physical system can achieve or as a link between microscopic and macroscopic cases, since it calculates the number of states that an atom or molecule must take in order to satisfy a macroscopic configuration. Entropy’s application is not restricted to statistical analysis because it is directly related to the second law of thermodynamics and therefore applies to all other areas of physics. Entropy in statistical mechanics is measured using the Boltzmann–Gibbs measure, which is given by
in which
stands as a discrete probability distribution on a random quantum state for
N microstates. For
, the measure
H takes its greatest value (i.e.,
), which holds when the state of system is in equilibrium. The Boltzmann–Gibbs entropy has the additivity property, as for two systems
A and
B which are non-interactive and adequately separated from each other with microstates
and
, respectively, which are accessible, one has
. It is worth mentioning that
H also has the extensity property (i.e., the composite system
has an entropy which fulfills
). It should be noted that the entropy in Equation (
1) does not depend on the dimension, so temperature in this discussion has the dimension of energy.
The application of entropy in mechanics, thermodynamics, and fatigue life modeling has been presented recently. For example, in the Basaran [
2] theory, Newton’s universal laws of motion included the creation of an entropy. Lee et al. [
3] used the thermodynamic state index (TSI) determined via cumulative entropy generation, which is used to predict the lifetime. On the basis of the theory used in unified mechanics, Lee et al. [
4] introduced a fatigue life model to anticipate the fatigue life of metals at very high cycling using an entropy generation mechanism. Lee and Basaran [
5] analyzed models based on an irreversible entropy as a metric with an empirical evolution function for empirical models developed in the context of Newtonian mechanics. Another fatigue model using entropy generation was proposed by Temfack and Basaran [
6].
Several generalizations of the Boltzmann–Gibbs entropy have been brought forward to push the preparation of statistical mechanics to new limits. Several of these approaches have been prompted by the desire to maintain the thermodynamic limit while deforming the structured entropy of Equation (
1) without free parameters (see, for example, the work of Obregón [
7], Fuentes et al. [
8], and Fuentes and Obregón [
9]). Others have studied the relaxation of the measure in Equation (
1) by introducing free parameters (see Kaniadakis [
10] and Sharma and Mittal [
11]). In the sequel to this paper, we will concentrate on the Renyi entropy (RE), which was first discussed in relation to coding and information theory by Renyi [
12] as one of the first endeavors to extend the Shannon entropy (Shannon [
13]). The definition of the RE is
where
is a parameter that deforms the entropy structure. For example, if
, then the RE corresponds to the Boltzmann–Gibbs case. The logarithmic structure allows the RE to retain the property of additivity no matter what value the parameter (which is free) takes, although the extensive property is no longer retained for any
.
Because of its properties, the RE has gained considerable attention in information theory from both the quantum and classical perspectives (see, for example, the work of Campbell [
14], Csiszar [
15], and Goold et al. [
16]). The RE is a powerful measure for quantizing quantum entanglement and the corresponding correlations which are strong, as they are found in a quantum field (Cui et al. [
17]). For instance, in multipartite systems, the special case
was discovered to be a measure of information describing the Gaussian states of harmonic quantum oscillators, resulting in strong inequalities of subadditivity that require generalized measures of mutual information (Adesso et al. [
18]), the computation of which can be traced through path regularization schemes (Srdinšek [
19]). Generalizations of conditional quantum information and the topological entanglement entropy are two more uses of the RE in quantum information (Berta et al. [
20]). The RE has also been proposed as a method to describe phase changes between self-organized states in dynamical problems, both in complex systems (Beck [
21] and Bashkirov [
22]) and in fuzzy systems (Eslami-Giski et al. [
23]), connected to the occurrence of quantum fuzzy trajectories (Fuentes [
24]). This entropy measure’s viability has also been acknowledged in other disciplines, such as molecular imaging for therapeutic applications (Hughes [
25]) and mathematical physics (Franchini et al. [
26]) and biostatistics (Chavanis [
27]). The concept of the RE can also be defined for a continuous distribution function (CDF) with some obvious modifications.
In the context of statistics and probability, quantifying uncertainties in a system’s lifetime is critical for engineers performing survival analysis. They concur that systems with higher dependability and longer lifetimes are better systems and that system reliability declines as uncertainties rise (see, for example, the work of Ebrahimi and Pellery [
28]). Let
X be a non-negative random variable (RV) with a probability density function (PDF)
The RE, recognized as an RE of the order
, is given by
in which “log” represents the natural logarithm. Specifically, the Shannon differential entropy [
13] can be calculated as
It is worth pointing out that the Shannon differential entropy is utilized to measure the uniformity of a PDF. In this case, the maximum entropy distribution is uniform. Hence, the values of
which are greater induce more uncertainty being generated by the PDF
f and, as a result, the ability to predict the next upshot of the RV
If
X denotes the lifetime of a new system, then
measures the uncertainty of the new system. In some cases, agents know something regarding the age of the system. For example, one may know that the system is alive at time
t and is interested in the uncertainty, the value it takes, and its RL (i.e.,
). Then,
will not be useful in such situations. Accordingly, the residual RE is
where
is the PDF of
,
is the survival function (SF) of
X, and
is the quantile function of
. Various properties, generalizations, and applications of
were investigated by Asadi et al. [
29], Gupta and Nanda [
30], Nanda and Paul [
31], and the authors of the references therein.
We recall that a number of standards are available for assessing the aging process of a lifetime unit. To this aim, the residual lifetime (RL) will be, in turn, an illustrative measure of the aging behavior. In fact, in a situation where the distribution of the RL is not affected by the age of a system, it is said that this is a no-aging aspect of the system. Thus, the behavior of the RL is considered dominant when one studies the aging phenomenon in a component or, more generally, in a system. In this paper, the RE of the RL of a system is evaluated. Namely, a formula for the RE of the RL is presented whenever the components of the system are working at time
t. The expressions are based on system signatures, lifetime distribution functions, and the beta distribution. The concept of a system signature is used when the lifetimes the components of a coherent system have are independent and identically distributed (IID). Ebrahimi [
32] initiated the concept of the dynamic Shannon entropy and acquired several aspects this measure has. Toomaj and Doostparat [
33] studied the classic Shannon entropy and its possessions for mixed systems, and moreover, further findings on the measure have also been achieved (see also [
34]). Recently, Toomaj et al. [
35] investigated some results concerning the information aspects in working systems in use by applying a dynamic signature. In this paper, we continue this line of research and investigate some results on the RE properties of working used systems using system signatures. In fact, we generalize the results of the aforementioned papers.
The results of this work are organized as follows. In
Section 2, we provide an expression for the RE of a coherent system under the assumption that all components have survived to a certain point in time. The residual RE is also ordered based on some ordering properties of system signatures without being computed directly in
Section 3.
Section 4 presents some useful limits and bounds for the new measure. There are some remarks that may be useful in future studies, which are given in
Section 5.
In the remaining parts of the paper, the notations “
”, “
”, “
”, and “
” are used to signify the usual stochastic order, the hazard rate order, the likelihood ratio order, and the dispersive order, respectively. For the further aspects and properties these orders have, the reader can be referred to Shaked and Shanthikumar [
36].
2. Renyi Entropy of the Residual Lifetime
In this section, we use the concept of the system signature to find an assertion for the RE of the RL of a system with a coherent structure that has an arbitrary system-level structure in the sense that we know that the components contained in the system are all in operation at a time
t. The signature a coherent system having
n components has is an
n-dimensional vector
for which the
jth element
, in which
T represents the lifetime the underlying coherent system has and
represent the order statistics of
n IID lifetimes of the components
with an unchanged distribution. For more details, see, for example, the work of Samaniego [
37]. Let us think about a coherent structure with IID component lifetimes
and a known signature vector
. If
represents the RL of the system provided that, at time
all components of the system are functioning, then from the results of Khaledi and Shaked [
38], the SF of
is derivable as follows:
where
denotes the RL of an
i-out-of-
n system, provided that the components are all operating at the time
t. The SF and PDF of
are given by
and
respectively, where
is the complete gamma function. It follows that
In what follows, we will concentrate on the study of the RE of the RV
, which measures the degree of uncertainty induced by the PDF of
with respect to the predictability of the RL of the system in terms of the RE. The transformation
plays a crucial role in our goal. It is clear that
follows from a beta distribution having the parameters
and
i with the PDF
Next, we give a statement on the RE of by applying the earlier transformations:
Theorem 1. The RE of can be derived aswhere for all Proof. By using the change of
from Equations (
4) and (
8), we obtain
In the last equality, is the PDF of V, which signifies the lifetime the system with the IID uniform distribution has. □
According to Equation (
11), if
(standing for the signature of an
i-out-of-
n system), then
in which
follows the beta distribution having the parameters
and
and
such that
stands for the beta function. In the special case for
, Equation (
12) coincides with the results of Abbasnejad and Arghami [
39].
The next theorem follows directly from Theorem 1, concerning the aging properties of its components. We recall that X has an increasing (decreasing) failure rate (IFR (DFR)) if in x is decreasing (increasing) for all :
Theorem 2. If X is the IFR (DFR), then is decreasing (increasing) in t for all .
Proof. We just prove this for when
X is the IFR, while the proof for the DFR is similar. It is plain to observe that
This implies that Equation (
11) can be rewritten as
for all
Instead, one can find that
for all
and hence one has
If
, then
. Therefore, when
F is the IFR, then for all
we have
for all
Using (
14), we get
for all
This implies that
for all
, and this completes the proof. □
The next example illustrates the results of Theorems 1 and 2:
Example 1. Contemplate a bridge system with a system signature The exact value of can be calculated using the relation in Equation (11), given the distributions of component lifetimes. For this purpose, let us assume the following lifetime distributions: - (i)
Let
X follow the uniform distribution in
From Equation (
11), we immediately obtain
We see that is decreasing in We note that the uniform distribution has the IFR property, and therefore decreases as t increases, as we expected based on Theorem 2.
- (ii)
Think about a Pareto type II with the SF
It is not hard to see that
It is obvious that the RE of is increasing in terms of Thus, the uncertainty of the conditional lifetime increases as t increases. We recall that this distribution has the DFR property.
- (iii)
Let us suppose that
X has a Weibull distribution with the shape parameter
k and with the SF
Through some manipulation, we obtain
It is not a facile assignment to acquire a plain statement for the above relation, and therefore we computed it numerically. In
Figure 1, we framed the entropy of
in terms of the time
t for values of
and
as well as
, which has the DFR property. As expected from Theorem 2, it is evident that
increases in
In
Figure 2, we plotted the entropy of
with respect to time
t for values of
and
along with
, which has the IFR property. As expected from Theorem 2, it is evident that
decreases in
Below, we compare the Renyi entropies of two coherent system lifetimes and their residual lifetimes.
Theorem 3. Consider a coherent system with IID IFR (DFR) component lifetimes. Then, for all .
Proof. We prove the theorem for the case when
X is the IFR, where the proof for the DFR property is similar. Since
X is the IFR, Theorem 3.B.25 of Shaked and Shanthikumar [
36] implies that
; that is, we have
for all
If
then we have
Thus, from Equations (
11) and (
18), we obtain
Therefore, the proof is completed. □
We remark that Theorem 3 reveals that when the lifetimes the components have are the IFR (DFR), then the RE of the working coherent system when all components of the system are alive at time t are less (greater) than the RE of the new system. The next theorem provides a lower bound for the residual RE in terms of the RE of the new system:
Theorem 4. If X is the DFR, then a lower bound for is given as follows:for all Proof. Since
X is the DFR, then it is NWU (i.e.,
). This implies that
for all
On the other hand, if
X is the DFR, then the PDF
f is decreasing, which implies that
for all
From Equation (
11), one can conclude that
for all
and this completes the proof. □
3. Renyi Entropy Comparison
In this section, we are concerned with the partial ordering (that is, it has reflexive, transitive, and antisymmetric properties) of the conditionally represented lifetimes in two coherent systems on the basis of their uncertainties. We report a number of achievements for making orderings between two coherent systems on the basis of different existing orderings among the lifetimes their components have and the associated vectors of the signature. In the next result, we analyze the entropies of the residual lifetimes of two coherent systems with the same structures:
Theorem 5. Let and denote the RLs of two coherent systems with matching signatures and n IID component lifetimes and from CDFs F and G, respectively. If , and X or Y is the IFR, then for all
Proof. As a result of the relation in Equation (
11), it is sufficient to demonstrate that
Due to the ordering relation
and the assumption that
X or
Y is the IFR, the proof of Theorem 5 of Ebrahimi and Kirmani [
40] shows that
, and this concludes the proof. □
Example 2. Let us assume two coherent systems with residual lifetimes and with the common signature Suppose that and , where stands for the Weibull distribution with the SF given in Equation (17). It is easy to see that Moreover, X and Y are both the IFR. Therefore, Theorem 5 yields that for all The plots of the Renyi entropies of these systems are displayed in Figure 3. In the next result, we analyze the residual REs for two coherent systems having matching component lifetimes and distinct structures:
Theorem 6. Let and signify the RLs of two coherent systems with vectors of signatures given by and , respectively. Suppose that the system’s components are IID according to the CDF F, and also also let Then, we have the following:
- (i)
If increases in u for all then for all .
- (ii)
If decreases in u for all then for all .
Proof. (i) First, we note that Equation (
11) can be reformulated as follows:
where
has the PDF as
Assumption
implies
, and this gives that
, which means that
increases in
u for all
, and hence
When
we obtain
where the inequality in Equation (
20) is obtained by noting that the conditions
imply
for all increasing (decreasing) functions
Therefore, the relation in Equation (
19) gives
or equivalently,
for all
Part (ii) can be obtained in a similar way. □
The next example gives an application of Theorem 6:
Example 3. Let and be the signatures of two coherent systems of the order , with residual lifetimes and Let us consider a Pareto type II with the SF given in Equation (16). After some calculation, one can find Clearly, the above function is increasing in u for all (i.e., increases in u for all ). Hence, due to Theorem 6, it holds that for all .