*3.1. Shannon and Shannon-Related Entropies*

**Shannon entropy** was introduced by Shannon in 1948 [22], it has multiple applications and it can be defined as:

$$H^S(X) = -E[\log f(X)]\tag{12}$$

Information measures for concomitants derived from the FGM family have been studied by Tahmasebi and Behboodian: in [23] for concomitants of order statistics, and in [24] for concomitants of GOS. Using (7), they proved that for the Shannon entropy of *Y*[*r*], the concomitant of the r-th generalized order statistics is:

$$H^{\mathbb{S}}(Y\_{[r]}^{\*}) = \mathcal{W}(r, u, n, m, k) + H^{\mathbb{S}}(Y)(1 - n\mathbb{C}\_{r}^{\*}) - 2n\mathbb{C}\_{r}^{\*}\phi\_{f}(u),\tag{13}$$

where

$$\mathcal{W}(r,a,n,m,k) = \frac{1}{4a\mathbb{C}\_r^\*} \{ (1 - \mathbb{C}\_r^\* a)^2 \log(1 - \mathbb{C}\_r^\* a) - (1 + \mathbb{C}\_r^\* a)^2 \log(1 + \mathbb{C}\_r^\* a) \} + \frac{1}{2},\tag{14}$$

and

$$\Phi\_f(u) = \int\_0^1 \log f\_Y(F\_Y^{-1}(u)) du. \tag{15}$$

**Remark 3.** *In [23] are analyzed the properties of* (13) *in the particular case when m* = 0 *and k* = 1*, i.e., when the GOS reduces to the simple order statistics, and therefore, the entropy from* (13) *in this case is the Shannon entropy of the concomitant of r-th-order statistic:*

$$H^{\mathbb{S}}(Y\_{[r]}) = W(r, n, n, 0, 1) + H^{\mathbb{S}}(Y) \left(1 + a \frac{n - 2r + 1}{n + 1} \right) + 2a \frac{n - 2r + 1}{n + 1} \phi\_f(u) \tag{16}$$

*In [24], Shannon entropy for record values is also mentioned:*

$$H^{\mathbb{S}}(R\_{[r]}) = \mathcal{W}(r, a, n, -1, 1) + H^{\mathbb{S}}(Y)(1 + a(2^{1-r} - 1)) + 2a(2^{1-r} - 1)\phi\_f(u). \tag{17}$$

*If we are in the case of progressive type II censoring order statistics with an equi-balanced censoring scheme, the Shannon entropy of the concomitant of r-th-order statistic from the FGM family is* (13) *with m* = *R, the removal number.*

Awad, in 1987, ref. [25] noticed that Shannon entropy, in the continuous case, does not fulfill the condition that the entropy is preserved under the linear transformation and proposed the following entropy known also in the literature as Sup-entropy:

$$H^{SA}(X) = -E\left[\log\frac{f(X)}{\delta}\right] \tag{18}$$

where *<sup>δ</sup>* <sup>=</sup> sup{ *<sup>f</sup>*(*x*)|*<sup>x</sup>* <sup>∈</sup> <sup>R</sup>}. We will call this entropy **Shannon–Awad entropy**.

**Residual and past Shannon entropies** were defined in the context of reliability, being important in measuring the amount of information that a residual life or a past life of a unit has. In the following, the random variable *X* with pdf *f* , cdf *F*, and survival function *F*¯, is considered positive and it has the meaning of a lifetime of a unit.

Residual entropy is introduced and its properties are analyzed in the works of Ebrahimi [26] and Ebrahimi and Pellerey [27]. Residual entropy is based on the idea of measuring the expected uncertainty contained in the conditional density of *X* − *t* given *X* > *t* [27]:

$$H^S(X;t) = -E\left[\log\frac{f(X)}{F(t)} \Big| X > t\right].\tag{19}$$

In terms of failure rate, the residual entropy can be written as:

$$H^{\mathbb{S}}(X;t) = 1 - E[\log \lambda\_F(X) | X > t],$$

where *λF*(·) = *f*(·)/*F*(·) is the failure rate function.

Similar to the definition of the residual entropy, DiCrescenzo and Longobardi [28] introduced past entropy as a dual to the residual entropy. Past entropy measures the uncertainty about past life of a failed unit:

$$\overline{H}^S(X;t) = -E\left[\log \frac{f(X)}{F(t)} \Big| X < t\right].\tag{20}$$

In terms of reversed failure rate, past entropy can be written as:

$$\overline{H}^S(X;t) = 1 - E\left[\log \tau\_F(X) | X < t\right],$$

where *τF*(·) = *f*(·)/*F*(·) is the reversed failure rate function.

Residual and past entropies for concomitants of GOS from the FGM family were determined by Mohie EL-Din et al. in [29]. They considered also concomitants of other types of GOS, but the form of the entropies is similar. The residual entropy of the r-th concomitant of GOS from the FGM family is [29]:

$$\begin{split} H^{\mathbb{S}}(\mathcal{Y}\_{[r]}^{\*};t) = \log \overline{\mathcal{G}}\_{[r]}(t) - \frac{1}{\overline{\mathcal{G}}\_{[r]}(t)} \Big\{ (1 - a\mathcal{C}\_{r}^{\*}) [\overline{F}\_{Y}(t)(\log \overline{F}\_{Y}(t) - H^{\mathbb{S}}(\mathcal{Y};t))] + \\ \qquad + 2a\mathcal{C}\_{r}^{\*}\phi\_{f}(t) + \mathcal{K}\_{1}(r,t,a,n,m,k) \Big\}, \end{split} \tag{21}$$

where

$$K\_{1}(r,t,a,n,m,k) = \frac{1}{2a\mathbb{C}\_{r}^{\*}} \left\{ \frac{-1}{4} [(1+a\mathbb{C}\_{r}^{\*})^{2} - (1+a\mathbb{C}\_{r}^{\*}(2F\_{Y}(t)-1))^{2}] + \\ \tag{22}$$

$$+ \frac{1}{2} [(1+a\mathbb{C}\_{r}^{\*})^{2} \log(1+a\mathbb{C}\_{r}^{\*}) - (1+a\mathbb{C}\_{r}^{\*}(2F\_{Y}(t)-1))^{2} \log(1+a\mathbb{C}\_{r}^{\*}(2F\_{Y}(t)-1))] \right\},$$

and

$$\Phi\_f(t) = \int\_t^{\infty} F\_Y(y) f\_Y(y) \log f\_Y(y) dy. \tag{23}$$

We notice that for *t* = 0, the residual entropy (21) becomes the entropy (13).

**Remark 4.** *For m* = 0 *and k* = 1*, we obtain residual Shannon entropy for the concomitant of r-th-order statistic:*

$$\begin{split} H^{S}(Y\_{[r]};t) = \log \overline{G}\_{[r]}(t) - \frac{1}{\overline{G}\_{[r]}(t)} \Big\{ \Big( 1 - a \frac{n - 2r + 1}{n + 1} \Big) [\overline{F}\_{Y}(t)(\log \overline{F}\_{Y}(t) - H^{S}(Y; t))] - \\ - 2a \frac{n - 2r + 1}{n + 1} \phi\_{f}(t) + K\_{1}(r, t, a, n, 0, 1) \Big\}. \end{split} \tag{24}$$

*For m* = −1 *and k* = 1*, we obtain residual Shannon entropy for the concomitant of r-th record value:*

$$\begin{split} H^{\rm S}(R\_{[r]};t) &= \log \overline{G}\_{[r]}(t) - \frac{1}{\overline{G}\_{[r]}(t)} \{ (1 - a(2^{1-r}-1)) [\overline{r}\_Y(t)(\log \overline{F}\_Y(t) - H^{\rm S}(Y;t))] - \text{ (25)} \} \\ &- 2a(2^{1-r}-1)\phi\_f(t) + K\_1(r,t,a,n,-1,1) \}. \end{split} \tag{25}$$

*In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Shannon entropy of the concomitant of r-th-order statistic from the FGM family is* (21) *with m* = *R, the removal number.*

In a similar way, the past entropy for the concomitant of the r-th GOS from the FGM family is defined as [29]:

$$\begin{split} \overline{H}^{\mathbb{S}}(\mathbf{Y}\_{[r]}^{\*};t) = \log G\_{[r]}(t) - \frac{1}{G\_{[r]}(t)} \{ (1 - a\mathbb{C}\_{r}^{\*}) [\mathcal{F}\_{Y}(t)(\log F\_{Y}(t) - \overline{H}^{\mathbb{S}}(\mathbf{Y};t))] \} + \\ &+ 2a\mathbb{C}\_{r}^{\*}\overline{\mathfrak{F}}\_{f}(t) + K\_{2}(r,t,a,n,m,k) \}, \end{split} \tag{26}$$

where

$$K\_{2}(r,t,a,n,m,k) = \frac{1}{2a\mathcal{C}\_{r}^{\*}} \left\{ \frac{-1}{4} [(1 + a\mathcal{C}\_{r}^{\*}(2F\_{Y}(t) - 1))^{2} - (1 - a\mathcal{C}\_{r}^{\*})^{2}] + \\ \tag{27}$$

$$+ \frac{1}{2} [(1 + a\mathcal{C}\_{r}^{\*}(2F\_{Y}(t) - 1))^{2} \log(1 + a\mathcal{C}\_{r}^{\*}(2F\_{Y}(t) - 1)) - (1 - a\mathcal{C}\_{r}^{\*})^{2} \log(1 - a\mathcal{C}\_{r}^{\*})] \right\},$$

and

$$\overline{\Phi}\_f(t) = \int\_0^t F\_Y(y) f\_Y(y) \log f\_Y(y) dy. \tag{28}$$

We notice that for *t* → ∞, the past entropy (26) becomes the entropy (13).

#### *3.2. Tsallis and Tsallis-Related Entropies*

For the first time, Tsallis entropy was introduced and used in the context of Cybernetics Theory by Harvda and Charvat [30], but it has become well known since its definition as a generalization of Boltzmann–Gibbs statistics, in the context of thermodynamics, by Tsallis in 1988 [31]. Being the starting point of the field of non-extensive statistics, Tsallis entropy is a non-additive generalization of the Shannon entropy and for a continuous random variable *X* with density function *f* , it can be defined as:

$$H^T(X) = \frac{1}{q-1} \left\{ 1 - \int\_{-\infty}^{+\infty} [f(x)]^q dx \right\} \neq 0, \ q \neq 1. \tag{29}$$

When *q* → 1, Tsallis entropy approaches to Shannon entropy. Tsallis entropy has, in turn, various generalizations, see, for example, [32].

Another important element in non-extensive statistics is log*<sup>q</sup>* function:

$$\log\_q x = \frac{x^{1-q} - 1}{1-q}, \quad x > 0, \ q \neq 1,\tag{30}$$

and Tsallis entropy can be obtained using this function in two ways:

$$H^T(X) = E\left[\log\_q \frac{1}{f(X)}\right] = \frac{1}{q-1} E\left[1 - [f(X)]^{q-1}\right].$$

Tsallis entropy has applications in many fields, from statistical mechanics and thermodynamics, to image processing and reliability, sometimes being more suited to measuring uncertainty than classical Shannon entropy [33,34].

In [35], Tsallis entropy was computed and its properties obtained for record values and their concomitants when the bivariate distribution is in the FGM family.

Similar to the Shannon case, we can think about residual and past variants of the Tsallis entropy in the context of reliability. In [36], Nanda and Paul introduced **residual Tsallis entropy** as the 'first kind residual entropy of order *β*'. In our notation, *β* is *q*:

$$H^T(\mathbf{X}; t) = \frac{1}{q - 1} \left\{ 1 - \int\_t^\infty \left[ \frac{f(\mathbf{x})}{\overline{F}(t)} \right]^q d\mathbf{x} \right\} = \frac{1}{q - 1} \left\{ 1 - \frac{1}{[\overline{F}(t)]^q} \int\_t^\infty [f(\mathbf{x})]^q d\mathbf{x} \right\}.\tag{31}$$

In addition to entropy type information measures, there are another two types information measures that can be associated to probability distributions—Fisher measures and divergence measures [37].
