*4.4. Tsallis Divergence*

We consider **Tsallis divergence** for two densities, *f*<sup>1</sup> and *f*2, as it is defined in [44]:

$$\operatorname{TD}(Z\_1, Z\_2) = \int\_{-\infty}^{+\infty} f\_1(z) \log\_q \left( \frac{f\_2(z)}{f\_1(z)} \right) dz \tag{75}$$

that can also be expressed as:

$$\text{TD}(Z\_1, Z\_2) = \frac{1}{1-q} E\_{f\_1} \left[ \left( \frac{f\_1(z)}{f\_2(z)} \right)^{q-1} - 1 \right]. \tag{76}$$

We notice that this divergence is the divergence considered in [36], with *φ*(*x*) = log*q*(1/*x*), it is the divergence analyzed in [56], with *k* = 1 − *q*, and it is Tsallis Relative Entropy from [57].

When *q* → 1, Tsallis entropy becomes Shannon entropy and also Tsallis divergence becomes Kullback–Leibler divergence (33). The next theorem generalizes the results from [23], computing Tsallis divergence for two concomitants of GOS from the FGM family.

**Theorem 7.** *Let Y*[*r*] *and Y*[*s*] *the r-th and the s-th concomitants of the GOS from the FGM family with the densities g*[*r*] *and g*[*s*]*. Then, the Tsallis divergence of g*[*s*] *from g*[*r*] *has the following form:*

$$TD(Y\_{[r]}, Y\_{[s]}) = \frac{1}{1-q} [D\_1 - D\_2 - 1],\tag{77}$$

*where*

$$\begin{split} D\_{1} &= \frac{1}{q(q+1)} \left( \frac{\mathbf{C}\_{r}^{\*}}{\mathbf{C}\_{r}^{\*} - \mathbf{C}\_{s}^{\*}} \right)^{q-1} (1 + \mathbf{C}\_{r}^{\*}a)^{q+1} F\_{1} \left( q+1, q-1, q+1, -\frac{\mathbf{C}\_{s}^{\*} (1 + \mathbf{C}\_{r}^{\*}a)}{\mathbf{C}\_{r}^{\*} - \mathbf{C}\_{s}^{\*}} \right), \\ D\_{2} &= \frac{1}{q(q+1)} \left( \frac{\mathbf{C}\_{r}^{\*}}{\mathbf{C}\_{r}^{\*} - \mathbf{C}\_{s}^{\*}} \right)^{q-1} (1 - \mathbf{C}\_{r}^{\*}a)^{q+1} F\_{1} \left( q+1, q-1, q+1, -\frac{\mathbf{C}\_{s}^{\*} (1 - \mathbf{C}\_{r}^{\*}a)}{\mathbf{C}\_{r}^{\*} - \mathbf{C}\_{s}^{\*}} \right), \\ F\_{2} &= \text{diag} \end{split}$$

*with F*<sup>1</sup> *being the hypergeometric function.*

**Proof.** In (76), we replace *f*<sup>1</sup> and *f*<sup>2</sup> with the concomitants densities:

$$\mathcal{g}\_{[r]}(y) = f\_{\mathcal{Y}}(y)[1 + \mathcal{C}\_r^\* \alpha(2F\_{\mathcal{Y}}(y) - 1)],$$

$$\mathcal{g}\_{[s]}(y) = f\_{\mathcal{Y}}(y)[1 + \mathcal{C}\_s^\* \alpha(2F\_{\mathcal{Y}}(y) - 1)],$$

and we compute the expectation:

$$\mathbb{E}\_{\mathcal{S}\_{[r]}}\left[\left(\frac{\mathcal{G}\_{[r]}(Y)}{\mathcal{G}\_{[s]}(Y)}\right)^{q-1}\right] = \int\_0^\infty \mathcal{g}\_{[r]}(y) \left(\frac{\mathcal{G}\_{[r]}(Y)}{\mathcal{G}\_{[s]}(Y)}\right)^{q-1} dy \tag{78}$$

$$= \int\_0^\infty f\_Y(y) [1 + \mathcal{C}\_r^\* a(2F\_Y(y) - 1)] \left(\frac{f\_Y(y)[1 + \mathcal{C}\_r^\* a(2F\_Y(y) - 1)]}{f\_Y(y)[1 + \mathcal{C}\_r^\* a(2F\_Y(y) - 1)]}\right)^{q-1} dy.$$

First, we make the transformation:

$$F\_Y(y) = u, \ y = F\_Y^{-1}(u), \ f\_Y(y) dy = du\ \mu$$

and we obtain

$$E\_{\mathcal{S}[r]}\left[\left(\frac{\mathcal{S}\_{[r]}(Y)}{\mathcal{S}\_{[s]}(Y)}\right)^{q-1}\right] = \int\_0^1 \frac{[1+\mathcal{C}\_r^\* a(2u-1)]^q}{[1+\mathcal{C}\_s^\* a(2u-1)]^{q-1}} du. \tag{79}$$

Then, we make the transformation:

$$2u - 1 = v, \; u = (v+1)/2, \; 2du = dv,$$

and we obtain:

$$E\_{\mathcal{S}[r]}\left[\left(\frac{\mathcal{S}\_{[r]}(\mathcal{Y})}{\mathcal{S}\_{[s]}(\mathcal{Y})}\right)^{q-1}\right] = \frac{1}{2}\int\_{-1}^{1} \frac{[1+\mathcal{C}\_r^\* \alpha v]^q}{[1+\mathcal{C}\_s^\* \alpha v]^{q-1}} dv. \tag{80}$$

We use the general formula:

$$\begin{split} \frac{1}{2} \int\_{-1}^{1} \frac{(1+ax)^{A}}{(1+bx)^{B}} dx &= \\ &= \frac{1}{A(A+1)} \left\{ (1+a)^{A+1} (1+b)^{-B} \left[ \frac{a(1+b)}{a-b} \right]^{B} F\_{1} \left( A+1, B, A+1, -\frac{b(1+a)}{a-b} \right) - \left( \frac{1}{A} \right)^{A} F\_{1} \left( A+1, B, A+1, -\frac{b(1+a)}{a-b} \right) \right\} \\ &\quad - (1-a)^{A+1} (1-b)^{-B} \left[ \frac{a(1-b)}{a-b} \right]^{B} F\_{1} \left( A+1, B, A+1, -\frac{b(1-a)}{a-b} \right) \Bigg\}, \end{split} \tag{15}$$

where *F*<sup>1</sup> is the hypergeometric function. It results in:

$$E\_{\mathcal{S}\_{[r]}}\left[\left(\frac{\mathcal{S}\_{[r]}(\mathcal{Y})}{\mathcal{S}\_{[s]}(\mathcal{Y})}\right)^{q-1}\right] = D\_1 - D\_2.$$

#### **5. Conclusions**

This paper is focused on information measures related to Shannon entropy, Tsallis entropy, Fisher information, and divergences for the concomitants of GOS from the FGM family. We review the literature on the mentioned information measures and we generalize existing results. The study of the concomitants, pairs of the order statistics in a sample from a bivariate distribution, ordered by one variate, could have applications in reliability, for example, in the analysis of the lifetime uncertainty of complex systems. For this reason, we also discuss residual and past versions of the entropies. Considering generalized order statistics (GOS) results in an increasing complexity of computations, but it gives a general form of the computed measures that can be applied for the concomitants of various order statistics.

**Author Contributions:** All authors contributed equally to the paper. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research received no external funding.

**Institutional Review Board Statement:** Not applicable.

**Data Availability Statement:** Not applicable.

**Acknowledgments:** The authors are grateful to the referees for their comments and suggestions which improved the presentation of this paper.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**


**Andres Dajles \*,†, Joseph Cavanaugh †**

**\*** Correspondence: andres-dajles@uiowa.edu

† These authors contributed equally to this work.

**Abstract:** When choosing between two candidate models, classical hypothesis testing presents two main limitations: first, the models being tested have to be nested, and second, one of the candidate models must subsume the structure of the true data-generating model. Discrepancy measures have been used as an alternative method to select models without the need to rely upon the aforementioned assumptions. In this paper, we utilize a bootstrap approximation of the Kullback–Leibler discrepancy (BD) to estimate the probability that the fitted null model is closer to the underlying generating model than the fitted alternative model. We propose correcting for the bias of the BD estimator either by adding a bootstrap-based correction or by adding the number of parameters in the candidate model. We exemplify the effect of these corrections on the estimator of the discrepancy probability and explore their behavior in different model comparison settings.

**Keywords:** bootstrap discrepancy comparison probability (BDCP); discrepancy comparison probability (DCP); likelihood ratio test (LRT); model selection; *p*-value
