*3.3. Fisher Information Number*

Fisher information measures the amount of information that we can obtain from a sample about an unknown parameter and therefore, it measures the uncertainty included in a unknown characteristic of a population. If the parameter is a location one, then Fisher information is shift-invariant and has the form:

$$I\_f = \int\_{-\infty}^{+\infty} \left(\frac{\partial}{\partial \mathbf{x}} \log f(\mathbf{x})\right)^2 f(\mathbf{x}) d\mathbf{x} = E \left[ \left(\frac{\partial}{\partial \mathbf{x}} \log f(\mathbf{x})\right)^2 \right]. \tag{32}$$

Shift-invariant Fisher information, also called Fisher Information Number (FIN), was studied in [38]. It has applications in statistical physics where it is also known by the name extreme physical information [39], and it is used in analyzing the evolution of dynamical systems.

For the concomitants of GOS from the FGM family, the Fisher information number was determined in [40].

## *3.4. Divergence Measures*

Divergences are useful tools when a measure of the difference between two probability distributions is needed and therefore, they have applications in various fields, from inference for Markov chains, [41–43] to machine learning [44,45]. One of the best-known divergence is Kullback–Leibler divergence [46,47], which for two continuous random variables *Z*1, with probability density *f*1, and *Z*<sup>2</sup> with probability density *f*<sup>2</sup> is :

$$\text{KLD}(Z\_1, Z\_2) = \int\_{-\infty}^{+\infty} f\_1(z) \log \left( \frac{f\_1(z)}{f\_2(z)} \right) dz. \tag{33}$$

Kullback–Leibler divergence for the concomitants of GOS from the FGM family was computed in [24] and the result is distribution-free.

One of the generalizations of Kullback–Leibler divergence measure is Tsallis divergence, which expands Kullback–Leibler divergence in a similar way to that in which Tsallis entropy extends Shannon entropy. There is a very rich literature on Tsallis divergence or Tsallis relative entropy in the case of the discrete distributions, see, for example, [48–50]. Tsallis divergence for continuous distributions does not appear so frequently in the literature, being studied mainly in Machine Learning context, [44,45]. Tsallis divergence for the concomitants will be determined in this paper in the next section.

#### **4. Information Measures for the Concomitants from FGM Family, New Results**

In this section, we will provide some generalizations of the existing results on the information measures for the concomitants of GOS from the FGM family, results that are mentioned in previous section. We are interested in Awad-type extensions of the entropies, in residual and past Tsallis entropies, in Tsallis type extension of the FIN, and in Tsallis divergence.
