**Appendix A. Specificity as a Measure of Information Content**

This Appendix section recaps specificity as a measure of the information content of a possibility distribution. Specificity has been mathematically defined by ZADEH [44], DUBOIS et al. [49], and MAURIS et al. [78] as a relative quantity between two information items (*π*<sup>1</sup> is more specific than *π*<sup>2</sup> if ∀*x* ∈ *X* : *π*1(*x*) < *π*2(*x*)). Absolute measures for specificity have been formalised by YAGER [51–53] as well as HIGASHI and KLIR [86,87].

A specificity measure *spec*(*π*) ∈ [0, 1] has to satisfy four conditions:


The measure of possibilistic specificity is a counterpart of Shannon's probabilistic entropy [45,86].

A measure of specificity for a real-valued, continuous frame of discernments is given by Yager [51–53]:

$$\text{spec}(\pi) \, = \, a\_{\text{max}} - \frac{1}{\left(\chi\_{\text{b}} - \chi\_{\text{a}}\right)} \cdot \int\_{0}^{a\_{\text{max}}} \left( \max\_{x \in A\_{\text{d}}} x - \min\_{x \in A\_{\text{d}}} x \right) \, \text{d}a\_{\text{s}} \tag{A1}$$

with *x*<sup>a</sup> and *x*<sup>b</sup> being the borders of *X* (*X* = [*x*a, *x*b]). For (A1), it is proven by Yager [51–53] that the measure satisfies the four requirements for specificity measures. The integral in (A2) is equivalent to the area under *A* [50]. Therefore, (A1) is equal to

$$\begin{split} \text{spec}(\boldsymbol{\pi}) &= \, \mathsf{a}\_{\mathsf{max}} - \frac{1}{\left(\mathsf{x}\_{\mathsf{b}} - \mathsf{x}\_{\mathsf{a}}\right)} \cdot \int\_{\mathsf{x}\_{\mathsf{a}}}^{\mathsf{x}\_{\mathsf{b}}} \pi(\mathbf{x}) \, \mathsf{d}x \\ &= \, \max\_{\mathsf{x} \in \mathsf{X}} \pi(\mathbf{x}) - \frac{1}{\left(\mathsf{x}\_{\mathsf{b}} - \mathsf{x}\_{\mathsf{a}}\right)} \cdot \int\_{\mathsf{x}\_{\mathsf{a}}}^{\mathsf{x}\_{\mathsf{b}}} \pi(\mathbf{x}) \, \mathsf{d}x. \end{split} \tag{A2}$$
