*3.1. Possibilistic Pooling Fusion*

Conjunctive and disjunctive fusion is most commonly performed using triangular norms (*t-norms*) and their counterpart triangular conorms (*s-norms*)—both stemming from fuzzy set theory. Triangular norms and conorms are functions t, s : [0, 1] × [0, 1] → [0, 1], which satisfy the properties of commutativity, associativity, and monotonicity [56]. For t-norms, 1 is the identity element, i.e., t(*π*, 1) = *π*. For s-norms, 0 is the identity element, i.e., s(*π*, 0) = *π*. Examples of t-norms are the minimum and the product operator. An example of an s-norm is the maximum operator. Although t-norms and s-norms are defined as binary functions, they can be directly applied to multiple possibility distributions because of their commutative and associative property.

In conjunctive mode, it is presumed that sources agree at least partially about the possibility of alternatives, that is, their information items are at least partially consistent. Consistency within a group of information items **I** is defined as [4]

$$\mathbf{h}(\mathbf{I}) = \mathbf{h}(\pi\_1, \pi\_2, \dots, \pi\_n) \;= \max\_{\mathbf{x} \in \mathcal{X}} \left( \mathop{\mathbf{t}}\_{i \in \{1, \dots, n\}} (\pi\_i(\mathbf{x})) \right). \tag{2}$$

Partially agreeing sources are characterised by items with h(**I**) > 0—that is, their possibility distributions have overlapping support. Fully agreeing sources have items with h(**I**) = 1, i.e., their possibility distributions have overlapping cores. Conjunctive fusion of fully consistent information items is then achieved by directly applying a t-norm [48]:

$$\pi^{(\text{fu})}(\mathbf{x}) \, = \mathop{\mathbf{t}}\_{i \in \{1, \dots, \mathbf{u}\}} (\pi\_i(\mathbf{x})). \tag{3}$$

As t-norms satisfy the strong zero preservation principle, i.e., *t*(*π*, 0) = 0, the conjunctive fusion excludes all alternatives, which at least one information source deems impossible. Conjunctive fusion results in the most specific outcome by eliminating alternatives. If information items are only partially consistent, then fusion based on t-norms results in subnormal possibility distributions. Renormalising the resulting possibility distribution leads to

$$\pi^{(\text{fu})}(\mathbf{x}) \;= \; \frac{\text{t}}{\underset{i \in \{1, \ldots, n\}}{\text{h}} \; (\pi\_i(\mathbf{x}))} \; \tag{4}$$

which is only defined if sources are not completely disagreeing and if their information items not fully inconsistent, i.e., h = 0 [48].

The disjunctive fusion is appropriate if information items are completely inconsistent, i.e., sources disagree, at least one of them is wrong in its assessment, and it is not known which one. The disjunctive fusion is given by applying an s-norm:

$$\pi^{(\text{fu})}(\mathbf{x}) \, = \mathop{\text{s.t.}}\_{i \in \{1, \dots, n\}} (\pi\_i(\mathbf{x})) \, \tag{5}$$

keeping all available information. In general, purely disjunctive fusion is not desirable as it results in minimal specific outcomes but is necessary in disagreeing cases.

Trade-off fusion modes combine conjunctive and disjunctive fusion depending on what is known (or assumed) about the reliability of sources. Prominent fusion rules can be found in the paper of Dubois and Prade [4]. For this paper, the most important of these are fusion based on the most consistent subsets, quantified fusion, and adaptive fusion.

One prominent way to aggregate information in a two-step process is to search for *maximal consistent subsets* (MCS) [20,57]. These nonconflicting MCS are fused conjunctively prior to disjunctive fusion of intermediate results. Dubois et al. [58] proposed an algorithm that finds MCS with linear complexity. In this algorithm, all subsets of **I** with a consistency above or equal to *<sup>α</sup>* ∈ [0, 1] are clustered. Let **<sup>I</sup>**MCS ⊆ **<sup>I</sup>** denote MCS subsets, then MCS fusion is formalised for a possibilistic setting as [6]:

$$\pi^{(\text{fu})}(\mathbf{x}) = \max\_{\mathbf{I}^{\text{MCS}} \subseteq \mathbf{I}} \left( \mathop{\mathbf{t}}\_{I\_i \in \mathbf{I}^{\text{MCS}}} (\pi\_i(\mathbf{x})) \right). \tag{6}$$

Later advancements in MCS fusion were proposed in multiple works [59–61].

Quantified fusion [62,63] is a similar two-step fusion process, which assumes that the number of reliable sources *j* is known. The quantified rule then takes all subsets of information items **I**<sup>∗</sup> ⊆ **I** with cardinality *j* and fuses these conjunctively in the first step. All intermediate results are then fused disjunctively:

$$\pi^{(\text{fu})}(\mathbf{x}) = \max\_{\substack{\mathbf{I}\_{\bullet} \subseteq \mathbf{I} \\ |\mathbf{I}\_{\bullet}| = j}} \left( \min\_{l\_i \in \mathbf{I}\_{\bullet}} (\pi\_i(\mathbf{x})) \right). \tag{7}$$

Adaptive fusion aims at progressing gradually from conjunctive to disjunctive behaviour as conflict increases. A simple adaptive fusion rule is

$$\pi^{(\text{fu})}(\mathbf{x}) = \max\left(\frac{\min\_{i \in \{1, \ldots, n\}} (\pi\_i(\mathbf{x}))}{\mathbf{h}\_{i \in \{1, \ldots, n\}}}, \min\left(\max\_{i \in \{1, \ldots, n\}} (\pi\_i(\mathbf{x})), 1 - \min\_{i \in \{1, \ldots, n\}} (\pi\_i)\right)\right). \tag{8}$$

It fuses all sources disjunctively (assuming one source is right) and discounts the result by (1 − h). In parallel, it fuses all sources conjunctively (assuming all sources are right) and combines both intermediate results. This process does not consider situations in which more than one or less than all sources are reliable. If many sources are fused, it is likely that h → 0, thus, resulting in uninformative results [4]. *Dubois' adaptive fusion rule* [4,48] builds upon the quantified (7) and adaptive fusion rule (8) assuming that a minimum and maximum number of reliable sources are known. The minimum and maximum number are derived from the consistency of information items **I**. The cardinality of the largest fully consistent subset gives the minimum number *j* <sup>−</sup> = max(|**I**| | h(**I**) = 1); the largest partially consistent subset provides the maximum number *j* <sup>+</sup> <sup>=</sup> max(|**I**| | <sup>h</sup>(**I**) <sup>&</sup>gt; <sup>0</sup>). The adaptive fusion is then

$$\pi^{(\text{fu})}(\mathbf{x}) = \max\left(\frac{\pi\_+^{(fu)}(\mathbf{x})}{\mathbf{h}\_{i \in \{1, \dots, n\}}}, \min\left(\pi\_-^{(fu)}(\mathbf{x}), \mathbf{1} - \mathbf{h}\_+\right)\right) \tag{9}$$

in which *<sup>π</sup>*(*f u*) <sup>+</sup> (*x*) and *<sup>π</sup>*(*f u*) <sup>−</sup> (*x*) are obtained by quantified fusion (7) (with *<sup>j</sup>* − and *j* +, respectively) and <sup>h</sup><sup>+</sup> <sup>=</sup> max*I*∗⊆**<sup>I</sup>** | |*I*∗| <sup>=</sup> *<sup>j</sup>*<sup>+</sup> (h(*I*∗)). In this way, completely disagreeing sources with fully inconsistent items (h = 0) are disregarded. Furthermore, small changes in the input possibility distributions may lead to significant changes in the fusion result [64].

Oussalah et al. [64] proposed changes to (9) improving the behaviour in the case of outliers and with regard to robustness against small changes. For their *progressive fusion* rule, they introduced a distance measurement with which the disjunctive fusion (*π*(*f u*) <sup>−</sup> (*x*)) part is adapted. Let *x*0, *x*<sup>1</sup> ∈ *X* be the smallest and largest element of the consensus set, then

$$d(\mathbf{x}) = \begin{cases} \max(|\mathbf{x} - \mathbf{x}\_0|, |\mathbf{x} - \mathbf{x}\_1|) & \text{if } \mathbf{x} < \mathbf{x}\_0 \text{ or } \mathbf{x} > \mathbf{x}\_1 \\ 0 & \text{otherwise} \end{cases}$$

measures the distance from point *<sup>x</sup>* to the consensus set. Let *<sup>α</sup>*(*x*) <sup>=</sup> min *<sup>d</sup>*(*x*) *<sup>d</sup>*<sup>0</sup> , 1 be a weighting factor. The threshold *d*<sup>0</sup> is the maximum distance until outliers are considered. Then, *<sup>π</sup>*(*f u*) <sup>−</sup> (*x*) in (9) is replaced by

$$
\pi\_{-}^{(fu)}(\mathbf{x}) = \mathfrak{a}(\mathbf{x}) \cdot \pi\_{+}^{(fu)}(\mathbf{x}) + (1 - \mathfrak{a}(\mathbf{x})) \cdot \max\_{i \in \{1, \dots, n\}} (\pi\_{i}(\mathbf{x})).\tag{10}
$$

Instead of (9), (10) considers the completely disjunctive fusion of all information items. The degree to which it considers disjunction relies on *d*(*x*). The further *x* is from the consensus set, the more consideration is given to inconsistent items.
