*3.2. Possibilistic Estimation Fusion*

Whereas pooling fusion aims at discarding alternatives, estimation fusion assumes that none of the sources are completely wrong and attempts to find a fusion result that is compatible with all information items [4]. Nonetheless, more specific or precise outcomes are still preferable. Estimation fusion has received less attention in the scientific community compared with pooling fusion (The higher number of citations of Dubois's paper [4] compared to Yager's paper [65] reflect the higher attention). Therefore, the following discussion takes a deeper look into the algebraic properties of estimation fusion.

Estimation fusion is based on Zadeh's extension principle, which allows mapping functions to be used on fuzzy sets [66]. Let *Y*, *Z* be a frame of discernments and F : *Y* → *Z*. Let *A* be a fuzzy set defined on *Y* and *B* a fuzzy set defined on *Z*, and then F maps the fuzzy membership function *μA*(*y*) with *y* ∈ *Y* to *μB*(*z*) with *z* ∈ *Z* by *μB*(*z*) = *μ<sup>A</sup>* F−1(*z*) = *μA*(*y*) with *z* = F(*y*). If F results in multiple outputs for the same *y*, then

$$\mu\_B(z) := \max\_{y \in Y \colon \mathcal{F}(y) = z} \mu\_A(y).$$

In multi-source estimation fusion, the input possibility distributions are first pooled by a fusion function—here referred to in this context as G. The result is then mapped by the multi-parameter function F(*x*1, *x*2,..., *xn*) with *xi* ∈ *Xi*, *i* ∈ {1, ..., *n*} onto a new frame of discernment *X*, i.e.,

$$\pi^{(\text{fu})}(\mathbf{x}) = \max\_{\substack{\mathbf{x}\_i \in \mathcal{X}\_i \colon \mathcal{F}(\mathbf{x}\_1, \dots, \mathbf{x}\_n) = \mathbf{x}}} \left( \mathbf{G}(\pi\_i(\mathbf{x}\_i)) \right), \tag{11}$$

for which the notation

$$\pi^{(\text{fu})}(\mathbf{x}) = \left\{ \frac{\mathbb{G}(\pi\_1(\mathbf{x}\_1), \pi\_2(\mathbf{x}\_2), \dots, \pi\_n(\mathbf{x}\_n))}{\mathbb{F}(\mathbf{x}\_1, \mathbf{x}\_2, \dots, \mathbf{x}\_n)} \right\}.\tag{12}$$

is used in the following. The fusion rule in (11) takes the maximum of G*i*(*πi*(*xi*)) for every *n*-tuple (*x*<sup>1</sup> ∈ *X*1,..., *xn* ∈ *Xn*), which satisfies F(*x*1,..., *xn*) = *x*.

Yager [65] proposed an estimation fusion rule in which G is the minimum operator and F is defined to be an averaging operator.

**Definition 3** (Averaging Operator)**.** *An operator that satisfies the three properties of commutativity, monotonicity, and idempotency, is referred to as a* mean *or* averaging operator *[4]. Such an averaging operator* avg(·) *lies between* min(·) *and* max(·)*, i.e.,* min(·) ≤ avg(·) ≤ max(·)*.*

Yager's estimation fusion rule [65] is then:

$$\pi^{(\text{fu})}(\mathbf{x}) = \left\{ \frac{\min\_{i \in \{1, \dots, n\}} (\pi\_i(\mathbf{x}\_i))}{\text{F}(\mathbf{x}\_1, \mathbf{x}\_2, \dots, \mathbf{x}\_n)} \right\}. \tag{13}$$

The application of the minimum operator results in maximal specific possibility distributions, which are placed on an averaged frame of discernment. The disadvantages of estimation fusion are that (i) it requires a frame of discernment for which it is sensible to apply averaging operators on and that (ii) estimation fusion may lead to fusion results that have been deemed impossible by all sources, i.e., the results do not satisfy the zero preservation principle [4]. Regarding the first disadvantage, it is often assumed that *X* ⊆ R [65], which is also assumed for the remainder of this section.

If G is also an averaging operator, then a noteworthy interaction between estimation fusion and the frame of discernment takes place, which is relevant for practical implementations.

**Proposition 1.** *If* G *is an averaging operator other than the minimum operator and X* ⊆ R*, then fusion with (13) is influenced by the borders of X. More formally,* min*π*(fu)(*x*)><sup>0</sup> *x is dependent on* min*x*∈*<sup>X</sup> x and* max*π*(fu)(*x*)><sup>0</sup> *x on* max*x*∈*<sup>X</sup> x.*

**Proof.** Let *x*<sup>a</sup> = min *<sup>x</sup>*∈*<sup>X</sup> <sup>x</sup>* and *<sup>x</sup>*<sup>b</sup> <sup>=</sup> max *<sup>x</sup>*∈*<sup>X</sup> <sup>x</sup>*, i.e., *<sup>X</sup>* = [*x*a, *<sup>x</sup>*b]. Let *<sup>x</sup>* <sup>=</sup> min*<sup>i</sup>* min *xi*∈*Xi*:*πi*(*xi*)><sup>0</sup> *xi*, i.e., *x* is the smallest element in *X* for which at least one *π<sup>i</sup>* > 0. Furthermore, let *i* = arg min *i* min *xi*∈*Xi*:*πi*(*xi*)><sup>0</sup> *xi*. If G = min, then, for at least one permutation of the *n*-tuple

(*x*a, *x*a, ... , *x* , ... , *x*a, *x*a): G ⎛ <sup>⎝</sup>*π*1(*x*a),..., *<sup>π</sup><sup>i</sup>* (*<sup>x</sup>* ) % &' ( <sup>&</sup>gt;<sup>0</sup> ,..., *πn*(*x*a) ⎞ <sup>⎠</sup> > 0. This *<sup>n</sup>*-tuple defines the minimum boundary of *π*(fu) , i.e., min *π*(fu)(*x*)>0 *x* = F - *x*a,..., *x* ,..., *x*a . The same holds for the maximum boundary of *<sup>π</sup>*(fu) only that *<sup>x</sup>* <sup>=</sup> max*<sup>i</sup>* max *xi*∈*Xi*:*πi*(*xi*)><sup>0</sup> *xi*, *i* = arg max *i* max *xi*∈*Xi*:*πi*(*xi*)><sup>0</sup> *xi*, and max *π*(fu)(*x*)>0 *x* = F - *x*b,..., *x* ,..., *x*<sup>b</sup> .

An example of the effects of Proposition 1 is illustrated in Figure 2.

**Figure 2.** An example of the interaction between estimation fusion (12) and *X* as discussed in Proposition 1. A frame of discernment *X* = [0, 10] and three possibility distributions are given. Each possibility distribution claims complete knowledge; *π*1(*x* = 3) = 1, *π*2(*x* = 5) = 1, and *π*3(*x* = 7) = 1. The plots show fusion results (dashed red) in which F is the arithmetic mean and G is (**a**) the minimum, (**b**) the maximum, and (**c**) the arithmetic mean operator.

**Corollary 1.** *If X is also unbounded and* F *is an averaging operator other than the minimum or maximum operator, then (13) results in an unbounded π*(fu)*. If X is half-bounded, then π*(fu) *is also half-bounded.*

**Proof.** From Proposition <sup>1</sup> it follows directly that, if <sup>F</sup> <sup>=</sup> max, then lim *<sup>x</sup>*a→−<sup>∞</sup> min *<sup>π</sup>*(fu)(*x*)><sup>0</sup> *x* = F(*x*a,..., *x* ,..., *<sup>x</sup>*a) <sup>=</sup> <sup>−</sup> <sup>∞</sup>. If <sup>F</sup> <sup>=</sup> min, then lim*x*b→<sup>∞</sup> max *<sup>π</sup>*(fu)(*x*)><sup>0</sup> *x* = F - *x*b,..., *x* ,..., *x*<sup>b</sup> = ∞.

Consequently, if G is an averaging operator other than the minimum operator, then it is reasonable to apply estimation fusion only on bounded *X*. Otherwise, (12) and (13) lead to fusion results spanning to infinity—even for very precise input possibility distributions.
