In this section, we first depict the details of our experiment based on the proposed method and then analyze the result in depth.
5.1. A Numerical Example
In this paper, we take a project bidding example in which 10 competing companies are participating in the event, each with six attributes. We chose this example based on real-world experience. Due to the length limitation of the article, like other research articles in this field, we only provide an in-depth analysis of the provided examples to illustrate the effectiveness of our approach. However, we explain that the method presented in this article is also applicable to many other examples.
Suppose that there is a set
, which consists of 10 candidates. Based on existing attributes, decision-makers evaluate alternatives according to six criteria, safety coefficient (
), quoted price (
), construction efficiency (
), industry reputation (
), service quality (
), and engineering experience (
). The attributes of all alternatives are listed in
Table 4.
The importance of each attribute can be collectively represented as an attribute weight set and the adjustment coefficient is .
According to our method, the decision process is obtained as follows.
Step 1: The fuzzy dominance probability matrix
can be obtained based on Definition 6 and Equations (
9) and (
10).
Step 2: According to Equation (
13), the weight relative probability matrix
can be calculated.
=
Step 3: According to Equation (
15), the loss functions of each
(
i = 1,2,…10) can be computed and are displayed in
Table 5. It can be seen that the loss function is different for each object; when each loss function is formed, the corresponding comprehensive threshold can be computed.
Step 4: According to Equation (
14), the comprehensive thresholds of each alternative can be obtained and are displayed in
Table 6.
The conditional probability is considered to vary between 0.3 and 0.45 in steps of 0.01. According to decision rules, the output of the TWD can be explained as follows.
When ∈ [0.3,0.33], all objects are included in ; when ∈ [0.34, 0.35], all objects are included in except ; when is 0.38, and ∈, and ∈, the rest of the objects are in ; when is 0.39, , and ∈, and ∈, the rest of the objects are in ; when is 0.40, , , and ∈, , and ∈, the rest of the objects are in ; when is 0.41, , and ∈, and ∈, the rest of the objects are in ; when ∈ [0.42,0.43], , and ∈, the rest of the objects are in ; When ∈ [0.44, 0.45], all objects are included in .
The distribution of objects under different conditional probabilities is depicted in
Figure 3. In
Figure 3, we use three colors to represent the regions of objects under different conditional probabilities. When the conditional probabilities are different, the region of an object changes. For example, when the conditional probability is between 0.3 and 0.41,
is in the positive region; when the conditional probability is between 0.41 and 0.43,
is in the boundary region; when the conditional probability is between 0.43 and 0.45,
is in the negative region. We can also find from
Figure 3 that all objects are in the positive region when the conditional probability is between 0.43 and 0.45. All objects are in the negative region when the conditional probability is between 0.3 and 0.34.
According to the above division of all candidates, we can get the dominance relation
≻
≻
. Instead of the method in [
22,
32], to solve the MADM problems efficiently, our approach is to get the final ranked result utilizing ranking the divided alternatives. When the conditional probability is set to be 0.4, all candidates can be divided into three regions,
,
, and
Step 5: According to the number of candidates needed, we can rank objects of each region based on Definition 9. The example has three possible conditions for the number of candidates needed.
(1) If the number of candidates needed is 3, there are 3 candidates to be chosen. Then all candidates in the positive region must be ranked, and the top 3 of them are selected. According to Definition 8, we transform the relative weight probability matrix of the positive region into the distance matrix
.
According to Equation (
18), the sorted result of the positive region is
and
,
, and
can be chosen.
(2) If the number of candidates needed is 6, six candidates will be chosen. Then all candidates in the boundary region need to be ranked, and the top 2 of them are selected. According to Definition 8, we transform the relative weight probability matrix of the boundary region into the distance matrix
.
According to Equation (
18), the sorted result of the positive region is
and
and
can be chosen.
(3) If the number of candidates needed is 9, there are 9 candidates to be chosen. Then all candidates in the negative region must be ranked, and the top 2 of them are selected. According to Definition 8, we transform the relative weight probability matrix of the negative region into the distance matrix
.
According to Equation (
18), the sorted result of the positive region is
and
and
can be chosen.
Compared with the existing MADM models, our approach combines decision-making with ranking and performs better.
5.3. Sensitivity Analysis of and
Adjustment coefficient
is set to a fixed number in the above experiment. We assume that
is changeable and then discuss the impact of different values of
on the decision-making results. Different results with changeable
are listed as follows, in which the conditional probability
is set to be 0.36 and
increases from 0.40 to 0.50 with step size 0.01. When
∈ [0.40,0.43], all objects are included in
; when
is 0.44,
,
, and
∈
, the rest of the objects are in
; when
is 0.45,
,
,
, and
∈
, the rest of the objects are in
; when
is 0.46,
,
,
, and
∈
, the rest of the objects are in
; when
is 0.47,
,
and
∈
, the rest of the objects are in
; when
is 0.48,
and
∈
, the rest of the objects are in
; when
is 0.49,
∈
,
∈
, the rest of the objects are in
; when
is 0.50,
∈
, the rest of the objects are in
. The above analysis shows that the distribution of objects is different for different adjustment coefficients
. To show the above description more vividly, we use
Figure 6 to describe the number of objects in different regions.
According to
Figure 6, the number of objects in NEG is always not less than that in POS under different
values. When the
value increases, the size of BND decreases while the sizes of POS and NEG increase.
Figure 7 shows the effect of
on the maximum number of objects. Since the number of objects in our experimental data is 10, the maximum number of objects is 10 under the normal rank. In a specific area, the maximum number of objects is less than or equal to the number of all objects (10) in
,
, or
.
According to
Figure 8, computational complexity can be reduced significantly by ranking only a part of all the candidates, especially when the number of candidates is large. The meanings of the dotted lines in
Figure 8 are the same as those in
Figure 7. Assuming that the number of global objects is 10, then a set
containing 20 random numbers that are generated from the range of 0 to 10 to represent the number of candidates needed. In the decision-making process, the number of candidates needed is less than or equal to that of all objects, further illustrating the importance of division for decision-making efficiency.
In
Figure 9, we discuss the object distribution of the conditional probability
from 0.35 to 0.45 when
. Specifically, when
changes from 0.30 to 0.32, all candidates are classified into the negative region
. When
changes from 0.33 to 0.36, all candidates are only classified into
and
. When changes from
= 0.37 to
= 0.40, candidates are distributed in three regions.