Next Article in Journal
Wave-Structure Interaction for a Stationary Surface-Piercing Body Based on a Novel Meshless Scheme with the Generalized Finite Difference Method
Next Article in Special Issue
Arithmetics of Vectors of Fuzzy Sets
Previous Article in Journal
Finite-Time Mittag–Leffler Synchronization of Neutral-Type Fractional-Order Neural Networks with Leakage Delay and Time-Varying Delays
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Minkowski Weighted Score Functions of Intuitionistic Fuzzy Values

1
Department of Applied Mathematics, School of Science, Xi’an University of Posts and Telecommunications, Xi’an 710121, China
2
Shaanxi Key Laboratory of Network Data Analysis and Intelligent Processing, Xi’an University of Posts and Telecommunications, Xi’an 710121, China
3
BORDA Research Unit and Multidisciplinary Institute of Enterprise (IME), University of Salamanca, E37007 Salamanca, Spain
*
Author to whom correspondence should be addressed.
Mathematics 2020, 8(7), 1143; https://doi.org/10.3390/math8071143
Submission received: 2 June 2020 / Revised: 6 July 2020 / Accepted: 9 July 2020 / Published: 13 July 2020
(This article belongs to the Special Issue Fuzzy Sets and Soft Computing)

Abstract

:
In multiple attribute decision-making in an intuitionistic fuzzy environment, the decision information is sometimes given by intuitionistic fuzzy soft sets. In order to address intuitionistic fuzzy decision-making problems in a more efficient way, many scholars have produced increasingly better procedures for ranking intuitionistic fuzzy values. In this study, we further investigate the problem of ranking intuitionistic fuzzy values from a geometric point of view, and we produce related applications to decision-making. We present Minkowski score functions of intuitionistic fuzzy values, which are natural generalizations of the expectation score function and other useful score functions in the literature. The rationale for Minkowski score functions lies in the geometric intuition that a better score should be assigned to an intuitionistic fuzzy value farther from the negative ideal intuitionistic fuzzy value. To capture the subjective attitude of decision makers, we further propose the Minkowski weighted score function that incorporates an attitudinal parameter. The Minkowski score function is a special case corresponding to a neutral attitude. Some fundamental properties of Minkowski (weighted) score functions are examined in detail. With the aid of the Minkowski weighted score function and the maximizing deviation method, we design a new algorithm for solving decision-making problems based on intuitionistic fuzzy soft sets. Moreover, two numerical examples regarding risk investment and supplier selection are employed to conduct comparative analyses and to demonstrate the feasibility of the approach proposed in this article.

1. Introduction

Multiple attribute decision making (MADM) refers to the general process of ranking a collection of alternatives or choosing the best alternative(s) from them, by taking into account the evaluations of all the alternatives against several attributes. The assessment information carried by these attributes characterizes the performance of the alternatives from several standpoints. In recent years, researchers from different countries and diverse fields have strained to develop various methods, models, and algorithms to support MADM [1]. Many in-depth studies have been devoted to the exploration of MADM and its applications. Altogether, they have provided effective solutions to MADM problems emerging from a wide range of different fields [2,3].
Fuzzy set (FS) theory is a powerful mathematical model proposed by Zadeh [4] in order to deal with uncertainty from the perspective of gradual belongingness. Since then, the idea of fuzzy sets has been playing an important role in the broad area of soft computing. It has achieved the purpose of easy processing, robustness, and low cost in the treatment of approximate knowledge, uncertainty, inaccuracy, and partial truth [5]. In order to further improve the applicability of fuzzy sets, in 1983, Atanassov [6] put forward the idea of intuitionistic fuzzy sets (IFSs). They are characterized by the stipulation of dissociated membership and non-membership degrees of the alternatives. Xu and Yager [7] coined the ordered pairs composed of the membership and non-membership grades in an intuitionistic fuzzy set as intuitionistic fuzzy values (IFVs). Despite their simplicity as an extension of Zadeh’s fuzzy sets, Atanassov’s intuitionistic fuzzy sets are of great significance to describe the uncertainty caused by human cognitive limitations or indecision [8].
Later on in 1999, soft set (SS) theory [9] gave rise to a wide mathematical framework for approaching uncertain, vaguely outlined objects that, however, possess defined characteristics. The basic principle of soft sets is dependent on the idea of parameterization, and it has broken the limits of membership functions. Soft set theory and its extensions show that real-life concepts can be understood from their characteristic aspects, and each separate feature provides an approximate portrayal of the entire entity [10]. Ali et al. [11,12,13] deeply explored algebraic and logical aspects of soft set theory in order to formulate its theoretical basis. Feng and Li [14] established relationships amongst five distinct kinds of soft subsets. They also investigated free soft algebras associated with soft product operations. Jun et al. studied the use of soft sets in the investigation of various algebraic structures such as ordered semigroups [15] and BCK/BCI-algebras [16,17]. Inspired by the notion of soft sets, Maji et al. [18] also presented the concept of a fuzzy soft set, which combines the idea of a fuzzy membership and soft set theory. Both soft sets and fuzzy soft sets were further generalized to hesitant fuzzy soft sets [19,20], N-soft sets [21,22], hesitant N-soft sets [23], and fuzzy N-soft sets [24]. From a different position, Maji et al. [25] introduced the concept of intuitionistic fuzzy soft sets (IFSSs). This hybrid structure yields an effective framework to describe and analyze more general MADM problems [26,27]. This model and its applications in decision-making were further expanded by Agarwal et al. [28], who devised generalized intuitionistic fuzzy soft sets. Feng et al. [29] clarified and improved the structure of this model. To be precise, they formulated it as a blend of intuitionistic fuzzy soft sets over the set of alternatives plus intuitionistic fuzzy sets on the set of attributes. Peng et al. [30] combined soft sets with Pythagorean fuzzy sets and presented the notion of Pythagorean fuzzy soft sets. Both fuzzy soft sets and intuitionistic fuzzy soft sets are special cases of Pythagorean fuzzy soft sets. Athira et al. [31] defined some entropy measures for Pythagorean fuzzy soft sets to compute the degree of fuzziness of the sets, which means that the larger the entropy, the lesser the vagueness. In addition, other different extension structures of soft sets were proposed and used to solve decision problems. For example, Dey et al. [32] presented the idea of a hesitant multi-fuzzy set. They joined the characteristics of a hesitant multi-fuzzy set with the parametrization of the soft sets to construct the hesitant multi-fuzzy soft sets. Likewise, intuitionistic multi-fuzzy sets [33] take advantage of both intuitionistic and multi-fuzzy features. Wei [34] proposed a scheme based on intuitionistic fuzzy soft sets to determine the weight of each attribute in two cases: incomplete knowledge of attribute information and complete ignorance of attribute information. Liu et al. [35] recently produced centroid transformations of intuitionistic fuzzy values that were built on intuitionistic fuzzy aggregation operators.
As the most elementary components in intuitionistic fuzzy MADM, IFVs quantify the performance of alternatives in an intuitionistic fuzzy setting. From a more general point of view, IFVs can be regarded as the intuitionistic fuzzy counterpart of real numbers in the unit interval [ 0 , 1 ] . Both play comparable roles in intuitionistic fuzzy and classical mathematical modeling, respectively. It is well known that the usual order of the real numbers endows the unit interval [ 0 , 1 ] with the structure of a chain (i.e., they produce a totally ordered set). As a result, such real numbers can be compared or ranked in an unequivocal way. Nonetheless, the ranking problem becomes much more complicated when we consider IFVs. The main reason is that the set L * of all IFVs only forms a complete lattice rather than a chain under the usual order of IFVs (see Equation (1) below). In response to this, various ways have been proposed to compare intuitionistic fuzzy values. They fall into two broad categories. The first kind suggests ranking IFVs by the score function [36], the accuracy function [37], Xu and Yager’s rule [7], and the dominance degree [38]. Their common feature is that they all rely on the membership function or non-membership function. Recently, Garg and Arora [39] presented an approach to solve decision-making problems by utilizing the TOPSIS method based on correlation measures under an intuitionistic fuzzy set environment, and their approach also belongs to this category. The other group tends to rank IFVs using geometric representations, such as Szmidt and Kacprzyk’s measure R [40], Guo’s measure Z [41], as well as Zhang and Xu’s measure l [42]. Distance measures of intuitionistic fuzzy sets have been well explored in the literature [43]. They can be used to quantify the difference between the information carried by intuitionistic fuzzy sets. Xing et al. [44] proposed a method in which IFVs are represented by points in a Euclidean space and ranked by their Euclidean distances to the most favorable point. This ranking method has made a great breakthrough in the satisfaction of desirable properties, such as weak admissibility, robustness, and determinacy, which had been basically neglected in previous geometric representation methods. In recent years, with the increasing complexity of socio-economic systems and related decision-making problems, this high performance ranking method for IFVs has become very practical.
However, it should be pointed out that some existing score functions based on the Hamming or Euclidean distance fail to differentiate those distinct IFVs that have the same expectation score or Euclidean score (called the ideal positive degree in [44]). To overcome this difficulty, the current study aims to introduce the Minkowski score function that enables us to prioritize all IFVs in L * by adjusting parameters if necessary. We further extend the Minkowski score function with the inclusion of the attitudinal parameter of the decision makers. Altogether, they produce a more general concept, called the Minkowski weighted score function of IFVs. We prove that this new score function naturally reduces to the Minkowski score function when the attitudinal parameter equals 0.5 (i.e., a decision maker’s neutral attitude). Note also that the Minkowski (weighted) score function builds on the Minkowski distance between the negative ideal IFV and the given IFV in L * . Thus, our score functions simplify the formulas and calculation with regard to the ideal positive degree in [44].
Figure 1 illustrates the workflow and framework of this study. Its first aim is to extend and simplify the ideal positive degree proposed by Xing et al. [44], which results in the concept of the Minkowski score function. Next, this function is further generalized to the Minkowski weighted score function thanks to the introduction of the attitudinal parameter of decision makers. Finally, an algorithm is designed for solving MADM problems based on intuitionistic fuzzy soft sets. This algorithm mainly builds on the Minkowski weighted score function, the maximizing deviation method for the determination of attribute weights, and the simply weighted intuitionistic fuzzy averaging (SWIFA) operator.
The rest of this study consists of the following sections. Section 2 recaps some fundamental concepts concerning (intuitionistic) fuzzy sets and intuitionistic fuzzy soft sets. Section 3 presents some known order relations for ranking IFVs. In Section 4, we define a new score function based on the Minkowski distance between IFVs and the negative ideal IFV. Then, some relevant properties are established and discussed. Section 5 defines our extended methodology that takes into account decision makers’ biases. Section 6 introduces the maximizing deviation method for the determination of the vector of weights and an algorithm that leans on this method is proposed to solve the intuitionistic fuzzy MADM. In Section 7, the decisions in two situations (namely, venture capital and supplier selection problems) are compared with those of lexicographic orders, Wei’s method [34], Xia’s method [45], Song’s method [46], and TOPSIS (a shorthand for Technique for Order Preference by Similarity to Ideal Solution) [47] . This comparative analysis validates the feasibility of the new approach. Finally, Section 8 gives the conclusion and future research directions.

2. Preliminaries

Along this paper, Q denotes a non-empty set of alternatives. This set is often known as the universe of discourse.
We proceed to recall fundamental concepts from the theories of (intuitionistic) fuzzy sets and intuitionistic fuzzy soft sets. These notions and related tools will be needed in subsequent analyses.
A fuzzy set ϑ in Q (sometimes also a fuzzy subset of Q ) is characterized by its membership function, namely ϑ : Q [ 0 , 1 ] . Then, each x Q is associated with its membership degree ϑ ( x ) to this fuzzy set. It captures the degree of belongingness of x to ϑ . Henceforth, F ( Q ) denotes the collection of all fuzzy subsets of Q .
Fuzzy subsethood and equality are quite natural. We declare ϑ ϱ when ϑ ( x ) ϱ ( x ) for each x Q . As expected, ϑ = ϱ holds when both ϑ ϱ and ϱ ϑ are true. In terms of the original formulation by Zadeh [4], fuzzy versions of the set union and intersection, as well as a fuzzy complement operation can be respectively given by the expressions: for each ϑ , ϱ F ( Q ) and x Q ,
  • ( ϑ ϱ ) ( x ) = max { ϑ ( x ) , ϱ ( x ) } ,
  • ( ϑ ϱ ) ( x ) = min { ϑ ( x ) , ϱ ( x ) } , and
  • ϑ c ( x ) = 1 ϑ ( x ) .
Intuitionistic fuzzy sets extend fuzzy sets in the following manner:
Definition 1
([6]). An intuitionistic fuzzy set on Q is defined by:
η = { ( x , t η ( x ) , f η ( x ) ) x Q } .
The mappings t η : Q [ 0 , 1 ] and f η : Q [ 0 , 1 ] respectively provide a membership grade t η ( x ) and a non-membership grade f η ( x ) for x Q to η. Moreover, it is required that 0 t η ( x ) + f η ( x ) 1 holds true for all x Q .
Unlike the case of fuzzy sets, a degree of hesitancy (or indeterminacy) of x to η can be defined. It consists of the difference π η ( x ) = 1 ( t A ( x ) + f A ( x ) ) , and when it is universally equal to zero, Q can be identified with a fuzzy set. In the following, the collection of all intuitionistic fuzzy sets on Q will be denoted by IFS ( Q ) .
The set-theoretical operations defined for fuzzy sets can be extended to Definition 1. Thus, in that realm, we define union, intersection, and subsethood as follows: when A , B IFS ( Q ) ,
  • A B = { ( x , max { t A ( x ) , t B ( x ) } , min { f A ( x ) , f B ( x ) } ) x Q } ;
  • A B = { ( x , min { t A ( x ) , t B ( x ) } , max { f A ( x ) , f B ( x ) } ) x Q } ;
  • A B for all x Q , both t A ( x ) t B ( x ) and f A ( x ) f B ( x ) are true.
It is just natural to declare A = B exactly when both A B and B A hold true. The complement of η is defined as:
η c = { ( x , f η ( x ) , t η ( x ) ) x Q } .
In addition to their relationship with fuzzy sets, intuitionistic fuzzy sets can be regarded as L-fuzzy sets with respect to ( L * , L * ) , the complete lattice with:
L * = { ( a 1 , b 1 ) [ 0 , 1 ] 2 a 1 + b 1 1 } ,
and lattice order L * given by:
( a 1 , b 1 ) L * ( a 2 , b 2 ) ( a 1 a 2 ) ( b 1 b 2 )
for all ( a 1 , b 1 ) , ( a 2 , b 2 ) L * [48,49]. Each ordered pair α = ( a 1 , a 2 ) L * is an intuitionistic fuzzy value (IFV) [7]. The complement of the IFV α = ( t 1 , f 1 ) is defined as α = ( f 1 , t 1 ) . Based on this viewpoint, the intuitionistic fuzzy set:
η = { ( x , t η ( x ) , f η ( x ) ) x Q }
can be viewed as the L-fuzzy set η : Q L * with η ( x ) = ( t η ( x ) , f η ( x ) ) for all x Q .
Soft set theory builds on different grounds. Now, we need E Q (a parameter space, often denoted by E when the set of alternatives is common knowledge). These are all the parameters that approximately represent the objects in Q . The pair ( Q , E ) is also known as a soft universe. Put differently, E comprises the collection of parameters that produce an appropriate joint model of the problem under inspection.
Definition 2
([9]). A soft set over Q is a pair S = ( F , A ) . It is assumed that A E and F : A P ( Q ) . F is a mapping known as the approximate function of S = ( F , A ) .
The natural combination of fuzzy sets with soft sets produces the following notion:
Definition 3
([18]). A fuzzy soft set  over Q is a pair S = ( F , A ) . It is assumed that A E and F : A F ( Q ) . Now, F is a mapping called the approximate function of S = ( F , A ) .
Henceforth, the collection of all fuzzy soft sets over Q whose sets of attributes are from E will be represented by FS E ( Q ) .
Refinements of Definition 3 with different characteristics exist. Valuation fuzzy soft sets refine fuzzy soft sets. They were introduced in [50] with the purpose of favoring decisions in cases like the valuation of assets. In addition, a very natural blend of intuitionistic fuzzy sets with soft sets generalizes Definition 3 as follows:
Definition 4
([25]). An intuitionistic fuzzy soft set (IFSS) over Q is a pair I = ( F ˜ , A ) . It is assumed that A E and F ˜ : A IFS ( Q ) . Now, F ˜ is a mapping called the approximate function of I = ( F ˜ , A ) .
Hereinafter, IFSS E ( Q ) represents the collection of all IFSSs over Q whose attributes belong to E.
Two increasingly general concepts of complements for IFSSs are applicable:
Definition 5.
The complement of  I = ( F ˜ , A ) IFSS E ( Q ) is the IFSS I c = ( G ˜ , A ) such that G ˜ satisfies G ˜ ( a ) = ( F ˜ ( a ) ) c when a A .
Definition 6.
The generalized B-complement of  I = ( F ˜ , A ) IFSS E ( Q ) , where B A , is the IFSS I B c = ( G ˜ , A ) such that for all a A ,
G ˜ ( a ) = ( F ˜ ( a ) ) c , i f   a B , F ˜ ( a ) , o t h e r w i s e .
Observe I c = I and I A c = I c . In other words, the generalized B-complement I B c coincides with I itself when B is the empty set, and it reduces to the complement I c if B = A .
Finally, the scalar product of a fuzzy set and an intuitionistic fuzzy soft set is defined as follows:
Definition 7.
Let I = ( F ˜ , A ) be an intuitionistic fuzzy soft set over Q and μ be a fuzzy set in A. The scalar product of μ and I is the IFSS Ω ˜ = μ I = ( G ˜ , A ) , with its approximate function given by:
G ˜ ( e ) ( u ) = 1 1 t F ˜ ( e ) ( u ) μ ( e ) , f F ˜ ( e ) ( u ) μ ( e ) ,
where e A , u Q , and F ˜ ( e ) ( u ) = t F ˜ ( e ) ( u ) , f F ˜ ( e ) ( u ) L * .

3. Order Relations for Ranking IFVs

In the following, let us recall some well-known concepts about binary relations and orders.
Definition 8.
When A and B are sets, any R A × B is a binary relation between A and B.
The fact ( a , b ) R is usually denoted by a R b . Then, a is R-related to b.
The domain of R is formed by the elements x A that satisfy x R y for at least one y B . Its range is formed by the elements y B that satisfy x R y for at least one x A .
If R A × A , then R is called a binary relation on A. In this context, several properties may apply to R.
Definition 9.
When R is a binary relation on X, we say that R is:
  • reflexive when ( x , x ) R for each x X .
  • irreflexive when ( x , x ) R is false, for each x X .
  • symmetric when ( x , y ) R implies ( y , x ) R , for each x , y X .
  • asymmetric when ( x , y ) R implies ( y , x ) R , for each x , y X .
  • antisymmetric when ( x , y ) R and ( y , x ) R imply x = y , for each x , y X .
  • transitive when ( x , y ) R and ( y , z ) R imply ( x , z ) R , for each x , y , z X .
  • complete when x y implies ( x , y ) R or ( y , x ) R , for each x , y X .
  • total or strongly complete, when either ( x , y ) or ( y , x ) holds true, for each x , y X .
Preorders are reflexive and transitive binary relations. Complete preorders are called weak orders. Antisymmetric weak orders are called linear or total orders, and antisymmetric preorders are called partial orders. A set A endowed with a partial order ⪯ is usually called a poset, and we denote it by ( A , ) . A poset ( A , ) is called a chain if ⪯ is a total order.
Definition 10.
Let ( A , A ) , ( B , B ) be posets. A mapping f : A B is called an order homomorphism when for all x , y A ,
x A y   i m p l i e s   f ( x ) B f ( y ) .
An order homomorphism f : A B is an order isomorphism when f is a bijection, and its inverse f 1 : B A is also an order homomorphism.
Definition 11.
Let ( P 1 , 1 ) and ( P 2 , 2 ) be two posets. The lexicographic composition of 1 and 2 is the lexicographic order ⪯ on P 1 × P 2 defined by:
( a 1 , b 1 ) ( a 2 , b 2 )   i f   a n d   o n l y   i f   e i t h e r ( a 1 1 a 2 ) o r ( a 1 = a 2 b 1 2 b 2 )
for all ( a 1 , b 1 ) , ( a 2 , b 2 ) P 1 × P 2 .
In the above definition, the notation a 1 1 a 2 means a 1 1 a 2 and a 1 a 2 . The lexicographic order ≤ is a partial order on P 1 × P 2 .
Xu and Yager [7] put forward a remarkable method for ranking IFVs. It uses some concepts that we proceed to recall:
Definition 12
([36]). The application s : L * [ 1 , 1 ] defined by s ( A ) = s A = t A f A for all A = ( t A , f A ) L * is called the score function of IFVs.
Definition 13
([37]). The application h : L * [ 0 , 1 ] defined by h ( A ) = h A = t A + f A for all A = ( t A , f A ) L * is called the accuracy function of IFVs.
Moreover, we refer to the mapping π : L * [ 0 , 1 ] as the hesitancy function of IFVs, which is defined as π ( A ) = π A = 1 h A for all A = ( t A , f A ) L * .
IFVs can be compared according to the following rules:
Definition 14
([7]). When A = ( t A , f A ) and B = ( t B , f B ) are two IFVs:
  • if s A < s B , we say that A is smaller than B; we express this fact as A < B ;
  • s A = s B , then three options appear:
    (1)
    when h A = h B , we say that A is equivalent to B; we express this fact as A = B ;
    (2)
    when h A < h B , we say that A is smaller than B; we express this fact as A < B ;
    (3)
    when h A > h B , we say that A is greater than B; we express this fact as A > B .
Definition 14 can be simply stated as a binary relation ( s , h ) on the set of of IFVs, namely:
A ( s , h ) B ( s A < s B ) ( s A = s B h A h B ) , for all   A , B L *
Henceforth, ( s , h ) is referred to as the Xu–Yager lexicographic order of IFVs. It is a linear order on L * . Feng et al. [51] presented several alternative lexicographic orders:
Definition 15
([51]). The binary relation ( s , t ) on L * is given by:
A ( s , t ) B ( s A < s B ) ( s A = s B t A t B )
for each A = ( t A , f A ) , B = ( t B , f B ) L * .
It is easy to see that ( s , t ) is equivalent to the Xu–Yager lexicographic order:
Proposition 1
([51]). When A = ( t A , f A ) , B = ( t B , f B ) L * , it must be the case that:
A ( s , h ) B A ( s , t ) B .
Definition 16
([51]). The binary relation ( s , f ) on L * is given by:
A ( s , f ) B ( s A < s B ) ( s A = s B f A f B )
for each A = ( t A , f A ) , B = ( t B , f B ) L * .
Definition 17
([51]). The binary relation ( t , f ) on L * is given by:
A ( t , f ) B ( t A < t B ) ( t A = t B f A f B )
for each A = ( t A , f A ) , B = ( t B , f B ) L * .
Definition 18
([51]). The binary relation ( f , t ) on L * is given by:
A ( f , t ) B ( f A > f B ) ( f A = f B t A t B )
for each A = ( t A , f A ) , B = ( t B , f B ) L * .

4. Minkowski Score Functions

This section introduces and investigates a novel technical construction. We need some preliminaries.
Definition 19
([29]). The expectation score function of IFVs is the application δ : L * [ 0 , 1 ] defined by the expression:
δ ( A ) = δ A = t A f A + 1 2
for all A = ( t A , f A ) L * .
The next proposition recalls some of the basic properties of δ :
Proposition 2
([29]). The expectation score function δ : L * [ 0 , 1 ] satisfies:
(1)
δ ( 0 , 1 ) = 0 ;
(2)
δ ( 1 , 0 ) = 1 ;
(3)
δ ( t A , f A ) is increasing with respect to t A ; and
(4)
δ ( t A , f A ) is deceasing with respect to f A .
Definition 20
([35]). Let A i = ( t A i , f A i ) L * ( i = 1 , , n ). The simply weighted intuitionistic fuzzy averaging (SWIFA) operator of dimension n is the application Ξ w s : ( L * ) n L * defined by the expression:
Ξ w s ( A 1 , A 2 , , A n ) = i = 1 n w i t A i , i = 1 n w i f A i ,
where w = ( w 1 , w 2 , , w n ) T is a weight vector with the usual restriction that w i [ 0 , 1 ] ( i = 1 , , n ) and i = 1 n w i = 1 .
In particular, the SWIFA operator with w = ( 1 / n , 1 / n , , 1 / n ) T is denoted by Ξ s , and it is referred to as the simple intuitionistic fuzzy averaging (SIFA) operator, i.e.,
Ξ s ( A 1 , A 2 , , A n ) = 1 n i = 1 n t A i , 1 n i = 1 n f A i .
The aforementioned elegant, but useful aggregation operator was introduced by Xu and Yager [52] in their analysis of intuitionistic fuzzy preference relations.
Let A = ( t A , f A ) and B = ( t B , f B ) be two IFVs. Then, the normalized Minkowski distance between them is defined as follows:
D p ( A , B ) = t A t B p + f A f B p 2 1 p ,
where p 1 .
If p = 1 , the Minkowski distance is reduced to the normalized Hamming distance between two IFVs given by:
D 1 ( A , B ) = t A t B + f A f B 2 .
If p = 2 , the Minkowski distance is reduced to the normalized Euclidean distance between two IFVs given by:
D 2 ( A , B ) = t A t B 2 + f A f B 2 2 1 2 .
Definition 21
([44]). Let A = ( t A , f A ) L * . The ideal positive degree of A is defined as:
P ( A ) = 1 D 2 ( A , ( 1 , 0 ) ) = 1 ( 1 t A ) 2 + f A 2 2 1 2 .
The ideal positive degree is a useful measure for evaluating the preference grade of a given IFV. From a geometric point of view, it is easy to observe that the closer an IFV A = ( t A , f A ) is to the positive ideal IFV α * = ( 1 , 0 ) , the better the IFV A is among all the IFVs in L * , and so, a greater ideal positive degree P ( A ) should be assigned to it. As pointed out by Xing et al. [44],
P ( α * ) = 1 D 2 ( α * , ( 1 , 0 ) ) = 1 .
This shows that the positive ideal IFV α * = ( 1 , 0 ) is the most preferable IFV in L * , and its ideal positive degree reaches the maximum value.
It is worth noting that the ideal positive degree based on Euclidean distances might fail to differentiate IFVs, since different IFVs may have the same ideal positive degree. The next example illustrates this possibility:
Example 1.
Consider the IFVs:
A = ( t A , f A ) = ( 0.5 , 0 ) a n d B = ( t B , f B ) = ( 0.6 , 0.3 ) .
Using Equation (9), we have:
P ( A ) = 1 ( 1 0.5 ) 2 + 0 2 2 1 2 = 1 0.25 2 1 2
and:
P ( B ) = 1 ( 1 0.6 ) 2 + 0.3 2 2 1 2 = 1 0.25 2 1 2 = P ( A ) .
Although A and B are different, we cannot tell A apart from B if we only rely on their ideal positive degrees.
Dually, a simpler and more general measure of IFVs arises in the following way, if we directly consider the Minkowski distance between the negative ideal IFV α * = ( 0 , 1 ) and any given IFV in L * .
Definition 22.
Let p 1 and α * = ( 0 , 1 ) . The Minkowski score function of IFVs is a mapping s p : L * [ 0 , 1 ] defined as:
s p ( A ) = D p ( A , α * ) = t A p + ( 1 f A ) p 2 1 p
for all A = ( t A , f A ) L * .
If p = 1 , we have:
s 1 ( A ) = t A f A + 1 2 = δ A .
Thus, the Minkowski score function s p ( A ) is reduced to the expectation score function δ ( A ) when p = 1 . In other words, the expectation score of an IFV A coincides with the normalized Hamming distance between A and ( 0 , 1 ) L * .
If p = 2 , then Equation (10) is reduced to the following form:
s 2 ( A ) = t A 2 + ( 1 f A ) 2 2 1 2 ,
which is called the Euclidean score function of IFVs.
As shown below, the Minkowski score function effectively solves the limitation of the ideal positive degree posed by Example 1.
Example 2.
In continuation of Example 1, the application of Equation (11) produces:
s 2 ( A ) = 0.5 2 + ( 1 0 ) 2 2 1 2 = 0.7906
and:
s 2 ( A ) = 0.6 2 + ( 1 0.3 ) 2 2 1 2 = 0.6519
therefore s 2 ( A ) = 0.7906 > 0.6519 = s 2 ( B ) . Thus, we conclude that A = ( 0.5 , 0 ) is superior to B = ( 0.6 , 0.3 ) if we compare them by the Euclidean score function.
It also might happen that two IFVs cannot be distinguished since they have the same Minkowski score. However, we can easily address this issue by adjusting the parameter p in the Minkowski score function. This can be illustrated by an example as follows.
Example 3.
Let us consider two IFVs:
A = ( t A , f A ) = ( 0.12 , 0.22 )
and:
B = ( t B , f B ) = ( 0.15 , 0.25 ) .
Since:
δ A = s 1 ( A ) = 0.45 = δ B = s 1 ( B ) ,
we cannot distinguish A from B with the expectation score function of IFVs. Nevertheless, we can tell them apart if we resort to a Minkowski score function with p > 1 . For instance, by Equation (11), we have:
s 2 ( A ) = 0.12 2 + ( 1 0.22 ) 2 2 1 2 = 0.5580
and:
s 2 ( B ) = 0.15 2 + ( 1 0.25 ) 2 2 1 2 = 0.5408 .
Since s 2 ( A ) > s 2 ( B ) , we can deduce that A = ( 0.12 , 0.22 ) is superior to B = ( 0.15 , 0.25 ) according to the Euclidean score function of IFVs. Similarly, by setting p = 9 in Equation (10), we have:
s 9 ( A ) = 0.12 9 + ( 1 0.22 ) 9 9 1 9 = 0.7222
and:
s 9 ( B ) = 0.15 9 + ( 1 0.25 ) 9 9 1 9 = 0.6944 .
Since s 9 ( A ) > s 9 ( B ) , we also deduce that A is superior to B based on the Minkowski score function with p = 9 .
The following result indicates that the Euclidean score function s 2 ( A ) and the ideal positive degree P ( A ) can be seen as a pair of dual concepts:
Proposition 3.
Let s p : L * [ 0 , 1 ] be the Minkowski score function and A = ( t A , f A ) L * . Then:
s 2 ( A ) + P ( A ) = 1
where A = ( f A , t A ) is the complement of A.
Proof. 
By Equation (11), we have
s 2 ( A ) = t A 2 + ( 1 f A ) 2 2 1 2 ,
and we can also get the following result according to Definition 21 that:
P ( A ) = P ( f A , t A ) = 1 ( 1 f A ) 2 + t A 2 2 1 2 = 1 s 2 ( A ) ,
where A = ( f A , t A ) is the complement of A. Then, it is easy to deduce that:
s 2 ( A ) + P ( A ) = 1 ,
which completes the proof. □
It is interesting to observe that the expectation and Euclidean score functions are not logically equivalent for the purpose of comparing IFVs. This can be illustrated by an example as follows.
Example 4.
We consider the IFVs:
A = ( t A , f A ) = ( 0.4 , 0.3 )
and:
B = ( t B , f B ) = ( 0.2 , 0.2 ) .
Using Equation (3), we have:
δ A = s 1 ( A ) = 0.4 0.3 + 1 2 = 0.55
and:
δ B = s 1 ( B ) = 0.2 0.2 + 1 2 = 0.5 .
It is clear that s 1 ( A ) = δ A > δ B = s 1 ( B ) . Thus, we deduce that A = ( 0.4 , 0.3 ) is superior to B = ( 0.2 , 0.2 ) if we compare them according to the expectation score function.
On the other hand, by Equation (11), we have:
s 2 ( A ) = 0.4 2 + ( 1 0.3 ) 2 2 1 2 = 0.5701
and:
s 2 ( B ) = 0.2 2 + ( 1 0.2 ) 2 2 1 2 = 0.5831 .
Obviously, s 2 ( A ) < s 2 ( B ) . Hence, we deduce that A = ( 0.4 , 0.3 ) is inferior to B = ( 0.2 , 0.2 ) if we compare them according to the Euclidean score function.
It is worth noting that the Minkowski score function satisfies some reasonable properties as shown below:
Proposition 4.
Let s p : L * [ 0 , 1 ] be the Minkowski score function. Then:
s p ( A ) = 0 A = ( 0 , 1 )
for all A = ( t A , f A ) L * .
Proof. 
By Definition 22 and Equation (10), we have p 1 and:
s p ( A ) = D p ( A , ( 0 , 1 ) ) = t A p + ( 1 f A ) p 2 1 p .
If s p ( A ) = 0 , then t A p + ( 1 f A ) p = 0 . Note also that t A 0 and f A 1 . Thus, t A = 0 and f A = 1 . That is, A = ( 0 , 1 ) . Conversely, assume that A = ( 0 , 1 ) . Then, it is easy to see that s p ( A ) = D p ( A , A ) = 0 . □
Proposition 5.
Let s p : L * [ 0 , 1 ] be the Minkowski score function. Then:
s p ( A ) = 1 A = ( 1 , 0 )
for all A = ( t A , f A ) L * .
Proof. 
The proof is similar to that of Proposition 4 and thus omitted. □
The following result shows that the Minkowski score function s p is an order homomorphism from the lattice ( L * , L * ) to the lattice ( [ 0 , 1 ] , ) .
Proposition 6.
Let s p : L * [ 0 , 1 ] be the Minkowski score function. Then:
A L * B s p ( A ) s p ( B )
for all A = ( t A , f A ) and B = ( t B , f B ) in L * .
Proof. 
Let A = ( t A , f A ) and B = ( t B , f B ) be two IFVs with A L * B . By Equation (1), we have t A t B and f A f B . Thus, for every p 1 , it follows that t A p t B p and ( 1 f A ) p ( 1 f B ) p . This implies that:
s p ( A ) = t A p + ( 1 f A ) p 2 1 p t B p + ( 1 f B ) p 2 1 p = s p ( B ) ,
which completes the proof. □
It is worth noting that the converse of the implication in Proposition 6 does not hold as shown below:
Example 5.
We consider the IFVs:
A = ( t A , f A ) = ( 0.3 , 0.6 )
and:
B = ( t B , f B ) = ( 0.1 , 0.2 ) .
Let p = 2 . By Equation (11), we have:
s 2 ( A ) = 0.3 2 + ( 1 0.6 ) 2 2 1 2 = 0.3536
and:
s 2 ( B ) = 0.1 2 + ( 1 0.2 ) 2 2 1 2 = 0.5701 .
Thus, it is clear that s p ( A ) s p ( B ) . Nevertheless, it should be noted that A L * B does not hold since t A > t B .
Proposition 7.
Let s p : L * [ 0 , 1 ] be the Minkowski score function. Then, 0 s p ( A ) 1 for all A = ( t A , f A ) L * .
Proof. 
Let A = ( t A , f A ) L * . By the definition of L * , it is easy to see that:
( 0 , 1 ) L * ( t A , f A ) L * ( 1 , 0 ) .
From Proposition 6, it follows that:
s p ( 0 , 1 ) s p ( A ) s p ( 1 , 0 ) .
Note also that s p ( 0 , 1 ) = 0 and s p ( 1 , 0 ) = 1 . Thus, we have 0 s p ( A ) 1 . □
Proposition 8.
Let s p : L * [ 0 , 1 ] be the Minkowski score function and A = ( t A , f A ) L * with h A = 1 . Then, s p ( A ) = t A .
Proof. 
Let A = ( t A , f A ) L * with h A = t A + f A = 1 . Thus, we have f A = 1 t A . From Definition 22 and Equation (10), it follows that:
s p ( A ) = t A p + ( 1 f A ) p 2 1 p = t A p 1 p = t A ,
completing the proof. □
Proposition 9.
The Minkowski score function s p is surjective.
Proof. 
For each a 0 [ 0 , 1 ] , let us take the IFV A 0 = ( a 0 , 1 a 0 ) in L * . From Proposition 8, it follows that s p ( A 0 ) = a 0 . This shows that the Minkowski score function s p is a surjection. □
Proposition 10.
Let s p : L * [ 0 , 1 ] be the Minkowski score function and A = ( t A , f A ) L * . Then, s p ( t A , f A ) is non-decreasing with respect to t A .
Proof. 
By Definition 22 and Equation (10), we have p 1 and:
s p ( A ) = s p ( t A , f A ) = t A p + ( 1 f A ) p 2 1 p .
By calculation, we get the partial derivative function of s p with respect to t A , which is as follows:
s p t A = t A p 1 2 t A p + ( 1 f A ) p 2 1 p p 0 .
Hence, we conclude that s p ( t A , f A ) is non-decreasing with respect to t A . □
Proposition 11.
Let s p : L * [ 0 , 1 ] be the Minkowski score function and A = ( t A , f A ) L * . Then, s p ( t A , f A ) is non-increasing with respect to f A .
Proof. 
The proof is similar to that of Proposition 10 and thus omitted. □

5. Minkowski Weighted Score Functions

This section investigates the traits of an improved version of Minkowski’s score functions. We show that it is possible to preserve a good deal of its attractive properties while gaining generality.
Definition 23.
Let p 1 and α [ 0 , 1 ] be the attitudinal parameter. The Minkowski weighted score function of IFVs is a mapping s p α : L * [ 0 , 1 ] defined as:
s p α ( A ) = ( 1 α ) t A p + α ( 1 f A ) p 1 p
for all A = ( t A , f A ) L * .
If p = 1 , then Equation (12) is reduced to the following form:
s 1 α ( A ) = ( 1 α ) t A + α ( 1 f A ) ,
which is called the Hamming weighted score function of IFVs.
If p = 2 , then Equation (12) is reduced to the following form:
s 2 α ( A ) = ( 1 α ) t A 2 + α ( 1 f A ) 2 1 2 ,
which is called the Euclidean weighted score function of IFVs.
Proposition 12.
Let s p α : L * [ 0 , 1 ] be the Minkowski weighted score function and A = ( t A , f A ) L * . Then, we have:
(1)
s p 0 ( A ) = s p 0 ( t A , f A ) = t A ;
(2)
s p 0.5 ( A ) = s p ( A ) ;
(3)
s p 1 ( A ) = s p 1 ( t A , f A ) = 1 f A .
Proof. 
By Definition 23 and Equation (12), we have:
s p α ( A ) = ( 1 α ) t A p + α ( 1 f A ) p 1 p ,
and where p 1 , α [ 0 , 1 ] .
If α = 0 , then ( 1 α ) t A p = t A p and α ( 1 f A ) p = 0 . It follows that:
s p 0 ( A ) = ( t A p + 0 ) 1 p = t A .
If α = 0.5 , then ( 1 α ) t A p = 1 2 t A p and α ( 1 f A ) p = 1 2 ( 1 f A ) p . Thus, according to Definition 22 and Equation (10), we have:
s p 0.5 ( A ) = ( 1 2 t A p + 1 2 ( 1 f A ) p ) 1 p = s p ( A ) .
Similarly, if α = 1 , then ( 1 α ) t A p = 0 and α ( 1 f A ) p = ( 1 f A ) p . Thus, it is easy to deduce that
s p 1 ( A ) = ( 0 + ( 1 f A ) p ) 1 p = 1 f A .
This completes the proof. □
Now, let us investigate whether the Minkowski weighted score function is bounded on the attitudinal parameter α .
Proposition 13.
Let s p α : L * [ 0 , 1 ] be the Minkowski weighted score function. Then:
t A = s p 0 ( A ) s p α ( A ) s p 1 ( A ) = 1 f A
for all A = ( t A , f A ) L * .
Proof. 
By Definition 23 and Equation (12), we have p 1 , α [ 0 , 1 ] and:
s p α ( A ) = ( 1 α ) t A p + α ( 1 f A ) p 1 p .
By calculation, we can get the partial derivative function of s p α with respect to α , which is as follows:
s p α α = 1 p ( 1 α ) t A p + α ( 1 f A ) p 1 p p ( ( 1 f A ) p t A p )
where 1 p > 0 , ( ( 1 α ) t A p + α ( 1 f A ) p ) 1 p p 0 . Note also that ( 1 f A ) p t A p 0 since t A 1 f A . Thus, it follows that s p α α 0 . Hence, we conclude that s p α ( t A , f A ) is non-decreasing with respect to α , so s p α ( A ) has the least value and the largest value when α = 0 and α = 1 , respectively. According to Proposition 12, we can get that:
t A = s p 0 ( A ) s p α ( A ) s p 1 ( A ) = 1 f A ,
which completes the proof. □
The above proof also proved the following proposition.
Proposition 14.
Let s p α 1 and s p α 2 be two Minkowski weighted score functions with α 1 < α 2 . Then, s p α 1 ( A ) s p α 2 ( A ) for all A = ( t A , f A ) L * .
Proof. 
Note that s p α ( t A , f A ) is non-decreasing with respect to α as shown in the proof of Proposition 13. This directly implies that s p α 1 ( A ) s p α 2 ( A ) if α 1 < α 2 , which completes the proof. □
Remark 1.
In view of Propositions 12–14, it is worth noting that the attitudinal parameter α [ 0 , 1 ] can be used to capture decision maker’s general attitude toward the information expressed in terms of IFVs. Given an IFV A = ( t A , f A ) , its Minkowski weighted score s p α ( A ) will not decrease when the attitudinal parameter α increases. The least Minkowski weighted score function s p 0 ( A ) = t A is obtained when α = 0 , which corresponds to the most pessimistic attitude. The largest Minkowski weighted score function s p 1 ( A ) = 1 f A is obtained when α = 1 , which corresponds to the most optimistic attitude. In the middle lies the usual Minkowski score function s p ( A ) obtained by taking α = 0.5 , which corresponds to a neutral attitude.
It is interesting to see that we might be able to express the newly defined Minkowski weighted score function in terms of the previously known functions.
Proposition 15.
Let A = ( t A , f A ) L * . Then, s 1 α ( A ) = t A + α π A .
Proof. 
Let A = ( t A , f A ) L * . By definition, π A = 1 t A f A . According to Equation (13), we have:
s 1 α ( A ) = ( 1 α ) t A + α ( 1 f A ) = t A + α ( 1 t A f A ) = t A + α π A .
This completes the proof. □
Proposition 16.
Let A = ( t A , f A ) L * . Then, s 2 α ( A ) = ( t A 2 + α π A ( 1 + s A ) ) 1 2 .
Proof. 
By the definition of π A and s A , we have π A = 1 t A f A and s A = t A f A . Then, according to Equation (14), we have:
s 2 α ( A ) = ( 1 α ) t A 2 + α ( 1 f A ) 2 1 2 = ( t A 2 + α ( ( 1 f A ) 2 t A 2 ) ) 1 2 = ( t A 2 + α ( 1 f A t A ) ( 1 f A + t A ) ) 1 2 = ( t A 2 + α π A ( 1 + s A ) ) 1 2 ,
which completes the proof. □
It can be seen that the Minkowski weighted score function satisfies some reasonable properties as shown below.
Proposition 17.
Let s p α : L * [ 0 , 1 ] be the Minkowski weighted score function. Then, 0 s p α ( A ) 1 for all A = ( t A , f A ) L * .
Proof. 
Let A = ( t A , f A ) L * . By Proposition 13, we have:
t A s p α ( A ) 1 f A ,
according to the definition of the IFVs. It is easy to see that the least value of t A is zero, and the largest value of 1 f A is one. Hence, we conclude that 0 s p α ( A ) 1 for all A = ( t A , f A ) L * . □
Proposition 18.
Let s p α : L * [ 0 , 1 ] be the Minkowski weighted score function and A = ( t A , f A ) L * with h A = 1 . Then, s p α ( A ) = t A .
Proof. 
Let A = ( t A , f A ) L * with h A = t A + f A = 1 . Thus, we have f A = 1 t A . From Definition 23 and Equation (12), it follows that:
s p α ( A ) = ( 1 α ) t A p + α ( 1 f A ) p 1 p = t A p 1 p = t A ,
completing the proof. □
Proposition 19.
The Minkowski weighted score function s p α is surjective.
Proof. 
For each a 0 [ 0 , 1 ] , let us take the IFV A 0 = ( a 0 , 1 a 0 ) in L * . From Proposition 18, it follows that s p α ( A 0 ) = a 0 . This shows that the Minkowski weighted score function s p α is a surjection. □
Proposition 20.
Let A = ( t A , f A ) L * . The Minkowski weighted score function s p α ( t A , f A ) is non-decreasing with respect to t A .
Proof. 
By Definition 23 and Equation (12), we have p 1 , 0 α 1 , and:
s p α ( A ) = ( 1 α ) t A p + α ( 1 f A ) p 1 p .
By calculation, we get the partial derivative function of s p α with respect to t A , which is as follows:
s p α t A = ( 1 α ) t A p 1 ( 1 α ) t A p + α ( 1 f A ) p 1 p p 0 .
Hence, we conclude that s p α ( t A , f A ) is non-decreasing with respect to t A . □
Proposition 21.
Let A = ( t A , f A ) L * . The Minkowski weighted score function s p α ( t A , f A ) is non-increasing with respect to f A .
Proof. 
The proof is similar to that of Proposition 20 and thus omitted. □

6. A New Intuitionistic Fuzzy MADM Method

In this section, we propose a new approach to intuitionistic fuzzy MADM based on the SWIFA operator and the Minkowski weighted score function.
The components of our model are as follows.
We denote a general set of options by U = { p 1 , p 2 , , p m } . The collection of relevant characteristics is captured by A = { e 1 , e 2 , , e n } , and we split it into the disjoint subsets A + and A that respectively comprise benefit and cost attributes. A committee of experts evaluates each alternative with respect to the attributes in A. Their evaluations produce an IFSS Ω ˜ = ( ω ˜ , A ) , and the evaluation of option p i with respect to the attribute e j is the following IFV:
ω ˜ ( e j ) ( p i ) = ( t i j , f i j ) ,
for every possible i = 1 , , m and j = 1 , , n .

6.1. Wei’s Method to Determine Weight Vector of Attributes

In [34], Wei put forward a procedure for solving intuitionistic fuzzy MADM problems with partially known or completely unknown information about attribute weights. We proceed to describe the fundamental constituents of their approach.
Definition 24
([53]). The normalized Hamming distance between a = ( t a , f a ) , b = ( t b , f b ) L * is:
d ( a , b ) = 1 2 ( t a t b + f a f b ) .
Wei [34] proposed a maximizing deviation method that determines the weight vector w = ( w 1 , w 2 , , w n ) T . A weight vector w is selected that maximizes the total weighted deviation value among all options and with respect to all attributes. This item is defined by:
D ( w ) = j = 1 n i = 1 m k = 1 m d ( ω ˜ ( e j ) ( p i ) , ω ˜ ( e j ) ( p k ) ) w j .
To achieve this goal, the following single-objective programming model is generated:
( M 1 ) : max D ( w ) = j = 1 n i = 1 m k = 1 m d ( ω ˜ ( e j ) ( p i ) , ω ˜ ( e j ) ( p k ) ) w j s . t . w H , j = 1 n w j = 1 , w j 0 , j = 1 , 2 , , n
where:
d ( ω ˜ ( e j ) ( p i ) , ω ˜ ( e j ) ( p k ) ) = 1 2 ( t i j t k j + f i j f k j )
and H captures the weight information that is known.
The solution of problem (M1) gives a vector w that can act as the weight vector for solving the MADM.
Further, in the case of complete lack of information about attribute weights, the following alternative programming model is at hand:
( M 2 ) : max D ( w ) = j = 1 n i = 1 m k = 1 m d ( ω ˜ ( e j ) ( p i ) , ω ˜ ( e j ) ( p k ) ) w j s . t . j = 1 n w j 2 = 1 , w j 0 , j = 1 , 2 , , n
The following Lagrange function is helpful to solve this model:
L ( w , λ ) = 1 2 j = 1 n i = 1 m k = 1 m w j ( t i j t k j + f i j f k j ) + λ 4 ( j = 1 n w j 2 1 ) .
Upon normalization of the attribute weights that are obtained, we can produce:
w j = i = 1 m k = 1 m ( t i j t k j + f i j f k j ) j = 1 n i = 1 m k = 1 m ( t i j t k j + f i j f k j )
for j = 1 , 2 , , n .

6.2. A New Algorithm

With the combined use of the maximizing deviation method, the SWIFA operator, and the Minkowski weighted score function, a new Algorithm 1 for intuitionistic fuzzy MADM can be developed.
Algorithm 1 The algorithm for intuitionistic fuzzy MADM.
 1:
Construct an IFSS Ω ˜ = ( ω ˜ , A ) over U = { p 1 , p 2 , , p m } based on the collected data associated with the concerned decision-making problem.
 2:
Divide the set A = { e 1 , e 2 , , e n } into two disjoint subsets A + and A , which consist of benefit and cost attributes, respectively.
 3:
Calculate the generalized A -complement G = Ω ˜ A c = ( G ˜ , A ) of the IFSS Ω ˜ according to Definition 6. For convenience, the approximate function of G is denoted by:
G ˜ ( e j ) ( p i ) = ( t i j , f i j ) ,
where i = 1 , 2 , , m and j = 1 , 2 , , n .
 4:
Determine the weight vector w = ( w 1 , w 2 , , w n ) T in two different cases as follows:
     • Case 1: When the information about the attribute weights is completely unknown, we solve the following single-objective programming model:
max D ( w ) = 1 2 j = 1 n i = 1 m k = 1 m ( t i j t k j + f i j f k j ) w j s . t . j = 1 n w j 2 = 1 , w j 0 , j = 1 , 2 , , n
and we can get the weight vector by normalizing the obtained optimal solution.
     • Case 2: When the information about the attribute weights is partially known, the weight vector can be derived as the solution of the following single-objective programming model:
max D ( w ) = 1 2 j = 1 n i = 1 m k = 1 m ( t i j t k j + f i j f k j ) w j s . t . w H , j = 1 n w j = 1 , w j 0 , j = 1 , 2 , , n
where H represents the known weight information.
 5:
Calculate the overall intuitionistic fuzzy preference value (OIFPV) Z G ( p i ) of the alternative p i according to Definition 20, i.e.,
Z G ( p i ) = Ξ w s ( G ˜ ( e 1 ) ( p i ) , G ˜ ( e 2 ) ( p i ) , , G ˜ ( e n ) ( p i ) ) = j = 1 n w i t i j , j = 1 n w i f i j
for i = 1 , 2 , , m .
 6:
Specify p 1 and the attitudinal parameter α [ 0 , 1 ] .
 7:
Calculate the Minkowski weighted score s p α Z G ( p i ) of the OIFPV Z G ( p i ) according to Definition 23, i.e.,
s p α Z G ( p i ) = ( 1 α ) t Z G ( p i ) p + α ( 1 f Z G ( p i ) ) p 1 p
for i = 1 , 2 , , m .
 8:
Rank p i ( i = 1 , 2 , , m ) according to the descending order of the Minkowski weighted scores s p α Z G ( p i ) .
 9:
The optimal decision is to select the object(s) with the greatest Minkowski weighted score.
For clarity, the decision-making process of this method is demonstrated in Figure 2.

7. Numerical Illustration

The purpose of this section is twofold. We illustrate the application of our methodology in two inspiring case-studies. We compare the results that we obtain with those of other benchmark methodologies in the same context.

7.1. An Investment Problem

In this subsection, we revisit a benchmark problem regarding venture capital investment, originally raised by Herrera and Herrera-Viedma [54]. Wei first investigated this problem under an intuitionistic fuzzy setting in [34]. Chen and Tu [55] further examined this problem to demonstrate the discriminative capability of some dual bipolar measures of IFVs. Note also that Feng et al. [51] explored the same problem to illustrate some lexicographic orders of IFVs. In what follows, this problem is to be used for illustrating Algorithm 1 in the case where the information about the attribute weights is completely unknown.
Suppose that an investment bank plans to make a venture capital investment in the most suitable company among five alternatives:
  • A 1 is a car company;
  • A 2 is a food company;
  • A 3 is a computer company;
  • A 4 is an arms company;
  • A 5 is a television company.
The set consisting of these companies is denoted by U, and the decision is made based on four criteria in the set C = { c 1 , c 2 , c 3 , c 4 } , where:
  • c 1 represents the risk analysis;
  • c 2 represents the growth analysis;
  • c 3 represents the social-political impact analysis;
  • c 4 represents the environmental impact analysis.
Now, let us solve the above risk investment problem using Algorithm 1 proposed in Section 6. The step-wise description is presented below:
Step 1. Based on the evaluation results adopted from [34], we construct an IFSS I = ( F ˜ , C ) over U = { A 1 , A 2 , , A 5 } as shown in Table 1.
Step 2. It easy to see that C + = C and C = since c 1 , c 2 , c 3 , and c 4 are all benefit attributes.
Step 3. By Definition 6, the generalized C -complement of I is:
G = ( G ˜ , C ) = I c = I ,
and its approximate function is denoted by:
G ˜ ( c j ) ( A i ) = ( t i j , f i j ) ,
where i = 1 , 2 , , 5 and j = 1 , 2 , , 4 .
Step 4. In this case, the information about the attribute weights is completely unknown. Accordingly, a single-objective programming model can be established as follows:
Max D ( W ) = 3.0 w 1 + 2.2 w 2 + 4.8 w 3 + 4.4 w 4 s . t . w 1 2 + w 2 2 + w 3 2 + w 4 2 = 1 w j 0 ( j = 1 , 2 , 3 , 4 ) .
By solving this model, we can get the following optimal solution:
W 1 * = ( 0.4000 , 0.2934 , 0.6401 , 0.5867 ) T .
After normalization, the obtained weight vector is:
W 1 = ( w 1 , w 2 , w 3 , w 4 ) T = ( 0.2083 , 0.1528 , 0.3333 , 0.3056 ) T .
Step 5. Using the weight vector W 1 , the OIFPV Z G ( A 1 ) is calculated as follows:
Z G ( A 1 ) = j = 1 4 w j G ˜ ( c j ) ( A 1 ) = ( 0.3569 , 0.5431 ) .
The OIFPVs of all alternatives A i ( 1 i 5 ) can be found in Table 2.
Step 6. Assume that the decision maker specifies p = 2 and the attitudinal parameter α = 0.7 , respectively.
Step 7. Based on the selected values, we calculate the Euclidean weighted scores s 2 0.7 Z G ( A i ) of the OIFPVs according to Equation (14):
s 2 0.7 Z G ( A i ) = ( 1 0.7 ) t Z G ( A i ) 2 + 0.7 ( 1 f Z G ( A i ) ) 2 1 2
for i = 1 , 2 , , 5 . For instance, the Euclidean weighted score s 2 0.7 Z G ( A 1 ) can be obtained as:
s 2 0.7 Z G ( A 1 ) = 0.3 t Z G ( A 1 ) 2 + 0.7 ( 1 f Z G ( A 1 ) ) 2 1 2 = 0.4294 ,
The other results can be found in Table 2.
Step 8. According to the descending order of the Euclidean weighted score s 2 0.7 Z G ( A i ) , the ranking of A i ( 1 i 5 ) can be obtained as follows:
A 5 A 2 A 3 A 4 A 1 .
Step 9. The optimal decision is to invest in the company A 5 since it has the greatest Euclidean weighted score.
In order to verify the effectiveness of the proposed method and its consistency with existing literature, now we make a comparative check against several lexicographic orders introduced in [51] and four other representative methods [34,45,46]. We apply them to the same illustrative example that we have studied above.
According to Definition 15 and the classical scores s ( Z G ( A i ) ) ( 1 i 5 ) shown in Table 2, we can obtain the following ranking:
A 5 A 2 A 3 A 4 A 1 .
Table 3 summarizes all the ranking results obtained by Algorithm 1, Wei’s method [34], Xia and Xu’s method [45], Song et al.’s method [46], and the lexicographic orders ( s , t ) , ( s , f ) , ( f , t ) , and ( t , f ) . By comparison, it can be seen that the ranking obtained by Algorithm 1 is only slightly different from the result of the lexicographic order ( t , f ) . Although the proposed method, Wei’s method [34], Xia and Xu’s method [45], Song et al.’s method [46], and the lexicographic orders ( s , t ) , ( s , f ) , and ( f , t ) can produce the same ranking of alternatives, the rationales of these approaches is quite different.
Remark 2.
Let us briefly recall some essential facts concerning the approaches that we use for contrast. Wei’s method [34] selects the weight vector w, which maximizes the total weighted deviation value among all options and with respect to all attributes. Xia and Xu’s method [45] determines the optimal weights of attributes based on entropy and cross entropy. According to Xia and Xu’s idea, an attribute with smaller entropy and larger cross entropy should be assigned a larger weight. The design of Song et al.’s similarity measure [46] for IFSs involves Jousselme’s distance measure and the cosine similarity measure between basic probability assignments (BPAs). Their similarity measure can avoid the counter-intuitive outputs by a single evaluation of the similarity measure, and it grants more rationality to the ranking results.
Feng et al. [51] pointed out that lexicographic orders ( s , t ) , ( s , f ) , ( f , t ) , and ( t , f ) are not logically equivalent. Therefore, we might observe that they produce different outputs. In our case, notice that the ranking obtained under ( t , f ) is different from the other three rankings. It is also worth noting that the rankings derived from ( s , t ) and ( s , f ) are identical, despite the fact that the scores Z G ( A i ) ( 1 i 5 ) are all different.
This comparison analysis shows that our new approach provides an effective and consistent tool for solving multiple attribute decision-making problems under an intuitionistic fuzzy environment.

7.2. A Supplier Selection Problem

In this subsection, we illustrate Algorithm 1 in the case where the information about the attribute weights is partly known. We revisit a multi-attribute intuitionistic fuzzy group decision-making problem studied in Boran et al. [47]. This reference analyzed a supplier selection problem by means of the TOPSIS method with intuitionistic fuzzy sets. We resort to a modified version of the case study in [47] where the number of alternatives is artificially increased to ten, for the purpose of better understanding Algorithm 1.
Following [47], an automotive company intends to select the most appropriate supplier for one of the key components in its manufacturing process. The selection will be made among ten alternatives in U = { s 1 , s 2 , s 10 } , which are evaluated based on four criteria. The set consisting of these criteria is denoted by C = { e 1 , e 2 , e 3 , e 4 } , where:
  • e 1 stands for product quality;
  • e 2 stands for relationship closeness;
  • e 3 stands for delivery performance;
  • e 4 stands for product price.
Now, let us solve the above supplier selection problem using Algorithm 1. The step-wise description is given below:
Step 1. Based on the evaluation results adopted from [47], we construct an IFSS I = ( F ˜ , C ) over U = { s 1 , s 2 , , s 10 } as shown in Table 4. It should be noted that the evaluations for the alternatives s 1 , s 2 , , s 5 are inherited from [47], while the evaluations of the new alternatives s 6 , s 7 , , s 10 are the simple intuitionistic fuzzy average (SIFA) values of the original values for s 1 , s 2 , , s 5 , taken by pairs.
For instance, the SIFA of the evaluations of the alternatives s 1 and s 2 is used as the evaluation result of the artificial alternative s 6 = s ( 1 , 2 ) . More specifically, by Equation (5), we get:
F ˜ ( e 1 ) s ( 1 , 2 ) = Ξ s F ˜ ( e 1 ) ( s 1 ) , F ˜ ( e 1 ) ( s 2 ) = ( 0.6620 , 0.2360 ) .
All other evaluation results can be obtained in a similar fashion. For convenience, we simply write s 6 = s ( 1 , 2 ) , s 7 = s ( 2 , 3 ) , s 8 = s ( 3 , 4 ) , s 9 = s ( 4 , 5 ) , and s 10 = s ( 1 , 5 ) .
Step 2. Divide the set C into two disjoint subsets C + and C . It is easy to see that C + = { e 1 , e 2 , e 3 } and C = { e 4 } , since e 1 , e 2 and e 3 are benefit attributes, while e 4 is a cost attribute.
Step 3. Calculate the generalized C -complement G = ( G ˜ , C ) = I C c of the IFSS I according to Definition 6, and the results are shown in Table 5. For convenience, the approximate function of G is denoted by:
G ˜ ( e j ) ( s i ) = ( t i j , f i j ) ,
where i = 1 , 2 , , 10 and j = 1 , 2 , , 4 .
Step 4. In this case, the information about the attribute weights is partly known. Accordingly, a single-objective programming model can be established as follows:
Max D ( W ) = 8.325 w 1 + 10.179 w 2 + 4.455 w 3 + 7.736 w 4 s . t . 0.25 w 1 0.30 0.26 w 2 0.31 0.23 w 3 0.28 0.08 w 4 0.14 w 1 + w 2 + w 3 + w 4 = 1 w j 0 ( j = 1 , 2 , 3 , 4 ) .
By solving this model, we can get the following normalized optimal weight vector:
W 2 = ( w 1 , w 2 , w 3 , w 4 ) T = ( 0.30 , 0.31 , 0.25 , 0.14 ) T .
Step 5. Using the weight vector W 2 , the OIFPVs of all alternatives s i ( 1 i 10 ) can be calculated, and the results are presented in Table 6.
Step 6. Assume that the decision maker specifies p = 3 and the attitudinal parameter α = 0.5 , respectively. As mentioned in Remark 1, the Minkowski weighted score function is reduced to the Minkowski score function when the attitudinal parameter α = 0.5 , representing the decision maker’s neutral attitude.
Step 7. Based on the selected values, we calculate the Minkowski weighted scores s 3 0.5 Z G ( s i ) of the OIFPVs according to Equation (10):
s 3 0.5 Z G ( s i ) = s 3 Z G ( s i ) = t Z G ( s i ) 3 + ( 1 f Z G ( s i ) ) 3 2 1 3
for i = 1 , 2 , , 10 . For instance, the Minkowski weighted score s 3 0.5 Z G ( s 1 ) is:
s 3 0.5 Z G ( s 1 ) = t Z G ( s 1 ) 3 + ( 1 f Z G ( s 1 ) ) 3 2 1 3 = 0.6355 3 + ( 1 0.2628 ) 3 2 1 3 = 0.6901 .
To make a thorough comparison, we also consider two other cases in which α = 0.2 and α = 0.8 , representing the decision maker’s negative and positive attitudes, respectively. Table 6 gives all the OIFPVs and their Minkowski weighted scores with p = 3 and various choices of the attitudinal parameter α .
Step 8. The ranking of s i ( i = 1 , 2 , , 10 ) based on the descending order of the Minkowski weighted scores s 3 0.5 Z G ( s i ) can be obtained as follows:
s 3 s 8 s 1 s 7 s 6 s 4 s 10 s 2 s 9 s 5 .
Step 9. The optimal decision is to select the supplier s 3 since it has the greatest Minkowski weighted score.
The analysis above presumes a neutral attitude, i.e., α = 0.5 . In order to visualize the influence of the attitudinal parameter on the final ranking, let us now see what conclusion we draw when we keep p = 3 and rank the alternatives with two other choices of the attitudinal parameter α . Table 6 gives the Minkowski weighted scores s 3 0.2 Z G ( s i ) and s 3 0.8 Z G ( s i ) of the OIFPVs Z G ( s i ) ( 1 i 10 ) . They correspond to a negative attitudinal parameter α = 0.2 and a positive attitude that we associate with α = 0.8 , respectively. It is interesting to observe that the rankings are almost identical to the ranking under a neutral attitude. This indicates that Algorithm 1 is to some extent robust with respect to the choice of the attitudinal parameter.
In addition, Boran et al. introduced the TOPSIS method based on intuitionistic fuzzy sets in [47], which can be used for ranking IFVs as well. To further verify the rationality of the proposed methods, we proceed to compare the ranking that arises from the weight vector W 2 obtained by Algorithm 1, the intuitionistic fuzzy TOPSIS method, and various lexicographic orders. Similarly, the ranking results of lexicographic orders can be obtained based on the classical scores s ( Z G ( s i ) ) ( i = 1 , 2 , , 10 ) listed in Table 6.
The attribute weight vector in [47] is an IFV representation based on the opinions of experts, and it is different from the expression of W 2 . Therefore, the construction of the aggregate weighted intuitionistic fuzzy decision matrix in Step 4 of the intuitionistic fuzzy TOPSIS method in [47] must be changed. Assume that μ is the fuzzy set in C such that μ ( e j ) = w j ( 1 j 4 ). Then, the weighted IFSS can easily be obtained by calculating the scalar product R = μ G = ( R ˜ , C ) of the IFSS G and the fuzzy set μ . For instance, by Definition 7 and Equation (2), we have:
R ˜ ( e 1 ) ( s 1 ) = 1 1 t G ˜ ( e 1 ) ( s 1 ) μ ( e 1 ) , f G ˜ ( e 1 ) ( s 1 ) μ ( e 1 ) = 1 1 t G ˜ ( e 1 ) ( s 1 ) w 1 , f G ˜ ( e 1 ) ( s 1 ) w 1 = 1 1 0.7280 0.30 , 0.1700 0.30 = ( 0.3233 , 0.5876 ) .
The weighted IFSS that arises is shown in Table 7.
By performing the rest of the steps of the intuitionistic fuzzy TOPSIS method in [47], we get Table 8, which consists of the Euclidean distance S i + ( i = 1 , 2 , , 10 ) between each object and the positive ideal solution, the Euclidean distance S i ( i = 1 , 2 , , 10 ) between each object and the negative ideal solution, and the relative closeness coefficient C i * ( i = 1 , 2 , , 10 ) . According to the intuitionistic fuzzy TOPSIS method, we can obtain the ranking results based on the descending order of the relative closeness coefficient C i * as shown in Table 9.
Table 9 summarizes all the ranking results obtained by Algorithm 1 (with three different attitudinal parameters α ), the intuitionistic fuzzy TOPSIS method [47], and the lexicographic orders ( s , t ) , ( s , f ) , ( f , t ) , and ( t , f ) . By comparison, it can be seen that the ranking obtained by Algorithm 1 with the pessimistic attitude is only slightly different from those obtained with either the neutral or optimistic attitude. Algorithm 1 with either the neutral or optimistic attitude and the lexicographic orders ( s , t ) , ( s , f ) produce the same results. The ranking given by Algorithm 1 with the pessimistic attitude coincides with the results of the lexicographic orders ( t , f ) and the intuitionistic fuzzy TOPSIS method. Whatever the choice, the best supplier is s 3 , and the worst supplier is s 5 . Thus, this numerical example concerning supplier selection further illustrates that the method proposed in this study is feasible for solving multiple attribute decision-making problems in real-life environments.
Remark 3.
Let us briefly highlight several key points with regard to the above discussion. Note first that the set of alternatives is expanded from the original set [47] consisting of five alternatives to a new set containing ten alternatives. All the evaluation results of these alternatives are shown in Table 4. It should be noted that the evaluation results of the alternatives s i ( i = 1 , 2 , , 5 ) are inherited directly from [47]. In contrast, the alternatives s i ( i = 6 , 7 , , 10 ) in Table 4 are artificial ones, and the evaluation results of them are obtained by computing the SIFA values of the IFVs associated with s 1 , s 2 , , s 5 in a pairwise manner. Note also that the obtained artificial alternatives s 6 , s 7 , , s 10 provide an additional way to verify the rationality of Algorithm 1. In fact, we can see from Table 9 that each artificial alternative lies between the two original alternatives that are used to produce it. For instance, s 6 = s ( 1 , 2 ) lies between s 1 and s 2 in all the ranking results. This is due to the fact that we obtain the evaluation results of s 6 by simply averaging the corresponding results of s 1 and s 2 . Finally, the ranking results in Table 9 with α = 0.2 , α = 0.5 , and α = 0.8 show that the selection of different attitudinal parameters may affect the final ranking results. However, to a certain extent, Algorithm 1 is robust with respect to the choice of the attitudinal parameter as well.

8. Conclusions

Based on the geometric intuition that a better score should be assigned to an intuitionistic fuzzy value farther from the negative ideal intuitionistic fuzzy value α * = ( 0 , 1 ) , Minkowski score functions of intuitionistic fuzzy values were proposed for comparing intuitionistic fuzzy values in the complete lattice L * . By taking into account an attitudinal parameter, we also presented the concept of Minkowski weighted score functions, which enabled us to describe the decision makers’ subjective attitudes as well. We investigated some basic properties of Minkowski (weighted) score functions and developed a new algorithm for solving multiple attribute decision-making problems based on intuitionistic fuzzy soft sets. Based on two numerical examples concerning risk investment and supplier selection, a brief comparative analysis was made between the newly proposed algorithm and existing alternatives in the literature. It was shown that our method was feasible for solving intuitionistic fuzzy soft set based decision-making problems in real-world scenarios.
Our fundamental tool (Minkowski weighted score function) can be used in the future for the efficient aggregation of infinite chains of intuitionistic fuzzy sets [56]. This line of research concerns decisions in a temporal setting, possibly with an infinite horizon. The breadth of the theory of the aggregation of infinite streams of numbers was discussed in [57] and the references therein.

Author Contributions

Conceptualization, F.F. and J.C.R.A.; formal analysis, F.F., Y.Z., and J.C.R.A.; writing, original draft, F.F. and Y.Z.; writing, review and editing, F.F., Y.Z., J.C.R.A., and Q.W. All authors read and agreed to the published version of the manuscript.

Funding

This work was partially supported by the National Natural Science Foundation of China (Grant Nos. 51875457 and 11301415), the Natural Science Basic Research Plan in Shaanxi Province of China (Grant No. 2018JM1054), and the Special Funds Project for Key Disciplines Construction of Shaanxi Universities.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ishizaka, A.; Nemery, P. Multi-Criteria Decision Analysis: Methods and Software; Wiley: Chichester, UK, 2013. [Google Scholar]
  2. Pamučar, D.S.; Savin, L.M. Multiple-criteria model for optimal off-road vehicle selection for passenger transportation: BWM-COPRAS model. Vojnoteh. Glas. 2020, 68, 28–64. [Google Scholar] [CrossRef]
  3. Xu, Z.S. Uncertain Multi-Attribute Decision Making; Tsinghua University Press: Beijing, China, 2004. [Google Scholar]
  4. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef] [Green Version]
  5. Rahman, S. On cuts of Atanassov’s intuitionistic fuzzy sets with respect to fuzzy connectives. Inf. Sci. 2016, 340, 262–278. [Google Scholar] [CrossRef]
  6. Atanassov, K.T. Intuitionistic fuzzy sets. In Intuitionistic Fuzzy Sets; Physica: Heidelberg, Germany, 1986; Volume 20, pp. 87–96. [Google Scholar]
  7. Xu, Z.S.; Yager, R.R. Some geometric aggregation operators based on intuitionistic fuzzy sets. Int. J. Gen. Syst. 2006, 35, 417–433. [Google Scholar] [CrossRef]
  8. Yager, R.R. Multicriteria decision making with ordinal/linguistic intuitionistic fuzzy sets for mobile apps. IEEE Trans. Fuzzy Syst. 2016, 24, 590–599. [Google Scholar] [CrossRef]
  9. Molodtsov, D.A. Soft set theory-first results. Comput. Math. Appl. 1999, 37, 19–31. [Google Scholar] [CrossRef] [Green Version]
  10. Feng, F.; Cho, J.; Pedrycz, W.; Fujita, H.; Herawan, T. Soft set based association rule mining. Knowl. Based Syst. 2016, 111, 268–282. [Google Scholar] [CrossRef]
  11. Ali, M.I.; Feng, F.; Liu, X.Y.; Min, W.K.; Shabir, M. On some new operations in soft set theory. Comput. Math. Appl. 2009, 57, 1547–1553. [Google Scholar] [CrossRef] [Green Version]
  12. Ali, M.I.; Shabir, M.; Naz, M. Algebraic structures of soft sets associated with new operations. Comput. Math. Appl. 2011, 61, 2647–2654. [Google Scholar] [CrossRef] [Green Version]
  13. Ali, M.I.; Shabir, M. Logic connectives for soft sets and fuzzy soft sets. IEEE Trans. Fuzzy Syst. 2014, 22, 1431–1442. [Google Scholar] [CrossRef]
  14. Feng, F.; Li, Y.M. Soft subsets and soft product operations. Inf. Sci. 2013, 232, 44–57. [Google Scholar] [CrossRef]
  15. Jun, Y.B.; Lee, K.J.; Khan, A. Soft ordered semigroups. Math. Log. Q. 2010, 56, 42–50. [Google Scholar] [CrossRef]
  16. Jun, Y.B.; Park, C.H. Applications of soft sets in ideal theory of BCK/BCI-algebras. Inf. Sci. 2008, 178, 2466–2475. [Google Scholar] [CrossRef]
  17. Jun, Y.B. Soft BCK/BCI-algebras. Comput. Math. Appl. 2008, 56, 1408–1413. [Google Scholar] [CrossRef] [Green Version]
  18. Maji, P.K.; Biswas, R.; Roy, A.R. Fuzzy soft sets. J. Fuzzy Math. 2001, 9, 589–602. [Google Scholar]
  19. Das, S.; Kar, S. The hesitant fuzzy soft set and its application in decision-making. In Facets of Uncertainties and Applications. Springer Proceedings in Mathematics & Statistics; Chakraborty, M.K., Skowron, A., Maiti, M., Kar, S., Eds.; Springer: New Delhi, India, 2015; Volume 125. [Google Scholar]
  20. Das, S.; Malakar, D.; Kar, S.; Pal, T. Correlation measure of hesitant fuzzy soft sets and their application in decision making. Neural Comput. Appl. 2019, 31, 1023–1039. [Google Scholar] [CrossRef]
  21. Fatimah, F.; Rosadi, D.; Hakim, R.B.F.; Alcantud, J.C.R. N-soft sets and their decision making algorithms. Soft Comput. 2018, 22, 3829–3842. [Google Scholar] [CrossRef]
  22. Alcantud, J.C.R.; Feng, F.; Yager, R.R. An N-soft set approach to rough sets. IEEE Trans. Fuzzy Syst. 2019. [Google Scholar] [CrossRef]
  23. Akram, M.; Adeel, A.; Alcantud, J.C.R. Group decision-making methods based on hesitant N-soft sets. Expert Syst. Appl. 2019, 115, 95–105. [Google Scholar] [CrossRef]
  24. Akram, M.; Adeel, A.; Alcantud, J.C.R. Fuzzy N-soft sets: A novel model with applications. J. Intell. Fuzzy Syst. 2018, 35, 4757–4771. [Google Scholar] [CrossRef]
  25. Maji, P.K.; Biswas, R.; Roy, A.R. Intuitionistic fuzzy soft sets. J. Fuzzy Math. 2001, 9, 677–692. [Google Scholar]
  26. Ali, M.I.; Feng, F.; Mahmood, T.; Mahmood, I.; Faizan, H. A graphical method for ranking Atanassov’s intuitionistic fuzzy values using the uncertainty index and entropy. Int. J. Intell. Syst. 2019, 34, 2692–2712. [Google Scholar] [CrossRef]
  27. Feng, F.; Xu, Z.; Fujita, H.; Liang, M. Enhancing PROMETHEE method with intuitionistic fuzzy soft sets. Int. J. Intell. Syst. 2020, 35, 1071–1104. [Google Scholar] [CrossRef]
  28. Agarwal, M.; Biswas, K.K.; Hanmandlu, M. Generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Soft Comput. 2013, 13, 3552–3566. [Google Scholar] [CrossRef]
  29. Feng, F.; Fujita, H.; Ali, M.I.; Yager, R.R.; Liu, X. Another view on generalized intuitionistic fuzzy soft sets and related multiattribute decision making methods. IEEE Trans. Fuzzy Syst. 2019, 27, 474–488. [Google Scholar] [CrossRef]
  30. Peng, X.D.; Yang, Y.; Song, J.P. Pythagoren fuzzy soft set and its application. Comput. Eng. 2015, 41, 224–229. [Google Scholar]
  31. Athira, T.M.; John, S.J.; Garg, H. A novel entropy measure of Pythagorean fuzzy soft sets. AIMS Math. 2019, 5, 1050–1061. [Google Scholar] [CrossRef]
  32. Dey, A.; Senapati, T.; Pal, M.; Chen, G.Y. A novel approach to hesitant multi-fuzzy soft set based decision-making. AIMS Math. 2020, 5, 1985–2008. [Google Scholar] [CrossRef]
  33. Das, S.; Kar, M.B.; Kar, S. Group multi-criteria decision making using intuitionistic multi-fuzzy sets. J. Uncertain. Anal. Appl. 2013, 1, 10. [Google Scholar] [CrossRef] [Green Version]
  34. Wei, G.W. Maximizing deviation method for multiple attribute decision making in intuitionistic fuzzy setting. Knowl. Based Syst. 2008, 21, 833–836. [Google Scholar] [CrossRef]
  35. Liu, X.; Kim, H.S.; Feng, F.; Alcantud, J.C.R. Centroid transformations of intuitionistic fuzzy values based on aggregation operators. Mathematics 2018, 6, 215. [Google Scholar] [CrossRef] [Green Version]
  36. Chen, S.M.; Tan, J.M. Handling multicriteria fuzzy decision-making problems based on vague set theory. Fuzzy Sets Syst. 1994, 67, 163–172. [Google Scholar] [CrossRef]
  37. Hong, D.H.; Choi, C.H. Multi-criteria fuzzy decision-making problems based on vague set theory. Fuzzy Sets Syst. 2000, 114, 103–113. [Google Scholar] [CrossRef]
  38. Yu, X.; Xu, Z.; Liu, S.; Chen, Q. On ranking of intuitionistic fuzzy values based on dominance relations. Int. J. Uncertain. Fuzziness Knowl. Based Syst. 2014, 22, 315–335. [Google Scholar] [CrossRef]
  39. Garg, H.; Arora, R. TOPSIS method based on correlation coefficient for solving decision-making problems with intuitionistic fuzzy soft set information. AIMS Math. 2020, 5, 2944–2966. [Google Scholar] [CrossRef]
  40. Szmidt, E.; Kacprzyk, J. Amount of infornation and its reliability in the ranking of Atanassov’s intuitionistic fuzzy alternatives. In Recent Advances in Decision Making; Series Studies in Computational Intelligence; Springer: Berlin/Heidelberg, Germany, 2009; pp. 7–19. [Google Scholar]
  41. Guo, K. Amount of information and attitudinal-based method for ranking Atanassov’s intuitionistic fuzzy values. IEEE Trans. Fuzzy Syst. 2014, 22, 177–188. [Google Scholar] [CrossRef]
  42. Zhang, X.; Xu, Z. A new method for ranking intuitionistic fuzzy values and its application in multi-attribute decision making. Fuzzy Optim. Decis. Mak. 2012, 11, 135–146. [Google Scholar] [CrossRef]
  43. Wang, W.; Xin, X. Distance measure between intuitionistic fuzzy sets. Pattern Recognit. Lett. 2005, 26, 2063–2069. [Google Scholar] [CrossRef]
  44. Xing, Z.; Xiong, W.; Liu, H. A Euclidean approach for ranking intuitionistic fuzzy values. IEEE Trans. Fuzzy Syst. 2018, 26, 353–365. [Google Scholar] [CrossRef]
  45. Xia, M.M.; Xu, Z.S. Entropy/cross entropy-based group decision making under intuitionistic fuzzy environment. Inf. Fusion 2012, 13, 31–47. [Google Scholar] [CrossRef]
  46. Song, Y.F.; Wang, X.D.; Lei, L.; Quan, W.; Huang, W.L. An evidential view of similarity measure for Atanassov’s intuitionistic fuzzy sets. J. Intell. Fuzzy Syst. 2016, 31, 1653–1668. [Google Scholar] [CrossRef]
  47. Boran, F.E.; Genc, S.; Kurt, M.; Akay, D. A multi-criteria intuitionistic fuzzy group decision making for supplier selection with TOPSIS method. Expert Syst. Appl. 2009, 36, 11363–11368. [Google Scholar] [CrossRef]
  48. Wang, G.J.; He, Y.Y. Intuitionistic fuzzy sets and L-fuzzy sets. Fuzzy Sets Syst. 2000, 10, 271–274. [Google Scholar] [CrossRef]
  49. Deschrijver, G.; Kerre, E.E. On the relationship between some extensions of fuzzy set theory. Fuzzy Sets Syst. 2003, 133, 227–235. [Google Scholar] [CrossRef]
  50. Alcantud, J.C.R.; Cruz-Rambaud, S.; Muñoz Torrecillas, M.J. Valuation fuzzy soft sets: A flexible fuzzy soft set based decision making procedure for the valuation of assets. Symmetry 2017, 9, 253. [Google Scholar] [CrossRef] [Green Version]
  51. Feng, F.; Liang, M.; Fujita, H.; Yager, R.R.; Liu, X. Lexicographic orders of intuitionistic fuzzy values and their relationships. Mathematics 2019, 7, 166. [Google Scholar] [CrossRef] [Green Version]
  52. Xu, Z.S.; Yager, R.R. Intuitionistic and interval-valued intutionistic fuzzy preference relations and their measures of similarity for the evaluation of agreement within a group. Fuzzy Optim. Decis. Mak. 2009, 8, 123–139. [Google Scholar] [CrossRef]
  53. Xu, Z.S. Models for multiple attribute decision-making with intuitionistic fuzzy information. Int. J. Uncertain. Fuzziness Knowl. Based Syst. 2007, 15, 285–297. [Google Scholar] [CrossRef]
  54. Herrera, F.; Herrera-Viedma, E. Linguistic decision analysis: Steps for solving decision problems under linguistic information. Fuzzy Sets Syst. 2000, 115, 67–82. [Google Scholar] [CrossRef]
  55. Chen, L.H.; Tu, C.C. Dual bipolar measures of Atanassov’s intuitionistic fuzzy sets. IEEE Trans. Fuzzy Syst. 2014, 22, 966–982. [Google Scholar] [CrossRef]
  56. Alcantud, J.C.R.; Khameneh, A.Z.; Kilicman, A. Aggregation of infinite chains of intuitionistic fuzzy sets and their application to choices with temporal intuitionistic fuzzy information. Inf. Sci. 2020, 514, 106–117. [Google Scholar] [CrossRef]
  57. Alcantud, J.C.R.; García-Sanz, M.D. Evaluations of infinite utility streams: Pareto efficient and egalitarian axiomatics. Metroeconomica 2013, 64, 432–447. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Workflow and framework of this study. IFSS, intuitionistic fuzzy soft set; MADM, multiple attribute decision making (MADM); SWIFA, simply weighted intuitionistic fuzzy averaging.
Figure 1. Workflow and framework of this study. IFSS, intuitionistic fuzzy soft set; MADM, multiple attribute decision making (MADM); SWIFA, simply weighted intuitionistic fuzzy averaging.
Mathematics 08 01143 g001
Figure 2. Flowchart of the proposed algorithm for intuitionistic fuzzy MADM.
Figure 2. Flowchart of the proposed algorithm for intuitionistic fuzzy MADM.
Mathematics 08 01143 g002
Table 1. Intuitionistic fuzzy soft set I = ( F ˜ , C ) .
Table 1. Intuitionistic fuzzy soft set I = ( F ˜ , C ) .
U c 1 c 2 c 3 c 4
A 1 ( 0.5 , 0.4 ) ( 0.6 , 0.3 ) ( 0.3 , 0.6 ) ( 0.2 , 0.7 )
A 2 ( 0.7 , 0.3 ) ( 0.7 , 0.2 ) ( 0.7 , 0.2 ) ( 0.4 , 0.5 )
A 3 ( 0.6 , 0.4 ) ( 0.5 , 0.4 ) ( 0.5 , 0.3 ) ( 0.6 , 0.3 )
A 4 ( 0.8 , 0.1 ) ( 0.6 , 0.3 ) ( 0.3 , 0.4 ) ( 0.2 , 0.6 )
A 5 ( 0.6 , 0.2 ) ( 0.4 , 0.3 ) ( 0.7 , 0.1 ) ( 0.5 , 0.3 )
Table 2. OIFPVs and their corresponding scores.
Table 2. OIFPVs and their corresponding scores.
UOIFPV: Z G ( A i ) s 2 0.7 ( Z G ( A i ) ) s ( Z G ( A i ) )
A 1 ( 0.3569 , 0.5431 ) 0.4294−0.1862
A 2 ( 0.6083 , 0.3125 ) 0.66470.2958
A 3 ( 0.5514 , 0.3361 ) 0.63230.2153
A 4 ( 0.4194 , 0.3834 ) 0.56470.0360
A 5 ( 0.5722 , 0.2125 ) 0.72960.3597
Table 3. Ranking results given by different methods.
Table 3. Ranking results given by different methods.
MethodsRanking of the Companies
Algorithm 1 ( p = 2 , α = 0.7 ) A 5 A 2 A 3 A 4 A 1
SWIFA and ( s , t ) A 5 A 2 A 3 A 4 A 1
SWIFA and ( s , f ) A 5 A 2 A 3 A 4 A 1
SWIFA and ( f , t ) A 5 A 2 A 3 A 4 A 1
SWIFA and ( t , f ) A 2 A 5 A 3 A 4 A 1
Wei’s method [34] A 5 A 2 A 3 A 4 A 1
Xia and Xu’s method [45] A 5 A 2 A 3 A 4 A 1
Song et al.’s method [46] A 5 A 2 A 3 A 4 A 1
Table 4. Intuitionistic fuzzy soft set I = ( F ˜ , C ) .
Table 4. Intuitionistic fuzzy soft set I = ( F ˜ , C ) .
U e 1 e 2 e 3 e 4
s 1 ( 0.7280 , 0.1700 ) ( 0.6260 , 0.2720 ) ( 0.7800 , 0.1180 ) ( 0.7000 , 0.2000 )
s 2 ( 0.5960 , 0.3020 ) ( 0.6050 , 0.2920 ) ( 0.6440 , 0.2560 ) ( 0.5780 , 0.3210 )
s 3 ( 0.8490 , 0.1000 ) ( 0.7800 , 0.1180 ) ( 0.7690 , 0.1700 ) ( 0.7690 , 0.1280 )
s 4 ( 0.6630 , 0.2360 ) ( 0.5380 , 0.3610 ) ( 0.7460 , 0.1510 ) ( 0.6440 , 0.2540 )
s 5 ( 0.5620 , 0.3370 ) ( 0.4620 , 0.4380 ) ( 0.6680 , 0.2310 ) ( 0.5260 , 0.3740 )
s 6 ( 0.6620 , 0.2360 ) ( 0.6155 , 0.2820 ) ( 0.7120 , 0.1870 ) ( 0.6390 , 0.2605 )
s 7 ( 0.7225 , 0.2010 ) ( 0.6925 , 0.2050 ) ( 0.7065 , 0.2130 ) ( 0.6735 , 0.2245 )
s 8 ( 0.7560 , 0.1680 ) ( 0.6590 , 0.2395 ) ( 0.7575 , 0.1605 ) ( 0.7065 , 0.1910 )
s 9 ( 0.6125 , 0.2865 ) ( 0.5000 , 0.3995 ) ( 0.7070 , 0.1910 ) ( 0.5850 , 0.3140 )
s 10 ( 0.6450 , 0.2535 ) ( 0.5440 , 0.3550 ) ( 0.7240 , 0.1745 ) ( 0.6130 , 0.2870 )
Table 5. Intuitionistic fuzzy soft set G = ( G ˜ , C ) = I C c .
Table 5. Intuitionistic fuzzy soft set G = ( G ˜ , C ) = I C c .
U e 1 e 2 e 3 e 4
s 1 ( 0.7280 , 0.1700 ) ( 0.6260 , 0.2720 ) ( 0.7800 , 0.1180 ) ( 0.2000 , 0.7000 )
s 2 ( 0.5960 , 0.3020 ) ( 0.6050 , 0.2920 ) ( 0.6440 , 0.2560 ) ( 0.3210 , 0.5780 )
s 3 ( 0.8490 , 0.1000 ) ( 0.7800 , 0.1180 ) ( 0.7690 , 0.1700 ) ( 0.1280 , 0.7690 )
s 4 ( 0.6630 , 0.2360 ) ( 0.5380 , 0.3610 ) ( 0.7460 , 0.1510 ) ( 0.2540 , 0.6440 )
s 5 ( 0.5620 , 0.3370 ) ( 0.4620 , 0.4380 ) ( 0.6680 , 0.2310 ) ( 0.3740 , 0.5260 )
s 6 ( 0.6620 , 0.2360 ) ( 0.6155 , 0.2820 ) ( 0.7120 , 0.1870 ) ( 0.2605 , 0.6390 )
s 7 ( 0.7225 , 0.2010 ) ( 0.6925 , 0.2050 ) ( 0.7065 , 0.2130 ) ( 0.2245 , 0.6735 )
s 8 ( 0.7560 , 0.1680 ) ( 0.6590 , 0.2395 ) ( 0.7575 , 0.1605 ) ( 0.1910 , 0.7065 )
s 9 ( 0.6125 , 0.2865 ) ( 0.5000 , 0.3995 ) ( 0.7070 , 0.1910 ) ( 0.3140 , 0.5850 )
s 10 ( 0.6450 , 0.2535 ) ( 0.5440 , 0.3550 ) ( 0.7240 , 0.1745 ) ( 0.2870 , 0.6130 )
Table 6. OIFPVs and their scores with p = 3 and α = 0.5 , 0.2 , 0.8 .
Table 6. OIFPVs and their scores with p = 3 and α = 0.5 , 0.2 , 0.8 .
U Z G ( s i ) s 3 0.5 ( Z G ( s i ) ) s 3 0.2 ( Z G ( s i ) ) s 3 0.8 ( Z G ( s i ) ) s ( Z G ( s i ) )
s 1 ( 0.6355 , 0.2628 ) 0.69010.65840.71910.3726
s 2 ( 0.5723 , 0.3260 ) 0.62720.59550.65610.2462
s 3 ( 0.7067 , 0.2167 ) 0.74690.72330.76910.4899
s 4 ( 0.5877 , 0.3106 ) 0.64260.61090.67140.2771
s 5 ( 0.5312 , 0.3683 ) 0.58580.55430.61420.1629
s 6 ( 0.6039 , 0.2944 ) 0.65860.62690.68760.3094
s 7 ( 0.6395 , 0.2714 ) 0.68690.65930.71250.3681
s 8 ( 0.6472 , 0.2637 ) 0.69460.66700.72020.3835
s 9 ( 0.5595 , 0.3394 ) 0.61420.58260.64280.2200
s 10 ( 0.5833 , 0.3155 ) 0.63790.60630.66660.2678
Table 7. Weighted IFSS R = μ G = ( R ˜ , C ) .
Table 7. Weighted IFSS R = μ G = ( R ˜ , C ) .
U e 1 e 2 e 3 e 4
s 1 ( 0.3233 , 0.5876 ) ( 0.2627 , 0.6679 ) ( 0.3151 , 0.5860 ) ( 0.1551 , 0.7982 )
s 2 ( 0.2380 , 0.6982 ) ( 0.2502 , 0.6827 ) ( 0.2275 , 0.7113 ) ( 0.1137 , 0.8529 )
s 3 ( 0.4328 , 0.5011 ) ( 0.3746 , 0.5155 ) ( 0.3067 , 0.6421 ) ( 0.1854 , 0.7499 )
s 4 ( 0.2784 , 0.6484 ) ( 0.2128 , 0.7291 ) ( 0.2900 , 0.6233 ) ( 0.1346 , 0.8254 )
s 5 ( 0.2193 , 0.7215 ) ( 0.1748 , 0.7742 ) ( 0.2409 , 0.6932 ) ( 0.0992 , 0.8713 )
s 6 ( 0.2777 , 0.6484 ) ( 0.2564 , 0.6754 ) ( 0.2674 , 0.6575 ) ( 0.1329 , 0.8283 )
s 7 ( 0.3192 , 0.6179 ) ( 0.3062 , 0.6118 ) ( 0.2639 , 0.6793 ) ( 0.1450 , 0.8112 )
s 8 ( 0.3450 , 0.5855 ) ( 0.2836 , 0.6420 ) ( 0.2982 , 0.6329 ) ( 0.1577 , 0.7931 )
s 9 ( 0.2475 , 0.6872 ) ( 0.1933 , 0.7524 ) ( 0.2642 , 0.6610 ) ( 0.1158 , 0.8502 )
s 10 ( 0.2670 , 0.6625 ) ( 0.2160 , 0.7253 ) ( 0.2751 , 0.6463 ) ( 0.1244 , 0.8396 )
Table 8. Separation measures and the relative closeness coefficients.
Table 8. Separation measures and the relative closeness coefficients.
U S i + S i C i *
s 1 0.08920.09640.5195
s 2 0.13420.06190.3158
s 3 0.05640.16280.7428
s 4 0.12380.06350.3389
s 5 0.16510.05330.2440
s 6 0.10930.06940.3884
s 7 0.08480.09490.5282
s 8 0.08000.09830.5515
s 9 0.14440.05190.2645
s 10 0.12720.05930.3178
Table 9. Ranking results based on different methods.
Table 9. Ranking results based on different methods.
MethodsRanking of the Suppliers
Boran et al.’s method [47] s 3 s 8 s 7 s 1 s 6 s 4 s 10 s 2 s 9 s 5
Algorithm 1 ( p = 3 , α = 0.2 ) s 3 s 8 s 7 s 1 s 6 s 4 s 10 s 2 s 9 s 5
Algorithm 1 ( p = 3 , α = 0.5 ) s 3 s 8 s 1 s 7 s 6 s 4 s 10 s 2 s 9 s 5
Algorithm 1 ( p = 3 , α = 0.8 ) s 3 s 8 s 1 s 7 s 6 s 4 s 10 s 2 s 9 s 5
SWIFA and ( s , t ) s 3 s 8 s 1 s 7 s 6 s 4 s 10 s 2 s 9 s 5
SWIFA and ( s , f ) s 3 s 8 s 1 s 7 s 6 s 4 s 10 s 2 s 9 s 5
SWIFA and ( f , t ) s 3 s 1 s 8 s 7 s 6 s 4 s 10 s 2 s 9 s 5
SWIFA and ( t , f ) s 3 s 8 s 7 s 1 s 6 s 4 s 10 s 2 s 9 s 5

Share and Cite

MDPI and ACS Style

Feng, F.; Zheng, Y.; Alcantud, J.C.R.; Wang, Q. Minkowski Weighted Score Functions of Intuitionistic Fuzzy Values. Mathematics 2020, 8, 1143. https://doi.org/10.3390/math8071143

AMA Style

Feng F, Zheng Y, Alcantud JCR, Wang Q. Minkowski Weighted Score Functions of Intuitionistic Fuzzy Values. Mathematics. 2020; 8(7):1143. https://doi.org/10.3390/math8071143

Chicago/Turabian Style

Feng, Feng, Yujuan Zheng, José Carlos R. Alcantud, and Qian Wang. 2020. "Minkowski Weighted Score Functions of Intuitionistic Fuzzy Values" Mathematics 8, no. 7: 1143. https://doi.org/10.3390/math8071143

APA Style

Feng, F., Zheng, Y., Alcantud, J. C. R., & Wang, Q. (2020). Minkowski Weighted Score Functions of Intuitionistic Fuzzy Values. Mathematics, 8(7), 1143. https://doi.org/10.3390/math8071143

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop