Next Article in Journal
Multiobjective Mixed Integer Nonlinear Model to Plan the Schedule for the Final Disposal of the Spent Nuclear Fuel in Finland
Next Article in Special Issue
Determining the Optimal Inventory and Number of Shipments for a Two-Resource Supply Chain with Correlated Demands and Remanufacturing Products Allowing Backorder
Previous Article in Journal
A Closed-Form Solution of Prestressed Annular Membrane Internally-Connected with Rigid Circular Plate and Transversely-Loaded by Central Shaft
Previous Article in Special Issue
The Constrained Median: A Way to Incorporate Side Information in the Assessment of Food Samples
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Distance and Similarity Measures for Spherical Fuzzy Sets and Their Applications in Selecting Mega Projects

1
KMUTT Fixed Point Research Laboratory, SCL 802 Fixed Point Laboratory and Department of Mathematics, Faculty of Science, King Mongkut’s University of Technology Thonburi (KMUTT), 126 Pracha-Uthit Road, Bang Mod, Thrung Khru, Bangkok 10140, Thailand
2
Center of Excellence in Theoretical and Computational Science (TaCS-CoE), Science Laboratory Building, Faculty of Science, King Mongkut’s University of Technology Thonburi (KMUTT), 126 Pracha-Uthit Road, Bang Mod, Thrung Khru, Bangkok 10140, Thailand
3
Department of Medical Research, China Medical University Hospital, China Medical University, Taichung 40402, Taiwan
4
Deparments of Mathematics, College of Science and Arts, King Abdulaziz University, P.O. Box 344, Rabigh 21911, Saudi Arabia
5
Program in Applied Statistics, Department of Mathematics and Computer Science, Faculty of Science and Technology, Rajamangala University of Technology Thanyaburi (RMUTT), Thanyaburi, Pathumthani 12110, Thailand
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2020, 8(4), 519; https://doi.org/10.3390/math8040519
Submission received: 4 March 2020 / Revised: 24 March 2020 / Accepted: 25 March 2020 / Published: 3 April 2020

Abstract

:
A new condition on positive membership, neutral membership, and negative membership functions give us the successful extension of picture fuzzy set and Pythagorean fuzzy set and called spherical fuzzy sets ( SFS ) . This extends the domain of positive membership, neutral membership, and negative membership functions. Keeping in mind the importance of similarity measure and application in data mining, medical diagnosis, decision making, and pattern recognition, several studies on similarity measures have been proposed in the literature. Some of those, however, cannot satisfy the axioms of similarity and provide counter-intuitive cases. In this paper, we proposed the set-theoretic similarity and distance measures. We provide some counterexamples for already proposed similarity measures in the literature and shows that how our proposed method is important and applicable to the pattern recognition problems. In the end, we provide an application of a proposed similarity measure for selecting mega projects in under developed countries.

1. Introduction

The membership function is used to define the fuzzy set ( FS ) . The uncertainty model effectively by the fuzzy set theory define by Zadeh [1]. The fuzzy set theory only focuses on one aspect of information, the containment or belongingness. Attansove defines the intuitionistic fuzzy set ( IFS ) [2], which is the generalization of FS and model uncertainty effectively. The membership and non-membership functions are used to define IFS . Due to the consideration of non-membership function, the IFS is more effective than FS for practical applications. The membership functions for the interval fuzzy set ( I v FS ) and interval-valued intuitionistic fuzzy set ( I v IFS ) describe in intervals instead of single values [3]. The experts give their preferences in the form of intervals in I v FS s and I v IFS s . Due to intensive quantity and type of uncertainties, these approaches are not sufficient to cover all aspects [4]. Molodtsov soft set theory model uncertainty by parametric point of view [4]. Nowadays, many authors define the hybrid model of soft sets with FS s , IFS s , I v FS , and I v IFS s [5,6,7,8,9].
The picture fuzzy set ( PFS ) define by Coung is another generalization of FS s and IFS s [10]. The generalization in the sense that the membership, neutral and non-membership functions are used to define PFS . In PFS , the preferences of the experts describe precisely because it contains all aspects of assessment of information like yes, abstain, no and refusal. The addition of the representative’s functions in PFS should be less than or equal to one. This condition restricts the expert preferences domain. The hybrid model of PFS and soft set is obtained by Yang [11]. Khan et al. [12] define the generalized picture fuzzy soft set and applied them to decision-making problems. For study more about decision making, we refer to [13,14,15,16,17,18,19].
Yager [20,21] defines the Pythagorean fuzzy sets P y FS , which is the successful extension of intuitionistic fuzzy sets, by putting a new condition on positive membership ξ and negative membership functions ν , i.e., 0 ξ 2 + ν 2 1 . This new condition expand the domain of membership functions like if we have ξ = 0.7 and ν = 0.5 , then we cannot deal it with intuitionistic fuzzy set because 0.7 + 0.5 1 but 0.7 2 + 0.5 2 = 0.49 + 0.25 = 0.74 1 and hence P y FS applied successfully. The concept of Pythagorean fuzzy number P y FS and detailed mathematical expression of P y FS is presented by Zhang [22]. To solve the multi criteria group decision-making problem with P y FS , Peng defines the division and subtraction operations for P y FS and also developed a Pythagorean fuzzy superiority and inferiority ranking method [23]. Reformat and Yager applied the P y FS in handling the collaborative-based recommender system [24]. In [25], Peng defines several distance, similarity, entropy and inclusion measures for P y FS and their relations between them.
Ashraf [26,27,28] defines the spherical fuzzy sets SFS s , which is the successful extension of picture fuzzy sets and P y FS , by putting a new condition on positive membership ξ , neutral membership η and negative membership functions ν , i.e., 0 ξ 2 + η 2 + ν 2 1 . This new condition expand the domain of membership functions like if we have ξ = 0.7 , η = 0.5 and ν = 0.5 , then we cannot deal it with picture fuzzy set because 0.7 + 0.5 + 0.5 1 but 0.7 2 + 0.5 2 + 0.5 2 = 0.49 + 0.25 + 0.25 = 0.99 1 and hence SFS applied successfully. In [29], Rafiq proposed similarity measure based on cosine and cotangent functions for SFS s and applied them to the pattern recognition. In [30], multi-attribute group decision making problem is solved by symmetric sum based aggregation operators for spherical fuzzy sets.
Keeping in mind the importance of similarity measure and application in data mining, medical diagnosis, decision making and pattern recognition many authors work on this topic. A wide theory of similarity measures of fuzzy sets and intuitionistic fuzzy sets are presented in the literature [31,32,33,34].
The generalization of SFS s is specified on PFS in a sense that the domains of membership, neutral and non-membership functions are grater in SFS s i.e., the experts give their judgments more freely. In SFS s , the preferences of the experts describe precisely because it contains all aspects of assessment of information like yes, abstain, no and refusal. The generalization of SFS s is specified on P y FS because it contains an extra degree of preferences: the neutral degree or neutral membership function.
The aim of this paper is to define the new similarity measures for SFS s and discuss the selection of mega projects for under developing countries. Since, it is important for under developing countries to select upcoming mega projects on priority which has less effect on their economy, environment, less maintenance cost has long term benefits, fewer peoples effects from that project and generate high revenue. Normally, the megaprojects are characterized by vast complexity (especially in organizational terms), large investment commitment, long-lasting impact on the economy, the environment, and society.
Some of the proposed similarity measures for SFS s have some problems which are pointed out in Section 4. To improve the idea of the similarity measure, we proposed the set-theoretic similarity and distance measures. The proposed similarity measure is then applied to the pattern recognition. The selection of mega projects for under developing countries is done by the proposed similarity measure.
The remaining paper is organized as follows: Introduction and preliminaries are presented in Section 1 and Section 2. In Section 3, we proposed the set-theoretic similarity measures for SFS s . In Section 4, we provide some counterexamples for already proposed similarity measures. To support the proposed similarity measure a numerical example of selecting mega projects in under developing countries is presented in Section 5. Comparison analysis and conclusion are presented in Section 6 and Section 7.

2. Preliminaries

In this section, we provide some basic definitions of FS , IFS , PFS , and SFS . The already proposed similarity measures for SFS are discussed.
A fuzzy set is defined by Zadeh [1], which handles uncertainty based on the view of gradualness effectively.
Definition 1.
[1] A membership function ξ A ^ : Y ^ [ 0 , 1 ] defines the fuzzy set A ^ over the Y ^ , where ξ A ^ ( y ) particularized the membership of an element y Y ^ in fuzzy set A ^ .
In [10], Cuong defines the PFS , which is an extension of a fuzzy set and applicable in many real-life problems. The picture fuzzy set is obtained by adding an extra membership function, namely, the degree of the neutral membership in IFS . The information regarding the situation of type: yes, abstain, no and refusal can be model by using picture fuzzy set easily. Voting can be a good example of a picture fuzzy set because it involves the situation of more answers of the type: yes, abstain, no, refusal.
Definition 2.
[10] A PFS A ^ over the universe Y ^ is defined as
A ^ = { ( y , ξ A ^ , η A ^ , υ A ^ ) | y Y ^ } ,
where ξ A ^ : Y ^ [ 0 , 1 ] , η A ^ : Y ^ [ 0 , 1 ] and ϑ A ^ : Y ^ [ 0 , 1 ] are the degree of positive membership, neutral membership and degree of negative membership, respectively, such that 0 ξ A ^ ( y ) + η A ^ ( y ) + υ A ^ ( y ) 1 .
Definition 3.
[26] A SFS A ^ over the universe Y ^ is defined as
A ^ = { ( y , ξ A ^ , η A ^ , υ A ^ ) | y Y ^ } ,
where ξ A ^ : Y ^ [ 0 , 1 ] , η A ^ : Y ^ [ 0 , 1 ] and ϑ A ^ : Y ^ [ 0 , 1 ] are the degree of positive membership, neutral membership and degree of negative membership, respectively. Furthermore, it is required that 0 ξ A ^ 2 ( y ) + η A ^ 2 ( y ) + υ A ^ 2 ( y ) 1 . Then for y Y ^ , π A ^ ( y ) = 1 ( ξ A ^ 2 ( y ) + η A ^ 2 ( y ) + υ A ^ 2 ( y ) ) is called the degree of refusal membership of y in A ^ . For SFS ( ξ A ^ ( y ) , η A ^ ( y ) , υ A ^ ( y ) ) are said to spherical fuzzy value ( S F V ) or spherical fuzzy number ( S F N ) and each S F V can be denoted by q = ( ξ q , η q , υ q ) , where ξ q , η q and υ q [ 0 , 1 ] , with condition that 0 ξ q 2 + η q 2 + υ q 2 1 . Therefore, the information regarding the situation of type: yes, abstain, no and refusal can be model more easily by using SFS than PFS .
We can easily observed that the SFS is an extension of PFS . For example, if we have ξ = 0.6 , η = 0.5 and ν = 0.4 , then 0.6 + 0.5 + 0.4 = 1.5 > 1 . However, 0.6 2 + 0.5 2 + 0.4 2 = 0.36 + 0.25 + 0.16 = 0.77 < 1 , hence SFS expand the domain of memberships functions.
In [29], Rafiq defines some similarity measures for SFS s based on cosine and cotangent functions.
Definition 4.
[29] For two SFS s A ^ and B ^ in Y ^ , a cosine similarity measure between A ^ and B ^ is defined as follows:
S c 1 ( A ^ , B ^ ) = 1 m j = 1 m ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) ξ A ^ 4 ( y j ) + η A ^ 4 ( y j ) + ν A ^ 4 ( y j ) ξ B ^ 4 ( y j ) + η B ^ 4 ( y j ) + ν B ^ 4 ( y j ) .
Definition 5.
[29] For two SFS s A ^ and B ^ in Y ^ , similarity measures using cosine function between A ^ and B ^ are defined as follows:
S c 2 ( A ^ , B ^ ) = 1 m j = 1 m c o s π 2 [ | ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) | | η A ^ 2 ( y j ) η B ^ 2 ( y j ) | | ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) | ]
S c 3 ( A ^ , B ^ ) = 1 m j = 1 m c o s π 4 [ | ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) | + | η A ^ 2 ( y j ) η B ^ 2 ( y j ) | + | ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) | ]
where ∨ is the maximum operation.
Definition 6.
[29] For two SFS s A ^ and B ^ in Y ^ , a cotangent similarity measure between A ^ and B ^ are defined as follows:
S c 4 ( A ^ , B ^ ) = 1 m j = 1 m c o t π 4 + π 4 [ | ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) | | η A ^ 2 ( y j ) η B ^ 2 ( y j ) | | ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) | ]
S c 5 ( A ^ , B ^ ) = 1 m j = 1 m c o t π 4 + π 8 [ | ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) | + | η A ^ 2 ( y j ) η B ^ 2 ( y j ) | + | ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) | ]
where ∨ is the maximum operation.
Definition 7.
[29] For two SFS s A ^ and B ^ in Y ^ , a cosine similarity measure by using degree of refusal membership between A ^ and B ^ are defined as follows:
S c 6 ( A ^ , B ^ ) = 1 m j = 1 m c o s π 2 [ | ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) | | η A ^ 2 ( y j ) η B ^ 2 ( y j ) | | ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) | | π A ^ ( y j ) π B ^ ( y j ) | ]
S c 7 ( A ^ , B ^ ) = 1 m j = 1 m c o s π 4 [ | ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) | + | η A ^ 2 ( y j ) η B ^ 2 ( y j ) | + | ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) | + | π A ^ ( y j ) π B ^ ( y j ) | ]
where ∨ is the maximum operation.
Definition 8.
[29] For two SFS s A ^ and B ^ in Y ^ , a cotangent similarity measure by using degree of refusal membership between A ^ and B ^ is defined as follows:
S c 8 ( A ^ , B ^ ) = 1 m j = 1 m c o t π 4 + π 4 [ | ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) | | η A ^ 2 ( y j ) η B ^ 2 ( y j ) | | ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) | | π A ^ ( y j ) π B ^ ( y j ) | ]
S c 9 ( A ^ , B ^ ) = 1 m j = 1 m c o t π 4 + π 8 [ | ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) | + | η A ^ 2 ( y j ) η B ^ 2 ( y j ) | + | ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) | + | π A ^ ( y j ) π B ^ ( y j ) | ]
where ∨ is the maximum operation.

3. A New Similarity Measures for SFS s

In this section, we define new similarity and distance measures for SFS s give their proof.
Definition 9.
A distance measure between SFS s A ^ and B ^ is a mapping D ^ : S F S × S F S [ 0 , 1 ] , which satisfies the following properties:
(D1) 
0 D ^ ( A ^ , B ^ ) 1
(D2) 
D ^ ( A ^ , B ^ ) = 0 A ^ = B ^
(D3) 
D ^ ( A ^ , B ^ ) = D ^ ( B ^ , A ^ )
(D4) 
If A ^ B ^ C ^ then D ^ ( A ^ , C ^ ) D ^ ( A ^ , B ^ ) and D ^ ( A ^ , C ^ ) D ^ ( B ^ , C ^ ) .
Definition 10.
A similarity measure between SFS s A ^ and B ^ is a mapping S ^ : S F S × S F S [ 0 , 1 ] , which satisfies the following properties:
(S1) 
0 S ^ ( A ^ , B ^ ) 1
(S2) 
S ^ ( A ^ , B ^ ) = 1 A ^ = B ^
(S3) 
S ^ ( A ^ , B ^ ) = S ^ ( B ^ , A ^ )
(S4) 
If A ^ B ^ C ^ then S ^ ( A ^ , C ^ ) S ^ ( A ^ , B ^ ) and S ^ ( A ^ , C ^ ) S ^ ( B ^ , C ^ ) .
Definition 11.
For two SFS s A ^ and B ^ in Y ^ , a new similarity measures is defined between A ^ and B ^ as follows:
S s ( A ^ , B ^ ) = j = 1 m ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } .
Example 1.
Let Y ^ = { y 1 , y 2 , y 3 , y 4 , y 5 } be the universal set. We consider two SFS s A ^ and B ^ in Y ^ , which are given as follows:
A ^ = { ( 0.7 , 0.1 , 0.2 ) / y 1 , ( 0.7 , 0.2 , 0.2 ) / y 2 , ( 0.2 , 0.1 , 0.7 ) / y 3 , ( 0.9 , 0.1 , 0.2 ) / y 4 , ( 0.2 , 0.1 , 0.6 ) / y 5 } B ^ = { ( 0.3 , 0.2 , 0.4 ) / y 1 , ( 0.5 , 0.2 , 0.1 ) / y 2 , ( 0.1 , 0.1 , 0.7 ) / y 3 , ( 0.4 , 0.1 , 0.3 ) / y 4 , ( 0.1 , 0.1 , 0.7 ) / y 5 } .
S s ( A ^ , B ^ ) = 0.0509 + 0.1245 + 0.2406 + 0.1333 + 0.1769 0.2673 + 0.2433 + 0.2418 + 0.6643 + 0.2418 = 0.61 0.91 = 0.669594 .
Theorem 1.
S s ( A ^ , B ^ ) is the similarity measure between two SFS s A ^ and B ^ in Y ^ .
Proof. 
To prove S s a similarity measure, we have to verify the four conditions of Definition 10 for S s .
( S 1 ) . Since for all y j , 1 j m , we have ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) , η A ^ 2 ( y j ) · η B ^ 2 ( y j ) η A ^ 4 ( y j ) η B ^ 4 ( y j ) and ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) . Therefore for each y j , we have
[ ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) ] [ { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } ] .
Therefore for all y j , 1 j m , we have
j = 1 m [ ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) ] j = 1 m [ { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } ] 0 S s ( A ^ , B ^ ) 1 .
( S 2 ) . Suppose S s ( A ^ , B ^ ) = 1 . We have to prove A ^ = B ^ . By definition of S s ,
j = 1 m ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } = 1 ,
j = 1 m [ ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) ] = j = 1 m [ { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } ] .
Now we claim that ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) = ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) , η A ^ 2 ( y j ) · η B ^ 2 ( y j ) = η A ^ 4 ( y j ) η B ^ 4 ( y j ) and ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) = ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) .
Suppose ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) , since ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) , there exists r > 0 such that ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + r = ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) .
Similarly there exists s , t > 0 such that η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + s = η A ^ 4 ( y j ) η B ^ 4 ( y j ) and ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) + t = ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) .
By hypothesis it follows that r + s + t = 0 . This implies that r = ( s + t ) , which is not possible. This implies that ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) = ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) , η A ^ 2 ( y j ) · η B ^ 2 ( y j ) = η A ^ 4 ( y j ) η B ^ 4 ( y j ) and ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) = ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) . This implies that ξ A ^ 2 ( y j ) = ξ B ^ 2 ( y j ) , η A ^ 2 ( y j ) = η B ^ 2 ( y j ) and ν A ^ 2 ( y j ) = ν B ^ 2 ( y j ) . Hence A ^ = B ^ .
Converse, trivially follows from Definition 11.
( S 3 ) . S s ( A ^ , B ^ ) = S s ( B ^ , A ^ ) is trivial.
( S 4 ) . For three SFS s A ^ , B ^ and C ^ in Y ^ . The similarity measures between A ^ , B ^ and A ^ , C ^ are given as:
S s ( A ^ , B ^ ) = j = 1 m ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } .
S s ( A ^ , C ^ ) = j = 1 m ξ A ^ 2 ( y j ) · ξ C ^ 2 ( y j ) + η A ^ 2 ( y j ) · η C ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν C ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ C ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η C ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν C ^ 4 ( y j ) } .
Suppose A ^ B ^ C ^ . For all y j Y ^ , we have ξ A ^ 2 ( y j ) ξ B ^ 2 ( y j ) ξ C ^ 2 ( y j ) , η A ^ 2 ( y j ) η B ^ 2 ( y j ) η C ^ 2 ( y j ) and ν A ^ 2 ( y j ) ν B ^ 2 ( y j ) ν C ^ 2 ( y j ) . This implies that ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) ξ C ^ 4 ( y j ) , η A ^ 4 ( y j ) η B ^ 4 ( y j ) η C ^ 4 ( y j ) and ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) ν C ^ 4 ( y j ) . Then we have
S s ( A ^ , B ^ ) = j = 1 m ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ B ^ 4 ( y j ) } + { η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) } ,
S s ( A ^ , C ^ ) = j = 1 m ξ A ^ 2 ( y j ) · ξ C ^ 2 ( y j ) + η A ^ 2 ( y j ) · η C ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν C ^ 2 ( y j ) j = 1 m { ξ C ^ 4 ( y j ) } + { η C ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) } .
We claim that for all y j Y ^ , we have
ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) ξ B ^ 4 ( y j ) + η B ^ 4 ( y j ) + ν A ^ 4 ( y j ) ξ A ^ 2 ( y j ) · ξ C ^ 2 ( y j ) ξ C ^ 4 ( y j ) + η C ^ 4 ( y j ) + ν A ^ 4 ( y j ) ,
because η B ^ 4 ( y j ) η C ^ 4 ( y j ) and 1 ξ B ^ 2 ( y j ) 1 ξ C ^ 2 ( y j ) . Similarly, we have
η A ^ 2 ( y j ) · η B ^ 2 ( y j ) ξ B ^ 4 ( y j ) + η B ^ 4 ( y j ) + ν A ^ 4 ( y j ) η A ^ 2 ( y j ) · η C ^ 2 ( y j ) ξ C ^ 4 ( y j ) + η C ^ 4 ( y j ) + ν A ^ 4 ( y j ) ,
ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) ξ B ^ 4 ( y j ) + η B ^ 4 ( y j ) + ν A ^ 4 ( y j ) ν A ^ 2 ( y j ) · ν C ^ 2 ( y j ) ξ C ^ 4 ( y j ) + η C ^ 4 ( y j ) + ν A ^ 4 ( y j ) .
By adding Equations (18)–(20), we have
ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) { ξ B ^ 4 ( y j ) } + { η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) } ξ A ^ 2 ( y j ) · ξ C ^ 2 ( y j ) + η A ^ 2 ( y j ) · η C ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν C ^ 2 ( y j ) { ξ C ^ 4 ( y j ) } + { η C ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) }
j = 1 m ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ B ^ 4 ( y j ) } + { η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) } j = 1 m ξ A ^ 2 ( y j ) · ξ C ^ 2 ( y j ) + η A ^ 2 ( y j ) · η C ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν C ^ 2 ( y j ) j = 1 m { ξ C ^ 4 ( y j ) } + { η C ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) } .
Hence S s ( A ^ , C ^ ) S s ( A ^ , B ^ ) . Similarly, we can prove S s ( A ^ , C ^ ) S s ( B ^ , C ^ ) .
Hence from ( S 1 ) ( S 4 ) , we conclude that S s ( A ^ , B ^ ) is the similarity measure between SFS s A ^ and B ^ . □
Definition 12.
Two SFS s A ^ and B ^ are called α -similar, denoted as A ^ α B ^ , if and only if S s ( A ^ , C ^ ) α for α ( 0 , 1 ) .
Corollary 1.
α is reflexive and symmetric.
Proof. 
The reflexive and symmetric part follows from Theorem 1. □
The following example shows that the relation α is not transitive.
Example 2.
Let Y ^ = { y 1 , y 2 , y 3 , y 4 , y 5 } be the universal set. Let us define α = 0.5 . We consider three SFS s A ^ , B ^ and C ^ in Y ^ , which are given as follows:
A ^ = { ( 0.8 , 0.1 , 0.0 ) / y 1 , ( 0.6 , 0.2 , 0.1 ) / y 2 , ( 0.1 , 0.1 , 0.8 ) / y 3 , ( 0.6 , 0.1 , 0.2 ) / y 4 , ( 0.2 , 0.1 , 0.6 ) / y 5 } B ^ = { ( 0.6 , 0.0 , 0.4 ) / y 1 , ( 0.6 , 0.2 , 0.5 ) / y 2 , ( 0.4 , 0.1 , 0.7 ) / y 3 , ( 0.8 , 0.1 , 0.3 ) / y 4 , ( 0.6 , 0.1 , 0.7 ) / y 5 } C ^ = { ( 0.5 , 0.3 , 0.2 ) / y 1 , ( 0.3 , 0.1 , 0.5 ) / y 2 , ( 0.3 , 0.3 , 0.4 ) / y 3 , ( 0.7 , 0.1 , 0.2 ) / y 4 , ( 0.6 , 0.1 , 0.1 ) / y 5 } .
Then S s ( A ^ , B ^ ) = 0.59636 > 0.5 and S s ( B ^ , C ^ ) = 0.519811 > 0.5 , but S s ( A ^ , C ^ ) = 0.322488 < 0.5 . Hence the relation α is not transitive.
Sometimes the alternatives under observations are not of equal importance, therefore, we defines weights of alternatives to signify their importance and defines weighted similarity measures between SFS s .
Definition 13.
For two SFS s A ^ and B ^ in Y ^ , a new weighted similarity measure is defined between A ^ and B ^ as follows:
S ω s ( A ^ , B ^ ) = j = 1 m ω j ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } / ( j = 1 m ω j ) ,
where ω j [ 0 , 1 ] are the weights of alternatives, but not all zero, 1 j m . If j = 1 m ω j = 1 , then we have
S ω s ( A ^ , B ^ ) = j = 1 m ω j ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } .
Example 3.
In Example 1, consider 0.8 , 0.5 , 0.6 , 0.7 and 0.6 be the weights of y 1 , y 2 , y 3 , y 4 and y 5 , respectively. Then
S ω s ( A ^ , B ^ ) = 0.44678 1.6585 = 0.269388 .
Theorem 2.
S ω s ( A ^ , B ^ ) is the similarity measure between two SFS s A ^ and B ^ in Y ^ .
Proof. 
The proof is similar to the proof of Theorem 1. □
On the basis of new similarity measure S s , we define distance measures for SFS s .
Definition 14.
For two SFS s A ^ and B ^ in Y ^ , a new distance measures is defined between A ^ and B ^ as follows:
D s ( A ^ , B ^ ) = 1 j = 1 m ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } .
Definition 15.
For two SFS s A ^ and B ^ in Y ^ , a new weighted distance measure is defined between A ^ and B ^ as follows:
D ω s ( A ^ , B ^ ) = 1 j = 1 m ω j ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } / ( j = 1 m ω j ) ,
where ω j [ 0 , 1 ] are the weights of alternatives, but not all zero, 1 j m . If j = 1 m ω j = 1 , then we have
D ω s ( A ^ , B ^ ) = 1 j = 1 m ω j ξ A ^ 2 ( y j ) · ξ B ^ 2 ( y j ) + η A ^ 2 ( y j ) · η B ^ 2 ( y j ) + ν A ^ 2 ( y j ) · ν B ^ 2 ( y j ) j = 1 m { ξ A ^ 4 ( y j ) ξ B ^ 4 ( y j ) } + { η A ^ 4 ( y j ) η B ^ 4 ( y j ) } + { ν A ^ 4 ( y j ) ν B ^ 4 ( y j ) } .
Theorem 3.
D s and D ω s are the distance measures between SFS s .
Proof. 
The proof is similar to the proof of Theorem 1. □

4. Applications in Pattern Recognition and Counter Examples

In this section, we provide some counter examples for already proposed similarity measures in the literature in pattern recognition. We have seen that the already proposed measures cannot classify the unknown pattern while set theoretic similarity measure classify the unknown pattern, which shows that our proposed similarity measure is applicable in pattern recognition problems.
Example 4.
In this example, we have seen that the second condition of Definition 10 ( S 2 ) is not satisfied for cosine similarity measure S c 1 (Definition 4), i.e., if A ^ = { ( a , a , a ) / y j | y j Y ^ , 1 j m } and B ^ = { ( b , b , b ) / y j | y j Y ^ , 1 j m } are two SFS s in Y ^ with 0 a , b 1 , 0 a 2 + a 2 + a 2 1 , 0 b 2 + b 2 + b 2 1 and a b , then A ^ B ^ . However, S c 1 ( A ^ , B ^ ) = 1 .
For example, let Y ^ = { y 1 , y 2 , y 3 } and SFS s in Y ^ are
A ^ = { ( 0.50 , 0.50 , 0.50 ) / y 1 , ( 0.30 , 0.30 , 0.30 ) / y 2 , ( 0.40 , 0.40 , 0.40 ) / y 3 } and B ^ = { ( 0.41 , 0.41 , 0.41 ) / y 1 , ( 0.27 , 0.27 , 0.27 ) / y 2 , ( 0.33 , 0.33 , 0.33 ) / y 3 } .
Clearly, A ^ B ^ but S c 1 ( A ^ , B ^ ) = 1 . Hence S c 1 is not effective for these cases and not reliable to find the similarity measure between SFS s . However, when we find the similarity measure by using S s , we get S s ( A ^ , B ^ ) = 0.686175 .
Example 5.
Let Q 1 and Q 2 be two known patterns with class labels Z 1 and Z 2 , respectively, are given. The SFS s are used to represents the patterns in Y ^ = { y 1 , y 2 , y 3 } as follows:
Q 1 = { ( 0.7 , 0.5 , 0.3 ) / y 1 , ( 0.4 , 0.3 , 0.5 ) / y 2 , ( 0.6 , 0.4 , 0.3 ) / y 3 } Q 2 = { ( 0.5 , 0.7 , 0.3 ) / y 1 , ( 0.3 , 0.4 , 0.5 ) / y 2 , ( 0.4 , 0.6 , 0.3 ) / y 3 } .
P is the unknown pattern which is given as follows:
P = { ( 0.5 , 0.4 , 0.4 ) / y 1 , ( 0.4 , 0.6 , 0.5 ) / y 2 , ( 0.6 , 0.7 , 0.3 ) / y 3 } .
Our aim is to find out the class of unknown pattern P belongs to. However, when we use cosine similarity measure S c 3 (Definition 5), we get the same similarity measure i.e., S c 3 ( P , Q 1 ) = S c 3 ( P , Q 2 ) = 0.965086 . Furthermore, when we use the cotangent similarity measure S c 5 (Definition 6), we get the same similarity measures i.e., S c 5 ( P , Q 1 ) = S c 5 ( P , Q 2 ) = 0.767857 . Hence in this case we cannot decide the class of unknown pattern P by using S c 3 and S c 5 . However, when we find the similarity measure by using S s , we get S s ( P , Q 1 ) = 0.555916 and S s ( P , Q 2 ) = 0.575836 . Since S s ( P , Q 2 ) > S s ( P , Q 1 ) , therefore, the unknown pattern P belongs to class Z 2 .
Example 6.
Let Q 1 and Q 2 be two known patterns with class labels Z 1 and Z 2 , respectively, are given. The SFS s are used to represents the patterns in Y ^ = { y 1 , y 2 , y 3 } as follows:
Q 1 = { ( 0.5 , 0.7 , 0.3 ) / y 1 , ( 0.3 , 0.8 , 0.4 ) / y 2 , ( 0.6 , 0.3 , 0.1 ) / y 3 } Q 2 = { ( 0.6 , 0.7 , 0.3 ) / y 1 , ( 0.8 , 0.4 , 0.4 ) / y 2 , ( 0.6 , 0.4 , 0.2 ) / y 3 } .
P is the unknown pattern which is given as follows:
P = { ( 0.6 , 0.4 , 0.3 ) / y 1 , ( 0.5 , 0.5 , 0.4 ) / y 2 , ( 0.7 , 0.4 , 0.2 ) / y 3 } .
Our aim is to find out the class of unknown pattern P belongs to. However, when we use cosine similarity measure S c 2 (Definition 5), we get the same similarity measure i.e., S c 2 ( P , Q 1 ) = S c 2 ( P , Q 2 ) = 0.888668 . Furthermore, when we use the cotangent similarity measure S c 4 (Definition 6), we get the same similarity measures i.e., S c 4 ( P , Q 1 ) = S c 4 ( P , Q 2 ) = 0.638144 . Hence in this case we cannot decide the class of unknown pattern P by using S c 2 and S c 4 . However, when we find the similarity measure by using S s , we get S s ( P , Q 1 ) = 0.50385 and S s ( P , Q 2 ) = 0.564666 . Since S s ( P , Q 2 ) > S s ( P , Q 1 ) , therefore, the unknown pattern P belongs to class Z 2 .
Example 7.
Let Q 1 and Q 2 be two known patterns with class labels Z 1 and Z 2 , respectively, are given. The SFS s are used to represents the patterns in Y ^ = { y 1 , y 2 , y 3 } as follows:
Q 1 = { ( 0.5 , 0.7 , 0.3 ) / y 1 , ( 0.3 , 0.8 , 0.4 ) / y 2 , ( 0.5 , 0.6 , 0.4 ) / y 3 } Q 2 = { ( 0.6 , 0.7 , 0.3 ) / y 1 , ( 0.8 , 0.4 , 0.4 ) / y 2 , ( 0.6 , 0.5 , 0.4 ) / y 3 } .
P is the unknown pattern which is given as follows:
P = { ( 0.6 , 0.4 , 0.3 ) / y 1 , ( 0.5 , 0.5 , 0.4 ) / y 2 , ( 0.6 , 0.5 , 0.2 ) / y 3 } .
Our aim is to find out the class of unknown pattern P belongs to. However, when we use cosine similarity measure S c 6 (Definition 7), we get the same similarity measure i.e., S c 6 ( P , Q 1 ) = S c 6 ( P , Q 2 ) = 0.889689 . Furthermore, when we use the cotangent similarity measure S c 8 (Definition 8), we get the same similarity measures i.e., S c 8 ( P , Q 1 ) = S c 8 ( P , Q 2 ) = 0.642526 . Hence in this case we cannot decide the class of unknown pattern P by using S c 6 and S c 8 . However, when we find the similarity measure by using S s , we get S s ( P , Q 1 ) = 0.492114 and S s ( P , Q 2 ) = 0.58562 . Since S s ( P , Q 2 ) > S s ( P , Q 1 ) , therefore, the unknown pattern P belongs to class Z 2 .
Example 8.
Let Q 1 and Q 2 be two known patterns with class labels Z 1 and Z 2 , respectively, are given. The SFS s are used to represents the patterns in Y ^ = { y 1 , y 2 , y 3 } as follows:
Q 1 = { ( 0.5 , 0.7 , 0.3 ) / y 1 , ( 0.3 , 0.8 , 0.4 ) / y 2 , ( 0.5 , 0.6 , 0.4 ) / y 3 } Q 2 = { ( 0.6 , 0.7 , 0.3 ) / y 1 , ( 0.8 , 0.4 , 0.4 ) / y 2 , ( 0.6 , 0.5 , 0.4 ) / y 3 } .
P is the unknown pattern which is given as follows:
P = { ( 0.6 , 0.4 , 0.3 ) / y 1 , ( 0.5 , 0.5 , 0.4 ) / y 2 , ( 0.5 , 0.5 , 0.2 ) / y 3 } .
Our aim is to find out the class of unknown pattern P belongs to. However, when we use cosine similarity measure S c 7 (Definition 7), we get the same similarity measure i.e., S c 7 ( P , Q 1 ) = S c 7 ( P , Q 2 ) = 0.874075 . Furthermore, when we use the cotangent similarity measure S c 9 (Definition 8), we get the same similarity measures i.e., S c 9 ( P , Q 1 ) = S c 9 ( P , Q 2 ) = 0.597149 . Hence in this case we cannot decide the class of unknown pattern P by using S c 7 and S c 9 . However, when we find the similarity measure by using S s , we get S s ( P , Q 1 ) = 0.497164 and S s ( P , Q 2 ) = 0.549396 . Since S s ( P , Q 2 ) > S s ( P , Q 1 ) , therefore, the unknown pattern P belongs to class Z 2 .

5. Selection of Mega Projects in Underdeveloping Countries

The megaprojects are characterized by vast complexity (especially in organizational terms), large investment commitment, long-lasting impact on the economy, the environment, and society. So it is important to choose the best method for the selection of mega projects for under developing countries. Because it affects the lives of millions of peoples, take much time to develop and build, involve multiple public and private stakeholders, and have a long-lasting impact on the economy, the environment, and society. As we have seen that the proposed similarity measures have counter-intuitive cases for Examples 4–8. Therefore, the selection of mega projects for under developing countries is done by the proposed similarity measure.
It is important for under developing countries to select upcoming mega projects on priority which has less effect on their economy, environment, less maintenance cost has long term benefits, fewer peoples effects from that project and generate high revenue. For example, a country has to start the mega project and they get a loan from the world bank so the country has to think before spending the money because they have to refund after some time. The government has five projects in his focus like 1 million house construction, dam construction, orange metro train, invest in industry and power sector. This set can be represented as U and the elements of U represented as E i , 1 i 5 , that is
U = { 1 million house construction , dam construction , orange metro train , invest in industry , power sector } .
To select the project on a priority basis, there are some parameters selected by experts from different fields to check the importance of projects like long term benefits, time, impact, revenue generated, costs and short term benefits. We represents this criteria as a set W and the elements of W represented as e j , 1 j 6 , that is
W = { long term benefits , time , impact , revenue generated , cost , short term benefits } .
We apply proposed technique for selecting upcoming mega projects on priority basis which is the classical multi attribute decision making problem. The weight vector for each attribute e j , j { 1 , 2 , , 6 } is ω ^ = ( 0.12 , 0.25 , 0.09 , 0.16 , 0.20 , 0.18 ) T . All the data collected in spherical fuzzy information is summarized in Table 1. In Table 1, we have seen that for each mega project E i , i { 1 , 2 , , 5 } , experts interpret their evaluation in the form of S F V s corresponding to each attribute (criteria).
To apply the proposed method, we calculate the ideal alternative (mega project) E + from given data as follows μ j + = M a x i { μ i } , 1 j 6 , 1 i 5 , η j + = M a x i { η i } , 1 j 6 , 1 i 5 and ν j + = M a x i { ν i } , 1 j 6 , 1 i 5 . Then the similarity measures between each alternative and ideal alternative are calculated. Heigh values of similarity measure more closer to the ideal alternative. In this case, the ideal alternative is
E + = { ( 0.91 , 0.03 , 0.02 ) , ( 0.89 , 0.08 , 0.03 ) , ( 0.42 , 0.35 , 0.05 ) , ( 0.73 , 0.15 , 0.02 ) , ( 0.52 , 0.31 , 0.05 ) , ( 0.91 , 0.03 , 0.05 ) } .
Then the similarity measures S s between between each alternative E i and ideal alternative E + are calculated. The details of similarity measures presented in Table 2 and the ranking of alternatives (mega projects) is given as follows:
E 5 E 4 E 1 E 2 E 3 .
The comparison between the already proposed similarity measures and proposed similarity measure is presented in Table 2.

6. Comparison Analysis

A comparison between new proposed similarity measure and already proposed similarity measure for SFS s is conducted to illustrate the superiority of the new similarity measure.
We have seen from Example 4 that the second condition of Definition 10 ( S 2 ) is not satisfied for cosine similarity measure S c 1 , i.e., S ^ ( A ^ , B ^ ) = 1 even A ^ B ^ . Furthermore, we provide a general criteria when second condition of Definition 10 ( S 2 ) is not satisfied for cosine similarity measure S c 1 . In Example 5, we have seen that the S c 3 and S c 5 can not classify the unknown pattern from known pattern. In Example 6, we have seen that the S c 2 and S c 4 can not classify the unknown pattern from known pattern. In Example 7, we have seen that the S c 6 and S c 8 can not classify the unknown pattern from known pattern. In Example 8, we have seen that the S c 7 and S c 9 can not classify the unknown pattern from known pattern.
However, in all Examples 4–8, the new similarity measure S s classify the unknown pattern and hence successfully applicable to the pattern recognition problems. In Section 5, S s applied successfully to selecting the mega projects for under developing countries.
From Table 3, we have seen that for different special cases, the already proposed similarity measures are not illegible for classification of unknown pattern but S s applied successfully. For cases 1 and 2, the similarity measures S c 3 , S c 5 , S c 7 and S c 9 provide counter-intuitive cases. The similarity measures S c 2 , S c 4 , S c 6 and S c 8 provide counter-intuitive cases for 3 and 4 cases. The second axiom of similarity measure for S c 1 (Definition 4) is not satisfied for case 5. As we have seen in Example 4, that if we have membership, neutral and non-membership degrees for a set are equal but different from another set which has also same membership degrees, then the S c 1 has result 1. This is inconsistent with the definition of a similarity measure.

7. Conclusions

In this paper, we have defined new similarity measures for SFS s called set theoretic similarity measures. We define set theoretic similarity measure, weighted set theoretic similarity measure, set theoretic distance and weighted set theoretic distance measures and provide their proofs in this paper. We discuss some special cases (Examples 4–8) where already proposed similarity measure fails to classify the unknown pattern while the proposed similarity measure successfully applied to the pattern recognition problems. Furthermore, S s applied successfully to selecting the mega projects for under developing countries.
In the future direction, we will apply the set theoretic similarity measure to data mining, medical diagnosis, decision making, complex group decision making, linguistic summarization risk analysis, pattern recognition, color image retrieval, histogram comparison and image processing.

Author Contributions

All authors contributed equally in this research paper. All authors have read and agreed to the published version of the manuscript.

Funding

Petchra Pra Jom Klao Ph.D. Research Scholarship from King Mongkut’s University of Technology Thonburi (KMUTT) and Theoretical and Computational Science (TaCS) Center. Moreover, Poom Kumam was supported by the Thailand Research Fund and the King Mongkut’s University of Technology Thonburi under the TRF Research Scholar Grant No. RSA6080047. Moreover, this research work was financially supported by the Rajamangala University of Technology Thanyaburi (RMUTTT) (Grant No. NSF62D0604).

Acknowledgments

This project was supported by Center of Excellence in Theoretical and Computational Science (TaCS-CoE), KMUTT. The first author was supported by the “Petchra Pra Jom Klao Ph.D. Research Scholarship from King Mongkut’s University of Technology Thonburi”.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
I F S Intuitionistic Fuzzy Set
PFS Picture Fuzzy Set
SFS Spherical Fuzzy Set

References

  1. Zadeh, L.A. Fuzzy sets. Inf. Contr. 1965, 8, 338–353. [Google Scholar] [CrossRef] [Green Version]
  2. Atanassov, K.T. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  3. Atanassov, K.T. Interval valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1989, 31, 343–349. [Google Scholar] [CrossRef]
  4. Molodtsov, Soft set theory-first results. Comput. Math. Appl. 1999, 37, 19–31. [CrossRef] [Green Version]
  5. Maji, P.K.; Biswas, R.; Roy, A.R. Fuzzy soft sets. J. Fuzzy Math. 2001, 9, 589–602. [Google Scholar]
  6. Maji, P.K.; Biswas, R.; Roy, A.R. Intuitionistic fuzzy soft sets. J. Fuzzy Math. 2001, 9, 677–692. [Google Scholar]
  7. Ali, M.I. A note on soft sets, rough soft sets and fuzzy soft sets. Appl. Soft Comput. 2011, 11, 3329–3332. [Google Scholar]
  8. Yang, X.B.; Lin, T.Y.; Yang, J.Y.; Li, Y.; Yu, D.Y. Combination of interval-valued fuzzy set and soft set. Comput. Math. Appl. 2009, 58, 521–527. [Google Scholar] [CrossRef] [Green Version]
  9. Jiang, Y.; Tang, Y.; Chen, Q.; Liu, H.; Tang, J. Interval-valued intuitionistic fuzzy soft sets and their properties. Comput. Math. Appl. 2010, 60, 906–918. [Google Scholar] [CrossRef] [Green Version]
  10. Cuong, B.C. Picture fuzzy sets. J. Comput. Sci. Cybern. 2014, 30, 409–420. [Google Scholar]
  11. Yang, Y.; Liang, C.; Ji, S.; Liu, T. Adjustable soft discernibility matrix based on picture fuzzy soft sets and its application in decision making. J. Int. Fuzzy Syst. 2015, 29, 1711–1722. [Google Scholar] [CrossRef]
  12. Khan, M.J.; Kumam, P.; Ashraf, S.; Kumam, W. Generalized Picture Fuzzy Soft Sets and Their Application in Decision Support Systems. Symmetry 2019, 11, 415. [Google Scholar] [CrossRef] [Green Version]
  13. Khan, M.J.; Kumam, P.; Liu, P.; Kumam, W.; Ashraf, S. A Novel Approach to Generalized Intuitionistic Fuzzy Soft Sets and Its Application in Decision Support System. Mathematics 2019, 7, 742. [Google Scholar] [CrossRef] [Green Version]
  14. Khan, M.J.; Kumam, P.; Liu, P.; Kumam, W.; Rehman, H. An adjustable weighted soft discernibility matrix based on generalized picture fuzzy soft set and its applications in decision making. J. Int. Fuzzy Syst. 2020, 38, 2103–2118. [Google Scholar] [CrossRef]
  15. Khan, M.J.; Kumam, P.; Liu, P.; Kumam, W. Another view on generalized interval valued intuitionistic fuzzy soft set and its applications in decision support system. J. Int. Fuzzy Syst. 2019, 1–16. [Google Scholar] [CrossRef]
  16. Khan, M.J.; Phiangsungnoen, S.; Rehman, H.; Kumam, W. Applications of Generalized Picture Fuzzy Soft Set in Concept Selection. Thai J. Math. 2020, 18, 296–314. [Google Scholar]
  17. Hayat, K.; Ali, M.I.; Cao, B.Y.; Karaaslan, F.; Yang, X.P. Another View of Aggregation Operators on Group-Based Generalized Intuitionistic Fuzzy Soft Sets: Multi-Attribute Decision Making Methods. Symmetry 2018, 10, 753. [Google Scholar] [CrossRef] [Green Version]
  18. Liu, F.; Aiwu, G.; Lukovac, V.; Vukić, M. A multicriteria model for the selection of the transport service provider: A single valued neutrosophic dematel multicriteria model. Decis. Mak. Appl. Manag. Eng. 2018, 1, 121–130. [Google Scholar] [CrossRef]
  19. Si, A.; Das, S.; Kar, S. An approach to rank picture fuzzy numbers for decision making problems. Decis. Mak. Appl. Manag. Eng. 2019, 2, 54–64. [Google Scholar] [CrossRef]
  20. Yager, R.R. Pythagorean fuzzy subsets. In Proceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting (IFSA/NAFIPS), Edmonton, AB, Canada, 24–28 June 2013; pp. 57–61. [Google Scholar]
  21. Yager, R.R. Pythagorean membership grades in multicriteria decision making. IEEE Trans. Fuzzy Syst. 2014, 22, 958–965. [Google Scholar] [CrossRef]
  22. Zhang, X.L.; Xu, Z.S. Extension of TOPSIS to multiple criteria decision making with Pythagorean fuzzy sets. Int. J. Intell. Syst. 2014, 29, 1061–1078. [Google Scholar] [CrossRef]
  23. Peng, X.; Yang, Y. Some results for Pythagorean fuzzy sets. Int. J. Intell. Syst. 2015, 30, 1133–1160. [Google Scholar] [CrossRef]
  24. Reformat, M.; Yager, R.R. Suggesting recommendations using Pythagorean fuzzy sets illustrated using Netflix movie data. In Proceedings of the International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems, Montpellier, France, 15–19 July 2014; Laurent, A., Strauss, O., Bouchon-Meunier, B., Yager, R.R., Eds.; Springer: Cham, Switzerland, 2014; Volume 442, pp. 546–556. [Google Scholar]
  25. Peng, X.; Yuan, H.; Yang, Y. Pythagorean Fuzzy Information Measures and Their Applications. Int. J. Intell. Syst. 2017, 32, 991–1029. [Google Scholar] [CrossRef]
  26. Ashraf, S.; Abdullah, S.; Mahmood, T.; Ghani, F.; Mahmood, T. Spherical fuzzy sets and their applications in multi-attribute decision making problems. J. Intell. Fuzzy Syst. 2019, 36, 2829–2844. [Google Scholar] [CrossRef]
  27. Ashraf, S.; Abdullah, S.; Mahmood, T. Spherical fuzzy Dombi aggregation operators and their application in group decision making problems. J. Ambient Intell. Hum. Comput. 2019, 1–19. [Google Scholar] [CrossRef]
  28. Ashraf, S.; Abdullah, S.; Aslam, M.; Qiyas, M.; Kutbi, M.A. Spherical fuzzy sets and its representation of spherical fuzzy t-norms and t-conorms. J. Intell. Fuzzy Syst. 2019, 36, 6089–6102. [Google Scholar] [CrossRef]
  29. Rafiq, M.; Ashraf, S.; Abdullah, S.; Mahmood, T.; Muhammad, S. The cosine similarity measures of spherical fuzzy sets and their applications in decision making. J. Intell. Fuzzy Syst. 2019. [Google Scholar] [CrossRef]
  30. Ashraf, S.; Abdullah, S.; Aslam, M. Symmetric sum based aggregation operators for spherical fuzzy information: Application in multi-attribute group decision making problem. J. Intell. Fuzzy Syst. 2020. [Google Scholar] [CrossRef]
  31. Ngan, R.T.; Son, L.H.; Cuong, B.C.; Ali, M. H-max distance measure of intuitionistic fuzzy sets indecision making. Appl. Soft Comput. 2018, 69, 393–425. [Google Scholar] [CrossRef]
  32. Garg, H.; Kumar, K. An advanced study on the similarity measures of intuitionistic fuzzy sets based on the set pair analysis theory and their application in decision making. Soft Comput. 2018. [Google Scholar] [CrossRef]
  33. Jiang, Q.; Jin, X.; Lee, S.; Yao, S. A new similarity/distance measure between intuitionistic fuzzy sets based on the transformed isosceles triangles and its applications to pattern recognition. Expert Syst. Appl. 2019, 116, 439–453. [Google Scholar] [CrossRef]
  34. Stanujkic, D.; Karabasevic, D. An extension of the WASPAS method for decision-making problems with intuitionistic fuzzy numbers: A case of website evaluation. Oper. Res. Eng. Sci. Theory Appl. 2018, 1, 29–39. [Google Scholar] [CrossRef]
Table 1. Data Table.
Table 1. Data Table.
E 1 E 2 E 3 E 4 E 5
e 1 (0.53,0.33,0.09)(0.73,0.12,0.08)(0.91,0.03,0.02)(0.85,0.09,0.05)(0.90,0.05,0.02)
e 2 (0.89,0.08,0.03)(0.13,0.64,0.21)(0.07,0.09,0.05)(0.74,0.16,0.10)(0.68,0.08,0.21)
e 3 (0.42,0.35,0.18)(0.03,0.82,0.13)(0.04,0.85,0.10)(0.02,0.89,0.05)(0.05,0.87,0.06)
e 4 (0.08,0.89,0.02)(0.73,0.15,0.08)(0.68,0.26,0.06)(0.08,0.84,0.06)(0.13,0.75,0.09)
e 5 (0.33,0.51,0.12)(0.52,0.31,0.16)(0.15,0.76,0.07)(0.16,0.71,0.05)(0.15,0.73,0.08)
e 6 (0.17,0.53,0.13)(0.51,0.24,0.21)(0.31,0.39,0.25)(0.81,0.15,0.09)(0.91,0.03,0.05)
Table 2. Similarity Measures.
Table 2. Similarity Measures.
Similarity Measures S ( E + , E 1 ) S ( E + , E 2 ) S ( E + , E 3 ) S ( E + , E 4 ) S ( E + , E 5 ) Ranking
S s 0.06508710.0546080.0481390.0798800.089495 E 5 E 4 E 1 E 2 E 3
S c 1 0.1020270.1194220.1038220.1142810.114483 E 2 E 5 E 4 E 3 E 1
S c 2 0.1206870.1208480.1054940.134160.137768 E 5 E 4 E 2 E 1 E 3
S c 3 0.1397830.1418260.1428140.1465820.148426 E 5 E 4 E 3 E 2 E 1
S c 4 0.0987450.0944790.0765570.0953340.103948 E 5 E 1 E 4 E 2 E 3
S c 5 0.1121160.1134260.1048310.1127980.117358 E 5 E 2 E 4 E 1 E 3
S c 6 0.1206870.1208480.1054940.134160.137768 E 5 E 4 E 2 E 1 E 3
S c 7 0.1205230.1205880.1052650.1340420.137374 E 5 E 4 E 2 E 1 E 3
S c 8 0.0987450.0944790.07655660.09533380.103948 E 5 E 1 E 4 E 2 E 3
S c 9 0.0982520.09428710.07638460.09526250.103626 E 5 E 1 E 4 E 2 E 3
Table 3. Comparison Table.
Table 3. Comparison Table.
Similarity Measures/Cases12345
A ^ (0.5,0.4,0.6)(0.5,0.4,0.6)(0.6,0.4,0.3)(0.6,0.4,0.3)(0.5,0.5,0.5)
B ^ (0.7,0.5,0.3)(0.5,0.7,0.3)(0.5,0.7,0.4)(0.6,0.7,0.3)(0.4,0.4,0.4)
S c 1 0.7493970.6663440.7896120.8700231.00
S c 2 0.3038010.2895440.2895440.2895440.330008
S c 3 0.2970020.2970020.3069480.3222000.325867
S c 4 0.2152310.1936150.1936150.1936150.289252
S c 5 0.2042670.2042670.2208380.2564700.269206
S c 6 0.3038010.2895440.2895440.2895440.303801
S c 7 0.2895440.2895440.2696720.2895440.303801
S c 8 0.2152310.1936150.1936150.1936150.215231
S c 9 0.1936150.1936150.1698420.1936150.215231
S s 0.4509490.4009720.4624340.5719960.640000

Share and Cite

MDPI and ACS Style

Khan, M.J.; Kumam, P.; Deebani, W.; Kumam, W.; Shah, Z. Distance and Similarity Measures for Spherical Fuzzy Sets and Their Applications in Selecting Mega Projects. Mathematics 2020, 8, 519. https://doi.org/10.3390/math8040519

AMA Style

Khan MJ, Kumam P, Deebani W, Kumam W, Shah Z. Distance and Similarity Measures for Spherical Fuzzy Sets and Their Applications in Selecting Mega Projects. Mathematics. 2020; 8(4):519. https://doi.org/10.3390/math8040519

Chicago/Turabian Style

Khan, Muhammad Jabir, Poom Kumam, Wejdan Deebani, Wiyada Kumam, and Zahir Shah. 2020. "Distance and Similarity Measures for Spherical Fuzzy Sets and Their Applications in Selecting Mega Projects" Mathematics 8, no. 4: 519. https://doi.org/10.3390/math8040519

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop