Next Article in Journal
Effect of Plastic Anisotropy on the Distribution of Residual Stresses and Strains in Rotating Annular Disks
Next Article in Special Issue
On the Classification of Bol-Moufang Type of Some Varieties of Quasi Neutrosophic Triplet Loop (Fenyves BCI-Algebras)
Previous Article in Journal
Four Operators of Rough Sets Generalized to Matroids and a Matroidal Method for Attribute Reduction
Previous Article in Special Issue
A Linguistic Neutrosophic Multi-Criteria Group Decision-Making Method to University Human Resource Management
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Probabilistic Single-Valued (Interval) Neutrosophic Hesitant Fuzzy Set and Its Application in Multi-Attribute Decision Making

1
College of Information Engineering, Shanghai Maritime University, Shanghai 201306, China
2
Department of Mathematics, School of Arts and Sciences, Shaanxi University of Science & Technology, Xi’an 710021, China
3
Department of Mathematics, College of Arts and Sciences, Shanghai Maritime University, Shanghai 201306, China
4
College of Science, Nanjing University of Science and Technology, Nanjing 210000, China
*
Author to whom correspondence should be addressed.
Symmetry 2018, 10(9), 419; https://doi.org/10.3390/sym10090419
Submission received: 21 August 2018 / Revised: 11 September 2018 / Accepted: 14 September 2018 / Published: 19 September 2018

Abstract

:
The uncertainty and concurrence of randomness are considered when many practical problems are dealt with. To describe the aleatory uncertainty and imprecision in a neutrosophic environment and prevent the obliteration of more data, the concept of the probabilistic single-valued (interval) neutrosophic hesitant fuzzy set is introduced. By definition, we know that the probabilistic single-valued neutrosophic hesitant fuzzy set (PSVNHFS) is a special case of the probabilistic interval neutrosophic hesitant fuzzy set (PINHFS). PSVNHFSs can satisfy all the properties of PINHFSs. An example is given to illustrate that PINHFS compared to PSVNHFS is more general. Then, PINHFS is the main research object. The basic operational relations of PINHFS are studied, and the comparison method of probabilistic interval neutrosophic hesitant fuzzy numbers (PINHFNs) is proposed. Then, the probabilistic interval neutrosophic hesitant fuzzy weighted averaging (PINHFWA) and the probability interval neutrosophic hesitant fuzzy weighted geometric (PINHFWG) operators are presented. Some basic properties are investigated. Next, based on the PINHFWA and PINHFWG operators, a decision-making method under a probabilistic interval neutrosophic hesitant fuzzy circumstance is established. Finally, we apply this method to the issue of investment options. The validity and application of the new approach is demonstrated.

1. Introduction

In real life, uncertainty widely exists, like an expert system, information fusion, intelligent computations and medical diagnoses. When some decision problems need to be solved, establishing mathematical models of uncertainty plays an important role. Especially when dealing with big data problems, the uncertainty must be considered. Therefore, to describe the uncertainty of the problems, Zadeh [1] presented the fuzzy set theory. Next, many new types of fuzzy set theory have been developed, including the intuitionistic fuzzy set [2], hesitant fuzzy set (HFS) [3], dual hesitant fuzzy set (DHFS) [4], interval-valued intuitionistic fuzzy set (IVIFS) [5,6], necessary and possible hesitant fuzzy sets [7] and dual hesitant fuzzy probability [8]. The fuzzy set theory is a useful tool to figure out uncertain information [9]. In addition, Fuzzy set theory has also been applied to algebraic systems [10,11,12,13].
Simultaneously, in actual productions, statistical uncertainty needs to be considered. The probabilistic method is not always effective when we deal with epistemic uncertain problems [14]. Thus, those problems makes researchers attempt to combine fuzzy set theory with probability theory as a new fuzzy concept. For example, (1) probability theory as a method of knowledge representation [15,16,17,18]; (2) increase the probability value when processing fuzzy decision making problems [19,20,21]; (3) through the combination of stochastic simulation with nonlinear programming, the fuzzy values can be generated [22,23]. In [24], Hao et al. lists a detailed summary. In the probabilistic fuzzy circumstances, probabilistic data will be lost easily. Thus, under the fuzzy linguistic environments [25,26,27], Pang et al. [28] established a new type of probabilistic fuzzy linguistic term set and successfully solved these issues. In some practical issues, it is necessary to fully consider the ambiguity and probability. In 2016, Xu and Zhou [29] produced the hesitant probabilistic fuzzy set (HPFS). Then, Hao et al. [24] researched a new probabilistic dual hesitant fuzzy set (PDHFS) and applied it to the uncertain risk evaluation issues.
In [30], Smarandache introduced the neutrosophic set (NS) as a new type of fuzzy set. The NS A includes three independent members: truth membership T A ( x ) [ 0 , 1 ] , indeterminacy membership I A ( x ) [ 0 , 1 ] and falsity membership F A ( x ) [ 0 , 1 ] . NS theory has been widely used in algebraic systems [31,32,33,34,35,36]. Next, some new types of NS were introduced, like single-valued NS (SVNS) [37] and interval NS (INS) [38]. Ye utilized SVNS theory applied to different types of decision making (DM) issues [39,40,41]. In [42], Ye presented a simplified neutrosophic set (SNS). Xu and Xia utilized HFS theory for actual life productions [43,44,45,46]. Next, in a hesitant fuzzy environment, a group DM method was introduced by Xu et al. [47]. However, there are some types of questions that are difficult to solve by HFS. Thus, Zhu [4] introduced a DHFS theory. Then, Ye [48] established a correlation coefficient of DHFS. When decision makers are making decisions, DHFS theory cannot express the doubts of decision makers, completely. Next, in 2005, a single-valued neutrosophic hesitant fuzzy set (SVNHFS) was established by Ye [49], and interval neutrosophic hesitant fuzzy set (INHFS) was introduced by Liu [50]. Recently, neutrosophic fuzzy set theory has been widely researched and applied [51,52,53,54,55].
The aleatory uncertainty needs to be considered under the probabilistic neutrosophic hesitant fuzzy environments. Recently, fuzzy random variables have been used to describe probability information in uncertainty. However, in the above NS theories, the probabilities is not considered. Thus, if a neutrosophic multi-attribute decision making (MADM) problem under the probabilistic surroundings needs to be solved, the probabilities as a part of a fuzzy system will be lost. Until now, this problem has not given an effective solution. Peng et al. [56] proposed a new method: the probability multi-valued neutrosophic set (PMVNS). The PMVNS theory successfully solves multi-criteria group decision-making problems without loss of information. Then, we offer the notion of probabilistic SVNHFS (the probabilistic interval neutrosophic hesitant fuzzy set (PINHFS)) based on fuzzy set, HFS, PDHFS, NS and IVNHFS. To solve the MADM problems under the probabilistic interval neutrosophic hesitant fuzzy circumstance, the concept of PINHFS is used. By comparison, we find that the application of PINHFS is wider than that of the probabilistic single-valued neutrosophic hesitant fuzzy set (PSVNHFS), and it is closer to real life. Thus, we can study the case of the interval.
The rest of the paper is organized as follows: Section 2 briefly describes some basic definitions. In Section 3, the concepts of PSVNHFS and PINHFS are introduced, respectively. Next, PINHFS is the main research object. The comparison method of probabilistic interval neutrosophic hesitant fuzzy numbers (PINHFNs) is proposed. In Section 4, the basic operation laws of PINHFN are investigated. The probabilistic interval neutrosophic hesitant fuzzy weighted averaging (PINHFWA) and the probability interval neutrosophic hesitant fuzzy weighted geometric (PINHFWG) operators are established, and some basic properties are studied in Section 5. In Section 6, a MAMD method based on the PINHFWA and PINHFWG operators is proposed. Section 7 gives an illustrative example according to our method. To explain that PINHFS comparedto PSVNHFS is more extensive, in Section 8, the PSVNHFS being a special case of PINHFS, the probabilistic single-valued neutrosophic hesitant fuzzy weighted averaging (PSVNHFWA) and probabilistic single-valued neutrosophic hesitant fuzzy weighted geometric (PSVNHFWG) operators are introduced and a numerical example given to illustrate. Last, we summarize the conclusion and further research work.

2. Preliminaries

Let us review some fundamental definitions of HFS, SVNHFS and INHFS in this section.
Definition 1.
([3]) Let X be a non-empty finite set; an HFS A on X is defined in terms of a function h A ( x ) that when applied to X returns a finite subset of [0, 1], and we can express HFSs by:
A = { x , h A ( x ) | x X } ,
where h A ( x ) is a set of some different values in [0, 1], representing the possible membership degrees of the element x X to A. We call h A ( x ) a hesitant fuzzy element (HFE), denoted by h, which reads h = { λ | λ h } .
Definition 2.
([49]) Let X be a fixed set; an SVNHFS on X is defined as:
N = { x , t ˜ ( x ) , i ˜ ( x ) , f ˜ ( x ) | x X }
in which t ˜ ( x ) , i ˜ ( x ) and f ˜ ( x ) are three sets of some values in [0, 1], denoting the possible truth-membership hesitant degrees, indeterminacy-membership hesitant degrees and falsity-membership hesitant degrees of the element x X to the set N, respectively, with the conditions 0 δ , γ , η 1 and 0 δ + + γ + + η + + 3 , where δ t ˜ ( x ) , γ i ˜ ( x ) , η f ˜ ( x ) , δ + t ˜ ( x ) = δ t ˜ ( x ) m a x δ , γ + i ˜ ( x ) = γ i ˜ ( x ) m a x γ , η + f ˜ ( x ) = η f ˜ ( x ) m a x η for x X .
Definition 3.
([50]) Let X be a non-empty finite set; an interval neutrosophic hesitant fuzzy set (INHFS) on X is represented by:
A = { ( x , T A ( x ) , I A ( x ) , F A ( x ) ) | x X } ,
where T A ( x ) = { α ˜ | α ˜ T A ( x ) } , I A ( x ) = { β ˜ | β ˜ I A ( x ) } and T A ( x ) = { γ ˜ | γ ˜ F A ( x ) } are three sets of some interval values in real unit interval [0, 1], which denotes the possible truth-membership hesitant degrees, indeterminacy-membership hesitant degrees and falsity-membership hesitant fuzzy degrees of element x X to the set A and satisfies these limits: α ˜ = [ α L , α U ] [ 0 , 1 ] , β ˜ = [ β L , β U ] [ 0 , 1 ] , γ ˜ = [ γ L , γ U ] [ 0 , 1 ] and 0 s u p α ˜ + + s u p β ˜ + + s u p γ ˜ + 3 , where α ˜ + = α ˜ T A ( x ) m a x { α ˜ } , β ˜ + = β ˜ I A ( x ) m a x { β ˜ } and γ ˜ + = γ ˜ F A ( x ) m a x { γ ˜ } for x X .

3. The Probabilistic Single-Valued (Interval) Neutrosophic Hesitant Fuzzy Set

In this section, the concepts of PSVNHFS and PINHFS are introduced. Since PINHFS is more general than PSVNHFS, the situation of PINHFS is mainly discussed.
Definition 4.
Let X be a fixed set. A probabilistic single-valued neutrosophic hesitant fuzzy set (PSVNHFS) on X is defined by the following mathematical symbol:
N P = { x , t ˜ ( x ) | P t ˜ ( x ) , i ˜ ( x ) | P i ˜ ( x ) , f ˜ ( x ) | P f ˜ ( x ) | x X } .
The components t ˜ ( x ) | P t ˜ ( x ) , i ˜ ( x ) | P i ˜ ( x ) and f ˜ ( x ) | P f ˜ ( x ) are three sets of some possible elements where t ˜ ( x ) , i ˜ ( x ) and f ˜ ( x ) represent the possible truth-membership hesitant degrees, indeterminacy-membership hesitant degrees and falsity-membership hesitant degrees to the set X of x, respectively. P t ˜ ( x ) , P i ˜ ( x ) and P f ˜ ( x ) are the corresponding probabilistic information for these three types of degrees. There is:
0 α , β , γ 1 , 0 δ + + γ + + η + + 3 ; P a t ˜ [ 0 , 1 ] , P b i ˜ [ 0 , 1 ] , P c f ˜ [ 0 , 1 ] ; a = 1 # t ˜ P a t ˜ = 1 , b = 1 # i ˜ P b i ˜ = 1 , c = 1 # f ˜ P c f ˜ = 1 .
where α t ˜ ( x ) , β i ˜ ( x ) , γ f ˜ ( x ) . α + t ˜ + ( x ) = = α t ˜ ( x ) m a x α , β + i ˜ + ( x ) = β i ˜ ( x ) m a x β , γ + f ˜ + ( x ) = γ f ˜ ( x ) m a x γ , P a t ˜ P t ˜ , P b i ˜ P i ˜ , P c f ˜ P f ˜ . The symbols # t ˜ , # i ˜ and # f ˜ are the total numbers of elements in the components t ˜ ( x ) | P t ˜ ( x ) , i ˜ ( x ) | P i ˜ ( x ) and f ˜ ( x ) | P f ˜ ( x ) , respectively.
For convenience, we call n p ˜ = t ˜ ( x ) | P t ˜ ( x ) , i ˜ ( x ) | P i ˜ ( x ) , f ˜ ( x ) | P f ˜ ( x ) a probabilistic single-valued neutrosophic hesitant fuzzy number (PSVNHFN). It is defined by the mathematical symbol: n ˜ = { t ˜ | P t ˜ , i ˜ | P i ˜ , f ˜ | P f ˜ } .
Next, a numerical example about investment options is used to explain the PSVNHFS.
Example 1.
OF four investment selections A h , select the only investment option of an investment company. The investment corporation wants to have an effective evaluation and to choose the best investment opportunity; thus, the decision maker needs to use the PSVNHFS theory. According to the practical situation, there are three main attributes: (1) C 1 is the hazard of investment; (2) C 2 is the future outlook; (3) C 3 is the environment index. Thus, the data on these four options are represented by SVNHFS, as illustrated in Table 1, Table 2, Table 3 and Table 4. Every table is called a probabilistic single-valued neutrosophic hesitant fuzzy decision matrix (PSVNHFDM).
In general, in the real world, if the three types of hesitant degrees of the PSVNHFS are interval values, this is a special case of INHFS. This kind of interval is more able to express the problems that people encounter when making choices in real life. However, the PSVNHFS is not an effective tool to solve this problem. Thus, we need to propose a new method to solve this problem. Then, the SVNHFS can be used as a special case of the probabilistic interval neutrosophic hesitant fuzzy circumstance. Thus, the probabilistic interval neutrosophic hesitant fuzzy set (PINHFS) is proposed and studied. The advantages of this are: SVNHFS can be studied in a wider range; the scope of application is also broader and closer to real life. Hence, we will give the concept of PINHFS. Simultaneously, in the rest of this paper, we take PINHFS as an example to conduct research.
Definition 5.
Let X be a fixed set, a probabilistic interval neutrosophic hesitant fuzzy set (PINHFS) on X is defined by the following mathematical symbol:
N = { x , T ( x ) | P T ( x ) , I ( x ) | P I ( x ) , F ( x ) | P F ( x ) | x X } .
The components T ( x ) | P T ( x ) , I ( x ) | P I ( x ) and F ( x ) | P F ( x ) are three sets of possible elements where T ( x ) , I ( x ) and F ( x ) are three sets of some interval values in the real unit interval [0, 1], which denotes the possible truth-membership hesitant degrees, indeterminacy-membership hesitant degrees and falsity-membership hesitant fuzzy degrees of element x X to the set N, respectively. P T ( x ) , P I ( x ) and P F ( x ) are the corresponding probabilistic information for these three types of degrees. There is:
α ˜ = [ α L , α U ] [ 0 , 1 ] , β ˜ = [ β L , β U ] [ 0 , 1 ] , γ ˜ = [ γ L , γ U ] [ 0 , 1 ] ; 0 s u p α ˜ + + s u p β ˜ + + s u p γ ˜ + 3 ; P a T [ 0 , 1 ] , P b I [ 0 , 1 ] , P c F [ 0 , 1 ] , a = 1 # T P a T = 1 , b = 1 # I P b I = 1 , c = 1 # F P c F = 1 ;
where α ˜ T ( x ) , β ˜ I ( x ) and γ ˜ F ( x ) . α ˜ + = α ˜ T A ( x ) m a x { α ˜ } , β ˜ + = β ˜ I A ( x ) m a x { β ˜ } , and γ ˜ + = γ ˜ F A ( x ) m a x { γ ˜ } . P a T P T , P b I P I , P c F P F . The symbols # T , # I and # f ˜ are the total numbers of elements in the components T ( x ) | P T ( x ) , I ( x ) | P I ( x ) and F ( x ) | P F ( x ) , respectively.
For convenience, we call n = T ( x ) | P T ( x ) , I ( x ) | P I ( x ) , F ( x ) | P F ( x ) a probabilistic interval neutrosophic hesitant fuzzy number (PINHFN). It is defined by the mathematical symbol: n = { T | P T , I | P I , F | P F } .
If α L = α U , β L = β U , γ L = γ U , the PINHFS is transformed into the PSVNHFS.
Therefore, we know PINHFS is more general than PSVNHFS. PSVNHFS can satisfy all the properties of PINHFS. Thus, this paper mainly studies PINHFS.
Definition 6.
For a PINHFN n, where a = 1 , 2 , , # T , b = 1 , 2 , , # I , c = 1 , 2 , , # F , the score function s ( n ) is defined as:
s ( n ) = a = 1 # T ( α a L + α a U ) P a T + b = 1 # I ( 2 ( β b L + β b U ) ) P b I + c = 1 # F ( 2 ( γ c L + β c U ) ) P c F 6 ,
where # T , # I and # f ˜ are the total numbers of elements in the components T ( x ) | P T ( x ) , I ( x ) | P I ( x ) and F ( x ) | P F ( x ) , respectively.
Definition 7.
For a PINHFN n, where a = 1 , 2 , , # T , b = 1 , 2 , , # I , c = 1 , 2 , , # F , the deviation function d ( n ) is defined as:
d ( n ) = a = 1 # T ( α a L + α a U 2 s ( n ) ) 2 · P a T + b = 1 # I ( 2 β b L β b U 2 s ( n ) ) 2 · P b I + c = 1 # F ( 2 γ c L β c U 2 s ( n ) ) 2 · P c F 4
where # T , # I and # f ˜ are the total numbers of elements in the components T ( x ) | P T ( x ) , I ( x ) | P I ( x ) and F ( x ) | P F ( x ) , respectively.
Definition 8.
Let n 1 and n 2 be two PINHFNs, the comparison of the method for n 1 and n 2 is as follows:
(1) 
If s ( n 1 ) > s ( n 2 ) , then n 1 > n 2 ;
(2) 
If s ( n 1 ) = s ( n 2 ) , d ( n 1 ) > d ( n 2 ) , then n 1 > n 2 ;
(3) 
If s ( n 1 ) = s ( n 2 ) , d ( n 1 ) = d ( n 2 ) , then n 1 = n 2 .

4. Some Basic Operations of PINHFNs

Definition 9.
Let n 1 = { T 1 | P T 1 , I 1 | P I 1 , F 1 | P F 1 } and n 2 = { T 2 | P T 2 , I 2 | P I 2 , F 2 | P F 2 } be two PINHFNs, then:
(1) 
( n 1 ) c = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { γ ˜ 1 | P 1 F 1 , [ 1 β 1 U , 1 β 1 L ] | P 1 I 1 , α ˜ 1 | P 1 T 1 } ,
(2) 
n 1 n 2 = α ˜ 1 T 1 , β ˜ I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ I 2 , μ ˜ 2 F 2 { { α ˜ 1 η ˜ 2 | P 2 T 1 P 2 T 2 P 1 T 1 P 2 T 2 } , { β ˜ 1 θ ˜ 2 | P 1 I 1 P 2 I 2 P 1 I 1 P 2 I 2 } , { γ ˜ 1 μ ˜ 2 | P 1 F 1 P 2 F 2 P 1 F 1 P 2 F 2 } } ,
(3) 
n 1 n 2 = α ˜ 1 T 1 , β ˜ I 1 , γ ˜ 2 F 1 , η ˜ 2 T 2 , θ ˜ I 2 , μ ˜ 2 F 2 { { α ˜ 1 η ˜ 2 | P 1 T 1 P 2 T 2 P 1 T 1 P 2 T 2 } , { β ˜ 1 θ ˜ 2 | P 1 I 1 P 2 I 2 P b I 1 P 2 I 2 } , { γ ˜ 1 μ ˜ 2 | P 1 F 1 P 2 F 2 P 1 F 1 P 2 F 2 } } ,
(4) 
( n 1 ) λ = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ ( α 1 L ) λ , ( α 1 U ) λ ] | P 1 T 1 } , { [ 1 ( 1 β 1 L ) λ , 1 ( 1 β 1 U ) λ ] | P 1 I 1 } , { [ 1 ( 1 γ 1 L ) , 1 ( 1 γ 1 U ) ] λ | P 1 F 1 } } ,
(5) 
λ ( n 1 ) = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ 1 ( 1 λ 1 L ) λ , 1 ( 1 λ 1 L ) λ ] | P 1 T 1 } , { [ ( β 1 L ) λ , ( β 1 U ) λ ] | P 1 I 1 } , { [ ( γ 1 L ) λ , ( γ 1 U ) λ ] | P 1 F 1 } } ,
(6) 
n 1 n 2 = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ α 1 L + η 2 L α 2 L η 2 L , α 1 U + η 2 U α 2 U η 2 U ] | P 1 T 1 P 2 T 2 } , { [ β 1 L θ 2 L , β 1 U θ 2 U ] | P 1 I 1 P 2 I 2 } , { [ γ 1 L μ 2 L , γ 1 U μ 2 U ] | P 1 F 1 P c 2 F 2 } } ,
(7) 
n 1 n 2 = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ α 1 L η 2 L , α 1 U η 2 U ] | P 1 T 1 P 2 T 2 } , { [ β 1 L + θ 2 L β 1 L θ 2 L , β 1 U + θ 2 U β 1 U θ 2 U | P 1 I 1 P 2 I 2 } , { [ γ 1 L + μ 2 L γ 1 L μ 2 L , γ 1 U + μ 2 U γ 1 U μ 2 U ] | P 1 F 1 P c 2 F 2 } } ,
where P 1 T 1 ; P 1 I 1 and P 1 F 1 are hesitant probabilities of α ˜ 1 T 1 , β ˜ 1 I 1 and γ ˜ 1 F 1 , respectively. P 2 T 2 ; P 2 I 2 and P 1 F 2 are corresponding hesitant probabilities of η ˜ 2 T 2 , θ ˜ 2 I 2 and μ ˜ 2 F 2 .
Theorem 1.
Let n 1 and n 2 be two PINHFNs, then ( n 1 ) c , n 1 n 2 , n 1 n 2 , ( n 1 ) λ , λ ( n 1 ) , n 1 n 2 and n 1 n 2 are PINHFNs.
Proof. 
By Definition 5, Definition 9, it is easy to prove the result. ☐
Theorem 2.
Let n 1 = ( T 1 | P T 1 , I 1 | P I 1 , F 1 | P F 1 ) , n 2 = ( T 2 | P T 2 , I 2 | P I 2 , F 2 | P F 2 ) and n 3 = ( T 3 | P T 3 , I 3 | P I 3 , F 3 | P F 3 ) be three PINHFNs, λ , λ 1 , λ 2 0 , then:
(1) 
n 1 n 2 = n 2 n 1 ; n 1 n 2 = n 2 n 1 ,
(2) 
( n 1 n 2 ) n 3 = n 1 ( n 2 n 3 ) ; ( n 1 n 2 ) n 3 = n 1 ( n 2 n 3 ) ,
(3) 
λ ( n 1 n 2 ) = λ ( n 1 ) λ ( n 2 ) ,
(4) 
( n 1 n 2 ) λ = ( n 1 ) λ ( n 2 ) λ ,
(5) 
( n 1 ) λ 1 + λ 2 = ( n 1 ) λ 1 ( n 1 ) λ 2 ; ( λ 1 + λ 2 ) n 1 = λ 1 ( n 1 ) λ 2 ( n 1 ) .
Proof. 
If P 1 T 1 ; P 1 I 1 and P 1 F 1 are probabilities of α ˜ 1 T 1 , β ˜ 1 I 1 and γ ˜ 1 F 1 , respectively. P 2 T 2 , P 2 I 2 and P 2 F 2 are corresponding probabilities of η ˜ 2 T 2 , θ ˜ 2 I 2 and μ ˜ 2 F 2 . P 3 T 3 , P 3 I 3 and P 3 F 3 are corresponding probabilities of ξ ˜ 3 T 3 , σ ˜ 3 I 3 and ϕ ˜ 3 F 3 , then we have:
(1)
By Definition 9, we can get that (1) is true.
(2)
( n 1 n 2 ) n 3 = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 ξ ˜ 3 T 3 , σ ˜ 3 I 3 , ϕ ˜ 3 F 3 { { [ α 1 L + ( η 2 L + ξ 3 L η 2 L ξ 3 L ) α 1 L ( η 2 L + ξ 3 L η 2 L ξ 3 L ) , α 1 U + ( η 2 U + ξ 3 U η 2 U ξ 3 U ) α 1 U ( η 2 U + ξ 3 U η 2 U ξ 3 U ) ] | P 1 T 1 ( P 2 T 2 P 3 T 3 ) } , { [ β 1 L ( θ 2 L σ 3 L ) , β 1 U ( θ 2 U σ 3 U ) ] | P 1 I 1 ( P 2 I 2 P 3 I 3 ) } , { [ λ 1 L ( μ 2 L ϕ 3 L ) , λ 1 U ( μ 2 U ϕ 3 U ) ] | P 1 F 1 ( P 2 F 2 P 3 F 3 ) } } = n 1 ( n 2 n 3 ) .
Similarly, we can obtain ( n 1 n 2 ) n 3 = n 1 ( n 2 n 3 ) .
(3)
λ ( n 1 n 2 ) = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ 1 ( 1 ( α 1 L + η 2 L α 1 L η 2 L ) ) λ , 1 ( 1 ( α 1 U + η 2 U α 1 U η 2 U ) ) λ ] | P 1 T 1 P T 2 2 } { [ ( β 1 L ) λ ( θ 2 L ) λ , ( β 1 U ) λ ( θ 2 U ) λ ] | P 1 I 1 P 2 I 2 } , { [ ( γ 1 L ) λ ( μ 2 L ) λ , ( γ 1 U ) λ ( μ 2 U ) λ ] | P 1 F 1 P 2 F 2 } } = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ 1 ( 1 α 1 L ) λ , 1 ( 1 α 1 U ) λ ] | P 1 T 1 } , { [ ( β 1 L ) λ , ( β 1 U ) λ | P 1 I 1 } , { [ ( γ 1 L ) λ , ( γ 1 U ) λ ] | P 1 F 1 } } η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ 1 ( 1 η 2 L ) λ , 1 ( 1 η 2 U ) λ ] | P 2 T 2 } , { [ ( θ 2 L ) λ , ( θ 2 U ) λ ] | P 2 I 2 } , { [ ( μ 2 L ) λ , ( μ 2 U ) λ ] | P 2 F 2 } } = λ ( n 1 ) λ ( n 2 ) .
(4)
( n 1 n 2 ) λ = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ ( α 1 L η 2 L ) λ , ( α 1 U η 2 U ) λ ] | P 1 T 1 P 2 T 2 ) } , { [ 1 ( 1 ( β 1 L + θ 2 L β 1 L θ 2 L ) ) λ , 1 ( 1 ( β 1 U + θ 2 U β 1 U θ 2 U ) ) λ ] | P 1 I 1 P 2 I 2 } , { [ 1 ( 1 ( γ 1 L + μ 2 L γ 1 L μ 2 L ) ) λ , 1 ( 1 ( γ 1 U + μ 2 U γ 1 U μ 2 U ) ) λ ] | P 1 F 1 P 2 F 2 } } = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ ( α 1 L ) λ , ( α 1 U ) λ ] | P 1 T 1 } , { [ 1 ( 1 β 1 L ) λ , 1 ( 1 β 1 U ) λ ] | P 1 I 1 } , { [ 1 ( 1 γ 1 L ) λ , 1 ( 1 γ 1 U ) λ ] | P 1 F 1 } } η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ ( η 2 L ) λ , ( η 2 U ) λ ] | P 2 T 2 } , { [ 1 ( 1 θ 2 L ) λ , 1 ( 1 θ 2 U ) λ ] | P 2 I 2 } , { [ 1 ( 1 μ 2 L ) λ , 1 ( 1 μ 2 U ) λ ] | P 2 F 2 } } = ( n 1 ) λ ( n 2 ) λ .
(5)
( n 1 ) λ 1 + λ 2 = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ ( α 1 L ) λ 1 + λ 2 , ( α 1 U ) λ 1 + λ 2 ] | P 1 T 1 } , { [ 1 ( 1 β 1 L ) λ 1 + λ 2 , 1 ( 1 β 1 U ) λ 1 + λ 2 ] | P 1 I 1 } , { [ 1 ( 1 γ 1 L ) λ 1 + λ 2 , 1 ( 1 γ 1 U ) λ 1 + λ 2 ] | P 1 F 1 } } = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { α a λ 1 | P a t ˜ 1 } , { ( 1 ( 1 β b ) λ 1 ) | P b i ˜ 1 } , { ( 1 ( 1 γ c ) + λ 1 ) | P c f ˜ 1 } } α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ ( α 1 L ) λ 2 , ( α 1 U ) λ 2 ] | P 1 T 1 } , { [ 1 ( 1 β 1 L ) λ 2 , 1 ( 1 β 1 U ) λ 2 ] | P 1 I 1 } , { [ 1 ( 1 γ 1 L ) λ 2 , 1 ( 1 γ 1 U ) λ 2 ] | P 1 F 1 } } = ( n 1 ) λ 1 ( n 1 ) λ 2 .
Similarly, we have ( λ 1 + λ 2 ) n 1 = λ 1 ( n 1 ) λ 2 ( n 1 ) . ☐
Theorem 3.
Let n 1 and n 2 be two PINHFNs, λ 0 , then:
(1) 
( ( n 1 ) c ) λ = ( λ ( n 1 ) ) c ,
(2) 
λ ( n 1 ) c = ( ( n 1 ) λ ) c ,
(3) 
( n 1 ) c n 2 c = ( n 1 n 2 ) c ,
(4) 
( n 1 ) c ( n 2 ) c = ( n 1 n 2 ) c .
Proof. 
P 1 T 1 ; P 1 I 1 and P 1 F 1 are hesitant probabilities of α ˜ 1 T 1 , β ˜ 1 I 1 and γ ˜ 1 F 1 , respectively. P 2 T 2 , P 2 I 2 and P 2 F 2 are corresponding hesitant probabilities of η ˜ 2 T 2 , θ ˜ 2 I 2 and μ ˜ 2 F 2 . Then:
(1)
( ( n 1 ) c ) λ = ( α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ γ 1 L , γ 1 U ] | P 1 F 1 } , { [ 1 β 1 U , 1 β 1 U ] | P 1 I 1 } , { α 1 L , α 1 U ] | P 1 T 1 } } ) λ = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ ( γ 1 L ) λ , ( γ 1 U ) λ ] | P 1 F 1 } , { [ 1 ( β 1 U ) λ , 1 ( β 1 U ) λ ] | P 1 I 1 } , [ 1 ( 1 α 1 L ) λ , 1 ( 1 α 1 U ) λ ] | P 1 t ˜ 1 } = ( λ ( α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ α 1 L , α 1 U ] | P 1 T 1 } , [ β 1 L , β 1 U ] | P 1 I 1 } , [ γ 1 L , γ 1 U ] | P 1 F 1 } } ) ) c = ( λ ( n 1 ) ) c .
(2)
λ ( n 1 ) c = λ ( α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ γ 1 L , γ 1 U ] | P 1 F 1 } , [ 1 β 1 U , 1 β 1 U ] | P 1 I 1 } , { [ α 1 L , α 1 L ] | P 1 T 1 } } ) = α a t ˜ 1 , β b i ˜ 1 , γ c f ˜ 1 { { 1 ( 1 γ 1 L ) λ , 1 ( 1 γ 1 U ) λ ] | P 1 F 1 } , { [ ( 1 β 1 U ) λ , ( 1 β 1 L ) λ ] | P 1 I 1 } , { [ ( α 1 L ) λ , ( α 1 U ) λ ] | P 1 T 1 } }
= ( α a t ˜ 1 , β b i ˜ 1 , γ c f ˜ 1 { { [ ( α 1 L ) λ , ( α 1 U ) λ ] | P 1 T 1 } , { [ 1 ( 1 β 1 L ) λ , 1 ( 1 β 1 U ) λ ] | P 1 I 1 } , { [ 1 ( 1 γ 1 L ) λ , 1 ( 1 γ 1 U ) λ ] | P 1 F 1 } } ) c = ( ( n 1 ) λ ) c .
(3)
( n 1 ) c ( n 2 ) c = ( α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ γ 1 L , γ 1 U ] | P 1 F 1 } , { [ 1 β 1 U , 1 β 1 L ] | P 1 I 1 } , { [ α 1 L , α 1 U ] | P 1 T 1 } } ) ( η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { [ μ 2 L , μ 2 U ] | P 2 F 2 } , { [ 1 θ 2 U , 1 θ 2 L ] | P 2 I 2 } , { [ η 2 L , η 2 U ] | P 2 T 2 } } ) = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ γ 1 L + μ 2 L γ 1 L μ 2 L , γ 1 U + μ 2 U γ 1 U μ 2 U ] | P 1 F 1 P 2 F 2 } , { [ ( 1 β 2 L ) ( 1 θ 2 L ) , ( 1 β 2 U ) ( 1 θ 2 U ) ] | P 1 I 1 P 2 I 2 } , { [ α 1 L η 2 L , α 1 U η 2 U ] | P 1 T 1 P 2 T 2 } } = ( α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ α 1 L η 2 L , α 1 U η 2 U ] | P 1 T 1 P 2 T 2 } , { [ β 1 L + θ 2 L β 1 L θ 2 L , β 1 U + θ 2 U β 1 U θ 2 U ] | P 1 I 1 P 2 I 2 } , { [ γ 1 L + μ 2 L γ 1 L μ 2 L , γ 1 U + μ 2 U γ 1 U μ 2 U ] | P 1 F 1 P 2 F 2 } } ) c = ( n 1 n 2 ) c .
(4)
( n 1 ) c ( n 2 ) c = ( α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 { { [ γ 1 L , γ 1 U ] | P 1 F 1 } , { [ 1 β 1 U , 1 β 1 L ] | P 1 I 1 } , { [ α 1 L , α 1 U ] | P 1 T 1 } } ) ( η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ μ 2 L , μ 2 U ] | P 2 F 2 } , { [ 1 θ 2 U , 1 θ 2 L ] | P 2 I 2 } , { [ η 2 L , η 2 U ] | P 2 T 2 } } ) = α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ γ 1 L μ 2 L , γ 1 U μ 2 U ] | P 1 F 1 P 2 F 2 } , { [ 1 β 1 U θ 2 U , 1 β 1 L θ 2 L ] | P 1 I 1 P 2 I 2 } , { [ α 1 L + η 2 L α 1 L η 2 L , α 1 U + η 2 U α 1 U η 2 U ] | P 1 T 1 P 2 T 2 } } = ( α ˜ 1 T 1 , β ˜ 1 I 1 , γ ˜ 1 F 1 , η ˜ 2 T 2 , θ ˜ 2 I 2 , μ ˜ 2 F 2 { { [ α 1 L + η 2 L α 1 L η 2 L , α 1 U + η 2 U α 1 U η 2 U ] | P 1 T 1 P 2 T 2 } , { [ β 1 L θ 2 L , β 1 U θ 2 U ] | P 1 I 1 P 2 I 2 } , { [ γ 1 L μ 2 L , γ 1 U μ 2 U ] | P 1 F 1 P 2 F 2 } } ) c = ( n 1 n 2 ) c .
 ☐
The PSVNHFS also satisfies the above properties, and the process of the proof is omitted.

5. The Basic Aggregation Operators for PINHFSs

Definition 10.
Let n j ( x = 1 , 2 , , X ) be a non-empty collection of PINHFNs, then a probabilistic interval neutrosophic hesitant fuzzy weighted averaging (PINHFWA) operator can be indicated as:
P I N H F W A ( n 1 , n 2 , , n X ) = j = 1 X w j ( n j ) = { { [ 1 j = 1 X ( 1 α j L ) w j , 1 j = 1 X ( 1 α j U ) w j ] | j = 1 X P j T j } , { [ j = 1 X ( β j L ) w j , j = 1 X ( β j U ) w j ] | j = 1 X P j I j } , { [ j = 1 X ( γ j L ) w j , j = 1 X ( γ j L ) w j ] | j = 1 X P j F j } } ,
where [ α j L , α j U ] = α ˜ j T j , [ β j L , β j U ] = β ˜ j I j , [ γ j L , γ j U ] = γ ˜ F j , P j T j , P j I j and P j F j are corresponding hesitant probabilities of α ˜ j T j , β ˜ j I j and γ ˜ j F j . j = 1 , 2 , , X , w j is the weight of n j and j = 1 X w j = 1 . If all wights are 1 X , then the PINHFWA operator reduces to the probabilistic interval neutrosophic hesitant fuzzy averaging (PINHFA) operator:
P I N H F A ( n 1 , n 2 , , n X ) = j = 1 X 1 X ( n j ) = { { [ 1 j = 1 X ( 1 α j L ) 1 X , 1 j = 1 X ( 1 α j L ) 1 X ] | j = 1 X P j T j } , { [ j = 1 X ( β j L ) 1 X , j = 1 X ( β j U ) 1 X ] | j = 1 X P j I j } , { [ j = 1 X ( γ j L ) 1 X , j = 1 X ( γ j U ) 1 X ] | j = 1 X P j F j } } .
Theorem 4.
(Monotonicity) Let n j = { { α ˜ j | P J T j } , { β ˜ j | P j I j } , { γ ˜ j | P j F j } } and m j = { { η ˜ j | P j T j * } , { θ ˜ 2 | P j I j * } , { μ ˜ j | P j F j * } } be two collections of PINHFNs; w j ( j = 1 , 2 , , X ) is weight, and j = 1 n w j = 1 . If P j T j = P j T j * , P j I j = P j I j * , P j F j = P j F j * and α j L η j L , α j U η j U , β j L θ j L , β j U θ j U , γ j L μ j L , γ j U μ j U , then:
P I N H F W A ( n 1 , n 2 , , n X ) P I N H F W A ( m 1 , m 2 , , m X ) .
Proof. 
Since α j L η j L , α j U η j U , β j L θ j L , β j U θ j U , γ j L μ j L , γ j U μ j U for all j, we have:
1 ( 1 α j L ) w j 1 ( 1 η j L ) w j ,   1 ( 1 α j U ) w j 1 ( 1 η j U ) w j ; ( β j L ) w j ( θ j L ) w j ,   ( β j U ) w j ( θ j U ) w j ; ( γ j L ) w j ( μ j L ) w j ,   ( γ j U ) w j ( μ j U ) w j .
Simultaneously, we have P j T j = P j T j * , P j I j = P j I j * , P j F j = P j F j * , so we can obtain:
( 1 ( 1 α j L ) w j ) P j T j ( β j L ) w j P j I j ( γ j L ) w j P j F j } ( 1 ( 1 η j L ) w j ) P j T j * ( θ j L ) w j P j I j * ( μ j L ) w j P j F j * , ( 1 ( 1 α j U ) w j ) P j T j ( β j U ) w j P j I j ( γ j U ) w j P j F j } ( 1 ( 1 η j U ) w j ) P j T j * ( θ j U ) w j P j I j * ( μ j U ) w j P j F j * ,
then by the score function 6 and Definition 8, we have P I N H F W A ( n 1 , n 2 , , n X ) P I N H F W A ( m 1 , m 2 , , m X ) .  ☐
Theorem 5.
(Boundedness) Let n j = { { α ˜ j | P j T j } , { β ˜ j | P I j } , { γ ˜ j | P F j } } be a PINHFN ( j = 1 , 2 , , X ), α ˜ j T j , β ˜ b I j , γ ˜ j F j , P j T j ; P j I j and P j F j are hesitant probabilities of α ˜ j , β ˜ j and γ ˜ j , respectively. w j ( j = 1 , 2 , , X ) is a weight, and j = 1 X w j = 1 . If:
N = { { [ m i n { α j L } , m i n { α j U } ] | m i n { P j T j } } , { [ m a x { β j L } , m a x { β j U } ] | m a x { P j I j } } , { [ m a x { γ j L } , m a x { γ j L } ] | m a x { P j F j } } } ,
N + = { { [ m a x { α j L } , m a x { α j U } ] | m a x { P j T j } } , { [ m i n { β j L } , m i n { β j U } ] | m i n { P j I j } } , { [ m i n { γ j L } , m i n { γ j L } ] | m i n { P j F j } } } .
Then:
P I N H F W A ( N ) P I N H F W A ( n 1 , n 2 , , n X ) P I N H F W A ( N + )
Proof. 
For all PINHFNs n l , we have:
m i n { α j L } α j L m a x { α j L } ,   m i n { α j U } α j U m a x { α j U } ; m i n { β j L } β j L m a x { β j L } ,   m i n { β j U } β j U m a x { β j U } ; m i n { γ j L } γ j L m a x { γ j L } ,   m i n { γ j U } γ j U m a x { γ j U } ; m i n { P j T j } P j T j m a x { P j T j } ,   m i n { P j I j } P j I l m a x { P j I j } , m i n { P j F j } P j F j m a x { P j F j } .
Thus,
1 ( 1 α j L ) w j 1 ( 1 m i n { α j L } ) w j = 1 ( 1 m i n { α l L } ) w j = m i n { α j L } , 1 ( 1 α j U ) w j 1 ( 1 m i n { α j U } ) w j = 1 ( 1 m i n { α l U } ) w j = m i n { α j U } , ( β j L ) w j ( m a x { β j L } ) w j = ( m a x { β j L } ) w j = m a x { β j L } , ( β j U ) w j ( m a x { β j U } ) w j = ( m a x { β j U } ) w j = m a x { β j U } , ( γ j L ) w j ( m a x { γ j L } ) w j = ( m a x { γ j L } ) w j = m a x { γ j L } , ( γ j U ) w j ( m a x { γ j U } ) w j = ( m a x { γ j U } ) w j = m a x { γ j U } .
Next, by Definition 10, we have:
N H P F W A ( N ) = { { [ m i n { α j L } , m i n { α j U } ] | m i n { P j T j } , { [ m a x { β j L } , m a x { β j U } ] | m a x { P j I j } } , { [ m a x { γ j L } , [ m a x { γ j U } ] | m a x { P j F j } } } .
By score function 6 and Definition 8, we can obtain P I N H F W A ( N ) P I N H F W A ( n 1 , n 2 , , n X ) . Similarly, we have P I N H F W A ( n 1 , n 2 , , n X ) P I N H F W A ( N + ) . ☐
Theorem 6.
(Idempotency) If n j = { { [ α L , α U ] | P 1 } , { [ β L , β U ] | P 2 } , { [ γ L , γ U ] | P 3 } } , j = 1 , 2 , , X , w j is the weight of n j , j = 1 X w j = 1 , then:
P I N H F W A ( n 1 , n 2 , , n X ) = { { [ α L , α U ] | P 1 } , { [ β L , β U ] | P 2 } , { [ γ L , γ U ] | P 3 } } .
Proof. 
Since n j = { { [ α L , α U ] | P 1 } , { [ β L , β U ] | P 2 } , { [ γ L , γ U ] | P 3 } } , thus we have:
1 ( 1 α L ) w j = 1 ( 1 α L ) w j = α L , 1 ( 1 α U ) w j = 1 ( 1 α U ) w j = α U ; ( β L ) w j = ( β L ) w j = β L , ( β U ) w j = ( β U ) w j = β U , ( γ L ) w j = ( γ L ) w j = γ L , ( γ U ) w j = ( γ U ) w j = γ U , ( P 1 ) w j = ( P 1 ) w j = P 1 , ( P 2 ) w j = ( P 2 ) w j = P 2 , ( P 3 ) w j = ( P 3 ) w j = P 3 .
It is easy to get:
P I N H F W A ( n p ˜ 1 , n p ˜ 2 , , n p ˜ X ) = { { [ α L , α U ] | P 1 } , { [ β L , β U ] | P 2 } , { [ γ L , γ U ] | P 3 } } .
 ☐
Theorem 7.
(Commutativity) If A = { n 1 , n 2 , , n X } is a collection and B = { m 1 , m 2 , , m X } is a new permutation of A, then:
P I N H F W A ( n 1 , n 2 , , n X ) = P I N H F W A ( m 1 , m 2 , , m X ) .
Proof. 
By Definition 10, it is easy to prove it. ☐
Definition 11.
Let n j ( j = 1 , 2 , , X ) be a non-empty collection of PINHFNs; a probability interval neutrosophic hesitant fuzzy weighted geometric (PINHFWG) operator can be indicated as:
P I N H F W G ( n 1 , n 2 , , n X ) = j = 1 X w j ( n j ) = { { [ j = 1 X ( α j L ) w j , j = 1 X ( α j U ) w j ] | j = 1 X P j T j } , { [ 1 j = 1 X ( 1 β j L ) w j , 1 j = 1 X ( 1 β j U ) w j ] | j = 1 X P j I j } , { [ 1 j = 1 X ( 1 γ j L ) w j , 1 j = 1 X ( 1 γ j U ) w j ] | j = 1 X P j F j } } ,
where [ α j L , α j U ] = α ˜ j T j , [ β j L , β j U ] = β ˜ j I j , [ γ j L , γ j U ] = γ ˜ F j , P j T j , P j I j and P j F j are corresponding hesitant probabilities of α ˜ j , β ˜ j and γ ˜ j . j = 1 , 2 , , X , w j is the weight of n j and j = 1 X w j = 1 . If all wights are 1 X , then the PINHFWG operator converts to the probabilistic interval neutrosophic hesitant fuzzy geometric (PINHFG) operator:
P I N H F G ( n 1 , n 2 , , n X ) = j = 1 X 1 X ( n j ) = { { [ j = 1 X ( α j L ) 1 X , j = 1 X ( α j U ) 1 X ] | j = 1 X P j T j } , { [ 1 j = 1 X ( 1 β j L ) 1 X , 1 j = 1 X ( 1 β j U ) 1 X ] | j = 1 X P j I j } , { [ 1 j = 1 X ( 1 γ j L ) 1 X , 1 j = 1 X ( 1 γ j U ) 1 X ] | j = 1 X P j F j } } .
Theorem 8.
(Monotonicity) Let n j = { { α ˜ j | P J T j } , { β ˜ j | P j I j } , { γ ˜ j | P j F j } } and m j = { { η ˜ j | P j T j * } , { θ ˜ 2 | P j I j * } , { μ ˜ j | P j F j * } } be two collections of PINHFNs; w j ( j = 1 , 2 , , X ) is weight, and j = 1 n w j = 1 . If P j T j = P j T j * , P j I j = P j I j * , P j F j = P j F j * and α j L η j L , α j U η j U , β j L θ j L , β j U θ j U , γ j L μ j L , γ j U μ j U , then:
P I N H F W G ( n 1 , n 2 , , n X ) P I N H F W G ( m 1 , m 2 , , m X ) .
Proof. 
This is similar to Theorem 4. ☐
Theorem 9.
(Boundedness) Let n j = { { α ˜ j | P j T j } , { β ˜ j | P I j } , { γ ˜ j | P F j } } be a PINHFN ( j = 1 , 2 , , X ), α ˜ j T j , β ˜ b I j , γ ˜ j F j , P j T j ; P j I j and P j F j are hesitant probabilities of α ˜ j , β ˜ j and γ ˜ j , respectively. w j ( j = 1 , 2 , , X ) is a weight, and j = 1 X w j = 1 . If:
P = { { [ m i n { α j L } , m i n { α j U } ] | m i n { P j T j } } , { [ m a x { β j L } , m a x { β j U } ] | m a x { P j I j } } , { [ m a x { γ j L } , m a x { γ j L } ] | m a x { P j F j } } } ,
P + = { { [ m a x { α j L } , m a x { α j U } ] | m a x { P j T j } } , { [ m i n { β j L } , m i n { β j U } ] | m i n { P j I j } } , { [ m i n { γ j L } , m i n { γ j L } ] | m i n { P j F j } } } ,
then:
P I N H F W G ( P ) P I N H F W G ( n 1 , n 2 , , n X ) P I N H F W G ( P + )
Proof. 
This is similar to Theorem 5. ☐
Theorem 10.
(Idempotency) If n j = { { [ α L , α U ] | P 1 } , { [ β L , β U ] | P 2 } , { [ γ L , γ U ] | P 3 } } , j = 1 , 2 , , X , w j is the weight of n j , j = 1 X w j = 1 , then:
P I N H F W G ( n 1 , n 2 , , n X ) = { { [ α L , α U ] | P 1 } , { [ β L , β U ] | P 2 } , { [ γ L , γ U ] | P 3 } } .
Proof. 
This is similar to Theorem 6. ☐
Theorem 11.
(Commutativity) If A = { n 1 , n 2 , , n X } is a collection and B = { m 1 , m 2 , , m X } is a new permutation of A, then:
P I N H F W G ( n 1 , n 2 , , n X ) = P I N H F W G ( m 1 , m 2 , , m X ) .
Proof. 
We can obtain it by Definition 13. ☐
Lemma 1.
[3] Let x i 0 , w i 0 , i = 1 , 2 , , n and i = 1 n w i = 1 , then:
i = 1 n ( x i ) w i i = 1 n x i w i ,
Theorem 12.
If n j = { { α ˜ j | P j T j } , { β ˜ j | P I j } , { γ ˜ j | P F j } } is a collection of PINHFNs and j = 1 , 2 , , X , w j is the weight of n j , w j 0 and j = 1 X w j = 1 , then:
P I N H F W G ( n 1 , n 2 , , n X ) P I N H F W A ( n 1 , n 2 , , n X ) , P I N H F G ( n 1 , n 2 , , n X ) P I N H F A ( n 1 , n 2 , , n X ) .
Proof. 
Since α ˜ j = [ α j L , α j U ] , β ˜ j = [ β j L , β j U ] , γ ˜ j = [ γ j L , γ j U ] , α j L , α j U [ 0 , 1 ] . Thus, By Lemma 1, we have:
( α j L ) w j w j α j L = 1 w j ( 1 α j L ) 1 ( 1 α j L ) w j , ( α j U ) w j w j α j U = 1 w j ( 1 α j U ) 1 ( 1 α j U ) w j .
Thus, we can obtain:
( α j L ) w j P j T j ( 1 ( 1 α j L ) w j ) P j T j , ( α j U ) w j P j T j ( 1 ( 1 α j U ) w j ) P j T j .
Similarly, we can also get:
( β j L ) w j P j I j ( 1 ( 1 β j L ) w j ) P j I j , ( β j U ) w j P j I j ( 1 ( 1 β j U ) w j ) P j I j , ( γ j L ) w j P j F j ( 1 ( 1 γ j L ) w j ) P j F j , ( γ j U ) w j P j F j ( 1 ( 1 γ j U ) w j ) P j F j .
Next, by the score function 6, we know:
P I N H F W G ( n 1 , n 2 , , n X ) P I N H F W A ( n 1 , n 2 , , n X ) .
Similar to the above process of the proof, we know inequality P I N H F G ( n 1 , n 2 , , n X ) P I N H F A ( n 1 , n 2 , , n X ) is right. ☐

6. MADM Based on the PINHFWA and PINHFWG Operators

In this section, the PINHFWA and PINHFWG operators are used to solve MADM problems with probabilistic interval neutrosophic hesitant fuzzy circumstances.
Let A = { A 1 , A 2 , , A M } be a collection of options and C = { C 1 , C 2 , , C N } be a set of attributes. In order to assess A h ( h = 1 , 2 , , M ) with the attribute C k ( k = 1 , 2 , , N ) represented by the PINHFN n h k = { T h k | P T h k , I h k | P I h k , F h k | P F h k } , next, we can construct a probabilistic interval neutrosophic hesitant fuzzy decision matrix (PINHFDM) D = ( n h k ) M × N ( h = 1 , 2 , , M ; k = 1 , 2 , , N ) . The weight vector of C is w = ( w 1 , w 2 , , w N ) . Then, the evaluation steps can select an optimal option:
  • Step 1. Use the PINHFWA or PINHFWG operator to aggregate N PINHFNs for an alternative A h , h = 1 , 2 , , M .
  • Step 2. Calculate the score values of all PINHFNs; if we get the same for s ( n ) , then we need to compare the deviation values.
  • Step 3. Rank and select the optimal option A h .

7. Illustrative Example

The background of the numerical case comes from Example 1. Therefore, this section is not covered in detail. The weight vector of C is w = ( 0.35 , 0.25 , 0.4 ) . Thus, four PINHFDMs are established, illustrated in Table 5, Table 6, Table 7 and Table 8.
  • Step 1. Select the PINHFWA operator to aggregate all PINHFNs of n h k ( h = 1 , 2 , 3 , 4 ; k = 1 , 2 , 3 ) to obtain the collective PINHFN n h ( h = 1 , 2 , 3 , 4 ) for the alternative A h ( h = 1 , 2 , 3 , 4 ) .
    n 1 = P I N H F W A ( n 11 , n 12 , n 13 ) = { { [ 0.2895 , 0.3903 ] | 0.05 , [ 0.3212 , 0.4234 ] | 0.05 , [ 0.3268 , 0.3903 ] | 0.05 , [ 0.3568 , 0.4234 ] | 0.05 , [ 0.3268 , 0.4280 ] | 0.4 , [ 0.3568 , 0.4590 ] | 0.4 } , { [ 0.1189 , 0.2213 ] | 1 } , { [ 0.3366 , 0.407 ] | 0.49 , [ 0.368 , 0.4378 ] | 0.21 , [ 0.3366 , 0.4373 ] | 0.21 , [ 0.368 , 0.4704 ] | 0.09 } ; n 2 = P I N H F W A ( n 21 , n 22 , n 23 ) = { { [ 0.6 , 0.7 ] | 1 } , { [ 0.1 , 0.1682 ] | 1 } , { [ 0.1189 , 0.2213 ] | 0.2 , [ 0.1516 , 0.2551 ] | 0.8 } } ; n 3 = P I N H F W A ( n 31 , n 32 , n 33 ) = { { [ 0.4375 , 0.5390 ] | 0.3 , [ 0.5 , 0.6 ] | 0.7 } , { [ 0.1516 , 0.2821 ] | 0.4 , [ 0.2 , 0.3318 ] | 0.6 } , { [ 0.2213 , 0.3224 ] | 1 } } ; n 4 = P I N H F W A ( n 41 , n 42 , n 43 ) = { { 0.5476 , 0.6807 ] | 1 } , { [ 0 , 0.1552 ] | 1 } , { [ 0.1189 , 0.2 ] | 0.2 , [ 0.1845 , 0.2352 ] | 0.8 } } .
  • Step 2. By (2), count the score values of all PINHFNs n h ( h = 1 , 2 , 3 , 4 ) ,
    n 1 = 0.6104 , n 2 = 0.7731 , n 3 = 0.6711 , n p ˜ 4 = 0.7789 .
  • Step 3. Rank the PINHFNs by Definition 8; we have:
    A 4 > A 2 > A 3 > A 1 .
    Thus, we know that A 4 is the best choice.
    Next, we will make use of the PINHFWG operator to solve the MADM problem.
  • Step 1’. Aggregate PINHFNs n h k ( h = 1 , 2 , 3 , 4 ; k = 1 , 2 , 3 ) by taking advantage of the PINHFWG operator to get the collective PINHFN n h for A h .
    n 1 = P I N H F W G ( n 11 , n 12 , n 13 ) = { { [ 0.2741 , 0.377 ] | 0.05 , [ 0.2898 , 0.3946 ] | 0.05 , [ 0.3031 , 0.377 ] | 0.05 , [ 0.3205 , 0.3946 ] | 0.05 , [ 0.3031 , 0.4076 ] | 0.4 , [ 0.3205 , 0.4266 ] | 0.4 } { [ 0.1261 , 0.2263 ] | 1 } , { [ 0.3419 , 0.4203 ] | 0.49 , [ 0.3881 , 0.4689 ] | 0.21 , [ 0.3419 , 0.4422 ] | 0.21 , [ 0.3881 , 0.4898 ] | 0.09 } } ; n 2 = P I N H F W G ( n 21 , n 22 , n 23 ) = { { [ 0.6 , 0.7 ] | 1 } , { [ 0.1 , 0.1761 ] | 1 } , { [ 0.1261 , 0.2263 ] | 0.2 , [ 0.1614 , 0.2616 ] | 0.8 } } ; n 3 = P I N H F W G ( n 31 , n 32 , n 33 ) = { { [ 0.4181 , 0.5206 ] | 0.3 , [ 0.5 , 0.6 ] | 0.7 } , { [ 0.1614 , 0.3004 ] | 0.4 , [ 0.2000 , 0.3368 ] | 0.6 } , { [ 0.2263 , 0.3265 ] | 1 } } ; n 4 = P I N H F W G ( n 41 , n 42 , n 43 ) = { { [ 0.4799 , 0.6411 ] | 1 } , { [ 0.0854 , 0.1861 ] | 1 } , { [ 0.1261 , 0.2000 ] | 0.2 , [ 0.2097 , 0.2416 ] | 0.8 } } .
  • Step 2’. By Definition 6, we have:
    n 1 = 0.595 , n 2 = 0.7692 , n 3 = 0.6653 , n 4 = 0.7372 .
  • Step 3’. Rank A h ( h = 1 , 2 , 3 , 4 ) on the basis of Step 2’,
    A 2 > A 4 > A 3 > A 1 .
    Thus, A 2 is the best choice.

8. The Basic Aggregation Operator for PSVNHFS

In this subsection, we construct the PSVNHFWA operator and the PSVNHFWG operator. The comparison method of PIVNHFNs is proposed.
Definition 12.
Let n p ˜ x ( x = 1 , 2 , , X ) be a non-empty collection of PSVNHFNs, then a PSVNHFWA operator can be indicated as:
P S V N H F W A ( n p ˜ 1 , n p ˜ 2 , , n p ˜ X ) = x = 1 X w x ( n p ˜ x ) = { { ( 1 j = 1 X ( 1 α j ) w j ) | j = 1 X P t ˜ j } , { j = 1 X β j w j | k = 1 X P i ˜ j } , { j = 1 X γ j w j | j = 1 X P f ˜ j } } ,
where α j t ˜ j , β j i ˜ j , γ j f ˜ j , j = 1 , 2 , , X , w j is the weight of n p ˜ j and j = 1 X w j = 1 .
Definition 13.
Let n p ˜ x ( x = 1 , 2 , , X ) be a non-empty collection of PSVNHFNs, then the PSVNHFWG operator can be indicated as:
P S V N H F W G ( n p ˜ 1 , n p ˜ 2 , , n p ˜ X ) = j = 1 X w j ( n p ˜ j ) = { { j = 1 X ( α j ) w j ) | j = 1 X P t ˜ j } , { ( 1 j = 1 X ( 1 β j ) w j ) | j = 1 X P i ˜ j } , { ( 1 j = 1 X ( 1 γ j ) w j ) | j = 1 X P f ˜ j } } ,
where α j t ˜ j , β j i ˜ j , γ j f ˜ j , j = 1 , 2 , , X , w j is the weight of n p ˜ j and j = 1 X w j = 1 .
Since the PSVNHFN is a special case of PINHFN, thus the score function s ( n p ˜ ) , deviation function d ( n p ˜ ) and sorting method can utilize Definition 6, Definition 7 and Definition 8, respectively. In order to solve the MADM problem of the probabilistic single-valued neutrosophic hesitant fuzzy circumstance, the algorithm can use the same method described in Section 6. Next, The application can use Example 1.
  • Step 1. Select the PSVNHFWA operator to aggregate all PSVNHFNs of ( n p ˜ ) h k ( h = 1 , 2 , 3 , 4 ; k = 1 , 2 , 3 ) to obtain the PSVNHFN n p ˜ h ( h = 1 , 2 , 3 , 4 ) for the option A h ( h = 1 , 2 , 3 , 4 ) .
    n p ˜ 1 = { { 0.3212 | 0.01 , 0.3568 | 0.015 , 0.3966 | 0.025 , 0.3580 | 0.01 , 0.3917 | 0.015 , 0.4293 | 0.025 , 0.3565 | 0.09 , 0.3903 | 0.1350 , 0.4280 | 0.2250 , 0.3914 | 0.09 , 0.4234 | 0.1350 , 0.4590 | 0.2250 } , { 0.1189 | 0.06 , 0.1569 | 0.14 , 0.1316 | 0.24 , 0.1737 | 0.56 } , { 0.368 | 0.048 , 0.407 | 0.032 , 0.3955 | 0.072 , 0.4373 | 0.048 , 0.3959 | 0.192 , 0.4378 | 0.128 , 0.4254 | 0.288 , 0.4704 | 0.192 } } n p ˜ 2 = { { 0.6 | 0.006 , 0.6435 | 0.014 , 0.6383 | 0.054 , 0.6776 | 0.126 , 0.6278 | 0.024 , 0.6682 | 0.056 , 0.6634 | 0.216 , 0.7 | 0.504 } , { 0.1 | 0.24 , 0.132 | 0.16 , 0.1275 | 0.36 , 0.1682 | 0.24 } , { 0.1677 | 0.35 , 0.2213 | 0.15 , 0.1933 | 0.35 , 0.2551 | 0.15 } } ; n p ˜ 3 = { { 0.5271 | 0.3 , 0.5675 | 0.2 , 0.5627 | 0.3 , 0.6 | 0.2 } , { 0.2138 | 1 } , { 0.2797 | 0.2 , 0.3224 | 0.8 } } ; n p ˜ 4 = { { 0.5476 | 0.216 , 0.6045 | 0.024 , 0.579 | 0.144 , 0.632 | 0.016 , 0.6074 | 0.324 , 0.6569 | 0.036 , 0.6347 | 0.216 , 0.6807 | 0.024 } , { 0.132 | 1 } , { 0.1189 | 0.01 , 0.1569 | 0.08 , 0.1846 | 0.01 , 0.1516 | 0.09 , 0.2 | 0.72 , 0.2352 | 0.09 } } .
  • Step 2. By (2), count the score values of all n p ˜ h ( h = 1 , 2 , 3 , 4 ) ,
    s ( n p ˜ 1 ) = 0.6108 , s ( n p ˜ 2 ) = 0.7839 , s ( n p ˜ 3 ) = 0.6776 , s ( n p ˜ 4 ) = 0.7579 .
  • Step 3. Rank the PSVNHFNs by Definition 8; we have.
    A 2 > A 4 > A 3 > A 1 .
    Thus, we know that A 2 is the best choice.
    Next, we will make use of the PSVNHFWG operator to solve Example 1.
  • Step 1’. Aggregate PSVNHFNs n p ˜ h k ( h = 1 , 2 , 3 , 4 ; k = 1 , 2 , 3 ) by taking advantage of the PSVNHFWG operator to get the n p ˜ h for A h .
    n p ˜ 1 = { { 0.2898 | 0.01 , 0.3409 | 0.09 , 0.3033 | 0.01 , 0.3568 | 0.09 , 0.3205 | 0.015 , 0.377 | 0.135 , 0.3355 | 0.015 , 0.3946 | 0.135 , 0.3466 | 0.025 , 0.4076 | 0.225 , 0.3627 | 0.025 , 0.4266 | 0.225 } , { 0.1261 | 0.06 , 0.1663 | 0.14 , 0.1548 | 0.24 , 0.1937 | 0.56 } , { 0.3881 | 0.048 , 0.4404 | 0.192 , 0.4113 | 0.072 , 0.4615 | 0.288 , 0.4203 | 0.032 , 0.4698 | 0.128 , 0.4422 | 0.048 , 0.4898 | 0.192 } } , n p ˜ 2 = { { 0.6 | 0.006 , 0.6382 | 0.014 , 0.6236 | 0.024 , 0.6632 | 0.056 , 0.6333 | 0.054 , 0.6735 | 0.126 , 0.6581 | 0.216 , 0.7 | 0.504 } , { 0.1 | 0.24 , 0.1414 | 0.16 , 0.1363 | 0.36 , 0.1761 | 0.24 } , { 0.1889 | 0.35 , 0.2263 | 0.15 , 0.226 | 0.35 , 0.2616 | 0.15 } } . n p ˜ 3 = { { 0.5233 | 0.3 , 0.5629 | 0.2 , 0.5578 | 0.3 , 0.6 | 0.2 } , { 0.2666 | 1 } , { 0.2942 | 0.2 , 0.3265 | 0.8 } } . n p ˜ 4 = { { 0.4799 | 0.216 , 0.5887 | 0.024 , 0.4988 | 0.144 , 0.6119 | 0.016 , 0.5029 | 0.324 , 0.6169 | 0.036 , 0.5226 | 0.216 , 0.6411 | 0.024 } , { 0.1414 | 1 } , { 0.1261 | 0.01 , 0.1663 | 0.08 , 0.2097 | 0.01 , 0.1614 | 0.09 , 0.2 | 0.72 , 0.2416 | 0.09 } } .
  • Step 2’. By Formula (2), we have:
    s ( n p ˜ 1 ) = 0.5507 , s ( n p ˜ 2 ) = 0.7741 , s ( n p ˜ 3 ) = 0.6568 , s ( n p ˜ 4 ) = 0.7248 .
  • Step 3’. Rank A h ( h = 1 , 2 , 3 , 4 ) by Definition 8,
    A 2 > A 4 > A 3 > A 1 .
    Thus, A 2 is the best choice.
In order to demonstrated the effectiveness of our approaches, a comparison was established with other methods. They are shown in Table 9 and Table 10.
In [49], Ye introduced the single-valued neutrosophic hesitant fuzzy weighted averaging (SVNHFWA) and single-valued neutrosophic hesitant fuzzy weighted geometric (SVNHFWG) operators and applied them to the single-valued neutrosophic hesitant fuzzy circumstance. In [50], Liu proposed the generalized weighted aggregation (GWA) operator and established the MADM method under the interval neutrosophic hesitant fuzzy circumstance. However, probability is not considered in [49,50]. The ranking results are presented in Table 9 and Table 10. According to the Table 9, A 2 is always the best choice, A 1 is always the worst option. According to the Table 10, the best option is A 4 under the group’s major points, whereas the best selection is A 2 under the individual major points. A 1 is always the worst choice. Apparently, the SVNHFS, IVHFS and PSVNHFS are special cases of PINHFS. Thus, the PINHFS is is wider than other methods.

9. Conclusions

In this paper, as a generation of fuzzy set theory, a new concept of PSVNHFS (PINHFS) is proposed based on the NHS and INS. The score function and the deviation function are defined. A comparison method is proposed. PSVNHFS is a special case of PINHFS; thus, PINHFS has a wider range of applications. Therefore, this paper mainly discusses the situation of the interval. Then, some basic operation laws of PINHFNs are introduced and investigated. Next, the PINHFWA and PINHFWG operators are presented, and some properties are studied. PSVNHFSs also satisfies the properties mentioned above. We can determine the optimal alternative by utilizing the PINHFWA (PINHFWG) operator. Finally, a numerical example was given. It is proven that the new approach is more flexible and suitable for practical issues. In addition, an example raised in this paper is to explain that PINHFS is more general than PSVNHFS. In the future, others aggregation operators of PINHFNs can be researched, and more practical applications in other areas can be solved, like medical diagnoses.

Author Contributions

All authors have contributed equally to this paper.

Funding

This research was funded by the National Natural Science Foundation of China (Grant Nos 61573240, 61473239).

Acknowledgments

Thanks to the reviewers for their valuable comments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zadeh, L.A. Fuzzy sets. Inf. Control. 1965, 8, 338–353. [Google Scholar] [CrossRef] [Green Version]
  2. Atanassov, K.T. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  3. Torra, V. Hesitant fuzzy sets. Int. J. Intell. Syst. 2010, 25, 529–539. [Google Scholar] [CrossRef]
  4. Zhu, B.; Xu, Z.S.; Xia, M.M. Dual hesitant fuzzy sets. J. Appl. Math. 2012, 2012, 879629. [Google Scholar] [CrossRef]
  5. Atanassov, K.T.; Gargov, G. Interval-valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1989, 3, 343–349. [Google Scholar] [CrossRef]
  6. Atanassov, K.T. Operators over interval-valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1994, 64, 159–174. [Google Scholar] [CrossRef]
  7. Alcantud, J.C.R.; Giarlotta, A. Necessary and Possible Hesitant Fuzzy Sets: A Novel Model for Group Decision Making. Inf. Fusion 2018, 46, 63–76. [Google Scholar] [CrossRef]
  8. Chen, J.; Huang, X. Dual hesitant fuzzy probability. Symmetry 2017, 9, 52. [Google Scholar] [CrossRef]
  9. Beg, I.; Rashid, T. Group decision making using intuitionistic hesitant fuzzy sets. Int. J. Fuzzy Logic Intell. Syst. 2014, 14, 181–187. [Google Scholar] [CrossRef]
  10. Li, L.Q.; Jin, Q.; Hu, K.; Zhao, F.F. The axiomatic characterizations on L-fuzzy covering-based approximation operators. Int. J. Gener. Syst. 2017, 46, 332–353. [Google Scholar] [CrossRef]
  11. Zhang, X.H. Fuzzy anti-grouped filters and fuzzy normal filters in pseudo-BCI algebras. J. Intell. Fuzzy Syst. 2017, 33, 1767–1774. [Google Scholar] [CrossRef]
  12. Zhang, X.H.; Park, C.; Wu, S.P. Soft set theoretical approach to pseudo-BCI algebras. J. Intell. Fuzzy Syst. 2018, 34, 559–568. [Google Scholar] [CrossRef]
  13. Shao, S.T.; Zhang, X.H.; Bo, C.X.; Park, C. Multi-granulation rough filters and rough fuzzy filters in Pseudo-BCI algebras. J. Intell. Fuzzy Syst. 2018, 34, 4377–4386. [Google Scholar] [CrossRef]
  14. Skalna, I.; Rebiasz, B.; Gawel, B.; Basiura, B.; Duda, J.; Opila, J.; Pelech-Pilichowski, T. Advances in Fuzzy Decision Making; Springer: Heidelbeg, Germany, 2015. [Google Scholar]
  15. Sevastjanov, P.; Dymova, L. Generalised operations on hesitant fuzzy values in the framework of Dempster- Shafer theory. Inf. Sci. 2015, 311, 39–58. [Google Scholar] [CrossRef]
  16. Yager, R.R.; Alajlan, N. Dempster-Shafer belief structures for decision making under uncertainty. Knowl.-Based Syst. 2015, 80, 58–66. [Google Scholar] [CrossRef]
  17. Dymova, L.; Sevastjanov, P. The operations on intuitionistic fuzzy values in the framework of Dempster- Shafer theory. Knowl.-Based Syst. 2012, 35, 132–143. [Google Scholar] [CrossRef]
  18. Yen, J. Generalizing the dempster-shafer theory to fuzzy-sets. IEEE Trans. Syst. Man Cybern. 1990, 20, 559–570. [Google Scholar] [CrossRef]
  19. Merigo, J.M. Fuzzy decision making with immediate probabilities. Comput. Ind. Eng. 2010, 58, 651–657. [Google Scholar] [CrossRef]
  20. Wei, G.W.; Merigó, J.M. Methods for strategic decision-making problems with immediate probabilities in intuitionistic fuzzy setting. Sci. Iran. 2012, 19, 1936–1946. [Google Scholar] [CrossRef]
  21. Jiang, F.J.; Ma, Q.G. Multi-attribute group decision making under probabilistic hesitant fuzzy environment with application to evaluate the transformation efficiency. Appl. Intell. 2018, 48, 953–965. [Google Scholar] [CrossRef]
  22. Baudrit, C.; Dubois, D.; Guyonnet, D. Joint propagation and exploitation of probabilistic and possibilistic information in risk assessment. IEEE Trans. Fuzzy Syst. 2006, 14, 593–608. [Google Scholar] [CrossRef]
  23. Li, D.Y.; Liu, C.Y.; Gan, W.Y. A new cognitive model: Cloud model. Int. J. Intell. Syst. 2009, 24, 357–375. [Google Scholar] [CrossRef]
  24. Hao, Z.N.; Xu, Z.S.; Zhao, H.; Su, Z. Probabilistic dual hesitant fuzzy set and its application in risk evaluation. Knowl.-Based Syst. 2017, 127, 16–28. [Google Scholar] [CrossRef]
  25. Liu, D.; Liu, Y.; Chen, X. The new similarity measure and distance measure of a hesitant fuzzy linguistic term set based on a linguistic scale function. Symmetry 2018, 10, 367. [Google Scholar] [CrossRef]
  26. Zhu, J.; Li, Y. Hesitant fuzzy linguistic aggregation operators based on the hamacher t-norm and t-conorm. Symmetry 2018, 10, 189. [Google Scholar] [CrossRef]
  27. Cui, W.; Ye, J. Multiple-attribute decision-making method using similarity measures of hesitant linguistic neutrosophic numbers regarding least common multiple cardinality. Symmetry 2018, 10, 330. [Google Scholar] [CrossRef]
  28. Pang, Q.; Wang, H.; Xu, Z.S. Probabilistic linguistic term sets in multi-attribute group decision making. Inf. Sci. 2016, 369, 128–143. [Google Scholar] [CrossRef]
  29. Xu, Z.S.; Zhou, W. Consensus building with a group of decision makers under the hesitant probabilistic fuzzy environment. Fuzzy Optim. Decis. Mak. 2017, 16, 481–503. [Google Scholar] [CrossRef]
  30. Smarandache, F. A unifying field in logics: Neutrosophic logic. Multi. Valued Logic. 1999, 8, 489–503. [Google Scholar]
  31. Zhang, X.H.; Bo, C.X.; Smarandache, F.; Park, C. New operations of totally dependent-neutrosophic sets and totally dependent-neutrosophic soft sets. Symmetry 2018, 10, 187. [Google Scholar] [CrossRef]
  32. Zhang, X.H.; Smarandache, F.; Liang, X.L. Neutrosophic duplet semi-group and cancellable neutrosophic triplet groups. Symmetry 2017, 9, 275. [Google Scholar] [CrossRef]
  33. Song, S.Z.; Smarandache, F.; Jun, Y.B. Neutrosophic commutative N -ideals in BCK-algebras. Information 2017, 8, 130. [Google Scholar] [CrossRef]
  34. Shao, S.T.; Zhang, X.H.; Bo, C.X.; Smarandache, F. Neutrosophic hesitant fuzzy subalgebras and filters in pseudo-BCI algebras. Symmetry 2018, 10, 174. [Google Scholar] [CrossRef]
  35. Zhang, X.H.; Bo, C.X.; Smarandache, F.; Dai, J.H. New inclusion relation of neutrosophic sets with applications and related lattice structure. Int. J. Mach. Learn. Cybern. 2018, 9, 1753–1763. [Google Scholar] [CrossRef]
  36. Zhang, X.H.; Hu, Q.Q.; Smarandache, F. On neutrosophic triplet groups: Basic properties, NT-subgroups and some notes. Symmetry 2018, 10, 289. [Google Scholar] [CrossRef]
  37. Wang, H.; Madiraju, P. Interval-neutrosophic sets. J. Mech. 2004, 1, 274–277. [Google Scholar]
  38. Wang, H.; Smarandache, F.; Sunderraman, R. Single-valued neutrosophic sets. Rev. Air Force Acad. 2013, 17, 10–13. [Google Scholar]
  39. Ye, J. Multicriteria decision-making method using the correlation coefcient under single-valued neutrosophic environment. Int. J. Gen. Syst. 2013, 42, 386–394. [Google Scholar] [CrossRef]
  40. Ye, J. Another form of correlation coefcient between single valued neutrosophic sets and its multiple attribute decision-making method. Neutrosophic Sets Syst. 2013, 1, 8–12. [Google Scholar]
  41. Ye, J. Single valued neutrosophic cross-entropy for multicriteria decision making problems. Appl. Math. Model. 2014, 38, 1170–1175. [Google Scholar] [CrossRef]
  42. Ye, J. A multicriteria decision-making method using aggregation operators for simplifed neutrosophic sets. J. Intell. Fuzzy Syst. 2014, 26, 2459–2466. [Google Scholar]
  43. Xia, M.M.; Xu, Z.S. Hesitant fuzzy information aggregation in decision making. Int. J. Approx. Reason. 2011, 52, 395–407. [Google Scholar] [CrossRef] [Green Version]
  44. Xu, Z.S.; Xia, M.M. Distance and similarity measures for hesitant fuzzy sets. Inf. Sci. 2011, 181, 2128–2138. [Google Scholar] [CrossRef]
  45. Xu, Z.S.; Xia, M.M. On distance and correlation measures of hesitant fuzzy information. Int. J. Intell. Syst. 2011, 26, 410–425. [Google Scholar] [CrossRef]
  46. Xu, Z.S.; Xia, M.M. Hesitant fuzzy entropy and cross-entropy and their use in multiattribute decision-making. Int. J. Intell. Syst. 2012, 27, 799–822. [Google Scholar] [CrossRef]
  47. Xu, Z.S.; Xia, M.M.; Chen, N. Some hesitant fuzzy aggregation operators with their application in group decision making. Group Decis. Negot. 2013, 22, 259–279. [Google Scholar]
  48. Ye, J. Correlation coefcient of dual hesitant fuzzy sets and its application to multiple attribute decision making. Appl. Math. Model. 2014, 38, 659–666. [Google Scholar] [CrossRef]
  49. Ye, J. Multiple-attribute decision-making method under a single-valued neutrosophic hesitant fuzzy environment. J. Intell. Syst. 2014, 24, 23–36. [Google Scholar] [CrossRef]
  50. Liu, P.; Shi, L. The generalized hybrid weighted average operator based on interval neutrosophic hesitant set and its application to multiple attribute decision making. Neural Comput. Appl. 2015, 26, 457–471. [Google Scholar] [CrossRef]
  51. Sahin, R.; Liu, P. Correlation coefficient of single-valued neutrosophic hesitant fuzzy sets and its applications in decision making. Neural Comput. Appl. 2017, 28, 1387–1395. [Google Scholar] [CrossRef]
  52. Li, X.; Zhang, X.H. Single-valued neutrosophic hesitant fuzzy choquet aggregation operators for multi-attribute decision making. Symmetry 2018, 10, 50. [Google Scholar] [CrossRef]
  53. Ye, J. Multiple-attribute decision-making method using similarity measures of single-valued neutrosophic hesitant fuzzy sets based on least common multiple cardinality. J. Intell. Fuzzy Syst. 2018, 34, 4203–4211. [Google Scholar] [CrossRef]
  54. Liu, P.; Zhang, L. An extended multiple criteria decision making method based on neutrosophic hesitant fuzzy information. J. Intell. Fuzzy Syst. 2017, 32, 4403–4413. [Google Scholar] [CrossRef]
  55. Liu, P.; Zhang, X.H. Some maclaurin symmetric mean operators for single-valued trapezoidal neutrosophic numbers and their applications to group decision making. Int. J. Fuzzy Syst. 2018, 20, 45–61. [Google Scholar] [CrossRef]
  56. Peng, H.G.; Zhang, H.Y.; Wang, J.Q. Probability multi-valued neutrosophic sets and its application in multi-criteria group decision-making problems. Neural Comput. Appl. 2017, 20, 563–583. [Google Scholar] [CrossRef]
Table 1. A probabilistic single-valued neutrosophic hesitant fuzzy decision matrix (PSVNHFDM) D 1 with respect to A 1 .
Table 1. A probabilistic single-valued neutrosophic hesitant fuzzy decision matrix (PSVNHFDM) D 1 with respect to A 1 .
AttributesInvestment Selection A 1
C 1 { { 0.3 | 0.2 , 0.4 | 0.3 , 0.5 | 0.5 } , { 0.1 | 1 } , { 0.3 | 0.6 , 0.4 | 0.4 } }  
  C 2 { { 0.5 | 0.5 , 0.6 | 0.5 } , { 0.2 | 0.2 , 0.3 | 0.8 } , { 0.3 | 0.4 , 0.4 | 0.6 } }  
  C 3 { { 0.2 | 0.1 , 0.3 | 0.9 } , { 0.1 | 0.3 , 0.2 | 0.7 } , { 0.5 | 0.2 , 0.6 | 0.8 } }  
Table 2. PSVNHFDM D 2 with respect to A 2 .
Table 2. PSVNHFDM D 2 with respect to A 2 .
AttributesInvestment Selection A 2
  C 1 { { 0.6 | 0.1 , 0.7 | 0.9 } , { 0.1 | 0.4 , 0.2 | 0.6 } , { 0.2 | 0.5 , 0.3 | 0.5 } }  
  C 2 { { 0.6 | 0.2 , 0.7 | 0.8 } , { 0.1 | 1 } , { 0.3 | 1 } }  
  C 3 { { 0.6 | 0.3 , 0.7 | 0.7 } , { 0.1 | 0.6 , 0.2 | 0.4 } , { 0.1 | 0.7 , 0.2 | 0.3 } }  
Table 3. PSVNHFDM D 3 with respect to A 3 .
Table 3. PSVNHFDM D 3 with respect to A 3 .
AttributesInvestment Selection A 3
C 1 { { 0.5 | 0.5 , 0.6 | 0.5 } , { 0.4 | 1 } , { 0.2 | 0.2 , 0.3 | 0.8 } }
C 2 { { 0.6 | 1 } , { 0.3 | 1 } , { 0.4 | 1 } }
C 3 { { 0.5 | 0.6 , 0.6 | 0.4 } , { 0.1 | 1 } , { 0.3 | 1 } }
Table 4. PSVNHFDM D 4 with respect to A 4 .
Table 4. PSVNHFDM D 4 with respect to A 4 .
AttributesInvestment Selection A 4
C 1 { { 0.7 | 0.4 , 0.8 | 0.6 } , { 0.1 | 1 } , { 0.1 | 0.1 , 0.2 | 0.9 } }
C 2 { { 0.6 | 0.6 , 0.7 | 0.4 } , { 0.1 | 1 } , { 0.2 | 1 } }
C 3 { { 0.3 | 0.9 , 0.5 | 0.1 } , { 0.2 | 1 } , { 0.1 | 0.1 , 0.2 | 0.8 , 0.3 | 0.1 } }
Table 5. A probabilistic interval neutrosophic hesitant fuzzy decision matrix (PINHFDM) D 1 with respect to A 1 .
Table 5. A probabilistic interval neutrosophic hesitant fuzzy decision matrix (PINHFDM) D 1 with respect to A 1 .
AttributesInvestment Selection A 1
  C 1 { { [ 0.3 , 0.4 ] | 0.1 , [ 0.4 , 0.4 ] | 0.1 , [ 0.4 , 0.5 ] | 0.8 } , { [ 0.1 , 0.2 ] | 1 } , { 0.3 , 0.4 ] | 1 } }  
  C 2 { { [ 0.4 , 0.5 ] | 0.5 , [ 0.5 , 0.6 ] ] | 0.5 } , { [ 0.2 , 0.3 ] | 1 } , { [ 0.3 , 0.3 ] | 0.7 , [ 0.3 , 0.4 ] | 0.3 } }  
  C 3 { { [ 0.2 , 0.3 ] | 1 } , { [ 0.1 , 0.2 ] | 1 } , { [ 0.4 , 0.5 ] | 0.7 , [ 0.5 , 0.6 ] | 0.3 } }  
Table 6. PINHFDM D 2 with respect to A 2 .
Table 6. PINHFDM D 2 with respect to A 2 .
AttributesInvestment Selection A 2
C 1 { { [ 0.6 , 0.7 ] | 1 } , { [ 0.1 , 0.2 ] | 1 } , { [ 0.1 , 0.2 ] | 0.2 , [ 0.2 , 0.3 ] | 0.8 } }
C 2 { { [ 0.6 , 0.7 ] | 1 } , { [ 0.1 , 0.1 ] | 1 } , { [ 0.2 , 0.3 ] | 1 } }
C 3 { { [ 0.6 , 0.7 ] | 1 } , { [ 0.1 , 0.2 ] | 1 } , { [ 0.1 , 0.2 ] | 1 } }
Table 7. PINHFDM D 3 with respect to A 3 .
Table 7. PINHFDM D 3 with respect to A 3 .
AttributesInvestment Selection A 3
C 1 { { [ 0.3 , 0.4 ] | 0.3 , [ 0.5 , 0.6 ] | 0.7 } , { [ 0.2 , 0.4 ] | 1 } , { [ 0.2 , 0.3 ] | 1 } }
C 2 { { [ 0.5 , 0.6 ] | 1 } , { [ 0.2 , 0.3 ] | 1 } , { [ 0.3 , 0.4 ] | 1 } }
C 3 { { [ 0.5 , 0.6 ] | 1 } , { [ 0.1 , 0.2 ] | 0.4 , [ 0.2 , 0.3 ] | 0.6 } , { [ 0.2 , 0.3 ] | 1 } }
Table 8. PINHFDM D 4 with respect to A 4 .
Table 8. PINHFDM D 4 with respect to A 4 .
AttributesInvestment Selection A 4
C 1 { { [ 0.7 , 0.8 ] | 1 } , { [ 0 , 0.1 ] | 1 } , { [ 0.1 , 0.2 ] | 1 } }
C 2 { { [ 0.6 , 0.7 ] | 1 } , { [ 0 , 0.1 ] | 1 } , { [ 0.2 , 0.2 ] | 1 } }
C 3 { { [ 0.3 , 0.5 ] | 1 } , { [ 0.2 , 0.3 ] | 1 } , { [ 0.1 , 0.2 ] | 0.2 , [ 0.3 , 0.3 ] | 0.8 } }
Table 9. Comparison of the results obtained by different methods under the single-valued neutrosophic hesitant fuzzy circumstance.
Table 9. Comparison of the results obtained by different methods under the single-valued neutrosophic hesitant fuzzy circumstance.
MethodSort of ResultsBest AlternativeWorst Alternative
   S V N H F W A o p e r a t o r [49] A 4 > A 2 > A 3 > A 1 A 3 A 4   
   S V N H F W G o p e r a t o r [49] A 2 > A 4 > A 3 > A 1 A 2 A 1   
   P S V N H F W A o p e r a t o r A 2 > A 4 > A 3 > A 1 A 2 A 1   
   P S V N H F W G o p e r a t o r A 2 > A 4 > A 3 > A 1 A 2 A 1   
Table 10. Comparison of the results obtained by different methods under the interval neutrosophic hesitant fuzzy circumstance.
Table 10. Comparison of the results obtained by different methods under the interval neutrosophic hesitant fuzzy circumstance.
MethodSort of ResultsBest AlternativeWorst Alternative
   G W A o p e r a t o r ( 1 λ 39 ) [50] A 3 > A 1 > A 2 > A 4 A 3 A 4   
   P I N H F W A o p e r a t o r A 4 > A 2 > A 3 > A 1 A 4 A 1   
   P I N H F W G o p e r a t o r A 2 > A 4 > A 3 > A 1 A 2 A 1   

Share and Cite

MDPI and ACS Style

Shao, S.; Zhang, X.; Li, Y.; Bo, C. Probabilistic Single-Valued (Interval) Neutrosophic Hesitant Fuzzy Set and Its Application in Multi-Attribute Decision Making. Symmetry 2018, 10, 419. https://doi.org/10.3390/sym10090419

AMA Style

Shao S, Zhang X, Li Y, Bo C. Probabilistic Single-Valued (Interval) Neutrosophic Hesitant Fuzzy Set and Its Application in Multi-Attribute Decision Making. Symmetry. 2018; 10(9):419. https://doi.org/10.3390/sym10090419

Chicago/Turabian Style

Shao, Songtao, Xiaohong Zhang, Yu Li, and Chunxin Bo. 2018. "Probabilistic Single-Valued (Interval) Neutrosophic Hesitant Fuzzy Set and Its Application in Multi-Attribute Decision Making" Symmetry 10, no. 9: 419. https://doi.org/10.3390/sym10090419

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop