Next Article in Journal
Approximating Common Fixed Points of Nonexpansive Mappings on Hadamard Manifolds with Applications
Previous Article in Journal
Robust Output Tracking of Boolean Control Networks over Finite Time
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On Covering-Based Rough Intuitionistic Fuzzy Sets

1
Department of Mathematics, Faculty of Science, Kafrelsheikh University, Kafr El-Sheikh 33516, Egypt
2
Department of Mathematics, Faculty of Science and Arts in Al-Mandaq, AL Baha University, Al Bahah P.O. Box 1988, Saudi Arabia
3
Department of Mathematics, Faculty of Science, Zarqa University, Zarqa 13133, Jordan
4
Department of Mathematics, Faculty of Science, New Valley University, El-Kharja 72511, Egypt
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(21), 4079; https://doi.org/10.3390/math10214079
Submission received: 15 September 2022 / Revised: 25 October 2022 / Accepted: 26 October 2022 / Published: 2 November 2022

Abstract

:
Intuitionistic Fuzzy Sets ( I F S s ) and rough sets depending on covering are important theories for dealing with uncertainty and inexact problems. We think the neighborhood of an element is more realistic than any cluster in the processes of classification and approximation. So, we introduce intuitionistic fuzzy sets on the space of rough sets based on covering by using the concept of the neighborhood. Three models of intuitionistic fuzzy set approximation space based on covering are defined by using the concept of neighborhood. In the first and second model, we approximate IFS by rough set based on one covering (C) by defining membership and non-membership degree depending on the neighborhood. In the third mode, we approximate IFS by rough set based on family of covering ( C i ) by defining membership and non-membership degree depending on the neighborhood. We employ the notion of the neighborhood to prove the definitions and the features of these models. Finlay, we give an illustrative example for the new covering rough I F approximation structure.

1. Introduction

The equivalence relation is the main tool for classification algorithms by Pawlak rough theory [1]. Pawlak’s model is suitable for discrete data but not continuous data, as the cost of computation becomes too high. So, many extensions of Pawlak’s model have been studied by many researchers, such as probabilistic rough sets [2], fuzzy rough set models [3], tolerance relations [4], similarity rough sets [5], decision-theoretic rough sets [6], rough sets based on covering [7] and probabilistic rough sets [2]. One of the most important extensions of rough set models are neighborhood rough set systems [8]. These systems can deal with discrete and continuous data by using the θ -neighborhood relation. Neighborhood rough set models have many application in feature selection [9,10,11,12].
X. Zhan-Ao et al. [13] combined rough sets depending on covering and IFSs using the concept of minimal description. In our approach, we use the neighborhood concept in the approximation of IFS defined on rough sets depending on covering. Many scholars, such as Goguen [14], Gau et al. [15] and Nada et al. [16], extended fuzzy set concepts in various directions after Zadeh [17] discovered them in 1965. Since then, fuzzy set concepts have been used in topology, analysis and other branches of mathematics. The definition of I F S s is one of the important generalizations of fuzzy sets, which has been presented by Atanassov [18], and it is a group that is important for the importance of its applications. I F S theories, such as in [19,20], are more convenient and are also applied in decision making in various fields, including programming, health [11,21,22] and decision-making problems [23,24,25]. Dubois and Prade extended the fuzzy lower ( L a p p ) and fuzzy upper approximation ( U a p p ) in the approximation structure of Pawlak to obtain a rough fuzzy set [26]. Although I F S s is a generalization of classical fuzzy sets and rough sets depending on covering, it is a generalization of classical rough sets. The combination of rough sets depending on covering and I F S s has gained the attention of many researchers, and many studies have been conducted using single-granulation methods [27,28,29,30,31,32]. Pawlak’s rough sets depending on single-granulation have been generalized to rough sets depending on multi-granulation (MGRSs) [33]. Zhan-Ao et al. [13] combined rough sets depending on covering and I F S s using the concept of minimal description. In this paper, we introduce this combination by using the concepts of neighborhood for each element of the universe. We think that the neighborhood of an element is more realistic than its description. So, it will be the best in classification processes and decision making. We process this combination for both single-granulation and multi-granulation. We introduce many basic concepts related to I F S , fuzzy set, rough I F S and c o v a p p structure.

2. Preliminaries

Definition 1
([17]). Assume that X 1 is a non-empty set. A fuzzy set H is characterized by a membership function μ H ( x 1 ) from X 1 to I = ] 0 , 1 [ and totally characterizing the fuzzy set H is H = { ( X 1 , μ H ( x 1 ) ) : x 1 X 1 } . In what comes next, let us assume that U, the universe of discourse, is finite.
Definition 2
([18]). An I F S on a universe X 1 is an object of the form H = { ( x 1 , μ H ( x 1 ) , ν H ( x 1 ) ) : x 1 X 1 } . Such that μ H ( x 1 ) [ 0 , 1 ] is said to be "degree of membership of x 1 in H ", and ν H ( x 1 ) [ 0 , 1 ] is called the "degree of non-membership of x 1 in H" since x 1 X 1 : μ H ( x 1 ) + ν H ( x 1 ) 1 , π H ( x 1 ) = 1 ( μ H ( x 1 ) + ν H ( x 1 ) ) , such that π H ( x 1 ) is called the hesitation part. The next example illustrates a collection of faction plants and the presence or absence of yellow cards. Next is an example of faction plants and yellow cards or not. H = { ( G r o u r d s , 0.6 , 0.2 ) , ( C i t r u s , 0.1 , 0.6 ) , ( L e g u m e s , 0.7 , 0.25 ) , ( G r a m i n e o u s , 0.9 , 0.1 ) } .
Definition 3
([18]). AIf H , K two IFS s on X 1 , then the IFS H K iff, x 1 X 1 , μ H ( x 1 ) μ K ( x 1 ) and ν H ( x 1 ) ν K ( x 1 ) .
Definition 4
([18]). An I F S H equal to I F S K iff x 1 X 1 , μ H ( V ) = μ K ( x 1 ) and ν H ( x 1 ) = ν K ( x 1 ) .
Definition 5
([23]). If H , K are two IFS s on X 1 , then the IFS H K = { ( x 1 , m i n ( μ H ( x 1 ) , μ K ( x 1 ) ) , m a x ( ν H ( x 1 ) , ν K ( x 1 ) ) ) : x 1 X 1 } .
Definition 6
([23]). If H , K two IFS s on X 1 , then the IFS H K = { ( x 1 , m a x ( μ H ( x 1 ) , μ K ( x 1 ) ) , m i n ( ν H ( x 1 ) , ν K ( x 1 ) ) ) : x 1 X 1 } .
Definition 7
([34,35]). Let U be a universe and C be a family of subsets of U whenever no subset in C is empty and ∪ C = U . Then, C is called a covering of U. Clearly, a partition of U is also a covering of U, hence the notion of a covering extends the notion of a partition.
Definition 8
([34,35]). If U is a non-empty set, and C a covering of U, then the ordered pair U , C is called a covering approximation ( c o v a p p ) structure.
Definition 9
([34,35]). If U , C is a c o v a p p structure, x 1 U , then the set family M d ( x 1 ) is called the minimal description of x 1 such that M d ( x 1 ) = { L C : x 1 L ( S C x 1 S S L L = S ) } .
Definition 10
([36]). If U is a non-null set, R is an equivalence relation on U and A is an I F S in U with the membership function μ A and non-membership function ν A , then L a p p and the U a p p s R 1 ( A ) and R 2 ( A ) , respectively, of the I F S A are I F S s of the quotient set U / R with:
(i) Membership function is defined as
μ R 1 ( A ) ( x ) = i n f { μ A ( x ) : x U } ,     μ R 2 ( A ) ( x ) = s u p { μ A ( x ) : x U }
(ii) Non-membership function is defined by
ν R 1 ( A ) ( x ) = s u p { ν A ( x ) : x U } ,     ν R 2 ( A ) ( x ) = i n f { ν A ( x ) : x U }
The rough I F S of A is R ( A ) given by the pair R ( A ) = ( R 1 ( A ) , R 2 ( A ) ) .
Example 1.
Let A be an I F S , A = { ( y 1 , 0.8 , 0.1 ) , ( y 2 , 0.7 , 0.2 ) , ( y 3 , 0.6 , 0.1 ) , ( y 4 , 0.9 , 0.1 ) , ( y 5 , 0.8 , 0.2 ) } and IF relation R as in Table 1.
R 1 ( A ) = { ( y 1 , 0.3 , 0.2 ) , ( y 2 , 0.1 , 0.4 ) , ( y 3 , 0.0 , 0.6 ) , ( y 4 , 0.3 , 0.5 ) , ( y 5 , 0.0 , 0.5 ) }
R 2 ( A ) = { ( y 1 , 0.9 , 0.0 ) , ( y 2 , 0.8 , 0.0 ) , ( y 3 , 0.7 , 0.1 ) , ( y 4 , 0.7 , 0.1 ) , ( y 5 , 0.9 , 0.0 ) }
Definition 11
([37]). If ( U , C ) is a c o v a p p structure, X 1 U , then
{ K : x 1 K C } is called the discernible neighborhood of x 1 and is denoted as a friend of x 1 .
{ K : x 1 K C } is called the neighborhood of x 1 and defined as N C ( X 1 ) . We eliminate the lowercase C when there is no confusion.
This article’s sections are arranged as follows: in Section 2, we define the first type of covering-based I F S and its related properties. In Section 3, we introduce the second type of covering rough I F -based neighborhood. In Section 4, we present the third type as a multi-granulation covering rough I F S s .

3. Covering Rough I F S s of the First Type

Definition 12.
If ( U , C ) is a c o v a p p structure, X 1 I F S ( U ) , then C * ( X 1 ) = { ( x 1 , α , β ) : x 1 U } is called covering rough intuitionistic L a p p of X 1 . Where,
α = { μ ( y ) : y N ( x 1 ) }      β = { ν ( y ) : y N ( x 1 ) }
and C * ( X 1 ) = { ( x 1 , Ψ , ω ) : x 1 U is called covering rough intuitionistic U a p p of X 1 . Where,
Ψ = { μ ( y ) : y N ( x 1 ) }      ω = { ν ( y ) : y N ( x 1 ) }
We call this model type-I covering rough I F S s .
Example 2.
Consider U = { u a , u b , u c , u d , u e , u f , u g , u h } , C = { l o n g , a v e r a g e , s h o r t } = { { u a , u b , u d } , { u c , u d , u e , u g } , { u f , u g , u h } } , and X 1 = { ( u a , 0.7 , 0.2 ) , ( u b , 0.3 , 0.6 ) , ( u c , 0.0 , 0.9 ) , ( u d , 0.9 , 0.1 ) , ( u e , 0.1 , 0.9 ) , ( u f , 0.0 , 0.8 ) , ( u g , 0.2 , 0.7 ) , ( u h , 0.9 , 0.1 ) } . As such, we compute the L a p p and U a p p of X 1 based on the model we presented in Definitions 9 and 12.
N ( u a ) = N ( u b ) = { u a , u b , u d } , N ( u c ) = N ( u e ) = { u c , u d , u e , u g } , N ( u d ) = { u d } , N ( u f ) = N ( u h ) = { u f , u g , u h } , N ( u g ) = { u g } . Then C * ( X 1 ) = { ( u a , 0.3 , 0.6 ) , ( u b , 0.3 , 0.6 ) , ( u c , 0.0 , 0.9 ) ,
( u d , 0.9 , 0.1 ) , ( u e , 0.0 , 0.9 ) , ( u f , 0.0 , 0.8 ) , ( u g , 0.0 , 0.9 ) , ( u h , 0.0 , 0.8 ) , } , and C * ( X 1 ) = { ( u a , 0.9 , 0.1 ) , ( u b , 0.9 , 0.1 ) , ( u c , 0.9 , 0.1 ) , ( u d , 0.9 , 0.1 ) , ( u e , 0.9 , 0.1 ) , ( u f , 0.9 , 0.1 ) , ( u g , 0.2 , 0.7 ) , ( u h , 0.9 , 0.1 ) }
Proposition 1.
If ( U , C ) is a c o v a p p approximation structure, then X 1 , Y I F S ( U ) ; the next properties hold:
(1) C * ( U ) = C * ( U ) = U     (2) C * ( Φ ) = C * ( Φ ) = Φ
(3) C * ( X 1 ) X 1 C * ( X 1 )     (4) Let X 1 Y . Therefore, C * ( X 1 ) C * ( Y ) and C * ( X 1 ) C * ( Y )
Proof. 
From Definition 12, it is clear. □

4. Covering Rough IFS s of the Second Type

Definition 13.
Let U be non-empty and C be a covering of U. For X 1 I F S ( U ) , x 1 U covering rough I F membership and the non-membership degrees of x 1 depending on the neighborhood of X 1 are described as: μ X 1 ( x 1 ) = y N ( x 1 ) μ X 1 ( y ) | N ( x 1 ) | ,     ν X 1 ( x 1 ) = y N ( x 1 ) ν X 1 ( y ) | N ( x 1 ) | .
The covering rough I F   L a p p of X 1 is described as
C N * ( X 1 ) = { ( x 1 , μ C N * ( x 1 ) , ν C N * ( x 1 ) ) : x 1 U }
We define the covering rough I F U a p p of X 1 in terms of
C N * ( X 1 ) = { ( x 1 , μ C N * ( x 1 ) , ν C N * ( x 1 ) ) : x 1 U }
Since μ C N * ( x 1 ) = m i n ( μ X 1 ( x 1 ) , μ X 1 ( x 1 ) ) , ν C N * ( x 1 ) = m a x ( ν X 1 ( x 1 ) , ν X 1 ( x 1 ) ) ,
μ C N * ( x 1 ) = m a x ( μ X 1 ( x 1 ) , μ X 1 ( x 1 ) ) and ν C N * ( x 1 ) = m i n ( ν X 1 ( x 1 ) , ν X 1 ( x 1 ) ) .
The following example is an illustrative example.
Example 3.
Consider ( U , C ) is a c o v a p p structure, U = { u a , u b , u c , u d , u e , u f , u g , u h } , C = { { u a , u b } , { u c , u d , u e } , { u d , u e , u g } , { u f , u g , u h } }
and X 1 I F S ( U ) , such that X 1 = { ( u a , 0.2 , 0.3 ) , ( u b , 0.7 , 0.2 ) , ( u c , 0.5 , 0.2 ) , ( u d , 0.3 , 0.6 ) , ( u e , 0.2 , 0.7 ) , ( u f , 0.0 , 0.6 ) , ( u g , 0.5 , 0.4 ) , ( u h , 0.1 , 0.5 ) } .
Accordingly, we compute the L a p p and U a p p of X 1 based on the model presented in Definitions 9 and 13
N ( u a ) = N ( u b ) = { u a , u b } , N ( u c ) = { u c , u d , u e } , N d ( u d ) = N ( u e ) = { u d , u e } , N ( u f ) = N ( u h ) = { u f , u g , u h } , N ( u g ) = { u g } .Then, μ X 1 ( x 1 ) = μ X 1 ( x 2 ) = 0.45 , μ X 1 ( x 3 ) = 0.33 , μ X 1 ( x 4 ) = μ X 1 ( x 5 ) = 0.25 , μ X 1 ( x 6 ) = μ X 1 ( x 8 ) = 0.3 , μ X 1 ( x 7 ) = 0.5 , and ν X 1 ( x 1 ) = ν X ( x 2 ) = 0.4 , ν X 1 ( x 3 ) = 0.5 , ν X 1 ( x 4 ) = ν X 1 ( x 5 ) = 0.65 , ν X 1 ( x 6 ) = ν X 1 ( x 8 ) = 0.5 , ν X 1 ( x 7 ) = 0.4 . From Definition 13, we have  μ C N * ( x 1 ) = m i n ( 0.2 , 0.45 ) = 0.2 , μ C N * ( x 2 ) = m i n ( 0.7 , 0.45 ) = 0.45 ,
μ C N * ( x 3 ) = m i n ( 0.5 , 0.33 ) = 0.33 , μ C N * ( x 4 ) = m i n ( 0.3 , 0.25 ) = 0.25 , μ C N * ( x 5 ) = m i n ( 0.2 , 0.25 ) = 0.2 , μ C N * ( x 6 ) = m i n ( 0.0 , 0.3 ) = 0.0 , μ C N * N ( x 7 ) = m i n ( 0.5 , 0.5 ) = 0.5 , μ C N * ( x 8 ) = m i n ( 0.1 , 0.3 ) = 0.1 , and ν C N * ( x 1 ) = m a x ( 0.3 , 0.4 ) = 0.4 , ν C N * ( x 2 ) = m a x ( 0.2 , 0.4 ) = 0.4 , ν C N * ( x 3 ) = m a x ( 0.2 , 0.5 ) = 0.5 , ν C N * ( x 4 ) = m a x ( 0.6 , 0.65 ) = 0.65 , ν C N * ( x 5 ) = m a x ( 0.7 , 0.65 ) = 0.7 , ν C N * ( x 6 ) = m a x ( 0.6 , 0.5 ) = 0.6 , ν C N * ( x 7 ) = m a x ( 0.4 , 0.4 ) = 0.4 , ν C N * ( x 8 ) = m a x ( 0.5 , 0.5 ) = 0.5 . We have C N * ( x ) = { ( x 1 , 0.2 , 0.4 ) , ( x 2 , 0.45 , 0.4 ) , ( x 3 , 0.33 , 0.5 ) , ( x 4 , 0.25 , 0.65 ) , ( x 5 , 0.2 , 0.7 ) , ( x 6 , 0.0 , 0.6 ) , ( x 7 , 0.5 , 0.4 ) , ( x 8 , 0.1 , 0.5 ) } . Likewise, from Definition 13, we can obtain  C N * ( x ) = { ( x 1 , 0.45 , 0.3 ) , ( x 2 , 0.7 , 0.2 ) , ( x 3 , 0.5 , 0.2 ) , ( x 4 , 0.3 , 0.6 ) , ( x 5 , 0.25 , 0.65 ) , ( x 6 , 0.3 , 0.5 ) , ( x 7 , 0.5 , 0.4 ) , ( x 8 , 0.3 , 0.5 ) } .
Proposition 2.
If ( U , C ) is a c o v a p p structure. The set family operators C * , and C * , X 1 , Y I F S ( U ) , then satisfy the next properties:
(1) C N * ( U ) = C N * ( U ) = U     (2) C N * ( Φ ) = C N * ( Φ ) = Φ
(3) C N * ( X 1 ) X 1 C N * ( X 1 )     (4) Let X 1 Y . Thus, C N * ( X 1 ) C N * ( Y ) and C N * ( X 1 ) C N * ( Y )
Proof. 
(1) If we take X 1 = U , then the membership degree =1 and the non-membership degree=0 for each x X 1 . Then, μ U ( x 1 ) = 1 , ν U ( x 1 ) = 0 , so from Definition 13 μ U ( x 1 ) = 1 , ν U ( x 1 ) = 0 . By Definition 13 , μ C N * ( x 1 ) = 1 , ν C N * ( x 1 ) = 0 and μ C N * ( x 1 ) = 1 , ν C N * ( x ) = 0 hence C N * ( U ) = C N * ( U ) = U .
(2) Similar to (1).
(3) From Definition 13 x 1 U , X 1 is I F S , μ X 1 ( x 1 ) μ X 1 ( x 1 ) or μ X 1 ( x 1 ) μ X 1 ( x 1 ) and ν X 1 ( x 1 ) ν X 1 ( x 1 ) or ν X 1 ( x 1 ) ν X 1 ( x 1 ) . Then, there are four cases:
(i) μ X 1 ( x 1 ) μ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν X 1 ( x 1 )     (ii) μ X 1 ( x 1 ) μ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν X 1 ( x 1 )
(iii) μ X 1 ( x 1 ) μ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν X 1 ( x 1 )     (iv) μ X 1 ( x 1 ) μ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν X 1 ( x 1 )  □
Proof. 
(i) If μ X 1 ( x 1 ) μ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν X 1 ( x 1 ) , then μ C N * = μ X 1 ( x 1 ) , μ C N * = μ X 1 ( x 1 ) , ν C N * = ν X 1 ( x 1 ) , ν C N * = ν X 1 ( x 1 ) . Therefor, μ C N * ( x 1 ) μ X 1 ( x 1 ) μ C N * , ν C N * ( x 1 ) ν X 1 ( x 1 ) ν C N * . □
Proof. 
(ii) If μ X 1 ( x 1 ) μ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν X 1 ( x 1 ) . Then, μ C N * = μ X 1 ( x 1 ) , μ C N * = μ X 1 ( x 1 ) , ν C N * = ν X 1 ( x 1 ) , ν C N * = ν X 1 ( x 1 ) . Therefore, μ C N * ( x 1 ) μ X 1 ( x 1 ) μ C N * ( x 1 ) , and ν C N * ( x 1 ) ν X 1 ( x 1 ) μ C N * ( x 1 ) . □
The proofs (iii) and (iv) are the same, hence C N * X 1 C N * .
(4) If X 1 Y , x 1 U , then μ X 1 ( x 1 ) μ Y ( x 1 ) , ν X 1 ( x 1 ) ν Y ( x 1 ) , μ X ( x 1 ) μ Y ( x 1 ) , ν X 1 ( x 1 ) ν Y ( x 1 ) . Therefore, there exist four types of relationships for the membership and non-membership degrees of the L a p p and U a p p s.
(i) If μ C N * ( X 1 ) ( x 1 ) = μ X 1 ( x 1 ) , μ C N * ( X 1 ) ( x 1 ) = μ X 1 ( x 1 ) and ν C N * ( X ) ( x ) = ν X ( x ) , ν C N * ( X ) ( x ) = ν X 1 ( x 1 )
(ii) If μ C N * ( X 1 ) ( x 1 ) = μ X 1 ( x 1 ) , μ C N * ( X 1 ) ( x 1 ) = μ X 1 ( x 1 ) and ν C N * ( X 1 ) ( x 1 ) = ν X 1 ( x 1 ) ,
ν C N * ( X 1 ) ( x 1 ) = ν X ( x 1 )
(iii) If μ C N * ( X 1 ) ( x 1 ) = μ X 1 ( x 1 ) , μ C N * ( X 1 ) ( x 1 ) = μ X 1 ( x 1 ) and ν C N * ( X 1 ) ( x 1 ) = ν X 1 ( x 1 ) ,
ν C N * ( X 1 ) ( x 1 ) = ν X 1 ( x 1 )
(iv) If μ C N * ( X 1 ) ( x 1 ) = μ X 1 ( x 1 ) , μ C N * ( X 1 ) ( x 1 ) = μ X 1 ( x 1 ) and ν C N * ( X 1 ) ( x 1 ) = ν X 1 ( x 1 ) ,
ν C N * ( X 1 ) ( x 1 ) = ν X 1 ( x 1 )
For the proof of (i) there are four cases:
Case 1: Let μ C N * ( Y ) ( x 1 ) = μ Y ( x 1 ) , μ C N * ( Y ) ( x 1 ) = μ Y ( x 1 ) and ν C N * ( Y ) ( x 1 ) = ν Y ( x 1 ) , ν C N * ( Y ) ( x 1 ) = ν Y ( x 1 ) .
Then, μ C N * ( X 1 ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , μ C N * ( X ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , and ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( x 1 ) , ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( x 1 ) .
Case 2: If μ C N * ( Y ) ( x 1 ) = μ Y ( x 1 ) , μ C N * ( Y ) ( x 1 ) = μ Y ( x 1 ) and ν C N * ( Y ) ( x 1 ) = ν Y ( x ) , ν C N * ( Y ) ( x 1 ) = ν Y ( x 1 ) .
Since μ X 1 ( x 1 ) μ X 1 ( x 1 ) μ Y ( x 1 ) , μ X 1 ( x 1 ) μ Y ( x 1 ) μ Y ( x 1 ) , then μ X 1 ( x 1 ) μ Y ( x 1 ) , μ X 1 ( x 1 ) μ Y ( x 1 ) , hence μ C N * ( X 1 ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , μ C N * ( X ) ( x ) μ C N * ( Y ) ( x 1 ) , and ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( x 1 ) , ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( x 1 ) .
Case 3: Let μ C N * ( Y ) ( x 1 ) = μ Y ( x 1 ) , μ C N * ( Y ) ( x 1 ) = μ Y ( x 1 ) , and ν C N * ( Y ) ( x ) = ν Y ( x 1 ) , ν C N * ( Y ) ( x 1 ) = ν Y ( x 1 ) . Then μ C N * ( X 1 ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , μ C N * ( X 1 ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , and ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( x 1 ) , ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( x 1 ) . Since ν X 1 ( x 1 ) ν X 1 ( x 1 ) μ Y ( x 1 ) , ν X 1 ( x ) ν Y ( x 1 ) ν Y ( x 1 ) , hence ν X 1 ( x 1 ) ν Y ( x 1 ) , ν X 1 ( x 1 ) ν Y ( x 1 ) . Then μ C N * ( X 1 ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , μ C N * ( X 1 ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( X 1 ) , ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( X 1 ) .
Case 4: Let μ C N * ( Y ) ( x 1 ) = μ Y ( x 1 ) , μ C N * ( Y ) ( x 1 ) = μ Y ( x 1 ) , and ν C N * ( Y ) ( x 1 ) = ν Y ( x 1 ) , ν C N * ( Y ) ( x 1 ) = ν Y ( x 1 ) . Then μ C N * ( X 1 ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , μ C N * ( X 1 ) ( x 1 ) μ C N * ( Y ) ( x 1 ) , and ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( x 1 ) , ν C N * ( X 1 ) ( x 1 ) ν C N * ( Y ) ( x 1 ) . The proofs of (i), (iii) and (iv) are the same. Then, C N * ( X 1 ) C N * ( Y ) and C N * ( X 1 ) C N * ( X 1 ) .

5. Multi-Granulation Covering Rough IFS s of the Third Type

In this part, we introduce a novel technique for multi-granulation covering rough I F S s , in short MGCRIFSs.
Definition 14.
Let ( U , C 1 ) be a c o v a p p structure, C 1 = { C i : 1 i n } be a family of coverings of U and C i be a covering of U, | C 1 | = n , x 1 U . The neighborhood is N ^ ( x 1 ) = { K C i : x 1 K } .
Definition 15.
Let ( U , C 1 ) be a c o v a p p structure, C 1 = { C i : 1 i n } be a family of coverings of U and C i be a covering of U, | C 1 | = n , x 1 U . Then, multi-granulation I F covering rough membership and non-membership degrees of x 1 depending on the neighborhood of X 1 are defined, respectively, as
μ ^ X 1 ( x 1 ) = y N ^ ( x 1 ) μ X 1 ( y ) | N ^ ( x 1 ) | ,     ν ^ X 1 ( x 1 ) = y N ^ ( x 1 ) ν X 1 ( y ) | N ^ ( x 1 ) |
The L a p p is C N ^ * ( X 1 ) = { ( x 1 , γ , δ ) : x 1 U } , where γ = m i n ( μ X 1 ( x 1 ) , μ ^ X 1 ( x 1 ) ) ,
δ = m a x ( ν X 1 ( x 1 ) , ν ^ X 1 ( x 1 ) ) , and the U a p p is C N ^ * ( X 1 ) = { ( x 1 , θ , λ ) : x 1 u a } , where θ = m a x ( μ X 1 ( x 1 ) , μ ^ X 1 ( x 1 ) ) , λ = m i n ( ν X 1 ( x 1 ) , ν ^ X 1 ( x 1 ) ) .
The following example is an illustrative example of the above definition.
Example 4.
Let ( U , C 1 ) be a c o v a p p structure, U = { u a , u b , u c , u d , u e , u f , u g , u h } , C = { C 1 , C 2 , C 3 } , C 1 = { { u a , u c , u d } , { u a , u g } , { u b , u e , u f , u h } } , C 2 = { { u b , u a , u c , u f } , { u a , u d } , { u e , u g , u h } } , C 3 = { { u a , u c , u e , u f } , { u b , u d } ,
{ u e , u g , u h } } . The I F S is: X = { ( u a , 0.2 , 0.3 ) , ( u b , 0.7 , 0.2 ) , ( u c , 0.5 , 0.2 ) , ( u d , 0.3 , 0.6 ) , ( u e , 0.2 , 0.7 ) ,
( u f , 0.0 , 0.6 ) , ( u g , 0.5 , 0.4 ) , ( u h , 0.1 , 0.5 ) } . We can calculate the L a p p and U a p p s of X. Using Definitions 14 and 15
N ^ ( u a ) = { u a } , N ^ ( u b ) = { u b } , N ^ ( u c ) = { u a , u c } , N ^ ( u d ) = { u d } , N ^ ( u e ) = { u e } , N ^ ( u f ) = { u f } , N ^ ( u g ) = { u g } , N ^ ( u h ) = { u e , u h } .
We can calculate the membership degrees as follows
μ ^ X ( u a ) = 0.2 , μ ^ X ( u b ) = 0.7 , μ ^ X ( u c ) = 0.35 , μ ^ U ( u d ) = 0.3 , μ ^ X ( u e ) = 0.2 , μ ^ X ( u f ) = 0.0 , μ ^ X ( u g ) = 0.5 , μ ^ X ( u h ) = 0.15 .
Additionally, calculate the non-membership degrees as follows:
ν ^ X ( u a ) = 0.3 , ν ^ X ( u b ) = 0.2 , ν ^ X ( u c ) = 0.25 , ν ^ X ( u d ) = 0.6 , ν ^ X ( u e ) = 0.7 , ν ^ X ( u f ) = 0.6 , ν ^ X ( u g ) = 0.4 , ν ^ U ( u h ) = 0.6 ,.
The L a p p is
C N ^ * ( X ) = { ( u a , 0.2 , 0.3 ) , ( u b , 0.7 , 0.2 ) , ( u c , 0.35 , 0.25 ) , ( u d , 0.3 , 0.6 ) , ( u e , 0.2 , 0.7 ) , ( u f , 0.0 , 0.6 ) , ( u g , 0.5 , 0.4 ) , ( u h , 0.1 , 0.6 ) } .
Additionally, the U a p p is
C N ^ * ( X ) = { ( u a , 0.2 , 0.3 ) , ( u b , 0.7 , 0.2 ) , ( u c , 0.5 , 0.2 ) , ( u d , 0.3 , 0.6 ) , ( u e , 0.25 , 0.5 ) , ( u f , 0.0 , 0.6 ) , ( u g , 0.5 , 0.4 ) , ( u h , 0.15 , 0.5 ) } .
Proposition 3.
Assume that ( U , C ) is a c o v a p p structure. The next properties are satisfied:
(1) C N ^ * ( U ) = C N ^ * ( U ) = U     (2) C N ^ * ( Φ ) = C N ^ * ( Φ ) = Φ
(3) C N ^ * ( X 1 ) X 1 C N ^ * ( X 1 )     (4) If X 1 Y 1 , then C N ^ * ( X 1 ) C N ^ * ( Y 1 ) , C N ^ * ( X 1 ) C N ^ * ( Y 1 )
Proof. 
(1) Suppose that U is I F S . Then, μ U ( x 1 ) = 1 , ν U ( x 1 ) = 0 , x U by Definition 15, μ ^ U ( x 1 ) = 1 , ν ^ U ( x 1 ) = 0 . Then γ = θ = 1 , δ = λ = 0 . Therefore, C N * ^ ( U ) = C N * ^ ( U ) = U .
(2) Similar to (1)
(3) From Definition 15, μ X 1 ( x 1 ) μ ^ X 1 ( x 1 ) or μ X 1 ( x 1 ) μ ^ X 1 ( x 1 ) , and ν X 1 ( x 1 ) ν ^ X 1 ( x 1 ) or ν X 1 ( x 1 ) ν ^ X 1 ( x 1 ) . Therefore, there are four cases.
(i) μ X 1 ( x 1 ) μ ^ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν ^ X 1 ( x 1 ) (ii) μ X 1 ( x 1 ) μ ^ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν ^ X 1 ( x 1 )
(iii) μ X 1 ( x 1 ) μ ^ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν ^ X 1 ( x 1 ) (iv) μ X 1 ( x 1 ) μ ^ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν ^ X 1 ( x 1 )
We prove (i) and (ii)
(i) Let μ X 1 ( x 1 ) μ ^ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν ^ X 1 ( x 1 ) . Then, γ = μ X 1 ( x 1 ) , θ = μ ^ X ( x ) , δ = ν ^ X 1 ( x 1 ) , λ = ν X 1 ( x 1 ) , hence γ μ ^ X 1 ( x 1 ) θ , and δ ν X ( x 1 ) λ .
(ii) Let μ X 1 ( x 1 ) μ ^ X 1 ( x 1 ) , ν X 1 ( x 1 ) ν ^ X 1 ( x 1 ) . Then, γ = μ ^ X 1 ( x 1 ) , θ = μ X 1 ( x 1 ) , δ = ν ^ X 1 ( x 1 ) , λ = ν X 1 ( x 1 ) , hence γ μ X 1 ( x 1 ) θ , and δ ν X ( x 1 ) λ , from (i) and (ii), we obtain C N ^ * ( X 1 ) X 1 C N ^ * ( X 1 ) .
(4) The proof is similar to (3). □

6. Conclusions

Covering-based rough set theory was combined with I F S s in dealing with uncertainty and decision making process. We approximated IFS on the covering rough set space via the neighborhood of each element of the universe. Three models of the approximation of IFS are introduced by the neighborhood concept. Some examples were used to prove and explain the properties of these approximation structures. We think these models will be useful in the decision-making process. In the future, we will use the concept of the core of the neighborhood of the element in generating a new covering rough I F approximation structure. In the future, we will using the IFS in new different applications (DNA mutation–repair mutation).

Author Contributions

Conceptualization, R.M., I.N., R.A.-G. and M.B.; data curation, R.M., I.N., R.A.-G. and M.B.; formal analysis, R.M., I.N., R.A.-G. and M.B.; software, R.A.-G.; supervision, I.N.; validation, R.M. and M.B.; visualization, R.M., I.N., R.A.-G. and M.B.; writing—review and editing, R.M., I.N., R.A.-G. and M.B.; investigation, R.M., I.N., R.A.-G. and M.B.; methodology, R.M., I.N., R.A.-G. and M.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pawlak, Z. Rough sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar] [CrossRef]
  2. Yao, Y. Probabilistic rough set approximations. Int. J. Approx. Reason. 2005, 49, 255–271. [Google Scholar] [CrossRef] [Green Version]
  3. Hu, Q.; Yu, D.; Xie, Z.; Liu, J. Fuzzy probabilistic approximation spaces and their information measures. IEEE Trans. Fuzzy Syst. 2006, 14, 191–201. [Google Scholar]
  4. Skowron, A.; Stepaniuk, J. Tolerance approximation spaces. Fundam. Inf. 1996, 27, 245–253. [Google Scholar] [CrossRef]
  5. Slowinski, R.; Vanderpooten, D. A generalized definition of rough approximations based on similarity. IEEE Trans. Knowl. Data Eng. 2000, 12, 331–336. [Google Scholar] [CrossRef]
  6. Yao, Y.; Zhao, Y. Attribute reduction in decision-theoretic rough set models. Inf. Sci. 2008, 178, 3356–3373. [Google Scholar] [CrossRef] [Green Version]
  7. Yao, Y.; Yao, B. Covering based rough set approximations. Inf. Sci. 2012, 200, 91–107. [Google Scholar] [CrossRef]
  8. Al-shami, T.M.; Ciucci, D. Subset neighborhood rough sets. Knowl.-Based Syst. 2022, 237. [Google Scholar] [CrossRef]
  9. Hu, Q.; Yu, D.; Liu, J.; Wu, C. Neighborhood rough set based heterogeneous feature subset selection. Inf. Sci. 2008, 178, 3577–3594. [Google Scholar] [CrossRef]
  10. Yong, L.; Wenliang, H.; Yunliang, J.; Zhiyong, Z. Quick attribute reduct algorithm for neighborhood rough set model. Inf. Sci. 2014, 271, 65–81. [Google Scholar] [CrossRef]
  11. Zhang, H.; Shu, L.; Liao, S. Intuitionistic fuzzy soft rough set and its application in decision making. In Abstract and Applied Analysis; Hindawi: London, UK, 2014; Volume 2014. [Google Scholar]
  12. Al-shami, T.M. An improvement of rough sets’ accuracy measure using containment neighborhoods with a medical application. Inf. Sci. 2021, 569, 110–124. [Google Scholar] [CrossRef]
  13. Zhan-Ao, X.; Xiao-Meng, S.; Tian-Yu, X.; Xian-Wei, X.; Yi-lin, Y. Multi-granulation covering rough intuitionistic fuzzy sets. J. Intell. Fuzzy Syst. 2017, 32, 899–911. [Google Scholar] [CrossRef]
  14. Goguen, J.A. L-fuzzy sets. J. Math. Anal. Appl. 1967, 18, 145–174. [Google Scholar] [CrossRef] [Green Version]
  15. Buehrer, D.J.; Gau, W.L. Vague sets. IEEE Trans. Syst. Cybernet 1993, 23, 610–614. [Google Scholar]
  16. Nada, S.M.S. Fuzzy rough sets. Fuzzy Sets Syst. 1999, 45, 157–160. [Google Scholar] [CrossRef]
  17. Zadeh, L. Fuzzy sets. Inf. Control 1965, 11, 338–353. [Google Scholar] [CrossRef] [Green Version]
  18. Atanassov, K. Intuitionistic fuzzy sets. Int. J. Bioautomation 2016, 20, 1. [Google Scholar]
  19. Bustince, H.; Burill, P. Vague sets are intuitionistic fuzzy Sets. Fuzzy Sets Syst. 1996, 79, 403–405. [Google Scholar] [CrossRef]
  20. Biswas, R. Fyzzy sets intuitionistic fuzzy Sets. NIFS 1997, 3, 3–11. [Google Scholar]
  21. Dutta, P.; Goala, S. Fuzzy decision making in medical diagnosis using an advanced distance measure on intuitionistic fuzzy sets. Open Cybern. Syst. J. 2018, 12, 136–149. [Google Scholar] [CrossRef]
  22. Atanassov, K. Intuitionistic Fuzzy Modal Topological Structure. Mathematics 2022, 10, 3313. [Google Scholar] [CrossRef]
  23. Gohain, B.; Chutia, R.; Dutta, A.P. Distance measure on intuitionistic fuzzy sets and its application in decision-making, pattern recognition, and clustering problems. Int. J. Intell. Syst. 2022, 37, 2458–2501. [Google Scholar] [CrossRef]
  24. Muthukumar, P.; Gangadharan, S.S.K. Ordered Intuitionistic Fuzzy soft sets and its application in decision making problem. Int. J. Fuzzy Syst. Appl. 2018, 7, 76–97. [Google Scholar] [CrossRef]
  25. Abu-Gdairi, R.; Noaman, I. Generating Fuzzy Sets and Fuzzy Relations based on Information. Wseas Trans. Math. 2021, 20, 178–185. [Google Scholar] [CrossRef]
  26. Dubois, D.; Prade, H. Rough fuzzy sets and fuzzy rough sets. Int. J. Gen. Syst. 1990, 17, 191–208. [Google Scholar] [CrossRef]
  27. Sun, W.M.B.; Liu, Q. An approach to decision making depending on intuitionistic fuzzy rough sets over two universes. J. Oper. Res. Soc. 2013, 64, 1079–1089. [Google Scholar] [CrossRef]
  28. Tang, K.S.J.G.; Zhu, W. A new type of covering-based rough fuzzy set model. Control. Decis. 2012, 27, 1652–1662. [Google Scholar]
  29. Hu, G.Y.W.J.; Zhang, Q.H. Covering based generalized rough fuzzy set model. J. Softw. 2010, 21, 968–977. [Google Scholar] [CrossRef]
  30. Sun, J.; Wang, Y.P.; Chen, M.W. Interval-valued intuitionistic fuzzy rough sets depending on coverings. Comput. Eng. Appl. 2013, 49, 155–156. [Google Scholar]
  31. Wei, F.F.X.L.; Miao, D.Q.; Xia, F.C. Research on a covering rough fuzzy set model. J. Comput. Dev. 2006, 43, 1719–1723. [Google Scholar] [CrossRef]
  32. Zhang, Z.; Bai, Y.; Tian, J. Intuitionistic fuzzy rough sets depending on intuitionistic fuzzy coverings. Control. Decis. 2010, 25, 1369–1373. [Google Scholar]
  33. Liang, J.Y.; Qian, Y.H.; Wei, W. Pessimistic rough decision. Second. Int. Rough Sets Theory 2010, 25, 440–449. [Google Scholar]
  34. Zhu, W.; Wang, F.Y. Reduction and axiomization of covering generalized rough sets. Inf. Sci. 2003, 15, 217–230. [Google Scholar] [CrossRef]
  35. Bryniarski, E.Z.; Bonikowski, U. Wybraniec-Skardowska. Extensions and intentions in the rough set theory. Inf. Sci. 1998, 107, 149–167. [Google Scholar]
  36. Rizvi, S.; Naqvi, H.J.; Nadeem, D. Rough Intuitionistic Fuzzy Sets. JCIS 2002, 6, 101–104. [Google Scholar]
  37. Zhu, W.; Wang, F.Y. On Three Types of Covering-Based Rough Sets. IEEE Trans. Knowl. Data Eng. 2007, 19, 1131–1144. [Google Scholar] [CrossRef]
Table 1. IF relation R.
Table 1. IF relation R.
R y 1 y 2 y 3 y 4 y 5
R ( y 1 ) (0.9, 0.0)(0.7, 0.1)(0.6, 0.2)(0.5, 0.1)(0.3, 0.2)
R ( y 2 ) (0.8, 0.1)(0.4, 0.4)(0.8, 0.1)(0.7, 0.1)(0.1, 0.0)
R ( y 3 ) (0.7, 0.2)(0.3, 0.1)(0.0, 0.6)(0.2, 0.2)(0.6, 0.2)
R ( y 4 ) (0.6, 0.1)(0.5, 0.5)(0.4, 0.4)(0.7, 0.2)(0.3, 0.4)
R ( y 5 ) (0.9, 0.0)(0.0, 0.1)(0.1, 0.1)(0.5, 0.5)(0.3, 0.3)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mareay, R.; Noaman, I.; Abu-Gdairi, R.; Badr, M. On Covering-Based Rough Intuitionistic Fuzzy Sets. Mathematics 2022, 10, 4079. https://doi.org/10.3390/math10214079

AMA Style

Mareay R, Noaman I, Abu-Gdairi R, Badr M. On Covering-Based Rough Intuitionistic Fuzzy Sets. Mathematics. 2022; 10(21):4079. https://doi.org/10.3390/math10214079

Chicago/Turabian Style

Mareay, R., Ibrahim Noaman, Radwan Abu-Gdairi, and M. Badr. 2022. "On Covering-Based Rough Intuitionistic Fuzzy Sets" Mathematics 10, no. 21: 4079. https://doi.org/10.3390/math10214079

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop