Next Article in Journal
On Energies of Charged Particles with Magnetic Field
Next Article in Special Issue
Split Systems of Nonconvex Variational Inequalities and Fixed Point Problems on Uniformly Prox-Regular Sets
Previous Article in Journal
A Study of the Effect of Medium Viscosity on Breakage Parameters for Wet Grinding
Previous Article in Special Issue
A Solution of Fredholm Integral Equation by Using the Cyclic η s q -Rational Contractive Mappings Technique in b-Metric-Like Spaces
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

gH-Symmetrically Derivative of Interval-Valued Functions and Applications in Interval-Valued Optimization

1
College of Science, Hohai University, Nanjing 210098, China
2
School of Mathematics and Statistics, Hubei Normal University, Huangshi 435002, China
*
Author to whom correspondence should be addressed.
Symmetry 2019, 11(10), 1203; https://doi.org/10.3390/sym11101203
Submission received: 2 September 2019 / Revised: 23 September 2019 / Accepted: 24 September 2019 / Published: 25 September 2019
(This article belongs to the Special Issue Advance in Nonlinear Analysis and Optimization)

Abstract

:
In this paper, we present the gH-symmetrical derivative of interval-valued functions and its properties. In application, we apply this new derivative to investigate the Karush–Kuhn–Tucker (KKT) conditions of interval-valued optimization problems. Meanwhile, some examples are worked out to illuminate the obtained results.

1. Introduction

In modern times, the optimization problems with uncertainty have received considerable attention and have great value in economic and control fields (e.g., [1,2,3,4]). From this point of view, Ishibuchi and Tanaka [5] derived the interval-valued optimization as an attempt to handle the problems with imprecise parameters. Since then, a collection of papers written by Chanas, Kuchta and Bitran et al. (e.g., [6,7,8]) offered many different approaches on this subject. For more profound results and applications, please see [9,10,11,12,13,14,15]. In addition, the importance of derivatives in nonlinear interval-valued optimization problems can not be ignored. Toward this end, Wu [16,17,18] discussed interval-valued nonlinear programming problems and gave a utilization of the H-derivative in interval-valued Karush–Kuhn–Tucker (KKT) optimization problems. Also, according to the results given by Chalco-Cano [19], the gH-differentiability was extended to learn interval-valued KKT optimality conditions. As for details of above mentioned derivatives, we refer the interested readers to [20,21].
Motivated by Wu [17] and Chalco-Cano [19], we introduce the gH-symmetrical derivative which is more general than the gH-derivative. Based on this derivative and its properties, we give KKT optimality conditions for interval-valued optimization problems.
The paper is discussed as follows. In Section 2, we recall some preliminaries. In Section 3, we put forward some concepts and theorems of the gH-symmetrical derivative. In Section 4, new KKT type optimality conditions are derived and some interesting examples are given. Finally, Section 5 contains some conclusions.

2. Preliminaries

Firstly, let R denote the space of real numbers and Q denote the set of rational numbers. We denote the set of real intervals by
I = { c = [ c ̲ , c ¯ ] | c ̲ , c ¯ R a n d c ̲ c ¯ } ,
the Hausdorff–Pompeiu distance between interval [ c ̲ , c ¯ ] and [ d ̲ , d ¯ ] I is defined by
D ( [ c ̲ , c ¯ ] , [ d ̲ , d ¯ ] ) = max { | c ̲ d ̲ | , | c ¯ d ¯ | } .
( I , D ) is a complete metric space. The relation “ L U ” of I is determined by
[ c ̲ , c ¯ ] L U [ d ̲ , d ¯ ] c ̲ d ̲ , c ¯ d ¯ .
Definition 1
([21]). The gH-difference of c , d I is defined as below
c g d = e ( a ) c = d + e ; o r ( b ) d = c + ( 1 ) e .
This gH-difference of two intervals always exists and it is equal to
c g d = [ min { c ̲ d ̲ , c ¯ d ¯ } , max { c ̲ d ̲ , c ¯ d ¯ } ] .
Proposition 1
([22]). We recall some properties of intervals c , d and e.
(1) Assume the length of interval c is defined by l ( c ) = c ¯ c ̲ . Then
c g d = [ c ̲ d ̲ , c ¯ d ¯ ] , i f l ( c ) l ( d ) ; [ c ¯ d ¯ , c ̲ d ̲ ] , i f l ( c ) < l ( d ) .
(2) If ( l ( c ) l ( d ) ) ( l ( d ) l ( e ) ) 0 , then
c g e = ( c g d ) + ( d g e ) .
Let f : ( a , b ) I be an interval-valued function, and f ( t ) = [ f ̲ ( t ) , f ¯ ( t ) ] so that f ̲ ( t ) f ¯ ( t ) for all t ( a , b ) . The functions f ̲ , f ¯ are called endpoint functions of f. In [21] Stefannini and Bede introduced the gH-derivative as follows.
Definition 2
([21]). Let f : ( a , b ) I . Then f is gH-differentiable at t 0 ( a , b ) if there exists f ( t 0 ) I such that
lim h 0 [ f ( t 0 + h ) g f ( t 0 ) ] h = f ( t 0 ) .
For more basic notations with interval analysis, see [21,22,23,24].
Definition 3
([25]). Let f : ( a , b ) R . Then f is symmetrically differentiable at t 0 ( a , b ) if there exists A R and
lim h 0 [ f ( t 0 + h ) f ( t 0 h ) ] 2 h = A .

3. Main Results

Now, we introduce the gH-symmetrical derivative and some corresponding properties.
Definition 4.
Let f : ( a , b ) I . Then f is symmetrically continuous at t 0 ( a , b ) if
lim h 0 f ( t 0 + h ) g f ( t 0 h ) = 0 .
Definition 5.
Let f : ( a , b ) I . Then f is gH-symmetrically differentiable at t 0 if there exists f s ( t 0 ) such that
lim h 0 f ( t 0 + h ) g f ( t 0 h ) 2 h = f s ( t 0 ) .
For convenience, let D I ( ( a , b ) , I ) , SD I ( ( a , b ) , I ) ) be the collection of gH-differentiable and gH-symmetrically differentiable interval functions on ( a , b ) .
Lemma 1.
Let c , d and e I . If ( l ( c ) l ( d ) ) ( l ( d ) l ( e ) ) < 0 , then we have
c g e = ( c g d ) g ( 1 ) ( d g e ) .
Proof. 
If ( l ( c ) l ( d ) ) < 0 and ( l ( d ) l ( e ) ) > 0 , by (2) we have
( c g d ) g ( 1 ) ( d g e ) = [ c ¯ d ¯ , c ̲ d ̲ ] g ( 1 ) [ d ̲ e ̲ , d ¯ e ¯ ] = [ c ¯ d ¯ , c ̲ d ̲ ] g [ e ¯ d ¯ , e ̲ d ̲ ] = min { c ¯ e ¯ , c ̲ e ̲ } , max { c ¯ e ¯ , c ̲ e ̲ } = c g e .
If ( l ( c ) l ( d ) ) > 0 and ( l ( d ) l ( e ) ) < 0 , the proof is similar to above. □
The following Theorem 1 shows the relation between D I ( ( a , b ) , I ) and SD I ( ( a , b ) , I ) ) .
Theorem 1.
Let f : ( a , b ) I be an interval-valued function. If f is gH-differentiable at t ( a , b ) then f is gH-symmetrically differentiable at t. However, the converse is not true.
Proof. 
Fix t ( a , b ) and assume f ( t ) exists. Put
K = l ( f ( t + h ) ) l ( f ( t ) ) l ( f ( t ) ) l ( f ( t h ) ) .
Applying Proposition 1 and Lemma 1, we obtain
f ( t + h ) g f ( t h ) = ( f ( t + h ) g f ( t ) ) + ( f ( t ) g f ( t h ) ) , i f K 0 ; ( f ( t + h ) g f ( t ) ) g ( 1 ) ( f ( t ) g f ( t h ) ) , i f K < 0 .
Hence,
a. If K 0 , by (5) we have
lim h 0 f ( t + h ) g f ( t h ) 2 h = lim h 0 ( f ( t + h ) g f ( t ) ) + ( f ( t ) g f ( t h ) ) 2 h = f ( t ) .
According to Definition 5, f s ( t ) exists and
f s ( t ) = f ( t ) .
b. If K < 0 , by (5) we have
lim h 0 f ( t + h ) g f ( t h ) 2 h = lim h 0 ( f ( t + h ) g f ( t ) ) g ( 1 ) ( f ( t ) g f ( t h ) ) 2 h = f ( t ) 2 g ( 1 ) f ( t ) 2 .
Thus, f s ( t ) exists and
f s ( t ) = f ( t ) 2 g ( 1 ) f ( t ) 2 .
Therefore, f is gH-symmetrically differentiable in view of (6) and (7).
Conversely, we now give a counter example as follows.
Let f 1 ( t ) = [ | 2 t | , | 3 t | ] , i f t 1 ; 2 , i f t = 1 . Since
lim h 0 [ | 2 ( 1 + h ) | , | 3 ( 1 + h ) | ] g [ | 2 ( 1 h ) | , | 3 ( 1 h ) | ] 2 h = [ 2 , 3 ] ,
f 1 is gH-symmetrically differentiable at t = 1 . However,
lim h 0 [ | 2 ( 1 + h ) | , | 3 ( 1 + h ) | ] g [ 2 , 2 ] h
does not exist. Then f 1 is not gH-differentiable at t = 1 . □
Remark 1.
Clearly the gH-symmetrically derivative is more general than gH-derivative reflected by Theorem 1. Moreover, f ( t ) and f s ( t ) are not necessarily equal according to (6) and (7). For example, consider interval-valued function f 2 ( t ) = [ | t | , | t | ] . We have
f 2 ( 0 ) = lim h 0 f ( h ) g f ( 0 ) h = lim h 0 [ | h | , | h | ] g [ 0 , 0 ] h = [ 1 , 1 ] .
However,
f 2 s ( 0 ) = lim h 0 f ( h ) g f ( h ) 2 h = lim h 0 [ | h | , | h | ] g [ | h | , | h | ] 2 h = 0 ,
which implies f 2 ( 0 ) f 2 s ( 0 ) .
Theorem 2.
Let f : ( a , b ) I . Then f is gH-symmetrically differentiable at t 0 ( a , b ) iff f ¯ and f ̲ are symmetrically differentiable at t 0 . Moreover
f s ( t 0 ) = [ min { f ̲ s ( t 0 ) , f ¯ s ( t 0 ) } , max { f ̲ s ( t 0 ) , f ¯ s ( t 0 ) } ] .
Proof. 
Suppose f is gH-symmetrically differentiable at t 0 , then f s ( t 0 ) = [ g ̲ ( t 0 ) , g ¯ ( t 0 ) ] exists. According to Definition 5 and (1),
g ̲ ( t 0 ) = lim h 0 min f ̲ ( t 0 + h ) f ̲ ( t 0 h ) 2 h , f ¯ ( t 0 + h ) f ¯ ( t 0 h ) 2 h , g ¯ ( t 0 ) = lim h 0 max f ̲ ( t 0 + h ) f ̲ ( t 0 h ) 2 h , f ¯ ( t 0 + h ) f ¯ ( t 0 h ) 2 h
exist. Then f ̲ s ( t 0 ) , f ¯ s ( t 0 ) must exist and (8) is workable.
Conversely, suppose f ¯ and f ̲ are symmetrically derivative at t 0 .
If ( f ¯ ) s ( t 0 ) ( f ̲ ) s ( t 0 ) , then
[ ( f ̲ ) s ( t 0 ) , ( f ¯ ) s ( t 0 ) ] = [ lim h 0 f ̲ ( t 0 + h ) f ̲ ( t 0 h ) 2 h , lim h 0 f ¯ ( t 0 + h ) f ¯ ( t 0 h ) 2 h ] = lim h 0 f ( t 0 + h ) g f ( t 0 h ) 2 h = f s ( t 0 ) .
So f is gH-symmetrically differentiable. Similarly, if ( f ¯ ) s ( t 0 ) ( f ̲ ) s ( t 0 ) , then f s ( t 0 ) = [ ( f ¯ ) s ( t 0 ) , ( f ̲ ) s ( t 0 ) ] .  □
Next, we study the gH-symmetrically derivative of f : M R n I where M is an open set.
Definition 6.
Let f : M I , t 0 = ( t 1 0 , t 2 0 , , t n 0 ) M . If there exist A 1 , A 2 , , A n I such that
lim h 0 D ( f ( t 0 + h ) g f ( t 0 h ) , 2 i = 1 n h i A i ) i = 1 n | h i | = 0 ,
h = ( h 1 , , h n ) . Then we call f gH-symmetrically differentiable at t 0 , and define ( A 1 , A 2 , , A n ) (denote g s f ( t 0 ) = ( A 1 , A 2 , , A n ) ) the symmetric gradient of f at t 0 .
Theorem 3.
The function f : M I is gH-symmetrically differentiable iff f ̲ and f ¯ are symmetrically differentiable.
Proof. 
The proof is similar to Theorem 2, so we omit it. □
Definition 7.
Let f : M I and t 0 M . If the interval- valued function φ ( t i ) = f ( t 1 0 , , t i 1 0 , t i , t i + 1 0 , , t n 0 ) is gH-symmetrically differentiable at t i 0 , then f has the ith partial gH-symmetrical derivative ( s f t i ) g ( t 0 ) at t 0 , i.e.,
( s f t i ) g ( t 0 ) = ( φ ) s ( t i 0 ) ) .
The following Theorem illustrates the relation between symmetric gradients and partial gH-symmetrical derivatives.
Theorem 4.
Let f : M I , t 0 = ( t 1 0 , t 2 0 , , t n 0 ) M . If f is gH-symmetrically differentiable at t 0 , then ( s f t i ) g ( t 0 ) exists, and ( s f t i ) g ( t 0 ) = A i ( i = 1 , 2 , , n ) , where ( A 1 , A 2 , , A n ) = g s f ( t 0 ) .
Proof. 
By Definition 6, substituting h j = 0 ( j i ) and taking h i 0 in M, it follows at once A i = ( s f t i ) g ( t 0 ) . □
Example 1.
Let f ( t 1 , t 2 ) = [ | 2 t 1 | + t 2 2 , | 3 t 1 | + t 2 2 ] , i f ( t 1 , t 2 ) ( 0 , 0 ) ; [ t 2 2 , t 2 2 + 1 ] , i f ( t 1 , t 2 ) = ( 0 , 0 ) .
We have
( s f t 1 ) g ( 0 , 0 ) = lim h 0 f ( h , 0 ) g f ( h , 0 ) 2 h = lim h 0 [ | 2 h | , 3 | h | ] g [ | 2 h | , 3 | h | ] 2 h = 0
and
( s f t 2 ) g ( 0 , 0 ) = lim h 0 f ( 0 , h ) g f ( 0 , h ) 2 h = lim h 0 [ h 2 , h 2 ] g [ h 2 , h 2 ] 2 h = 0 .
Therefore, the symmetric gradient of f at the point ( 0 , 0 ) is g s f ( 0 , 0 ) = ( 0 , 0 ) .
Remark 2.
The gradient of f in [19] is more restrictive than the symmetric gradient. For instance, the partial derivative ( f t 1 ) g ( 0 , 0 ) does not exist in Example 1. So we can not obtain the gradient at ( 0 , 0 ) using the gH-derivative.

4. Mathematical Programming Applications

Now, we discuss the following interval-valued optimization problem (IVOP):
min f ( t ) subject   to g i ( t ) 0 , i = 1 , , m ,
where g 1 , g 2 , , g m : M R n R are symmetrically differentiable and convex on M, M is an open and convex set and f : M I is LU-convex (see [19], Definition 8). Then we study the LU-solution (see ([17], Definition 5.1)) of the problem (IVOP1).
Theorem 5.
Suppose f : M I is LU-convex and gH-symmetrically differentiable at t * . If there exist (Lagrange) multipliers 0 < λ 1 , λ 2 R and 0 μ i R , i = 1 , , m so that
(1) λ 1 s f ̲ ( t * ) + λ 2 s f ¯ ( t * ) + i = 1 m μ i s g i ( t * ) = 0 ;
(2) i = 1 m μ i g i ( t * ) = 0 , where μ = ( μ 1 , , μ m ) T .
Then t * is an optimal LU-solution of problem (IVOP1).
Proof. 
We define f l ( t ) = λ 1 f ̲ ( t ) + λ 2 f ¯ ( t ) . Since f is LU-convex and gH-symmetrically differentiable at t * , then f l is convex and symmetrically differentiable at t * . And
s f l ( t * ) = λ 1 s f ̲ ( t * ) + λ 2 s f ¯ ( t * ) ,
then we have following conditions
(1) s f l ( t * ) + i = 1 m μ i s g i ( t * ) = 0 ;
(2) i = 1 m μ i g i ( t * ) = 0 where μ = ( μ 1 , , μ m ) T .
Based on Theorem 3.1 of [26], t * is an optimal solution of the real-valued objective function f l subject to the same constraints of problem (IVOP1), i.e.,
f l ( t * ) f l ( t ¯ )
for any t ¯ ( t * ) M .
Next, we illustrate this theorem by contradiction. Assume t * is not a solution of (IVOP1), then there exists an t ¯ M such that f ( t ¯ ) L U f ( t * ) , i.e.,
f ̲ ( t ¯ ) < f ̲ ( t * ) a n d f ¯ ( t ¯ ) f ¯ ( t * ) ; o r , f ̲ ( t ¯ ) f ̲ ( t * ) a n d f ¯ ( t ¯ ) < f ¯ ( t * ) ; o r , f ̲ ( t ¯ ) < f ̲ ( t * ) a n d f ¯ ( t ¯ ) < f ¯ ( t * ) .
Therefore, we obtain that f l ( t ¯ ) < f l ( t * ) which leads to a contradiction. This completes the proof. □
Example 2.
Suppose the objective function
f ( t ) = [ 3 t 2 + t 6 , 2 t 2 + 2 t ] , i f t ( 1 , 0 ) ; [ 2 t 6 , 2 t ] , i f t [ 0 , 1 ) ,
and the optimization problem as below
min f ( t ) s u b j e c t   t o t 0 , t 1 0 .
We have
f ̲ ( t ) = 3 t 2 + t 6 , i f t ( 1 , 0 ) ; 2 t 6 , i f t [ 0 , 1 ) ,
f ¯ ( t ) = 2 t 2 + 2 t , i f t ( 1 , 0 ) ; 2 t , i f t [ 0 , 1 ) .
Both f ̲ , f ¯ are convex and symmetrically differentiable. Furthermore, the condition (1) and (2) of Theorem 5 are satisfying at t * = 0 when λ 1 = 1 3 μ 1 , λ 2 = 1 4 μ 1 a n d μ 2 = 0 . Hence, t * = 0 is a LU-solution of (IVOP2).
Remark 3.
Note Theorem 4 in [19] can not be used in problem (IVOP2) since f ̲ is not differentiable at 0. Hence, Theorem 5 generalizes Theorem 4 in [19].
Applying Theorem 5 we have the following result.
Corollary 1.
Under the same assumption of Theorem 5, let k be any integer with 1 < k < m . If there exist (Lagrange) multipliers 0 μ i R , i = 1 , , m , such that
(1) s f ̲ ( t * ) + i = 1 k μ i s g i ( t * ) = 0 ;
(2) s f ¯ ( t * ) + i = k + 1 m μ i s g i ( t * ) = 0 ;
(3) i = 1 k μ i g i ( t * ) = 0 = i = k + 1 m μ i g i ( t * ) , where μ = ( μ 1 , , μ m ) T .
Then t * is an optimal LU-solution of problem (IVOP1).
Proof. 
Let ν i = λ 1 μ i ( i = 1 , , k ) and ω i = λ 2 μ i ( i = k + 1 , , m ). The conditions in this corollary can be written as
(1) λ 1 s f ̲ ( t * ) + λ 2 s f ¯ ( t * ) + i = 1 k ν i s g i ( t * ) + i = k + 1 m ω i s g i ( t * ) = 0 ;
(2) i = 1 k μ i g i ( t * ) = 0 = i = k + 1 m μ i g i ( t * ) .
Then from Theorem 5 the result follows. □
As shown in Example 1, symmetric gradient is more general than the gradient of f using the gH-derivative, we derive new KKT conditions for (IVOP1) using the symmetric gradient of interval-valued function given in Definition 6.
Theorem 6.
Under the same assumption of Theorem 5, the following KKT conditions hold
(1) g s f ( t * ) + i = 1 m μ i s g i ( t * ) = 0 ;
(2) i = 1 m μ i g i ( t * ) = 0 , w h e r e μ = ( μ 1 , , μ m ) T .
Then t * is an optimal LU-solution of problem (IVOP1).
Proof. 
By Theorem 3, the equation g s f ( t * ) + i = 1 m μ i s g i ( t * ) = 0 can be interpreted as
s f ̲ ( t * ) + i = 1 m μ i s g i ( t * ) = 0 = s f ¯ ( t * ) + i = 1 m μ i s g i ( t * ) ,
which implies
s f ̲ ( t * ) + s f ¯ ( t * ) + i = 1 m ν i s g i ( t * ) = 0 ,
where ν i = 2 μ i ( i = 1 , , m ). Then the result meets all conditions of Theorem 5. That is the end of proof. □
Example 3.
Suppose
f ( t ) = [ 1 3 t 1 , t ] , i f t > 0 ; [ 1 3 t 1 , t ] , i f t 0 .
and the programming problem
min f ( t ) s u b j e c t   t o t 2 , t 1 < 3 .
We can observe that g s f ( 0 ) = 0 . The conditions (1) and (2) of Theorem 6 are satisfied for μ 1 = μ 2 = 0 . Hence, 0 is an optimal LU-solution of (IVOP3).
Remark 4.
It is worth noting that Theorem 9 of [19] can not solve the problem (IVOP3) since f is not gH-differentiable at 0. So Theorem 6 is more general than Theorem 9 in [19].
Remark 5.
Comparing Example 2 with Example 3, it is easy to see Theorem 5 is more generic than Theorem 6. Nonetheless, Theorem 6 can be very effective for obtaining the solution of (IVOP1) in some cases.

5. Conclusions and Further Research

We defined the gH-symmetrical derivative of interval-valued functions, which is more general than the gH-derivative. In addition, we generalized some results of Wu [17] and Chalco-Cano [19] by establishing sufficient optimality conditions for optimality problems involving gH-symmetrically differentiable objective functions. The symmetric gradient of interval functions is more general and it is more robust for optimization problems. However, the equality constraints are not considered in our paper. We can try to handle equality constraints using a similar methodology to the one proposed in this paper. Moreover, the constraint functions in this paper are still real-valued. In future research, we may extend the constraint functions as the interval-valued functions. And we may study the symmetric integral and more interesting properties about interval-valued functions.

Author Contributions

All authors contributed equally to the writing of this paper. All authors read and approved the final manuscript.

Funding

This work is financially supported by the Fundamental Research Funds for the Central Universities (2017B19714, 2017B07414, 2019B44914), Natural Science Foundation of Jiangsu Province (BK20180500) and the National Key Research and Development Program of China (2018YFC1508106).

Acknowledgments

The authors thank the anonymous referees for their constructive comments and suggestions which helped to improve the presentation.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bao, T.Q.; Mordukhovich, B.S. Set-valued optimization in welfare economics. In Advances in Mathematical Economics; Springer: Tokyo, Japan, 2010; Volume 13, pp. 113–153. [Google Scholar]
  2. Chung, B.D.; Yao, T.; Xie, C.; Thorsen, A. Robust Optimization Model for a Dynamic Network Design Problem Under Demand Uncertainty. Netw. Spat. Econ. 2010, 11, 371–389. [Google Scholar] [CrossRef] [Green Version]
  3. Ostrovsky, G.M.; Volin, Y.M.; Golovashkin, D.V. Optimization problem of complex system under uncertainty. Comput. Chem. Eng. 1998, 22, 1007–1015. [Google Scholar] [CrossRef]
  4. Mahanipour, A.; Nezamabadi-Pour, H. GSP: An automatic programming technique with gravitational search algorithm. Appl. Intell. 2019, 49, 1502–1516. [Google Scholar] [CrossRef]
  5. Ishibuchi, H.; Tanaka, H. Multiobjective programming in optimization of the interval objective function. Eur. J. Oper. Res. 1990, 48, 219–225. [Google Scholar] [CrossRef]
  6. Chanas, S.; Kuchta, D. Multiobjective programming in optimization of interval objective functions—A generalized approach. Eur. J. Oper. Res. 1996, 94, 594–598. [Google Scholar] [CrossRef]
  7. Bitran, G.R. Linear multiple objective problems with interval coefficients. Manag. Sci. 1980, 26, 694–706. [Google Scholar] [CrossRef]
  8. Ida, M. Multiple objective linear programming with interval coefficients and its all efficient solutions. In Proceedings of the 35th IEEE Conference on Decision and Control, Kobe, Japan, 13 December 1996; Volume 2, pp. 1247–1249. [Google Scholar]
  9. Singh, D.; Dar, B.A.; Kim, D.S. KKT optimality conditions in interval valued multiobjective programming with generalized differentiable functions. Eur. J. Oper. Res. 2016, 254, 29–39. [Google Scholar] [CrossRef]
  10. Debnath, I.P.; Gupta, S.K. Necessary and Sufficient Optimality Conditions for Fractional Interval-Valued Optimization Problems. In Decision Science in Action; Springer: Singapore, 2019; pp. 155–173. [Google Scholar]
  11. Ghosh, D.; Singh, A.; Shukla, K.K.; Manchanda, K. Extended Karush-Kuhn-Tucker condition for constrained interval optimization problems and its application in support vector machines. Inform. Sci. 2019, 504, 276–292. [Google Scholar] [CrossRef]
  12. Tung, L.T. Karush-Kuhn-Tucker optimality conditions and duality for convex semi-infinite programming with multiple interval-valued objective functions. J. Appl. Math. Comput. 2019, 1–25. [Google Scholar] [CrossRef]
  13. Stefanini, L.; Arana-Jiménez, M. Karush-Kuhn-Tucker conditions for interval and fuzzy optimization in several variables under total and directional generalized differentiability. Fuzzy Sets Syst. 2019, 362, 1–34. [Google Scholar] [CrossRef]
  14. Chen, X.H.; Li, Z.H. On optimality conditions and duality for non-differentiable interval-valued programming problems with the generalized (F, ρ)-convexity. J. Ind. Manag. Optim. 2018, 14, 895–912. [Google Scholar] [CrossRef]
  15. Roman, R.C.; Precup, R.E.; David, R.C. Second order intelligent proportional-integral fuzzy control of twin rotor aerodynamic systems. Procedia Comput. Sci. 2018, 139, 372–380. [Google Scholar] [CrossRef]
  16. Wu, H.C. On interval-valued nonlinear programming problems. J. Math. Anal. Appl. 2008, 338, 299–316. [Google Scholar] [CrossRef] [Green Version]
  17. Wu, H.C. The Karush-Kuhn-Tucker optimality conditions in an optimization problem with interval-valued objective function. Eur. J. Oper. Res. 2007, 176, 46–59. [Google Scholar] [CrossRef]
  18. Wu, H.C. The optimality conditions for optimization problems with convex constraints and multiple fuzzy-valued objective functions. Fuzzy Optim. Decis. Mak. 2009, 8, 295–321. [Google Scholar] [CrossRef]
  19. Chalco-Cano, Y.; Lodwick, W.A.; Rufian-Lizana, A. Optimality conditions of type KKT for optimization problem with interval-valued objective function via generalized derivative. Fuzzy Optim. Decis. Mak. 2013, 12, 305–322. [Google Scholar] [CrossRef]
  20. Hukuhara, M. Integration des applications mesurables dont la valeur est un compact convexe. Funkcial. Ekvac. 1967, 10, 205–223. [Google Scholar]
  21. Stefanini, L.; Bede, B. Generalized Hukuhara differentiability of interval-valued functions and interval differential equations. Nonlinear Anal. 2009, 71, 1311–1328. [Google Scholar] [CrossRef] [Green Version]
  22. Tao, J.; Zhang, Z.H. Properties of interval vector-valued arithmetic based on gH-difference. Math. Comput. 2015, 4, 7–12. [Google Scholar]
  23. Stefanini, L. A generalization of Hukuhara difference and division for interval and fuzzy arithmetic. Fuzzy Sets Syst. 2010, 161, 1564–1584. [Google Scholar] [CrossRef]
  24. Markov, S. Calculus for interval functions of a real variable. Computing 1979, 22, 325–337. [Google Scholar] [CrossRef]
  25. Thomson, B.S. Symmetric Properties of Real Functions; Dekker: New York, NY, USA, 1994. [Google Scholar]
  26. Minch, R.A. Applications of symmetric derivatives in mathematical programming. Math. Program. 1971, 1, 307–320. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Guo, Y.; Ye, G.; Zhao, D.; Liu, W. gH-Symmetrically Derivative of Interval-Valued Functions and Applications in Interval-Valued Optimization. Symmetry 2019, 11, 1203. https://doi.org/10.3390/sym11101203

AMA Style

Guo Y, Ye G, Zhao D, Liu W. gH-Symmetrically Derivative of Interval-Valued Functions and Applications in Interval-Valued Optimization. Symmetry. 2019; 11(10):1203. https://doi.org/10.3390/sym11101203

Chicago/Turabian Style

Guo, Yating, Guoju Ye, Dafang Zhao, and Wei Liu. 2019. "gH-Symmetrically Derivative of Interval-Valued Functions and Applications in Interval-Valued Optimization" Symmetry 11, no. 10: 1203. https://doi.org/10.3390/sym11101203

APA Style

Guo, Y., Ye, G., Zhao, D., & Liu, W. (2019). gH-Symmetrically Derivative of Interval-Valued Functions and Applications in Interval-Valued Optimization. Symmetry, 11(10), 1203. https://doi.org/10.3390/sym11101203

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop