Next Article in Journal
Helicopter Safe Landing Trajectory after Main Rotor Actuator Failures
Next Article in Special Issue
Optimum Design of PID Controlled Active Tuned Mass Damper via Modified Harmony Search
Previous Article in Journal
A Transfer Learning Method for Pneumonia Classification and Visualization
Previous Article in Special Issue
Feature Selection for Facial Emotion Recognition Using Cosine Similarity-Based Harmony Search Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Differential Mutation Based Adaptive Harmony Search Algorithm for Global Optimization

1
School of Science, Beijing University of Posts and Telecommunications, Beijing 100876, China
2
Hunan Key Laboratory for Computation and Simulation in Science and Engineering, Xiangtan University, Xiangtan 411105, China
3
School of Statistics, University of International Business and Economics, Beijing 10029, China
4
School of Mathematics and Computational Science, Xiangtan University, Xiangtan 411105, China
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2020, 10(8), 2916; https://doi.org/10.3390/app10082916
Submission received: 22 February 2020 / Revised: 27 March 2020 / Accepted: 17 April 2020 / Published: 23 April 2020

Abstract

:
The canonical harmony search (HS) algorithm generates a new solution by using random adjustment. However, the beneficial effects of harmony memory are not well considered. In order to make full use of harmony memory to generate new solutions, this paper proposes a new adaptive harmony search algorithm (aHSDE) with a differential mutation, periodic learning and linear population size reduction strategy for global optimization. Differential mutation is used for pitch adjustment, which provides a promising direction guidance to adjust the bandwidth. To balance the diversity and convergence of harmony memory, a linear reducing strategy of harmony memory is proposed with iterations. Meanwhile, periodic learning is used to adaptively modify the pitch adjusting rate and the scaling factor to improve the adaptability of the algorithm. The effects and the cooperation of the proposed strategies and the key parameters are analyzed in detail. Experimental comparison among well-known HS variants and several state-of-the-art evolutionary algorithms on CEC 2014 benchmark indicates that the aHSDE has a very competitive performance.

Graphical Abstract

1. Introduction

The Harmony Search (HS) algorithm is one of the Evolutionary Algorithms (EA), taking inspiration from the music improvisation process, which was proposed by Geem et al. [1] in 2001. It is an emerging population-based metaheuristic optimization algorithm which simulates the improvisation behavior of musicians by repeatedly adjusting the instruments, eventually generating a harmony state. In HS, the harmony of musical instrument tones is regarded as a solution vector of the optimization problem. The evaluation of musical harmonies corresponds to the objective function value.
There are four main control parameters in a canonical HS algorithm [1], including the harmony memory size (HMS), harmony memory considering rate (HMCR), pitch adjusting rate (PAR) and the bandwidth (bw). However, it is well known that the optimal setting of these parameters [2] depends on the problem. Therefore, when HS is applied to real-world problems, it is necessary to adjust the control parameters to obtain the desired results. Overall, it has attracted more and more attention and a variety of HS variants have been proposed.
In order to improve its efficiency or to overcome some shortcomings, the original HS operators have been adapted and/or new operators have been introduced. Mahdavi et al. [3] proposed an improved harmony search algorithm (IHS) in which PAR is designed to increase linearly, while bw decreases exponentially with the increase of the number of iterations. Pan et al. [4] proposed a self-adaptive global-best harmony search (SGHS). It employs a new improvisation scheme and uses a parameter adjustment strategy to generate a new solution with a learning period. Combining the harmony search algorithm with particle swarm optimization (PSO) [5], Valian et al. [6] presented an intelligent global harmony search algorithm (IGHS) which has excellent performance compared with its competitors. To enhance the search efficiency and effectiveness, a self-adaptive global-best harmony search algorithm [7] is developed. The proposed algorithm takes full advantage of the valuable information hidden in the harmony memory to devise a high-performance search strategy. It also integrates a self-adaptive mechanism to develop a parameter-setting-free technique [8]. Ouyang et al. [9] proposed an improved harmony search algorithm with three key features: adaptive global pitch adjustment, opposition-based learning and competition selection mechanism. Inspired from the simulated annealing of accepted inferior solutions, a hybrid harmony search algorithm (HSA) [10] is proposed. It accepts the inferior harmonies with a probability determined by a temperature parameter. Zhu et al. [11] proposed an improved differential-based harmony search algorithm with a linear dynamic domain which utilized two main innovative strategies. Focusing on the historical development of algorithm structures, Zhang and Geem [12] reviewed various modified and hybrid HS methods, which included the adaption of the original operators, parameter adaption, hybrid methods, handling multi-objective optimization and constraint handling.
One question is naturally proposed—why does HS work on various problems from science and engineering [13]? The unique stochastic derivative [14] gives information on the probabilistic inclination to select certain discrete points based on multiple vectors stored in HM for a discrete problem. Although HS is easy to implement and has a simple structure [13], it has shown its superiority with more complex optimization algorithms, and has been applied to many practical problems [12,15,16,17]. HS has been successfully used in a wide range of applications [18,19,20,21,22,23,24,25], which has attracted a lot of research attention undertaken to further improve its performance. Combining HS and local search, a novel sensor localization approach is proposed by Manjarres et al. [26]. Minimizing energy consumption and maximizing the network lifetime of a wireless sensor network (WSN) problem using the HS algorithm was closely studied [27,28,29,30]. Degertekin [31] optimized the frame size of truss structures by harmony search algorithm. Compared with the genetic algorithm for max-cut problem [32], the harmony search algorithm has advantages of generating new vectors after considering all of the existing vectors and parameters. Boryczka and Szwarc [33] proposed a harmony search algorithm with the additional improvement of harmony memory for asymmetric traveling salesman problems, which eliminates the imperfectness revealed in the previous research. Seyedhosseini et al. [34] researched the portfolio optimization problem using a mean-semi variance approach based on harmony search and an artificial bee colony. HSA [35] was used in reservoir engineering-assisted history, matching questions of different complex degrees, which are two material balance history matches of different scales and one reservoir history matching.
However, HS and its variants usually have the following drawbacks, which are also our research motivation.
(1)
For the pitch adjustment operator of HS, a larger bandwidth is easier to jump out of the local optimum, while a smaller bandwidth biases to find a promising solution for the fining search. Therefore, a fixed step size is not an ideal choice.
(2)
It is difficult to find the optimal solution with a constant execution probability and an adaptive adjusting method is required.
(3)
Parameter HMS has an important influence on the performance of algorithms. An adaptive sizing HMS is possible to enhance the performance of the algorithm.
Therefore, an adaptive harmony search algorithm is proposed with differential evolution mutation, periodic learning and linear population size reduction (aHSDE). The main contributions of this paper are as follows.
(1)
The pitch adjustment strategy is implemented with differential mutation. Adjust the pitch adjusting rate PAR and the scaling factor F with periodic learning strategy. Linear population size reduction strategy is adopted for HMS changing scheme.
(2)
The cooperation and effects of several strategies are analyzed step by step.
The organization of this paper is as follows. Section 2 reviews the canonical HS and several improved variants. In Section 3, the composite strategies and algorithm aHSDE are proposed. In Section 4, the effects and the cooperation of the proposed strategies and parameters analysis are presented. The comprehensive performance comparison with other HS variants and other state-of-the-art EAs are presented in Section 5. Finally, Section 6 concludes the paper.

2. Harmony Search and Several Variants

2.1. Harmony Search Algorithm

The harmony search algorithm is a new population-based metaheuristic optimization algorithm [1], which is inspired by the improvisation process of music. The improvisation process is modeled as an iterative optimization method and the musicians’ musical instruments improvise to produce a better harmony [36]. The basic steps are described in detail in Algorithm 1.
Algorithm 1:General Framework of the Harmony Search (HS).
1://Initialize the problem and algorithm parameters//
      f ( x ) : objective function
     HMS: harmony memory size
     HMCR: harmony memory considering rate
     PAR: pitch adjusting rate
     bw: bandwidth
     DIM: dimension of decision variable
      M A X _ N F E : maximum number of function evaluations
L i , U i : the lower and upper bounds of the i-th component for the decision vector
2://Initialize the harmony memory   ( x 1 , x 2 , , x H M S ) //
          x i j = L i + r a n d ( 0 , 1 ) · ( U i L i )
3://Improvise a new harmony   x n e w = ( x 1 n e w , x 2 n e w , , x D I M n e w ) //
f o r   e a c h   i   [ 1 ,   D I M ]   d o
      i f   ( r a n d ( 0 , 1 ) < H M C R )   t h e n
              x i n e w = x i a , w h e r e   a { 1 , 2 , , H M S }
              i f ( r a n d ( 0 , 1 ) < P A R )   t h e n
                        x i n e w = x i n e w + r a n d ( 1 , 1 ) × b w  
              e n d i f
      e l s e
                              x i n e w = L i + r a n d ( U i L i )
      e n d i f
e n d f o r
4://Update the harmony memory//
i f f ( x n e w ) < f ( x w o r s t ) = max j = 1 , 2 , , H M S f ( x j ) , t h e n x w o r s t = x n e w .
5://Check the stopping criterion//
If the termination condition is met, stop and output the best individual. Otherwise, the process will repeat from Step 3.

2.2. The Improved Harmony Search Algorithm (IHS)

In the canonical harmony search algorithm, the probability of P A R and b w are constant. Mahdavi et al. [3] proposed an improved harmony search algorithm, called IHS, which mainly introduced the dynamic change of P A R and b w using the following equations.
P A R ( N F E ) = P A R m i n + P A R m a x P A R m i n M A X N F E × N F E
b w ( N F E ) = b w m a x e x p ( l n b w m i n b w m a x M A X N F E × N F E )
where P A R m a x is the maximum adjusting rate; P A R m i n is the minimum adjusting rate; b w m a x is the maximum bandwidth; b w m i n is the minimum bandwidth. M A X N F E is the maximum number of function evaluation and N F E is the current number of function evaluation.

2.3. A Self-Adaptive Global-Best Harmony Search (SGHS)

The SGHS [4] employs a new improvisation scheme and an adaptive parameter tuning method. To modify pitch adjustment rule, x i n e w is assigned with the corresponding decision variable x i b e s t from the best harmony vector. In addition, the concept of a learning period is introduced. Parameters HMCR and PAR are dynamically adapted to a suitable range by recording their historical values corresponding to the generated successful harmonies entering the nest harmony memory. Furthermore, bw is dynamically updated using the following equation.
b w ( N F E ) = { b w m a x b w m a x b w m i n M A X N F E × 2 N F E ,   i f   N F E < M A X N F E 2 b w m i n                                       ,     i f       N F E > M A X N F E 2
where   b w max and b w min are the maximum and minimum values of the bandwidth (bw), respectively.

2.4. An Intelligent Global Harmony Search Algorithm (IGHS)

Valian et al. [6] modified the improvisation step by imitating one dimension of the best harmony in the harmony memory to generate the new harmony and proposed algorithm IGHS. The main steps are shown in Algorithm 2.
Algorithm 2:Main framework of the intelligent global harmony search algorithm (IGHS).
f o r   e a c h   i [ 1 ,   D I M ]   d o
      i f   ( r a n d ( 0 , 1 ) < H M C R )   t h e n
            i f ( r a n d ( 0 , 1 ) < P A R )   t h e n
                    x i n e w =   x k b e s t   ,     w h e r e   k { 1 , 2 , , D I M }
              else
                    x i R = 2 × x i b e s t   x i w o r s t
                    i f       x i R < L i
                        x i R = L i
                    e l s e i f       x i R > U i
                          x i R = U i
                       e n d i f
                       x i n e w = x i w o r s t + r a n d ( x R x w o r s t )
               e n d i f
      e l s e
               x i n e w = L i + r a n d ( U i L i )
      e n d i f
e n d f o r

3. Adaptive Harmony Search with Differential Evolution

When musicians compose music, they take full advantage of their own knowledge and experience in the improvement direction. With the continuous optimization of music composition, musicians will also reduce the available experience and accelerate the composition. Inspired by the conception, it is desired to make full use of the information in the harmony memory and dynamically adjust the harmony memory size. Thus, the differential evolution mutation is adopted into the modified algorithm to provide an effective guidance for the generation of the new solution. The linear harmony memory size reduction strategy is also introduced into the algorithm to accelerate the convergence. Meanwhile, in order to strengthen the general suitability for various problems and reduce the dependence on the parameters, the parameter’s self-adaption with the concept of a learning period is applied to the modified algorithm.
This paper presents an adaptive harmony search algorithm (aHSDE) with differential evolution mutation, periodic learning and linear population size reduction strategy. Therefore, it is desirable to balance the ability of global exploration and local exploitation for the harmony search algorithm.

3.1. Differential Evolution

As a stochastic population optimization algorithm, differential evolution (DE) is similar to other evolutionary algorithms [37]. The basic idea of DE is summarized as follows: a set of initial individuals are generated randomly in the search space, and each individual represents a solution. After this, a new individual is generated by the following three operations in sequence: mutation, crossover and selection. The core idea of DE is that it adds the differential vectors among several individual pairs to a base vector. It controls the magnitude and direction of exploration for the promising neighborhood [38].
This paper uses DE/best/2 of DE mutation as follows:
x i n e w = x i b e s t + F [ ( x i r 1 x i r 2 ) + ( x i r 3 x i r 4 ) ]
where r 1 , r 2 , r 3   a n d   r 4 are mutually different individual indexes which are chosen randomly. The parameter F is a scale factor controlling the mutation step size. Scaled differential vectors with respect to the possible individual pairs adapt the property of the current neighborhood landscape. It thus can provide promising mutation directions with adjustable step size and a balance between local and global search [39].

3.2. Linear Population Size Reduction

In the former search stage of HS, the algorithm tends to explore the search space with the assistance of well-population diversity. Subsequently, it can construct some fine-tuning directions in the iterative process. In the latter stage of the algorithm, the population usually focuses on the neighbor search. Therefore, exploitation better attracts most of the computing resource. Inspired by the improved Success-History based parameter Adaptation for Differential Evolution (SHADE) [40], a monotonically reducing population size strategy is utilized with respect to the function evaluation number. It is shown as follows
H M S = r o u n d ( H M S max H M S max H M S min M A X N F E × N F E )
where H M S max and H M S min are the maximum and minimum values of the harmony memory size (HMS), respectively. M A X N F E is the maximum number of function evaluation and N F E is the current number of function evaluation.

3.3. Differential Mutation in the Pitch Adjustment Operator

The canonical harmony search algorithm operates the pitch adjustment with the constant distance bandwidth, which cannot adapt to the searching landscapes at different searching stages for different problems. It is certain that a proper bandwidth is important for the harmony search algorithm. In this paper, we present a general framework for defining the pitch adjustment operator with the differential mutation (DE/best/2) [41], which can provide a more effective direction than the constant bandwidth to the searching landscape. It is indicated as Equation (6).
x i n e w = x i b e s t + F [ ( x i r 1 x i r 2 ) + ( x i r 3 x i r 4 ) ] + r a n d ( 1 , 1 ) × b w
where i is the component index from { 1 , 2 , 3 , , D I M } ; r 1 , r 2 , r 3 , r 4 are selected randomly from { 1 , 2 , 3 , , D I M } and mutually exclusive; rand(−1,1) is a uniformly distributed random number between −1 and 1. D I M is the dimension of decision variables. x n e w is the newly generated harmony vector and x b e s t is the current best harmony vector in the harmony memory.
If the new solution is out of the bounds [ L i , U i ] , it will be modified as follows,
  x i n e w = { L i ,           i f           x i n e w < L i   U i ,           i f           x i n e w > U i
where U i and L i denote the upper and lower bounds of the i-th component for the decision vector.

3.4. Self-Adaptive PAR and F

Inspired by the concept of the learning period of SGHS [21] to adaptively tune parameters HMCR and PAR, this paper employs a new modified scheme for PAR and F.
In addition, the parameters HMCR and PAR are dynamically adapted to a suitable range by recording their historic values corresponding to the generated harmonies entering the harmony memory. During the evolution, the values of HMCR and PAR for the generated successful harmony are recorded to replace the worst members in the harmony memory.
First, both the means of PAR (PARm) and F (Fm) are initialized as 0.5. Second, parameters PAR and F are generated with a normal distribution. During the generations, the values of PAR and F are recorded when the generated harmony successfully replaces the worst member in the harmony memory. After Learning Period (LP), parameters PARm and Fm are recalculated with the weighted Lehmer mean formulas [42]. The weighted Lehmer mean m e a n w l ( S ) uses the following deterministic Equations (8)–(10) to compute. The amount of fitness difference Δ f k is used to influence the parameter adaptation.
m e a n w l ( S ) = k = 1 | S | w k · S k 2 k = 1 | S | w k · S k
w k = Δ f k l = 1 | S | Δ f l
Δ f k = | f ( x n e w ) f ( x w o r s t ) |
where S includes either S P A R or S F and m e a n w l ( S ) is the new value of PARm or Fm. x n e w is the newly generated solution in the current generation. x w o r s t is the worst solution in harmony memory. f ( * ) denotes the fitness function.
Parameters PARm or Fm use the framework in Algorithm 3 to update their values, where generation counter lp = 1. The difference between the weighted Lehmer mean and the arithmetic mean is described as follows. Arithmetic mean means all the recorded successful parameters of PARm or Fm have the same weights. On the other hand, the weighted Lehmer mean shown in Equations (8)–(10) means that all the recorded successful parameters of PARm or Fm have the self-adaptive weights based on their fitness improvements. It is very possible that the weighted Lehmer mean outperforms the arithmetic mean statistically. However, we will not analyze their difference in this paper due to paper length restrictions, and instead cite the reference [42] directly. The detailed information of weighted Lehmer mean can be found in reference [42].
Algorithm 3:Parameters updating of the means of PAR (PARm) and F (Fm).
i f   l p > L e a r n P e r i o d
        P A R m = m e a n w l ( S P A R )
        F m = m e a n w l ( S F )
          lp=1;
else
          lp=lp+1
endif
In general, the values of PAR and F are regenerated as the following equations.
{   P A R = n o r m r n d ( P A R m , 0.1 ) F = n o r m r n d ( F m , 0.1 )
If PAR is larger than 1, it is truncated to be 1; if PAR is less than or equal to 0, it will be assigned 0.001. The same action is executed on F.

3.5. aHSDE Algorithm Framework

The aim of this paper is to provide some beneficial strategies to improve the performance of the aHSDE from the improvisational perspective. Algorithm 4 shows the procedure of the aHSDE.
Algorithm 4:Framework of the new adaptive harmony search algorithm (aHSDE).
1://Initialize the problem and parameters//
        f ( x ) : objective function
       HMSmax: the maximum value of the harmony memory size
       HMSmin: the minimum value of the harmony memory size
       HMCR: harmony memory considering rate
       PARm: the mean of pitch adjusting rate
       Fm: the mean of the scaled factor
       bw: bandwidth
       LP: learning period
        M A X _ N F E : maximum number of function evaluation
        L , U : the lower and upper bounds of the decision vector
2://Initialize the harmony memory   ( x 1 , x 2 , , x H M S ) //
              x i j = L i + r a n d ( 0 , 1 ) · ( U i L i ) , j = 1 , 2 , , H M S ; i = 1 , 2 , , D I M
3://Improvise a new harmony     x n e w = ( x 1 n e w , x 2 n e w , , x D I M n e w ) //
(1)
Update HMS with Equation (5). If HMS decreases, the solutions are sorted in HM according to their fitness values and the worst one is deleted.
(2)
To generate a new solution, the process is as follows:

f o r   e a c h   i [ 1 ,   D I M ]   d o
        i f   ( r a n d < H M C R )   t h e n
                                x i n e w = x i b e s t
                    i f ( r a n d ( 0 , 1 ) < P A R )   t h e n
                                   x i n e w = x i b e s t + F [ ( x i r 1 x i r 2 ) + ( x i r 3 x i r 4 ) ] ± r a n d × b w
                  e n d i f
        e l s e
                    x i n e w = L i + r a n d ( U i L i )
        e n d i f
e n d f o r
4://Update the harmony memory//
i f f ( x n e w ) < f ( x w o r s t ) = max j = 1 , 2 , , H M S f ( x j ) , t h e n x w o r s t = x n e w .
Record the generation of PAR, F and the fitness difference Δ f k .
5://Check the stopping criterion//
If the termination condition is met, stop and output the best individual.
Otherwise, the process will repeat from Step 3.

4. Experimental Comparison and Analysis

The proposed strategies and the parameter adaption schemes are explained and analyzed firstly by empirical research in this section. Subsequently, the proposed aHSDE is compared with the classical HS and several state-of-the-art HS variants, which include IHS [3], SGHS [4] and IGHS [6]. It is also compared with other state-of-the-art evolutionary algorithms (non harmony ones), which are Adaptive Particle Swarm Optimization (APSO) [43] and Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) [44].

4.1. Parameters and Benchmark Functions

This section evaluates the performance of the aHSDE on the CEC2014 benchmark suite [45] compared with the original HS, IHS, SGHS and IGHS. The CEC2014 benchmark suite consists of 30 test functions, which include three unimodal functions, 13 multimodal functions, six hybrid functions and eight composite functions. In particular, these hybrid functions (f17–f22) are very similar to real world problems, such as transportation networks [46], circuit theory [47], image processing [48], capacitated arc routing problems [49] and flexible job–shop scheduling problems [50]. The search range for each function is [−100, 100]DIM, where DIM is the dimension of the problem. The experiments are conducted in 10, 50 and 100 dimensions and the maximum function evaluation number is DIM × 10000. The number of multiple runs per function is 30, and the average results of these runs are recorded. When the value difference between the found best solution and the optimal solution is lower than 1 × 10−8, the error is considered as 0.
For the comparing HS variants, the parameter settings are the same as the respective literatures, which are also shown in Algorithm 5.
Algorithm 5:Parameters.
HS [1] HMS = 5 , HMCR = 0.9 , PAR   = 0.3 , bw   = 0.01
HIS [3] HMS = 5 , HMCR = 0.9 ,   PAR m i n = 0.01 , PAR m a x = 0.99 ,
bw m i n = 0.0001 , bw m a x = x U x L 20
SGHS [4] HMS = 5 , HMCR m = 0.98 , PAR m = 0.9 , Lp = 100 ,
bw m i n = 0.0005 , bw m a x = x U x L 10
IGHS [6] HMS = 5 , HMCR = 0.995 , PAR   = 0.4
aHSDE HMS m i n = 5 , HMS m a x = 18 × DIM , H M C R = 0.99 , L P = 100 , B W = 0.01

4.2. How HMS Changes

In order to analyze the impact of the harmony memory size HMS on the aHSDE, four functions, f1, f10, f21, f28, are chosen from four categories, respectively. In the following experiments of Section 4.2, Section 4.3, Section 4.4 and Section 4.5, the dimension size of four functions is 30 and the statistical results are obtained from 30 independent runs. The minimum value of HMS is five and the maximum value HMS m a x is a maximal integer which is no larger than rate × DIM, which is associated with the problem dimension size. Furthermore, rate is considered changing from 0.5 to 25 with an interval 0.5. Then the best results in each case are recorded and shown in Figure 1.
As can be seen from Figure 1, the fitness of the four functions decreases exponentially when the initial value of HMS gradually changes from 0.5 × DIM to 25 × DIM. This indicates that the initial value of HMS has a great impact on the performance of the aHSDE. When the initial value is small, the fitness of function decreases rapidly. The decreasing trend is no longer clear for the fitness as the initial value increases to 15 × DIM. Therefore, the initial value of HMS is set to 18 × DIM in this paper without special explanation in the following sections.

4.3. Effect of Differential Evolution Based Mutation

In this paper, mutation strategy DE/best/2 is used to adjust bandwidth to explore the landscape of the corresponding sub-stages. Therefore, ( ( x i r 1 x i r 2 ) + ( x i r 3 x i r 4 ) ) , regarded as an Experience Operator (EO), is used to indicate the possible maximum searching neighborhood with the increase of generations. It can be adjusted adaptively with the change from population diverse to converging. The same analyzing functions, f1, f10, f21, f28, as detailed above, are chosen to illustrate the performance of the aHSDE algorithm with the increase of generations. Their changing trends of Eos are shown in Figure 2.
It can be seen from Figure 2 that EO is gradually convergent with the increase of generations. In the early stage of iteration, the search region of the algorithm is relatively large and EO is relatively large accordingly, which strengthens the global exploration ability. However, with the gradual convergence of HMS, the fining search of the algorithm is gradually conducted to improve the local exploitation at the latter generations. It is possible to provide a promising escape mechanism from the landscape valley with the differential mutation strategy to adjust the pitch in the aHSDE. Therefore, the aHSDE precedes over the original HS and several HS variants, which exploits the valley using a small step size.

4.4. How PAR and F Change

In this paper, the concept of a learning period for PAR and F adjustment is adopted, which is computed with the weighted Lehmer mean. It aims to reduce the dependence on the parameters and enlarge the application scope of the algorithm. The results of PAR and F are recorded in 30 independent runs with the same four functions as the previous sections. Figure 3 and Figure 4 illustrate the changing trends of the adaptive adjustment strategy for PAR and F.
Observed in Figure 3, PAR probably changes between 0.7 and 0.95, which is rather large in the early generations. However, it ranges around 0.1 in the latter generations. It is necessary that the aHSDE, as one of the population-based optimization algorithms, needs a wide neighborhood-based global exploration in a high probability in the early stage. After this, a probability of pitch adjustment becomes smaller and smaller in order to improve the fine-tuning search and convergence of the algorithm in the latter generations. Thus, this adaptive modification strategy for PAR is inherently consistent with the internal variation principle of the exploring step size for population based optimization algorithms. It is possible to keep a good balance between local exploitation and global exploration.
It is observed that the overall changing trend of the scaling factor F for four functions is opposite to the parameter PAR. F is small at the beginning stage of iteration, at about 0.3. However, it becomes large at the latter iteration, probably around 0.9. It is worthy of note that the initial value of F is 0.5. Therefore, it can be roughly inferred that algorithm is not sensitive to the initial value of F. The possible reason for the four functions with similar trends is that the difference vector of the improving direction guided by DE operation is relatively large at the early generations. Therefore, a relatively small scaling factor F is suitable to the search demand for the algorithm. On the contrary, most of the solutions approximate to the optimal solution and the difference vector of improving direction is relatively small at the latter generations. Therefore, a large scaling factor F is required. In addition, although the overall changing trend of each function is similar, the adaptive adjustment behavior of F still depends on the solving problems. The curves, showing how F changes for four functions, exhibit different varying principles.
In this paper, the weighted Lehmer mean is used to adaptively tune PAR and scale the parameter F. It is a versatile and efficient automatic parameter tuner and is highly successful in tuning search and optimization algorithms [42].

4.5. Combined Adaptability Consideration for PAR and F

In order to consider the effects of parameters PAR and F, the same four functions are used to analyze the performance difference on the aHSDE with different parameter settings. The statistical results in 30 runs are shown in Table 1 and Tables S1–S3. The data in the Tables S1–S3 represent the statistical results of multiple runs from different PAR and F combinations.
Table 1 shows that function f1 gets the best result when the parameter pair (PAR, F) is (0.9, 0.4). Table S1 shows that function f10 gets the best result when the parameter pair (PAR, F) is (0.1, 0.6). Table S2 shows that function f21 gets the best result when the parameter pair (PAR, F) is (0.9, 0.3). Table S3 shows that function f28 gets the best result when the parameter pair (PAR, F) is (0.7, 0.6). At the same time, it is easy to see that the performance of algorithm varies greatly with different parameter pairs. For example, the result of algorithm varies from 1.35 × 107 with (PAR, F) being (0.8, 0.9) to 3.53 × 103 with (PAR, F) being (0.9, 0.4) for Function 1. The result of the algorithm varies from 8.66 × 105 with (PAR, F) being (0.1, 0.3) to 8.89 × 10 with (PAR, F) being (0.9, 0.3) for Function 2. This tells us that different functions have different sensitivities to the parameter PAR or F.
Observed from the comparison analysis of different parameter pairs on (PAR, F), the performance of the algorithm is sensitive to the parameter pair (PAR, F) to different problems. Simultaneously, it demonstrates that a certain parameter adaption scheme is necessary for problem solving. The algorithm aHSDE can obtain the best parameter pair of (PAR, F) and converge the best solution with the adaptive strategy. This scheme can reduce the dependence on the parameter for the algorithm.
Thus, it can be said that Table 1 and Tables S1–S3 fully demonstrate the effects of the adaptive strategy. In conclusion, the aHSDE is highly successful with the tuned parameter settings of PAR and F through the learning period and the weighted Lehmer mean method.

5. Experimental Comparison with HS Variants and Well-Known EAs

5.1. aHSDE vs. HS Variants

The experimental results of five algorithms (HS, IHS, SGHS, IGHS and the aHSDE) are reported and compared in Tables S4−S6 with different dimension sizes 10, 50 and 100, respectively. The items “Best”, “Mean” and “SD” represent the best and average results and the standard deviation of multiple final results, which are collected in 30 independent runs for each algorithm on each function. Meanwhile, the fitness error is assigned to zero if it was less than 1× 10−8.
It can be seen from Tables S4−S6 which can be found in the Supplementary data due to space and readability reasons) that the aHSDE has significantly competitive performance when compared with the canonical HS algorithm and several state-of-the-art HS variants. These data are the statistical results of 30 independent runs with the CEC 2014 benchmark for the 10-, 50- and 100-dimension sizes. In Tables S4−S6, the aHSDE always performs best among its competitors on f1–f3 unimodal functions. Secondly, the performance advantage of the aHSDE to its competitors increases as the dimension increases on f4–f16 multimodal functions. Let us take the concrete example as an illustration of algorithmic performance difference. For example, the mean results of Function 8 of HS, IHS, SGHS and the aHSDE. The IGHS indicates that three of four variants obtain the true optimal solution with the dimension as 10. The mean item of the aHSDE is 7.41 × 10−8, however, the best mean item of the other three algorithms is 2.52 × 10−2 for Function 8 with the dimension as 50. The mean item of the aHSDE is 2.90 × 10−6, however, the best mean item of other three algorithms is 1.74 × 10−0 for Function 8 with the dimension as 100. This concrete example indicates that the performance advantage of the aHSDE to its competitors becomes more and more obvious with the increase of dimension size.
Moreover, the performance of the aHSDE is also all better than those of the other four HS variants on f17–f22 hybrid functions, except for Function 19, which has slightly worse performance with a dimension size of 100. Subsequently, Tables S4−S6 indicate that the performance advantage of the aHSDE is not as obvious as the previous benchmark. It performs slightly better than other competitors on the composition functions f23–f30. However, the overall statistical Table 2 tells us that the aHSDE still has the best cases for all the composition functions f23–f30 for all the dimensional sizes. These statistical experimental comparisons and results analyses indicate that the improvement strategies of the aHSDE have significant impact on performance and its ability on global exploration and local exploitation.

5.2. Overall Statistical Comparison among HS Variants

Table 2 presents the overall statistical comparison results for the aHSDE and its competitors based on the Wilcoxon rank–sum test with the significance level α of 0.05 for each dimension case on all the benchmark functions. The symbols “+”, “−“ and “~” mean that the aHSDE performs significantly better, significantly worse, or not significantly different compared with its competitors. Overall, it demonstrated that the performance of the aHSDE is quite competitive compared with four HS variants on the CEC2014 benchmark.
The following facts can be observed from Table 2. For three unimodal functions, the aHSDE outperforms all its competitors for 10, 50 and 100 dimensions. For thirteen multimodal functions, the aHSDE performs a little better than HS on 10 dimensions and performs much better than HS with the increase of dimension size for all competitions and all functions. For six hybrid functions, the aHSDE clearly outperforms HS, IHS, SGHS and IGHS for all the cases. These results illustrate that the aHSDE has a superior advantage to the state-of-the-art HS variants when solving various optimization problems, whose varieties may have no features. For the eight composition functions, the aHSDE significantly outperforms HS, IHS, SGHS and IGHS for 10, 50, 100 dimensions, except that the aHSDE performs comparably to SGHS for the 50-dimensional case and IGHS for the 10-dimensional case, respectively. The advantages are more obvious on higher dimensional functions. As a whole, the aHSDE performs much better than the canonical HS algorithm and HS variants in total on 10, 50, 100 dimensions on 30 benchmark functions for all dimensional cases.

5.3. Comparison with Other Well-Known EAs

In this subsection, the proposed aHSDE algorithm is compared with other state-of-the-art evolutionary algorithms (non harmony ones), including Adaptive Particle Swarm Optimization (APSO) [43] and Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) [44]. The experimental results (mean best and standard deviation of multiple runs) of different algorithms are all collected with the maximum function evaluation number DIM * 10000 respectively, all of which are summarized in Table 3. The best “mean” result for the same function is highlighted in bold.
Observed in Table 3, it can be seen that APSO, CMA-ES and the aHSDE perform best on 7, 4 and 19 benchmarks respectively from the 30 benchmark functions. It should be further noted that APSO outperforms the aHSDE on eight problems and CMA-ES outperforms the aHSDE on six functions among this IEEE CEC 2014 benchmark suite. Therefore, generally speaking, the aHSDE significantly outperforms APSO and CMA-ES on most of the benchmark functions. However, it should be especially noted that APSO outperforms CMA-ES and the aHSDE on the composition functions, which indicates that APSO is promising for the composition, or the highly complex problems. Comparatively speaking, the aHSDE has better overall performance on multiple types of problems.

6. Conclusions

Based on the analysis of HSA and the knowledge and experience of the musician, a new adaptive harmony search algorithm is proposed (aHSDE) for global optimization in this paper. It enhances the performance of HS with a differential mutation for the pitch adjustment of HS, the mechanism of decreasing HMS linearly, and the parameter adaptation of PAR and F. Firstly, the mutual influence and cooperation of three strategies and key parameters on the aHSDE are analyzed and verified in detail. After this, the performance of the aHSDE is comprehensively evaluated on IEEE CEC 2014 Benchmarks with 10-, 50- and 100-dimension sizes. The experimental results indicate that the aHSDE outperforms the canonical HS algorithm and three advanced HS variants, including IHS, SGHS and IGHS. Furthermore, other state-of-the-art metaheuristic algorithms, namely APSO and CMA-ES, are also used as competitors to evaluate the aHSDE.

Supplementary Materials

The following are available online at https://www.mdpi.com/2076-3417/10/8/2916/s1, Table S1: Fitness of f10 for different parameters (PAR, F). Table S2: Fitness of f21 for different parameters (PAR, F). Table S3 : Fitness of f28 for different parameters (PAR, F). Table S4: Performance comparison among five harmony search algorithms for f1–f30 (DIM = 10). Table S5: Performance comparison among five harmony search algorithms for f1–f30 (DIM = 50). Table S6: Performance comparison among five harmony search algorithms for f1–f30 (DIM = 100).

Author Contributions

Conceptualization, X.Z.; methodology, X.Z., R.L.; investigation, J.H.; data curation, R.L.; writing—original draft preparation, Z.L.; writing—review and editing, J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Beijing Natural Science Foundation, grant number 1202020 and National Natural Science Foundation of China, grant number 61973042, 71772060. The APC was funded by Xinchao Zhao with his funds.

Acknowledgments

We will also express our awfully thanks to the Swarm Intelligence Research Team of BeiYou University.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Geem, Z.W.; Kim, J.H.; Loganathan, G. A New Heuristic Optimization Algorithm: Harmony Search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  2. Geem, Z.W. Optimal cost design of water distribution networks using harmony search. Eng. Optim. 2006, 38, 259–277. [Google Scholar] [CrossRef]
  3. Mahdavi, M.; Fesanghary, M.; Damangir, E. An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 2007, 188, 1567–1579. [Google Scholar] [CrossRef]
  4. Pan, Q.-K.; Suganthan, P.N.; Tasgetiren, M.F.; Liang, J. A self-adaptive global best harmony search algorithm for continuous optimization problems. Appl. Math. Comput. 2010, 216, 830–848. [Google Scholar] [CrossRef]
  5. Zhao, F.; Liu, Y.; Zhang, C.; Wang, J. A self-adaptive harmony PSO search algorithm and its performance analysis. Expert Syst. Appl. 2015, 42, 7436–7455. [Google Scholar] [CrossRef]
  6. Valian, E.; Tavakoli, S.; Mohanna, S. An intelligent global harmony search approach to continuous optimization problems. Appl. Math. Comput. 2014, 232, 670–684. [Google Scholar] [CrossRef]
  7. Luo, K.; Ma, J.; Zhao, Q. Enhanced self-adaptive global-best harmony search without any extra statistic and external archive. Inf. Sci. 2019, 482, 228–247. [Google Scholar] [CrossRef]
  8. Geem, Z.W.; Sim, K.-B. Parameter-setting-free harmony search algorithm. Appl. Math. Comput. 2010, 217, 3881–3889. [Google Scholar] [CrossRef]
  9. Ouyang, H.-B.; Gao, L.-Q.; Li, S.; Kong, X.-Y.; Wang, Q.; Zou, D.-X. Improved Harmony Search Algorithm: LHS. Appl. Soft Comput. 2017, 53, 133–167. [Google Scholar] [CrossRef]
  10. Assad, A.; Deep, K. A Hybrid Harmony search and Simulated Annealing algorithm for continuous optimization. Inf. Sci. 2018, 450, 246–266. [Google Scholar] [CrossRef]
  11. Zhu, Q.; Tang, X.; Li, Y.; Yeboah, M.O. An improved differential-based harmony search algorithm with linear dynamic domain. Knowl.-Based Syst. 2020, 187, 104809. [Google Scholar] [CrossRef]
  12. Zhang, T.; Geem, Z.W. Review of harmony search with respect to algorithm structure. Swarm Evol. Comput. 2019, 48, 31–43. [Google Scholar] [CrossRef]
  13. Saka, M.; Hasançebi, O.; Geem, Z.W. Metaheuristics in structural optimization and discussions on harmony search algorithm. Swarm Evol. Comput. 2016, 28, 88–97. [Google Scholar] [CrossRef]
  14. Geem, Z.W. Novel derivative of harmony search algorithm for discrete design variables. Appl. Math. Comput. 2008, 199, 223–230. [Google Scholar] [CrossRef]
  15. Couckuyt, I.; Deschrijver, D.; Dhaene, T. Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization. J. Glob. Optim. 2013, 60, 575–594. [Google Scholar] [CrossRef]
  16. Manjarrés, D.; Landa-Torres, I.; Gil-Lopez, S.; Del Ser, J.; Bilbao, M.; Salcedo-Sanz, S.; Geem, Z.W. A survey on applications of the harmony search algorithm. Eng. Appl. Artif. Intell. 2013, 26, 1818–1831. [Google Scholar] [CrossRef]
  17. Ertenlice, O.; Kalayci, C.B.; Kalaycı, C.B. A survey of swarm intelligence for portfolio optimization: Algorithms and applications. Swarm Evol. Comput. 2018, 39, 36–52. [Google Scholar] [CrossRef]
  18. Geem, Z.W.; Kim, J.; Loganathan, G. Harmony Search Optimization: Application to Pipe Network Design. Int. J. Model. Simul. 2002, 22, 125–133. [Google Scholar] [CrossRef]
  19. Zhao, X.; Liu, Z.; Hao, J.; Li, R.; Zuo, X. Semi-self-adaptive harmony search algorithm. Nat. Comput. 2017, 16, 619–636. [Google Scholar] [CrossRef]
  20. Yi, J.; Gao, L.; Li, X.; Shoemaker, C.A.; Lu, C. An on-line variable-fidelity surrogate-assisted harmony search algorithm with multi-level screening strategy for expensive engineering design optimization. Knowl.-Based Syst. 2019, 170, 1–19. [Google Scholar] [CrossRef]
  21. Vasebi, A.; Fesanghary, M.; Bathaee, S. Combined heat and power economic dispatch by harmony search algorithm. Int. J. Electr. Power Energy Syst. 2007, 29, 713–719. [Google Scholar] [CrossRef]
  22. Geem, Z.W. Optimal Scheduling of Multiple Dam System Using Harmony Search Algorithm. In Proceedings of the International Work Conference on Artificial Neural Networks, San Sebastin, Spain, 20−22 June 2007; LNCS 4507. pp. 316–323. [Google Scholar]
  23. Geem, Z.W. Harmony search optimization to the pump-included water distribution network design. Civ. Eng. Environ. Syst. 2009, 26, 211–221. [Google Scholar] [CrossRef]
  24. Geem, Z.W. Particle-swarm harmony search for water network design. Eng. Optim. 2009, 41, 297–311. [Google Scholar] [CrossRef]
  25. Lin, Q.; Chen, J. A novel micro-population immune multiobjective optimization algorithm. Comput. Oper. Res. 2013, 40, 1590–1601. [Google Scholar] [CrossRef]
  26. Manjarres, D.; Ser, J.D.; Gil-Lopez, S.; Vecchio, M.; Landa-Torres, I.; Lopez-Valcarce, R. A novel heuristic approach for distance-and connectivity-based multi-hop node localization in wireless sensor networks. Soft Comput. 2013, 17, 17–28. [Google Scholar] [CrossRef]
  27. Landa-Torres, I.; Gil-Lopez, S.; Ser, J.D.; Salcedo-Sanz, S.; Manjarres, D.; Portilla-Figueras, J.A. Efficient citywide planning of open WiFi access networks using novel grouping harmony search heuristics. Eng. Appl. Artif. Intell. 2013, 26, 1124–1130. [Google Scholar] [CrossRef]
  28. Peng, Z.-R.; Yin, H.; Dong, H.-T.; Li, H.; Pan, A. A Harmony Search Based Low-Delay and Low-Energy Wireless Sensor Network. Int. J. Future Gener. Commun. Netw. 2015, 8, 21–32. [Google Scholar] [CrossRef]
  29. Mohsen, A. A Robust Harmony Search Algorithm Based Markov Model for Node Deployment in Hybrid Wireless Sensor Networks. Int. J. Geomate 2016, 11. [Google Scholar] [CrossRef]
  30. Nikravan, M. Combining Harmony Search and Learning Automata for Topology Control in Wireless Sensor Networks. Int. J. Wirel. Mob. Netw. 2012, 4, 87–98. [Google Scholar] [CrossRef]
  31. Degertekin, S.O. Improved harmony search algorithms for sizing optimization of truss structures. Comput. Struct. 2012, 92, 229–241. [Google Scholar] [CrossRef]
  32. Kim, Y.-H.; Yoon, Y.; Geem, Z.W. A comparison study of harmony search and genetic algorithm for the max-cut problem. Swarm Evol. Comput. 2019, 44, 130–135. [Google Scholar] [CrossRef]
  33. Boryczka, U.; Szwarc, K. The Harmony Search algorithm with additional improvement of harmony memory for Asymmetric Traveling Salesman Problem. Expert Syst. Appl. 2019, 122, 43–53. [Google Scholar] [CrossRef]
  34. Seyedhosseini, S.M.; Esfahani, M.J.; Ghaffari, M. A novel hybrid algorithm based on a harmony search and artificial bee colony for solving a portfolio optimization problem using a mean-semi variance approach. J. Cent. South Univ. 2016, 23, 181–188. [Google Scholar] [CrossRef]
  35. Shams, M.; El-Banbi, A.; Sayyouh, H. Harmony search optimization applied to reservoir engineering assisted history matching. Pet. Explor. Dev. 2020, 47, 154–160. [Google Scholar] [CrossRef]
  36. Lee, K.S.; Geem, Z.W. A new structural optimization method based on the harmony search algorithm. Comput. Struct. 2004, 82, 781–798. [Google Scholar] [CrossRef]
  37. Price, K.; Storn, R.; Lampinen, J. Differential Evolution: A Practical Approach to Global Optimization; Springer Science & Business Media: Berlin, Germany, 2006. [Google Scholar]
  38. Park, S.-Y.; Lee, J.-J. Stochastic Opposition-Based Learning Using a Beta Distribution in Differential Evolution. IEEE Trans. Cybern. 2015, 46, 2184–2194. [Google Scholar] [CrossRef]
  39. Qin, A.; Forbes, F. Harmony search with differential mutation based pitch adjustment. In Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation-GECCO’11; Association for Computing Machinery (ACM), Dublin, Ireland, 12−16 July 2011; pp. 545–552. [Google Scholar]
  40. Tanabe, R.; Fukunaga, A.S.; Tanabe, R. Improving the search performance of SHADE using linear population size reduction. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6−11 July 2014; pp. 1658–1665. [Google Scholar] [CrossRef] [Green Version]
  41. Pant, M.; Zaheer, H.; Garcia-Hernandez, L.; Abraham, A. Differential Evolution: A review of more than two decades of research. Eng. Appl. Artif. Intell. 2020, 90, 103479. [Google Scholar] [CrossRef]
  42. Peng, F.; Tang, K.; Chen, G.; Yao, X. Multi-start JADE with knowledge transfer for numerical optimization. In Proceedings of the 2009 IEEE Congress on Evolutionary Computation, Trondheim, Norway, 18−21 May 2009; pp. 1889–1895. [Google Scholar] [CrossRef]
  43. Zhan, Z.-H.; Zhang, J.; Li, Y.; Chung, H.S.-H. Adaptive particle swarm optimization. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 2009, 39, 1362–1381. [Google Scholar] [CrossRef] [Green Version]
  44. Hansen, N.; Müller, S.D.; Koumoutsakos, P. Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES). Evol. Comput. 2003, 11, 1–18. [Google Scholar] [CrossRef] [PubMed]
  45. Liang, J.J.; Qu, B.Y.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the CEC 2014 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization; Computational Intelligence Laboratory, Zhengzhou University: Zhengzhou, China; Technical Report, Nanyang Technological University: Singapore, 2013. [Google Scholar]
  46. Xie, F.; Butt, M.M.; Li, Z. A feasible flow-based iterative algorithm for the two-level hierarchical time minimization transportation problem. Comput. Oper. Res. 2017, 86, 124–139. [Google Scholar] [CrossRef]
  47. Singh, K.; Jain, A.; Mittal, A.; Yadav, V.; Singh, A.A.; Jain, A.K.; Gupta, M. Optimum transistor sizing of CMOS logic circuits using logical effort theory and evolutionary algorithms. Integration 2018, 60, 25–38. [Google Scholar] [CrossRef]
  48. Subashini, M.M.; Sahoo, S.K.; Sunil, V.; Easwaran, S. A non-invasive methodology for the grade identification of astrocytoma using image processing and artificial intelligence techniques. Expert Syst. Appl. 2016, 43, 186–196. [Google Scholar] [CrossRef]
  49. Shang, R.H.; Dai, K.Y.; Jiao, L.C.; Stolkin, R. Improved memetic algorithm based on route distance grouping for Multi-objective Large Scale Capacitated Arc Routing Problems. IEEE Trans. Cybern. 2016, 6, 1000–1013. [Google Scholar] [CrossRef]
  50. Li, J.-Q.; Pan, Q.-K.; Tasgetiren, M.F. A discrete artificial bee colony algorithm for the multi-objective flexible job-shop scheduling problem with maintenance activities. Appl. Math. Model. 2014, 38, 1111–1132. [Google Scholar] [CrossRef]
Figure 1. Impact of Harmony Memory Size (HMS) with 30 runs on the aHSDE (f1, f10, f21, f28).
Figure 1. Impact of Harmony Memory Size (HMS) with 30 runs on the aHSDE (f1, f10, f21, f28).
Applsci 10 02916 g001
Figure 2. Change of Experience Operator on aHSDE (f1, f10, f21, f28).
Figure 2. Change of Experience Operator on aHSDE (f1, f10, f21, f28).
Applsci 10 02916 g002aApplsci 10 02916 g002b
Figure 3. How pitch adjusting rate (PAR) changes f1, f10, f21, f28.
Figure 3. How pitch adjusting rate (PAR) changes f1, f10, f21, f28.
Applsci 10 02916 g003
Figure 4. How the scaling factor F changes f1, f10, f21, f28.
Figure 4. How the scaling factor F changes f1, f10, f21, f28.
Applsci 10 02916 g004
Table 1. Fitness of f1 for different parameters (PAR, F).
Table 1. Fitness of f1 for different parameters (PAR, F).
PAR/F0.10.20.30.40.50.60.70.80.9
0.11.28 × 1071.35 × 1071.17 × 1077.01 × 1068.09 × 1062.08 × 1061.10 × 1061.02 × 1061.17 × 106
0.27.20 × 1067.11 × 1065.56 × 1061.27 × 1068.57 × 1057.69 × 1057.01 × 1056.36 × 1057.72 × 105
0.38.54 × 1066.17 × 1062.11 × 1065.26 × 1054.86 × 1055.88 × 1056.94 × 1058.08 × 1051.05 × 106
0.45.22 × 1064.00 × 1067.65 × 1053.86 × 1053.81 × 1056.83 × 1051.06 × 1061.48 × 1061.63 × 106
0.55.69 × 1062.16 × 1063.99 × 1052.53 × 1054.68 × 1051.02 × 1061.65 × 1062.20 × 1062.74 × 106
0.63.37 × 1067.32 × 1051.80 × 1053.31 × 1057.86 × 1051.43 × 1062.47 × 1064.55 × 1064.94 × 106
0.72.87 × 1064.47 × 1051.14 × 1053.51 × 1051.03 × 1062.52 × 1065.63 × 1069.59 × 1061.52 × 107
0.82.92 × 1062.97 × 1055.14 × 1041.19 × 1055.89 × 1052.20 × 1066.67 × 1062.61 × 1073.10 × 107
0.92.75 × 1062.10 × 1051.12 × 1043.53 × 1035.00 × 1045.17 × 1052.29 × 1061.03 × 1072.14 × 107
Table 2. Overall statistical comparison among aHSDE and HS, improved harmony search (IHS), self-adaptive global-best harmony search (SGHS) and IGHS on CEC2014.
Table 2. Overall statistical comparison among aHSDE and HS, improved harmony search (IHS), self-adaptive global-best harmony search (SGHS) and IGHS on CEC2014.
GroupsaHSDE
vs
HS(DIM = )IHS(DIM = )SGHS(DIM = )IGHS(DIM = )
1050100105010010501001050100
3 Unimodal Functions +333323323333
000000000000
~000010010000
13 Simple Multimodal Functions +61112511138111091313
411610100000
~310210423400
6 Hybrid Functions +466465565666
000100000000
~200101101000
8 Composition Functions+466456646467
212122132300
~210310110121
30 All Functions+172627162427222324222829
623832232300
~720631644521
Table 3. Comparison on mean best and standard deviation of multiple runs of Adaptive Particle Swarm Optimization (APSO), Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) and the aHSDE based on IEEE CEC 2014 benchmarks.
Table 3. Comparison on mean best and standard deviation of multiple runs of Adaptive Particle Swarm Optimization (APSO), Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) and the aHSDE based on IEEE CEC 2014 benchmarks.
FunctionAPSOCMA-ESaHSDEFunctionAPSOCMA-ESaHSDE
f12.69 × 109 ±
3.28 × 108
9.42 × 104 ±
7.88 × 104
2.06 × 105 ±
9.12 × 104
f161.42 × 10 ±
2.37 × 10−1
1.38 × 10 ±
5.31 × 10−1
1.78 × 10 ±
1.00 × 100
f21.02 × 1011 ±
2.29 × 109
2.55 × 10 ±
3.85 × 109
5.77 × 103 ±
6.41 × 103
f172.86 × 108 ±
1.28 × 108
5.49 × 103 ±
3.62 × 103
2.66 × 103 ±
1.83 × 103
f31.19 × 106 ±
1.25 × 106
1.45 × 104 ±
5.66 × 103
1.25 × 100 ±
1.61 × 100
f188.75 × 109 ±
3.11 × 109
1.52 × 109 ±
3.93 × 108
8.88 × 10 ±
3.11 × 10
f42.49 × 104 ±
1.54 × 103
2.00 × 10 ±
2.63 × 10−5
4.43 × 10 ±
3.69 × 10
f198.45 × 102 ±
1.15 × 102
2.98 × 102 ±
4.25 × 10
1.16 × 10 ±
1.42 × 100
f52.13 × 10 ±
5.61 × 10−2
2.08 × 10 ±
6.69 × 10−2
2.01 × 10 ±
4.08 × 10−2
f201.59 × 107 ±
1.37 × 107
4.61 × 103 ±
3.88 × 103
4.07 × 10 ±
9.82 × 100
f64.80 × 10 ±
1.79 × 100
4.09 × 103 ±
2.13 × 100
2.02 × 10 ±
3.77 × 100
f211.33 × 108 ±
7.50 × 107
6.86 × 103 ±
2.76 × 103
8.35 × 102 ±
2.36 × 102
f71.06 × 103 ±
3.85 × 10
2.31 × 102 ±
2.83 × 10
2.14 × 10−3 ±
4.28 × 10−3
f221.31 × 104 ±
9.38 × 103
1.61 × 103 ±
2.92 × 102
8.30 × 102 ±
2.96 × 102
f85.03 × 102 ±
3.02 × 10
2.83 × 102 ±
2.21 × 10
7.41 × 10−8 ±
1.94 × 10−8
f232.00 × 102 ±
0.00 × 100
5.79 × 102 ±
4.94 × 10
3.44 × 102 ±
0.00 × 100
f94.78 × 102 ±
6.30 × 100
3.28 × 102 ±
7.65 × 10
7.89 × 10 ±
1.80 × 10
f242.00 × 102 ±
0.00 × 100
2.12 × 102 ±
7.49 × 100
2.69 × 102 ±
6.50 × 100
f109.30 × 103 ±
5.68 × 102
2.61 × 102 ±
1.06 × 102
1.94 × 10−1 ±
4.50 × 10−2
f252.00 × 102 ±
0.00 × 100
2.12 × 102 ±
2.97 × 100
2.07 × 102 ±
2.04 × 100
f119.24 × 103 ±
4.86 × 102
1.69 × 102 ±
1.98 × 102
4.71 × 103 ±
5.61 × 102
f261.86 × 102 ±
2.68 × 10
1.25 × 10−2 ±
5.51 × 10−1
1.00 × 102 ±
6.01 × 10−2
f125.91 × 100 ±
1.32× 100
3.03 × 10−1 ±
2.18 × 100
9.57 × 10−2 ±
4.16 × 10−2
f272.00 × 102 ±
0.00 × 100
1.07 × 103 ±
2.30 × 102
8.76 × 102 ±
1.26 × 102
f131.03 × 10 ±
7.53 × 10−1
5.51 × 100 ±
3.07 × 10−1
3.31 × 10−1 ±
6.32 × 10−2
f282.00 × 102 ±
0.00 × 100
2.79 × 103 ±
5.92 × 102
1.28 × 103 ±
8.88 × 10
f143.95 × 102 ±
2.22 × 10
7.53 × 10 ±
8.08 × 100
3.30 × 10−1 ±
1.12 × 10−1
f292.00 × 102 ±
0.00 × 100
3.52 × 104 ±
5.34 × 103
2.35 × 107 ±
1.69 × 107
f151.05 × 106 ±
0.00 × 100
1.02 × 104 ±
3.24 × 104
8.09 × 100 ±
2.32 × 100
f302.00 × 102 ±
0.00 × 100
6.48 × 105 ±
1.31 × 105
8.93 × 103 ±
6.76 × 102

Share and Cite

MDPI and ACS Style

Zhao, X.; Li, R.; Hao, J.; Liu, Z.; Yuan, J. A New Differential Mutation Based Adaptive Harmony Search Algorithm for Global Optimization. Appl. Sci. 2020, 10, 2916. https://doi.org/10.3390/app10082916

AMA Style

Zhao X, Li R, Hao J, Liu Z, Yuan J. A New Differential Mutation Based Adaptive Harmony Search Algorithm for Global Optimization. Applied Sciences. 2020; 10(8):2916. https://doi.org/10.3390/app10082916

Chicago/Turabian Style

Zhao, Xinchao, Rui Li, Junling Hao, Zhaohua Liu, and Jianmei Yuan. 2020. "A New Differential Mutation Based Adaptive Harmony Search Algorithm for Global Optimization" Applied Sciences 10, no. 8: 2916. https://doi.org/10.3390/app10082916

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop