Next Article in Journal
Asymptotic Relations in Applied Models of Inhomogeneous Poisson Point Flows
Next Article in Special Issue
Intelligent Classification and Diagnosis of Diabetes and Impaired Glucose Tolerance Using Deep Neural Networks
Previous Article in Journal
A Mandarin Tone Recognition Algorithm Based on Random Forest and Feature Fusion
Previous Article in Special Issue
Impact of Machine Learning and Artificial Intelligence in Business Based on Intuitionistic Fuzzy Soft WASPAS Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Integrated Heuristic Optimizer Using a Water Cycle Algorithm and Gravitational Search Algorithm for Optimization Problems

1
School of Science, Xi’an Polytechnic University, Xi’an 710048, China
2
School of Computer Science, Xi’an Polytechnic University, Xi’an 710048, China
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(8), 1880; https://doi.org/10.3390/math11081880
Submission received: 14 March 2023 / Revised: 10 April 2023 / Accepted: 12 April 2023 / Published: 15 April 2023
(This article belongs to the Special Issue AI Algorithm Design and Application)

Abstract

:
This paper presents a novel composite heuristic algorithm for global optimization by organically integrating the merits of a water cycle algorithm (WCA) and gravitational search algorithm (GSA). To effectively reinforce the exploration and exploitation of algorithms and reasonably achieve their balance, a modified WCA is first put forward to strengthen its search performance by introducing the concept of the basin, where the position of the solution is also considered into the assignment of the sea or river and its streams, and the number of the guider solutions is adaptively reduced during the search process. Furthermore, the enhanced WCA is adaptively cooperated with the gravitational search to search for new solutions based on their historical performance within a certain stage. Moreover, the binomial crossover operation is also incorporated after the water cycle search or the gravitational search to further improve the search capability of the algorithm. Finally, the performance of the proposed algorithm is evaluated by comparing with six excellent meta-heuristic algorithms on the IEEE CEC2014 test suite, and the numerical results indicate that the proposed algorithm is very competitive.

1. Introduction

With the development of science and technology, global optimization problems can be found in almost all applications, and have attracted great interest from researchers [1,2,3,4,5]. In order to deal with these real problems satisfactorily, many meta-heuristic algorithms have been presented by the inspiration of natural phenomena, animal behaviours, human activities and physical criteria, such as particle swarm optimization (PSO) [6], genetic algorithm (GA) [7], differential evolution (DE) [8], ant colony optimization [9], fireworks algorithm (FA) [10], joint operations algorithm [11], squirrel search algorithm [12], gaining–sharing knowledge-based algorithm [13], and so on. Specifically, more nature-inspired approaches can be further found in the literature [14]. As we all know, these approaches have been proved to be accurate, reasonably fast and robust optimizers, and have been applied in many fields, including pattern recognition [15], feature selection [16], image processing [17], multitask optimization [18], multimodal optimization [19], and data clustering [20]. However, they still do not guarantee the convergence to the global optimum, especially for complicated problems [21,22]. Hence, it is necessary to study more promising versions for them.
Since the performance of each heuristic optimization algorithm heavily depends on its balance between exploration and exploitation of the search space, many variants of the classical algorithms and hybrid algorithms have been developed by incorporating new mechanisms or combining with other approaches [23,24,25,26,27,28,29,30,31,32,33,34,35,36]. For example, Yan and Tian [25] presented an enhanced adaptive DE variant, with respect to the classical DE algorithm, by making full use of the information of the current individual, its best neighbour and a randomly selected neighbour to predict the promising region around it. Li et al. [26] proposed a novel integrated DE algorithm by dividing the population into the leader and adjoint sub-populations based on a leader–adjoint model and, respectively, adopting two mutation strategies for them. For the particle swarm optimization algorithm, to integrate the advantages of the update strategies from different PSO variants, Xu et al. [27] developed a strategy learning framework to learn an optimal combination of strategies. For the gravitational search algorithm, Sun et al. [23] improved its search performance by learning from the unique neighbourhood and historically global best agent to avoid premature convergence and high runtime consumption. Moreover, by integrating the benefits of the genetic algorithm and particle swarm optimization algorithm, Yu et al. [24] developed a hybrid GAPSO algorithm and applied it to the integration of process planning and scheduling. To further improve the search ability of the fireworks optimization method, Zheng et al. [29] additionally incorporated some differential evolution operators into the framework of the fireworks optimization method, and proposed a hybrid fireworks optimization method. Through properly introducing the search operations of differential evolutions in the cultural algorithm and particle swarm optimization algorithm, Awad et al. [30] and Zuo et al. [28], respectively, put forward a novel hybrid algorithm. Furthermore, by incorporating a local pattern search method into a water cycle algorithm to enhance its convergence speed, Taib et al. [34] presented a hybrid water cycle algorithm for data clustering, while by introducing a reinforcement learning technique into the framework of the particle swarm optimization algorithm, Li et al. [36] proposed an enhanced PSO variant for global optimization problems, where a neighbourhood differential mutation operator is also employed to increase the diversity of the algorithm. Thereby, designing new strategies and combining different approaches or algorithms is still an effective and popular way to enhance the performance of an algorithm.
In last decade, the gravitational search algorithm (GSA) [37] and water cycle algorithm (WCA) [38] were proposed to solve the optimization problem according to the Newton’s law of gravity and motion and the water cycle phenomenon, respectively. In particular, the particles in the population, considered as objects and their performance are evaluated as their mass, absorb each other by the gravitational force in the GSA, while the total population is divided into streams, rivers and the sea according to their fitness values in the WCA, and the rivers and the sea are regarded as leaders to guide other points towards better points, thus avoiding the search in inappropriate regions. Since they were born, they have attracted more and more attention from researchers and been applied to different fields [39,40,41,42,43,44,45,46,47,48,49,50,51,52,53]. For instance, with respect to the water cycle algorithm, to strengthen its search performance, Sadollaha et al. [39] further developed the concept of evaporation and incorporated the evaporation procedures into the original WCA, while Heidari et al. [40] adopted a chaotic mapping to set its parameters. By considering and simulating the real-world percolation process, Qiao et al. [44] designed a percolation operator and then developed an improved WCA for clustering analysis. Meanwhile, by adding the solution mapping technique in the WCA, Osaba et al. [45] put forward a discrete WCA variant to solve the symmetric and asymmetric travelling salesman problem; moreover, by introducing two different surrogate modes, Chen et al. [52] proposed a novel water cycle algorithm for high-dimensional expensive optimization. Furthermore, in order to alleviate the low precision and slow convergence of the WCA, Ye et al. [53] further presented an enhanced version of the WCA based on quadratic interpolation, where a new non-linear adjustment strategy is designed for distance control parameters, mutation operations are probabilistically performed, and the quadratic interpolation operator is introduced to improve the local exploitation capability of the algorithm. Specifically, more detailed works on the improvements and applications of the WCA can be found in [48]. On the other hand, for the gravitational search algorithm, by introducing chaotic mappings to control the gravitational constants, Mirjalili and Gandomi [41] presented a chaotic GSA version; whereas Lei et al. [43] proposed a hierarchical gravitational search algorithm by adaptively adjusting the gravitational constants and introducing the population topology. Meanwhile, to alleviate the stagnation of the GSA, Oshi [50] put forward an enhanced GSA variant by incorporating chaos-embedded opposition-based learning into the basic GSA, introducing a sine–cosine-based chaotic mapping to set the gravitational constants. More detailed descriptions of recent GSA variants can be found in [49]. Noticeably, even though the WCA and GSA variants have been proven to be more effective than their original versions, the GSA versions still easily fall into local optima and become very costly when the particles are only guided by superior agents or too many agents, respectively; while the WCA variants have a poorer exploration ability in the search space since the processes of the streams flowing to their corresponding rivers and consequent sea are similar to the mutation in differential evolution [54]. Therefore, it is necessary to develop an improved version of them to effectively balance exploration and exploitation.
Based on above discussions, this paper presents a novel hybrid heuristic algorithm (HMWCA) by combining the water cycle algorithm and the gravitational search algorithm. To effectively enhance the search capability of the algorithm and obtain a better balance between exploration and exploitation, a modified WCA was first developed by designing a crude niching technique and using it to assign the sea, rivers and their corresponding streams in the WCA, and dynamically adjusting the total number of rivers and the sea in an adaptive manner. Furthermore, the resulting water cycle search and gravitational search were organically integrated to explore or exploit the search space by making full use of their history performance within certain iterations. Moreover, in order to further improve the search effectiveness of the algorithm, the binomial crossover operation was further incorporated after the gravitational search or modified water cycle search to adjust the transmission of the search information. Then, the HMWCA could not only effectively balance the exploration and exploitation, but also save the computing resources during the search process. Finally, numerical experiments were carried out to evaluate the performance of the HMWCA by comparing with six excellent meta-heuristic algorithms on 30 CEC2014 benchmark functions [55] with different dimensions, and the experimental results showed that the proposed algorithm was very competitive. Furthermore, the HMWCA was also applied to spread-spectrum radar polyphase code design (SSRPPCD) [56].
For clarity, compared with the existing works, the main contributions and novelties of this paper are listed below.
(1)
A crude niching technique was designed and adopted to define the sea, rivers and their corresponding streams in the WCA, and a non-linear setting was introduced to adjust the total number of rivers and the sea during the search process. Thereby, a modified WCA is presented, which could possess a better balance between exploration and exploitation.
(2)
To further strengthen the adaptivity and robustness of the algorithm for various search stages and problems, the resulting water cycle search and gravitational search were further integrated according to their history performance within certain iterations.
(3)
The binomial crossover operation was additionally introduced in the proposed algorithm when the gravitational search or modified water cycle search were executed. This might further promote control over the transmission of the search information.
Therefore, the proposed algorithm could have a more promising search performance.
The reminder of this paper is organized as follows. Section 2 gives brief descriptions of the classical gravitational search algorithm and water cycle algorithm. The proposed algorithm is presented in Section 3. Section 4 provides and discusses the experimental results against benchmark functions. Finally, conclusions are drawn in Section 5.

2. The Classical GSA and WCA

In this section, we shall simply describe the classical gravitational search algorithm and water cycle algorithm.

2.1. Gravitational Search Algorithm

According to the law of gravity and mass interaction, Rashedi et al. [37] proposed a new heuristic optimization algorithm, GSA, for optimization problems. In the GSA, agents are considered as objects and their performance are measured by their masses. All objects attract each other by a gravity force, thus leading to a global movement towards objects with heavier masses. For a minimization problem, min { f ( X ) | X = ( x 1 , x 2 , , x D ) Ω } , the concrete steps of the GSA can be described as follows. Herein, Ω is the search space, and D are the dimensions of Ω ,
Consider a system with N p o p objects, and  X i = ( x i 1 , x i 2 , , x i D ) ( i = 1 , 2 , , N p o p ) is the position of the ith object with x i d being its d-th dimension, the gravitational force from agent j on agent i at a specific time t is defined as:
F i j d ( t ) = G ( t ) M p i ( t ) × M a j ( t ) R i j ( t ) + ε [ x j d ( t ) x i d ( t ) ] ,
where M a j ( t ) is the active gravitational mass related to agent j, M p i ( t ) is the passive gravitational mass related to agent i, ε > 0 is a small constant, R i j ( t ) is the Euclidean distance between the agents i and j, and  G ( t ) is the gravitational constant at time t.
The total force acting on agent i in the d-th dimension is a randomly weighted sum of d-th components of the forces exerted from other agents:
F i d ( t ) = j = 1 , j i N p o p r a n d j F i j d ( t ) ,
where r a n d j is a random number in the interval [0,1]. Then, the acceleration a i d ( t ) is given by
a i d ( t ) = F i d ( t ) M i i ( t )
where M i i ( t ) is the mass of object i. The velocity and position of agent i are, respectively, calculated by
v i d ( t + 1 ) = r a n d × v i d ( t ) + a i d ( t )
and
x i d ( t + 1 ) = x i d ( t ) + v i d ( t + 1 )
where d = 1 , 2 , , D , and  r a n d [ 0 , 1 ] is a random number subjected to uniform distribution.
Moreover, the masses of objects are calculated by the following equations:
M a j = M p i = M i i = M i , m i ( t ) = f i t i ( t ) w o r s t ( t ) b e s t ( t ) w o r s t ( t ) , M i ( t ) = m i ( t ) i = 1 N m i ( t )
where i = 1 , 2 , , N p o p , f i t i ( t ) represents the fitness value of the ith object at time t, and  b e s t ( t ) and w o r s t ( t ) are defined by:
b e s t ( t ) = min i = 1 , 2 , , N p o p f i t i ( t ) , w o r s t ( t ) = max i = 1 , 2 , , N p o p f i t i ( t ) .

2.2. Water Cycle Algorithm

The water cycle algorithm (WCA) is a new meta-heuristic optimizer inspired by real-world observations of the water cycle and how rivers and streams flow downhill towards the sea [38]. Same as in Section 2.1, a minimization problem min { f ( X ) | X = ( x 1 , x 2 , , x D ) Ω } is considered, where Ω is the search space and D is the dimensions of Ω . Then the main contents of the WCA can be described as follows.
In the WCA, after the initialization of the population, the best individual is chosen as the sea, and a number of better streams are chosen as rivers. Thus, the population can be redefined as
T o t a l   p o p l u t i o n = S e a R i v e r 1 R i v e r 2 S t r e a m N s r + 1 S t r e a m N s r + 2 S t r e a m N p o p = x 1 1 x 1 2 x 1 D x 2 1 x 2 2 x 2 D x N p o p 1 x N p o p 2 x N p o p D ,
where N p o p is the size of population, while the total number N s r of rivers and the sea and the number of streams can be computed by
N s r = N u m b e r o f R i v e r s + 1 s e a ,
and
N s t r e a m = N p o p N s r ,
respectively. Moreover, the cost of a raindrop is calculated as follows:
C o s t i = f ( X i ) , i = 1 , 2 , , N p o p ,
where X i = ( x i 1 , x i 2 , , x i D ) . Then the corresponding flow intensity of rivers and the sea are calculated by
C n = C o s t n C o s t N s r + 1 , n = 1 , 2 , , N s r ,
N S n = r o u n d C n i = 1 N s r C i × N s t r e a m , n = 1 , 2 , , N s r ,
where N S n is the number of streams flowing to a specific river or sea. During the search process, the streams flowing to the sea and rivers, and the rivers flowing to the sea are, respectively, updated by the following equations:
X s t r e a m ( t + 1 ) = X s t r e a m ( t ) + r a n d × C × ( X s e a ( t ) X s t r e a m ( t ) ) ,
X s t r e a m ( t + 1 ) = X s t r e a m ( t ) + r a n d × C × ( X r i v e r ( t ) X s t r e a m ( t ) ) ,
X r i v e r ( t + 1 ) = X r i v e r ( t ) + r a n d × C × ( X s e a ( t ) X r i v e r ( t ) ) ,
where t represents the number of the current iteration, and  1 < C < 2 is a constant. In particular, when the new position is better than its corresponding river or sea, their positions will be exchanged.
Furthermore, as one of the most important procedures in the WCA, evaporation is used to prevent the algorithm from premature convergence, and its detail procedures can be described as follows.
(a) If X s e a X r i v e r , i < d m a x , i = 1 , 2 , , N s r 1 , then the raining process will be executed for its corresponding streams by
X s t r e a m , n e w = L B + r a n d × ( U B L B ) ,
where L B and U B are the lower and upper bounds of the search space, respectively.
(b) If the streams flowing to the sea are satisfied the above evaporation condition, then let
X s t r e a m , n e w = X s e a + μ × r a n d n ( 1 , D ) ,
where the coefficient μ represents the concept of variance to show the range of searching regions around the best solution, r a n d n is a normally distributed random number, and  d m a x is a small number close to zero to control the search intensity. To enhance the search near the sea, d m a x adaptively decreases as follows:
d m a x ( t + 1 ) = d m a x ( t ) ( 1 1 T )
where T is the maximum number of iterations.
Overall, even though the GSA and WCA have promising performances for optimization problems and applied in actual optimization problems [37,38], the WCA easily becomes trapped into local optima since the streams and rivers always flow to few super individuals, while the GSA requires a large amount of computing resources, since each object is attracted by the sum of the forces exerted from other objects. Thereby, it is necessary to further develop a more promising optimizer.

3. Proposed Algorithm

As mentioned above, the GSA has a strong global search ability, while the WCA performs better in local searches. Consequently, we develop a novel hybrid algorithm (HMWCA) to balance exploration and exploitation by making full use of their advantages.

3.1. Modified WCA

Although the WCA has been broadly researched and successfully adopted to solve various real-world optimization problems [33,34,35,36,38,39,40,44,45,46,47,48,52,53], it still easily falls into premature convergence, especially for complicated functions. The reasons for this might be due to the fact that the number of rivers, streams and the sea in the original WCA are fixed during the whole search process once they are formed at the start of the algorithm. Obviously, it is not suitable for the search states to change. Moreover, the fitness values of the population are only taken into account to form the sea, rivers and their streams, which might lead to premature convergence since the superior individuals are crowded in general. To overcome these shortcomings, a modified WCA (MWCA) was proposed by simultaneously considering both the fitness values and positions of the population to design the sea, rivers and streams in the WCA and adjusting the total number of rivers and the sea in an adaptive manner.
As pointed out in [57], niching techniques can divide the whole population into smaller niches according to both the fitness value and distance, and maintain the diversity of the population during the search process. Then a special niching method, named clustering for speciation [57], was considered and modified here to form the sea, rivers and their corresponding streams in the WCA. Accordingly, the intensity of the flow for the rivers and sea would be regarded as the cluster size in clustering for speciation.
To enhance the global search capability, the intensity of the flow for rivers and sea was set as follows:
N S n = f l o o r ( N s t r e a m N s r ) , n = 2 , 3 , , N s r ,
and
N S 1 = N s t r e a m i = 2 N s r N S i ,
where f l o o r ( A ) returns the nearest integer less than or equal to A. For clarity, the detail procedures of the modified niching approach generating the sea, rivers and streams, are shown in Algorithm 1.
Algorithm 1 Modified niching approach.
1:
Input: The population P, the fitness values of P, and the total number of rivers and the sea N s r .
2:
Calculate the intensity N S n ( n = 1 , 2 , , N s r ) of flow for each river and sea by Equations (20) and (21).
3:
for i ← 1 to N s r  do
4:
   Find the best individual P b e s t in P as a new seed (sea or river) based on its fitness value, and delete it from P;
5:
   Randomly select three indexes, named d 1 , d 2 , and d 3 , among  1 , 2 , , D , and compute the Euclidean distance d i s j ( j = 1 , 2 , , | P | ) between each individual X j in P and P b e s t just on the chosen three dimensions d 1 , d 2 and d 3 , where | A | denotes the size of set A.
6:
   Sort the individuals in P in ascending order according to their Euclidean distances;
7:
   Let the first N S i individuals be the streams flowing to the above-chosen sea or river P b e s t , and delete them from P;
8:
end for
9:
Output: The sea, rivers and their corresponding streams.
Moreover, to enhance the exploitation at the later search stage, the total number of rivers and sea N s r is adjusted in an adaptive manner as follows:
N s r = max ( r o u n d ( N s r i n i t a l × ( 1 t T ) 2 ) , 1 ) ,
where r o u n d ( A ) returns the nearest integer to A, t and T are the number of the current iteration and the maximum number of iterations, respectively. Obviously, N s r decreases gradually as the number of iteration increases. Then, the initial total number of the sea and rivers N s r i n i t a l plays an important role in the performance of the algorithm. A too low a value for N s r i n i t a l would increase the chances of becoming trapped into local optima, while a too large a value might decrease the capability of the algorithm to efficiently exploit each promising region. Thus, we let N s r i n i t a l = 5 , and its suitability is shown in Section 4.1. More specifically, when the value of N s r is changed, the sea, rivers and streams will be reassigned by Algorithm 1.
By integrating the above changes within the classical WCA, a variant of the WCA (MWCA) was developed, and its framework is clearly described in Algorithm 2. Differing from the classical WCA [38], the fitness value and position of individuals are both considered to form the sea, rivers and their streams (see Steps 4–7 in Algorithm 1), and the total number of rivers and the sea are dynamically reduced during the search process. Therefore, the WCA variant is helpful for both keeping the diversity of the population and enhancing the convergence of the algorithm during the search process. Moreover, it is worth noting that the modified niching method is more close to the real river system, where the streams flow to the nearest river. Thereby, for convenience, a subpopulation consisting of the sea or rivers and its corresponding streams is called a basin in the following.
Algorithm 2 The framework of the MWCA.
1:
Input: The total number of the population N p o p , the initial total number of rivers and the sea N s r i n i t a l , the predefined parameter d m a x and the maximum number of iterations T.
2:
Initialize the population P by randomly generating N p o p individuals; Calculate the fitness values (costs) of the population by Equation (11);
3:
Set the current iteration t = 0 ; Let N s r = N s r i n i t a l ;
4:
Calculate the intensity N S n ( n = 1 , 2 , , N s r ) of flow for each river and sea by Equations (20) and (21).
5:
Determine the sea, rivers and their corresponding streams by Algorithm 1;
6:
while   t T   do
7:
   for i ← 1 to N s r  do
8:
     for j ← 2 to N S i  do
9:
        if  i = = 1  then
10:
          Update the position of each stream X j by Equation (14);
11:
        else
12:
          Update the position of each stream X j by Equation (15);
13:
        end if
14:
     end for
15:
   end for
16:
   for i ← 1 to N s r 1  do
17:
     Update the position of each river X r i v e r , i by Equation (16);
18:
   end for
19:
   Execute the evaporation operation as same as in the original WCA;
20:
   Update d m a x by Equation (19) and the total number of rivers and the sea N s r by Equation (22);
21:
   if  N s r is changed then
22:
     Determine the sea, rivers and their corresponding streams by Algorithm 1;
23:
   end if
24:
    t = t + 1 ;
25:
end while
26:
Output: The sea (global optimum).

3.2. The Hybridization of MWCA and GSA

As described in the last subsection, the MWCA can maintain the diversity of population and has a strong search capability. However, the sea, rivers and their streams are only updated by Equations (14)–(16), respectively, and these update approaches are similar to the mutation in DE, which easily leads to premature convergence [40]. To overcome this shortcoming, by properly integrating the stronger exploration of the gravitational search, a hybrid approach was designed to further enhance the search ability of the algorithm. Specifically, the concrete procedures of this hybrid method are described in the following.
In order to effectively balance the exploration and exploitation, the water cycle search and gravitational search are combined and alternatively operated to update the population based on their history performance within constant iterations T S . In detail, for each basin, the performance of the current operation is evaluated by the successful number N u m s u c over the T S iterations. Meanwhile, similar to [58], if  N u m s u c 0.3 T S , meaning that this operation has promising performance, then it continues to execute for this basin in the next stage; otherwise, another operation is conducted. Clearly, T S is a key factor to the performance of the hybrid approach, and a large T S will not evaluate the search stage in time, while a small T S cannot precisely measure the performance. Thus, we let T S = 15 , shown to be a suitable choice from the experiments in Section 4.1.
Moreover, to save the computing costs, unlike the traditional gravitational search algorithm [37], each object is only attracted by N b e s t superior objects measured by their fitness values. Furthermore, to effectively balance exploration and exploitation, and accelerate the convergence speed at the later search stage, we let
N b e s t = r o u n d ( N S i × ( 1 + e 20 × ( t T 0.4 ) ) 1 + 1 , i = 1 , 2 , , N s r .
Clearly, N b e s t decreases gradually with the lapse of iterations. Then each object is absorbed by almost all objects in the previous search stage and a few superior ones in the later stage, respectively. Thus, this setting can ensure the exploration and convergence of the algorithm at different search stages.
Noticeably, compared with the original WCA and GSA [37,38], the proposed hybrid approach can not only effectively balance exploration and exploitation, but also decrease the cost of computation.

3.3. The Crossover Operation

Considering that the update approaches of the streams and rivers in the WCA are similar to the mutation in differential evolution, and the mutation operator always performs together with the crossover operation in DE, a crossover operation was added in the proposed method after the water cycle search or gravitational search.
In particular, to enhance the performance of the algorithm, the binomial crossover [54] was employed and an adaptive method [59] was also introduced to set its corresponding crossover rate. In particular, for one solution (river or stream) X i ( t ) at iteration t, when one new position X t e m p t , i ( t ) has been generated by the water cycle search or gravitational search, its new position X i ( t + 1 ) will be created by
x i d ( t + 1 ) = x t e m p t , i d ( t ) , if r a n d C r i ( t ) o r d = r a n d n ( i ) , x i d ( t ) , otherwise
where d = 1 , 2 , , D , C r i ( t ) [ 0 , 1 ] is the crossover rate, x t e m p t , i d ( t ) denotes the d-th dimension of X t e m p t , i ( t ) , and  r a n d n ( i ) returns a random integer within [ 1 , D ] , which ensures that X i ( t + 1 ) receives at least one component from X t e m p t , i ( t ) . Moreover, the value of C r i ( t ) can be obtained by
C r i ( t ) = r a n d n ( C r m e a n ( t ) , 0.1 ) ,
where r a n d n ( C r m e a n ( t ) , 0.1 ) is the normal distribution with standard deviation 0.1 and mean
C r m e a n ( t ) = c · C r m e a n ( t 1 ) + ( 1 c ) · m e a n W A ( S C r ( t 1 ) ) , if S C r ( t 1 ) , 1 C r m e a n ( t 1 ) , otherwise .
Herein, m e a n W A ( · ) is the weighted arithmetic mean [59], and  S C r ( t 1 ) denotes the set of all successful C r values at iteration t. Furthermore, similar to [59], c is set to 0.1 , and  C r m e a n is initialized to 0.5 .
From the above descriptions, the binomial crossover operation and an adaptive parameter setting were incorporated in the proposed algorithm to update the information of all solutions during the search process. Thereby, this operation can further adjust the search ability of the algorithm, and adaptively control the transmission of the search information during the search process.
Overall, by integrating the above-presented approaches, the framework of the proposed algorithm is described in Algorithm 3.
From Algorithm 3, we see that the population is first divided into N s r basins by using the modified niching method. Next, the object in each basin is updated by the hybrid approach and binomial crossover, and then the evaporation condition is judged. Finally, N s r is updated and the condition of forming the new basins is judged. If satisfied, then the population is divided again. Obviously, differing from the classical WCA [38], the sea, rivers and streams are formed by the modified niching method to enhance the diversity of the population, and the population is updated by the hybrid approach and binary crossover operation to balance exploration and exploitation. Therefore, the HMWCA has a promising performance to escape from local optima and find the global optima.
Algorithm 3 The framework of the HMWCA.
1:
Input: The total number of the population N p o p , the initial total number of rivers and the sea N s r i n i t a l , the predefined parameter d m a x , the maximum number of iterations T, and the constant iterations T S .
2:
Initialize the population P by randomly generating N p o p individuals; calculate the fitness values (costs) of the population by Equation (11); and initialize the crossover parameter C r m e a n to 0.5 ;
3:
Set the current iteration t = 0 ; let N s r = N s r i n i t a l ;
4:
Calculate the intensity N S n ( n = 1 , 2 , , N s r ) of flow for each river and sea by Equations (20) and (21).
5:
Determine the sea, rivers and their corresponding streams by Algorithm 1;
6:
Set the successful number N u m s u c , i for each basin B i ( i = 1 , 2 , , N s r ) to 0;
7:
Set the indicator i n d i of the adopted updating strategy for each basin B i ( i = 1 , 2 , , N s r ) to 1;
8:
while  t T   do
9:
   for i ← 1 to N s r  do
10:
     if  i n d i = = 1  then
11:
        Generate the corresponding crossover rate for each stream in the current basin B i by Equations (25) and (26);
12:
        Update the positions of the streams in the current basin B i as the same as in Algorithm 2 and Equation (24);
13:
     end if
14:
     if  i n d i = = 2  then
15:
        Compute N b e s t by Equation (23);
16:
        Generate the corresponding crossover rate for each stream in the current basin B i by Equations (25) and (26);
17:
        Update the positions of the streams in the current basin B i by Equations (5) and (24);
18:
     end if
19:
     if The best individual in B i is updated then
20:
         N u m s u c , i = N u m s u c , i + 1 ;
21:
     end if
22:
     Update the crossover parameter as the same as in [59];
23:
   end for
24:
   if  m o d ( t , T S ) = = 0  then
25:
     for i ← 1 to N s r  do
26:
        if  N u m s u c , i < 0.3 T S  then
27:
           i n d i = m o d ( i n d i + 1 , 2 ) + 1 ;
28:
        end if
29:
         N u m s u c , i = 0 ;
30:
     end for
31:
   end if
32:
   Execute the evaporation operation as in the original WCA;
33:
   Update d m a x by Equation (19) and the total number of rivers and the sea N s r by Equation (22);
34:
   if  N s r is changed then
35:
     Determine the sea, rivers and their corresponding streams by Algorithm 1;
36:
   end if
37:
    t = t + 1 ;
38:
end while
39:
Output: The sea (global optimum).

3.4. Complexity Analysis

In this subsection, the complexity of the HMWCA is further analysed and presented. Obviously, the main differences between the HMWCA and the original WCA [38] are the modified niching method, the hybridization of the water cycle search and gravitational search, and the binomial crossover operation. According to [38], the complexity of the original WCA is O ( N p o p · D · T + log 2 N p o p ) .
For the modified niching method, it must sort the population based on their fitness values and calculate the distances between the sea, rivers and the population. Moreover, note that the modified niching method is only performed N s r i n i t i a l times during the whole search process. Then their complexity are O ( N p o p · log 2 N p o p · N s r i n i t i a l ) and O ( ( N s r i n i t i a l + 1 ) / 2 · N p o p · D ) , respectively. Therefore, the complexity of the modified niching method is O ( N p o p · ( D · ( N s r i n i t i a l + 1 ) / 2 + log 2 N p o p · N s r i n i t i a l ) ) . Meanwhile, with respect to the hybridization of the water cycle search and gravitational search, according to [37,38], the complexity of the water cycle search and gravitational search are O ( N p o p · D · T ) and O ( N p o p 2 · D · T ) , respectively. In addition, the complexity of the alternate criterion between them is O ( N p o p · T ) . Then, the complexity of the hybridization of the water cycle search and gravitational search is O ( N p o p · T · ( D · N p o p + 1 ) ) . Moreover, for the binomial crossover operation, according to [60], its complexity is O ( N p o p · D · T ) .
Overall, the total complexity of the HMWCA is O ( N p o p · ( D · ( N s r i n i t i a l + 1 ) / 2 + log 2 N p o p · N s r i n i t i a l + T · ( D · N p o p + 1 + D ) ) ) , which can be simplified as O ( N p o p 2 · D · T ) . Note that the gravitational search is operated in alternation during the search process in the HMWCA, and the complexity of the original GSA is O ( N p o p 2 · D · T ) . Therefore, the HMWCA has acceptable complexity.

4. Numerical Experiments

In this section, the performance of the proposed HMWCA is evaluated on 30 well-known benchmark functions f 1 - f 30 from the CEC2014 with both D = 30 and D = 50 [55]. Meanwhile, the sensitivity of the parameters in the HMWCA are discussed against eight typical functions: unimodal functions f 1 and f 3 , simple multimodal functions f 4 and f 9 , hybrid functions f 17 and f 21 , and composition functions f 27 and f 29 . Moreover, four variants of the HMWCA are designed and compared to show the effects of the proposed components. Furthermore, a comparison of the HMWCA with six well-known optimization algorithms, including the original WCA [38], two competitive variants of the WCA (the ER_WCA [39] and CWCA [40]), the original GSA [37] and two other well-known heuristic algorithms (CLPSO [22] and HSOGA [61]), was conducted to discuss the efficiency of the HMWCA.
In these experiments, the maximum number of function evaluations ( F E S m a x ) was set to 60,000 for all problems with both D = 30 and D = 50 , and all problems were independently ran 25 times. Moreover, for fair comparisons, the size of the population N p o p was set to 50 in the HMWCA, the same as in the original WCA and its two variants, the ER_WCA and CWCA.

4.1. The Sensitivities of Parameters N s r i n i t i a l and T S

In this subsection, the sensitivity of parameters in HMWCA is discussed by a series of tuning experiments. In particular, to obtain the reasonable values of N s r i n i t i a l and T S , the HMWCA was ran on f 1 , f 3 , f 4 , f 9 , f 17 , f 21 , f 27 and f 29 , varying N s r i n i t i a l and T S from 4 to 7 and from 5 to 20 in steps of 1 and 5, respectively. Table 1 reports the experimental results, where the “Mean Error” and “Std Dev” represent the average and standard deviation of the obtained error function values from 25 runs, respectively, and the best results are marked by bold for each function (the same as below).
From Table 1, we see that f 1 , f 3 , f 4 , f 17 and f 21 obtain the best values when T S = 15 and N s r 0 = 5, f 9 obtained the best values when N s r 0 = 10, T S = 10 and 15, f 27 obtains the best values when T S = 10 and N s r 0 = 8, while f 29 obtains the best values when T S = 15 and N s r 0 = 8. Moreover, it can also been seen that there is no best values when T S = 5 or 20, and only two best values when T S = 10, while the HMWCA obtains 4, 2 and 2 best values when N s r 0 = 5, 8 and 10, respectively. Then, the value for T S should not be too small or too large, with 15 and 5 as reasonable choices for T S and N s r 0 for most functions, respectively. Therefore, we let T S = 15 and N s r 0 = 5 in this paper.

4.2. The Effectiveness of the Proposed Components in the HMWCA

To clearly show the effects of the modified niching method, the hybridization of the water cycle search and gravitational search, and the binomial crossover operation on the performance of the HMWCA, we first designed four variants of the HMWCA as follows.
(1)
HMWCA1: the method in the original WCA [38] was used to form the sea, rivers and streams instead of the modified niching method in the HMWCA.
(2)
HMWCA2: the population was only updated by the water cycle search in the HMWCA during the whole search process.
(3)
HMWCA3: after the water cycle search or gravitational search at each generation, the binomial crossover operation is not further used in the HMWCA.
(4)
HMWCA4: the total number of and rivers and the sea is unchanged in the HMWCA during the search process. In particular, similar to the original WCA [38] and its variants [39,40], N s r was set to 4 in the HMWCA.
Then, a comparison of the HMWCA with the above four variants was made against all problems from the CEC2014 with D=30. The experimental results are listed in Table 2, and the rank of each algorithm against each problems was used to measure their performance.
From Table 2, we see that for unimodal functions f 1 f 3 , the HMWCA obtains the best results for f 1 and the HMWCA2 obtains the best results for f 2 and f 3 , potentially due to the fact that the gravitational search degrades the local search ability. For simple multimodal functions f 4 f 16 , the HMWCA obtains the best results, except the HMWCA2 achieves the best results for f 8 and the HMWCA1 achieves the best results for f 14 . In particular, the HMWCA has similar top results for f 5 with the HMWCA1, HMWCA2 and HMWCA3. For hybrid functions f 17 f 22 , the HMWCA has the best results, except the HMWCA2 achieves the best results for f 19 and f 22 . For composition functions f 23 f 30 , HMWCA obtains the best values for f 23 , f 26 and f 30 , the HMWCA2 obtains the best values for f 25 , f 28 and f 29 , and the HMWCA3 attains the best values for f 24 and f 27 . Furthermore, the HMWCA1, HMWCA2, HMWCA3, HMWCA4 and HMWCA obtain 75, 67, 107, 141 and 44 in terms of their total rank and 2.5 , 2.23 , 3.57 , 4.7 and 1.47 in terms of their average rank for all problems, respectively. Therefore, it can be noted that the modified niching method, the hybridization of the water cycle search and gravitational search, and the binomial crossover operation are helpful to enhance the search performance of the HMWCA.

4.3. Comparisons and Discussion

In this subsection, to evaluate the performance of the HMWCA, six well-known optimization algorithms including the original WCA [38], two competitive variants of the WCA (the ER_WCA [39] and CWCA [40]), the original GSA [37] and two other well-known heuristic algorithms (CLPSO [22] and HSOGA [61]) were used as a comparison with CEC2014 problems with both D = 30 and D = 50 . Among these compared algorithms, the ER_WCA [39] is an improved version of the WCA [38] to balance exploration and exploitation by defining the new concept of the evaporation rate for different rivers and streams. The CWCA [40] is a well-known variant of the WCA incorporating chaotic patterns into the stochastic processes of the WCA. CLPSO [22] is an improved PSO using personal historical best information of all particles to update the velocity of particles [22], and the HSOGA [61] is a hybrid self-adaptive orthogonal genetic algorithm designing a self-adaptive orthogonal crossover operator.
In our experiments, to perform fair comparisons, F E S m a x was set to 60,000 for all the above algorithms, the size of population in the HMWCA was set to 50, consistent with the traditional WCA, ER_WCA and CWCA.The other parameters involved are listed in Table 3. The parameters of the other three algorithms, including the GSA, CLPSO and HSOGA, are the same as in their original papers.
Furthermore, the average error (Mean Error) and standard deviation (Std Dev) of the function errors from 25 independent runs were recorded to measure their performances, and a Wilcoxon’s rank sum test ( p < 0.05 ) and their ranks were conducted on the experimental results draw statistically sound conclusions. Table 4 and Table 5 report the experimental results for all problems with D = 30 and D = 50 , respectively, where the best results are marked in bold for each function, and “+”, “−” or “≈ ” to denote that the HMWCA performance was better, worse, or equivalent to the corresponding algorithms, respectively.
From Table 4, we can see that for unimodal functions f 1 f 3 , the HMWCA obtains the best results for all functions, except the ER_WCA has the best results for f 1 , where the HMWCA is significantly better than the CWCA, GSA, CLPSO and HSOGA achieving 3 in terms of its rank. For simple multimodal functions f 4 f 16 , the HMWCA obtains the best results for most functions, including f 4 , f 5 , f 7 , f 9 f 11 and f 14 , while CLPSO attains the best results for f 6 , f 8 , f 15 and f 16 , the GSA achieves the best results for f 5 , f 12 and f 13 , while the CWCA has the best results for f 15 . For hybrid functions f 17 f 22 , the HMWCA obtains the best results for f 18 , f 19 and f 21 , the ER_WCA has the best results for f 17 and f 20 , and CLPSO has the best results for f 22 . For composition functions f 23 f 30 , the HMWCA obtains the best results for f 26 and f 28 f 30 , the GSA obtains the best results for f 23 f 25 , and CLPSO achieves the best results for f 27 . In summary, according to the results of the Wilcoxon’s rank sum tests reported in Table 4, the HMWCA is better than the WCA, CWCA, ER_WCA, GSA, CLPSO and HSOGA on the 29, 25, 27, 24, 23, and 29 test functions, respectively, equivalent on the 0, 2, 0, 1, 0, and 0 test functions, respectively, and slightly worse on the 1, 3, 3, 5, 7, and 1 test functions, respectively. Moreover, the WCA, CWCA, ER_WCA, GSA, CLPSO, HSOGA and HMWCA obtain 4.07 , 3.97 , 4.37 , 4.77 , 3.77 , 5.27 and 1.7 in terms of their average rank for all functions, respectively.
For clarity, the evolution curves of these compared algorithms on ten functions, including f 1 , f 2 , f 3 , f 4 , f 9 , f 17 , f 19 , f 21 , f 28 and F 30 , are plotted in Figure 1 and Figure 2, and their results from 25 independent runs are compared in Figure 3 and Figure 4. From Figure 1, Figure 2, Figure 3 and Figure 4, we can see that the HMWCA has a more promising performance than others. In detail, for unimodal functions f 1 and f 2 , the HMWCA has a better exploration ability than the WCA and ER_WCA, and worse than the GSA. Meanwhile, for complex functions, the HMWCA has a better performance for most functions, especially for f 9 , f 19 , f 28 and f 30 . Therefore, the HMWCA has an effective performance.
From Table 5, we can see that for unimodal functions f 1 f 3 , the HMWCA is significantly better than the CWCA, GSA, CLPSO and HSOGA, while the ER_WCA achieves the best results for all functions. This might be because the gravitational search reduces the exploitation ability of the HMWCA. For simple multimodal functions f 4 f 16 , the HMWCA obtains the best results for f 9 , f 11 and f 14 , the ER_WCA obtains the best results for f 4 and f 7 , while the CWCA, GSA and WCA have the best results for f 6 , f 12 and f 15 , respectively, and CLPSO obtains the best results for f 8 , f 10 and f 13 . In particular, the HMWCA and GSA obtain the best results for f 5 , while the CWCA and CLPSO have the best results for f 16 . For hybrid functions f 17 f 22 , the HMWCA obtains the results for f 18 , f 19 and f 22 , while the ER_WCA has the best results for f 17 , f 20 and f 21 . For composition functions f 23 f 30 , the HMWCA obtains the best results for f 28 and f 30 , the GSA has the best results for f 23 and f 29 , the HSOGA obtains the best results for f 24 f 26 , and the CWCA obtains the best results for f 27 . According to the statistical results of the Wilcoxon’s rank sum test reported in Table 5, the HMWCA is better than the WCA, CWCA, ER_WCA, GSA, CLPSO and HSOGA for the 22, 24, 21, 24, 22, and 26 test functions, respectively, equivalent on 0, 0, 0, 1, 0, and 0 test functions, respectively, and slightly worse on 8, 6, 9, 5, 8, and 4 test functions, respectively. Furthermore, in terms of their average rank, the WCA, CWCA, ER_WCA, GSA, CLPSO, HSOGA and HMWCA obtain 3.7 , 3.83 , 3.33 , 4.77 , 4.03 , 5.73 and 2.33 for all functions, respectively. Therefore, the HMWCA is a more promising optimizer.

4.4. Algorithm Efficiency

In this subsection, a comparisons of the HMWCA with the WCA and GSA are conducted to show the running efficiency of the HMWCA on six typical functions, including f 1 , f 3 , f 4 , f 9 , f 17 and f 21 . This experiment was conducted in MATLAB 2015a on a personal computer (Pentium(R) Dual-Core CPU at 3.06 GHz and 4 GB memory), and the average running time of 25 independent runs was recorded to evaluate their efficiencies. The maximum number of function evaluations F E S m a x was considered as the termination condition and set to 60,000 for all algorithms. Table 6 reports their numerical results, where the time represents the CPU time when F E S m a x is met.
From Table 6, we can see that the HMWCA is faster than the GSA and slower than the WCA. It should be noted that the water cycle search and gravitational search are alternatively operated during the search process according to their performances. This determines that the HMWCA cost more than the WCA but less than the GSA. Thus, the HMWCA has acceptable complexity.

4.5. Application

Herein, the HMWCA is further applied to solve a practical problem, named the spread-spectrum radar polyphase code design (SSRPPCD) [56]. This problem aims to select an appropriate waveform, and can be modelled as the following min–max non-linear non-convex optimization problem.
min x X f ( x ) = max { ϕ 1 ( x ) , , ϕ 2 m ( x ) } ,
where X = { ( x 1 , , x n ) R n | 0 x j 2 π for j = 1 , , n } , m = 2 n 1 , ϕ m + i ( x ) = ϕ i ( x ) for i = 1 , , m and
ϕ 2 i 1 ( x ) = j = 1 n cos ( k = | 2 i j 1 | + 1 j x k ) , i = 1 , , n ϕ 2 i ( x ) = 0.5 + j = i + 1 n cos ( k = | 2 i j | + 1 j x k ) , i = 1 , , n 1 .
Clearly, the objective function is piecewise smooth and has numerous local optima. Table 7 lists the numerical results of the HMWCA, WCA, CWCA and ER_WCA over 25 runs with n = 30 and F E S m a x = 60,000, where the best algorithm among them is marked in bold. From Table 7 one can see that the HMWCA has the best results in terms of the “Best” result, “Worse” result and “Average” value, while it has a poorer performance than the CWCA and ER_WCA in term of the standard deviation. Therefore, the HMWCA’s performance is more promising for this problem.

5. Conclusions

To integrate the advantages of the WCA and GSA, this paper proposed the HMWCA with the GSA for global optimization. To effectively balance exploitation and exploration, a modified niching technique was first designed and used to assign the sea, rivers and their corresponding streams in the WCA, and the total number of rivers and the sea were adjusted in an adaptive manner during the search process. Moreover, the gravitational search and water cycle search were combined and alternatively operated to explore or exploit the search space according to their performances within certain iterations. Furthermore, to improve the search capability, the binomial crossover operation was also employed and operated after the water cycle search or gravitational search, and an adaptive method was used to set its parameter. Finally, the HMWCA was compared with six well-known optimization algorithms to evaluate its performance on 30 IEEE CEC2014 benchmark functions with different dimensions, and applied to the spread-spectrum radar polyphase code design (SSRPPCD). The experimental results show that the proposed algorithm is very competitive.
Further research should focus on the application of the HMWCA to constrained optimization problems as well as real-world optimization problems to further test its performance. Furthermore work should be conducted to design other improved optimization approaches by hybridising the WCA with other heuristic algorithms.

Author Contributions

Conceptualization, M.T. and J.L.; methodology, M.T.; software, M.T.; validation, J.L., W.Y. and J.Z.; formal analysis, M.T.; investigation, M.T.; resources, W.Y.; data curation, J.Z.; writing—original draft preparation, M.T.; writing—review and editing, W.Y. and J.Z.; visualization, M.T.; supervision, M.T.; project administration, J.L. and J.Z.; funding acquisition, J.L. and J.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Natural Science Basic Research Program of the Shaanxi Province of China No. 2022JQ-624, the Fund of Science and Technology Department of Shaanxi Province No. 2023-JC-QN-0097, and the Scientific Research Startup Foundation of Xi’an Polytechnic University (BS202052).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets generated during and analysed during the current study are available from the corresponding author on reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sha, D.Y.; Hsu, C.Y. A new particle swarm optimization for the open shop scheduling problem. Comput. Oper. Res. 2008, 35, 3243–3261. [Google Scholar] [CrossRef]
  2. Zhang, B.; Pan, Q.K.; Meng, L.L.; Lu, C.; Mou, J.H.; Li, J.Q. An automatic multi-objective evolutionary algorithm for the hybrid flowshop scheduling problem with consistent sublots. Knowl.-Based Syst. 2022, 238, 107819. [Google Scholar] [CrossRef]
  3. Das, R.; Akay, B.; Singla, R.K.; Singh, K. Application of artificial bee colony algorithm for inverse modelling of a solar collector. Inverse Probl. Sci. Eng. 2017, 25, 887–908. [Google Scholar] [CrossRef]
  4. Omran, M.G.; Engelbrecht, A.P.; Salman, A.A. Differential evolution methods for unsupervised image classification, 2008. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC 2005), Edinburgh, UK, 2–4 September 2005. [Google Scholar]
  5. Zhang, Z.; Han, Y. Discrete sparrow search algorithm for symmetric traveling salesman problem. Appl. Soft Comput. 2022, 118, 108469. [Google Scholar] [CrossRef]
  6. Kennedy, J.; Eberhart, R. Particle Swarm Optimization, 1995. In Proceedings of the Icnn95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995. [Google Scholar]
  7. Holland, J.H. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology. In Control and Artificial Intelligence; University of Michigan Press: Ann Arbor, MI, USA, 1975. [Google Scholar]
  8. Storn, R. Differential evolution-a simple and efficient heuristic for global optimization over continuous space. J. Glob. Optim. 1997, 11, 341. [Google Scholar] [CrossRef]
  9. Colorni, A. Distributed optimization by ant colonies. In Proceedings of the First European Conference on Artificial Life, Paris, France, 11–13 December 1991. [Google Scholar]
  10. Ying, T.; Zhu, Y. Fireworks Algorithm for Optimization, 2010. In Proceedings of the First International Conference, ICSI 2010, Beijing, China, 12–15 June 2010. [Google Scholar]
  11. Sun, G.; Zhao, R.; Lan, Y. Joint operations algorithm for large-scale global optimization. Appl. Soft Comput. 2016, 38, 1025–1039. [Google Scholar] [CrossRef]
  12. Jain, M.; Singh, V.; Rani, A. A novel nature-inspired algorithm for optimization: Squirrel search algorithm. Swarm Evol. Comput. 2018, 44, 148–175. [Google Scholar] [CrossRef]
  13. Wagdy, A.; Khater, A.; Hadi, A.A. Gaining-sharing knowledge based algorithm for solving optimization problems Algorithm. Int. J. Mach. Learn. Cybern. 2020, 11, 1501–1529. [Google Scholar]
  14. Ma, Z.Q.; Wu, G.H.; Suganthan, P.N.; Song, A.J.; Luo, Q.Z. Performance assessment and exhaustive listing of 500+ nature-inspired metaheuristic algorithms. Swarm Evol. Comput. 2023, 77, 101248. [Google Scholar] [CrossRef]
  15. Ilonen, J.; Kamarainen, J.K.; Lampinen, J. Differential Evolution Training Algorithm for Feed-Forward Neural Networks. Neural Process. Lett. 2003, 17, 93–105. [Google Scholar] [CrossRef]
  16. Bello, R.; Gomez, Y.; Nowe, A.; Garcia, M.M. Two-Step Particle Swarm Optimization to Solve the Feature Selection Problem, 2007. In Proceedings of the International Conference on Intelligent Systems Design & Applications, Rio de Janeiro, Brazil, 20–24 October 2007. [Google Scholar]
  17. Cuevas, E.; Zaldivar, D.; Perez-Cisneros, M. A novel multi-threshold segmentation approach based on differential evolution optimization. Expert Syst. Appl. 2010, 37, 5265–5271. [Google Scholar] [CrossRef]
  18. Li, J.Y.; Zhan, Z.H.; Tan, K.C.; Zhang, J. A Meta-knowledge transfer-based differential evolution for multitask optimization. IEEE Trans. Evol. Comput. 2022, 26, 719–734. [Google Scholar] [CrossRef]
  19. Liao, Z.W.; Mi, X.Y.; Pang, Q.S.; Sun, Y. History archive assisted niching differential evolution with variable neighborhood for multimodal optimization. Swarm Evol. Comput. 2023, 76, 101206. [Google Scholar] [CrossRef]
  20. Chen, J.X.; Gong, Y.J.; Chen, W.N.; Li, M.T.; Zhang, J. Elastic differential evolution for automatic data clustering. IEEE Trans. Cybern. 2021, 51, 4134–4147. [Google Scholar] [CrossRef]
  21. Hrstka, O.; Kuerová, A. Improvement of real coded genetic algorithm based on differential operators preventing premature convergence. Adv. Eng. Softw. 2004, 35, 237–246. [Google Scholar] [CrossRef] [Green Version]
  22. Liang, J.J.; Qin, A.K.; Suganthan, P.N.; Baskar, S. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 2006, 10, 281–295. [Google Scholar] [CrossRef]
  23. Sun, G.; Zhang, A.; Wang, Z.; Yao, Y.; Ma, J.; Couples, G.D. Locally informed Gravitational Search Algorithm. Knowl.-Based Syst. 2016, 104, 134–144. [Google Scholar] [CrossRef]
  24. Yu, M.; Zhang, Y.; Chen, K.; Zhang, D. Integration of process planning and scheduling using a hybrid GA/PSO algorithm. Int. J. Adv. Manuf. Technol. 2015, 78, 583–592. [Google Scholar] [CrossRef]
  25. Yan, X.Q.; Tian, M.N. Differential evolution with two-level adaptive mechanism for numerical optimization. Knowl.-Based Syst. 2022, 241, 108209. [Google Scholar] [CrossRef]
  26. Li, Y.Z.; Wang, S.H.; Yang, H.Y.; Chen, H.; Yang, B. Enhancing differential evolution algorithm using leader-adjoint populations. Inf. Sci. 2023, 622, 235–268. [Google Scholar] [CrossRef]
  27. Xu, H.Q.; Gu, S.; Fan, Y.C.; Li, X.S.; Zhao, Y.F.; Zhao, J.; Wang, J.J. A strategy learning framework for particle swarm optimization algorithm. Inf. Sci. 2023, 619, 126–152. [Google Scholar] [CrossRef]
  28. Zuo, X.; Li, X. A DE and PSO based hybrid algorithm for dynamic optimization problems. Soft Comput. 2014, 18, 1405–1424. [Google Scholar] [CrossRef]
  29. Zheng, Y.J.; Xu, X.L.; Ling, H.F.; Chen, S.Y. A hybrid fireworks optimization method with differential evolution operators. Neurocomputing 2015, 148, 75–82. [Google Scholar] [CrossRef]
  30. Awad, N.H.; Ali, M.Z.; Suganthan, P.N.; Reynolds, R.G. CADE: A hybridization of Cultural Algorithm and Differential Evolution for numerical optimization. Inf. Sci. 2017, 378, 215–241. [Google Scholar] [CrossRef]
  31. Lynn, N.; Suganthan, P.N. Ensemble particle swarm optimizer. Appl. Soft Comput. 2017, 55, 533–548. [Google Scholar] [CrossRef]
  32. Shehadeh, H.A. A hybrid sperm swarm optimization and gravitational search algorithm (HSSOGSA) for global optimization. Neural Comput. Appl. 2021, 33, 11739–11752. [Google Scholar] [CrossRef]
  33. Chen, H. Hierarchical Learning Water Cycle Algorithm. Appl. Soft Comput. 2020, 86, 105935. [Google Scholar] [CrossRef]
  34. Taib, H.; Bahreininejad, A. Data clustering using hybrid water cycle algorithm and a local pattern search method. Adv. Eng. Softw. 2021, 153, 102961. [Google Scholar] [CrossRef]
  35. Veeramani, C.; Senthil, S. An improved Evaporation Rate-Water Cycle Algorithm based Genetic Algorithm for solving generalized ratio problems. RAIRO-Oper. Res. 2020, 55, S461–S480. [Google Scholar] [CrossRef]
  36. Li, W.; Liang, P.; Sun, B.; Sun, Y.; Huang, Y. Reinforcement learning-based particle swarm optimization with neighborhood differential mutation strategy. Swarm Evol. Comput. 2023, 78, 101274. [Google Scholar] [CrossRef]
  37. Saryazdi, N.P. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar]
  38. Eskandar, H.; Sadollah, A.; Bahreininejad, A.; Hamdi, M. Water cycle algorithm—A novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput. Struct. 2012, 110, 151–166. [Google Scholar] [CrossRef]
  39. Sadollah, A.; Eskandar, H.; Bahreininejad, A.; Kim, J.H. Water cycle algorithm with evaporation rate for solving constrained and unconstrained optimization problems. Appl. Soft Comput. 2015, 30, 58–71. [Google Scholar] [CrossRef]
  40. Heidari, A.A.; Abbaspour, R.A.; Jordehi, A.R. An efficient chaotic water cycle algorithm for optimization tasks. Neural Comput. Appl. 2017, 28, 57–85. [Google Scholar] [CrossRef]
  41. Mirjalili, S.; Gandomi, A.H. Chaotic gravitational constants for the gravitational search algorithm. Appl. Soft Comput. 2017, 53, 407–419. [Google Scholar] [CrossRef]
  42. Wang, H. A hierarchical gravitational search algorithm with an effective gravitational constant. Swarm Evol. Comput. 2019, 46, 118–139. [Google Scholar] [CrossRef]
  43. Lei, Z.; Gao, S.; Gupta, S.; Cheng, J.; Yang, G. An aggregative learning gravitational search algorithm with self-adaptive gravitational constants. Expert Syst. Appl. 2020, 152, 113396. [Google Scholar] [CrossRef]
  44. Qiao, Y. A simple water cycle algorithm with percolation operator for clustering analysis. Soft Comput. 2019, 23, 4081–4095. [Google Scholar] [CrossRef]
  45. Osaba, E.; Del, J.; Sadollah, A.; Nekane, B.M. A discrete water cycle algorithm for solving the symmetric and asymmetric traveling salesman problem. Appl. Soft Comput. 2018, 71, 277–290. [Google Scholar] [CrossRef]
  46. Kudkelwar, S. An application of evaporation-rate-based water cycle algorithm for coordination of over-current relays in microgrid. Sadhana Acad. Proc. Eng. Sci. 2020, 45, 237. [Google Scholar] [CrossRef]
  47. Wang, J.; Zhang, H.; Luo, H. Research on the construction of stock portfolios based on multiobjective water cycle algorithm and KMV algorithm. Appl. Soft Comput. 2022, 115, 108186. [Google Scholar] [CrossRef]
  48. Nasir, M.; Sadollah, A.; Choi, Y.H.; Kim, J.H. A comprehensive review on water cycle algorithm and its applications. Neural Comput. Appl. 2020, 32, 17433–17488. [Google Scholar] [CrossRef]
  49. Mittal, H.; Tripathi, A.; Pandey, A.C.; Pal, R. Gravitational search algorithm: A comprehensive analysis of recent variants. Multimed. Tools Appl. 2021, 80, 7581–7608. [Google Scholar] [CrossRef]
  50. Oshi, S.K. Chaos embedded opposition based learning for gravitational search algorithm. Appl. Intell. 2023, 53, 5567–5586. [Google Scholar]
  51. Wang, Y.R.; Gao, S.C.; Yu, Y.; Cai, Z.H.; Wang, Z.Q. A gravitational search algorithm with hierarchy and distributed framework. Knowl.-Based Syst. 2021, 218, 106877. [Google Scholar] [CrossRef]
  52. Chen, C.H.; Wang, X.J.; Dong, H.C.; Wang, P. Surrogate-assisted hierarchical learning water cycle algorithm for high-dimensional expensive optimization. Swarm Evol. Comput. 2022, 75, 101169. [Google Scholar] [CrossRef]
  53. Ye, J.; Xie, L.; Wang, H. A water cycle algorithm based on quadratic interpolation for high-dimensional global optimization problems. Appl. Intell. 2023, 53, 2825–2849. [Google Scholar] [CrossRef]
  54. Price, K.V.; Storn, R.M.; Lampinen, J.A. Differential Evolution—A Practical Approach to Global Optimization. Nat. Comput. 2005, 141, 2. [Google Scholar]
  55. Liang, J.J.; Qu, B.Y.; Suganthan, P.N. Problem definitions and evaluation criteria for the CEC 2014 special session and competition on single objective real-parameter numerical optimization. In Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report; Nanyang Technological University: Singapore, 2013. [Google Scholar]
  56. Das, S.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for CEC 2011 Competition on Testing Evolutionary Algorithms on Real World Optimization Problems; Jadavpur University: Kolkata, India; Nanyang Technological University: Singapore, 2011. [Google Scholar]
  57. Sheng, W.; Swift, S.; Zhang, L.; Liu, X. A weighted sum validity function for clustering with a hybrid niching genetic algorithm. IEEE Trans. Syst. Man, Cybern. Part (Cybern.) 2005, 35, 1156–1167. [Google Scholar] [CrossRef] [Green Version]
  58. Udit, H.; Swagatam, D.; Dipankar, M. A cluster-based differential evolution algorithm with external archive for optimization in dynamic environments. IEEE Trans. Cybern. 2013, 43, 881–897. [Google Scholar]
  59. Zhang, J.Q.; Sanderson, A.C. JADE: Adaptive Differential Evolution With Optional External Archive. IEEE Trans. Evol. Comput. 2009, 13, 945–958. [Google Scholar] [CrossRef]
  60. Epitropakis, M.G.; Plagianakos, V.P.; Vrahatis, M.N. Balancing the exploration and exploitation capabilities of the differential evolution algorithm. In Proceedings of the 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, China, 1–6 June 2008. [Google Scholar]
  61. Jiang, Z.Y.; Cai, Z.X.; Wang, Y. Hybrid Self-Adaptive Orthogonal Genetic Algorithm for Solving Global Optimization Problems. J. Softw. 2010, 21, 1296–1307. [Google Scholar] [CrossRef]
Figure 1. The evolution curves of the seven studied algorithms for functions (a) f 1 , (b) f 2 , (c) f 3 , (d) f 4 , (e) f 9 and (f) f 17 with D = 30 .
Figure 1. The evolution curves of the seven studied algorithms for functions (a) f 1 , (b) f 2 , (c) f 3 , (d) f 4 , (e) f 9 and (f) f 17 with D = 30 .
Mathematics 11 01880 g001
Figure 2. The evolution curves of the seven studied algorithms for functions (a) f 19 , (b) f 21 , (c) f 28 , (d) f 30 with D = 30 .
Figure 2. The evolution curves of the seven studied algorithms for functions (a) f 19 , (b) f 21 , (c) f 28 , (d) f 30 with D = 30 .
Mathematics 11 01880 g002
Figure 3. The comparisons of the results of the seven studied algorithms for the functions (a) f 1 , (b) f 2 , (c) f 3 , (d) f 4 , (e) f 9 and (f) f 17 with D = 30 .
Figure 3. The comparisons of the results of the seven studied algorithms for the functions (a) f 1 , (b) f 2 , (c) f 3 , (d) f 4 , (e) f 9 and (f) f 17 with D = 30 .
Mathematics 11 01880 g003
Figure 4. The comparisons of the results of the seven studied algorithms for the functions (a) f 19 , (b) f 21 , (c) f 28 , (d) f 30 with D = 30 .
Figure 4. The comparisons of the results of the seven studied algorithms for the functions (a) f 19 , (b) f 21 , (c) f 28 , (d) f 30 with D = 30 .
Mathematics 11 01880 g004
Table 1. Experimental results of the HMWCA with different values for T S and N s r 0 .
Table 1. Experimental results of the HMWCA with different values for T S and N s r 0 .
FunctionsStatistical f 1 f 3 f 4 f 9 f 17 f 21 f 27 f 29
TS N sr 0 Item
55Mean Error4.97 × 10 6 4.04 × 10 3 1.01 × 10 2 1.39 × 10 2 4.91 × 10 5 2.71 × 10 5 7.06 × 10 2 2.17 × 10 2
Std Dev3.31 × 10 6 7.79 × 10 3 4.80 × 10 1 3.28 × 10 1 3.19 × 10 5 3.31 × 10 5 3.25 × 10 2 2.84 × 10 0
8Mean Error4.78 × 10 6 2.89 × 10 3 7.59 × 10 1 1.22 × 10 2 4.73 × 10 5 2.07 × 10 5 7.54 × 10 2 2.22 × 10 2
Std Dev2.78 × 10 6 3.24 × 10 3 3.23 × 10 1 2.74 × 10 1 4.15 × 10 5 1.35 × 10 5 3.09 × 10 2 2.09 × 10 1
10Mean Error7.97 × 10 6 2.00 × 10 3 6.49 × 10 1 1.33 × 10 2 8.00 × 10 5 2.92 × 10 5 8.78 × 10 2 2.44 × 10 2
Std Dev7.17 × 10 6 2.34 × 10 3 3.75 × 10 1 4.48 × 10 1 3.46 × 10 5 1.87 × 10 5 1.76 × 10 2 5.89 × 10 1
105Mean Error6.45 × 10 6 1.05 × 10 3 6.25 × 10 1 1.35 × 10 2 4.72 × 10 5 2.27 × 10 5 8.21 × 10 2 2.34 × 10 2
Std Dev3.67 × 10 6 2.52 × 10 3 2.48 × 10 1 1.99 × 10 1 3.34 × 10 5 2.16 × 10 5 2.27 × 10 2 3.80 × 10 1
8Mean Error5.01 × 10 6 1.27 × 10 3 6.95 × 10 1 1.36 × 10 2 4.01 × 10 5 1.53 × 10 5 6.21 × 10 2 2.16 × 10 2
Std Dev2.85 × 10 6 2.85 × 10 3 3.37 × 10 1 3.03 × 10 1 2.84 × 10 5 1.01 × 10 5 2.83 × 10 2 4.54 × 10 0
10Mean Error4.46 × 10 6 2.39 × 10 3 7.63 × 10 1 1.20 × 10 2 4.54 × 10 5 1.06 × 10 5 8.03 × 10 2 2.29 × 10 2
Std Dev2.59 × 10 6 1.84 × 10 3 5.03 × 10 1 1.62 × 10 1 3.30 × 10 5 4.84 × 10 4 2.76 × 10 2 2.93 × 10 1
155Mean Error2.90 × 10 6 7.61 × 10 2 6.11 × 10 1 1.31 × 10 2 2.75 × 10 5 1.00 × 10 5 7.58 × 10 2 2.29 × 10 2
Std Dev1.26 × 10 6 1.17 × 10 3 2.12 × 10 1 1.79 × 10 1 1.75 × 10 5 6.50 × 10 4 3.14 × 10 2 2.90 × 10 1
8Mean Error4.31 × 10 6 1.29 × 10 3 7.65 × 10 1 1.37 × 10 2 5.74 × 10 5 1.38 × 10 5 7.97 × 10 2 2.15 × 10 2
Std Dev1.79 × 10 6 1.34 × 10 3 3.07 × 10 1 3.05 × 10 1 5.52 × 10 5 7.06 × 10 4 2.82 × 10 2 4.34 × 10 0
10Mean Error4.63 × 10 6 2.18 × 10 3 7.55 × 10 1 1.20 × 10 2 4.33 × 10 5 2.03 × 10 5 7.65 × 10 2 2.20 × 10 2
Std Dev2.59 × 10 6 2.00 × 10 3 3.97 × 10 1 2.72 × 10 1 2.48 × 10 5 9.98 × 10 4 3.14 × 10 2 1.62 × 10 1
205Mean Error3.72 × 10 6 1.13 × 10 3 8.22 × 10 1 1.41 × 10 2 4.80 × 10 5 1.89 × 10 5 6.77 × 10 2 2.21 × 10 2
Std Dev2.60 × 10 6 1.75 × 10 3 3.07 × 10 1 2.31 × 10 1 3.53 × 10 5 1.10 × 10 5 2.89 × 10 2 2.04 × 10 1
8Mean Error6.91 × 10 6 2.25 × 10 3 8.13 × 10 1 1.41 × 10 2 2.87 × 10 5 2.91 × 10 5 8.21 × 10 2 2.24 × 10 2
Std Dev3.05 × 10 6 3.72 × 10 3 3.63 × 10 1 2.46 × 10 1 1.34 × 10 5 2.62 × 10 5 3.00 × 10 2 2.92 × 10 1
10Mean Error4.29 × 10 6 1.36 × 10 3 7.70 × 10 1 1.27 × 10 2 4.46 × 10 5 1.25 × 10 5 7.47 × 10 2 2.16 × 10 2
Std Dev2.52 × 10 6 1.17 × 10 3 3.43 × 10 1 1.93 × 10 1 3.96 × 10 5 7.27 × 10 4 3.00 × 10 2 2.97 × 10 0
Table 2. Experimental results of the HMWCA and its four variants on the C E C 2014 functions with D = 30 .
Table 2. Experimental results of the HMWCA and its four variants on the C E C 2014 functions with D = 30 .
FunctionsHMWCA1HMWCA2HMWCA3HMWCA4HMWCA
Mean Error ± Std Dev
/Rank
Mean Error ± Std Dev
/Rank
Mean Error ± Std Dev
/Rank
Mean Error ± Std Dev
/Rank
Mean Error ± Std Dev
/Rank
f 1 6.34 × 106 ± 3.14 × 106
/2
7.37 × 106 ± 5.28 × 106
/3
3.04 × 107 ± 1.07 × 107
/5
1.03 × 107 ± 2.53 × 106
/4
2.90 × 106 ± 1.26 × 106
/1
f 2 1.47 × 10 4 ± 1.52 × 10 4
/3
1.54 × 10 3 ± 4.94 × 10 3
/1
2.36 × 10 7 ± 2.68 × 10 7
/5
8.03 × 10 5 ± 8.77 × 10 5
/4
5.80 × 10 3 ± 4.72 × 10 3
/2
f 3 2.71 × 10 3 ± 3.26 × 10 3
/3
5.08 × 10 2 ± 8.21 × 10 2
/1
2.89 × 10 4 ± 7.60 × 10 3
/5
1.86 × 10 4 ± 8.72 × 10 3
/4
7.61 × 10 2 ± 1.17 × 10 3
/2
f 4 1.20 × 10 2 ± 4.56 × 10 1
/3
8.12 × 10 1 ± 4.56 × 10 1
/2
1.91 × 10 2 ± 4.55 × 10 1
/4
2.14 × 10 2 ± 3.35 × 10 1
/5
6.11 × 10 1 ± 2.12 × 10 1
/1
f 5 2.00 × 10 1 ± 4.48 × 10 4
/1
2.00 × 10 1 ± 1.86 × 10 5
/1
2.00 × 10 1 ± 3.35 × 10 3
/1
2.01 × 10 1 ± 6.87 × 10 2
/5
2.00 × 10 1 ± 2.12 × 10 1
/1
f 6 2.64 × 10 1 ± 3.60 × 10 0
/3
2.61 × 10 1 ± 4.05 × 10 0
/2
3.04 × 10 1 ± 3.32 × 10 0
/4
5.27 × 10 1 ± 5.90 × 10 0
/5
2.42 × 10 1 ± 2.93 × 10 0
/1
f 7 8.62 × 10 2 ± 1.05 × 10 1
/3
3.52 × 10 2 ± 3.86 × 10 2
/2
1.29 × 10 0 ± 3.79 × 10 1
/5
6.43 × 10 1 ± 2.87 × 10 1
/4
1.10 × 10 2 ± 1.18 × 10 2
/1
f 8 1.10 × 10 2 ± 1.48 × 10 1
/2
8.26 × 10 1 ± 1.88 × 10 1
/1
1.33 × 10 2 ± 2.95 × 10 1
/4
2.30 × 10 2 ± 2.23 × 10 1
/5
1.10 × 10 2 ± 2.94 × 10 1
/2
f 9 1.41 × 10 2 ± 3.01 × 10 1
/2
1.46 × 10 2 ± 4.55 × 10 1
/3
1.47 × 10 2 ± 2.66 × 10 1
/4
2.77 × 10 2 ± 4.49 × 10 1
/5
1.31 × 10 2 ± 1.79 × 10 1
/1
f 10 2.64 × 10 3 ± 4.46 × 10 2
/3
2.12 × 10 3 ± 4.83 × 10 2
/2
3.39 × 10 3 ± 6.39 × 10 2
/4
4.67 × 10 3 ± 1.05 × 10 3
/5
1.82 × 10 3 ± 4.59 × 10 2
/1
f 11 3.70 × 10 3 ± 6.28 × 10 2
/2
3.95 × 10 3 ± 7.07 × 10 2
/3
3.95 × 10 3 ± 6.05 × 10 2
/3
6.77 × 10 3 ± 1.14 × 10 3
/5
3.56 × 10 3 ± 5.39 × 10 2
/1
f 12 6.24 × 10 1 ± 2.83 × 10 1
/2
1.03 × 10 0 ± 4.80 × 10 1
/3
1.32 × 10 0 ± 6.11 × 10 1
/4
1.65 × 10 0 ± 6.49 × 10 1
/5
6.21 × 10 1 ± 2.09 × 10 1
/1
f 13 4.99 × 10 1 ± 1.40 × 10 1
/3
4.80 × 10 1 ± 1.13 × 10 1
/2
5.39 × 10 1 ± 1.26 × 10 1
/4
5.77 × 10 1 ± 1.04 × 10 1
/5
4.20 × 10 1 ± 6.11 × 10 2
/1
f 14 2.85 × 10 1 ± 5.61 × 10 2
/1
3.33 × 10 1 ± 1.07 × 10 1
/5
3.04 × 10 1 ± 4.05 × 10 2
/2
3.17 × 10 1 ± 4.21 × 10 2
/4
3.05 × 10 1 ± 5.20 × 10 2
/3
f 15 5.12 × 10 1 ± 2.21 × 10 1
/3
3.44 × 10 1 ± 1.13 × 10 1
/2
5.15 × 10 1 ± 1.73 × 10 1
/4
1.36 × 10 2 ± 2.19 × 10 1
/5
3.11 × 10 1 ± 9.66 × 10 0
/1
f 16 1.24 × 10 1 ± 6.40 × 10 1
/2
1.28 × 10 1 ± 5.10 × 10 1
/4
1.27 × 10 1 ± 4.50 × 10 1
/3
2.21 × 10 1 ± 2.92 × 10 1
/5
1.21 × 10 1 ± 5.98 × 10 1
/1
f 17 5.13 × 10 5 ± 3.13 × 10 5
/2
7.28 × 10 5 ± 6.17 × 10 5
/3
1.14 × 10 6 ± 9.40 × 10 5
/4
1.99 × 10 6 ± 1.23 × 10 6
/5
2.75 × 10 5 ± 1.75 × 10 5
/1
f 18 1.08 × 10 3 ± 1.33 × 10 3
/2
4.32 × 10 3 ± 7.55 × 10 3
/4
1.70 × 10 3 ± 2.14 × 10 3
/3
4.75 × 10 3 ± 4.96 × 10 3
/5
5.19 × 10 2 ± 3.36 × 10 2
/1
f 19 1.71 × 10 1 ± 3.84 × 10 0
/2
1.56 × 10 1 ± 3.27 × 10 0
/1
2.51 × 10 1 ± 3.54 × 10 0
/4
5.86 × 10 1 ± 3.28 × 10 1
/5
1.79 × 10 1 ± 3.05 × 10 0
/3
f 20 5.51 × 10 3 ± 3.94 × 10 3
/3
5.37 × 10 3 ± 6.11 × 10 3
/2
6.59 × 10 3 ± 5.28 × 10 3
/4
2.72 × 10 4 ± 1.09 × 10 4
/5
3.80 × 10 3 ± 2.14 × 10 3
/1
f 21 1.90 × 10 5 ± 2.03 × 10 5
/2
2.48 × 10 5 ± 1.69 × 10 5
/3
3.45 × 10 5 ± 2.61 × 10 5
/4
1.31 × 10 6 ± 7.29 × 10 5
/5
1.00 × 10 5 ± 6.50 × 10 4
/1
f 22 7.39 × 10 2 ± 1.76 × 10 2
/3
6.36 × 10 2 ± 1.84 × 10 2
/1
8.24 × 10 2 ± 2.30 × 10 2
/4
1.44 × 10 3 ± 3.84 × 10 2
/5
6.37 × 10 2 ± 1.27 × 10 2
/2
f 23 3.14 × 10 2 ± 4.68 × 10 2
/1
3.14 × 10 2 ± 2.00 × 10 2
/1
3.26 × 10 2 ± 8.14 × 10 0
/4
3.37 × 10 2 ± 2.96 × 10 1
/5
3.14 × 10 2 ± 5.50 × 10 3
/1
f 24 2.31 × 10 2 ± 3.31 × 10 0
/2
2.44 × 10 2 ± 6.21 × 10 0
/4
2.29 × 10 2 ± 2.83 × 10 0
/1
2.80 × 10 2 ± 1.14 × 10 1
/5
2.31 × 10 2 ± 6.54 × 10 0
/2
f 25 2.12 × 10 2 ± 2.75 × 10 0
/3
2.06 × 10 2 ± 8.45 × 10 0
/1
2.11 × 10 2 ± 3.58 × 10 0
/2
2.30 × 10 2 ± 7.12 × 10 0
/5
2.12 × 10 2 ± 1.61 × 10 0
/3
f 26 1.01 × 10 2 ± 1.15 × 10 1
/2
1.01 × 10 2 ± 1.62 × 10 1
/2
1.01 × 10 2 ± 7.34 × 10 1
/2
1.60 × 10 2 ± 5.15 × 10 1
/5
1.00 × 10 2 ± 1.18 × 10 1
/1
f 27 8.07 × 10 2 ± 3.40 × 10 2
/3
8.33 × 10 2 ± 3.08 × 10 2
/4
6.44 × 10 2 ± 3.39 × 10 2
/1
1.68 × 10 3 ± 1.11 × 10 2
/5
7.58 × 10 2 ± 3.14 × 10 2
/2
f 28 5.80 × 10 2 ± 1.83 × 10 2
/4
4.87 × 10 2 ± 9.00 × 10 1
/1
1.52 × 10 3 ± 7.50 × 10 2
/5
5.44 × 10 2 ± 1.14 × 10 2
/3
5.43 × 10 2 ± 1.18 × 10 2
/2
f 29 3.02 × 10 2 ± 2.74 × 10 2
/5
2.19 × 10 2 ± 2.01 × 10 1
/1
2.51 × 10 3 ± 3.53 × 10 3
/3
2.63 × 10 2 ± 4.03 × 10 1
/4
2.29 × 10 2 ± 2.90 × 10 1
/2
f 30 1.04 × 10 3 ± 5.44 × 10 2
/3
9.79 × 10 2 ± 3.11 × 10 2
/2
1.55 × 10 4 ± 2.83 × 10 4
/5
1.63 × 10 3 ± 6.12 × 10 2
/4
9.38 × 10 2 ± 7.82 × 10 2
/1
Sum Rank756710714144
Average Rank2.52.233.574.71.47
Table 3. Parameters setting.
Table 3. Parameters setting.
AlgorithmsParameter Setting
GSA N p o p = 50 , G 0 = 100 , α = 20
HSOGA N p o p = 200 , Q 0 = D 1 , Q = 2 , S = 5 , P c = 0.6 , P m = 0.1 ,   δ 0 = 0.05
CLPSO N p o p = 30 , c 1 = c 2 = 1.494 , w m a x = 0.9 , w m i n = 0.4 , m = 5
WCA N p o p = 50 , N s r = 4 , d m a x = 0.0001
CWCA N p o p = 50 , N s r = 4 , d m a x = 0.0001 , C i = 1 , C f = 2 , μ i = 0.1 , μ f = 0.8
ER_WCA N p o p = 50 , N s r = 4 , d m a x = 0.0001
HMWCA N p o p = 50 , N s r i n i t i a l = 5 , d m a x = 0.0001 , G 0 = 100 , α = 20 , T S = 15
Table 4. Experimental results of the HMWCA and six well-known algorithms on the C E C 2014 functions with D = 30 .
Table 4. Experimental results of the HMWCA and six well-known algorithms on the C E C 2014 functions with D = 30 .
FunctionAlgorithmWCACWCAER_WCAGSACLPSOHSOGAHMWCA
f 1 Mean Error2.82 × 10 6 6.00 × 10 6 +1.32 × 10 6 9.10 × 10 7 +1.04 × 10 8 +1.91 × 10 8 +2.90 × 10 6
Std Dev
/Rank
1.05 × 10 6
/2
4.88 × 10 6
/4
6.07 × 10 5
/1
2.37 × 10 7
/5
3.12 × 10 7
/6
2.49 × 10 7
/7
1.26 × 10 6
/3
f 2 Mean Error1.07 × 10 4 +2.93 × 10 4 +1.36 × 10 4 +1.12 × 10 7 +2.93 × 10 7 +5.68 × 10 9 +5.80 × 10 3
Std Dev
/Rank
1.01 × 10 4
/2
2.12 × 10 4
/4
1.04 × 10 4
/3
3.17 × 10 7
/5
7.69 × 10 6
/6
1.00 × 10 9
/7
4.72 × 10 3
/1
f 3 Mean Error3.99 × 10 3 +2.02 × 10 4 +8.53 × 10 2 +8.71 × 10 4 +6.46 × 10 3 +2.30 × 10 4 +7.61 × 10 2
Std Dev
/Rank
2.56 × 10 3
/3
1.85 × 10 4
/4
9.62 × 10 2
/2
7.63 × 10 3
/7
2.94 × 10 3
/5
5.09 × 10 3
/6
1.17 × 10 3
/1
f 4 Mean Error1.03 × 10 2 +1.36 × 10 2 +7.99 × 10 1 +3.92 × 10 2 +2.30 × 10 2 +5.02 × 10 2 +6.11 × 10 1
Std Dev
/Rank
3.22 × 10 1
/3
6.46 × 10 1
/4
2.71 × 10 1
/2
1.07 × 10 2
/6
2.80 × 10 1
/5
6.48 × 10 1
/7
2.12 × 10 1
/1
f 5 Mean Error2.01 × 10 1 +2.01 × 10 1 +2.01 × 10 1 +2.00 × 10 1 2.07 × 10 1 +2.10 × 10 1 +2.00 × 10 1
Std Dev
/Rank
8.27 × 10 2
/3
6.84 × 10 2
/3
9.23 × 10 2
/3
5.71 × 10 4
/1
5.05 × 10 2
/6
5.92 × 10 2
/7
2.73 × 10 4
/1
f 6 Mean Error3.10 × 10 1 +2.52 × 10 1 +3.03 × 10 1 +2.47 × 10 1 +2.30 × 10 1 3.17 × 10 1 +2.42 × 10 1
Std Dev
/Rank
3.45 × 10 0
/6
3.28 × 10 0
/4
3.55 × 10 0
/5
2.11 × 10 0
/3
1.61 × 10 0
/1
1.86 × 10 0
/7
2.93 × 10 0
/2
f 7 Mean Error2.19 × 10 2 +9.12 × 10 2 +1.13 × 10 2 +1.33 × 10 0 +1.19 × 10 0 +4.27 × 10 1 +1.10 × 10 2
Std Dev
/Rank
1.90 × 10 2
/3
4.97 × 10 2
/4
1.30 × 10 2
/2
1.69 × 10 0
/6
6.56 × 10 2
/5
6.12 × 10 0
/7
1.18 × 10 2
/1
f 8 Mean Error1.51 × 10 2 +7.45 × 10 1 1.73 × 10 2 +1.41 × 10 2 +1.80 × 10 1 1.36 × 10 2 +1.10 × 10 2
Std Dev
/Rank
2.99 × 10 1
/6
2.68 × 10 1
/2
4.28 × 10 1
/7
1.22 × 10 1
/5
3.23 × 10 0
/1
1.04 × 10 1
/4
2.94 × 10 1
/3
f 9 Mean Error1.82 × 10 2 +2.39 × 10 2 +1.82 × 10 2 +1.65 × 10 2 +1.43 × 10 2 +2.28 × 10 2 +1.31 × 10 2
Std Dev
/Rank
4.10 × 10 1
/4
5.35 × 10 1
/7
4.93 × 10 1
/4
2.11 × 10 1
/3
1.22 × 10 1
/2
1.88 × 10 1
/6
1.79 × 10 1
/1
f 10 Mean Error4.00 × 10 3 +1.84 × 10 3 +3.64 × 10 3 +3.85 × 10 3 +3.71 × 10 2 +1.89 × 10 3 +1.82 × 10 3
Std Dev
/Rank
8.84 × 10 2
/7
6.25 × 10 2
/2
6.58 × 10 2
/4
4.76 × 10 2
/6
1.02 × 10 2
/5
1.85 × 10 2
/3
4.59 × 10 2
/1
f 11 Mean Error4.23 × 10 3 +3.72 × 10 3 +4.55 × 10 3 +4.48 × 10 3 +4.44 × 10 3 +7.22 × 10 3 +3.56 × 10 3
Std Dev
/Rank
1.02 × 10 3
/3
6.81 × 10 2
/2
7.33 × 10 2
/6
6.12 × 10 2
/5
2.79 × 10 2
/4
3.29 × 10 2
/7
5.39 × 10 2
/1
f 12 Mean Error1.40 × 10 0 +3.60 × 10 1 1.51 × 10 0 +5.59 × 10 3 9.08 × 10 1 +2.61 × 10 0 +6.21 × 10 1
Std Dev
/Rank
5.34 × 10 1
/5
1.08 × 10 1
/2
5.67 × 10 1
/6
4.62 × 10 3
/1
1.52 × 10 1
/4
4.56 × 10 1
/7
2.09 × 10 1
/3
f 13 Mean Error5.41 × 10 1 +7.67 × 10 1 +5.62 × 10 1 +3.66 × 10 1 3.89 × 10 1 9.30 × 10 1 +4.20 × 10 1
Std Dev
/Rank
1.19 × 10 1
/4
1.31 × 10 1
/6
1.56 × 10 1
/5
6.61 × 10 2
/1
4.75 × 10 2
/2
1.77 × 10 1
/7
6.11 × 10 2
/3
f 14 Mean Error3.46 × 10 1 +9.83 × 10 1 +4.69 × 10 1 +5.43 × 10 1 +3.43 × 10 1 +1.43 × 10 1 +3.05 × 10 1
Std Dev
/Rank
1.32 × 10 1
/3
3.64 × 10 1
/6
2.33 × 10 1
/4
1.50 × 10 0
/5
3.30 × 10 2
/2
3.63 × 10 0
/7
5.20 × 10 2
/1
f 15 Mean Error3.24 × 10 1 +2.65 × 10 1 4.88 × 10 1 +3.28 × 10 1 +2.65 × 10 1 7.75 × 10 2 +3.11 × 10 1
Std Dev
/Rank
1.19 × 10 1
/4
1.41 × 10 1
/1
3.34 × 10 1
/6
1.37 × 10 1
/5
2.92 × 10 0
/1
4.76 × 10 2
/7
9.66 × 10 0
/3
f 16 Mean Error1.27 × 10 1 +1.21 × 10 1 1.29 × 10 1 +1.37 × 10 1 +1.19 × 10 1 1.31 × 10 1 +1.21 × 10 1
Std Dev
/Rank
4.62 × 10 1
/4
5.50 × 10 1
/2
4.58 × 10 1
/5
2.02 × 10 1
/7
3.18 × 10 1
/1
2.63 × 10 1
/6
5.98 × 10 1
/2
f 17 Mean Error2.91 × 10 5 +6.40 × 10 5 +9.12 × 10 4 7.50 × 10 6 +6.60 × 10 6 +2.24 × 10 6 +2.75 × 10 5
Std Dev
/Rank
1.75 × 10 5
/3
5.92 × 10 5
/4
5.37 × 10 4
/1
2.83 × 10 6
/7
3.85 × 10 6
/6
8.12 × 10 5
/5
1.75 × 10 5
/2
f 18 Mean Error5.26 × 10 3 +1.33 × 10 4 +5.01 × 10 3 +5.24 × 10 2 +5.31 × 10 5 +2.04 × 10 7 +5.19 × 10 2
Std Dev
/Rank
6.44 × 10 3
/4
1.04 × 10 4
/5
6.27 × 10 3
/3
4.55 × 10 2
/2
4.55 × 10 5
/6
7.96 × 10 6
/7
3.36 × 10 2
/1
f 19 Mean Error3.99 × 10 1 +4.75 × 10 1 +2.38 × 10 1 +1.32 × 10 2 +2.78 × 10 1 +5.15 × 10 1 +1.79 × 10 1
Std Dev
/Rank
4.12 × 10 1
/4
4.52 × 10 1
/5
2.31 × 10 5
/2
3.63 × 10 1
/7
1.04 × 10 1
/3
1.17 × 10 1
/6
3.05 × 10 0
/1
f 20 Mean Error4.46 × 10 3 +2.57 × 10 4 +1.04 × 10 3 2.48 × 10 5 +1.11 × 10 4 +1.64 × 10 4 +3.80 × 10 3
Std Dev
/Rank
3.42 × 10 3
/3
1.66 × 10 4
/6
7.08 × 10 2
/1
1.21 × 10 5
/7
4.72 × 10 3
/4
5.58 × 10 3
/5
2.14 × 10 3
/2
f 21 Mean Error1.03 × 10 5 +2.34 × 10 5 +4.19 × 10 4 +2.74 × 10 6 +8.31 × 10 5 +5.99 × 10 5 +1.00 × 10 5
Std Dev
/Rank
6.59 × 10 4
/2
1.62 × 10 5
/3
1.66 × 10 4
/4
1.35 × 10 6
/7
4.29 × 10 5
/6
1.89 × 10 5
/5
6.50 × 10 4
/1
f 22 Mean Error6.43 × 10 2 +6.81 × 10 2 +6.66 × 10 2 +1.17 × 10 3 +4.01 × 10 2 6.38 × 10 2 +6.37 × 10 2
Std Dev
/Rank
2.50 × 10 2
/4
2.46 × 10 2
/6
1.98 × 10 2
/5
3.02 × 10 2
/7
1.07 × 10 2
/1
1.32 × 10 2
/3
1.27 × 10 2
/2
f 23 Mean Error3.15 × 10 2 +3.16 × 10 2 +3.15 × 10 2 +2.69 × 10 2 3.20 × 10 2 +3.45 × 10 2 +3.14 × 10 2
Std Dev
/Rank
1.70 × 10 2
/3
1.67 × 10 0
/5
9.38 × 10 4
/3
7.39 × 10 1
/1
1.59 × 10 0
/6
2.54 × 10 1
/7
5.50 × 10 3
/2
f 24 Mean Error2.43 × 10 2 +2.45 × 10 2 +2.43 × 10 2 +2.03 × 10 2 2.35 × 10 2 +2.09 × 10 2 2.31 × 10 2
Std Dev
/Rank
1.02 × 10 1
/5
6.35 × 10 0
/7
1.07 × 10 1
/5
6.21 × 10 0
/1
3.03 × 10 0
/4
3.45 × 10 0
/2
6.54 × 10 0
/3
f 25 Mean Error2.21 × 10 2 +2.12 × 10 2 2.24 × 10 2 +2.02 × 10 2 2.17 × 10 2 +2.04 × 10 2 2.12 × 10 2
Std Dev
/Rank
1.02 × 10 1
/6
7.77 × 10 0
/3
9.17 × 10 0
/7
3.38 × 10 0
/1
2.58 × 10 0
/5
2.09 × 10 0
/2
1.61 × 10 0
/3
f 26 Mean Error1.13 × 10 2 +1.01 × 10 2 +1.09 × 10 2 +1.98 × 10 2 +1.01 × 10 2 +1.01 × 10 2 +1.00 × 10 2
Std Dev
/Rank
3.30 × 10 1
/6
2.15 × 10 1
/2
2.75 × 10 1
/5
1.19 × 10 1
/7
7.98 × 10 1
/2
3.53 × 10 1
/2
1.18 × 10 1
/1
f 27 Mean Error9.20 × 10 2 +9.36 × 10 2 +1.04 × 10 3 +1.81 × 10 3 +4.94 × 10 2 1.07 × 10 3 +7.58 × 10 2
Std Dev
/Rank
3.69 × 10 2
/3
2.75 × 10 2
/4
3.34 × 10 2
/5
4.19 × 10 2
/7
4.79 × 10 1
/1
3.53 × 10 1
/6
3.14 × 10 2
/2
f 28 Mean Error1.89 × 10 3 +1.39 × 10 3 +1.82 × 10 3 +2.52 × 10 3 +1.51 × 10 3 +8.46 × 10 2 +5.43 × 10 2
Std Dev
/Rank
5.19 × 10 2
/6
4.32 × 10 2
/3
6.15 × 10 2
/5
8.18 × 10 2
/7
2.96 × 10 2
/4
1.14 × 10 2
/2
1.18 × 10 2
/1
f 29 Mean Error5.44 × 10 6 +7.23 × 10 5 +8.48 × 10 6 +1.44 × 10 5 +9.69 × 10 4 +8.30 × 10 2 +2.29 × 10 2
Std Dev
/Rank
9.48 × 10 6
/6
2.49 × 10 6
/5
8.97 × 10 6
/7
7.19 × 10 5
/4
5.71 × 10 4
/3
2.40 × 10 2
/2
2.90 × 10 1
/1
f 30 Mean Error1.82 × 10 4 +6.71 × 10 3 +6.51 × 10 3 +2.28 × 10 5 +4.04 × 10 4 +2.19 × 10 3 +9.38 × 10 2
Std Dev
/Rank
1.75 × 10 4
/5
9.54 × 10 3
/4
3.24 × 10 3
/3
1.09 × 10 5
/7
1.46 × 10 4
/6
4.75 × 10 2
/2
7.82 × 10 2
/1
Average Rank4.073.974.374.773.775.271.7
+292527242329
-133571
020100
Table 5. Experimental results of the HMWCA and six well-known algorithms on the C E C 2014 functions with D = 50 .
Table 5. Experimental results of the HMWCA and six well-known algorithms on the C E C 2014 functions with D = 50 .
FunctionAlgorithmWCACWCAER_WCAGSACLPSOHSOGAHMWCA
f 1 Mean Error7.78 × 10 6 +2.38 × 10 7 +5.50 × 10 6 3.93 × 10 8 +2.37 × 10 8 +7.62 × 10 8 +6.76 × 10 6
Std Dev
/Rank
1.84 × 10 6
/3
1.18 × 10 7
/4
1.41 × 10 6
/1
3.05 × 10 8
/6
6.61 × 10 7
/5
1.24 × 10 8
/7
2.17 × 10 6
/2
f 2 Mean Error1.31 × 10 4 1.16 × 10 9 +8.17 × 10 3 2.04 × 10 10 +1.25 × 10 9 +5.15 × 10 10 +3.93 × 10 5
Std Dev
/Rank
8.55 × 10 3
/2
2.61 × 10 9
/4
9.44 × 10 3
/1
3.40 × 10 9
/6
2.83 × 10 8
/4
5.66 × 10 9
/7
2.58 × 10 5
/3
f 3 Mean Error1.07 × 10 4 1.05 × 10 5 +5.07 × 10 3 1.56 × 10 5 +6.01 × 10 4 +1.67 × 10 5 +1.50 × 10 4
Std Dev
/Rank
4.72 × 10 3
/2
3.77 × 10 4
/4
3.33 × 10 3
/1
9.58 × 10 3
/6
1.15 × 10 4
/4
1.73 × 10 4
/7
6.75 × 10 3
/3
f 4 Mean Error1.40 × 10 2 4.81 × 10 2 +1.04 × 10 2 2.91 × 10 3 +7.83 × 10 2 +7.37 × 10 3 +2.03 × 10 2
Std Dev
/Rank
5.19 × 10 1
/2
2.94 × 10 2
/4
4.73 × 10 1
/1
6.40 × 10 2
/6
8.79 × 10 1
/5
1.28 × 10 3
/7
3.04 × 10 1
/3
f 5 Mean Error2.01 × 10 1 +2.03 × 10 1 +2.01 × 10 1 +2.00 × 10 1 2.08 × 10 1 +2.12 × 10 1 +2.00 × 10 1
Std Dev
/Rank
1.35 × 10 1
/3
9.38 × 10 2
/5
9.13 × 10 2
/3
1.06 × 10 4
/1
3.21 × 10 2
/6
5.27 × 10 2
/7
3.58 × 10 3
/1
f 6 Mean Error5.91 × 10 1 +4.91 × 10 1 5.84 × 10 1 +5.30 × 10 1 +5.14 × 10 1 6.71 × 10 1 +5.16 × 10 1
Std Dev
/Rank
4.80 × 10 0
/6
4.63 × 10 0
/1
7.92 × 10 0
/5
2.74 × 10 0
/4
2.27 × 10 0
/2
1.86 × 10 0
/7
3.18 × 10 0
/3
f 7 Mean Error4.30 × 10 2 1.01 × 10 1 +8.53 × 10 3 2.02 × 10 2 +1.09 × 10 1 +5.09 × 10 2 +4.64 × 10 1
Std Dev
/Rank
2.06 × 10 2
/2
2.74 × 10 1
/4
7.12 × 10 3
/1
3.14 × 10 1
/6
2.70 × 10 0
/5
6.57 × 10 1
/7
1.93 × 10 1
/3
f 8 Mean Error3.00 × 10 2 +2.21 × 10 2 +3.03 × 10 2 +2.77 × 10 2 +9.14 × 10 1 3.97 × 10 2 +1.93 × 10 2
Std Dev
/Rank
3.96 × 10 1
/5
4.77 × 10 1
/3
6.53 × 10 1
/6
1.80 × 10 1
/4
8.54 × 10 0
/1
2.59 × 10 1
/7
2.86 × 10 1
/2
f 9 Mean Error4.01 × 10 2 +5.14 × 10 2 +3.94 × 10 2 +3.55 × 10 2 +3.76 × 10 2 +5.54 × 10 2 +2.58 × 10 2
Std Dev
/Rank
7.32 × 10 1
/5
7.88 × 10 1
/6
5.73 × 10 1
/4
2.82 × 10 1
/2
2.54 × 10 1
/3
3.21 × 10 1
/7
6.46 × 10 1
/1
f 10 Mean Error7.29 × 10 3 +4.26 × 10 3 6.93 × 10 3 +7.52 × 10 3 +2.49 × 10 3 7.41 × 10 3 +4.72 × 10 3
Std Dev
/Rank
8.50 × 10 2
/5
1.10 × 10 3
/2
8.71 × 10 2
/4
7.19 × 10 2
/7
2.68 × 10 2
/1
5.42 × 10 2
/6
8.04 × 10 2
/3
f 11 Mean Error8.08 × 10 3 +7.19 × 10 3 +7.80 × 10 3 +8.17 × 10 3 +9.29 × 10 3 +1.39 × 10 4 +6.88 × 10 3
Std Dev
/Rank
1.40 × 10 3
/4
6.86 × 10 2
/2
1.03 × 10 3
/3
8.54 × 10 2
/5
4.63 × 10 2
/6
4.12 × 10 2
/7
5.96 × 10 2
/1
f 12 Mean Error2.16 × 10 0 +6.29 × 10 1 2.13 × 10 0 +1.28 × 10 2 1.18 × 10 0 4.07 × 10 0 +1.24 × 10 0
Std Dev
/Rank
6.70 × 10 1
/6
1.85 × 10 1
/2
8.20 × 10 1
/5
6.79 × 10 3
/1
1.34 × 10 1
/3
3.90 × 10 1
/7
2.90 × 10 1
/4
f 13 Mean Error6.25 × 10 1 +8.01 × 10 1 +6.31 × 10 1 +2.87 × 10 0 +5.36 × 10 1 4.98 × 10 0 +5.81 × 10 1
Std Dev
/Rank
8.50 × 10 2
/3
1.40 × 10 1
/5
1.50 × 10 1
/4
5.98 × 10 1
/6
7.86 × 10 2
/1
3.24 × 10 1
/7
5.33 × 10 2
/2
f 14 Mean Error4.15 × 10 1 +2.62 × 10 0 +3.84 × 10 1 +4.33 × 10 1 +4.88 × 10 1 +1.18 × 10 2 +3.30 × 10 1
Std Dev
/Rank
1.92 × 10 1
/3
5.25 × 10 0
/5
1.32 × 10 1
/2
7.20 × 10 0
/6
8.13 × 10 2
/4
1.33 × 10 1
/7
3.79 × 10 2
/1
f 15 Mean Error8.77 × 10 1 1.05 × 10 5 +1.26 × 10 2 +9.76 × 10 3 +7.27 × 10 2 +2.16 × 10 5 +1.07 × 10 2
Std Dev
/Rank
2.33 × 10 1
/1
3.36 × 10 5
/6
4.38 × 10 1
/3
6.07 × 10 3
/5
3.30 × 10 2
/4
7.80 × 10 4
/7
1.84 × 10 1
/2
f 16 Mean Error2.24 × 10 1 +2.14 × 10 1 2.24 × 10 1 +2.26 × 10 1 +2.14 × 10 1 2.30 × 10 1 +2.16 × 10 1
Std Dev
/Rank
6.51 × 10 1
/4
5.88 × 10 1
/1
5.65 × 10 1
/4
3.45 × 10 1
/6
3.12 × 10 1
/1
1.55 × 10 1
/7
9.07 × 10 1
/3
f 17 Mean Error7.90 × 10 5 4.06 × 10 6 +3.34 × 10 5 2.69 × 10 7 +4.03 × 10 7 +7.84 × 10 7 +1.73 × 10 6
Std Dev
/Rank
2.82 × 10 5
/2
2.50 × 10 6
/4
1.55 × 10 5
/1
1.07 × 10 7
/5
1.36 × 10 7
/6
1.69 × 10 7
/7
5.87 × 10 5
/3
f 18 Mean Error2.77 × 10 3 +5.49 × 10 3 +2.92 × 10 3 +3.16 × 10 8 +9.25 × 10 6 +8.58 × 10 8 +2.13 × 10 3
Std Dev
/Rank
1.92 × 10 3
/2
2.22 × 10 3
/4
2.07 × 10 3
/3
6.70 × 10 8
/6
4.50 × 10 6
/5
1.70 × 10 8
/7
1.15 × 10 3
/1
f 19 Mean Error6.39 × 10 1 +6.91 × 10 1 +7.20 × 10 1 +1.72 × 10 2 +9.43 × 10 1 +3.85 × 10 2 +5.22 × 10 1
Std Dev
/Rank
3.00 × 10 1
/2
3.31 × 10 1
/3
2.59 × 10 1
/4
3.22 × 10 1
/6
1.20 × 10 1
/5
5.60 × 10 1
/7
3.23 × 10 1
/1
f 20 Mean Error6.07 × 10 3 7.20 × 10 4 +2.53 × 10 3 1.99 × 10 5 +4.90 × 10 4 +7.17 × 10 4 +1.81 × 10 4
Std Dev
/Rank
3.89 × 10 3
/2
3.24 × 10 4
/6
1.67 × 10 3
/1
8.45 × 10 4
/7
1.00 × 10 4
/4
2.48 × 10 4
/5
1.34 × 10 4
/3
f 21 Mean Error4.71 × 10 5 1.83 × 10 6 +2.03 × 10 5 4.58 × 10 6 +1.21 × 10 7 +1.65 × 10 7 +9.98 × 10 5
Std Dev
/Rank
2.64 × 10 5
/2
1.16 × 10 6
/4
1.16 × 10 5
/1
1.08 × 10 6
/5
4.48 × 10 6
/6
6.04 × 10 6
/7
5.53 × 10 5
/3
f 22 Mean Error1.50 × 10 3 +1.51 × 10 3 +1.55 × 10 3 +2.15 × 10 3 +1.48 × 10 3 +2.25 × 10 3 +1.13 × 10 3
Std Dev
/Rank
3.82 × 10 2
/3
3.31 × 10 2
/4
3.86 × 10 2
/5
3.95 × 10 2
/6
2.17 × 10 2
/2
1.88 × 10 2
/7
2.89 × 10 2
/1
f 23 Mean Error3.45 × 10 2 +3.59 × 10 2 +3.44 × 10 2 +2.55 × 10 2 3.92 × 10 2 +3.31 × 10 2 3.37 × 10 2
Std Dev
/Rank
1.28 × 10 0
/5
1.39 × 10 1
/6
2.69 × 10 2
/4
1.52 × 10 2
/1
1.08 × 10 1
/7
7.84 × 10 1
/2
1.70 × 10 1
/3
f 24 Mean Error3.00 × 10 2 +2.97 × 10 2 +3.08 × 10 2 +2.40 × 10 2 2.93 × 10 2 +2.17 × 10 2 2.84 × 10 2
Std Dev
/Rank
9.52 × 10 0
/6
1.63 × 10 1
/5
1.16 × 10 1
/7
1.66 × 10 1
/2
1.91 × 10 0
/4
6.62 × 10 0
/1
1.13 × 10 1
/3
f 25 Mean Error2.45 × 10 2 +2.24 × 10 2 2.44 × 10 2 +2.04 × 10 2 2.54 × 10 2 +2.03 × 10 2 2.27 × 10 2
Std Dev
/Rank
1.03 × 10 1
/6
9.48 × 10 0
/3
1.11 × 10 1
/5
5.55 × 10 0
/2
7.41 × 10 0
/7
1.69 × 10 0
/1
1.25 × 10 1
/4
f 26 Mean Error1.75 × 10 2 +1.25 × 10 2 +2.32 × 10 2 2.00 × 10 2 +1.24 × 10 2 1.07 × 10 2 1.50 × 10 2
Std Dev
/Rank
7.59 × 10 1
/5
6.78 × 10 1
/7
1.01 × 10 2
/3
1.00 × 10 1
/6
3.78 × 10 1
/2
5.61 × 10 1
/1
5.27 × 10 1
/4
f 27 Mean Error1.87 × 10 3 +1.62 × 10 3 1.85 × 10 3 +3.25 × 10 3 +1.63 × 10 3 2.00 × 10 3 +1.67 × 10 3
Std Dev
/Rank
1.26 × 10 2
/5
1.07 × 10 2
/1
1.28 × 10 2
/4
4.99 × 10 2
/7
1.80 × 10 2
/2
4.42 × 10 1
/6
1.62 × 10 2
/3
f 28 Mean Error3.64 × 10 3 +2.81 × 10 3 +3.65 × 10 3 +5.93 × 10 3 +4.24 × 10 3 +8.26 × 10 2 +4.74 × 10 2
Std Dev
/Rank
1.03 × 10 3
/4
6.16 × 10 2
/3
9.26 × 10 2
/5
1.41 × 10 3
/7
8.98 × 10 2
/6
1.17 × 10 3
/2
4.71 × 10 1
/1
f 29 Mean Error1.23 × 10 8 +9.22 × 10 6 +1.08 × 10 8 +2.02 × 10 2 5.93 × 10 6 +5.19 × 10 4 +2.37 × 10 2
Std Dev
/Rank
1.06 × 10 8
/7
1.88 × 10 7
/5
7.85 × 10 7
/6
3.72 × 10 1
/1
2.02 × 10 6
/4
2.06 × 10 4
/3
1.73 × 10 1
/2
f 30 Mean Error4.31 × 10 4 +1.65 × 10 4 +1.88 × 10 4 +3.17 × 10 6 +2.41 × 10 5 +7.38 × 10 4 +1.91 × 10 3
Std Dev
/Rank
5.51 × 10 4
/4
3.71 × 10 3
/2
4.31 × 10 3
/3
3.72 × 10 6
/7
8.66 × 10 4
/6
7.47 × 10 4
/5
4.21 × 10 2
/1
Average Rank3.73.833.334.774.035.732.33
+222421242226
-869584
000100
Table 6. The average CPU time of the HMWCA, WCA and GSA.
Table 6. The average CPU time of the HMWCA, WCA and GSA.
Function f 1 f 3 f 4 f 9 f 17 f 21
HMWCA54.22 s55.87 s53.23 s54.98 s59.72 s56.70 s
WCA30.70 s24.16 s31.03 s25.37 s29.44 s28.16 s
GSA116.60 s115.87 s115.40 s116.58 s115.99 s116.55 s
Table 7. Numerical results of the HMWCA on the SSRPPCD.
Table 7. Numerical results of the HMWCA on the SSRPPCD.
AlgorithmBest (Result)Worse (Result)Average ValueStandard Deviation
WCA3.62 × 10 0 2.57 × 10 0 3.09 × 10 0 4.00 × 10 1
CWCA3.35 × 10 0 2.46 × 10 0 3.02 × 10 0 3.14 × 10 1
ER_WCA3.44 × 10 0 2.68 × 10 0 2.93 × 10 0 2.34 × 10 1
HMWCA3.27 n . × 10 0 is versio2.29 × 10 0 2.77 × 10 0 3.16 × 10 1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tian, M.; Liu, J.; Yue, W.; Zhou, J. A Novel Integrated Heuristic Optimizer Using a Water Cycle Algorithm and Gravitational Search Algorithm for Optimization Problems. Mathematics 2023, 11, 1880. https://doi.org/10.3390/math11081880

AMA Style

Tian M, Liu J, Yue W, Zhou J. A Novel Integrated Heuristic Optimizer Using a Water Cycle Algorithm and Gravitational Search Algorithm for Optimization Problems. Mathematics. 2023; 11(8):1880. https://doi.org/10.3390/math11081880

Chicago/Turabian Style

Tian, Mengnan, Junhua Liu, Wei Yue, and Jie Zhou. 2023. "A Novel Integrated Heuristic Optimizer Using a Water Cycle Algorithm and Gravitational Search Algorithm for Optimization Problems" Mathematics 11, no. 8: 1880. https://doi.org/10.3390/math11081880

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop