Next Article in Journal
Topological Doping and Superconductivity in Cuprates: An Experimental Perspective
Previous Article in Journal
Research on Knowledge Graphs with Concept Lattice Constraints
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Harris Hawks Optimization with Multi-Strategy Search and Application

1
School of Automation and Information Engineering, Xi’an University of Technology, Xi’an 710048, China
2
Shaanxi Key Laboratory of Complex System Control and Intelligent Information Processing, Xi’an University of Technology, Xi’an 710048, China
3
School of Electronic & Electrical Engineering, Baoji University of Arts and Sciences, Baoji 721016, China
*
Author to whom correspondence should be addressed.
Symmetry 2021, 13(12), 2364; https://doi.org/10.3390/sym13122364
Submission received: 5 November 2021 / Revised: 26 November 2021 / Accepted: 30 November 2021 / Published: 8 December 2021
(This article belongs to the Topic Applied Metaheuristic Computing)

Abstract

:
The probability of the basic HHO algorithm in choosing different search methods is symmetric: about 0.5 in the interval from 0 to 1. The optimal solution from the previous iteration of the algorithm affects the current solution, the search for prey in a linear way led to a single search result, and the overall number of updates of the optimal position was low. These factors limit Harris Hawks optimization algorithm. For example, an ease of falling into a local optimum and the efficiency of convergence is low. Inspired by the prey hunting behavior of Harris’s hawk, a multi-strategy search Harris Hawks optimization algorithm is proposed, and the least squares support vector machine (LSSVM) optimized by the proposed algorithm was used to model the reactive power output of the synchronous condenser. Firstly, we select the best Gauss chaotic mapping method from seven commonly used chaotic mapping population initialization methods to improve the accuracy. Secondly, the optimal neighborhood perturbation mechanism is introduced to avoid premature maturity of the algorithm. Simultaneously, the adaptive weight and variable spiral search strategy are designed to simulate the prey hunting behavior of Harris hawk to improve the convergence speed of the improved algorithm and enhance the global search ability of the improved algorithm. A numerical experiment is tested with the classical 23 test functions and the CEC2017 test function set. The results show that the proposed algorithm outperforms the Harris Hawks optimization algorithm and other intelligent optimization algorithms in terms of convergence speed, solution accuracy and robustness, and the model of synchronous condenser reactive power output established by the improved algorithm optimized LSSVM has good accuracy and generalization ability.

1. Introduction

Along with the significant increase in the processing power of computer hardware and software, a large number of excellent meta-heuristics were created in the intelligent computing field [1,2,3,4,5]. Meta-heuristics are a large class of algorithms developed in contrast to optimization and heuristics. Optimization algorithms are dedicated to finding the optimal solution to a problem, but they are often difficult to implement due to the unresolvability of the problem [6,7]. Heuristic algorithms are dedicated to customizing algorithms through intuitive experience and problem information, but are often difficult to generalize due to their specialized nature. Compared to these two algorithms, meta-heuristic algorithms are more general and do not require deep adaptation to the problem, and although they do not guarantee optimal solutions, they can generally obtain optimal solutions under acceptable spatial and temporal conditions, although the degree of deviation from the optimal solution is difficult to estimate [8,9,10,11,12].
The main optimization strategies of meta-heuristic algorithms are summarized as follows: (1) diversification of exploration in a wide range to ensure the global optimal solution; and (2) intensification and exploitation in a local range to obtain an optimal solution as close to the optimal solution as possible [13]. The main difference between various meta-heuristic algorithms is how to strike a balance between the two. Almost all meta-heuristic algorithms have the following characteristics: (1) they are inspired by some phenomena in nature, such as simulated physics, biology, and biological behavior; (2) they use stochastic strategies; (3) they do not use the gradient resolution information of the objective function; and (4) they have several parameters that need to be adapted to the problem, and (5) they have good parallel and autonomous exploration. Meta-heuristic algorithms have been widely used in all aspects of social production and life. Many related research papers are published every year in the fields of production scheduling [14,15], engineering computing [16,17], management decision-making [18,19], machine learning (ML) [20,21], system control [22], and many other disciplines.
P-meta-heuristics are categorized into four main groups [23,24]: (1) simulated physical process algorithm, (2) evolutionary algorithm, (3) simulated swarm intelligence algorithm, and (4) human behavior [25,26,27]. Algorithms for simulating physical processes include simulated annealing (SA) [28], gravitational search algorithm [29] which simulates earth gravity, artificial chemical reaction optimization algorithm [30], heat transfer search, which simulates the heat transfer search process in thermodynamics [31], Gases Brownian motion optimization, which simulates the phenomenon of Brownian motion in physics [32], Henry gas solubility optimization, which simulates Henry gas solubility process [33]; and evolutionary algorithm. In 1975, American professor Holland proposed the genetic algorithm (GA) based on the Darwinian evolutionary theory and the mechanism of superiority and inferiority in nature. GA [34], evolution strategies [35] and differential evolution [36], genetic programming [37], and Biogeography-Based Optimizer [38]; simulated population intelligence algorithms: the Artificial Bee Colony (ABC) algorithm [39] based on the honey bee harvesting mechanisms, Firefly Algorithm based on the flickering behavior of fireflies [40], Beetle Antennae Search algorithm based on the foraging principle of aspen bark beetles [41], Grey Wolf Optimization (GWO) algorithm inspired by the hierarchy and predatory behavior of gray wolf packs [42], and Virus Colony Search algorithm [43], which is based on the proliferation and infection strategies of viruses to survive and reproduce in the cellular environment through host cells. Simulation of human behavior: Tabu Search [44], Socio Evolution and Learning Optimization [45], Teaching Learning Based Optimization [46], and Imperialist Competitive Algorithm [47].
Harris Hawks Optimization (HHO) [24] is a swarm intelligence optimization algorithm proposed by Heidari et al. in 2019 to simulate the prey hunting process of Harris’s hawks in nature. The algorithm was inspired by the three phases of Harris’s hawks’ predatory behavior: search, search-exploitation conversion, and exploitation. The algorithm has a simple principle, fewer parameters, and better global search capability. Therefore, HHO has been applied in image segmentation [48], neural network training [49], motor control [50] and other fields. However, similar to other swarm intelligence optimization algorithms, HHO has the disadvantages of slow convergence speed, low optimization accuracy, and easily falls into local optimum when solving complex optimization problems. For example, the literature [51] used the information exchange mechanism to enhance the population diversity, thus improving the convergence speed of the HHO algorithm with information exchange (IEHHO) algorithm. The limitation of the IEHHO algorithm is how to set the parameters of the proposed algorithm. Zhang et al. [52] introduced an exponentially decreasing strategy to update the energy factor to increase the exploration and exploitation capability obtained by the relatively higher values of escaping energy; Elgamal et al. [53] made two improvements: (1) they applied chaotic mapping in the initialization phase of HHO; and (2) they used the SA algorithm as the current best solution to improve HHO exploitation; Shiming Song et al. [54] applied Gaussian mutation and a dimension decision strategy of the cuckoo search method into this algorithm to increase the HHO’s performance. The mechanism of cuckoo search was useful in improving the convergence speed of the search agents as well as sufficient excavation of the solutions in the search area, while the Gaussian mutation strategy performed well in increasing the accuracy and jumping of the local optimum.
However, according to the no free lunch theory [55], one meta-heuristics algorithm cannot always perform as the best on all operations. The original HHO method could not fully balance the exploration and exploitation phases, which resulted in insufficient global search capability and slow convergence of the HHO method. To alleviate these adverse effects, we propose an improved algorithm model called chaotic multi-strategy search HHO (CSHHO), which introduces chaotic mapping and global search strategy, to solve single-objective optimization problems efficiently.
Here, the initialization phase of HHO is replaced by chaotic mapping, which allows the population initialization phase to be evenly distributed in the upper and lower bounds to enhance the population diversity, simultaneously enabling the population to approach the prey location faster, which accelerates the convergence speed of the algorithm. The adaptive weights are added to the position update formula in the exploration phase of HHO to dynamically adjust the influence of the global optimal solution. In the update phase of the HHO, the optimal neighborhood perturbation strategy is introduced to prevent the algorithm from falling into the local optimal solution and to solve the premature aging phenomenon.
To verify the superior performance of the CSHHO algorithm, this experiment first tests the effect of common chaotic mappings of the HHO algorithm’s performance. The selected chaotic mappings are Sinusoidal, Tent, Kent, Cubic, Logistic, Gauss, and Circle, and the experimental results will show that Gauss chaotic mapping improves the accuracy of the HHO algorithm to the greatest extent. Second, the HHO algorithm based on Gauss chaotic mapping with multi-strategy search is tested. Then, it is compared with other classic and state-of-the-art algorithms on 23 classic test functions and 30 IEEE CEC2017 competition functions to verify the significant superiority of the proposed paradigm over other algorithms by Friedman test and Bonferroni–Holm corrected Wilcoxon signed-rank test. Finally, CSHHO is applied to model the reactive power output problem of a synchronous condenser based on LSSVM. The complete results will show that the effectiveness of the proposed optimizer is better than other models in the experiment.
The remainder of this paper is organized as follows: Section 2 introduces the basic theory and structure of the original HHO algorithm. Section 3 introduces the chaotic operator and Global search strategy to integrate it into the original optimizer. Section 4 conducts a full range of experiments on the proposed method and demonstrates the experimental results. It further discusses the proposed method based on experimental results. Section 5 applies the proposed method to the LSSVM-based synchronous condenser reactive power output problem. Finally, Section 6 summarizes the study and proposes research ideas for the future.

2. Harris Hawks Optimization Algorithm

The HHO algorithm is a swarm intelligence optimization algorithm that is widely used in solving optimization problems. The main idea of the algorithm is derived from the cooperative behavior and chasing strategy of Harris’s hawk when catching prey in nature [24]. In the process of prey capture, the HHO algorithm is divided into two segments according to the physical energy E of the prey at the time of escape: exploration and exploitation phases, as shown in Figure 1. During the exploration phase, Harris’s hawks randomly select a perching location to observe and monitor their prey.
X i ( t + 1 ) = { X r a n d ( t ) r 1 | X r a n d ( t ) 2 r 2 X i ( t ) | , q 0.5 X rabbit ( t ) X m ( t ) r 3 [ lb + r 4 ( ub lb ) ] , q < 0.5  
where X rabbit   ( t ) and X rand   ( t ) denote the position of the prey and the individual position at time t, respectively, and q is a random number between (0, 1), the average individual position:
X m ( t ) = 1 N i = 1 N X i ( t )
As the physical capacity of the prey decreases, the exploration phase changes to the exploitation phase, the prey’s physical energy factor E is as follows:
E = 2 E 0 ( 1 t T )
In the exploitation phase, the Harris’s hawk launches a surprise attack on the target prey found in the exploration phase, and the prey tries to escape when it encounters danger. Let the randomly generated prey escape probability be r, when r < 0.5 the prey successfully escapes; when r > 0.5 the prey does not successfully escape. According to the magnitude of r and |E|, four different location update strategies were proposed in the exploitation phase (see Table 1).
According to the position update condition in the HHO algorithm, the position of the Harris hawk is updated continuously, the fitness value was calculated according to the position of the Harris hawk, and if the fitness threshold was reached, the algorithm was finished. Otherwise, the algorithm continued to execute, and if the maximum number of iterations was reached, the algorithm was finished, and the optimal solution was obtained (See Figure 2).

3. HHO Algorithm Based on Multi-Search Strategy

3.1. Reasons for Improving the Basic HHO Algorithm

Harris’s hawks generally gather high in trees to hunt for prey. In the process of hunting for prey, they often hover in a spiral to capture prey; when approaching prey, they rush towards their prey at a faster speed until the distance from the prey is small. They slow down and adjust their body posture to increase the probability of their catching prey [56,57,58]. This mechanism is important in the HHO algorithm. The exploration phase of the basic HHO algorithm uses Equations (1)–(3), where the optimal solution from the previous iteration of the algorithm affected the current solution and caused the algorithm to fall into a local optimum. The search for prey in a linear way led to a single search result. From all iterations of the algorithm, the optimal position of the current algorithm was only updated when the algorithm searched for a better solution than the current one, and the overall number of updates of the optimal position was low, which led to a decrease in the efficiency of the algorithm’s search. In reality, when a Harris’s hawk chases its prey, it hovers and descends in a spiral manner to catch its prey adaptively, showing better agility when hunting.
Here, the optimal neighborhood disturbance strategy was introduced to enhance the convergence speed of the algorithm and avoid premature maturity of the algorithm. The adaptive weighting and variable spiral position update strategies were introduced to enhance the global search capability of the algorithm by simulating the predation process of Harris’s hawks in nature. To make the initial solution generated in the population initialization phase of the HHO algorithm cover the solution space as much as possible, we selected the best chaotic mapping method for HHO among seven commonly used chaotic mapping population initialization methods. It was used as the population initialization method to improve the algorithm. Hence, the above four methods are used to improve the global search capability of the HHO algorithm and to increase the speed of Harris Hawk’s search for the optimal solution.

3.2. Chaotic Mapping

Chaotic is a deterministic stochastic method found in non-periodic, non-convergence and bounded nonlinear dynamic systems. Mathematically, chaotic is the randomness of a simple deterministic dynamic system, and a chaotic system is considered as the source of randomness. The essence of chaotic is obviously random and unpredictable, and it also has regularity [59].
As an important part of the population initialization algorithm, its result directly affects the convergence speed and quality of the algorithm [60,61]. For example, uniform distribution has more complete coverage of solution space than random distribution, and it is easier to obtain good initial solutions. A classical HHO algorithm uses random population initialization operation, which cannot cover the whole solution space. A chaotic sequence has ergodicity, randomness, and regularity in a certain range. Compared with random search, chaotic sequence searches the search space thoroughly with higher probability, which enables the algorithm to go beyond the local optimum and maintain the diversity of the population. Based on the above analysis, to obtain a good initial solution position and speed up the convergence of the population, seven common chaotic mappings Sinusoidal, Tent, Kent, Cubic, Logistic, Gauss, and Circle were selected [62,63,64,65,66,67,68,69] and used to initialize the population of HHO algorithm. The results were analyzed and the optimal one for the HHO algorithm selected as the population initialization method for the improved algorithm. The following were the mathematical formulas of the 10 chaotic mappings:
(1) Sinusoidal chaotic mapping:
x k + 1 = P sin ( 2 π x k )
where P was the control parameter, here P = 2.3 , x0 = 0.7, Equation (14) was written as
x k + 1 = sin ( π x k )
(2) Tent chaotic mapping
x k + 1 = { 2 x k x k < 0.5 2 ( 1 x k ) x k 0.5
(3) Kent chaotic mapping
{ x k + 1 = x k / μ 0 < x k < μ x k + 1 = ( 1 x k ) / ( 1 μ ) μ x k < 1
The control parameter μ ( 0 , 1 ) , when μ = 0.5 , the system was Short Period State, μ = 0.5 was not taken here. When using the chaotic mapping, the initial value x 0 had to not be the same as the system parameters μ , otherwise the system evolved into a periodic system. Here, we took μ = 0.4 .
(4) Cubic chaotic mapping
The standard Cubic chaotic mapping function was expressed as
x k + 1 = b x k 3 c x k
where b and c were the influence factors of chaotic mapping. The range of Cubic chaotic mapping was different for different values to b and c. When c = 3 , the sequence generated by Cubic mapping was chaotic. Also when b = 1, x n ( 2 , 2 ) ; when b = 4, x n ( 1 , 1 ) . Here, we took b = 4 and c = 3.
(5) Logistic chaotic mapping
x k + 1 = P x k ( 1 x k )
when P = 4, the generation number of Logistic chaotic mapping was between (0, 1).
(6) Gauss chaotic mapping
x k + 1 = { 0 1 x k mod ( 1 ) x k = 0   otherwise   1 x k mod ( 1 ) = 1 x k | 1 x k |
(7) Circle chaotic mapping
x k + 1 = mod ( x k + 0.2 ( 0.5 2 π ) sin ( 2 π x k ) , 1 )
The three steps to initialize the population of the HHO using the seven chaotic mappings were:
  • Step 1: Randomly generate M Harris hawks in D-dimensional space, i.e., Y = ( y 1 , y 2 , y 3 , , y n ) y i ( 1 , 1 ) i = 1 , 2 , , n .
  • Step 2: Iterate each dimension of each Harris hawk M times, resulting in M Harris hawks.
  • Step 3: After all Harris hawk iterations were completed, chaotic mapping (21) was applied to the solution space.
x i d = l b d + ( 1 + y i d ) × u b d l b d 2
where u b was the upper bound of the exploration space, l b the lower bound of the exploration space; the d-dimensional coordinates of the i-th Harris hawk were represented by y i d , which was generated using Equations (14)–(21); the coordinates of the i-th Harris hawk in the d-dimensional of the exploration space were x i d , which was generated using Equation (22).
Here, we first proposed the HHO algorithm based on seven different chaotic initialization strategies, respectively, chaotic initialization Harris hawks optimization (CIHHO) algorithm. Obviously, the implementation of CIHHO is basically the same as that of HHO, except that the initialization in Step 2 generates m individual Harris hawks using Equations (14)–(21), and then maps the positions of these m Harris hawks to the search space of the population using Equation (22).

3.3. Adaptive Weight

Inspired by the predation process of the Harris’s hawk hunting strategy, we added an adaptive weight to the position update of Harris’s hawk that changed with the number of iterations. In the early stage of the exploration phase of HHO, the influence of the optimal Harris’s hawk position on the current individual position adjustment was weakened to improve the global search ability of the algorithm in the early stage. As the number of iterations increased, the influence of the optimal Harris’s hawk position gradually increased, so that other Harris hawks could quickly converge to the optimal Harris hawk position and improve the convergence speed of the whole algorithm. According to the variation of the number of updates in the HHO algorithm, the adaptive weight composed of the number of iterations t were chosen as follows:
w ( t ) = 0.2 cos ( π 2 ( 1 t t max ) )
Such adaptive weight w ( t ) had a property of nonlinear variation between [0, 1], due to the variation property of the cos function between [ 0 , π 2 ] , so that the weights were small at the beginning stage of the exploration phase, but changed slightly faster; at the end of the exploration phase their values were larger, but the speed of change would slow down, so that the convergence of the algorithm was fully guaranteed. The improved HHO algorithm position update formula is:
X ( t + 1 ) = { w ( t ) X rand   ( t ) r 1 | X rand   ( t ) 2 r 2 X ( t ) | q 0.5 w ( t ) ( X rabbit   ( t ) X m ( t ) ) r 3 ( l b + r 4 ( u b l b ) ) q < 0.5
The position update after the introduction of adaptive weights dynamically adjusted the weight size according to the increase of the number of iterations, so that the randomly selected Harris’s hawk position X rand   ( t ) and the optimal average Harris’s hawk position X rabbit   ( t ) X m ( t ) in the population guide the individual Harris’s hawks differently at different times. As the number of iterations increased, the Harris’s hawk population would move closer to the optimal position, and the larger weights would speed up the movement of Harris’s hawk positions, which accelerated the convergence of the algorithm.

3.4. Variable Spiral Position Update

In the search phase of the HHO algorithm, Harris’s hawk randomly searched for prey in two equal-opportunity strategies based on target location and its own location. However, in nature, Harris’s hawks generally hover in a spiral shape to search for prey. To simulate the real process of prey search in nature, we introduced a variable spiral position update strategy in the search phase of the HHO algorithm, so that the Harris’s hawk would adjust the distance of each position update according to the spiral shape between the target position and its own position (see Figure 3).
In the exploration phase of the HHO algorithm, Equation (1), a constant b was introduced to control the shape of the spiral; if this parameter was set to a constant, each time the Harris’s hawk position updated a different spiral arc for speed adjustment would follow. However, if b was set to a constant value, the spiral movement of the Harris’s hawk would be too singular when searching for prey, and it would follow a fixed spiral line to approach the target every time, which would easily fall into the misconception of local optimal solution and weaken the global exploration ability of the algorithm. To address this, we introduced the idea of variable spiral search to enable the Harris’s hawk to develop more diverse search path strategies for location update and design the parameter b as a variable that changes with the number of iterations to dynamically adjust the shape of the spiral when the Harris’s hawk explores, to increase the ability of the Harris’s hawk to explore unknown areas; thus, improving the global search capability of the algorithm. After combining the adaptive weights, the new spiral position update was created (see Equation (25)).
The b parameter was designed based on the mathematical model of the spiral, and the spiral shape was dynamically adjusted by introducing the number of iterations on the basis of the original spiral model. The b parameter was designed in such a way that the spiral shape changed from large to small as the number of iterations increased. Early in the exploration phase of the HHO algorithm, the Harris’s hawk searches the target with a larger spiral shape, the Harris hawk explores the global optimal solution as much as possible to improve the global optimal search capability of the algorithm; later in the exploration phase of the HHO algorithm, the Harris’s hawk searched the target with a small spiral shape to improve the algorithm’s search accuracy.
X ( t + 1 ) = { w ( t ) X rand   ( t ) b | X rand   ( t ) 2 r 2 X ( t ) | q 0.5 w ( t ) ( X rabbit   ( t ) X m ( t ) ) b ( l b + r 4 ( u b l b ) ) q < 0.5 b = e 5 ( π ( 1 t t max ) )

3.5. Optimal Neighborhood Disturbance

When updating the position, the Harris’s hawk generally takes the current optimal position as the target of this iteration. In the whole iteration, the optimal position is updated only when there is a better position; thus, the total number of updates was not many, which led to the inefficiency of the algorithm search. In this regard, the optimal neighborhood disturbance strategy was introduced to search the neighborhood of the optimal position randomly to find a better global value, which could not only improve the convergence speed of the algorithm, but also avoided premature maturity of the algorithm. The optimal position generated a random disturbance to increase its search of the nearby space, and the neighborhood disturbance formula was:
X ˜ ( t ) = { X * ( t ) + 0.5 h X * ( t ) ,   g < 0.5 X * ( t ) ,   g 0.5
where h and g were random numbers uniformly generated between [0, 1]; X ( t ) was the new position generated. For the generated neighborhood positions, a greedy strategy was used to determine whether to keep them, and the formula was:
X * ( t ) = { X ˜ ( t ) , f ( X ˜ ( t ) ) < f ( X * ( t ) ) X * ( t ) , f ( X * ( t ) ) f ( X ˜ ( t ) )
where f ( x ) was the position adaptation value of x. If the generated position was better than the original position, it would be replaced with the original position to make it the global optimum. Otherwise, the optimal position remained unchanged.

3.6. Computational Complexity

The computational complexity of the population initialization process of the classical HHO algorithm is O(N), and the computational complexity of the updated mechanism was O(T × N) + O(T × N × D), so the computational complexity of the classical HHO algorithm was O(N × (T + TD + 1)), where T was the maximum number of iterations and D the dimension of the specific problem. The computational complexity of the population initialization process of the CSHHO algorithm was O(ND), and the computational complexity of the update mechanism was the same as that of the classical HHO algorithm, so the computational complexity of the CSHHO algorithm was O(N × (T + TD + D)), where T was the maximum number of iterations, and D the dimensionality of the specific problem.

3.7. Algorithm Procedure

Algorithm 1 shows the procedure of the CSHHO optimization algorithm: See Algorithm 1.
Algorithm 1: CSHHO algorithm
Input: The population size N, maximum number of iterations T.
Output: The location of rabbit and its fitness value.
Using Seven chaotic maps to initialize the population:
Through chaotic variables y i k   [0, 1], k = 1, 2, …, M. M indicates the initial population dimension and Equations (14)–(21)generate initial chaotic vector;
Inverse mapping to get the initial population of the corresponding solution space through Equation (22);
While stopping condition is not meet do
Calculate the fitness values of hawks;
   Set X r a b b i t as best location of rabbit;
   For each   X i  do
    Update the E using Equation (3);
     if | E | 1  then
       Update   the   vector   X i   using   Equations   ( 25 )   and   ( 2 ) ;   Random   generate   parameters : r 1 , r 2 ;
       b = e 5 × ( π ( 1 t t max ) ) ;
If q 0.5  then
X i ( t + 1 ) = ω ( t ) × X r a n d ( t ) b × | X r a n d ( t ) 2 × r 2 × X i ( t ) | ;
end
if q < 0.5 then
X i ( t + 1 ) = ω ( t ) × ( X r a b b i t ( t ) X m ( t ) ) b × | l b + r 4 × ( u b l b ) | ;
end
end
    if |E| < 1 then
      if r ≥ 0.5 and |E| ≥ 0.5 then
Update   the   vector   X i using Equations (4)–(6);
end
      else if r ≥ 0.5 and |E| < 0.5 then
         Update   the   vector   X i using Equation (7);
       end
      else if r < 0.5 and E |≥ 0.5
      Update   the   vector   X i using Equations (8)–(11);
       end
      else if r < 0.5and |E| < 0.5
         Update   the   vector   X i using Equations (12) and (13);
       end
       end
end
Optimal neighborhood disturbance using Equations (27) and (28);
X ~ ( t ) = { X * ( t ) + 0.5 × h × X * ( t ) , g < 0.5 X * ( t ) , g > 0.5 ;
X * ( t ) = { X ~ ( t ) , f ( X ~ ( t ) ) < f ( X * ( t ) ) X * ( t ) , f ( X * ( t ) ) > f ( X ~ ( t ) ) ;
end
Return X r a b b i t ;

4. Experiments and Discussion

In this section, to test and verify the performance of our optimizer proposed, namely CSHHO, a different category of experiments were designed. According to the randomness of the HHO algorithm, this section used a necessary and acceptable set of test functions to ensure that the superior results of the CSHHO algorithm did not happen by accident. Therefore, this section used two different benchmark test suites: classical 23 well-known benchmark functions [70,71] and standard IEEE CEC 2017 [72]. All experiments were as follows:
Experiment 1: First, seven chaotic mappings were used as the initialization method of HHO population and tested separately. Second, the seven data sets are analyzed, and the optimal chaotic mapping was selected as the population initialization method of the improved algorithm.
Experiment 2: First, on the basis of Experiment 1, a combination test of adaptive weighting mechanism, variable spiral position update and optimal neighborhood disturbance mechanism was executed. Second, we analyzed and compared the CSHHO algorithm with other recently proposed meta-heuristic algorithms such as HHO [24], WOA [38], SCA [73], and Chicken Swarm Optimization (CSO) [74]. Third, we analyzed and compared the CSHHO algorithm with developed advanced variants such as HHO with dimension decision logic and Gaussian mutation (GCHHO) [54] and Hybrid PSO Algorithm with Adaptive Step Search (DEPSOASS) [75] and Gravitational search algorithm with linearly decreasing gravitational constant (Improved GSA) [76] and Dynamic Generalized Opposition-based Learning Fruit Fly Algorithm (DGOBLFOA) [77]. Fourth, based on the third step, the IEEE CEC 2017 was used to perform an algorithm-based accuracy scalability test with test dimensions D = 50, D = 100.
To ensure the fairness of the experiments, the experiments were evaluated using the same parameters, and all population sizes N were set to 30, and dimension D was set to 30; each algorithm on each test instance was performed over 50 independent runs. In each run, the function error value log ( F ( x ) F ( x * ) ) , where F(x) was mean value found at all of the run, and F ( x * ) was the optimal value recorded in 23 benchmark functions. The average error (Mean) and standard deviation (Std) of the function error values were considered as two performance metrics for evaluating the performance of the algorithm in all runs. The experimental environment: CPU Intel(R) Xeon(R) CPU E5-2680 v3 (2.50 GHz), RAM 16.00 GB, MATLAB R2019b.

4.1. Benchmark Functions Verification

All experiments were performed using the classical 23 test functions [70,71] to test the performance of each algorithm in terms of convergence speed and search accuracy. These Benchmark functions were divided into three categories, including unimodal (UM) and multi-modal (MM). F1–F7 were the UM functions, which had unique global optimality and were used to test the exploitation performance of optimization algorithms. F8–F23 were the MM functions, which were used to test the exploration performance of the optimization algorithm and LO avoidance potentials. As the complexity of the test functions increased, the tested algorithms were more likely to fall into local optima, and all the test functions were used to evaluate the performance of the tested algorithms in various aspects. The convergence curves and test values of the corresponding test functions are given. Appendix A shows the classical 23 test functions.
IEEE CEC 2017 functions were also used in Experiment 2 to evaluate the scalability of CSHHO, other meta-heuristic algorithms, and developed HHO advanced variants. IEEE CEC 2017 Benchmark functions were classified into four categories, consisting of three UM functions (F1–F3), seven MM functions (F4–F10), 10 hybrid functions (F11–F20), and 10 composite functions (F21–F30). To evaluate the scalability of each algorithm more comprehensively, the dimensions of Benchmark functions were set to D = 50, D = 100, and Table 2 records the corresponding accuracy values. It also shows the function formulas for IEEE CEC 2017.
In addition, to compare the performance of various algorithms, the rank was used to rank the mean values of all the algorithms in the simulation experiment in the order of lowest to highest. The lower the rank, the better the algorithm was compared to other algorithms; conversely, the higher the rank, the worse the algorithm was compared to other algorithms. Wilcoxon signed-rank test [78] was used to detect whether there was a significant performance difference among all algorithms, the p-value was corrected by Bonferroni–Holm correction [79]; moreover, the Friedman test [80] was used to rank the superiority of all the algorithms.
We used the values of the Friedman test to rank all the algorithms involved in the comparison, if the values of the Friedman test were the same, then the rankings were averaged. Here, the Friedman test was performed on the classical 23 test functions, and the test values were recorded in the average ranking values (ARV) column.

4.2. Efficiency Analysis of the Improvement Strategy

First, in the population initialization phase, we selected the Gauss mapping, which had the highest impact on the accuracy of the HHO algorithm, as the population initialization method of CSHHO from seven commonly used chaotic mappings. Second, a global optimization strategy was used to optimize the HHO algorithm, which consisted of three components, including adaptive weight strategy, variable spiral update strategy and optimal neighborhood disturbance strategy. To verify the performance improvement of the HHO algorithm by the two improvements, six algorithms were used for comparison:
  • HHO without any modification, i.e., basic HHO;
  • WOA without any modification, i.e., basic WOA;
  • SCA without any modification, i.e., basic SCA;
  • CSO without any modification, i.e., basic CSO;
  • HHO with dimension decision logic and Gaussian mutation (GCHHO);
  • Hybrid PSO Algorithm with Adaptive Step Search (DEPSOASS);
  • Gravitational search algorithm with linearly decreasing gravitational constant (Improved GSA);
  • Dynamic Generalized Opposition-based Learning Fruit Fly Algorithm (DGOBLFOA).

4.2.1. Influence of Seven Common Chaotic Mappings on HHO Algorithm

In order to select the best effective chaotic mapping method among seven well-known chaotic mapping methods, which enables us to obtain the best initial solution position and speed up the convergence of the Harris Hawk algorithm population, sinusoidal chaotic mapping, Tent chaotic mapping, Kent chaotic mapping, Cubic chaotic mapping, Logistic chaotic mapping, Gauss chaotic mapping, Circle chaotic mapping were initialized to the population of the HHO algorithm, respectively, forming Sinusoidal-HHO, Tent-HHO, Kent-HHO, Cubic-HHO, Logistic-HHO, Gauss-HHO, Circle-HHO, and we compared the accuracy of these seven algorithms.
Table 3 presents the results of the seven algorithms for 23 classical test functions. The results included Best, Worst, Mean, Rank, and Std for each algorithm run 50 times independently.
Table 4 shows the Bonferroni–Holm corrected probability values p obtained from the Wilcoxon signed-rank test for the seven chaotic mapping HHO algorithms. Symbols “+\=\−” represent the number of algorithms that were better, similar, or worse than Gauss-HHO. The ARV at Table 5 is the value of the Friedman test for the seven chaotic mapping HHO algorithms.
Table 3 shows the data with better experimental results in bold. By analyzing the experimental results, we concluded that under the UM functions (F1–F7), Sinusoidal chaotic mapping achieved optimal results in F3, F5, F7 test functions, Circle chaotic mapping achieved optimal results in F1, F4 test functions, and Sinusoidal chaotic mapping had the most influence on the HHO algorithm, followed by Gauss chaotic mapping and Circle chaotic mapping. Under the MM functions (F8–F23), Gauss chaotic mapping had the most influence on the HHO algorithm. Circle chaotic mapping, Sinusoidal chaotic mapping, Tent chaotic mapping, and Kent chaotic mapping obtained the best results in F21, F15, F20, F13, and F23 test functions, respectively; the results of the seven chaotic mappings were compared in 23 test functions, The Gauss chaotic mapping obtained the most optimal solutions.
Table 4 shows the Bonferroni–Holm correction p-values of Wilcoxon signed rank test with 5% confidence level, “+\=\−”indicates whether Gauss-HHO was worse consistent or better with Circle-HHO, Sinusoidal-HHO, Tent-HHO, Kent-HHO, Cubic-HHO and Logistic-HHO. Analyzing the Bonferroni–Holm corrected p-value of Wilcoxon signed rank test and the value of “+\=\−” in each row of the table, better results were obtained using Gauss-HHO based among the 23 tested functions; the experimental results of HHO algorithms based on seven chaotic mappings, respectively, were evaluated comprehensively using Friedman’s test at Table 5, compared with the other six chaotic mappings population initialization methods. Gauss-HHO obtained the best results in terms of average ranking, indicating that for the HHO optimization algorithm, Gauss chaotic mapping not only had the randomness, ergodicity and initial value sensitivity of the chaotic mapping itself, but also the population initialization of the HHO optimization algorithm using Gauss chaotic mapping. The Gauss chaos map was used to initialize the population of the HHO optimization algorithm, and to obtain a better optimization accuracy.

4.2.2. Comparison with Conventional Techniques

The Gauss mapping was used for population initialization. Then the adaptive weight mechanism, variable spiral position update mechanism, and adaptive neighborhood disturbance mechanism were introduced to form the CSHHO algorithm. In order to verify the effectiveness of the CSHHO algorithm against the emerging swarm intelligence optimization algorithms in recent years. In this subsection, the CSHHO algorithm is compared with recently published meta-heuristics, including HHO [24], WOA [24], SCA [65] and CSO [66] to calculate the average precision mean and stability Std of each algorithm. The performance of CSHHO was tested against other optimization algorithms using a nonparametric test: the Bonferroni–Holm corrected Wilcoxon signed rank test. Finally, the non-parametric test method (i.e., the Friedman test) was used to calculate the ARV values of all the participating algorithms and rank them together. As in Experiment 1, this experiment was also based on the test set of 23 classical test functions (see Table 1). The details of the experiment were consistent with the description at the beginning of this section, and Table 6 shows the detailed experimental results. Additionally, Table 7 gives the corrected Wilcoxon signed-rank test based on the 5% confidence level, the “+\=\−” value: the number of CSHHO results that were worse, similar, better or than the comparison algorithm for each test function run 50 times, the result based on the Friedman test was at table.
The optimal results of the tested algorithms under the current Benchmark function are marked in bold. Analyzing the data in Table 6, CSHHO had a strong optimization capability compared to the traditional optimization algorithms. Under the MM functions (F8–F23), CSHHO obtained good results with the best optimization results under the Benchmarks of F9–F13, F15–F19, F21–F23, and CSHHO explored the most optimal region of the above Benchmark and outperformed the other compared optimization algorithms in terms of search performance. The CSHHO explored the above Benchmark optimal regions and outperformed the other participating optimization algorithms in terms of exploration performance. It tied for first place in Benchmark F9, F11, and F11. This showed that CSHHO had strong exploration ability and LO avoidance potentials.
Table 7 was analyzed to determine if there was a significant difference between the other algorithms and CSHHO. The “+\=\−” column indicates the number of results that are less than, similar to, or greater than CSHHO for each of the HHO, WOA, SCA, and CSO algorithms run 50 times in each test function. CSHHO has 21 test functions with better results than HHO, CSHHO has 22 test functions with better results than WOA, CSHHO has 23 test functions with better results than SCA, CSHHO has 22 test functions with better results than CSO; the results of the corrected 5% confidence level Wilcoxon signed rank test were analyzed. If the p-value was greater than 0.05, the algorithm was considered to be the same as CSHHO; otherwise, it was considered to be significantly different. In most cases the p-values of Wilcoxon signed-rank test < 0.5, indicating that the CSHHO algorithm was significantly different from the other compared algorithms, all results were corrected by Bonferroni–Holm correction; At Table 8, analysis of the Friedman test value showed that the value of CSHHO was 2.57 lower than the traditional optimization algorithm. The CSHHO algorithm’s performance was better than other meta-heuristic algorithms.
Figure 4 shows the convergence curves of CSHHO and the traditional optimization algorithms HHO, WOA, SCA and CSO under 23 Benchmark functions, including the performance under UM functions (F1–F7), MM functions (F8–F23), MM functions (F8–F23). The function error value was defined as, where F(x) was the mean value found at all of iterations, and F(x*) was the optimal value recorded in 23 benchmark functions. Among them, under UM functions (F1–F7), the CSHHO algorithm converged with higher accuracy and converged faster than other algorithms, indicating that the development performance of CSHHO algorithm was improved compared with other algorithms; under MM functions (F8–F23), the CSHHO algorithm converged with higher accuracy and converged faster than other algorithms, indicating that the development performance of CSHHO algorithm was improved compared with other algorithms. From F8–F23 CSHHO algorithm did not fall into the local optimum region and could not escape; in F9, F11, F16 CSHHO converged faster in F12, F13, F17, F19, F21–F23. Although the convergence curve of the CSHHO algorithm was smoother and converged slower in the initial iterations as the algorithm searched further. In F10, F15 the CSHHO algorithm not only explores the dominant region with faster convergence speed, but also leads the rest of the algorithms in terms of search accuracy. Therefore, CSHHO algorithm benefits from Gauss chaotic mapping that enhances the population initialization of the algorithm, as well as adaptive weighting mechanism, variable spiral position updating mechanism and adaptive neighborhood disturbance mechanism that enhance the exploration and exploitation ability of the algorithm, CSHHO is less likely to fall into the current search region and increase the ability to jump out of the local optimal region.

4.2.3. Comparison with HHO Variants

In order to verify the effectiveness of the CSHHO algorithm against HHO variants in recent years, this subsection compares the CSHHO algorithm with the recently published advanced HHO variants: GCHHO and DEPSOASS and Improved GSA and DGOBLFOA, observing the mean accuracy mean and stability Std of each algorithm. Table 9 presents the results of the experiments. Next, using nonparametric tests: the Wilcoxon signed-rank test and the Friedman test were used to synthetically assess the performance differences between CSHHO and GCHHO and DEPSOASS and Improved GSA and DGOBLFOA. The configuration of the experiments is the same as in Section 4.2.2, “+\=\−”: the number of CSHHO results obtained from 50 runs in each test function that are worse, similar, or better than the comparison algorithm. Table 10 presents the results of the Bonferroni–Holm correction of Wilcoxon signed-rank test experiments, Table 11 presents the results of the Friedman test experiments.
Analyzing the data in Table 9, CSHHO has some advantages over the advanced HHO variants: under the UM functions (F1–F7), CSHHO outperforms the other algorithms in F1–F4, F6, F7, which indicates that CSHHO has further enhanced global best-finding capability compared to the advanced HHO variants. Under the MM functions (F8–F23) the CSHHO’s performance outperforms the remaining algorithms in F9, F10, F11, F16, F17, F18, F19, F21, which indicates that CSHHO can explore the peak-to-peak information deeply and effectively to avoid the algorithm from entering the local optimum. In summary, CSHHO has a good ability to develop and explore and avoid local optima.
Table 10 shows the results of the corrected Wilcoxon signed rank test at 5% confidence level and Friedman’s test for CSHHO and GCHHO and DEPSOASS and Improved GSA and DGOBLFOA. The p-values indicate whether the numerical results of the algorithms involved in the comparison are significant compared to CSHHO; if the p-value is greater than 0.05, the numerical results of the algorithm are considered the same as CSHHO; otherwise, it is considered to be significantly different. Analyzing the Bonferroni–Holm corrected p-value column values in Table 10, under the 23 classical test functions, only GCHHO is less different from CSHHO in the F9–F11 and F16–18 test functions, and in the rest of the test functions. Moreover, at Table 11 upon analyzing the results of the Friedman test, the value of CSHHO is 1.70, which is lower than others. The result indicates that CSHHO has an advantage over the above algorithms in optimization.
Figure 5 shows the convergence curves of CSHHO and the variants of the optimization algorithm GCHHO and DEPSOASS and Improved GSA and DGOBLFOA under 23 Benchmark Functions, including the performance under UM functions (F1–F7), MM functions (F8–F23). The function error value is defined as, where F ( x ) is the mean value found at all of iterations, and F ( x * ) is the optimal value is recorded in 23 The CSHHO algorithm under UM functions (F1–F7) has improved convergence accuracy in F1–F4, F6, F7 compared to other algorithms, and convergence speed is better than other algorithms in F1, F2, F3, F4, F5, F6, F7; under MM functions (F8–F23), CSHHO algorithm does not fall into the local optimal region and cannot escape, in F9, F10, F11, F16, F17, F18, F19, F21. CSHHO can explore the dominant region well and is ahead of other algorithms in terms of search accuracy. In F9, F10, F11, F12, F13, F21, F22, F23, CSHHO has smooth convergence curves and faster convergence speed. In F16–F20, although the convergence speed of the CSHHO algorithm is slow, the convergence curve does not produce large fluctuations, which indicates that the CSHHO algorithm has good search ability, and it does not fall into local optimum and cannot jump out. Functions. In summary, the CSHHO algorithm’s performance is further improved compared to GCHHO algorithm and other advanced algorithms. Compared to these advanced variants, CSHHO algorithm are effective.

4.2.4. Scalability Test on CSHHO

Dimensional data are an important basis for analyzing the influence of the number of factors to be optimized on the algorithm, and the purpose of the scalability test is to further verify the overall performance and stability of the optimization model. The experimental subjects in this section are CSHHO, HHO. 29 CEC2017 functions [72] based on 50 and 100 dimensions, respectively, are used for scalability experiments. In this experiment, the experimental parameters and the experimental environment are consistent with the previous experiments except that the dimensionality settings are different from the previous experiments, and Table 12 shows the experimental results using Mean and Std.
The best numerical results in CSHHO and HHO are set in bold, and the numerical results of both equivalents are in bold. Under the UM function (F1–F2), the numerical results of CSHHO are overall better than those of HHO, and CSHHO continues to maintain some advantage as the number of dimensions increases; under the MM function test (F3–F9), CSHHO performs better than HHO in 50 and 100 dimensions. The CSHHO’s performance in the 50 and 100 dimensions was generally better than that of HHO in the 50 dimensions in F6 and F7 and in the 100 dimensions in F4. In the hybrid function (F10–F19), CSHHO performs better in the rest of the test sets, except for F11 and F12, where it performs lower than HHO. In the composition function (F20–F29), it still has good ability, and the accuracy in functions F21, F22 and F26 is higher than HHO, and the effect in functions F22, F23, F24, F26 is the same as HHO, and the effect in functions F23 (50 dimensions), F26 (100 dimensions) is lower than HHO. In general, compared with HHO, CSHHO can better balance the exploration and exploitation process as the number of dimensions increases.

5. Engineering Application

In this chapter, the proposed CSHHO is applied to model the reactive power output of a synchronous condenser. Due to the large number of UHV DC transmission projects in the power system, to ensure DC to DC power consumption and peaking demand, the scale of conventional units on the receiving AC grid is reduced, the dynamic reactive power support capability of the system is weakened, and the voltage stability margin is reduced [81]. This requires dynamic reactive power compensation devices to have instantaneous reactive power support characteristics in case of system failure, and the synchronous condenser to have reactive power output characteristics to meet the dynamic reactive power compensation requirements of the grid [82]. Modeling the reactive power support capability of a synchronous condenser is of great theoretical significance and practical value for the reactive power control of converter stations in high-voltage DC transmission systems with synchronous condenser.
The existing research methods for modeling the reactive power output of synchronous condenser are the mathematical analytical model calculation method and the experimental result fitting method [83,84,85,86] and both require large computational effort and have low accuracy, but few papers have studied the application of LSSVM in modeling the reactive power output of synchronous condenser. The advantages of the least squares support vector machine (LSSVM) are that it is less likely to fall into local minima and has high generalization ability [87]. Researchers have used various intelligent optimization algorithms to find the optimal results of kernel function parameters and regularization parameters, including the GA [88], Particle Swarm Optimization Algorithm (PSO) [89], Free Search Algorithm (FS) [90], Ant Colony Optimization Algorithm (ACO) [91], ABC Algorithm [92], GWO algorithm [93] and Backtracking Search Optimization Algorithm (BSA) [94], etc. However, the traditional swarm optimization algorithm is prone to the defects of falling into local optimum and low convergence accuracy in the search process. According to the results above, the CSHHO not only reduces the probability of the algorithm falling into local optimum and improves the convergence accuracy of the algorithm, but also has the advantages of the basic Harris Hawk optimization algorithm: 1. the steadiness of the searching cores; 2. the fruitfulness in the initial iterations; 3. the progressive selection scheme [5].
This paper proposes a CSHHO-LSSVM-based reactive power modeling method based on the numerical characteristics and global search capability of CSHHO. The optimal values of the penalty parameters, kernel function parameters, and loss function parameters of the LSSVM are found by using CSHHO to build the CSHHO-LSSVM model for the reactive power output of the synchronous condenser.

5.1. Principle of LSSVM

Support Vector Machine (SVM) is an ML method based on statistical learning theory, with kernel function as the core, which implicitly maps the data in the original space to the high-dimensional feature space, then finds the linear relationship in the feature space [87].
LSSVM is a regression algorithm that extends the basic SVM. Compared with the SVM algorithm, LSSVM requires fewer parameters and is more stable. LSSVM simplifies the complex constraints, which makes the improved SVM more capable of handling data. Moreover, by setting the error sum of squares as the loss function of the algorithm, LSSVM enhances the performance of regression prediction and improves the prediction accuracy. Simultaneously, the complexity of the algorithm is reduced, which reduces the processing time of the algorithm and provides more flexibility. LSSVM uses a nonlinear model on basis of SVM:
f ( x ) = ( ω , ϕ ( x ) ) + b
The input data were ( x i , y i ) i = 1 , l where x i R d denoted the different elements, d denoted the dimension, y i R the expected value of the output, and l the total number of inputs. ϕ ( x ) denoted the mapping function. In summary, the LSSVM optimization objective was:
min 1 2 ω 2 + 1 2 γ i = 1 l e i 2
s . t . ω T φ ( x i ) + b + e i = y i ( i = 1 , , l ) .
where e i denoted the error, the magnitude of which determined the prediction accuracy; e R l × 1 denoted the error vector, γ denoted the regularization parameter r, which determined the magnitude of the error. Adding a Lagrangian multiplier to Equation (29), λ R l × 1 , Equation (30) was expressed as:
min J = 1 2 ω 2 + 1 2 γ i = 1 l e i 2 i = 1 l λ i ( ω T ϕ ( x i ) + b + e i y i )
From the KKT condition, we obtained:
{ J ω = 0 i = 1 l λ i φ ( x i ) J b = 0 i = 1 l λ i = 0 J e i = 0 λ i = γ e i , i = 1 , 2 , , l J λ i = 0 ω T φ ( x i ) + b + e i y i = 0 , i = 1 , 2 , , l
By eliminating the slack variables e i and weight vectors ω , the optimization problem was linearized:
[ 0 Q T Q K + 1 γ I ] [ b A ] = [ 0 Y ]
where A = [ α 1 , α 2 , , α N ] T , Q = [ 1 , 1 , , 1 ] T , was an l × 1 dimensional column vector, Y = [ y 1 , y 2 , , y N ] T . According to the Mercer condition, K denoted a kernel function: K ( x i , x j ) = φ ( x i ) T φ ( x j ) i , j = 1 , 2 , , N . The Radial Basis Function kernel function was chosen for the model:
k ( x i , x j ) = exp ( x i x j 2 2 σ 2 ) σ > 0
Therefore, the nonlinear prediction model was expressed by Equation (34):
y = i = 1 l λ i K ( x i , x ) + b
When predicting with least squares support vector regression models, the penalty factor and radial basis kernel function parameters were the two parameters to be solved.

5.2. Simulation and Verification

The reactive power regulation results of a synchronous condenser based on PSCAD/EMTDC simulation software were used as training samples and test samples. Table 11 shows the main parameters. The data with serial numbers 9, 14, 26 and 35 in Table 11 were taken as the test samples, and the rest were the training samples.
First, the data were preprocessed, and the LSSVM was trained by using CSHHO to find the penalty parameters, kernel function parameters and optimal parameters of the loss function (γ, σ, S), and the LSSVM was predicted by applying the test sample to the LSSVM, and the regression fitted prediction model was output. The algorithm flow is shown in Figure 6. Figure 7 compares the output results of the LSSVM model of the test sample and the CSHHO-LSSVM model were compared with those of the test sample and the errors of the LSSVM model of the test sample and the CSHHO-LSSVM model. The output regression of CSHHO-LSSVM model was better fitted and had higher accuracy and radial basis kernel function parameters.
To verify the generalization ability of the CSHHO-LSSVM model, it was evaluated by absolute deviation, as shown in Table 13. The table shows the absolute deviation range of CSHHO-LSSVM model from 0.0123 to 0.989, indicating that the accuracy of the CSHHO-LSSVM model was high.
From the reactive power and system voltage simulation results of the synchronous condenser in Table 14, Table 15 and Table 16, we can see that the maximum absolute error of the reactive power simulation result of CSHHO-LSSVM model was 0. 989 Mvar, and the maximum absolute error of the system voltage simulation result was 0.0415 kV, which were smaller than the simulation results of LSSVM model, indicating that the CSHHO-LSSVM model had higher accuracy and better regression fitting performance. The LSSVM model was more accurate and had better regression fitting performance.

6. Conclusions

Here, we analyzed the shortcomings of the basic HHO algorithm and applied the chaotic mapping population initialization, adaptive weighting, variable spiral position update and optimal neighborhood disturbance mechanisms to the classical HHO algorithm, in which the Gauss chaotic mapping population initialization increased the coverage of the solution space by the initial solution of the algorithm, the adaptive weighting mechanism sped up the movement of Harris hawk populations to the optimal solution, and the variable spiral position update increased the ability of Harris hawk populations. The optimal neighborhood disturbance mechanism helped the improved algorithm to increase the algorithm’s global search capability and avoided premature maturity. To verify the optimal performance of the four strategies, the experiments were separated into two groups.
First, seven commonly used chaotic mappings were selected for the population initialization of the HHO algorithm, including Sinusoidal, Tent, Kent, Cubic, Logistic, Gauss, and Circle mappings. The HHO algorithm’s performance after population initialization of each of these seven mappings was evaluated. The HHO algorithm’s performance after population initialization of Gauss mapping was significantly better than that of the HHO algorithm after population initialization of other mappings in terms of solution accuracy. Second, based on the results of the first set of experiments, the Gauss mapping was used for population initialization, and adaptive weights, variable spiral position update, and optimal neighborhood disturbance mechanisms were introduced into the algorithm after population initialization. Next, CSHHO was compared with other classical algorithms including WOA, SCA, CSO and advanced algorithms including GCHHO and DEPSOASS and Improved GSA and DGOBLFOA based on 23 classical test functions and the means and standard deviations of all algorithms were analyzed. Subsequently, each algorithm’s performance was evaluated comprehensively using Friedman’s test and the Bonferroni–Holm corrected Wilcoxon signed-rank test with 5% confidence level, where numerical analysis concluded that CSHHO outperformed the other algorithms. In detail, analyzing the experimental results of this work, in the population initialization phase, Gauss chaos mapping had the best results in F2, F6, F12, F17, and F23 test functions, and comparing the results of the remaining six chaotic mappings, Gauss chaos mapping obtained the most optimal solutions; CSHHO algorithm outperformed HHO in 17 benchmark functions out of 23 classical test functions, outperformed WOA in 21 results, SCA in 23 results, and CSO in 22 experiments. It outperformed GCHHO in 9 results. Meanwhile, in the statistical experiments of advanced meta-heuristic and classical meta-heuristic, the ARVs obtained by CSHHO were 1.93 and 2.57, respectively, which were lower than the values obtained by other pairwise meta-heuristics in the same group of experiments. Additionally, dimensional scalability tests were conducted for CSHHO on the IEEECEC2017 dataset, including 50 and 100 dimensions, and the results showed that the improved optimizer effectively handled high-dimensional data with good stability. Meanwhile, in the statistical experiments of advanced meta-heuristic and classical meta-heuristic, CSHHO obtained ARVs of 1.70 and 2.57, respectively, which were lower than the values obtained by other meta-heuristic algorithms in the same set of experiments. Furthermore, dimensional scalability tests were conducted for CSHHO on the IEEE CEC 2017 dataset, including 50 and 100 dimensions, and the results showed that the improved optimizer effectively handled high-dimensional data with excellent stability.
Here, the CSHHO algorithm was also applied to the engineering problem of reactive power output modeling of the synchronous condenser. In view of the defects of the many calculations and low accuracy of the traditional reactive power output modeling method of the synchronous condenser, CSHHO-LSSVM was used to model the reactive power output of the synchronous condenser based on the advantages of LSSVM, which was not easy to fall into local minimum and had strong generalization ability, and CSHHO had high search accuracy and strong global search ability. The excitation current and excitation voltage of the synchronous condenser were used as the input of the LSSVM model, and the reactive power and system voltage were used as the LSSVM model’s output. CSHHO was used to find the optimal values of the penalty parameter, kernel function parameter, and loss function parameter of LSSVM. The experiment showed that the CSHHHO-LSSVM model had better accuracy and better regression fitting performance compared with LSSVM.
In future work, we will try to improve the convergence speed and search accuracy of the algorithm and balance the exploration and exploitation phases of the algorithm to obtain better search performance. Additionally, the next step will be to investigate how CSHHO can be used to solve multi-objective optimization problems. In addition, CSHHO can also be used for evolutionary ML, such as extreme learning machines and parameter tuning of convolution neural networks. Other problems include grid scheduling and 3D multi-objective tracking.

Author Contributions

Methodology, C.W.; resources, S.J.; funding acquisition, S.J., R.G., Y.L. and Q.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (No. 61871318), the Shaanxi Provincial Key Research and Development Project (No. 2019GY-099) and Open project of Shaanxi Key Laboratory of Complex System Control and Intelligent Information Processing (No. 2020CP10).

Institutional Review Board Statement

The study did not involve human and animals.

Informed Consent Statement

The study did not involve human and animals.

Data Availability Statement

In the paper, all the data generation information has been given in detail in the related chapter.

Acknowledgments

The author thanks the referees for detailed and constructive criticism of the original manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. The classical 23 test functions.
Table A1. The classical 23 test functions.
FunctionDimensionsRange f min
unimodal benchmark functions
f 1 ( x ) = i = 1 n x i 2 30,100, 500, 1000[100, 100]0
f 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | 30,100, 500, 1000[10, 10]0
f 3 ( x ) = i = 1 n ( j 1 i x j ) 2 30,100, 500, 1000[100, 100]0
f 4 ( x ) = max i { | x i | , 1 i n } 30,100, 500, 1000[100, 100]0
f 5 ( x ) = i = 1 n 1 [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ] 30,100, 500, 1000[30, 30]0
f 6 ( x ) = i = 1 n ( [ x i + 0.5 ] ) 2 30,100, 500, 1000[100, 100]0
f 7 ( x ) = i = 1 n i x i 4 + random [ 0 , 1 ) 30,100, 500, 1000[128, 128]0
multimodal benchmark functions
f 8 ( x ) = i = 1 n x i sin ( | x i | ) 30,100, 500, 1000[500, 500]−418.9829 × n
f 9 ( x ) = i = 1 n [ x i 2 10 cos ( 2 π x i ) + 10 ] 30,100, 500, 1000[5.12, 5.12]0
f 10 ( x ) = 20 exp ( 0.2 1 n i = 1 n x i 2 ) exp ( 1 n i = 1 n cos ( 2 π x i ) ) + 20 + e 30,100, 500, 1000[32, 32]0
f 11 ( x ) = 1 4000 i = 1 n x i 2 i = 1 n cos ( x i i ) + 1 30,100, 500, 1000[600, 600]0
f 12 ( x ) = π n { 10 sin ( π y 1 ) + i = 1 n 1 ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ] + ( y n 1 ) 2 } + i = 1 n u ( x i , 10 , 100 , 4 ) y i = 1 + x i + 1 4 u ( x i , a , k , m ) = { k ( x i a ) m x i > a 0 a < x i < a k ( x i a ) m x i < a 30,100, 500, 1000[50, 50]0
f 13 ( x ) = 0.1 { sin 2 ( 3 π x 1 ) + i = 1 n ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n 1 ) 2 [ 1 + sin 2 ( 2 π x n ) ] } + i = 1 n u ( x i , 5 , 100 , 4 ) 30,100, 500, 1000[50, 50]0
fixed-dimension multimodal benchmark functions
f 14 ( x ) = ( 1 500 + j = 1 25 1 j + i = 1 2 ( x i a i j ) 6 ) 1 2[−65, 65]1
f 15 ( x ) = i = 1 11 [ a i x 1 ( b i 2 + b i x 2 ) b i 2 + b i x 3 + x 4 ] 2 4[−5, 5]0.00030
f 16 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 2[−5, 5]−1.0316
f 17 ( x ) = ( x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 ) 2 + 10 ( 1 1 8 π ) cos x 1 + 10 2[−5, 5]0.398
f 18 ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 3 x 2 ) 2 × ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ] 2[−2, 2]3
f 19 ( x ) = i = 1 4 c i exp ( j = 1 3 a i j ( x j p i j ) 2 ) 3[1, 3]−3.86
f 20 ( x ) = i = 1 4 c i exp ( j = 1 6 a i j ( x j p i j ) 2 ) 6[0, 1]−3.32
f 21 ( x ) = i = 1 5 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.1532
f 22 ( x ) = i = 1 7 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.4028
f 23 ( x ) = i = 1 10 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.5363

References

  1. Kundu, T.; Garg, H. A Hybrid ITLHHO Algorithm for Numerical and Engineering Optimization Problems. Int. J. Intell. Syst. 2021, 36, 1–81. [Google Scholar] [CrossRef]
  2. Garg, H. A Hybrid GSA-GA Algorithm for Constrained Optimization Problems. Inf. Sci. 2019, 478, 499–523. [Google Scholar] [CrossRef]
  3. Garg, H. A hybrid PSO-GA algorithm for constrained optimization problems. Appl. Math. Comput. 2016, 274, 292–305. [Google Scholar] [CrossRef]
  4. Wu, Y. A Survey on Population-Based Meta-Heuristic Algorithms for Motion Planning of Aircraft. Swarm Evol. Comput. 2021, 62, 100844. [Google Scholar] [CrossRef]
  5. Alabool, H.M.; Alarabiat, D.; Abualigah, L.; Heidari, A.A. Harris hawks optimization: A comprehensive review of recent variants and applications. Neural Comput. Appl. 2021, 33, 8939–8980. [Google Scholar] [CrossRef]
  6. Vasant, P. Handbook of Research on Artificial Intelligence Techniques and Algorithms; IGI Global: Hershey, PA, USA, 2015. [Google Scholar] [CrossRef]
  7. Simon, D. Evolutionary Optimization Algorithms: Biologically-Inspired and Population-Based Approaches to Computer Intelligence; John Wiley & Sons Inc.: Hoboken, NJ, USA, 2013; ISBN 978-0-470-93741-9. [Google Scholar]
  8. Dréo, J. (Ed.) Metaheuristics for Hard Optimization: Methods and Case Studies; Springer: Berlin, Germany, 2006; ISBN 978-3-540-23022-9. [Google Scholar]
  9. Salcedo-Sanz, S. Modern Meta-Heuristics Based on Nonlinear Physics Processes: A Review of Models and Design Procedures. Phys. Rep. 2016, 655, 1–70. [Google Scholar] [CrossRef]
  10. Adeyanju, O.M.; Canha, L.N. Decentralized Multi-Area Multi-Agent Economic Dispatch Model Using Select Meta-Heuristic Optimization Algorithms. Electr. Power Syst. Res. 2021, 195, 107128. [Google Scholar] [CrossRef]
  11. Zhu, M.; Chu, S.-C.; Yang, Q.; Li, W.; Pan, J.-S. Compact Sine Cosine Algorithm with Multigroup and Multistrategy for Dispatching System of Public Transit Vehicles. J. Adv. Transp. 2021, 2021, 1–16. [Google Scholar] [CrossRef]
  12. Fu, X.; Fortino, G.; Li, W.; Pace, P.; Yang, Y. WSNs-Assisted Opportunistic Network for Low-Latency Message Forwarding in Sparse Settings. Future Gener. Comput. Syst. 2019, 91, 223–237. [Google Scholar] [CrossRef]
  13. Dhiman, G. SSC: A Hybrid Nature-Inspired Meta-Heuristic Optimization Algorithm for Engineering Applications. Knowl.-Based Syst. 2021, 222, 106926. [Google Scholar] [CrossRef]
  14. Han, Y.; Gu, X. Improved Multipopulation Discrete Differential Evolution Algorithm for the Scheduling of Multipurpose Batch Plants. Ind. Eng. Chem. Res. 2021, 60, 5530–5547. [Google Scholar] [CrossRef]
  15. Loukil, T.; Teghem, J.; Tuyttens, D. Solving multi-objective production scheduling problems using meta-heuristics. Eur. J.Oper. Res. 2005, 161, 42–61. [Google Scholar] [CrossRef]
  16. Li, Q.; Chen, H.; Huang, H.; Zhao, X.; Cai, Z.; Tong, C.; Liu, W.; Tian, X. An Enhanced Grey Wolf Optimization Based Feature Selection Wrapped Kernel Extreme Learning Machine for Medical Diagnosis. Comput. Math. Methods Med. 2017, 1–15. [Google Scholar] [CrossRef]
  17. Li, Y.; Liu, J.; Tang, Z.; Lei, B. Deep Spatial-Temporal Feature Fusion From Adaptive Dynamic Functional Connectivity for MCI Identification. IEEE Trans. Med. Imaging. 2020, 39, 2818–2830. [Google Scholar] [CrossRef] [PubMed]
  18. Corazza, M.; di Tollo, G.; Fasano, G.; Pesenti, R. A Novel Hybrid PSO-Based Metaheuristic for Costly Portfolio Selection Problems. Ann. Oper. Res. 2021, 304, 109–137. [Google Scholar] [CrossRef]
  19. Gaspero, L.D.; Tollo, G.D.; Roli, A.; Schaerf, A. Hybrid Metaheuristics for Constrained Portfolio Selection Problems. Quant. Financ. 2011, 11, 1473–1487. [Google Scholar] [CrossRef]
  20. Shen, L.; Chen, H.; Yu, Z.; Kang, W.; Zhang, B.; Li, H.; Yang, B.; Liu, D. Evolving Support Vector Machines Using Fruit Fly Optimization for Medical Data Classification. Knowl.-Based Syst. 2016, 96, 61–75. [Google Scholar] [CrossRef]
  21. Wang, M.; Chen, H.; Yang, B.; Zhao, X.; Hu, L.; Cai, Z.; Huang, H.; Tong, C. Toward an Optimal Kernel Extreme Learning Machine Using a Chaotic Moth-Flame Optimization Strategy with Applications in Medical Diagnoses. Neurocomputing 2017, 267, 69–84. [Google Scholar] [CrossRef]
  22. Song, J.; Zheng, W.X.; Niu, Y. Self-Triggered Sliding Mode Control for Networked PMSM Speed Regulation System: A PSO-Optimized Super-Twisting Algorithm. IEEE Trans. Ind. Electron. 2021, 69, 763–773. [Google Scholar] [CrossRef]
  23. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  24. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gen. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  25. Dong, W.; Zhou, M. A Supervised Learning and Control Method to Improve Particle Swarm Optimization Algorithms. IEEE Trans. Syst. Man Cybern. Syst. 2017, 47, 1135–1148. [Google Scholar] [CrossRef]
  26. Mareda, T.; Gaudard, L.; Romerio, F. A Parametric Genetic Algorithm Approach to Assess Complementary Options of Large Scale Windsolar Coupling. IEEE/CAA J. Autom. Sin. 2017, 4, 260–272. [Google Scholar] [CrossRef] [Green Version]
  27. Jian, Z.; Liu, S.; Zhou, M. Modified cuckoo search algorithm to solve economic power dispatch optimization problems. IEEE/CAA J. Autom. Sin. 2018, 5, 794–806. [Google Scholar]
  28. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by Simulated Annealing. Science. 1983, 220, 671–680. [Google Scholar] [CrossRef]
  29. Xing, B.; Gao, W.-J. Gravitational Search Algorithm. In Innovative Computational Intelligence: A Rough Guide to 134 Clever Algorithms; Intelligent Systems Reference Library; Springer International Publishing: Cham, Germany, 2014; Volume 62, pp. 355–364. ISBN 978-3-319-03403-4. [Google Scholar]
  30. Alatas, B. ACROA: Artificial chemical reaction optimization algorithm for global optimization. Expert Syst. Appl. 2011, 38, 13170–13180. [Google Scholar] [CrossRef]
  31. Patel, V.K.; Savsani, V.J. Heat transfer search (HTS): A novel optimization algorithm. Inf. Sci. 2015, 324, 217–246. [Google Scholar] [CrossRef]
  32. Abdechiri, M.; Meybodi, M.R.; Bahrami, H. Gases brownian motion optimization: An algorithm for optimization (GBMO). Appl. Soft Comput. 2013, 13, 2932–2946. [Google Scholar] [CrossRef]
  33. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Future Gener. Comput. Syst. 2019, 101, 646–667. [Google Scholar] [CrossRef]
  34. Holland, J.H. Genetic Algorithms. Sci. Am. 1992, 267, 66–72. [Google Scholar] [CrossRef]
  35. Schneider, B.; Ranft, U. Simulationsmethoden in der Medizin und Biologie; Springer: Berlin/Heidelberg, Germany, 1978. [Google Scholar]
  36. Yang, X.-S. Differential evolution. In Nature-Inspired Optimization Algorithms. Algorithms; Elsevier: London UK, 2021; Volume 6, pp. 101–109. [Google Scholar]
  37. Storn, R.; Price, K. Differential evolution–A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  38. Simon, D. Biogeography-Based Optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
  39. Karaboga, D.; Akay, B. A comparative study of Artificial Bee Colony algorithm. Appl. Math. Comput. 2009, 214, 108–132. [Google Scholar] [CrossRef]
  40. Yang, X.S. Firefly algorithm, stochastic test functions and design optimization. Int. J. Bio-Inspired Comput. 2010, 2, 78. [Google Scholar] [CrossRef]
  41. Jiang, X.; Li, S. BAS: Beetle antennae search algorithm for optimization problems. Int. J. Robot. Control 2018, 1, 1. [Google Scholar] [CrossRef]
  42. Okwu, M.O. (Ed.) Grey Wolf Optimizer, Metaheuristic Optimization, Nature-Inspired Algorithms Swarm and Computational Intelligence, Theory and Applications; Springer International Publishing: Cham, Switzerland, 2021. [Google Scholar] [CrossRef]
  43. Li, M.D.; Zhao, H.; Weng, X.W.; Han, T. A novel nature-inspired algorithm for optimization: Virus colony search. Adv. Eng. Softw. 2016, 92, 65–88. [Google Scholar] [CrossRef]
  44. Glover, F. Tabu search—Part I. ORSA J. Comput. 1989, 1, 190–206. [Google Scholar] [CrossRef] [Green Version]
  45. Kumar, M.; Kulkarni, A.J.; Satapathy, S.C. Socio evolution & learning optimization algorithm: A socio-inspired optimization methodology. Future Gener. Comput. Syst. 2018, 81, 252–272. [Google Scholar] [CrossRef]
  46. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching–learning-based optimization: An optimization method for continuous non-linear large scale problems. Inf. Sci. 2012, 183, 1–15. [Google Scholar] [CrossRef]
  47. Hosseini, S.; Al Khaled, A. A Survey on the Imperialist Competitive Algorithm Metaheuristic: Implementation in Engineering Domain and Directions for Future Research. Appl. Soft. Comput. 2014, 24, 1078–1094. [Google Scholar] [CrossRef]
  48. Jia, H.; Peng, X.; Kang, L.; Li, Y.; Jiang, Z.; Sun, K. Pulse coupled neural network based on Harris hawks optimization algorithm for image segmentation. Multimed Tools Appl. 2020, 79, 28369–28392. [Google Scholar] [CrossRef]
  49. Fan, C.; Zhou, Y.; Tang, Z. Neighborhood centroid opposite-based learning harris hawks optimization for training neural networks. Evol. Intell. 2020, 14, 1847–1867. [Google Scholar] [CrossRef]
  50. Saravanan, G.; Ibrahim, A.M.; Kumar, D.S.; Vanitha, U.; Chandrika, V.S. Iot Based Speed Control of BLDC Motor with Harris Hawks Optimization Controller. Int. J. Grid Distrib. Comput. 2020, 13, 1902–1919. [Google Scholar]
  51. Qu, C.; He, W.; Peng, X.; Peng, X. Harris hawks optimization with information exchange. Appl. Math. Model. 2020, 84, 52–75. [Google Scholar] [CrossRef]
  52. Devarapalli, R.; Bhattacharyya, B. Application of modified harris hawks Optimization in power system oscillations damping controller design. In Proceedings of the 2019 8th International Conference on Power Systems (ICPS), Jaipur, India, 20–22 December 2019. [Google Scholar]
  53. Elgamal, Z.M.; Yasin, N.B.M.; Tubishat, M.; Alswaitti, M.; Mirjalili, S. An improved harris hawks optimization algorithm with simulated annealing for feature selection in the Medical Field. IEEE Access. 2020, 8, 186638–186652. [Google Scholar] [CrossRef]
  54. Song, S.; Wang, P.; Heidari, A.A.; Wang, M.; Zhao, X.; Chen, H.; He, W.; Xu, S. Dimension decided harris hawks optimization with gaussian mutation: Balance analysis and diversity patterns. Knowl.-Based Syst. 2021, 215, 106425. [Google Scholar] [CrossRef]
  55. Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–83. [Google Scholar] [CrossRef] [Green Version]
  56. Bednarz, J.C. Cooperative Hunting Harris’ Hawks (Parabuteo unicinctus). Science 1988, 239, 1525–1527. [Google Scholar] [CrossRef] [PubMed]
  57. Lefebvre, L.; Whittle, P.; Lascaris, E.; Finkelstein, A. Feeding innovations and forebrain size in birds. Anim. Behav. 1997, 53, 549–560. [Google Scholar] [CrossRef] [Green Version]
  58. Sol, D.; Duncan, R.P.; Blackburn, T.M.; Cassey, P.; Lefebvre, L. Big brains, Enhanced Cognition, and Response of birds to Novel environments. Proc. Natl. Acad. Sci. USA 2005, 102, 5460–5465. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  59. Kazimipour, B.; Li, X.; Qin, A.K. A review of population initialization techniques for evolutionary algorithms. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6–11 July 2014; pp. 2585–2592. [Google Scholar] [CrossRef]
  60. Migallón, H.; Jimeno-Morenilla, A.; Rico, H.; Sánchez-Romero, J.L.; Belazi, A. Multi-Level Parallel Chaotic Jaya Optimization Algorithms for Solving Constrained Engineering Design Problems. J. Supercomput. 2021, 77, 12280–12319. [Google Scholar] [CrossRef]
  61. Alatas, B.; Akin, E. Multi-Objective Rule Mining Using a Chaotic Particle Swarm Optimization Algorithm. Knowl.-Based Syst. 2009, 22, 455–460. [Google Scholar] [CrossRef]
  62. Arora, S.; Anand, P. Chaotic Grasshopper Optimization Algorithm for Global Optimization. Neural Comput. Appl. 2019, 31, 4385–4405. [Google Scholar] [CrossRef]
  63. Wang, G.-G.; Guo, L.; Gandomi, A.H.; Hao, G.-S.; Wang, H. Chaotic Krill Herd Algorithm. Inf. Sci. 2014, 274, 17–34. [Google Scholar] [CrossRef]
  64. Mitić, M.; Vuković, N.; Petrović, M.; Miljković, Z. Chaotic Fruit Fly Optimization Algorithm. Knowl.-Based Syst. 2015, 89, 446–458. [Google Scholar] [CrossRef]
  65. Kumar, Y.; Singh, P.K. A Chaotic Teaching Learning Based Optimization Algorithm for Clustering Problems. Appl. Intell. 2019, 49, 1036–1062. [Google Scholar] [CrossRef]
  66. Pierezan, J.; dos Santos Coelho, L.; Mariani, V.C.; de Vasconcelos Segundo, E.H.; Prayogo, D. Chaotic Coyote Algorithm Applied to Truss Optimization Problems. Comput. Struct. 2021, 242, 106353. [Google Scholar] [CrossRef]
  67. Sayed, G.I.; Tharwat, A.; Hassanien, A.E. Chaotic Dragonfly Algorithm: An Improved Metaheuristic Algorithm for Feature Selection. Appl. Intell. 2019, 49, 188–205. [Google Scholar] [CrossRef]
  68. Chen, K.; Zhou, F.; Liu, A. Chaotic Dynamic Weight Particle Swarm Optimization for Numerical Function Optimization. Knowl.-Based Syst. 2018, 139, 23–40. [Google Scholar] [CrossRef]
  69. Anand, P.; Arora, S. A Novel Chaotic Selfish Herd Optimizer for Global Optimization and Feature Selection. Artif. Intell. Rev. 2020, 53, 1441–1486. [Google Scholar] [CrossRef]
  70. Yao, X.; Liu, Y.; Lin, G. Evolutionary Programming Made Faster. IEEE Trans. Evol. Comput. 1999, 3, 82–102. [Google Scholar] [CrossRef] [Green Version]
  71. Digalakis, J.G.; Margaritis, K.G. On benchmarking functions for genetic algorithms. Int. J. Comput. Math. 2001, 77, 481–506. [Google Scholar] [CrossRef]
  72. Chen, H.; Zhang, Q.; Luo, J. An enhanced Bacterial Foraging Optimization and its application for training kernel extreme learning machine. Appl. Soft. Comput. 2020, 86, 105–884. [Google Scholar] [CrossRef]
  73. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  74. Meng, X.; Liu, Y.; Gao, X.; Zhang, H. A new bio-inspired algorithm: Chicken swarm optimization. In Advances in Swarm Intelligence; Tan, Y., Shi, Y., Coello, C.A.C., Eds.; Springer International Publishing: Cham, Germany, 2014; pp. 86–94. [Google Scholar] [CrossRef]
  75. Zhang, J.; Chen, J.; Che, L. Hybrid PSO Algorithm with Adaptive Step Search in Noisy and Noise-Free Environments. In Proceedings of the 2020 IEEE Congress on Evolutionary Computation (CEC), Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar]
  76. Jordehi, A.R. Gravitational Search Algorithm with Linearly Decreasing Gravitational Constant for Parameter Estimation of Photovoltaic Cells. In Proceedings of the 2017 IEEE Congress on Evolutionary Computation (CEC), Donostia, Spain, 5–8 June 2017; pp. 37–42. [Google Scholar]
  77. Feng, X.; Liu, A.; Sun, W.; Yue, X.; Liu, B. A Dynamic Generalized Opposition-Based Learning Fruit Fly Algorithm for Function Optimization. In Proceedings of the 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–7. [Google Scholar]
  78. Demsar, J. Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 2006, 7, 1–30. [Google Scholar]
  79. García, S.; Fernández, A.; Luengo, J.; Herrera, F. Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf. Sci. 2010, 180, 2044–2064. [Google Scholar] [CrossRef]
  80. Groppe, D.M.; Urbach, T.P.; Kutas, M. Mass univariate analysis of event-related brain potentials/fields I: A critical tutorial review. Psychophysiology 2011, 48, 1711–1725. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  81. Deliparaschos, K.M.; Nenedakis, F.I.; Tzafestas, S.G. Design and implementation of a fast digital fuzzy logic controller using FPGA Technology. J. Intell. Robot. Syst. 2006, 45, 77–96. [Google Scholar] [CrossRef]
  82. Sapkota, B.; Vittal, V. Dynamic VAr planning in a large power system using trajectory sensitivities. IEEE Trans. Power Syst. 2010, 25, 461–469. [Google Scholar] [CrossRef]
  83. Huang, H.; Xu, Z.; Lin, X. Improving performance of Multi-infeed HVDC systems using grid dynamic segmentation technique based on fault current limiters. IEEE Trans. Power Syst. 2012, 27, 1664–1672. [Google Scholar] [CrossRef]
  84. Yong, T. A discussion about standard parameter models of synchronous machine. Power Syst. Technol. 2007, 12, 47. [Google Scholar]
  85. Grigsby, L.L. (Ed.) Power system Stability and Control, 3rd ed.; Taylor & Francis: Boca Raton, FL, USA, 2012. [Google Scholar]
  86. Tian, Z. Backtracking search optimization algorithm-based least square support vector machine and its applications. Eng. Appl. Artif. Intell. 2020, 94, 103801. [Google Scholar] [CrossRef]
  87. Suykens, J.A.K.; Vandewalle, J. Least squares support vector machine classifiers. Neural Process. Lett. 1999, 9, 293–300. [Google Scholar] [CrossRef]
  88. Zendehboudi, A. Implementation of GA-LSSVM modelling approach for estimating the performance of solid desiccant wheels. Energy Convers. Manag. 2016, 127, 245–255. [Google Scholar] [CrossRef]
  89. Chamkalani, A.; Zendehboudi, S.; Bahadori, A.; Kharrat, R.; Chamkalani, R.; James, L.; Chatzis, I. Integration of LSSVM technique with PSO to determine asphaltene deposition. J. Pet. Sci. Eng. 2014, 124, 243–253. [Google Scholar] [CrossRef]
  90. Zhongda, T.; Shujiang, L.; Yanhong, W.; Yi, S. A prediction method based on wavelet transform and multiple models fusion for chaotic time series. Chaos Solitons Fractals 2017, 98, 158–172. [Google Scholar] [CrossRef]
  91. Liu, W.; Zhang, X. Research on the supply chain risk assessment based on the improved LSSVM algorithm. Int. J. U-Serv. Sci. Technol. 2016, 9, 297–306. [Google Scholar] [CrossRef]
  92. Jain, S.; Bajaj, V.; Kumar, A. Efficient algorithm for classification of electrocardiogram beats based on artificial bee colony-based least-squares support vector machines classifier. Electron. Lett. 2016, 52, 1198–1200. [Google Scholar] [CrossRef]
  93. Yang, A.; Li, W.; Yang, X. Short-term electricity load forecasting based on feature selection and Least Squares Support Vector Machines. Knowl.-Based Syst. 2019, 163, 159–173. [Google Scholar] [CrossRef]
  94. Adankon, M.M.; Cheriet, M. Support vector machine. In Encyclopedia of Biometrics; Springer: Boston, MA, USA, 2014; Volume 3, pp. 1–9. [Google Scholar] [CrossRef]
Figure 1. Description of each stage of HHO.
Figure 1. Description of each stage of HHO.
Symmetry 13 02364 g001
Figure 2. Flow chart of HHO algorithm.
Figure 2. Flow chart of HHO algorithm.
Symmetry 13 02364 g002
Figure 3. Spiral position update.
Figure 3. Spiral position update.
Symmetry 13 02364 g003
Figure 4. Convergence curves of CSHHO and the classic meta-heuristic algorithms.
Figure 4. Convergence curves of CSHHO and the classic meta-heuristic algorithms.
Symmetry 13 02364 g004aSymmetry 13 02364 g004b
Figure 5. Convergence curves of CSHHO and the advanced meta-heuristic algorithms.
Figure 5. Convergence curves of CSHHO and the advanced meta-heuristic algorithms.
Symmetry 13 02364 g005aSymmetry 13 02364 g005b
Figure 6. CSHHO Optimizes Synchronous condenser Reactive Power Output based LSSVM.
Figure 6. CSHHO Optimizes Synchronous condenser Reactive Power Output based LSSVM.
Symmetry 13 02364 g006
Figure 7. Output and error comparison diagram of LSSVM model and CSHHO-LSSVM model.
Figure 7. Output and error comparison diagram of LSSVM model and CSHHO-LSSVM model.
Symmetry 13 02364 g007aSymmetry 13 02364 g007b
Table 1. Exploitation phase of HHO algorithm.
Table 1. Exploitation phase of HHO algorithm.
StrategyThe value of r and |E|
Soft besieger ≥ 0.5 and |E| ≥ 0.5
X i ( t + 1 ) = Δ X i ( t ) E ( t ) | J X rabbit   ( t ) X i ( t ) | (4)
Δ X i ( t ) = X rabbit   ( t ) X i ( t ) (5)
J = 2 ( 1 r 5 ) (6)
Hard besieger ≥ 0.5 and |E| < 0.5
X i ( t + 1 ) = X rabbit   ( t ) E ( t ) | Δ X i ( t ) | (7)
Soft besiege with progressive rapid divesr < 0.5 and |E| ≥ 0.5
X i ( t + 1 ) = { Y f ( Y ) < f ( X i ( t ) ) Z f ( Z ) < f ( X i ( t ) ) (8)
Y = X rabbit   ( t ) E ( t ) | J X rabbit   ( t ) X i ( t ) | (9)
Z = Y + S × L F ( D ) (10)
L F ( ) = 0.01 × u | v | 1 β × ( Γ ( 1 + β ) × sin ( 0.5 β π ) Γ ( 0.5 ( 1 + β ) ) × β × 2 ( β 1 ) / 2 ) 1 β (11)
Hard besiege with progressive rapid divesr < 0.5 and |E| < 0.5
X i ( t + 1 ) = { Y f ( Y ) < f ( X i ( t ) ) Z f ( Z ) < f ( X i ( t ) ) (12)
Y = X rabbit   ( t ) E ( t ) | J X rabbit   ( t ) X m ( t ) | (13)
Table 2. The information of IEEE CEC2017.
Table 2. The information of IEEE CEC2017.
IDDescriptionTypeDimensionRangeOptimum
F1Shifted and Rotated Bent Cigar FunctionUnimodal30, 50, 100[−100, 100]100
F2Shifted and Rotated Zakharov functionUnimodal30, 50, 100[−100, 100]300
F3Shifted and Rotated Rosenbrock’s functionMultimodal30, 50, 100[−100, 100]400
F4Shifted and Rotated Rastrigin’s functionMultimodal30, 50, 100[−100, 100]500
F5Shifted and Rotated Expanded Scaffer’s F6 functionMultimodal30, 50, 100[−100, 100]600
F6Shifted and Rotated Lunacek Bi-Rastrigin functionMultimodal30, 50, 100[−100, 100]700
F7Shifted and Rotated Non-Continuous Rastrigin’s functionMultimodal30, 50, 100[−100, 100]800
F8Shifted and Rotated Lévy functionMultimodal30, 50, 100[−100, 100]900
F9Shifted and Rotated Schwefel’s functionMultimodal30, 50, 100[−100, 100]1000
F10Hybrid Function 1 (N = 3)Hybrid30, 50, 100[−100, 100]1100
F11Hybrid Function 2 (N = 3)Hybrid30, 50, 100[−100, 100]1200
F12Hybrid Function 3 (N = 3)Hybrid30, 50, 100[−100, 100]1300
F13Hybrid Function 4 (N = 4)Hybrid30, 50, 100[−100, 100]1400
F14Hybrid Function 5 (N = 4)Hybrid30, 50, 100[−100, 100]1500
F15Hybrid Function 6 (N = 4)Hybrid30, 50, 100[−100, 100]1600
F16Hybrid Function 6 (N = 5)Hybrid30, 50, 100[−100, 100]1700
F17Hybrid Function 6 (N = 5)Hybrid30, 50, 100[−100, 100]1800
F18Hybrid Function 6 (N = 5)Hybrid30, 50, 100[−100, 100]1900
F19Hybrid Function 6 (N = 6)Hybrid30, 50, 100[−100, 100]2000
F20Composition Function 1 (N = 3)Composition30, 50, 100[−100, 100]2100
F21Composition Function 2 (N = 3)Composition30, 50, 100[−100, 100]2200
F22Composition Function 3 (N = 4)Composition30, 50, 100[−100, 100]2300
F23Composition Function 4 (N = 4)Composition30, 50, 100[−100, 100]2400
F24Composition Function 5 (N = 5)Composition30, 50, 100[−100, 100]2500
F25Composition Function 6 (N = 5)Composition30, 50, 100[−100, 100]2600
F26Composition Function 7 (N = 6)Composition30, 50, 100[−100, 100]2700
F27Composition Function 7 (N = 6)Composition30, 50, 100[−100, 100]2800
F28Composition Function 9 (N = 3)Composition30, 50, 100[−100, 100]2900
F29Composition Function 10 (N = 3)Composition30, 50, 100[−100, 100]3000
Table 3. Results of a comparison with seven chaotic mappings on HHO.
Table 3. Results of a comparison with seven chaotic mappings on HHO.
Benchmark CircleSinusoidalTentKentCubicLogisticGauss
F1Mean4.37 × 101102.73 × 10−967.91 × 10−1013.68 × 10−998.75 × 10−1007.28 × 10−991.4 × 10−109
Std3.05 × 10−1091.09 × 10−953.91 × 10−1001.82 × 10−984.06 × 10−995.13 × 10−989.69 × 10−109
Rank1735462
Best\Worst2.15 × 10−108\6.44 × 10−1354.36 × 10−95\3.01 × 10−1152.21 × 10−99\1.1 × 10−1199.39 × 10−98\3.69 × 10−1162.11 × 10−98\1.4 × 10−1163.63 × 10−97\1.96 × 10−1166.85 × 10−108\1.18 × 10−137
F2Mean6.4 × 10−593.88 × 10−533.45 × 10−537.39 × 10−538.12 × 10−538.59 × 10−538.05× 1061
Std3.12 × 10−581.74 × 10−521.5 × 10−524.9 × 10−525.03 × 10−524.9 × 10−522.37× 10−60
Rank2435671
Best\Worst1.97 × 10−57\5.03 × 10−771.20 × 10−51\6.33 × 10−608.88 × 10−52\1.72 × 10−613.47 × 10−51\1.86 × 10−643.56 × 10−51\2.76 × 10−683.4391 × 10−51\3.91 × 10−621.0834× 10−59\3.80 × 10−68
F3Mean2.90 × 10−862.51 × 10−865.22 × 10−825.28 × 10−783.08 × 10−839.34 × 10−867.29 × 10−78
Std2.01 × 10−859.38 × 10−863.13 × 10−813.73 × 10−772.15 × 10−823.43 × 10−855.15 × 10−77
Rank2156437
Best\Worst1.42 × 10−84\8.08 × 10−1043.51 × 10−85\9.55 × 10−1042.18 × 10−80\1.01 × 10−1082.63 × 10−76\5.02 × 10−1011.52 × 10−81\2.2 × 10−1071.90 × 10−84\5.74 × 10−1053.64 × 10−76\1.58 × 10−107
F4Mean8.48 × 10−561.23 × 10−507.91 × 10−518.72 × 10−513.65 × 10−509.39 × 10−509.3 × 10−53
Std4.24 × 10−554.28 × 10−505.22 × 10−503.6 × 10−502.4 × 10−496.51 × 10−496.49 × 10−52
Rank1534672
Best\Worst2.64 × 10−54\9.93 × 10−691.91 × 10−49\1.3 × 10−553.69 × 10−49\2.77 × 10−591.97 × 10−49\4.06 × 10−581.70 × 10−48\2.16 × 10−604.60 × 10−48\4.39 × 10−584.59 × 10−51\1.13 × 10−65
F5Mean8.55 × 10−24.17 × 10−25.42 × 10−24.65 × 10−27.29 × 10−24.85 × 10−28.33 × 10−2
Std5.63 × 10−14.90 × 10−39.52 × 10−37.56 × 10−39.62 × 10−38.98 × 10−35.64 × 10−1
Rank7142536
Best\Worst3.9896\4.3262 × 10−50.021867\3.3302 × 10−60.048302\8.4348 × 10−70.0425\8.3416 × 10−70.043385\3.4475 × 10−50.051141\6.6967 × 10−63.9896\6.6344 × 10−6
F6Mean3.34 × 10−57.87 × 10−55.47 × 10−54.72 × 10−56.85 × 10−54.45 × 10−52.48× 10−5
Std6.12 × 10−51.10 × 10−49.45 × 10−55.54 × 10−58.89 × 10−566.00 × 10−53.13× 10−5
Rank2754631
Best\Worst3.42 × 104\1.6931 × 10−84.27 × 104\7.8651 × 10−83.71 × 104\2.4303 × 10−93.16 × 104\3.9886 × 10−84.33 × 104\1.2627 × 10−82.77 × 104\2.6747 × 10−81.48× 104\1.1473× 10−10
F7Mean1.02 × 10−47.60 × 10−58.41 × 10−59.33 × 10−57.99 × 10−57.62 × 10−59.17 × 10−5
Std9.77 × 10−54.66 × 10−51.02 × 10−49.06 × 10−58.17 × 10−57.35 × 10−58.83 × 10−5
Rank7146325
Best\Worst4.09 × 10−4\1.85 × 10−61.92 × 10−4\1.62 × 10−65.44 × 10−4\1.26 × 10−63.88 × 10−4\4.19 × 10−65.07 × 10−4\1.77 × 10−64.39 × 10−4\8.40 × 10−74.36 × 10−4\1.72 × 10−6
F8Mean−1.26 × 104−1.26 × 104−1.26 × 104−1.26 × 104−1.25 × 104−1.26 × 104−1.26 × 104
Std52.10.2320.4530.5080.3780.22630.7
Rank1111711
Best\Worst−1.23 × 104\−1.25 × 104−1.25 × 104\−1.25 × 104−1.25 × 104\−1.25 × 104−1.25 × 104\−1.25 × 104−9.90 × 103\−1.25 × 104−1.25 × 104\−1.25 × 104−1.23× 104\−1.25 × 104
F9Mean00000\000
Std0000000
Rank1111111
Best\Worst0\00\00\00\00\00\00\0
F10Mean8.88 × 10−168.88 × 10−168.88 × 10−168.88 × 10−168.88 × 10−168.88 × 10−168.88× 10−16
Std1.99 × 10−311.99 × 10−311.99 × 10−311.99 × 10−311.99 × 10−311.99 × 10−311.99× 10−31
Rank1111111
Best\Worst8.88 × 10−16\8.88 × 10−168.88 × 10−16\8.88 × 10−168.88 × 10−16\8.88 × 10−168.88 × 10−16\8.88 × 10−168.88 × 10−16\8.88 × 10−168.88 × 10−16\8.88 × 10−168.88× 10−16\8.88 × 10−16
F11Mean0000000
Std0000000
Rank1111111
Best\Worst0\00\00\00\00\00\00\0
F12Mean1.31 × 10−62.79 × 10−64.21 × 10−62.75 × 10−64.02 × 10−64.18 × 10−61.18× 10−6
Std1.49 × 10−63.23 × 10−68.73 × 10−63.36 × 10−65.88 × 10−65.65 × 10−61.52× 10−6
Rank2473561
Best\Worst7.06 × 10−6\5.43 × 10−91.46 × 10−5\2.19 × 10−85.17 × 10−5\1.09 × 10−81.75 × 10−5\2.96 × 10−93.25 × 10−5\2.30 × 10−92.80 × 10−5\5.59 × 10−96.68× 10−6\5.16 × 10−10
F13Mean2.66 × 10−43.31 × 10−43.91 × 10−52.56 × 10−54.15 × 10−52.68 × 10−54.62 × 10−4
Std1.55 × 10−32.07 × 10−35.02 × 10−54.26 × 10−55.82 × 10−543.68 × 10−52.17 × 10−3
Rank5631427
Best\Worst0.010987\1.4769 × 10−70.014637\1.0906 × 10−70.00023743\4.4639 × 10−80.00026746\5.4412 × 10−70.00024897\2.9199 × 10−110.00016986\8.1121 × 10−80.010987\5.0129 × 10−8
F14Mean1.209.97 × 10−19.98 × 10−11.221.021.121.57
Std9.76 × 10−15.61 × 10−165.61 × 10−169.82 × 10−11.41 × 10−17.10 × 10−11.49
Rank5126347
Best\Worst5.93\9.98 × 10−19.98 × 10−1\9.98 × 10−19.98 × 10−1\9.98 × 10−15.93\9.98 × 10−11.99\9.98 × 10−15.93\9.98 × 10−15.93\9.98 × 10−1
F15Mean3.24 × 10−43.19 × 10−43.22 × 10−43.27 × 10−43.72 × 10−43.28 × 10−43.25 × 10−4
Std1.45 × 10−51.03 × 10−51.70 × 10−51.93 × 10−52.24 × 10−41.79 × 10−51.23 × 10−5
Rank3125764
Best\Worst3.82 × 10−4\3.08 × 10−43.48 × 10−4\3.08 × 10−43.72 × 10−4\3.08 × 10−43.92 × 10−4\3.08 × 10−41.63 × 10−3\3.08 × 10−43.86 × 10−4\3.08 × 10−43.95 × 10−4\3.08 × 10−4
F16Mean−1.03−1.03−1.03−1.03−1.03−1.03−1.03
Std1.35 × 10−151.35 × 10−151.35 × 10−151.35 × 10−151.35 × 10−151.35 × 10−151.35× 10−15
Rank1111111
Best\Worst−1.03\−1.03−1.03\−1.031−1.03\−1.03−1.03\−1.03−1.03\−1.03−1.03\−1.03−1.03\−1.03
F17Mean3.98 × 10−13.98 × 10−13.98 × 10−13.98 × 10−13.98 × 10−13.98 × 10−13.9× 10−9
Std7.97 × 10−84.58 × 10−82.57 × 10−71.65 × 10−73.38 × 10−77.58 × 10−84.17× 10−8
Rank2222221
Best\Worst3.97 × 10−1\3.97 × 10−13.97 × 10−1\3.97 × 10−13.97 × 10−1\3.97 × 10−13.97 × 10−1\3.97 × 10−13.97 × 10−1\0.3.97 × 10−13.97 × 10−1\0.3.97 × 10−13.97× 10−1\0.3.97 × 10−1
F18Mean3333333
Std1.41 × 10−81.31 × 10−71.98 × 10−85.48 × 10−81.41 × 10−87.00 × 10−84.63× 10−8
Rank1111111
Best\Worst3\33\33\33\33\33\33\3
F19Mean−3.86−3.86−3.86−3.86−3.86−3.86−3.86
Std2.09 × 10−32.52 × 10−32.35 × 10−32.43 × 10−31.61 × 10−33.02 × 10−31.32× 10−3
Rank1111111
Best\Worst−3.85\−3.86−3.85\−3.86−3.85\−3.86−3.85\−3.86−3.86\−3.86−3.85\−3.86−3.86\−3.86
F20Mean−3.12−3.12−3.14−3.12−3.13−3.11−3.12
Std1.10 × 10−11.12 × 10−11.10 × 10−11.10 × 10−11.10 × 10−11.12 × 10−19.95 × 10−2
Rank3313273
Best\Worst−2.88\−3.31−2.80\−3.30−2.89\−3.30−2.86\−3.30−2.90\−3.30−2.82\−3.30−2.81\−3.27
F21Mean−8.77−7.05−7.04−8.75−5.36−7.03−8.45
Std2.232.472.462.211.212.452.36
Rank1452763
Best\Worst−5.05\−1.01 × 101−5.05\−1.01 × 101−5.05\−1.01 × 101−5.05\−1.01 × 101−5.04\−1.01 × 101−5.05\−1.01 × 101−5.05\−1.01 × 101
F22Mean−8.44−8.62−7.06−8.95−5.49−7.04−9.06
Std2.542.452.552.321.382.531.26
Rank4352761
Best\Worst−5.08\−1.04 × 101−5.07\−1.04 × 101−5.08\−1.04 × 101−5.09\−1.04 × 101−5.08\−1.04 × 101−5.08\−1.04 × 101−5.09\−1.04× 101
F23Mean−9.27−8.61−6.84−9.54−5.38−7.34−8.92
Std2.232.532.521.951.332.632.4
Rank2461753
Best\Worst−5.12\−1.05 × 101−5.123\−1.05 × 101−5.12\−1.05 × 101−5.12\−1.05 × 101−2.41\−1.05 × 101−5.13\−1.05 × 101−5.12\−1.05 × 101
Table 4. The Bonferroni–Holm corrected P-values of Wilcoxon’s signed-rank test.
Table 4. The Bonferroni–Holm corrected P-values of Wilcoxon’s signed-rank test.
BenchmarkGaussCircleSinusoidalTentKentCubicLogistic
Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−
F1N/A2.12 × 10128\0\221.043 × 10−750\0\05.40 × 10−748\0\21.04 × 10−750\0\01.08 × 10−749\0\16.77 × 10−748\0\2
F2N/A3.05 × 10126\0\241.043 × 10−750\0\01.20 × 10−749\0\11.08 × 10−749\0\11.20 × 10−748\0\21.03 × 10−750\0\0
F3N/A3.23 × 10125\0\253.20 × 10124\0\262.27 × 10129\0\212.01 × 10032\0\186.39 × 10031\0\198.66 × 10028\0\22
F4N/A1.64 × 10123\0\271.02 × 10−547\0\32.74 × 10−546\0\43.24 × 10−748\0\23.71 × 10−546\0\43.61 × 10−748\0\2
F5N/A2.02 × 10130\0\202.79 × 10127\0\232.79 × 10127\0\232.85 × 10126\0\246.12 × 10031\0\192.59 × 10128\0\22
F6N/A2.66 × 10123\0\274.67 × 10−236\0\148.66 × 10033\0\171.74 × 10032\0\182.25 × 10−237\0\136.18 × 10032\0\18
F7N/A3.34 × 10126\0\243.13 × 10128\0\223.22 × 10123\0\272.72 × 10123\0\273.26 × 10123\0\273.12 × 10124\0\26
F8N/A1.64 × 10120\0\303.26 × 10124\1\252.03 × 10121\0\293.27 × 10124\0\263.25 × 10128\1\212.71 × 10120\1\29
F9N/A2.49 × 1010\50\02.10 × 1010\50\01.70 × 1010\50\01.30 × 1010\50\09.00 × 1000\50\05.00 × 1000\50\0
F10N/A2.40 × 1010\50\02.00 × 1010\50\01.60 × 1010\50\01.20 × 1010\50\08.00 × 1000\50\04.00 × 1000\50\0
F11N/A2.30 × 1010\50\01.90 × 1010\50\01.50 × 1010\50\01.10 × 1010\50\07.00 × 1000\50\03.00 × 1000\50\0
F12N/A3.02 × 10131\0\193.32 × 10−133\0\175.13 × 10−136\0\149.84 × 10−131\0\191.26 × 10−237\0\136.20 × 10−236\0\14
F13N/A2.84 × 10122\0\282.91 × 10126\0\248.84 × 10029\0\213.15 × 10123\0\272.71 × 10127\0\233.29 × 10122\0\28
F14N/A1.26 × 1012\39\94.30 × 10−10\41\94.30 × 10−10\41\91.32 × 1013\39\81.23 × 1001\40\94.88 × 1001\40\9
F15N/A2.49 × 10120\0\305.13 × 10−114\0\361.31 × 10118\0\322.79 × 10121\0\291.99 × 10129\0\212.27 × 10127\0\23
F16N/A2.20 × 1010\50\01.80 × 1010\50\01.40 × 1010\50\01.00 × 1010\50\06.00 × 1000\50\02.00 × 1000\50\0
F17N/A3.23 × 10110\28\122.68 × 10111\28\112.20 × 10112\27\119.10 × 10020\21\99.18 × 10022\17\113.25 × 1018\31\11
F18N/A3.22 × 1010\48\23.28 × 1015\42\33.06 × 1012\45\33.32 × 1015\42\33.20 × 1010\48\22.76 × 1012\45\3
F19N/A3.28 × 10125\0\251.32 × 10130\0\203.23 × 10127\0\233.26 × 10125\0\252.92 × 10125\0\253.24 × 10122\0\28
F20N/A3.34 × 10125\0\253.00 × 10123\0\272.12 × 10120\0\303.00 × 10123\0\273.24 × 10124\0\263.27 × 10124\0\26
F21N/A2.03 × 10121\0\292.15 × 10032\0\182.87 × 10−137\0\132.85 × 10128\0\226.425 × 10−540\0\108.45 × 10−238\0\12
F22N/A2.20 × 10129\0\211.89 × 10130\0\204.39 × 10−237\0\132.79 × 10128\0\223.068 × 10−747\0\31.22 × 10−239\0\11
F23N/A2.75 × 10123\0\273.05 × 10128\0\222.31 × 10−131\0\193.26 × 10124\0\269.225 × 10−640\0\101.24 × 10030\0\20
Table 5. Average ranks obtained by each method in the Friedman test.
Table 5. Average ranks obtained by each method in the Friedman test.
Chaotic MappingAverage Rankings
Gauss3.50
Circle3.74
Sinusoidal3.89
Tent3.85
Kent4.85
Cubic4.57
Logistic3.61
Table 6. Results of a comparison with classic meta-heuristic algorithms.
Table 6. Results of a comparison with classic meta-heuristic algorithms.
Benchmark CSHHOHHOWOASCACSO
F1Mean1.96 × 10−1131.24 × 10−886.15 × 10−742.92 × 10−132.82 × 10−19
Std1.38 × 10−1128.75 × 10−882.79 × 10−731.33 × 10−127.98 × 10−19
Rank12354
Best\Worst1.44 × 10−138\9.78 × 10−1125.93 × 10−111\6.19 × 10−871.16 × 10−87\1.85 × 10−722.12 × 10−20\8.05 × 10−124.81 × 10−26\4.63 × 10−18
F2Mean1.66 × 10−591 × 10−498.97 × 10−511.04 × 10−101.05 × 10−18
Std1.06 × 10−584.8 × 10−494.65 × 10−502.43 × 10−102.76 × 10−18
Rank13254
Best\Worst7.47 × 10−58\8.54 × 10−701.86 × 10−74\1.28 × 10−734.54 × 104\1.25 × 1042.19 × 10−4\1.31 × 10−36.13 × 103\3.20 × 103
F3Mean4.96 × 10−881.86 × 10−744.54 × 1040.0002196.13 × 103
Std2.81 × 10−871.28 × 10−731.25 × 1041.31 × 10−33.2 × 103
Rank12534
Best\Worst5.85 × 10−108\1.96 × 10−864.26 × 10−99\9.05 × 10−731.81 × 104\7.29 × 1041.75 × 10−12\9.26 × 10−33.04 × 102\1.65 × 104
F4Mean1.05 × 10−557.88 × 10−4953.47.27 × 10−521.6
Std6.13 × 10−554.62 × 10−482.82 × 1011.38 × 10−41.28 × 101
Rank12534
Best\Worst1.83 × 10−68\4.25 × 10−542.8 × 10−58\3.25 × 10−479.79 × 10−1\8.84 × 1012.56 × 10−7\6.9 × 10−40.0748\4.11 × 101
F5Mean8.57 × 10−31.02 × 10−22.79 × 1011.11 × 1014.77 × 101
Std2.23 × 10−21.43 × 10−24.98 × 10−12.78 × 1019.63 × 101
Rank12435
Best\Worst1.29 × 10−8\0.1321.25 × 10−5\0.072326.9\28.86.48\20427.1\583
F6Mean6.73 × 10−71.59 × 10−40.3740.3773.73
Std1.85 × 10−62.76 × 10−42.44 × 10−11.42 × 10−14.16 × 10−1
Rank12345
Best\Worst4.40 × 10−14\8.8 × 10−64.85 × 10−9\0.001810.0422\1.250.117\0.6273.05\4.99
F7Mean5.98 × 10−52.02 × 10−43.67 × 10−31.84 × 10−39.11 × 10−2
Std7.15 × 10−53.26 × 10−43.79 × 10−32.02 × 10−31.82 × 10−1
Rank12435
Best\Worst2.03 × 10−7\4.65 × 10−43.13 × 10−7\2.10 × 10−39.46 × 10−6\1.75 × 10−26.53 × 10−5\1.02 × 10−22.66 × 10−3\1.07
F8Mean−1.24 × 104−1.26 × 104−1.04 × 104−2.24 × 103−7.12 × 103
Std2.71 × 1029.50 × 1011.67 × 1031.52 × 1026.62 × 102
Rank21354
Best\Worst−1.26 × 104\−1.15 × 104−1.26 × 104\−1.19 × 104−1.26 × 104\−7.76 × 103−2.61 × 103\−1.81 × 103−8.89 × 103\−5.91 × 103
F9Mean003.41 × 10−150.5720.725
Std001.78 × 10−142.033.34
Rank11345
Best\Worst0\00\00\1.14 × 10−130\1.12 × 1010\2.11 × 101
F10Mean8.88 × 10−168.88 × 10−164.3 × 10−155.46 × 10−89.39 × 10−11
Std1.99 × 10−311.99 × 10−312.68 × 10−151.41 × 10−72.5 × 10−10
Rank11354
Best\Worst8.88 × 10−16\8.88 × 10−168.88 × 10−16\8.88 × 10−168.88 × 10−16\7.99 × 10−156.45 × 10−11\7.1 × 10−72.82 × 10−13\1.66 × 10−9
F11Mean001.97 × 10−29.43 × 10−21.22 × 10−2
Std007.55 × 10−21.56 × 10−14.49 × 10−2
Rank11453
Best\Worst0\00\00\4.34 × 10−10\7.65 × 10−10\2.47 × 10−1
F12Mean5.26 × 10−79.97 × 10−62.69 × 10−27.80 × 10−2229
Std1.2 × 10−61.36 × 10−50.042.80 × 10−21.1 × 103
Rank12345
Best\Worst1.68 × 10−11\5 × 10−63.07 × 10−8\7.06 × 10−54.58 × 10−3\2.64 × 10−13.41 × 10−2\1.63 × 10−11.57 × 10−1\7.34 × 103
F13Mean1.64 × 10−51.26 × 10−45.05 × 10−12.64 × 10−15.48
Std2.86 × 10−52.49 × 10−42.57 × 10−18.66 × 10−22.07 × 101
Rank12435
Best\Worst6.14 × 10−10\1.23 × 10−43.21 × 10−7\1.31 × 10−39.14 × 10−2\1.198.80 × 10−2\5.17 × 10−11.38\1.47 × 102
F14Mean1.391.373.061.631.34
Std1.230.7963.150.9351.45
Rank32541
Best\Worst9.98 × 101\5.939.98 × 101\5.930.9.98 × 101\10.80.9.98 × 101\2.980.9.98 × 101\10.8
F15Mean3.40 × 10−43.60 × 10−47.04 × 10−49.44 × 10−47.90 × 10−4
Std3.39 × 10−51.45 × 10−44.57 × 10−43.72 × 10−42.78 × 10−4
Rank12354
Best\Worst3.14 × 10−4\5.55 × 10−43.08 × 10−4\1.34 × 10−33.10 × 10−4\2.25 × 10−33.63 × 10−4\1.59 × 10−33.19 × 10−4\1.62 × 10−3
F16Mean−1.03−1.03−1.03−1.03−1.03
Std1.35 × 10−151.98 × 10−82.4 × 10−82.16 × 10−52.6 × 10−6
Rank12354
Best\Worst−1.03\−1.03−1.03\−1.03−1.03\−1.03−1.03\−1.03−1.03\−1.03
F17Mean3.98 × 1010.3.98 × 1010.398 × 1010.399 × 1010.398 × 101
Std1.88 × 10−82.62 × 10−52.15 × 10−57.21 × 10−41.86 × 10−5
Rank11111
Best\Worst3.98 × 10−1\3.98 × 10−13.98 × 10−1\3.98 × 10−13.98 × 10−1\3.98 × 10−13.98 × 10−1\0.4013.98 × 10−1\3.98 × 10−1
F18Mean3.003.003.003.003.00
Std1.15 × 10−71.01 × 10−61.62 × 10−45.88 × 10−52.09 × 10−4
Rank11111
Best\Worst3.00\3.003.00\3.003.00\3.003.00\3.003.00\3.00
F19Mean−3.86−3.86−3.85−3.85−3.86
Std1.83 × 10−33.29 × 10−31.71 × 10−22.18 × 10−31.01 × 10−2
Rank11111
Best\Worst−3.86\−3.86−3.86\−3.85−3.86\−3.75−3.86\−3.85−3.86\−3.8
F20Mean−3.13−3.07−3.26−3.01−3.24
Std1.05 × 1011.41 × 10−18.66 × 10−21.27 × 10−17.31 × 10−2
Rank34152
Best\Worst−3.31\−2.74−3.30\−2.73−3.32\−3.04−3.19\−2.59−3.32\−3.02
F21Mean−9.65−5.35−8.24−3.02−7.64
Std1.381.182.541.833.02
Rank14253
Best\Worst−10.2\−5.02−10.1\−5.04−10.2\−2.63−5.82\−0.497−10.2\−2.55
F22Mean−9.45−5.46−7.83−3.59−8.19
Std1.941.29\.971.792.98
Rank14352
Best\Worst−10.4\−5.03−10.1\−5.03−10.4\−2.76−8.23\−0.906−10.4\−2.74
F23Mean−9.97−5.04−7.59−4.1−8.05
Std1.461.023.291.593.28
Rank14352
Best\Worst−10.5\−5.09−9.96\−1.65−10.5\−1.67−9.42\−0.94510.5\−2.37
Table 7. The Bonferroni-Holm corrected p-values of Wilcoxon’s signed-rank test with classic meta-heuristic algorithms.
Table 7. The Bonferroni-Holm corrected p-values of Wilcoxon’s signed-rank test with classic meta-heuristic algorithms.
BenchmarkCSHHOHHOWOASCACSO
Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−
F1N/A6.80 × 1080\0\506.50 × 10−80\0\505.89 × 10−80\0\504.84 × 10−80\0\50
F2N/A6.73 × 10−80\0\506.42 × 10−80\0\505.82 × 10−80\0\504.76 × 10−80\0\50
F3N/A1.38 × 10−65\0\456.35 × 10−80\0\505.74 × 10−80\0\504.69 × 10−80\0\50
F4N/A6.65 × 10−80\0\506.27 × 10−80\0\505.67 × 10−80\0\504.61 × 10−80\0\50
F5N/A5.24 × 10−118\0\326.20 × 10−80\0\505.59 × 10−80\0\504.53 × 10−80\0\50
F6N/A6.57 × 10−80\0\506.12 × 10−80\0\505.52 × 10−80\0\504.46 × 10−80\0\50
F7N/A4.11 × 10−411\0\394.27 × 10−81\0\494.18 × 10−81\0\494.38 × 10−80\0\50
F8N/A1.22 × 10−335\0\151.77 × 10−68\0\425.44 × 10−80\0\504.31 × 10−80\0\50
F9N/A3.00 × 1000\50\04.00 × 1000\48\26.92 × 10−70\8\428.13 × 10−10\45\5
F10N/A3.00 × 1000\50\03.17 × 10−60\15\355.37 × 10−80\0\504.23 × 10−80\0\50
F11N/A2.00 × 1000\50\08.13 × 10−10\45\51.10 × 10−70\3\473.17 × 10−30\36\14
F12N/A6.92 × 10−75\0\456.05 × 10−80\0\505.29 × 10−80\0\504.16 × 10−80\0\50
F13N/A1.44 × 10−311\0\395.97 × 10−80\0\505.21 × 10−80\0\504.08 × 10−80\0\50
F14N/A3.93 × 1006\30\141.15 × 10−26\16\284.60 × 10−36\2\423.93 × 1005\38\7
F15N/A3.57 × 10026\0\241.81 × 10−75\0\455.14 × 10−80\0\504.18 × 10−81\0\49
F16N/A4.00 × 1000\48\22.25 × 1000\47\36.94 × 10−80\0\501.15 × 10−20\38\12
F17N/A3.58 × 10−51\17\322.72 × 10−61\11\385.06 × 10−80\0\503.09 × 10−23\32\15
F18N/A2.96 × 10−12\36\126.87 × 10−80\0\506.94 × 10−80\0\505.00 × 10−12\41\7
F19N/A3.02 × 10−213\0\371.16 × 10−410\0\405.33 × 10−81\0\496.77 × 10−121\0\29
F20N/A2.96 × 10−119\0\311.04 × 10−541\0\92.34 × 10−413\0\375.59 × 10−538\0\12
F21N/A2.23 × 10−73\0\471.10 × 10022\0\284.99 × 10−80\0\502.49 × 10−410\0\40
F22N/A2.59 × 10−77\0\431.54 × 10025\0\255.33 × 10−81\0\499.79 × 10−216\0\34
F23N/A5.43 × 10−82\0\481.02 × 10−118\0\324.91 × 10−80\0\504.73 × 10−414\0\36
Table 8. Average rankings obtained by classic meta-heuristic in the Friedman test, and the best result is shown in boldface.
Table 8. Average rankings obtained by classic meta-heuristic in the Friedman test, and the best result is shown in boldface.
Chaotic MappingAverage Rankings
CSHHO2.57
HHO3.67
WOA4.74
SCA5.96
CSO5.00
Table 9. Results of a comparison with the advanced meta-heuristic algorithms.
Table 9. Results of a comparison with the advanced meta-heuristic algorithms.
Benchmark CSHHOGCHHODEPSOASSImproved GSADGOBLFOA
F1Mean1.96 × 101133.91 × 10−971.45 × 10−32.84 × 1012.88 × 10−8
Std1.38 × 10−1122.55 × 10−961.45 × 10−32.84 × 1012.88 × 10−8
Rank12453
Best\Worst9.78 × 10−112\1.44 × 10−1381.80 × 10−95\1.25 × 10−1193.22 × 10−3\3.86 × 10−44.68 × 101\1.51 × 1011.42 × 10−6\0.00 × 100
F2Mean1.66 × 10−591.62 × 10−501.88 × 10−12.81 × 1012.54 × 10−1
Std1.06 × 10−589.45 × 10−501.88 × 10−12.81 × 1012.54 × 10−1
Rank12354
Best\Worst7.47 × 10−58\8.54 × 10−706.63 × 10−49\3.57 × 10−602.84 × 10−1\1.19 × 10−13.46 × 101\1.44 × 1011.03 × 100\8.72 × 10−4
F3Mean4.96 × 10−881.46 × 10−655.37 × 10−13.32 × 1021.18 × 10−2
Std2.81 × 10−871.01 × 10−645.37 × 10−13.32 × 1021.18 × 10−2
Rank12453
Best\Worst1.96 × 10−86\5.85 × 10−1087.16 × 10−64\1.16 × 10−911.44 × 100\9.11 × 10−25.38 × 102\1.39 × 1024.66 × 10−1\1.45 × 10−24
F4Mean1.05 × 10−551.82 × 10−483.49 × 10−22.31 × 1001.74 × 10−4
Std6.13 × 10−559.06 × 10−483.49 × 10−22.31 × 1001.74 × 10−4
Rank12453
Best\Worst4.25 × 10−54\1.83 × 10−685.94 × 10−47\2.66 × 10−585.40 × 10−2\1.75 × 10−22.83 × 100\1.44 × 1008.57 × 10−3\1.29 × 10−20
F5Mean8.57 × 10−37.14 × 10−42.76 × 1011.19 × 1042.87 × 101
Std2.23 × 10−27.02 × 10−42.76 × 1011.19 × 1042.87 × 101
Rank21354
Best\Worst1.32 × 10−1\1.29 × 10−83.28 × 10−3\2.96 × 10−52.95 × 101\2.59 × 1012.60 × 104\1.29 × 1033.20 × 101\2.79 × 101
F6Mean6.73 × 10−71.20 × 10−61.30 × 10−32.83 × 1015.35 × 100
Std1.85 × 10−61.61 × 10−61.30 × 10−32.83 × 1015.35 × 100
Rank12354
Best\Worst8.80 × 10−6\4.40 × 10−141.02 × 10−5\1.47 × 10−73.27 × 10−3\3.91 × 10−44.15 × 101\1.43 × 1016.03 × 100\4.32 × 100
F7Mean5.98 × 10−51.94 × 10−41.26 × 10−11.04 × 1026.10 × 10−1
Std7.15 × 10−51.92 × 10−41.26 × 10−11.04 × 1026.10 × 10−1
Rank12354
Best\Worst4.65 × 10−4\2.03 × 10−77.84 × 10−4\5.01 × 10−62.21 × 10−1\7.33 × 10−21.52 × 102\3.44 × 1001.44 × 100\1.01 × 10−1
F8Mean−1.24 × 104−1.25 × 104−2.78 × 103−2.70 × 103−3.70 × 102
Std2.71 × 1021.67 × 102−2.78 × 103−2.70 × 1031.69 × 102
Rank21345
Best\Worst−1.15 × 104\−1.26 × 104−1.18 × 104\−1.26 × 104−1.79 × 103\−3.79 × 103−1.74 × 103\−3.42 × 103−1.79 × 103\−3.46 × 103
F9Mean0.000.003.72 × 1012.54 × 1028.95 × 100
Std0.000.003.72 × 1012.54 × 1028.95 × 100
Rank11453
Best\Worst0.00\0.000.00\0.006.02 × 101\2.00 × 1012.91 × 102\1.96 × 1022.81 × 101\2.18 × 10−3
F10Mean8.88 × 10−168.88 × 10−163.06 × 10−25.11 × 1004.23 × 10−2
Std1.99 × 10−311.99 × 10−313.06 × 10−25.11 × 1004.23 × 10−2
Rank11354
Best\Worst8.88 × 10−16\8.88 × 10−168.88 × 10−16\8.88 × 10−164.31 × 10−2\1.67 × 10−26.09 × 100\3.64 × 1006.08 × 10−1\8.88 × 10−16
F11Mean0.00 × 1000.00 × 1005.90 × 10−47.85 × 10−11.09 × 10−9
Std0.00 × 1000.00 × 1005.90 × 10−47.85 × 10−11.09 × 10−9
Rank11453
Best\Worst0.00\0.000.00\0.001.97 × 10−2\2.27 × 10−59.59 × 10−1\4.58 × 10−15.46 × 10−8\0.00 × 100
F12Mean5.26 × 10−78.05 × 10−89.89 × 10−61.29 × 1006.98 × 10−1
Std1.20 × 10−69.54 × 10−89.89 × 10−61.29 × 1006.98 × 10−1
Rank21354
Best\Worst5.00 × 10−6\1.68 × 10−115.02 × 10−7\5.77 × 10−92.58 × 10−5\4.76 × 10−62.32 × 100\6.10 × 10−11.01 × 100\3.67 × 10−1
F13Mean1.64 × 10−51.26 × 10−61.69 × 10−36.34 × 1002.44 × 100
Std2.86 × 10−51.48 × 10−61.69 × 10−36.34 × 1002.44 × 100
Rank21354
Best\Worst1.23 × 10−4\6.14 × 10−106.48 × 10−6\4.00 × 10−81.12 × 10−2\6.80 × 10−59.67 × 100\2.90 × 1002.88 × 100\1.60 × 100
F14Mean1.39 × 1009.98 × 10−11.52 × 1007.07 × 1004.34 × 100
Std1.23 × 1005.61 × 10−161.52 × 1007.07 × 1004.34 × 100
Rank21354
Best\Worst5.93 × 100\9.98 × 10−19.98 × 10−1\9.98 × 10−13.02 × 100\9.98 × 10−11.55 × 101\1.01 × 1001.18 × 101\9.98 × 10−1
F15Mean3.40 × 10−43.17 × 10−49.62 × 10−41.39 × 10−32.61 × 10−2
Std3.39 × 10−54.50 × 10−59.62 × 10−41.39 × 10−32.61 × 10−2
Rank21345
Best\Worst5.55 × 10−4\3.14 × 10−45.41 × 10−4\3.07 × 10−44.55 × 10−3\5.50 × 10−43.18 × 10−3\1.03 × 10−36.88 × 10−2\1.81 × 10−3
F16Mean−1.03 × 100−1.03 × 100−1.03 × 100−1.02 × 100−1.03 × 100
Std1.35 × 10−151.35 × 10−15−1.03 × 100−1.02 × 100−1.03 × 100
Rank11151
Best\Worst−1.03 × 100\−1.03 × 100−1.03 × 100\−1.03 × 100−1.03 × 100\−1.03 × 100−9.22 × 10−1\−1.03 × 100−1.03 × 100\−1.03 × 100
F17Mean3.98 × 10−13.98 × 10−13.98 × 10−14.08 × 10−15.30 × 10−1
Std1.88 × 10−81.12 × 10−163.98 × 10−14.08 × 10−15.30 × 10−1
Rank11145
Best\Worst3.98 × 10−13\98 × 10−13.98 × 10−1\3.98 × 10−13.98 × 10−1\3.98 × 10−14.35 × 10−1\3.98 × 10−11.70 × 100\4.00 × 10−1
F18Mean3.00 × 1003.00 × 1003.00 × 1004.91 × 1003.50 × 100
Std1.15 × 10−70.00 × 1003.00 × 1004.91 × 1003.50 × 100
Rank11154
Best\Worst3.00 × 100\3.00 × 1003.00 × 100\3.00 × 1003.00 × 100\3.00 × 1001.25 × 101\3.01 × 1006.51 × 100\3.01 × 100
F19Mean−3.86 × 100−3.86 × 100−3.86 × 100−3.45 × 100−3.50 × 100
Std1.83 × 10−32.24 × 10−15−3.86 × 100−3.45 × 100−3.50 × 100
Rank11154
Best\Worst−3.86 × 100\−3.86 × 100−3.86 × 100\−3.86 × 100−3.86 × 100\−3.86 × 100−2.90 × 100\−3.86 × 100−2.69 × 100\−3.85 × 100
F20Mean−3.13 × 100−3.25 × 100−3.31 × 100−1.69 × 100−2.09 × 100
Std1.05 × 10−15.88 × 10−2−3.31 × 100−1.69 × 100−2.09 × 100
Rank32154
Best\Worst−2.74 × 100\−3.31 × 100−3.20 × 100\−3.32 × 100−3.18 × 100\−3.32 × 100−6.72 × 10−1\−2.55 × 100−1.11 × 100\−3.10 × 100
F21Mean−9.65 × 100−6.07 × 100−6.85 × 100−4.05 × 100−4.49 × 100
Std1.38 × 1002.06 × 100−6.85 × 100−4.05 × 100−4.49 × 100
Rank13254
Best\Worst−5.02 × 100\−1.02 × 101−5.06 × 100\−1.02 × 101−2.68 × 100\−1.02 × 101−1.99 × 100\−7.93 × 100−2.67 × 100\−7.66 × 100
F22Mean−9.45 × 100−5.94 × 100−1.01 × 101−3.55 × 100−4.08 × 100
Std1.94 × 1001.97 × 100−1.01 × 101−3.55 × 100−4.08 × 100
Rank23154
Best\Worst−5.03 × 100\−1.04 × 101−5.09 × 100\−1.04 × 101−6.48 × 100\−1.04 × 101−1.93 × 100\−7.27 × 100−1.67 × 100\−8.98 × 100
F23Mean−9.97 × 100−5.67 × 100−1.03 × 101−4.34 × 100−4.48 × 100
Std1.46 × 1001.64 × 100−1.03 × 101−4.34 × 100−4.48 × 100
Rank23154
Best\Worst−5.09 × 100\−1.05 × 101−5.13 × 100\−1.05 × 101−5.42 × 100\−1.05 × 101−2.61 × 100\−7.03 × 100−2.43 × 100\−9.12 × 100
Table 10. The Bonferroni-Holm corrected p-values of Wilcoxon’s signed-rank test with the advanced meta-heuristic algorithms.
Table 10. The Bonferroni-Holm corrected p-values of Wilcoxon’s signed-rank test with the advanced meta-heuristic algorithms.
Benchmark GCHHODEPSOASSGSAFOA
Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−Corrected
p-Value
+\=\−
F1N/A3.07 × 10−81\0\496.57 × 10−80\0\505.06 × 10−80\0\502.99 × 10−82\0\48
F2N/A6.80 × 10−80\0\506.50 × 10−80\0\504.99 × 10−80\0\504.01 × 10−80\0\50
F3N/A6.73 × 10−80\0\506.42 × 10−80\0\504.91 × 10−80\0\503.93 × 10−80\0\50
F4N/A6.65 × 10−80\0\506.35 × 10−80\0\504.84 × 10−80\0\503.85 × 10−80\0\50
F5N/A3.46 × 10−125\0\256.27 × 10−80\0\504.76 × 10−80\0\503.78 × 10−80\0\50
F6N/A4.84 × 10−39\0\416.20 × 10−80\0\504.69 × 10−80\0\503.70 × 10−80\0\50
F7N/A4.72 × 10−412\0\386.12 × 10−80\0\504.61 × 10−80\0\503.63 × 10−80\0\50
F8N/A3.70 × 10−135\0\156.05 × 10−80\0\504.53 × 10−80\0\503.55 × 10−850\0\0
F9N/A4.00 × 1000\50\05.97 × 10−80\0\504.46 × 10−80\0\503.48 × 10−80\0\50
F10N/A4.00 × 1000\50\05.89 × 10−80\0\504.38 × 10−80\0\504.71 × 10−80\2\48
F11N/A3.00 × 1000\50\05.82 × 10−80\0\504.31 × 10−80\0\508.59 × 10−20\42\8
F12N/A8.60 × 10−126\0\245.74 × 10−80\0\504.23 × 10−80\0\503.40 × 10−80\0\50
F13N/A7.25 × 10−436\0\145.67 × 10−80\0\504.16 × 10−80\0\503.33 × 10−80\0\50
F14N/A3.13 × 10−16\44\01.02 × 10−72\0\481.20 × 10−35\1\441.39 × 10−71\0\49
F15N/A3.99 × 10−648\0\25.59 × 10−80\0\504.08 × 10−80\0\503.25 × 10−80\0\50
F16N/A2.00 × 1000\50\05.52 × 10−80\0\506.75 × 10−80\0\503.17 × 10−40\26\24
F17N/A8.75 × 10−14\46\05.44 × 10−80\0\504.71 × 10−80\2\483.17 × 10−80\0\50
F18N/A8.75 × 10−14\46\05.37 × 10−80\0\506.88 × 10−80\0\503.10 × 10−80\0\50
F19N/A6.88 × 10−850\0\05.29 × 10−80\0\503.04 × 10−743\0\73.02 × 10−80\0\50
F20N/A1.10 × 10−645\0\55.21 × 10−80\0\502.99 × 10−849\0\13.08 × 10−81\0\49
F21N/A1.83 × 10−513\0\373.08 × 10−82\0\481.89 × 10−319\0\313.26 × 10−82\0\48
F22N/A4.72 × 10−413\0\375.14 × 10−80\0\501.55 × 10−344\0\62.95 × 10−80\0\50
F23N/A2.67 × 10−79\0\413.07 × 10−81\0\492.46 × 10−547\0\32.97 × 10−81\0\49
Table 11. Average rankings obtained by advanced meta-heuristic in the Friedman test, and the best result is shown in boldface.
Table 11. Average rankings obtained by advanced meta-heuristic in the Friedman test, and the best result is shown in boldface.
Chaotic MappingAverage Rankings
CSHHO1.70
GCHHO1.83
DEPSOASS2.76
GSA4.87
FOA3.85
Table 12. Experimental results of scalability tests in different dimensions.
Table 12. Experimental results of scalability tests in different dimensions.
BenchmarkMetric50100
CSHHOHHOCSHHOHHO
F1Mean\Std3.49 × 107\1.19 × 1083.30 × 108\9.91 × 1071.08 × 1010\2.06 × 1091.10 × 1010\2.09 × 109
F2Mean\Std9.88 × 104\1.66 × 1041.03 × 105\1.38 × 1042.78 × 105\1.73 × 1042.90 × 105\1.42 × 104
F3Mean\Std8.65 × 102\1.16 × 1029.00 × 102\1.52 × 1022.89 × 102\4.96 × 1023.01 × 103\5.84 × 102
F4Mean\Std8.95 × 102\2.82 × 1018.96 × 102\3.47 × 1011.56 × 103\4.55 × 1011.56 × 103\5.35 × 101
F5Mean\Std6.71 × 102\4.36 × 1006.74 × 102\4.16 × 1006.85 × 102\2.82 × 1006.85 × 102\3.25 × 100
F6Mean\Std1.84 × 103\9.07 × 1011.84 × 103\7.88 × 1013.71 × 103\1.41 × 1023.69 × 103\1.46 × 102
F7Mean\Std1.20 × 103\3.34 × 1011.20 × 103\3.11 × 1012.01 × 103\7.14 × 1012.01 × 103\6.02 × 101
F8Mean\Std2.52 × 104\2.86 × 1032.54 × 104\3.63 × 1035.24 × 104\5.53 × 1035.72 × 104\5.52 × 103
F9Mean\Std9.41 × 103\9.83 × 1029.49 × 103\8.76 × 1022.21 × 104\1.46 × 1032.28 × 104\2.12 × 103
F10Mean\Std1.65 × 103\1.47 × 1021.66 × 103\1.21 × 1025.03 × 104\1.59 × 1045.06 × 104\1.06 × 104
F11Mean\Std2.17 × 108\1.28 × 1081.88 × 108\1.05 × 1081.58 × 109\5.29 × 1081.68 × 109\5.46 × 108
F12Mean\Std3.82 × 106\1.44 × 1065.20 × 106\4.28 × 1061.75 × 107\7.23 × 1061.62 × 107\6.97 × 106
F13Mean\Std1.54 × 106\1.01 × 1062.07 × 106\2.53 × 1064.81 × 106\1.82 × 1065.41 × 106\1.64 × 106
F14Mean\Std6.00 × 105\2.65 × 1056.35 × 105\2.81 × 1053.36 × 106\9.38 × 1054.35 × 106\4.49 × 106
F15Mean\Std4.38 × 103\6.23 × 1024.52 × 103\6.15 × 1028.55 × 103\1.00 × 1038.56 × 103\8.52 × 102
F16Mean\Std3.78 × 103\3.76 × 1023.90 × 103\4.41 × 1026.73 × 103\7.03 × 1026.84 × 103\7.62 × 102
F17Mean\Std5.39 × 105\4.79 × 1064.63 × 106\4.40 × 1065.75 × 106\2.56 × 1066.62 × 106\3.52 × 106
F18Mean\Std5.82 × 105\6.11 × 1051.12 × 106\7.19 × 1051.20 × 107\4.90 × 1061.55 × 107\6.97 × 106
F19Mean\Std3.63 × 102\3.07 × 1023.39 × 103\3.20 × 1026.05 × 103\5.46 × 1026.06 × 103\4.89 × 102
F20Mean\Std2.46 × 103\8.32 × 1012.85 × 103\6.82 × 1014.17 × 103\1.82 × 1024.13 × 103\1.86 × 102
F21Mean\Std1.03 × 104\1.09 × 1031.14 × 104\1.04 × 1032.56 × 104\1.24 × 1032.56 × 104\1.48 × 103
F22Mean\Std3.74 × 103\2.31 × 1023.74 × 103\1.77 × 1025.35 × 103\3.78 × 1025.35 × 103\2.91 × 102
F23Mean\Std4.20 × 103\2.15 × 1024.18 × 103\1.76 × 1027.25 × 103\5.84 × 1027.25 × 103\4.62 × 102
F24Mean\Std3.28 × 103\6.13 × 1013.30 × 103\7.30 × 1014.73 × 103\2.76 × 1024.73 × 103\2.60 × 102
F25Mean\Std1.08 × 104\1.79 × 1031.10 × 104\1.63 × 1032.85 × 104\2.46 × 1032.86 × 104\4.26 × 103
F26Mean\Std4.44 × 103\5.43 × 1024.44 × 103\3.71 × 1025.70 × 103\1.25 × 1035.43 × 103\6.12 × 102
F27Mean\Std3.82 × 103\1.43 × 1023.89 × 103\1.77 × 1025.79 × 103\4.35 × 1025.86 × 103\4.88 × 102
F28Mean\Std6.09 × 103\7.55 × 1026.38 × 103\7.33 × 1021.10 × 104\1.01 × 1031.11 × 104\9.60 × 102
F29Mean\Std5.23 × 107\1.82 × 1075.25 × 107\1.55 × 1071.52 × 108\6.75 × 1071.54 × 108\7.05 × 107
Table 13. Training samples and test samples.
Table 13. Training samples and test samples.
Input SampleOutput Sample
Excitation Current/AExciting Voltage/VReactive Power/MvarSystem Voltage/kV
25.48829.267305.0228.4
23.36426.103261.5228.2
21.59424.521232.5228.5
20.15422.939212.5228.5
19.682422.148197.5228.6
18.832821.159186.4228.6
18.40820.408177.4228.7
17.912419.9728170.2228.75
17.719.775163.8228.8
17.34619.3795158.5228.8
16.99218.984154.2228.8
16.708818.7863150.5228.8
16.63818.5727147.0228.78
16.46118.3987144.2228.85
16.142417.9715137.4228.84
15.682217.402131.1228.9
15.22217.0065125.0228.9
14.86816.611118.35229
14.230815.9782108.75229
12.956414.534688.5229.1
12.3913.842578.3229.1
11.68212.95367.8229.2
10.97412.45857.5229.3
10.336811.469547.5229.3
9.91211.07437.0229.4
8.991610.28327.5229.4
8.42529.175617.2229.48
7.85888.85927.5229.5
7.29248.30551.75229.57
6.97387.91−2.5229.6
6.3726.9213−11.7229.6
5.80566.7235−21.0229.62
5.6646.4467−30.5229.7
4.9565.1415−39.6229.71
4.2484.5483−48.5229.8
3.544.351−57.5229.875
2.8323.164−57.5229.875
2.1242.4521−66.5229.9
Table 14. CSHHO-LSSV Model generalization ability verification.
Table 14. CSHHO-LSSV Model generalization ability verification.
Prediction ValueSample ValueAbsolute ErrorRelative Error
9Reactive Power/MVar163.0515163.8−0.7485−0.4570
System Voltage/kV228.7877228.8−0.0123−0.0054
14Reactive Power/MVar144.5989144.20.39890.2766
System Voltage/kV228.8882228.850.03820.0167
26Reactive Power/MVar28.132227.60.53221.9282
System Voltage/kV229.4014229.45−0.0486−0.0212
35Reactive Power/MVar−40.5890−39.6−0.9892.4975
System Voltage/kV229.7915229.750.04150.0181
Table 15. Reactive power simulation results comparison.
Table 15. Reactive power simulation results comparison.
Prediction ValueSample ValueAbsolute ErrorRelative Error
LSSVM9162.9631163.8−0.8369−0.5128
35−40.8531−39.6−1.25313.1649
CSHHO−LSSVM9163.0515163.8−0.7485−0.4570
35−40.5890−39.6−0.9892.4975
Table 16. System voltage simulation results comparison.
Table 16. System voltage simulation results comparison.
Prediction ValueSample ValueAbsolute ErrorRelative Error
LSSVM9228.6717228.8−0.1283−0.0561
35229.8859229.750.13590.0591
CSHHO−LSSVM9228.7877228.8−0.0123−0.0054
35229.7915229.750.04150.0181
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jiao, S.; Wang, C.; Gao, R.; Li, Y.; Zhang, Q. Harris Hawks Optimization with Multi-Strategy Search and Application. Symmetry 2021, 13, 2364. https://doi.org/10.3390/sym13122364

AMA Style

Jiao S, Wang C, Gao R, Li Y, Zhang Q. Harris Hawks Optimization with Multi-Strategy Search and Application. Symmetry. 2021; 13(12):2364. https://doi.org/10.3390/sym13122364

Chicago/Turabian Style

Jiao, Shangbin, Chen Wang, Rui Gao, Yuxing Li, and Qing Zhang. 2021. "Harris Hawks Optimization with Multi-Strategy Search and Application" Symmetry 13, no. 12: 2364. https://doi.org/10.3390/sym13122364

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop