Next Article in Journal
Guided Hybrid Modified Simulated Annealing Algorithm for Solving Constrained Global Optimization Problems
Next Article in Special Issue
A Hybrid Arithmetic Optimization and Golden Sine Algorithm for Solving Industrial Engineering Design Problems
Previous Article in Journal
New Applications of Gegenbauer Polynomials on a New Family of Bi-Bazilevič Functions Governed by the q-Srivastava-Attiya Operator
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Improved Wild Horse Optimizer for Solving Optimization Problems

1
School of Information Engineering, Sanming University, Sanming 365004, China
2
Department of Computer and Information Science, Linköping University, 581 83 Linköping, Sweden
3
Faculty of Science, Fayoum University, Faiyum 63514, Egypt
4
Faculty of Computer Sciences and Informatics, Amman Arab University, Amman 11953, Jordan
5
School of Computer Science, Universiti Sains Malaysia, Gelugor 11800, Malaysia
6
School of Education and Music, Sanming University, Sanming 365004, China
*
Authors to whom correspondence should be addressed.
Mathematics 2022, 10(8), 1311; https://doi.org/10.3390/math10081311
Submission received: 30 March 2022 / Revised: 10 April 2022 / Accepted: 13 April 2022 / Published: 14 April 2022
(This article belongs to the Special Issue Optimisation Algorithms and Their Applications)

Abstract

:
Wild horse optimizer (WHO) is a recently proposed metaheuristic algorithm that simulates the social behavior of wild horses in nature. Although WHO shows competitive performance compared to some algorithms, it suffers from low exploitation capability and stagnation in local optima. This paper presents an improved wild horse optimizer (IWHO), which incorporates three improvements to enhance optimizing capability. The main innovation of this paper is to put forward the random running strategy (RRS) and the competition for waterhole mechanism (CWHM). The random running strategy is employed to balance exploration and exploitation, and the competition for waterhole mechanism is proposed to boost exploitation behavior. Moreover, the dynamic inertia weight strategy (DIWS) is utilized to optimize the global solution. The proposed IWHO is evaluated using twenty-three classical benchmark functions, ten CEC 2021 test functions, and five real-world optimization problems. High-dimensional cases (D = 200, 500, 1000) are also tested. Comparing nine well-known algorithms, the experimental results of test functions demonstrate that the IWHO is very competitive in terms of convergence speed, precision, accuracy, and stability. Further, the practical capability of the proposed method is verified by the results of engineering design problems.

1. Introduction

Optimization is a research field that aims to maximize or minimize specific objective functions [1,2,3]. Most real-world engineering problems found in nature are classified as NP optimization problems [4,5]. Optimization exists in nearly every field, such as economics [6], text clustering [7], pattern recognition [8], chemistry [9], engineering design [10], feature selection [11,12], face detection and recognition [13], and information technology [14]. Classical and traditional mathematical approaches cannot solve these problems accurately or adequately. Recently, many researchers have tried to solve these problems using a new kind of approximation algorithm known as metaheuristics. In the last two decades, an enormous amount of development and work regarding metaheuristics algorithms have been done. Metaheuristics algorithms have received much attention and have become more popular due to their flexibility, simplicity, and their avoidance of local optima. Metaheuristics algorithms simulate natural phenomena or physical law and can be generally categorized into four different classes: (1) swarm intelligence algorithms, (2) evolution algorithms, (3) physics and chemistry algorithms, and (4) human-based algorithms.
Swarm intelligence algorithms contain algorithms which simulate collective or social behavior of specific creatures. Creatures in the real world can interact with each other to achieve intelligent behavior. Examples of such algorithms include particle swarm optimization (PSO) [15], ant colony optimization (ACO) [16], cuckoo search (CS) [17], grasshopper optimization algorithm (GOA) [18], virus colony search (VCS) [19], dolphin echolocation (DE) algorithm [20], Harris Hawk optimization [21], whale optimization algorithm (WOA) [22], ant lion optimization (ALO) [23], crow search algorithm (CSA) [24], salp swarm algorithm (SSA) [25], moth–flame optimization (MFO) [26], snake optimizer (SO) [27], coot bird [28], and squirrel search algorithm (SSA) [29].
Evolution algorithms are population-based and stochastic algorithms which maintain biological evolution processes such as selection, mutation, elimination, and migration [12,13]. This class contains genetic algorithm (GA) [30], evolutionary programming (EP) [31], differential evolution (DE) [32], bacterial foraging optimization (BFO) [33], and memetic algorithm (MA) [34].
Physics and chemistry algorithms mimic chemical laws or physical phenomena in the universe. Examples of such algorithms are simulated annealing (SA) [35], gravitational search algorithm (GSA) [36], electromagnetism mechanism (EM) algorithm [37], ion motion algorithm (IMA) [38], lightning search algorithm (LSA) [39], and vortex search algorithm (VSA) [40].
Human-based algorithms: Humans are the most intelligent creature in the universe as they can easily find the best way to fix their problems and issues. Human-based algorithms contain algorithms that simulate physical or nonphysical activities such as thinking and physical activities. Examples of this class of algorithms include teaching–learning-based optimization (TLBO) [41], imperialist competitive algorithm (ICA) [42], and social-based algorithm (SBA) [43].
Although several metaheuristic techniques have been proposed, no free lunch (NFL) [44] encourages researchers to research more. NFL mentioned that no algorithm works efficiently in all optimization problems. If the algorithm can solve a specific type of problem efficiently, it may not solve other optimization problem classes. Therefore, many researchers devote themselves to proposing new algorithms or improving existing methods. For instance, Ning improved the whale optimization algorithm using Gaussian mutation [45]. Nautiyal and Tubishat enhanced the basic salp swarm algorithm by applying mutation schemes and opposition-based learning, respectively [46,47]. Moreover, Pelusi proposed a hybrid phase between exploration and exploitation using a fitness dependent weight factor to strengthen the balance between exploration and exploitation for moth–flame optimization [48]. Recently, a new optimization algorithm called wild horse optimizer (WHO) [49] was proposed by Naruei and Keynia. WHO simulates horse behavior in which they leave their group and join another group before becoming adults to prevent mating between siblings or daughters. However, WHO may fall in local optima regions or have a slow convergence in high-dimensional and complex problems.
In this paper, an improved version of WHO, called “IWHO”, is proposed to enhance the original algorithm’s effectiveness and performance. IWHO improved the original algorithm by using three different operators: random running strategy (RRS), dynamic inertia weight strategy (DIWS), and competition of waterhole mechanism (CWHM). In order to evaluate the performance of the WHO algorithm, we compared it with the classical WHO algorithm and eight other different algorithms: grey wolf optimizer (GWO) [50], moth–flame optimization (MFO) [51], salp swarm algorithm (SSA) [52], whale optimization algorithm (WOA) [53], particle swarm optimizer (PSO) [15], hybridizing sine–cosine algorithm with harmony search (HSCAHS) [54], dynamic sine–cosine algorithm (DSCA) [55], and modified ant lion optimizer (MALO) [56] using thirty-three benchmark functions. Moreover, five different constrained engineering problems were used: welded beam design, tension/compression spring design, three-bar truss design, car crashworthiness design, and speed reducer design. The main contributions of this paper are:
  • The random running strategy is proposed to balance the exploration and exploitation phases.
  • The dynamic inertia weight strategy is applied to the waterhole to achieve the optimal global solution.
  • The competition for waterhole mechanism is introduced to stallion position calculations to boost exploitation behavior.
This paper is organized as follows: Section 2 describes WHO and other used operators, whereas Section 3 describes the improved algorithm. Section 4 and Section 5 show the experimental study and discussion using benchmark functions and five real-world engineering problems, whereas Section 6 concludes the paper.

2. Wild Horse Optimizer

Generally, horses can be divided into two classes based on their social organization (territorial and non-territorial). They live in groups of different ages, such as offspring, stallions, and mares (see Figure 1). Both stallions and mares live together and interact with each other in grazing. Foals leave their groups after they grow up and join other groups to establish their own families. This behavior prevents mating between stallions and siblings.
The wild horse optimization (WHO) algorithm is a metaheuristic swarm-based algorithm inspired by the social behavior of horses, such as grazing, domination, leadership hierarchy, and mating.
The WHO algorithm consists of five different steps, described below:

2.1. Creating Initial Populations, Horse Groups, Determining Leaders

If N individuals and G groups exist, then the number of non-leaders (mares and foals) is NG, and the number of leaders is G. The proportion of stallions is defined as PS, which is G/N. Figure 1 shows how leaders are determined from the basic generation to create various groups.

2.2. Grazing Behavior

As stated previously, most of a foal’s life is spent grazing near its group. In order to simulate the grazing phase, we assume that the stallion position existed in the grazing area center. The following formula is used to enable other individuals to move.
X G , j i = 2 Z cos 2 π R Z × Stallion G , j X G , j i + Stallion G , j
where X G , j i and StallionG,j are the positions of the ith group member and stallion in the jth group, respectively, R is a random number between −2 and 2, and Z is an adaptive parameter computed by Equation (2):
P = R 1 < T D R ,   I D X = ( P = = 0 ) ,   Z = R 2 Θ I D X + R 3 Θ ( ~ I D X )
where P is a vector containing 0 and 1, and its dimension equals the dimension of the problem, R 1 and R 3 are random vectors between 0 and 1, and R2 is a random number between 0 and 1. TDR is a linearly decreasing parameter computed by Equation (3).
T D R = 1 t T
where t and T are the current and maximum iterations, respectively.

2.3. Horse Mating Behavior

As stated previously, one of the unique behaviors of horses compared to other animals is separating foals from their original groups prior to their reaching puberty and mating. To be able to simulate the behavior of mating between horses, the following formula is used:
X G , k p = Crossover X G , i q , X G , j z , i j k , q = z = e n d Crossover = Mean
where X G , k p is the position of horse p in group k, which is formed by positions of horse q in group i and horse z in group j. In the basic WHO, the probability of crossover is set to a constant named PC.

2.4. Group Leadership

Group leaders (stallions) will lead other group members to a suitable area (waterhole). Group leaders (stallions) will also compete for the waterhole, leading the dominant group to employ the waterhole first. The following formula is used to simulate this behavior:
Stallion ¯ G , j = 2 Z cos 2 π R Z × WH Stallion G , j + WH   i f   r a n d > 0.5 2 Z cos 2 π R Z × WH Stallion G , j WH   i f   r a n d 0.5
where Stallion ¯ G , j and StallionG,j are the candidate position and the current leader position in the jth group, respectively, and WH is the position of the waterhole.

2.5. Exchange and Selection of Leaders

At first, leaders are selected randomly. After that, leaders are selected based on their fitness values. To simulate the exchange between leader positions and other individuals, the following formula is used:
Stallion G , j = X G , j i ,   i f   f X G , j i < f Stallion G , j Stallion G , i ,   i f   f X G , j i f Stallion G , j
where f( X G , j i ) and f(StallionG,j) are the fitness values of foal and stallion, respectively.

3. Improved Wild Horse Optimizer

In this section, the IWHO is introduced in detail. To enhance the optimizing capability of the basic algorithm, three methods are added. Firstly, wild horses in nature tend to chase and run. Thus, a random running strategy is applied to the foals and stallions. Then, a dynamic inertia weight strategy is introduced to the waterhole, which is from DSCA [50] and beneficial to balance exploration and exploitation. At last, the competition for waterhole mechanism is proposed for stallions, which is inspired by artificial gorilla troops optimizer (GTO) [57] and further improves the solution quality.

3.1. Random Running Strategy (RRS)

In nature, wild horses are very fond of running or chasing each other. Inspired by this, the random running strategy is proposed for both foals and stallions. The position-updating formula is presented as follow:
X G , j i   or   Stallion ¯ G , j = l b + ( u b l b ) × r a n d
where lb and ub are the lower boundary and upper boundary, respectively.
When conducting the RRS, search agents may appear anywhere in the search space. Thus, this method can help search agents jump out of the local optima. Note that to balance exploration and exploitation, the probability of random running (PRR) is set to a small value of 0.1.

3.2. Dynamic Inertia Weight Strategy (DIWS)

The dynamic weight strategy has been applied in much research [55,58,59] and is helpful to find the optimal global solution when introduced into algorithms. Thus, to help the stallions find a better waterhole, a dynamic inertia weight is added to the waterhole in the first formula of Equation (5). The weight and modified formula are calculated as follows:
w = w   min + ( w   max w   min ) × f ( t ) i f ( t ) min f ( t ) a v g f ( t ) min ,   i f   f ( t ) i f ( t ) a v g   w   max ,   i f   f ( t ) i > f ( t ) a v g
Stallion ¯ G , j = 2 Z cos 2 π R Z × WH Stallion G , j + w × WH
where wmin and wmax are the upper and lower boundary values, respectively, f(t)i is the fitness value of the current stallion at tth iteration, f(t)avg is the average fitness value of all stallions, and f(t)min is the minimum fitness value of the population.

3.3. Competition for Waterhole Mechanism (CWHM)

Moreover, the waterhole can be found by two stallions simultaneously. Then there may be rivalry between stallions over the waterhole. Thus, to further improve the solution quality, the competition for the waterhole mechanism, which is similar to the competition for adult females in GTO [53], is proposed to replace the second formula of Equation (5). The position updating formula is as follows:
Stallion ¯ G , j = WH Z × Stallion G , j × Q 1 Stallion G , j × Q 2
where Q1 and Q2 are random numbers between −1 and 1.

3.4. Improved Wild Horse Optimizer

By combining the DIWS and CWHM, the following formula is used to replace Equation (5) in WHO:
Stallion ¯ G , j = 2 Z cos 2 π R Z × WH Stallion G , j + w × WH ,   i f   R 3 > 0.5 WH Z × Stallion G i × Q 1 Stallion G j × Q 2 ,   i f   R 3 0.5
In IWHO, foals and stallions have a more flexible method to update their positions. The RRS helps search agents achieve a better balance between exploration and exploitation. At the same time, DIWS and CWHM enable the method to obtain high-quality solutions and improve the convergence speed. The pseudocode of IWHO is shown in Algorithm 1, and the flowchart of the proposed IWHO is shown in Figure 2.
Algorithm 1 Pseudocode of IWHO
Start IWHO.
1.  Input IWHO parameters: PC = 0.13, PS = 0.2, wmin = 0.01, wmax = 0.99, PRR = 0.1.
2.  Set population size (N) and the maximum number of iterations (T).
3.  Initialize the population of horses at random.
4.  Create foal groups and select stallions.
5.  While (tT)
6.  Calculate TDR using Equation (3).
7.  Calculate Z using Equation (2).
8.  For the number of stallions
9.    For the number of foals
10.   If rand > PC
11.     Update the position of the foal using Equation (1).
12.   Else if rand > PRR
13.     Update the position of the foal using Equation (4).
14.   Else
15.     Update the position of the foal using Equation (7).
16.   End
17.   End for
18.   If rand > PRR
19.   If rand > 0.5
20.     Generate the candidate position of stallion using Equation (10).
21.   Else
22.     Generate the candidate position of stallion using Equation (9).
23.   End if
24.   Else
25.   Generate the candidate position of stallion using Equation (7).
26.    End if
27.   If the candidate position of the stallion is better
28.    Replace the position of the stallion using the candidate position.
29.   End if
30.    End for
31.   Exchange foals and stallions position using Equation (6).
32.   t = t + 1
33. End While
34. Output the best solution obtained by IWHO.
End IWHO.

3.5. Analysis of Algorithm Computational Complexity

The computational complexity of IWHO is related to the population size (N), the maximum number of iterations (T), and problem dimensions (D). In the basic WHO, the computational complexity of initializing population is O(N × D). Then, the computational complexity of updating positions for foals and stallions is O(N × D × T). In addition, consider the worst-case scenario: all foals conduct the grazing behavior. The computational complexity of calculating Z is O(N × D × T). The computational complexity of the exchange and selection of leaders is O(Nfoal × T). Thus, the overall computational complexity of WHO is O(2 × N × D × T + N × D + Nfoal × T). In IWHO, it should be noted that the proposed RRS and CWHM would not increase the computational complexity.
Similarly, considering the worst-case scenario, all stallions conduct the DIWS. Then, the computational complexity of calculating weight is O(Nstallion × T). Therefore, the overall computational complexity of IWHO is O(2 × N × D × T + N × D + N × T), which is only a little higher than the basic algorithm.

4. Experimental Study

To be able to show the significant performance and superiority of IWHO, a variety of experiments were carried out. IWHO was tested over 33 mathematical functions taken from two well-known benchmark function datasets: CEC2005 [60] and CEC2021 [61]. IWHO performance was compared to classical WHO and eight different algorithms. The parameter settings of these experiments are discussed in the following subsection.

4.1. Optimization of Functions and Parameter Settings

Thirty-three mathematical functions of different types (unimodal, multimodal, fixed-dimension multimodal functions) were implemented from two different benchmark datasets: CEC2005 and CEC2021. Table 1 shows each mathematical function used to carry out these experiments, including its type, range of the search space, and its global optimal value.
IWHO has been compared to WHO and eight other algorithms: grey wolf optimizer (GWO) [50], moth–flame optimization (MFO) [51], salp swarm algorithm (SSA) [52], whale optimization algorithm (WOA) [53], particle swarm optimizer (PSO) [15], hybrid sine–cosine algorithm with harmony search (HSCAHS) [54], dynamic sine–cosine algorithm (DSCA) [55], and modified ant lion optimizer (MALO) [56]. The parameter settings of each algorithm are shown in Table 2. These comparative algorithms have shown certain optimizing capability to the benchmark functions. However, for some optimization problems, they cannot get satisfactory optimization solutions. Hence, the advantages of the improved algorithm can be revealed through the extensive comparison between the improved algorithm and other algorithms.
The experiments were carried out in Windows 10 with 16 GB RAM and an Intel (R) i5-9500 CPU. All simulations were executed using MATLAB R2016a. Moreover, the maximum number of iterations was set at 500, and the number of individuals was set to 30. In contrast, the number of dimensions was different according to each function, as shown in Table 1.

4.2. Analysis of Performance for the CEC 2005 Test Suite

Improved WHO (IWHO) results with other comparative algorithms are listed in Table 3. This table shows that the suggested approach achieves the best results in almost all unimodal functions (F1–F7) in all mentioned dimensions (30, 200, 500, and 1000). On the other hand, IWHO also ranked first in almost multimodal functions (F8–F13), and in almost functions regarding fixed-dimension multimodal functions (F14–F23). Compared to basic WHO and other algorithms, it can be seen that the improved algorithm has better exploration and exploitation capability as a result of the three improvements.
Moreover, a non-parametric test known as Wilcoxon signed test was used to give evidence of the performance of our suggested approach. Table 4 shows the Wilcoxon results of IWHO vs. other algorithms with 5% significance [62]. From this table, we can see that IWHO is better than many other metaheuristics algorithms in most cases.
Furthermore, Figure 3 shows the convergence curves of some functions (F1–F3, F5, F6, F8, F10, F12, F13) with 30 dimensions. It is clear from this figure that IWHO has powerful convergence. Further, Figure 4, Figure 5 and Figure 6 show the convergence curves of the same functions (F1–F3, F5, F6, F8, F10, F12, F13) with dimensions of 200, 500, and 1000, respectively. Figure 7 shows the convergence curves for some fixed-dimension multimodal functions (F14, F15, F17–F23). It is obvious that the suggested algorithm has good convergence even in these multimodal functions.

4.3. Statistical Analysis of Performance for the CEC 2021 Test Suite

In this subsection, the performance of IWHO is tested using ten mathematical functions from IEEE CEC 2021, and it is compared using the same comparative metaheuristics algorithms.
The results of CEC2021 test functions are given in Table 5 in terms of mean and standard deviation. Table 5 also shows the results of Friedman ranking test [63]. It can be seen that the proposed IWHO is ranked first in CEC_02 and CEC_06 and the second-best in the other two functions (CEC_05 and CEC_08). Overall, it is also ranked first.
Figure 8 shows the box plots of the introduced algorithm and other compared ones using nine functions (CEC_01–CEC_09). It can be seen that the improved algorithm has better stability in these functions. Further, Figure 9 shows different exploration and exploitation phases. The percentages of exploration and exploitation phases are calculated based on the dispersion of all searching individuals, which exhibits the flexibility of improved algorithms during the optimization process.

5. Real-World Applications

In this section, several real-world engineering design problems with constraints are utilized to assess IWHO’s significance and effectiveness. IWHO has been evaluated using five engineering design problems: welded beam, tension/compression spring, three-bar truss, car crashworthiness, and speed reducer design problems.

5.1. Welded Beam Design Problem

The first engineering problem used to evaluate IWHO is the welded beam design, a common and well-known problem [64], as shown in Figure 10. Welded beam aims to select the optimal variables to decrease the entire weld beam manufacturing cost subject to:
  • Welded thickness (h)
  • Bar length (l)
  • Bar height (t)
  • Bar thickness (b)
The mathematical equations of this problem are shown below:
Consider:
x = [ x 1 ,   x 2 ,   x 3 ,   x 4 ] = [ h ,   l ,   t ,   b ]
Minimize:
f ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 )
Subject to:
g 1 ( x ) = τ ( x ) τ max 0 g 2 ( x ) = σ ( x ) σ max 0 g 3 ( x ) = δ ( x ) δ max 0 g 4 ( x ) = x 1 x 4 0 g 5 ( x ) = P P C ( x ) 0 g 6 ( x ) = 0.125 x 1 0 g 7 ( x ) = 1.10471 x 1 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 ) 5 0
where:
τ ( x ) = ( τ ) 2 + 2 τ τ x 2 2 R + ( τ ) 2 τ = P 2 x 1 x 2 ,   τ = M R J ,   M = P ( L + x 2 2 ) R = x 2 2 4 + ( x 1 + x 3 2 ) 2 J = 2 2 x 1 x 2 x 2 2 4 + ( x 1 + x 3 2 ) 2 σ ( x ) = 6 P L x 3 2 x 4 ,   δ ( x ) = 6 P L 3 E x 3 3 x 4 P C ( x ) = 4.013 E x 3 2 x 4 6 36 L 2 1 x 3 2 L E 4 G
Range of variables:
P = 6000   lb ,   L = 14   in ,   E = 30 × 10 6   psi ,   G = 12 × 10 6   psi τ max = 13600   psi ,   σ max = 30000   psi ,   δ max = 0 . 25   in
IWHO has been compared to the original WHO [49], GWO [50], MFO [51], WOA [53], SCA [65], ALO [66], DA [67], and LSHADE [68]. From Table 6, it’s obvious that IWHO achieves the best results compared to other algorithms. It achieves [h, l, t, b] = [0.2057, 3.2530, 9.0366, 0.2057] with fitness value 1.6952.

5.2. Tension/Compression Spring Design Problem

The second constrained problem is the tension/compression spring problem [69], the goal of which is to select the optimal parameters to minimize the price of manufacturing a spring, as shown in Figure 11. This problem has three variables:
  • Diameter of wire (d)
  • Coil diameter (D)
  • Active coils number (N)
The mathematical description of this problem is given below:
Consider:
x = [ x 1 ,   x 2 ,   x 3 ,   x 4 ] = [ d ,   D ,   N ]
Minimize:
f ( x ) = ( x 3 + 2 ) x 2 x 1 2
Subject to:
g 1 ( x ) = 1 x 2 3 x 3 71785 x 1 4 0 g 2 ( x ) = 4 x 2 2 x 1 x 2 12566 ( x 2 x 1 3 x 1 4 ) + 1 5108 x 1 2 0 g 3 ( x ) = 1 140.45 x 1 x 2 2 x 3 0 g 4 ( x ) = x 1 + x 2 1.5 1 0
Range of variables:
0.05 x 1 2.00 0.25 x 2 1.30 2.00 x 3 15.00
Table 7 shows the results of the proposed algorithm compared to AO [70], HHO [71], SSA [52], WOA [53], GWO [50], PSO [15], MVO [72], GA [29], and HS [73]. As shown in Table 8, IWHO clearly ranked first, with [d, D, N] = [0.0517, 0.4155, 7.1564] and a fitness value of 0.0102.

5.3. Three-Bar Truss Design Problem

This engineering problem is a non-linear optimization problem introduced by Nowcki to minimize bar structure burdens [74]. Three-bar truss has two variables, as shown in Figure 12. The mathematical description of this problem is given below:
Consider:
x = [ x 1 ,   x 2 ] = [ A 1 ,   A 2 ]
Minimize:
f ( x ) = ( 2 2 x 1 + x 2 ) l
Subject to:
g 1 ( x ) = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 g 2 ( x ) = x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 g 3 ( x ) = 1 2 x 2 + x 1 P σ 0
where:
l = 100   cm ,   P = 2   KN / cm 2 ,   σ = 2   KN / cm 2
Range of variables:
0 x 1 ,   x 2 1
The results of this problem are given in Table 8, comparing IWHO with basic WHO [49], HHO [71], SSA [52], AOA [75], MVO [72], MFO [51], and GOA [18]. IWHO achieves the best solution [x1, x2] = [0.7884, 0.4081] and its objective value is 263.8523.
Table 8. Optimization results for the three-bar truss design problem.
Table 8. Optimization results for the three-bar truss design problem.
AlgorithmOptimal Values for VariablesOptimum Weight
x1x2
IWHO0.78840.4081263.8523
WHO [49]0.79800.3816263.9181
HHO [71]0.7886628160.408283133832900263.8958434
SSA [52]0.788665410.408275784263.89584
AOA [75]0.793690.39426263.9154
MVO [72]0.788602760.408453070000000263.8958499
MFO [51]0.7882447710.409466905784741263.8959797
GOA [18]0.7888975555789730.407619570115153263.895881496069

5.4. Car Crashworthiness Design Problem

The car crashworthiness problem was first introduced by Gu et al. [76]. This engineering problem has 10 constraints and 11 design variables. The mathematical description and 3D model diagram of this problem can be seen in [77].
The results of this problem are given in Table 9, where IWHO is compared to GWO [50], GOA [18], PSO [15], MFO [51], SSA [52], DSCA [55], MALO [56]. From this table, we can conclude that IWHO achieves the best results [x1x11] = [0.5000, 1.1186, 0.5000, 1.2982, 0.5000, 1.5000, 0.5000, 0.3450, 0.3450, −19.1594, 0.0020] with fitness value 22.8427.

5.5. Speed Reducer Design Problem

The last constrained engineering problem is the speed reducer design problem [75], as shown in Figure 13, which aims to find the minimum weight of the speed reducer. This problem has seven variables. The mathematical formulation of this problem is shown below:
Minimize:
f ( x ) = 0.7854 x 1 x 2 2 ( 3.3333 x 3 2 + 14.9334 x 3 43.0934 ) 1.508 x 1 ( x 6 2 + x 7 2 ) + 7.4777 ( x 6 3 + x 7 3 )
Subject to:
g 1 ( x ) = 27 x 1 x 2 2 x 3 1 0 g 2 ( x ) = 397.5 x 1 x 2 2 x 3 2 1 0 g 3 ( x ) = 1.93 x 4 3 x 2 x 3 x 6 4 1 0 g 4 ( x ) = 1.93 x 5 3 x 2 x 3 x 7 4 1 0 g 5 ( x ) = ( 745 x 4 x 2 x 3 ) 2 + 16.9 × 10 6 110.0 x 6 3 1 0 g 6 ( x ) = ( 745 x 4 x 2 x 3 ) 2 + 157.5 × 10 6 85.0 x 6 3 1 0 g 7 ( x ) = x 2 x 3 40 1 0 g 8 ( x ) = 5 x 2 x 1 1 0 g 9 ( x ) = x 1 12 x 2 1 0 g 10 ( x ) = 1.5 x 6 + 1.9 x 4 1 0 g 11 ( x ) = 1.1 x 7 + 1.9 x 5 1 0
Range of variables:
2.6 x 1 3.6 0.7 x 2 0.8 17 x 3 28 7.3 x 4 8.3 7.8 x 5 8.3 2.9 x 6 3.9 5.0 x 7 5.5
IWHO is compared to AO [70], PSO [15], AOA [75], MFO [51], GA [30], SCA [65], HS [73], and FA [78]. As listed in Table 10, the proposed algorithm clearly achieves the best results: [x1x7] = [3.4976, 0.7, 17, 7.3, 7.8, 3.3501, 5.2855] with fitness value 2995.43736.

6. Conclusions

In this paper, an improved wild horse optimizer (IWHO) is proposed for solving global optimization problems. The basic WHO is enhanced by integrating three improvements: random running strategy (RRS), dynamic inertia weight strategy (DIWS), and competition for waterhole mechanism (CWHM), which were inspired by the behavior of wild horses. RRS is utilized to enhance the global search capability and balance exploration and exploitation, while DIWS and CWHM are applied to increase the quality of the optimal solution. To evaluate the performance of the proposed IWHO, twenty-three classical benchmark functions with various types and dimensions, ten CEC 2021 test functions, and five real-world optimization problems are employed for testing. Meanwhile, nine well-known algorithms are used for comparison: basic wild horse optimizer (WHO), grey wolf optimizer (GWO), moth–flame optimization (MFO), salp swarm algorithm (SSA), whale optimization algorithm (WOA), particle swarm optimizer (PSO), hybridizing sine–cosine algorithm with harmony search (HSCAHS), dynamic sine–cosine algorithm (DSCA), and modified ant lion optimizer (MALO). The numerical and statistical results indicate that IWHO outperforms other methods for most test functions, showing that the improvements applied to WHO are successful. In addition, the results of engineering design problems verify the applicability of the proposed IWHO.
In the future, IWHO may be applied to high-dimensional, real-world problems, such as multilayer perceptron training for solving classification tasks. Additionally, the IWHO can be hybridized with other metaheuristic algorithms to further improve its optimization capability and solve more optimization problems, such as feature selection, multilevel thresholding image segmentation, and parameter optimization.

Author Contributions

Conceptualization, R.Z.; methodology, R.Z. and H.-M.J.; software, S.W.; validation, D.W.; writing—original draft preparation, A.G.H. and R.Z.; writing—review and editing, L.A. and H.-M.J.; funding acquisition, R.Z. and H.-M.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by Sanming University, which introduces high-level talent to start scientific research; funding support project (21YG01, 20YG14), Sanming University National Natural Science Foundation Breeding Project (PYT2103, PYT2105), Fujian Natural Science Foundation Project (2021J011128), Guiding Science and Technology Projects in Sanming City (2021-S-8), Educational Research Projects of Young and Middle-Aged Teachers in Fujian Province (JAT200618), Scientific Research and Development Fund of Sanming University (B202009), and by Open Research Fund Program of Fujian Provincial Key Laboratory of Agriculture Internet of Things Application (ZD2101).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Acknowledgments

The authors would like to thank the support of Fujian Key Lab of Agriculture IoT Application and IoT Application Engineering Research Center of Fujian Province Colleges and Universities, and also the anonymous reviewers and the editor for their careful reviews and constructive suggestions to help us improve the quality of this paper.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Hassanien, A.E.; Emary, E. Swarm Intelligence: Principles, Advances, and Applications; CRC Press: Boca Raton, FL, USA, 2018; pp. 93–119. [Google Scholar]
  2. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H.; Amin, M.; Azar, A.T. New binary whale optimization algorithm for discrete optimization problems. Eng. Optimiz. 2020, 52, 945–959. [Google Scholar] [CrossRef]
  3. Yang, X.S. Engineering Optimization: An Introduction with Metaheuristic Applications; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar]
  4. Jamil, M.; Yang, X.S. A literature survey of benchmark functions for global optimization problems. Int. J. Math. Model. Numer. Optim. 2013, 4, 150–194. [Google Scholar]
  5. Fathi, H.; AlSalman, H.; Gumaei, A.; Manhrawy, I.I.M.; Hussien, A.G.; El-Kafrawy, P. An Efficient Cancer Classification Model Using Microarray and High-Dimensional Data. Comput. Intel. Neurosc. 2021, 2021, 7231126. [Google Scholar] [CrossRef] [PubMed]
  6. Mousavi-Avval, S.H.; Rafiee, S.; Sharifi, M.; Hosseinpour, S.; Notarnicola, B.; Tassielli, G.; Renzulli, P.A. Application of multi-objective genetic algorithms for optimization of energy, economics and environmental life cycle assessment in oilseed production. J. Clean. Prod. 2017, 140, 804–815. [Google Scholar] [CrossRef]
  7. Abualigah, L.; Gandomi, A.H.; Elaziz, M.A.; Hussien, A.G.; Khasawneh, A.M.; Alshinwan, M.; Houssein, E.H. Nature-inspired optimization algorithms for text document clustering—A comprehensive analysis. Algorithms 2020, 13, 345. [Google Scholar] [CrossRef]
  8. Shamir, J.; Rosen, J.; Mahlab, U.; Caulfield, H.J. Optimization methods for pattern recognition. Int. Soc. Opt. Eng. 1992, 40, 2–24. [Google Scholar]
  9. Houssein, E.H.; Amin, M.; Hassanien, A.G.; Houssein, A.E. Swarming behaviour of salps algorithm for predicting chemical compound activities. In Proceedings of the 8th IEEE International Conference on Intelligent Computing and Information Systems (ICICIS), Cairo, Egypt, 5–7 December 2017; pp. 315–320. [Google Scholar]
  10. Chou, J.S.; Pham, A.D. Nature-inspired metaheuristic optimization in least squares support vector regression for obtaining bridge scour information. Inform. Sci. 2017, 399, 64–80. [Google Scholar] [CrossRef]
  11. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H.; Bhattacharyya, S.; Amin, M. S-shaped Binary Whale Optimization Algorithm for Feature Selection. In Recent Trends in Signal and Image Processing; Springer: Singapore, 2019; Volume 727, pp. 79–87. [Google Scholar]
  12. Hussien, A.G.; Houssein, E.H.; Hassanien, A.E. A binary whale optimization algorithm with hyperbolic tangent fitness function for feature selection. In Proceedings of the 8th IEEE International Conference on Intelligent Computing and Information Systems (ICICIS), Cairo, Egypt, 5–7 December 2017; pp. 166–172. [Google Scholar]
  13. Besnassi, M.; Neggaz, N.; Benyettou, A. Face detection based on evolutionary Haar filter. Pattern Anal. Appl. 2020, 23, 309–330. [Google Scholar] [CrossRef]
  14. Wang, Z.Y.; Xing, H.L.; Li, T.R.; Yang, Y.; Qu, R.; Pan, Y. A modified ant colony optimization algorithm for network coding resource minimization. IEEE T. Evolut. Comput. 2016, 20, 325–342. [Google Scholar] [CrossRef] [Green Version]
  15. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  16. Dorigo, M.; Gambardella, L.M. Ant colony system: A cooperative learning approach to the traveling salesman problem. IEEE T. Evolut. Comput. 1997, 1, 53–66. [Google Scholar] [CrossRef] [Green Version]
  17. Yang, X.S.; Deb, S. Cuckoo search via Lévy flights. In Proceedings of the World Congress on Nature & Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009; pp. 210–214. [Google Scholar]
  18. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef] [Green Version]
  19. Li, M.D.; Zhao, H.; Weng, X.W.; Han, T. A novel nature-inspired algorithm for optimization: Virus colony search. Adv. Eng. Softw. 2016, 92, 65–88. [Google Scholar] [CrossRef]
  20. Kaveh, A.; Farhoudi, N. A new optimization method: Dolphin echolocation. Adv. Eng. Softw. 2013, 59, 53–70. [Google Scholar] [CrossRef]
  21. Hussien, A.G.; Amin, M. A self-adaptive Harris Hawks optimization algorithm with opposition-based learning and chaotic local search strategy for global optimization and feature selection. Int. J. Mach. Learn. Cyb. 2022, 13, 309–336. [Google Scholar] [CrossRef]
  22. Hussien, A.G.; Oliva, D.; Houssein, E.H.; Juan, A.A.; Yu, X. Binary whale optimization algorithm for dimensionality reduction. Mathematics 2020, 8, 1821. [Google Scholar] [CrossRef]
  23. Assiri, A.S.; Hussien, A.G.; Amin, M. Ant Lion Optimization: Variants, hybrids, and applications. IEEE Access 2020, 8, 77746–77764. [Google Scholar] [CrossRef]
  24. Hussien, A.G.; Amin, M.; Wang, M.; Liang, G.; Alsanad, A.; Gumaei, A.; Chen, H. Crow Search Algorithm: Theory, Recent Advances, and Applications. IEEE Access 2020, 8, 173548–173565. [Google Scholar] [CrossRef]
  25. Hussien, A.G. An enhanced opposition-based Salp Swarm Algorithm for global optimization and engineering problems. J. Amb. Intel. Hum. Comp. 2021, 13, 129–150. [Google Scholar] [CrossRef]
  26. Hussien, A.G.; Amin, M.; Abd El Aziz, M. A comprehensive review of moth-flame optimisation: Variants, hybrids, and applications. J. Exp. Theor. Artif. Intell. 2020, 32, 705–725. [Google Scholar] [CrossRef]
  27. Hashim, F.A.; Hussien, A.G. Snake Optimizer: A novel meta-heuristic optimization Algorithm. Knowl.-Based Syst. 2022, 242, 108320. [Google Scholar] [CrossRef]
  28. Mostafa, R.R.; Hussien, A.G.; Khan, M.A.; Kadry, S.; Hashim, F. Enhanced COOT Optimization Algorithm for Dimensionality Reduction. In Proceedings of the Fifth International Conference of Women in Data Science at Prince Sultan University (WiDS PSU), Riyadh, Saudi Arabia, 28–29 March 2022. [Google Scholar] [CrossRef]
  29. Jain, M.; Singh, V.; Rani, A. A novel nature-inspired algorithm for optimization: Squirrel search algorithm. Swarm Evol. Comput. 2019, 44, 148–175. [Google Scholar] [CrossRef]
  30. Holland, J.H. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
  31. Juste, K.; Kita, H.; Tanaka, E.; Hasegawa, J. An evolutionary programming solution to the unit commitment problem. IEEE T. Power Syst. 1999, 14, 1452–1459. [Google Scholar] [CrossRef]
  32. Rocca, P.; Oliveri, G.; Massa, A. Differential evolution as applied to electromagnetics. IEEE Antennas Propag. Mag. 2011, 53, 38–49. [Google Scholar] [CrossRef]
  33. Passino, K.M. Biomimicry of bacterial foraging for distributed optimization and control. IEEE Contr. Syst. Mag. 2002, 22, 52–67. [Google Scholar]
  34. Moscato, P.; Mendes, A.; Berretta, R. Benchmarking a memetic algorithm for ordering microarray data. Biosystems 2007, 88, 56–75. [Google Scholar] [CrossRef] [PubMed]
  35. Kirkpatrick, S.; Gelatt, C.D., Jr.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef] [PubMed]
  36. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inform. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  37. BİRBİL, Ş.İ.; Fang, S.C. An electromagnetism-like mechanism for global optimization. J. Glob. Optim. 2003, 25, 263–282. [Google Scholar] [CrossRef]
  38. Javidy, B.; Hatamlou, A.; Mirjalili, S. Ions motion algorithm for solving optimization problems. Appl. Soft Comput. 2015, 32, 72–79. [Google Scholar] [CrossRef]
  39. Abualigah, L.; Abd Elaziz, M.; Hussien, A.G.; Alsalibi, B.; Jalali, S.M.J.; Gandomi, A.H. Lightning search algorithm: A comprehensive survey. Appl. Intell. 2021, 51, 2353–2376. [Google Scholar] [CrossRef]
  40. Doğan, B.; Ölmez, T. A new metaheuristic for numerical function optimization: Vortex search algorithm. Inform. Sci. 2015, 293, 125–145. [Google Scholar] [CrossRef]
  41. Rao, R.V.; Savsani, V.J.; Vakharia, D. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  42. Atashpaz-Gargari, E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In Proceedings of the IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; pp. 4661–4667. [Google Scholar]
  43. Ramezani, F.; Lotfi, S. Social-based algorithm (SBA). Appl. Soft Comput. 2013, 13, 2837–2856. [Google Scholar] [CrossRef]
  44. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE T. Evolut. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  45. Ning, G.Y.; Cao, D.Q. Improved Whale Optimization Algorithm for Solving Constrained Optimization Problems. Discret. Dyn. Nat. Soc. 2021, 2021, 8832251. [Google Scholar] [CrossRef]
  46. Nautiyal, B.; Prakash, R.; Vimal, V.; Liang, G.; Chen, H. Improved Salp Swarm Algorithm with Mutation Schemes for Solving Global Optimization and Engineering Problems. Eng. Comput. 2021, 1–23. [Google Scholar] [CrossRef]
  47. Tubishat, M.; Idris, N.; Shuib, L.; Abushariah, M.A.M.; Mirjalili, S. Improved Salp Swarm Algorithm Based on Opposition Based Learning and Novel Local Search Algorithm for Feature Selection. Expert Syst. Appl. 2020, 145, 113122. [Google Scholar] [CrossRef]
  48. Pelusi, D.; Mascella, R.; Tallini, L.; Nayak, J.; Naik, B.; Deng, Y. An Improved Moth-Flame Optimization Algorithm with Hybrid Search Phase. Knowl. Based Syst. 2020, 191, 105277. [Google Scholar] [CrossRef]
  49. Naruei, I.; Keynia, F. Wild horse optimizer: A new meta-heuristic algorithm for solving engineering optimization problems. Eng. Comput. 2021. [Google Scholar] [CrossRef]
  50. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  51. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  52. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  53. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  54. Singh, N.; Kaur, J. Hybridizing sine-cosine algorithm with harmony search strategy for optimization design problems. Soft Comput. 2021, 25, 11053–11075. [Google Scholar] [CrossRef]
  55. Li, Y.; Zhao, Y.; Liu, J. Dynamic sine cosine algorithm for large-scale global optimization problems. Expert Syst. Appl. 2021, 177, 114950. [Google Scholar] [CrossRef]
  56. Wang, S.; Sun, K.; Zhang, W.; Jia, H. Multilevel thresholding using a modified ant lion optimizer with opposition-based learning for color image segmentation. Math. Biosci. Eng. 2021, 18, 3092–3143. [Google Scholar] [CrossRef]
  57. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. Int. J. Intell. Syst. 2021, 36, 5887–5958. [Google Scholar] [CrossRef]
  58. Chen, H.L.; Yang, C.J.; Heidari, A.A.; Zhao, X.H. An efficient double adaptive random spare reinforced whale optimization algorithm. Expert Syst. Appl. 2020, 154, 113018. [Google Scholar] [CrossRef]
  59. Dong, H.; Xu, Y.L.; Li, X.P.; Yang, Z.L.; Zou, C.H. An improved antlion optimizer with dynamic random walk and dynamic opposite learning. Knowl. Based Syst. 2021, 216, 106752. [Google Scholar] [CrossRef]
  60. Suganthan, P.N.; Hansen, N.; Liang, J.J.; Deb, K.; Chen, Y.P.; Auger, A.; Tiwari, S. Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization; Technical Report, Nanyang Technological University, Singapore and KanGAL; Kanpur Genetic Algorithms Lab.: Kanpur, India, 2005. [Google Scholar]
  61. Zheng, R.; Jia, H.M.; Abualigah, L.; Wang, S.; Wu, D. An improved remora optimization algorithm with autonomous foraging mechanism for global optimization problems. Math. Biosci. Eng. 2022, 19, 3994–4037. [Google Scholar] [CrossRef]
  62. García, S.; Fernández, A.; Luengo, J.; Herrera, F. Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf. Sci. 2010, 180, 2044–2064. [Google Scholar] [CrossRef]
  63. Theodorsson-Norheim, E. Friedman and Quade tests: BASIC computer program to perform nonparametric two-way analysis of variance and multiple comparisons on ranks of several related samples. Comput. Biol. Med. 1987, 17, 85–99. [Google Scholar] [CrossRef]
  64. Coello, C.A.C. Use of a self-adaptive penalty approach for engineering optimization problems. Comput. Ind. 2000, 41, 113–127. [Google Scholar] [CrossRef]
  65. Mirjalili, S. SCA: A sine cosine algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  66. Mirjalili, S. The ant lion optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
  67. Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 2016, 27, 1053–1073. [Google Scholar] [CrossRef]
  68. Tanabe, R.; Fukunaga, A.S. Improving the search performance of SHADE using linear population size reduction. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6–11 July 2014; pp. 1658–1665. [Google Scholar]
  69. Baykasoğlu, A.; Akpinar, Ş. Weighted superposition attraction (WSA): A swarm intelligence algorithm for optimization problems–part 2: Constrained optimization. Appl. Soft Comput. 2015, 37, 396–415. [Google Scholar] [CrossRef]
  70. Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-Qaness, M.A.; Gandomi, A.H. Aquila optimizer: A novel meta-heuristic optimization algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  71. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  72. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513. [Google Scholar] [CrossRef]
  73. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  74. Ray, T.; Saini, P. Engineering design optimization using a swarm with an intelligent information sharing among individuals. Eng. Optimiz. 2001, 33, 735–748. [Google Scholar] [CrossRef]
  75. Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The arithmetic optimization algorithm. Comput. Method Appl. M. 2021, 376, 113609. [Google Scholar] [CrossRef]
  76. Gu, L.; Yang, R.; Tho, C.H.; Makowskit, M.; Faruquet, O.; Li, Y.L. Optimisation and robustness for crashworthiness of side impact. Int. J. Vehicle Des. 2001, 26, 348–360. [Google Scholar] [CrossRef]
  77. Yildiz, B.S.; Pholdee, N.; Bureerat, S.; Yildiz, A.R.; Sait, S.M. Enhanced grasshopper optimization algorithm using elite opposition-based learning for solving real-world engineering problems. Eng. Comput. 2021. [Google Scholar] [CrossRef]
  78. Yang, X.S.; He, X. Firefly algorithm: Recent advances and applications. Int. J. Swarm Intell. R. 2013, 1, 36–50. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Formation of groups from the original population.
Figure 1. Formation of groups from the original population.
Mathematics 10 01311 g001
Figure 2. Flowchart of IWHO.
Figure 2. Flowchart of IWHO.
Mathematics 10 01311 g002
Figure 3. Convergence curves for the optimization algorithms on test functions (F1, F2, F3, F5, F6, F8, F10, F12, F13) with D = 30.
Figure 3. Convergence curves for the optimization algorithms on test functions (F1, F2, F3, F5, F6, F8, F10, F12, F13) with D = 30.
Mathematics 10 01311 g003
Figure 4. Convergence curves for the optimization algorithms on test functions (F1, F2, F3, F5, F6, F8, F10, F12, F13) with D = 200.
Figure 4. Convergence curves for the optimization algorithms on test functions (F1, F2, F3, F5, F6, F8, F10, F12, F13) with D = 200.
Mathematics 10 01311 g004
Figure 5. Convergence curves for the optimization algorithms on test functions (F1, F2, F3, F5, F6, F8, F10, F12, F13) with D = 500.
Figure 5. Convergence curves for the optimization algorithms on test functions (F1, F2, F3, F5, F6, F8, F10, F12, F13) with D = 500.
Mathematics 10 01311 g005
Figure 6. Convergence curves for the optimization algorithms on test functions (F1, F2, F3, F5, F6, F8, F10, F12, F13) with D = 1000.
Figure 6. Convergence curves for the optimization algorithms on test functions (F1, F2, F3, F5, F6, F8, F10, F12, F13) with D = 1000.
Mathematics 10 01311 g006
Figure 7. Convergence curves for the optimization algorithms on fixed-dimension test functions (F14, F15, F17–F23).
Figure 7. Convergence curves for the optimization algorithms on fixed-dimension test functions (F14, F15, F17–F23).
Mathematics 10 01311 g007
Figure 8. Box plots of CEC2021 test functions (CEC_01–CEC_09).
Figure 8. Box plots of CEC2021 test functions (CEC_01–CEC_09).
Mathematics 10 01311 g008
Figure 9. Exploration and exploitation phases for IWHO on CEC2021 test functions (CEC_01–CEC_09).
Figure 9. Exploration and exploitation phases for IWHO on CEC2021 test functions (CEC_01–CEC_09).
Mathematics 10 01311 g009
Figure 10. The welded beam design problem: 3D model diagram (left) and structural parameters (right).
Figure 10. The welded beam design problem: 3D model diagram (left) and structural parameters (right).
Mathematics 10 01311 g010
Figure 11. Tension/compression spring design problem (3D model diagram and structural parameters).
Figure 11. Tension/compression spring design problem (3D model diagram and structural parameters).
Mathematics 10 01311 g011
Figure 12. Three-bar truss design problem (3D model diagram and structural parameters).
Figure 12. Three-bar truss design problem (3D model diagram and structural parameters).
Mathematics 10 01311 g012
Figure 13. The speed reducer design problem: 3D model diagram (left) and structural parameters (right).
Figure 13. The speed reducer design problem: 3D model diagram (left) and structural parameters (right).
Mathematics 10 01311 g013
Table 1. Feature properties of the test functions (D indicates the dimension).
Table 1. Feature properties of the test functions (D indicates the dimension).
Function TypeFunctionDRangeTheoretical Optimization Value
Unimodal test functionsF130/100/500/1000[−100, 100]0
F230/100/500/1000[−10, 10]0
F330/100/500/1000[−100, 100]0
F430/100/500/1000[−100, 100]0
F530/100/500/1000[−30, 30]0
F630/100/500/1000[−100, 100]0
F730/100/500/1000[−1.28, 1.28]0
Multimodal test functionsF830/100/500/1000[−500, 500]−418.9829 × D
F930/100/500/1000[−5.12, 5.12]0
F1030/100/500/1000[−32, 32]0
F1130/100/500/1000[−600, 600]0
F1230/100/500/1000[−50, 50]0
F1330/100/500/1000[−50, 50]0
Fixed-dimension multimodal test functionsF142[−65, 65]0.998004
F154[−5, 5]0.0003075
F162[−5, 5]−1.03163
F172[−5, 5]0.398
F182[−2, 2]3
F193[−1, 2]−3.8628
F206[0, 1]−3.3220
F214[0, 10]−10.1532
F224[0, 10]−10.4028
F234[0, 10]−10.5363
CEC2021 unimodal test functionsCEC_0120[−100, 100]100
CEC2021 basic test functionsCEC_0220[−100, 100]1100
CEC_0320[−100, 100]700
CEC_0420[−100, 100]1900
CEC2021 hybrid test functionsCEC_0520[−100, 100]1700
CEC_0620[−100, 100]1600
CEC_0720[−100, 100]2100
CEC2021 composition test functionsCEC_0820[−100, 100]2200
CEC_0920[−100, 100]2400
CEC_1020[−100, 100]2500
Table 2. Parameter values for the IWHO and other comparative optimization algorithms.
Table 2. Parameter values for the IWHO and other comparative optimization algorithms.
AlgorithmParameters
IWHOPS = 0.2; PC = 0.13; PRR = 0.1; w ∈ [0.01, 0.99]
WHO [49]PS = 0.2; PC = 0.13
GWO [50]a = [2, 0]
MFO [51]b = 1; t = [−1,1]; a ∈ [−1,−2]
SSA [52]c1 ∈ [0, 1]; c2 ∈ [0, 1]
WOA [53]a1 = [2, 0]; a2 = [−2, −1]; b = 1
PSO [15]c1 = 2; c2 = 2; W ∈ [0.2, 0.9]; vMax = 6
HSCAHS [54]a = 2; Bandwidth = 0.02
DSCA [55]w ∈ [0.1, 0.9], σ = 0.1; aend = 0; astart = 2
MALO [56]Switch possibility = 0.5
Table 3. Comparison of optimization results between IWHO and other optimization algorithms on classical test functions (F1−F23).
Table 3. Comparison of optimization results between IWHO and other optimization algorithms on classical test functions (F1−F23).
FunctionDMetricIWHOWHOGWOMFOSSAWOAPSOHSCAHSDSCAMALO
F130Mean03.05 × 10−432.09 × 10−271.69 × 1031.97 × 10−78.09 × 10−742.09 × 10−42.79 × 10−503.88 × 10−1061.28 × 10−3
Std01.57 × 10−423.99 × 10−273.83 × 1033.38 × 10−72.79 × 10−733.23 × 10−41.41 × 10−492.11 × 10−1051.17 × 10−3
200Mean04.57 × 10−329.50 × 10−82.88 × 1051.69 × 1045.20 × 10−703.38 × 1023.46 × 10−341.34 × 10−913.95 × 104
Std01.81 × 10−314.45 × 10−82.81 × 1042.15 × 1032.84 × 10−694.88 × 1018.54 × 10−347.35 × 10−919.15 × 103
500Mean02.96 × 10−281.63 × 10−31.16 × 1069.46 × 1043.14 × 10−715.85 × 1034.56 × 10−291.90 × 10−782.17 × 105
Std01.62 × 10−275.53 × 10−43.51 × 1046.80 × 1038.49 × 10−714.89 × 1028.10 × 10−291.04 × 10−773.75 × 104
1000Mean06.50 × 10−292.53 × 10−12.73 × 1062.37 × 1056.48 × 10−704.10 × 1042.05 × 10−254.69 × 10−735.80 × 105
Std02.29 × 10−286.74 × 10−25.33 × 1041.27 × 1043.21 × 10−692.39 × 1034.20 × 10−252.57 × 10−728.66 × 104
F230Mean02.22 × 10−259.06 × 10−172.65 × 1012.371.56 × 10−517.364.04 × 10−271.22 × 10−605.29 × 101
Std07.39 × 10−256.90 × 10−172.03 × 1011.734.94 × 10−511.05 × 1019.46 × 10−276.65 × 10−605.31 × 101
200Mean06.92 × 1013.14 × 10−57.55 × 1021.55 × 1021.18 × 10−494.76 × 1021.19 × 10−181.05 × 10−513.76 × 1073
Std01.80 × 1029.14 × 10−64.70 × 1011.26 × 1013.91 × 10−496.08 × 1011.99 × 10−183.96 × 10−512.06 × 1074
500Mean06.01 × 1021.09 × 10−24.30 × 101165.38 × 1021.00 × 10−485.74 × 10583.84 × 10−167.18 × 10−502.31 × 10244
Std06.18 × 1022.39 × 10−32.23 × 101172.39 × 1014.88 × 10−483.14 × 10596.21 × 10−163.93 × 10−49Inf
1000Mean01.43 × 1038.06 × 10−1Inf1.20 × 1031.30 × 10−491.40 × 1035.11 × 10−15InfInf
Std01.37 × 1039.03 × 10−1NaN3.61 × 1013.29 × 10−497.08 × 1011.09 × 10−14NaNNaN
F330Mean03.56 × 10−266.23 × 10−62.28 × 1041.74 × 1034.35 × 1047.36 × 1011.08 × 10−482.25 × 10−574.56 × 103
Std01.82 × 10−251.02 × 10−51.31 × 1041.30 × 1031.17 × 1043.10 × 1014.06 × 10−481.23 × 10−561.79 × 103
200Mean01.76 × 10−122.14 × 1048.63 × 1052.32 × 1055.05 × 1068.90 × 1041.73 × 10−312.86 × 10−453.42 × 105
Std04.91 × 10−121.11 × 1041.56 × 1051.31 × 1051.36 × 1061.88 × 1045.48 × 10−311.54 × 10−449.60 × 104
500Mean01.06 × 10−83.26 × 1054.86 × 1061.01 × 1062.91 × 1075.94 × 1055.34 × 10−261.35 × 10−432.09 × 106
Std05.30 × 10−81.03 × 1059.34 × 1055.23 × 1059.83 × 1061.18 × 1059.19 × 10−266.83 × 10−436.22 × 105
1000Mean02.36 × 10−51.52 × 1061.83 × 1074.24 × 1061.18 × 1082.47 × 1065.52 × 10−231.46 × 10−357.32 × 106
Std01.28 × 10−42.93 × 1054.06 × 1061.96 × 1064.57 × 1076.38 × 1051.12 × 10−227.85 × 10−352.30 × 106
F430Mean1.98 × 10−3201.97 × 10−177.10 × 10−76.92 × 1011.05 × 1015.04 × 1011.049.05 × 10−265.55 × 10−481.68 × 101
Std04.78 × 10−176.47 × 10−79.082.712.89 × 1012.44 × 10−12.18 × 10−253.03 × 10−473.47
200Mean02.10 × 10−112.37 × 1019.72 × 1013.49 × 1018.49 × 1011.96 × 1012.86 × 10−161.39 × 10−414.09 × 101
Std07.03 × 10−116.677.39 × 10−13.491.40 × 1011.614.71 × 10−165.77 × 10−414.46
500Mean07.02 × 10−106.47 × 1019.89 × 1014.00 × 1017.72 × 1012.81 × 1011.16 × 10−113.39 × 10−334.95 × 101
Std01.24 × 10−94.894.11 × 10−13.332.80 × 1011.371.59 × 10−111.73 × 10−324.89
1000Mean02.25 × 10−97.89 × 1019.95 × 1014.55 × 1018.43 × 1013.30 × 1012.50 × 10−71.97 × 10−285.30 × 101
Std04.00 × 10−93.181.95 × 10−13.351.85 × 1011.904.41 × 10−71.08 × 10−274.17
F530Mean4.373.58 × 1012.70 × 1012.68 × 1062.19 × 1022.79 × 1013.13 × 1032.88 × 1012.86 × 1019.77 × 10−1
Std9.712.35 × 1016.78 × 10−11.46 × 1073.37 × 1024.26 × 10−11.64 × 1049.01 × 10−23.04 × 10−15.22
200Mean4.34 × 1012.88 × 1021.98 × 1021.05 × 1094.11 × 1061.98 × 1026.61 × 1051.99 × 1021.99 × 1025.60 × 101
Std7.82 × 1013.26 × 1024.76 × 10−11.13 × 1081.14 × 1061.89 × 10−11.52 × 1054.40 × 10−26.75 × 10−24.75 × 101
500Mean1.27 × 1024.99 × 1024.98 × 1025.12 × 1093.86 × 1074.96 × 1023.00 × 1074.99 × 1024.99 × 1022.31 × 102
Std1.90 × 1022.183.18 × 10−12.17 × 1088.07 × 1064.62 × 10−13.48 × 1065.15 × 10−31.87 × 10−21.93 × 102
1000Mean1.22 × 1029.98 × 1021.07 × 1031.24 × 10101.18 × 1089.94 × 1022.82 × 1089.99 × 1029.99 × 1024.36 × 102
Std2.47 × 1029.67 × 10−13.20 × 1013.12 × 1081.13 × 1077.27 × 10−12.87 × 1073.67 × 10−21.69 × 10−23.94 × 102
F630Mean5.09 × 10−59.53 × 10−38.12 × 10−11.00 × 1031.52 × 10−74.21 × 10−12.27 × 10−46.675.574.69 × 10−4
Std4.64 × 10−52.92 × 10−24.15 × 10−13.04 × 1031.80 × 10−72.56 × 10−13.22 × 10−41.76 × 10−13.09 × 10−13.21 × 10−4
200Mean2.193.58 × 1012.91 × 1012.88 × 1051.69 × 1041.07 × 1013.24 × 1024.92 × 1014.81 × 1011.56
Std3.263.22 × 1011.582.26 × 1041.84 × 1032.784.57 × 1012.13 × 10−13.38 × 10−16.98 × 10−1
500Mean1.07 × 1011.10 × 1029.06 × 1011.18 × 1069.48 × 1042.84 × 1015.92 × 1031.24 × 1021.23 × 1026.01
Std1.34 × 1013.18 × 1011.892.78 × 1045.33 × 1038.785.16 × 1022.13 × 10−13.46 × 10−12.62
1000Mean1.20 × 1012.29 × 1022.02 × 1022.74 × 1062.36 × 1056.85 × 1014.06 × 1042.49 × 1022.48 × 1021.24 × 101
Std2.28 × 1012.01 × 1012.533.49 × 1041.03 × 1041.80 × 1011.94 × 1032.60 × 10−14.41 × 10−14.84
F730Mean2.23 × 10−41.23 × 10−31.85 × 10−32.291.77 × 10−14.07 × 10−33.161.30 × 10−41.86 × 10−39.01 × 10−5
Std2.05 × 10−48.00 × 10−41.19 × 10−32.967.74 × 10−25.54 × 10−34.201.92 × 10−41.88 × 10−37.28 × 10−5
200Mean4.58 × 10−41.62 × 10−31.70 × 10−22.99 × 1031.76 × 1016.24 × 10−32.92 × 1031.08 × 10−42.34 × 10−37.14 × 10−5
Std4.58 × 10−42.22 × 10−35.45 × 10−34.19 × 1023.537.00 × 10−34.63 × 1029.81 × 10−52.07 × 10−35.22 × 10−5
500Mean5.06 × 10−41.75 × 10−35.38 × 10−23.86 × 1042.72 × 1023.39 × 10−34.39 × 1041.13 × 10−43.77 × 10−31.03 × 10−4
Std6.25 × 10−41.56 × 10−31.48 × 10−21.74 × 1034.13 × 1013.30 × 10−35.70 × 1038.65 × 10−53.35 × 10−31.10 × 10−4
1000Mean4.28 × 10−43.08 × 10−31.38 × 10−11.98 × 1051.78 × 1035.79 × 10−32.44 × 1051.14 × 10−43.20 × 10−39.34 × 10−5
Std4.57 × 10−42.65 × 10−32.96 × 10−27.27 × 1031.74 × 1025.81 × 10−36.04 × 1031.43 × 10−43.32 × 10−38.66 × 10−5
F830Mean−1.26 × 104−8.80 × 103−5.98 × 103−8.46 × 103−7.46 × 103−9.80 × 103−4.77 × 103−2.44 × 103−4.46 × 103−1.21 × 104
Std6.64 × 10−45.09 × 1026.98 × 1021.19 × 1037.32 × 1021.57 × 1031.30 × 1033.14 × 1023.59 × 1021.08 × 103
200Mean−8.38 × 104−3.48 × 104−2.96 × 104−3.70 × 104−3.46 × 104−6.87 × 104−1.31 × 104−6.34 × 103−1.18 × 104−7.52 × 104
Std5.243.54 × 1034.82 × 1032.68 × 1033.14 × 1031.29 × 1044.80 × 1039.93 × 1028.64 × 1029.10 × 103
500Mean−2.09 × 105−6.03 × 104−5.72 × 104−6.21 × 104−5.99 × 104−1.77 × 105−2.40 × 104−1.05 × 104−1.91 × 104−1.90 × 105
Std3.87 × 1017.66 × 1034.22 × 1034.14 × 1034.63 × 1032.92 × 1048.68 × 1031.56 × 1032.27 × 1032.43 × 104
1000Mean−4.19 × 105−8.36 × 104−8.80 × 104−8.87 × 104−8.66 × 104−3.48 × 105−3.44 × 104−1.52 × 104−2.83 × 104−3.76 × 105
Std2.01 × 1029.06 × 1031.43 × 1046.25 × 1036.65 × 1035.54 × 1041.30 × 1042.13 × 1033.81 × 1034.60 × 104
F930Mean04.55 × 10−103.401.66 × 1025.25 × 10101.01 × 102008.33 × 101
Std01.74 × 10−94.193.80 × 1011.57 × 10103.03 × 101002.80 × 101
200Mean002.21 × 1012.22 × 1038.20 × 1027.58 × 10−152.03 × 103001.03 × 103
Std009.221.11 × 1028.14 × 1014.15 × 10−141.45 × 102008.92 × 101
500Mean007.71 × 1016.98 × 1033.15 × 10306.31 × 103003.74 × 103
Std002.84 × 1011.37 × 1021.38 × 10201.92 × 102001.88 × 102
1000Mean002.05 × 1021.55 × 1047.63 × 10301.42 × 104008.78 × 103
Std004.77 × 1012.12 × 1021.44 × 10203.30 × 102002.93 × 102
F1030Mean8.88 × 10−162.07 × 10−151.03 × 10−131.78 × 1012.744.44 × 10−154.07 × 10−18.88 × 10−168.88 × 10−164.17
Std01.70 × 10−152.27 × 10−144.327.62 × 10−11.87 × 10−156.82 × 10−1002.11
200Mean8.88 × 10−162.66 × 10−152.40 × 10−52.00 × 1011.31 × 1014.44 × 10−156.578.88 × 10−168.88 × 10−161.49 × 101
Std05.26 × 10−154.81 × 10−67.54 × 10−26.63 × 10−12.47 × 10−153.42 × 10−1008.88 × 10−1
500Mean8.88 × 10−161.48 × 10−151.89 × 10−32.03 × 1011.42 × 1015.15 × 10−151.20 × 1011.24 × 10−158.88 × 10−161.60 × 101
Std01.35 × 10−154.06 × 10−41.17 × 10−12.52 × 10−12.86 × 10−153.13 × 10−11.08 × 10−1507.80 × 10−1
1000Mean8.88 × 10−161.95 × 10−151.76 × 10−22.03 × 1011.46 × 1014.56 × 10−151.60 × 1016.10 × 10−158.88 × 10−161.60 × 101
Std01.90 × 10−152.26 × 10−32.24 × 10−11.45 × 10−12.55 × 10−153.82 × 10−18.64 × 10−1501.28
F1130Mean002.56 × 10−33.10 × 1012.07 × 10−28.30 × 10−39.87 × 10−3005.55 × 10−2
Std005.97 × 10−34.92 × 1011.48 × 10−23.16 × 10−21.05 × 10−2002.88 × 10−2
200Mean005.28 × 10−32.56 × 1031.57 × 10201.48003.74 × 102
Std001.37 × 10−21.65 × 1022.56 × 10106.05 × 10−1009.11 × 101
500Mean002.22 × 10−21.03 × 1048.55 × 1023.70 × 10−188.09 × 101001.96 × 103
Std003.95 × 10−23.03 × 1025.06 × 1012.03 × 10−179.46003.25 × 102
1000Mean003.75 × 10−22.45 × 1042.13 × 1033.70 × 10−182.77 × 102005.35 × 103
Std005.93 × 10−24.32 × 1021.03 × 1022.03 × 10−172.08 × 101009.55 × 102
F1230Mean6.24 × 10−77.45 × 10−33.91 × 10−28.53 × 1067.372.42 × 10−26.91 × 10−31.057.71 × 10−11.67 × 10−5
Std1.16 × 10−62.63 × 10−21.88 × 10−24.67 × 1073.372.20 × 10−22.63 × 10−24.52 × 10−27.15 × 10−21.67 × 10−5
200Mean9.49 × 10−43.80 × 10−15.38 × 10−12.17 × 1094.93 × 1037.15 × 10−23.94 × 1011.191.144.62 × 10−4
Std1.56 × 10−31.70 × 10−14.58 × 10−22.91 × 1087.29 × 1033.02 × 10−21.98 × 1012.79 × 10−23.13 × 10−23.36 × 10−4
500Mean1.44 × 10−37.45 × 10−17.46 × 10−11.19 × 10101.46 × 1061.06 × 10−12.38 × 1051.191.179.62 × 10−4
Std2.08 × 10−32.14 × 10−14.10 × 10−26.36 × 1089.21 × 1055.63 × 10−29.88 × 1049.22 × 10−38.84 × 10−36.70 × 10−4
1000Mean1.00 × 10−39.73 × 10−11.203.06 × 10101.11 × 1079.41 × 10−29.37 × 1061.181.174.64 × 10−4
Std2.08 × 10−31.86 × 10−12.17 × 10−11.13 × 1092.60 × 1064.92 × 10−22.32 × 1063.00 × 10−35.39 × 10−34.04 × 10−4
F1330Mean4.35 × 10−51.14 × 10−15.87 × 10−11.37 × 1071.31 × 1014.99 × 10−14.46 × 10−32.852.816.16 × 10−4
Std1.38 × 10−42.74 × 10−12.68 × 10−17.49 × 1071.48 × 1012.17 × 10−18.92 × 10−33.32 × 10−24.64 × 10−22.06 × 10−3
200Mean1.27 × 10−11.79 × 1011.69 × 1014.49 × 1091.33 × 1066.225.70 × 1031.99 × 1011.99 × 1019.64 × 10−2
Std3.60 × 10−13.276.35 × 10−14.98 × 1087.44 × 1052.083.07 × 1033.27 × 10−25.31 × 10−26.79 × 10−2
500Mean2.08 × 10−15.62 × 1015.05 × 1012.23 × 10103.59 × 1071.91 × 1014.14 × 1064.99 × 1015.00 × 1014.49 × 10−1
Std2.87 × 10−11.15 × 1011.291.21 × 1099.80 × 1065.581.17 × 1062.52 × 10−23.91 × 10−22.67 × 10−1
1000Mean7.12 × 10−11.15 × 1021.25 × 1025.54 × 10101.54 × 1083.96 × 1018.03 × 1079.99 × 1011.00 × 1021.05
Std1.021.00 × 1011.40 × 1011.68 × 1092.46 × 1071.18 × 1011.17 × 1072.82 × 10−24.65 × 10−26.62 × 10−1
F142Mean9.98 × 10−12.214.072.351.162.933.002.844.64 × 10−21.63
Std1.30 × 10−162.764.002.035.87 × 10−13.072.544.52 × 10−17.05 × 10−19.20 × 10−1
F154Mean4.48 × 10−41.39 × 10−32.48 × 10−32.78 × 10−33.48 × 10−37.32 × 10−44.94 × 10−32.40 × 10−31.31 × 10−32.15 × 10−3
Std2.56 × 10−43.61 × 10−36.07 × 10−35.14 × 10−36.74 × 10−35.21 × 10−47.86 × 10−31.24 × 10−33.22 × 10−44.97 × 10−3
F162Mean−1.03−1.03−1.03−1.03−1.03−1.03−1.03−1.02−1.03−1.03
Std6.05 × 10−164.97 × 10−161.95 × 10−86.78 × 10−162.74 × 10−149.52 × 10−106.25 × 10−165.56 × 10−31.83 × 10−41.28 × 10−13
F172Mean3.98 × 10−13.98 × 10−13.98 × 10−13.98 × 10−13.98 × 10−13.98 × 10−13.98 × 10−16.76 × 10−14.00 × 10−13.98 × 10−1
Std001.68 × 10−609.46 × 10−152.08 × 10−502.05 × 10−12.95 × 10−32.20 × 10−14
F182Mean3.003.003.003.003.003.003.003.013.013.00
Std1.82 × 10−156.85 × 10−163.02 × 10−51.70 × 10−152.22 × 10−135.43 × 10−51.33 × 10−151.68 × 10−28.68 × 10−37.64 × 10−13
F193Mean−3.86−3.84−3.86−3.86−3.86−3.86−3.86−3.43−3.83−3.86
Std2.52 × 10−151.41 × 10−13.59 × 10−32.71 × 10−151.38 × 10−127.94 × 10−32.65 × 10−153.26 × 10−12.73 × 10−26.35 × 10−13
F206Mean−3.26−3.25−3.27−3.24−3.23−3.20−3.22−1.78−3.01−3.25
Std6.05 × 10−26.00 × 10−27.38 × 10−26.40 × 10−26.34 × 10−21.18 × 10−11.07 × 10−16.08 × 10−19.02 × 10−25.86 × 10−2
F214Mean−1.02 × 101−7.89−9.32−6.14−7.06−8.28−6.71−6.54 × 10−1−4.08−6.86
Std5.49 × 10−153.122.213.453.462.733.192.50 × 10−19.62 × 10−12.59
F224Mean−1.04 × 101−8.51−1.02 × 101−7.36−9.11−8.18−8.54−7.37 × 10−1−4.48−6.85
Std8.73 × 10−163.229.63 × 10−13.572.692.992.962.99 × 10−12.34 × 10−13.48
F234Mean−1.05 × 101−7.33−1.01 × 101−7.95−8.15−7.31−8.57−1.01−4.45−6.78
Std2.06 × 10−153.751.753.533.263.393.113.03 × 10−11.083.64
Table 4. The Wilcoxon signed-rank test results between IWHO and other algorithms on classical test functions (F1–F23).
Table 4. The Wilcoxon signed-rank test results between IWHO and other algorithms on classical test functions (F1–F23).
FunctionDIWHO vs. WHOIWHO vs. GWOIWHO vs. MFOIWHO vs. SSAIWHO vs. WOAIWHO vs. PSOIWHO vs. HSCAHSIWHO vs. DSCAIWHO vs. MALO
F1306.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F2306.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F3306.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F4306.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F5306.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.22 × 10−46.10 × 10−56.10 × 10−54.21 × 10−1
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.37 × 10−2
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.37 × 10−2
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.83 × 10−46.10 × 10−56.10 × 10−56.10 × 10−51.69 × 10−1
F6301.83 × 10−46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−53.05 × 10−46.10 × 10−56.10 × 10−56.10 × 10−5
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.22 × 10−46.10 × 10−56.10 × 10−56.10 × 10−55.54 × 10−2
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.16 × 10−3
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−52.52 × 10−1
F7301.22 × 10−46.10 × 10−56.10 × 10−56.10 × 10−51.22 × 10−46.10 × 10−56.10 × 10−54.27 × 10−46.10 × 10−5
2003.05 × 10−46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−52.62 × 10−31.22 × 10−44.79 × 10−2
5008.54 × 10−46.10 × 10−56.10 × 10−56.10 × 10−58.54 × 10−46.10 × 10−56.71 × 10−31.22 × 10−41.53 × 10−3
10002.01 × 10−36.10 × 10−56.10 × 10−56.10 × 10−52.01 × 10−36.10 × 10−51.51 × 10−21.22 × 10−41.25 × 10−2
F8306.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.83 × 10−4
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F9301.006.10 × 10−56.10 × 10−56.10 × 10−51.006.10 × 10−51.001.006.10 × 10−5
2001.006.10 × 10−56.10 × 10−56.10 × 10−51.006.10 × 10−51.001.006.10 × 10−5
5001.006.10 × 10−56.10 × 10−56.10 × 10−51.006.10 × 10−51.001.006.10 × 10−5
10001.006.10 × 10−56.10 × 10−56.10 × 10−51.006.10 × 10−51.001.006.10 × 10−5
F10301.25 × 10−16.10 × 10−56.10 × 10−56.10 × 10−59.77 × 10−46.10 × 10−51.001.006.10 × 10−5
2001.25 × 10−16.10 × 10−56.10 × 10−56.10 × 10−53.91 × 10−36.10 × 10−51.001.006.10 × 10−5
5006.25 × 10−26.10 × 10−56.10 × 10−56.10 × 10−57.81 × 10−36.10 × 10−51.001.006.10 × 10−5
10006.25 × 10−26.10 × 10−56.10 × 10−56.10 × 10−51.95 × 10−36.10 × 10−54.88 × 10−41.006.10 × 10−5
F11301.001.25 × 10−16.10 × 10−56.10 × 10−55.00 × 10−16.10 × 10−51.001.006.10 × 10−5
2001.006.10 × 10−56.10 × 10−56.10 × 10−51.006.10 × 10−51.001.006.10 × 10−5
5001.006.10 × 10−56.10 × 10−56.10 × 10−51.006.10 × 10−51.001.006.10 × 10−5
10001.006.10 × 10−56.10 × 10−56.10 × 10−51.006.10 × 10−51.001.006.10 × 10−5
F12306.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.81 × 10−26.10 × 10−56.10 × 10−51.83 × 10−4
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−57.62 × 10−1
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−58.04 × 10−1
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−53.03 × 10−1
F13303.36 × 10−36.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−55.37 × 10−36.10 × 10−56.10 × 10−58.33 × 10−2
2006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−52.52 × 10−1
5006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−54.27 × 10−3
10006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−58.90 × 10−1
F1427.81 × 10−36.10 × 10−51.95 × 10−31.95 × 10−36.10 × 10−59.77 × 10−46.10 × 10−56.10 × 10−56.10 × 10−5
F1548.36 × 10−35.54 × 10−23.66 × 10−41.51 × 10−22.29 × 10−11.22 × 10−46.10 × 10−56.10 × 10−54.27 × 10−3
F1621.006.10 × 10−51.001.22 × 10−46.10 × 10−51.006.10 × 10−56.10 × 10−56.10 × 10−5
F1721.006.10 × 10−51.004.88 × 10−46.10 × 10−51.006.10 × 10−56.10 × 10−51.22 × 10−4
F1821.95 × 10−36.10 × 10−57.54 × 10−16.10 × 10−56.10 × 10−53.07 × 10−16.10 × 10−56.10 × 10−56.10 × 10−5
F1931.006.10 × 10−51.006.10 × 10−56.10 × 10−51.006.10 × 10−56.10 × 10−56.10 × 10−5
F2067.66 × 10−12.56 × 10−21.56 × 10−28.54 × 10−42.52 × 10−16.27 × 10−16.10 × 10−56.10 × 10−52.01 × 10−3
F2143.13 × 10−26.10 × 10−53.13 × 10−26.10 × 10−56.10 × 10−57.81 × 10−36.10 × 10−56.10 × 10−56.10 × 10−5
F2246.25 × 10−26.10 × 10−56.25 × 10−26.10 × 10−56.10 × 10−53.91 × 10−36.10 × 10−56.10 × 10−56.10 × 10−5
F2343.13 × 10−26.10 × 10−51.95 × 10−36.10 × 10−56.10 × 10−51.56 × 10−26.10 × 10−56.10 × 10−56.10 × 10−5
Overall (+/=/−)45/17/059/3/057/5/062/0/052/10/057/5/047/11/450/12/045/12/5
Table 5. Comparison of optimization results on CEC2021 test functions between IWHO and other optimization algorithms (CEC_01−CEC_10).
Table 5. Comparison of optimization results on CEC2021 test functions between IWHO and other optimization algorithms (CEC_01−CEC_10).
FunctionMetricIWHOWHOGWOMFOSSAWOAPSOHSCAHSDSCAMALO
CEC_01Mean2.64 × 1034.54 × 1038.07 × 1082.13 × 1092.41 × 1031.44 × 1091.58 × 1033.25 × 10101.40 × 10101.70 × 103
Std3.02 × 1034.17 × 1036.80 × 1081.62 × 1092.82 × 1031.03 × 1091.94 × 1032.42 × 1092.58 × 1091.99 × 103
Ranking45683711092
CEC_02Mean2.60 × 1032.76 × 1032.74 × 1033.28 × 1033.52 × 1034.34 × 1033.38 × 1036.29 × 1035.76 × 1033.36 × 103
Std4.49 × 1024.21 × 1027.65 × 1027.86 × 1025.76 × 1026.19 × 1023.65 × 1022.98 × 1022.33 × 1026.28 × 102
Ranking13247861095
CEC_03Mean7.93 × 1027.92 × 1027.94 × 1028.37 × 1028.21 × 1029.68 × 1027.85 × 1021.06 × 1039.86 × 1028.61 × 102
Std2.45 × 1012.89 × 1012.66 × 1017.11 × 1014.42 × 1014.63 × 1011.86 × 1011.78 × 1011.82 × 1015.06 × 101
Ranking32465811097
CEC_04Mean1.91 × 1031.91 × 1032.02 × 1031.05 × 1041.91 × 1032.80 × 1031.90 × 1034.20 × 1058.47 × 1041.91 × 103
Std3.444.802.92 × 1021.22 × 1042.241.81 × 1031.602.13 × 1054.36 × 1043.19
Ranking3.53.5683.5711093.5
CEC_05Mean2.26 × 1052.76 × 1058.21 × 1052.87 × 1066.26 × 1052.43 × 1061.92 × 1056.72 × 1064.43 × 1062.92 × 105
Std1.52 × 1052.00 × 1056.45 × 1054.73 × 1064.90 × 1051.20 × 1061.35 × 1052.02 × 1062.22 × 1062.27 × 105
Ranking23685711094
CEC_06Mean1.86 × 1031.99 × 1032.01 × 1032.04 × 1032.26 × 1032.65 × 1032.19 × 1033.82 × 1032.75 × 1032.41 × 103
Std1.53 × 1021.93 × 1021.69 × 1022.10 × 1022.30 × 1023.11 × 1022.35 × 1022.69 × 1022.10 × 1022.74 × 102
Ranking12346851097
CEC_07Mean1.43 × 1059.90 × 1043.30 × 1051.07 × 1062.39 × 1051.34 × 1061.07 × 1052.78 × 1062.23 × 1061.30 × 105
Std1.15 × 1051.10 × 1054.35 × 1051.59 × 1062.27 × 1051.00 × 1069.74 × 1041.54 × 1061.16 × 1061.18 × 105
Ranking41675821093
CEC_08Mean2.36 × 1032.80 × 1033.02 × 1034.07 × 1032.93 × 1034.21 × 1033.64 × 1035.91 × 1036.04 × 1032.30 × 103
Std3.38 × 1021.03 × 1039.10 × 1021.32 × 1031.31 × 1031.92 × 1031.49 × 1034.45 × 1028.07 × 1021.15 × 101
Ranking23574869101
CEC_09Mean2.88 × 1032.87 × 1032.87 × 1032.89 × 1032.86 × 1033.04 × 1033.06 × 1033.47 × 1033.02 × 1032.90 × 103
Std3.78 × 1013.22 × 1013.95 × 1013.40 × 1012.14 × 1017.19 × 1018.00 × 1019.40 × 1011.99 × 1012.36 × 101
Ranking42.52.551891076
CEC_10Mean2.96 × 1032.97 × 1033.00 × 1033.04 × 1032.95 × 1033.13 × 1032.95 × 1035.29 × 1033.84 × 1032.97 × 103
Std3.77 × 1013.22 × 1013.95 × 1011.56 × 1023.29 × 1017.21 × 1012.97 × 1012.94 × 1022.42 × 1022.64 × 101
Ranking34.5671.581.51094.5
Overall Ranking12674831095
Table 6. Optimization results for the welded beam design problem.
Table 6. Optimization results for the welded beam design problem.
AlgorithmOptimal Values for VariablesMinimum Cost
hltb
IWHO0.20573.25309.03660.20571.6952
WHO [49]0.20583.25149.03700.20581.6958
GWO [50]0.2023693.5442149.0482100.2057231.728712
MFO [51]0.3025462.6626197.2624560.3185241.726186
WOA [53]0.3285172.4652677.0462710.4265801.878023
SCA [65]0.1418835.22567410.000000.2116401.858033
ALO [66]0.1251309.7841186.5221530.3949371.785364
DA [67]0.2069053.3149909.3972120.2069051.749248
LSHADE [68]0.205733.47059.03660.205731.724852
Table 7. Optimization results for the tension/compression spring design problem.
Table 7. Optimization results for the tension/compression spring design problem.
AlgorithmOptimal Values for VariablesMinimum Weight
dDN
IWHO0.05170.41557.15640.0102
AO [70]0.05024390.3526210.54250.011165
HHO [71]0.0517963930.35930535511.1388590.012665443
SSA [52]0.0512070.34521512.0040320.0126763
WOA [53]0.0512070.34521512.0040320.0126763
GWO [50]0.051690.35673711.288850.012666
PSO [15]0.0517280.35764411.2445430.0126747
MVO [72]0.052510.3760210.335130.012790
GA [30]0.0514800.35166111.6322010.01270478
HS [73]0.0511540.34987112.0764320.0126706
Table 9. Optimization results in car crashworthiness design problem.
Table 9. Optimization results in car crashworthiness design problem.
AlgorithmIWHOGWO [50]GOA [18]PSO [15]MFO [51]SSA [52]DSCA [55]MALO [56]
x10.5000 0.5000 0.500000.5000 0.5000 0.5000 0.5000 0.5000
x21.1186 1.0268 1.116701.1327 1.0946 1.1221 0.9765 1.2023
x30.5000 0.5929 0.500000.5000 0.5000 0.5062 1.3381 0.5000
x41.2982 1.3772 1.302081.2781 1.3461 1.3170 0.5000 1.2833
x50.5000 0.5000 0.500000.5000 0.5000 0.5000 0.5214 0.5000
x61.5000 1.5000 1.500001.5000 1.5000 1.5000 1.5000 1.1728
x70.5000 0.5000 0.500000.5000 0.5000 0.5000 0.6263 0.5000
x80.3450 0.3450 0.345000.3450 0.3450 0.3136 0.1920 0.3347
x90.3450 0.3450 0.192000.3450 0.3450 0.2115 0.3450 0.3096
x10−19.1594 −30.0000 −19.54935−16.6449 −23.4599 −21.9139 −30.0000 −5.3953
x110.0020 0.1523 −0.004310.1076 0.0014 2.2405 7.8909 7.3999
Optimal value22.8427 23.1976 22.8447422.8556 22.8744 22.9855 24.9261 23.3406
Table 10. Optimization results for the speed reducer design problem.
Table 10. Optimization results for the speed reducer design problem.
AlgorithmOptimal Values for VariablesOptimum Weight
x1x2x3x4x5x6x7
IWHO3.49760.7177.37.83.35015.28552995.43736
AO [70]3.50210.7177.30997.74763.36415.29943007.7328
PSO [15]3.50010.717.00027.51777.78323.35085.28673145.922
AOA [75]3.503840.7177.37.729333.356495.28672997.9157
MFO [51]3.497450.7177.827757.712453.351785.286352998.9408
GA [29]3.510250.7178.357.83.362205.287723067.561
SCA [65]3.508750.7177.37.83.461025.289213030.563
HS [73]3.520120.7178.377.83.366975.288713029.002
FA [78]3.507490.7001177.719678.080853.351515.287053010.13749
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zheng, R.; Hussien, A.G.; Jia, H.-M.; Abualigah, L.; Wang, S.; Wu, D. An Improved Wild Horse Optimizer for Solving Optimization Problems. Mathematics 2022, 10, 1311. https://doi.org/10.3390/math10081311

AMA Style

Zheng R, Hussien AG, Jia H-M, Abualigah L, Wang S, Wu D. An Improved Wild Horse Optimizer for Solving Optimization Problems. Mathematics. 2022; 10(8):1311. https://doi.org/10.3390/math10081311

Chicago/Turabian Style

Zheng, Rong, Abdelazim G. Hussien, He-Ming Jia, Laith Abualigah, Shuang Wang, and Di Wu. 2022. "An Improved Wild Horse Optimizer for Solving Optimization Problems" Mathematics 10, no. 8: 1311. https://doi.org/10.3390/math10081311

APA Style

Zheng, R., Hussien, A. G., Jia, H. -M., Abualigah, L., Wang, S., & Wu, D. (2022). An Improved Wild Horse Optimizer for Solving Optimization Problems. Mathematics, 10(8), 1311. https://doi.org/10.3390/math10081311

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop