Next Article in Journal
Dynamics on Binary Relations over Topological Spaces
Previous Article in Journal
Computing the Metric Dimension of Gear Graphs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Improved Whale Optimization Algorithm Based on Different Searching Paths and Perceptual Disturbance

1
School of Electronic and Information Engineering, University of Science and Technology Liaoning, Anshan 114044, China
2
Fujian Institute of Research on the Structure, Fuzhou 350002, China
3
National Financial Security and System Equipment Engineering Research Center, University of Science and Technology Liaoning, Anshan 114044, China
*
Authors to whom correspondence should be addressed.
Symmetry 2018, 10(6), 210; https://doi.org/10.3390/sym10060210
Submission received: 15 April 2018 / Revised: 1 June 2018 / Accepted: 6 June 2018 / Published: 11 June 2018

Abstract

:
Whale optimization algorithm (WOA) is a swarm intelligence optimization algorithm inspired by humpback whale hunting behavior. WOA has many similarities with other swarm intelligence algorithms (PSO, GWO, etc.). WOA’s unique search mechanism enables it to have a strong global search capability while taking into account the strong global search capabilities. In this work, considering the the deficiency of WOA in local search mechanism, combined with the optimization methods of other group intelligent algorithms, perceptual perturbation mechanism is introduced, which makes the agent perform more detailed searches near the local extreme point. At the same time, since the WOA uses a logarithmic spiral curve, the agent cannot fully search all the spaces within its search range, even though the introduction of the perturbation mechanism may still lead to the algorithm falling into a local optimum. Therefore, the equal pitch Archimedes spiral curve is chosen to replace the classic logarithmic spiral curve. In order to fully verify the effect of the search path on the performance of the algorithm, several other spiral curves have been chosen for experimental comparison. By utilizing the 23 benchmark test functions, the simulation results show that WOA (PDWOA) with perceptual perturbation significantly outperforms the standard WOA. Then, based on the PDWOA, the effect of the search path on the performance of the algorithm has been verified. The simulation results show that the equal pitch of the Archimedean spiral curve is best.

1. Introduction

Problems with optimization must find the most optimal solution of the objective function by iteration. Generally, the search target—which can be described by coutinuous, descrete, linear, unlinear, convace, and convex functions—is a function that belongs to optimal objective functions. In order to solve the problem of function optimization, a heuristic algorithm inspired by natural processes and/or events has raised concerns. Swarm intelligence optimization algorithms that simulate biological population evolution is a random search algorithm. It can solve complex global optimization problems through cooperation and competition among individuals. Swarm intelligence optimization algorithm include the ant colony optimization (ACO) algorithm [1,2,3], genetic algorithm (GA) [4,5], particle swarm optimization (PSO) algorithm [6,7,8], artificial bee colony (ABC) algorithm [9], grey wolf algorithm (GWO) [10], harmony search (HS) Algorithm [11,12], etc.
The whale optimization algorithm (WOA) is a biological heuristic algorithm presented by Seyedali Mirjalili and Andrew Lewis in 2016. WOA is a swarm intelligence optimization algorithm inspired by the unique humpback hunting method. Because of unique optimization mechanism, WOA has a good global search capability. Therefore, the new algorithm has been widely proposed by the engineering community. As a new type of bionic algorithm with better global search performance, many scholars are interested in it and apply it to a variety of engineering problems. It has been widely used in the optimization of neural network parameters, allocation, and scheduling. On the allocation and scheduling, the literatures [13,14,15,16] use WOA to optimize energy management system (EMS) of the Combined Economic Emission Dispatch (CEED) problem for microgrid systems, and MATLAB software is used to compared it with GM, ACO, PSO and WOA; the results show that WOA can obtain better results with fewer iterations. The literature [17] applies WOA to the Unit Commitment Problem of power generation operation scheduling, the experimental results show that the convergence speed of WOA is obviously faster than PSO, GA, etc. In order to reduce the wire loss of distribution network, the literature [18] uses WOA to optimize the distribution scheme of the capacitors in the grid. The simulation results show that WOA is more effective in reducing operating costs and maintaining better voltage distribution. Similarly, the literature [19] uses the WOA to the water resources scheduling problem. The research results show that WOA has a faster convergence speed.
In terms of parameter optimization, the literature [20] uses WOA to optimize two parameters of a least-squares support vector machine to establish a WOA-LSSVM (WOA-least-squares support vector machine) model to predict carbon dioxide emissions. Similarly, chaos WOA algorithm (CWOA) is proposed to optimize the parameters of Elman neural network in the literature [21], so as to establish a soft measurement model and predict variables. The literature [22] compares WOA with other methods for optimizing neural network parameters quantitatively and qualitatively. Simulation results show that the proposed trainer can outperform current algorithms on most data sets in terms of local optimal avoidance and convergence speed. The literature [23] uses WOA to optimize the parameters of the multilayer perception model, and it uses five standard data sets to verify the validity of the modified model. We compared it with GWO and PSO, and found a high convergence and classification rate of WOA, and it is possible to avoid local minimums.
In terms of variable prediction, the literature [24] proposes a multi-objective whale optimization algorithm (MOWOA) model to predict the wind speed in the power system and thus provide a reference for power dispatching. The literature [25] combines the reverse adaptive whale optimization algorithm (AWOA) and the fast learning network (FLN) to propose an integrated modeling method (AWOA-FLN) and establishes a 600-MW prediction model for supercritical steam turbine group heat rate. Simulation results show that the AWOA-FLN model has stronger prediction accuracy and stronger generalization ability than the improved PSO algorithm and differential evolution algorithm, and it can more accurate predict the heat rate of the turbine. In image processing, in order to find the best image segmentation threshold, the literature [26] combines WOA and moth-flame optimization to avoid the problem of determining the optimal threshold spend too much time in the case of multilevel thresholds. The simulation results show that the proposed algorithm has better performance than other swarm intelligence algorithms. The literature [27] propose a MRI image liver segmentation method based on the whale optimization algorithm (WOA). The proposed method uses a set of 70 magnetic resonance images (MRI) images for testing. The experimental results show that the method has a high accuracy overall. In addition, considering WOA optimization and other applications, the literature [28] introduces the nondominated sorting algorithm to obtain the optimal solution of WOA, named the nondominated sorting WOA (NSWOA). The simulation results show that the method is much better than multiobjective collision-body optimizer (MOCBO), multi-objective particle swarm optimizer (MOPSO), nondominated sorting genetic algorithm II (NSGA-II) and the multiobjective symbiotic organism search (MOSOS). The literature [29] proposed a chaos WOA (CWOA) algorithm to calculate and automatically adjust the internal parameters of the optimization algorithm through chaotic mapping. This idea is essentially different from the literature [21]. The experimental results show that CWOA can effectively optimize the parameters of photovoltaic cells and their components. The literature [30] proposes a WOA-based feature selection approach, which is applied to find the largest subset of features so that it maximizes classification accuracy while retaining the minimum number of features.
The essence of a group intelligence algorithm is to find the global optimal value as much as possible with the help of various mechanisms. The improvement ideas proposed by many scholars are as far as possible to complete the search mechanism of group intelligent algorithm, so that the search agent traverses the entire search space as much as possible.
In this paper, we analyze the search mechanism of the WOA algorithm and find that its search mechanism has defects. However, this kind of defect cannot be solved by adjusting the moving step length. The original WOA uses a logarithmic spiral curve as the search path of the agent. However, the screw pitch of the logarithmic spiral curve is not fixed. Because the screw pitch causes a large amount of agent movements in the early stage, some areas cannot be searched. As a result, the global optimum is ignored. For this problem, this paper proposes a WOA with perceptual perturbation [31,32,33] so that the agent can perturb near the local extreme points to obtain better optimal values as much as possible. The closest to the idea of this paper is the literature [34]. In order to improve the PSO’s local search ability, literature [34] defines and analyzes the regions with the most particles. At the same time, presented a PSO algorithm with intermediate disturbance searching strategy (IDPSO), which enhances the global search ability of particles and increases their convergence rates. Introducing the perturbation mechanism into the group intelligent algorithm can effectively improve the performance of the algorithm. The difference between this article and other methods is that the step length is set to make the search agent’s moving step smaller and smaller, so that the agent can search more and more carefully. The definition of the minimum step size makes it impossible for the agent’s step size to be smaller and smaller, thus ensuring the algorithm’s convergence speed and ability to jump out of the local extreme point. At the same time, for the defects of the search mechanism, this paper uses the remaining seven spiral curves instead of the original logarithmic spiral curve to obtain an optimal search path. Finally, we select 23 benchmark test functions to verify the effectiveness of perceived perturbation and remaining spiral curves.
The structure of this paper is as follows: we have an introduction in the first section. The second section introduces the WOA optimization algorithm and analyzes the defects of the algorithm. The third section introduces the idea and method of improving WOA. The fourth section is a simulation, and a summary in the fifth section.

2. Whale Optimization Algorithm

2.1. Inspiration

The whale optimization algorithm (WOA) is inspired by the unique hunting method of the humpback whale, which is called the bubble-net predation method [35,36,37,38,39]. The humpback whale can perceive the distance between him and the prey and surround the prey. It is observed that the humpback whale can move up with a spiral path in about 15 m deep, and spit out a number of different sizes of bubbles. The last spit out bubble and the first spit out bubble rose to the surface at the same time so as to form a cylindrical or tubular bubble network. It likes a huge spider knotted web to surround the prey tightly, and makes the prey toward the center of the net. So the humpback whales almost upright open mouth in the bubble circle and swallow the prey in the net. According to the above descriptions, the hunting behavior of the humpback whale can be divided into three steps: encircling prey, spiral bubble-net feeding maneuver and searching for prey.

2.2. Search for Prey (Exploration Phase)

In order to establish the mathematical model of the humpback whale searching path, each humpback whale is set as a search agent. At the same time, the algorithm is divided into three steps: the search for prey, encircling prey and spiral bubble-net attacking method (exploitation phase). Then these three kinds of behavioral mathematical model are established. And the mathematical model is established for these three behaviors. In the modeling of the spiral bubble-net attacking method, two methods are proposed, which are the shrinking encircling mechanism and the spiral updating position method. Among them, the searching for prey can be regarded as the exploration stage, whose main purpose is to find a better solution. The bubble-net attacking method can also be called the exploitation phase, whose main purpose is to use this better solution more fully.
In the exploration phase (searching for prey), in order to make the search more extensive, the search agents are pushed away from each other by the positive and negative of a random vector A . At the same time, the position of the search agent is replaced by a randomly selected search agent to replace the current optimal search agent. The mathematical model is described as follows:
D = | C X r a n d X |
X ( t + 1 ) = X r a n d A D
where, X r a n d is a random location vector selected from the current population. Figure 1 shows a possible location of a solution when A > 1 . When | A | > 1 , the search agent selects a random agent, and when | A | < 1 , the search agent is selected as the current best agent to replace the location of the search agent.

2.3. Encircling Prey

At the initial stage, the optimal location in the search space is unknown when the prey is surrounded. In this algorithm, the best candidate is regarded as the target prey or the best target. After that, the best search agent will be defined, and other agents will try to update their location toward the best agent. The mathematical model of this behavior can be described as follows:
D = | C X * ( t ) X ( t ) |
X ( t + 1 ) = X * ( t ) A D
where t is the current iteration number, A and C are the coefficient vector, X * is the known optimal location vector, X is the location vector of other search agents. It is proposed that, in each iteration, if a better solution occurs, the X * will be replaced with a better one.
The calculated expressions of A and C are described follows:
A = 2 a r a
C = 2 r
The a is linearly reduced in the interval (0, 2) in the iterative process (including the whole exploration and development phase), and the vector r is a random vector in interval [0,1]. Figure 2 illustrates the theoretical basis of Equation (4) in two dimensional plane. The search agent (X, Y) can update its location based on the location of the current best agent ( X * , Y * ). It can be seen from this point that if the choice of ( X * , Y * ) is not good, it will be easy to make the algorithm fall into the local optimum. It can be seen from Figure 2 that anywhere near the best agent can be achieved by changing the value of the A and C .

2.4. Bubble-Net Attacking Method (Exploitation Phase)

The third stage is the bubble-net attacking method. In order to realize the model in this stage, two ideas are introduced.
1.
Shrinking encircling mechanism
By reducing the value of a in Equation (5), the value of A is limited in [−a, a]. When random values for A are in the interval [−1,1], the new search agent can be defined any location between the initial position and the current search agent’s best position. The theoretical illustration of the shrinking encircling mechanism is shown in Figure 3. It can be seen that any position between (X, Y) and ( X * , Y * ) in the two dimensional plane can be reached by adjusting the value of A between [0,1].
2.
Spiral updating position method
The sketch map of spiral renewal position method is shown in Figure 4, which is the path of the search agent proposed by the original WOA. It calculates the distance between whale (X, Y) and ( X * , Y * ) prey. The equation imitating the humpback whales spiral moving mode is described as follows.
X ( t + 1 ) = D e b l cos ( 2 π l ) + X * ( t )
where, D = | X * ( t ) X ( t ) | represents the distance between the i th whale and the prey, b is used to define the constant to limit the logarithmic spiral and l is a random number between the interval [−1,1].
To be mentioned, the humpback whales swim around the prey in a gradual contraction of the circle and a spiral shape. In order to facilitate the establishment of the model, it is stipulated that each of the fifty percent of the whales may choose to surround the contraction path or spiral model to update their locations. The mathematical model is described as follows.
X ( t + 1 ) = { X * ( t ) A D p < 0.5 D e b l cos ( 2 π l ) + X * ( t ) p 0.5
where, p is a random number between [0,1].

2.5. Idea of Improving Whale Optimization Algorithm

Through carrying out the research on the original WOA idea, two improvement ideas are put forward.
(1) It can be seen from the searching path of the whales in Figure 4 and Equation (7), the logarithmic spiral curve adopted as the searching path, but the pitch is not equal seen from the logarithmic helix curve. In other words, when the searching agent carries out the searching process according to the path, it can be seen from the two shaded sections in Figure 2 that any location around the best agent can be changed by changing the values of A and C . However, if the pitch of the spiral curve is larger than that of the search agent, it will lead to some locations can not be searched so as to reduce the ergodicity of the algorithm.
(2) In order to make the algorithm search thoroughly near the location of the search agents, after each iteration of the WOA, a set of more advantageous searched positions X * ( t ) are obtained. Then X * ( t ) will be no go directly into the next iteration, the disturbance is carried on it so as to searching the nearby scope of X * ( t ) . The next iteration will generate a new best searching agent.

3. Complex Path-Perceptual Disturbance WOA

3.1. Selection of Mathematical Model of Searching Path

According to the logarithmic spiral model proposed by the original WOA, seven kinds of spirals are puts forward as the mathematical models of searching paths [40].

3.1.1. Logarithmic Spiral Curve (Lo)

Logarithmic spiral curve is also called equilateral spiral curve. The mathematical model of Logarithmic spiral searching path is described in Equation (9) and the two-dimensional image is show in Figure 5.
{ x = a e l cos ( 2 π l ) y = a e l sin ( 2 π l )

3.1.2. Archimedes Spiral Curve (Ar)

The Archimedes spiral curve is a trail generated by a point evenly moving away from a fixed point, while moving at a fixed angular velocity around the fixed point. The equal pitch Archimedes spiral means that the pitch of the spiral curve is a invariant constant shown in Figure 6. The mathematical model of Archimedes spiral searching path is described in Equation (10) and the two-dimensional image is show in Figure 6.
{ x = ( a + b l ) cos ( 2 π l ) y = ( a + b l ) sin ( 2 π l )
As we put forward, the pitch of the spiral curve gradually changes to affect the convergence performance of the algorithm. Therefore, we first select the Archimedean spiral curve with a constant pitch. Since its pitch is equidistant, we use an algorithm to adjust the parameters of the function so that the algorithm achieves optimal performance.

3.1.3. Rose Spiral Curve (Ro)

Assuming a fixed length segment AB = 2a, the two endpoints of AB slide on two mutually perpendicular straight lines. Then a vertical line OM is made from the intersection O of two straight lines to the line AB. The trajectory of foot M is called the Rose spiral curve. The mathematical model of Rose spiral with four leaves searching path is described in Equation (11) and the two-dimensional image is show in Figure 7.
{ x = a cos ( n π l ) sin ( l ) y = a sin ( n π l ) cos ( l )
Because the path of the function is relatively simple, taking into account the existence of the disturbance, by adjusting the parameters of the function, the spacing between the search paths is as small as possible, and the optimal value may be obtained faster with as few iterations as possible. So we chose this spiral curve.

3.1.4. Epitrochoid-I (Ep-I)

A movable circle is carried out the roll around the externally-tangent of the fixed circle internally, where the radius of the fixed circle is a and the radius of the movable circle is b . In the process of the movable circle rolling, the trajectory is formed by a fixed point P on the movable circle, which is called the Epicycloid-I. The mathematical model of Epitrochoid-I searching path is described in Equation (12) and the two-dimensional image is show in Figure 8.
{ x = ( a + b ) cos ( n π l ) c cos ( a b + 1 n π l ) y = ( a + b ) sin ( n π l ) + c sin ( a b + 1 n π l )

3.1.5. Hypotrochoid (Hy)

A fixed large circle internally tangent a movable small circle. In the process of small circle rolling, the trajectory is formed by a fixed point P on the small circle, which is called the Hypotrochoid. The curve will change with the radius of the two circles. The mathematical model of Hypotrochoid searching path is described in Equation (13) and the two-dimensional image is show in Figure 9.
{ x = ( a + b ) cos ( n π l ) b cos ( a b + 1 n π l ) y = ( a + b ) sin ( n π l ) b sin ( a b + 1 n π l )

3.1.6. Epitrochoid-II (Ep-II)

The formation principle of the Epicycloid-II and the Epicycloid-I is the same. Just the difference of the radius of the big circle and the small circle produces the different trend. The mathematical model of Epitrochoid-II searching path is described in Equation (14) and the two-dimensional image is show in Figure 10.
{ x = ( a b ) cos ( n π l ) + c cos ( a b 1 n π l ) y = ( a b ) sin ( n π l ) c sin ( a b 1 n π l )
The selection idea of the three spiral curves of EP-I, EP-II, and Hy is that when multiple agents are searching at the current optimal position, the agent performs a full search around the current optimal position according to its search path.

3.1.7. Fermat Spiral Curve (Fe)

The Fermat spiral curve is a kind of equiangular spiral curve. It is obtained by tire up the starting points of the two logarithmic spiral curves, whose rotation direction is opposite. The mathematical model of Fermat spiral searching path is described in Equation (15) and the two-dimensional image is show in Figure 11.
{ x = l cos ( a b 1 n π l ) y = l sin ( a b 1 n π l )

3.1.8. Lituus Spiral Curve (Li)

The Lituus spiral curve is parameterized and interacted by three helical curves. The mathematical model of Lituus spiral searching path is described in Equation (16) and the two-dimensional image is show in Figure 12.
{ x = a + b l cos ( a b 1 n π l ) y = a + b l sin ( a b 1 n π l )
Li and Fe, we complete the search path corresponding to such functions by defining the positive and negative of two search agents. Search for larger ranges in less time by combining two agents searching in symmetrical positions.

3.2. Introduction of Disturbance Factor

In order to make the WOA search thoroughly near the location of the search agent, after each iteration, a set of more advantageous search positions X * ( t ) are obtained. But it will not go directly to the next iteration. A disturbance factor is carried out to search the nearby scope of X * ( t ) . The next iteration will generate a new best search agent. In order to get rid of the randomness and blindness of the original perturbation method, when constructing the perturbation factor, the range of disturbances are limited to ensure the accuracy of local search. At the same time, in order to make the search agents swim to the targets in the disturbance range as far as possible. Then the perceptual coefficient is introduced. In this way, the search agents are constantly changing the positions within the disturbance range and the optimal value are replaced with the current optimal. The introduced disturbance factor is described as follows.
ε = x d x x d x S t e p R a n d ( )
After the disturbance, the position of the search agent is updated by the following equation.
x = x + u R a n d ( ) ε
where, u is the coefficient that defines the perturbation distance, R a n d ( ) is a random number between (−1, 1), S t e p is the step size of the search agent moving at the time of disturbance, x d is the position at time d , x is the current best position, represents the point-to-point multiplication.
In the Equation (18), ( x d x ) / x d x is selected for the nature of fitness function, when the fitness function solves the maximum value problem, select ( x d x ) / x d x ; if the minimum value is solved, then ( x d x ) / x d x is selected. When the perturbation result is unchanged, the disturbed structure is allowed to enter the next main loop.

3.3. Improved WOA with Perceptual Disturbances and Complex Paths

The algorithm should search out a better position as fast as possible in the initial period of the disturbance. In the later stages of the perturbation, the search agent can perform a more thorough search near the target to improve the search accuracy. For the above given formula, the moving step size of the search agents is redefined as:
S t e p = S t e p min + ( S t e p max S t e p min ) × N n N s . t . i f S t e p > S t e p max t h e n S t e p = S t e p max
where S t e p min is the minimum value of the moving step, S t e p max is the maximum value of the moving step, N is the maximum number of iterations and n is the current number of iterations.
It can be seen from the above equations, the value of S t e p is the maximum value from the beginning of the iteration and the minimum at the end of the iteration. The algorithm procedure of the complex path-perceptual disturbance WOA (CP-PDWOA) is described as follows.
Step 1: Initialization. Randomly generate N search agents and initialize their locations.
Step 2: Realization of searching path. In this paper, the moving path of the search agents is improved, but the shrinking idea of the searching path is constant. Randomly generate a number p ( 0 , 1 ) . When p < 0.5 , the shrinking encircling mechanism is executed so as to narrow the search radius. When p > 0.5 , the method of the spiral updating position is executed. The above two steps are not followed by order and are random implemented.
Step 3: Update locations. The value of | A | in Equation (5) is the criterion for the search agent to update the next location. When | A | 1 , the search agent randomly selects an agent as a reference for the next move, which is to ensure the ergodicity of the algorithm. When | A | 1 , the search agent chooses the current optimal agent as the reference for the next move, which is to ensure the convergence of the algorithm.
Step 4: Perceived perturbation. The best search agent at each iteration is disturbed. Then the position of the search agent is tested after the perturbation, and the test result is compared with the search agent obtained from the last disturbance to select the position of the better search agent. After a number of disturbances, a good set of search agents is obtained, and the best position X b e s t is selected.
Step 5: Determine whether to terminate the iteration. If the fitness value f ( X b e s t ) reaches the termination condition, X b e s t is the optimal solution. If it does not reached, the solution group obtained by Step 4 returned to Step 2 for the next iteration.
It must be explained that Step 2 and Step 3 are executed at the same time when the algorithm is ran. In this paper, it is decomposed into two steps in order to facilitate the expression.

4. Simulation and Results Analysis

4.1. Selection of Testing Functions

In order to test the performance of the improved WOA, 23 benchmark functions are selected in this paper [41,42]. The expressions of the functions are shown in Table 1 in details.
The three-dimensional images of a partial functions are shown in Figure 13a–i.

4.2. Simulation Results and Analysis

This paper tests the computational power of WOA by adopting 23 classical benchmark functions. Among them, the number of search agents is 30, the maximum number of iterations is 500. Each function runs 30 times and then takes the average to plot the convergence curve. Benchmark functions can be divided into four groups: unimodal, multimodal, fixed-dimension multimodal and composite functions. Functions F1–F7 are typical unimodal since they have only one global optimum. These functions allow to evaluate the exploitation capability of the investigated meta-heuristic algorithms. The functions F8–F23 are multimodal functions. Unlike unimodal functions, multimodal functions include many local optima whose number increases exponentially with the problem size (number of design variables). Therefore, this kind of test problems turns very useful if the purpose is to evaluate the exploration capability of an optimization algorithm.
In this paper, we completed the simulation in matlab2010b environment, the computer configuration is Windows 7. The related MATLAB code of CP-PDWOA is available online at https://github.com/sunweizhen01/CP-PDWOA.git. Firstly, in order to verify the effectiveness of the perturbation mechanism, the convergence curves of the original WOA and the convergence curves after the introduction of the disturbance mechanism only with logarithmic spiral curve are compared. Then, the search path of the WOA with the disturbance mechanism is replaced by several spiral curves listed in Table 1 so as to find out the search path with the best optimization performance. The paper chooses several functions from above mentioned classes as a representative, and their convergence curves are shown in the following Figure 14 and Figure 15. The convergence performances of all the different methods are shown in Table 2. The convergence curves of the representative functions are shown in Figure 14a–i to verify the validity of the disturbance.
It can be seen from the Figure 14 that the convergence effect of the improved WOA with the introduction of the perturbation is better than the standard WOA, which proves that the perturbation mechanism is effective and can make the algorithm have good convergence speed and optimization precision. Next, the simulation results are carried out to verify which search has best search performance. Therefore, the convergence curves generated by the improved WOA with these proposed paths are compared.
In Figure 14, LSC represents the search path used by the original WOA, namely a spiral curve. Therefore, we use LSC to represent the original WOA and Disturbance-LSC to introduce PDWOA after perceptual perturbation. As can be seen from Figure 14, the perturbation-inducing WOA has significantly better convergence and convergence than the original WOA. Specifically, when the two methods converge to the same optimal value, Disturbance-LSC cannot reach the optimal value in fewer iterations. In general, Disturbance-LSC can converge to a better optimal value than LSC. There is a special case in these examples, F8. Through analysis of this function, we find that in the range of its range, we calculate the minimum value, and the WOA algorithm and PDWOA algorithm give the minimum value as far less than this value. The minimum value searched by other algorithms, such as GSA, PSO, etc., is much smaller than this value, which has been calculated in the paper [7]. So far, we have not found the specific reason for this problem.
Through the above analysis and Figure 14, we can find that PDWOA proposed in this paper has better performance than WOA. In the literature [7], WOA has been compared with many algorithms to determine the performance of WOA. In the following, on the basis of this research, we have studied the replacement of PDWOA algorithm search path. In order to ensure the validity of the proposed method, we have introduced the remaining eight spiral curves.
In order to distinguish it from the simulation results of the logarithmic spiral curve of the perturbation simulation part, we compare the different search path’s effect on the convergence performance of the algorithm. We define the name of the logarithmic spiral curve as Lo. From a numerical perspective, the convergence curve represented by Lo is exactly equal to the convergence curve of Disturbance-LSC.
It can be seen from the convergence curves of the benchmark functions and the comparison of the data in Table 2 that in the case of the same number of iterations, the convergence performance of the WOA using the Archimedes spiral curve as the search path of the search agents is superior to the logarithmic spiral curves used in the original algorithm and other seven spiral curves. In particular, for the function F5, F6, F10 and F13, the Archimedes spiral curve makes the convergence speed and accuracy of the algorithm improved obviously compared to other seven helical curves (search paths). Especially for the function F5, this function should be able to converge to 0, but the original WOA can only converge to 27.86, while the convergence of several other swarm intelligent algorithms is not very satisfactory [7]. However, when the search path is changed to the Archimedes spiral curve, the convergence result is obviously close to zero. On the other hand, seen from the three-dimensional images of function F8, F9 and F14, these three functions are very complex multipeak functions, which have many local minimum. In the CP-PDWOA, the added perturbation mechanism can make the algorithm effectively avoid the local minim and search the extreme points, and the convergence speed of the algorithm is obviously improved. It is shown that the improved idea proposed in this paper enhances the ability of the algorithm to avoid the local minimum. From the convergence curves of the functions and the data in Table 2, it can be seen that the improved whale optimization algorithm has excellent search performance.
As can be seen from Figure 15f, when the average value of the function F12 is searched, the areas of the convergence curves corresponding to Lo and Hy are equal, and there is only a slight difference in the early stage. This is not surprising, because from the convergence results of all the models, their convergence results do not think that there are huge differences in other functions; From the paths of the two models, it can be seen that the approximate convergence curve is obtained by adjusting the parameters of the function. The main reason is that these two models fall into the same local extreme point, so we get the same convergence curve, which is also related to the initialization of the agent. Because in order to ensure the feasibility of the simulation, we chose the pseudo-random number as the initial position of the agent.
It is concluded that the introduction of perceptual perturbation not only makes the searching of agents more purposeful, but also makes the location of search agents more diversified and prevents the algorithm from falling into local minim. At the same time, the moving step size of the search agent can be adjusted at different simulation phase so as to ensure that the algorithm in the early search has a strong ability to exploit and in the late search has a stronger explore capability. This ensures the accuracy of the algorithm searching process, but also makes the algorithm have a faster convergence rate.

5. Conclusions

In this paper, we have learned from other scholars’ experience in the improvement of swarm intelligence algorithms, and improved the performance of WOA by introducing disturbance factors. Then the WOA search mechanism is analyzed and the logarithmic spiral curve of equal pitch is used as the search path of the agent. The simulation results also prove that the performance of the equal-pitch Archimedean spiral curve is superior to other types of spiral curve. When collating simulation and algorithm, we found that if we randomly define different search paths for different agents in the same iteration, the resulting convergence performance will be better, but correspondingly, this will also give the algorithm parameters. The adjustment brings a certain degree of difficulty. In summary, the proposed complex path-perceptual disturbance WOA (CP-PDWOA) algorithm has a stronger search performance.

Supplementary Materials

The related MATLAB code of CP-PDWOA is available online at https://github.com/sunweizhen01/CP-PDWOA.git.

Author Contributions

Conceptualization, W.-z.S. and J.-s.W.; Methodology, W.-z.S.; Software, W.-z.S.; Validation, W.-z.S. and J.-s.W.; Formal Analysis, X.W.; Investigation, W.-z.S.; Resources, W.-z.S.; Data Curation, W.-z.S.; Writing-Original Draft Preparation, W.-z.S.; Writing-Review & Editing, X.W.; Visualization, W.-z.S.; Supervision, J.-s.W.; Project Administration, J.-s.W.; Funding Acquisition, J.-s.W.

Funding

This research was funded by the Project by National Natural Science Foundation of China grant number [21576127], the Basic Scientific Research Project of Institution of Higher Learning of Liaoning Province grant number [2017FWDF10], and the CAS Pioneer Hundred Talents Program (Type C) grant number [2017-122].

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Chandra Mohan, B.; Baskaran, R. A survey: Ant Colony Optimization based recent research and implementation on several engineering domain. Expert Syst. Appl. 2012, 39, 4618–4627. [Google Scholar] [CrossRef]
  2. Yu, Y.; Li, Y.; Li, J. Nonparametric modeling of magnetorheological elastomer base isolator based on artificial neural network optimized by ant colony algorithm. J. Intell. Mater. Syst. Struct. 2015, 26, 1789–1798. [Google Scholar] [CrossRef]
  3. Precup, R.E.; Sabau, M.C.; Petriu, E.M. Nature-inspired optimal tuning of input membership functions of Takagi-Sugeno-Kang fuzzy models for anti-lock braking systems. Appl. Soft Comput. 2015, 27, 575–589. [Google Scholar] [CrossRef]
  4. Vallada, E.; Ruiz, R. A genetic algorithm for the unrelated parallel machine scheduling problem with sequence dependent setup times. Eur. J. Oper. Res. 2011, 211, 612–622. [Google Scholar] [CrossRef] [Green Version]
  5. Zăvoianu, A.-C.; Bramerdorfer, G.; Lughofer, E.; Silber, S.; Amrhein, W.; Klement, E.P. Hybridization of multi-objective evolutionary algorithms and artificial neural networks for optimizing the performance of electrical drives. Eng. Appl. Artif. Intell. 2013, 26, 1781–1794. [Google Scholar] [CrossRef]
  6. Kennedy, J. Particle swarm optimization. In Encyclopedia of Machine Learning; Springer: New York, NY, USA, 2010; pp. 760–766. [Google Scholar]
  7. Yu, Y.; Li, Y.; Li, J. Parameter identification of a novel strain stiffening model for magnetorheological elastomer base isolator utilizing enhanced particle swarm optimization. J. Intell. Mater. Syst. Struct. 2015, 26, 2446–2462. [Google Scholar] [CrossRef]
  8. Yu, Y.; Li, Y.; Li, J. Forecasting hysteresis behaviours of magnetorheological elastomer base isolator utilizing a hybrid model based on support vector regression and improved particle swarm optimization. Smart Mater. Struct. 2015, 24, 035025. [Google Scholar] [CrossRef]
  9. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  10. Precup, R.E.; David, R.C.; Petriu, E.M. Grey Wolf Optimizer Algorithm-Based Tuning of Fuzzy Control Systems with Reduced Parametric Sensitivity. IEEE Trans. Ind. Electron. 2016, 64, 527–534. [Google Scholar] [CrossRef]
  11. Saadat, J.; Moallem, P.; Koofigar, H. Training echo state neural network using harmony search algorithm. Int. J. Artif. Intell. 2017, 15, 163–179. [Google Scholar]
  12. Vrkalovic, S.; Teban, T.A.; Borlea, L.D. Stable Takagi-Sugeno fuzzy control designed by optimization. Int. J. Artif. Intell. 2017, 15, 17–29. [Google Scholar]
  13. Reddy, P.D.P.; Reddy, V.C.V.; Manohar, T.G. Whale optimization algorithm for optimal sizing of renewable resources for loss reduction in distribution systems. Renew. Wind Water Solar 2017, 4, 3. [Google Scholar] [CrossRef]
  14. Trivedi, I.N.; Bhoye, M.; Bhesdadiya, R.H.; Jangir, P.; Jangir, N.; Kumar, A. An emission constraint environment dispatch problem solution with microgrid using Whale Optimization Algorithm. In Proceedings of the Power Systems Conference, Bhubaneswar, India, 19–21 December 2016. [Google Scholar]
  15. Rosyadi, A.; Penangsang, O.; Soeprijanto, A. Optimal filter placement and sizing in radial distribution system using whale optimization algorithm. In Proceedings of the International Seminar on Intelligent Technology and ITS Applications, Surabaya, Indonesia, 28–29 August 2017; pp. 87–92. [Google Scholar]
  16. Buch, H.; Jangir, P.; Jangir, N.; Ladumor, D.; Bhesdadiya, R.H. Optimal Placement and Coordination of Static VAR Compensator with Distributed Generation using Whale Optimization Algorithm. In Proceedings of the IEEE 1st International Conference on Power Electronics, Intelligent Control and Energy Systems (ICPEICES), Delhi, India, 4–6 July 2016. [Google Scholar]
  17. Ladumor, D.P.; Trivedi, I.N.; Jangir, P.; Kumar, A. A Whale Optimization Algorithm approach for Unit Commitment Problem Solution. In Proceedings of the Conference Advancement in Electrical & Power Electronics Engineering (AEPEE 2016), Morbi, India, 14–17 December 2016. [Google Scholar]
  18. Prakash, D.B.; Lakshminarayana, C. Optimal siting of capacitors in radial distribution network using Whale Optimization Algorithm. Alex. Eng. J. 2016, 56, 499–509. [Google Scholar] [CrossRef]
  19. Yan, Z.; Sha, J.; Liu, B.; Tian, W.; Lu, J. An Ameliorative Whale Optimization Algorithm for Multi-Objective Optimal Allocation of Water Resources in Handan, China. Water 2018, 10, 87. [Google Scholar] [CrossRef]
  20. Medani, K.B.O.; Sayah, S.; Bekrar, A. Whale optimization algorithm based optimal reactive power dispatch: A case study of the Algerian power system. Electr. Power Syst. Res. 2017. [Google Scholar] [CrossRef]
  21. Sun, W.Z.; Wang, J.S. Elman Neural network Soft-sensor Model of Conversion Velocity in Polymerization Process Optimized by Chaos Whale Optimization Algorithm. IEEE Access 2017, 5, 13062–13076. [Google Scholar] [CrossRef]
  22. Aljarah, I.; Faris, H.; Mirjalili, S. Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput. 2016, 22, 1–15. [Google Scholar] [CrossRef]
  23. Bhesdadiya, R.; Jangir, P.; Jangir, N.; Trivedi, I.N.; Ladumor, D. Training Multi-Layer Perceptron in Neural Network using Whale Optimization Algorithm. Indian J. Sci. Technol. 2016, 9, 28–36. [Google Scholar]
  24. Wang, J.; Du, P.; Niu, T.; Yang, W. A novel hybrid system based on a new proposed algorithm—Multi-Objective Whale Optimization Algorithm for wind speed forecasting. Appl. Energy 2017, 208, 344–360. [Google Scholar] [CrossRef]
  25. Niu, P.F.; Wu, Z.L.; Ma, Y.P.; Shi, C.J.; Li, J.B. Prediction of steam turbine heat consumption rate based on whale optimization algorithm. CIESC J. 2017, 68, 1049–1057. [Google Scholar]
  26. Aziz, M.A.E.; Ewees, A.A.; Hassanien, A.E. Whale Optimization Algorithm and Moth-Flame Optimization for multilevel thresholding image segmentation. Expert Syst. Appl. 2017, 83, 242–256. [Google Scholar] [CrossRef]
  27. Mostafa, A.; Hassanien, A.E.; Houseni, M.; Hefny, H. Liver segmentation in MRI images based on whale optimization algorithm. Multimed. Tools Appl. 2017, 76, 24931–24954. [Google Scholar] [CrossRef]
  28. Jangir, P.; Jangir, N. Non-Dominated Sorting Whale Optimization Algorithm (NSWOA): A Multi-Objective Optimization Algorithm for Solving Engineering Design Problems. Glob. J. Res. Eng. 2017, 17, 15–42. [Google Scholar]
  29. Oliva, D.; Aziz, M.A.E.; Hassanien, A.E. Parameter estimation of photovoltaic cells using an improved chaotic whale optimization algorithm. Appl. Energy 2017, 200, 141–154. [Google Scholar]
  30. Sharawi, M.; Zawbaa, H.M.; Emary, E. Feature selection approach based on whale optimization algorithm. In Proceedings of the Ninth International Conference on Advanced Computational Intelligence, Doha, Qatar, 4–6 February 2017; pp. 163–168. [Google Scholar]
  31. Zhao, G.; Zhou, Y.; Ouyang, Z.; Wang, Y. A Novel Disturbance Parameters PSO Algorithm for Functions Optimization. Adv. Inf. Sci. Serv. Sci. 2012, 4, 51–57. [Google Scholar]
  32. Jing, Y.E. The Diversity Disturbance PSO Algorithm to Solve TSP Problem. J. Langfang Teach. Coll. 2010, 5, 2. [Google Scholar]
  33. Li, D.; Deng, N. An electoral quantum-behaved PSO with simulated annealing and gaussian disturbance for permutation flow shop scheduling. J. Inf. Comput. Sci. 2012, 9, 2941–2949. [Google Scholar]
  34. Gao, H.; Kwong, S.; Yang, J.; Cao, J. Particle swarm optimization based on intermediate disturbance strategy algorithm and its application in multi-threshold image segmentation. Inf. Sci. 2013, 250, 82–112. [Google Scholar] [CrossRef]
  35. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  36. Mafarja, M.M.; Mirjalili, S. Hybrid Whale Optimization Algorithm with simulated annealing for feature selection. Neurocomputing 2017, 260, 302–312. [Google Scholar] [CrossRef]
  37. Ling, Y.; Zhou, Y.; Luo, Q. Lévy Flight Trajectory-Based Whale Optimization Algorithm for Global Optimization. IEEE Access 2017, 5, 6168–6186. [Google Scholar] [CrossRef]
  38. Abdel-Basset, M.; El-Shahat, D.; El-Henawy, I.; Sangaiah, A.K.; Ahmed, S.H. A Novel Whale Optimization Algorithm for Cryptanalysis in Merkle-Hellman Cryptosystem. Mob. Netw. Appl. 2018, 5, 1–11. [Google Scholar] [CrossRef]
  39. Kaveh, A.; Ghazaan, M.I. Enhanced Whale Optimization Algorithm for Sizing Optimization of Skeletal Structures. Mech. Based Des. Struct. Mach. 2017, 45, 345–362. [Google Scholar] [CrossRef]
  40. Spiral Curve. Van Nostrand’s Scientific Encyclopedia; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2005. [Google Scholar]
  41. Ren, Y.; Wu, Y. An efficient algorithm for high-dimensional function optimization. Soft Comput. 2013, 17, 995–1004. [Google Scholar] [CrossRef]
  42. Yuan, Z.; de Oca, M.A.M.; Birattari, M.; Stützle, T. Continuous optimization algorithms for tuning real and integer parameters of swarm intelligence algorithms. Swarm Intell. 2012, 6, 49–75. [Google Scholar] [CrossRef]
Figure 1. Theoretical basis of two-dimensional plane.
Figure 1. Theoretical basis of two-dimensional plane.
Symmetry 10 00210 g001
Figure 2. Theoretical basis of two-dimensional plane.
Figure 2. Theoretical basis of two-dimensional plane.
Symmetry 10 00210 g002
Figure 3. Theoretical basis of shrinking encircling mechanism.
Figure 3. Theoretical basis of shrinking encircling mechanism.
Symmetry 10 00210 g003
Figure 4. Sketch map of spiral renewal position method.
Figure 4. Sketch map of spiral renewal position method.
Symmetry 10 00210 g004
Figure 5. Two-dimensional curve of Logarithmic spiral.
Figure 5. Two-dimensional curve of Logarithmic spiral.
Symmetry 10 00210 g005
Figure 6. Two-dimensional curve of equal pitch Archimedes spiral.
Figure 6. Two-dimensional curve of equal pitch Archimedes spiral.
Symmetry 10 00210 g006
Figure 7. Two-dimensional curve of a Rose spiral with four leaves.
Figure 7. Two-dimensional curve of a Rose spiral with four leaves.
Symmetry 10 00210 g007
Figure 8. Two-dimensional curve of Epicycloid-I.
Figure 8. Two-dimensional curve of Epicycloid-I.
Symmetry 10 00210 g008
Figure 9. Two-dimensional curve of Hypotrochoid.
Figure 9. Two-dimensional curve of Hypotrochoid.
Symmetry 10 00210 g009
Figure 10. Two-dimensional curve of Epicycloid-II.
Figure 10. Two-dimensional curve of Epicycloid-II.
Symmetry 10 00210 g010
Figure 11. Two-dimensional curve of Fermat spiral.
Figure 11. Two-dimensional curve of Fermat spiral.
Symmetry 10 00210 g011
Figure 12. Two-dimensional curve of Lituus spiral.
Figure 12. Two-dimensional curve of Lituus spiral.
Symmetry 10 00210 g012
Figure 13. Three-dimensional images of some optimization functions.
Figure 13. Three-dimensional images of some optimization functions.
Symmetry 10 00210 g013aSymmetry 10 00210 g013bSymmetry 10 00210 g013cSymmetry 10 00210 g013d
Figure 14. Comparison of the convergence curves of the partial functions to verify disturbances.
Figure 14. Comparison of the convergence curves of the partial functions to verify disturbances.
Symmetry 10 00210 g014aSymmetry 10 00210 g014bSymmetry 10 00210 g014cSymmetry 10 00210 g014dSymmetry 10 00210 g014e
Figure 15. Comparison of the convergence curves of the partial functions.
Figure 15. Comparison of the convergence curves of the partial functions.
Symmetry 10 00210 g015aSymmetry 10 00210 g015bSymmetry 10 00210 g015cSymmetry 10 00210 g015dSymmetry 10 00210 g015e
Table 1. Testing functions.
Table 1. Testing functions.
FunctionFunction ExpressionRangeFmin
F1 f 1 ( x ) = i = 1 d x i 2 [−100,100]0
F2 f 2 ( x ) = i = 1 d | x i | + i = 1 n | x i | [−10,10]0
F3 f 3 ( x ) = i = 1 d ( j 1 i x j ) 2 [−100,100]0
F4 f 4 ( x ) = max i { | x i | , 1 i n } [−100,100]0
F5 f 5 ( x ) = i = 1 d 1 ( 100 ( x i + 1 x i 2 ) + ( x i 1 ) 2 ) [−30,30]0
F6 f 6 ( x ) = i = 1 d ( [ x i + 0.5 ] ) 2 [−100,100]0
F7 f 7 ( x ) = i = 1 d i x i 4 + r a n d o m [ 0 , 1 ) [−1.28,1.28]0
F8 f 8 ( x ) = i = 1 n x i sin ( x i ) [−500,500]−418.9 × 5
F9 f 9 ( x ) = i = 1 d ( x i 2 10 c o s ( 2 π x i ) + 10 ) [−5.12,5.12]0
F10 f 10 ( x ) = 20 exp ( 0.2 1 n i = 1 n x i 2 ) exp ( 1 n i = 1 n cos ( 2 π x i ) ) + 20 + e [−32,32]0
F11 f 11 ( x ) = 1 4000 ( i = 1 n ( x 2 i ) ) ( i = 1 n cos ( x i i ) ) + 1 [−600,600]0
F12 f 12 ( x ) = π n { 10 sin ( π y 1 ) + i = 1 n 1 ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ] + ( y n + 1 ) 2 } + i = 1 n u ( x i , 10 , 100 , 4 ) y i = 1 + x i + 1 4 ; u ( x i , a , k , m ) = { k ( x i a ) m 0 k ( x i a ) m x i > a a < x i < a x i < a [−50,50]0
F13 f 13 ( x ) = 0.1 { sin 2 ( 3 π x 1 ) + i = 1 n ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] } + i = 1 n u ( x i , 5 , 100 , 4 ) [−50,50]0
F14 f 14 = ( 1 500 + j = 1 25 1 j + i = 1 2 ( x i a i j ) 6 ) 1 [−65,65]1
F15 f 15 = i = 1 11 [ a i x 1 ( b i 2 + b 1 x 2 ) b i 2 + b 1 b 3 + b 4 ] 2 [−5,5]0.0003
F16 f 16 = 4 x 1 2 2.1 x 1 4 + 1 3 x + 1 6 x 1 x 2 4 x 2 2 + 4 x 2 4 [−5,5]−1.0316
F17 f 17 = ( x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 ) 2 + 10 ( 1 1 8 π ) cos x 1 + 10 [−5,5]0.398
F18 f 18 = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 3 x 2 ) 2 ] × ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ) [−2,2]3
F19 f 19 = i = 1 4 c i exp ( j = 1 3 a i j ( x i j p i j ) 2 ) [1,3]−3.80
F20 f 20 = i = 1 4 c i exp ( j = 1 6 a i j ( x i j p i j ) 2 ) [0,1]−3.32
F21 f 21 = i = 1 5 [ ( X a i ) ( X a i ) T + c i ] 1 [0,10]−10.1513
F22 f 22 = i = 1 7 [ ( X a i ) ( X a i ) T + c i ] 1 [0,10]−10.4028
F23 f 23 = i = 1 10 [ ( X a i ) ( X a i ) T + c i ] 1 [0,10]−10.5363
Table 2. Comparison of optimization results obtained for the unimodal, multimodal and fixed-dimension multimodal benchmark functions.
Table 2. Comparison of optimization results obtained for the unimodal, multimodal and fixed-dimension multimodal benchmark functions.
FunctionL0ArRoHyPe-IPe-IIFeLi
AVESTDAVESTDAVESTDAVESTDAVESTDAVESTDAVESTDAVESTD
F11.41 × 10−304.91 × 10−303.64 × 10−1064992.41162.065 × 10−476816.39591.001 × 10−488014.52871.031 × 10−866343.72627.242 × 10−715094.08433.096 × 10−605251.47993.638 × 10−497360.7530
F21.06 × 10−212.39 × 10−211.26 × 10−10510125014.91 × 10−65425300408138.33 × 10−43194298129333.22 × 10−52974981867293.22 × 10−5254637507792.66 × 10−334937715280624.66 × 10−3614453630723
F321,533.0615,903.3417,308.0961,432.7649,342.1222,646.5537,706.4622,712.24855,440.03919,449.5750,199.0410,850.3169,415.3225,663.1530,810.0336,337.14
F40.0725810.397470.024010.7508.141735.14897.832719.8330.76145.79551.143915.9421.032415.5683.808224.589
F527.865580.7636260.4452196696428.7891746345027.77919673010.6371976111.6512091628.5982586195.436226382
F63.1162660.5324290.017528622.9840.00012085866.5990.0101635806.4490.011667645.3300.007386855.5530.018206264.78760.0545704673.773
F70.0014250.0011490.0004566.06420.006178.950370.00195310.19040.0035210.17420.003177.34950.005647.18750.0001239.778
F8−5080.76695.7968−12,569.062882.164−8103.505784.832−7574.253385.518−7747.811421.263−9937.307625.755−8163.108395.22−11,050.7071125.405
F900049.058075.368069.589069.735087.804065.9588060.5413
F107.40439.897578.88 × 10−162.94494.44 × 10−152.99654.44 × 10−153.54818.88 × 10−163.32398.88 × 10−163.01697.99 × 10−162.799401.39 × 10−133.05938
F110.000280.001589.9767 × 10−645.548050.3721.5259 × 10−1062.5257.414 × 10−1054.5371.1872 × 10−1355.45450.0007657166.13653.4649 × 10−550.2502
F120.339670.214860.00103435935460.0335483357770.5537240748790.0631464838370.1670350413770.0273397175310.037051384616
F131.889010.266080.0052579150151.6642732684620.1307893153330.6859966988540.2089966754671.0066688000441.384978754257
F142.111972.498590.998803.55382.98215.29410.998014.30430.998022.02500.99800.819302.98218.19282.98210.3263
F150.000570.000320.000300.000550.000310.010030.000330.015610.000320.001950.000330.005500.000710.003630.000370.01032
F16−1.03164.2 × 10−7−1.03164.2 × 10−7-1.03164.2 × 10−7−1.03164.2 × 10−7−1.03164.2 × 10−7−1.03164.2 × 10−7−1.03164.2 × 10−7−1.03164.2 × 10−7
F170.397912.7 × 10−50.08172.7 × 10−50.397912.7 × 10−50.397912.7 × 10−50.397912.7 × 10−50.397912.7 × 10−50.397912.7 × 10−50.397912.7 × 10−5
F1834.22 × 10−150.30987.65 × 10−1836.51 × 10−1534.36 × 10−1533.12 × 10−1532.13 × 10−1535.63 × 10−1534.22 × 10−15
F19−3.856160.002706−3.86210.0495−3.86250.1135−3.86270.0110−3.86220.00087−3.85990.0030−3.84860.0024−3.86120.0165
F20−2.981050.376653−3.321650.1012356−3.312560.118835−3.319360.191023−3.318440.750346−3.041780.055936−2.835320.078376−3.321260.066325
F21−7.049183.629551−10.1520.8923−5.0550.5159−5.0540.4053−2.6270.1699−5.05460.3044−5.055830.5427−9.6291.25043
F22−8.18178 3.829202−10.40231.2051−3.72420.19741−10.40201.4238−10.40142.3170−2.76580.2601−5.08760.5905−10.40242.6105
F23−9.34238 2.414737−10.53611.1602−5.11851.3741−5.12411.1746−3.83510.3774−3.83540.4215−5.07401.16607−5.11661.2442

Share and Cite

MDPI and ACS Style

Sun, W.-z.; Wang, J.-s.; Wei, X. An Improved Whale Optimization Algorithm Based on Different Searching Paths and Perceptual Disturbance. Symmetry 2018, 10, 210. https://doi.org/10.3390/sym10060210

AMA Style

Sun W-z, Wang J-s, Wei X. An Improved Whale Optimization Algorithm Based on Different Searching Paths and Perceptual Disturbance. Symmetry. 2018; 10(6):210. https://doi.org/10.3390/sym10060210

Chicago/Turabian Style

Sun, Wei-zhen, Jie-sheng Wang, and Xian Wei. 2018. "An Improved Whale Optimization Algorithm Based on Different Searching Paths and Perceptual Disturbance" Symmetry 10, no. 6: 210. https://doi.org/10.3390/sym10060210

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop