Next Article in Journal
Multivariate Statistical Process Control Using Enhanced Bottleneck Neural Network
Previous Article in Journal
Trust in the Balance: Data Protection Laws as Tools for Privacy and Security in the Cloud
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Adaptive Mutation Dynamic Search Fireworks Algorithm

School of Computer, Shenyang Aerospace University, Shenyang 110136, China
*
Author to whom correspondence should be addressed.
Algorithms 2017, 10(2), 48; https://doi.org/10.3390/a10020048
Submission received: 23 February 2017 / Revised: 20 April 2017 / Accepted: 25 April 2017 / Published: 28 April 2017

Abstract

:
The Dynamic Search Fireworks Algorithm (dynFWA) is an effective algorithm for solving optimization problems. However, dynFWA easily falls into local optimal solutions prematurely and it also has a slow convergence rate. In order to improve these problems, an adaptive mutation dynamic search fireworks algorithm (AMdynFWA) is introduced in this paper. The proposed algorithm applies the Gaussian mutation or the Levy mutation for the core firework (CF) with mutation probability. Our simulation compares the proposed algorithm with the FWA-Based algorithms and other swarm intelligence algorithms. The results show that the proposed algorithm achieves better overall performance on the standard test functions.

1. Introduction

Fireworks Algorithm (FWA) [1] is a new group of intelligent algorithms developed in recent years based on the natural phenomenon of simulating fireworks sparking, and can solve some optimization problems effectively. Compared with other intelligent algorithms such as particle swarm optimization and genetic algorithms, the FWA adopts a new type of explosive search mechanism, to calculate the explosion amplitude and the number of explosive sparks through the interaction mechanism between fireworks.
However, many researchers quickly find that traditional FWA has some disadvantages in solving optimization problems; the main disadvantages include slow convergence speed and low accuracy, thus, many improved algorithms have been proposed. So far, research on the FWA has concentrated on improving the operators. One of the most important improvements of the FWA is the enhanced fireworks algorithm (EFWA) [2], where the operators of the conventional FWA were thoroughly analyzed and revised. Based on the EFWA, an adaptive fireworks algorithm (AFWA) [3] was proposed, which was the first attempt to control the explosion amplitude without preset parameters by detecting the results of the search process. In [4], a dynamic search fireworks algorithm (dynFWA) was proposed which divided the fireworks into core firework and non-core fireworks according to the fitness value and adaptive adjustment of the explosion amplitude for the core firework. Based on the analysis of each operator of the fireworks algorithm, an improvement of fireworks algorithm (IFWA) [5] was proposed. Since the FWA was proposed, it has been applied to many areas [6], including digital filter design [7], nonnegative matrix factorization [8], spam detection [9], image identification [10], mass minimization of trusses with dynamic constraints [11], clustering [12], power loss minimization and voltage profile enhancement [13], etc.
The aforementioned dynFWA variants can improve the performance of FWA to some extent. However, the inhibition of premature convergence and solution accuracy improvement are still challenging issues that require further research on dynFWA.
In this paper, an adaptive mutation dynamic search fireworks algorithm (AMdynFWA) is presented. In AMdynFWA, the core firework chooses either Gaussian mutation or Levy mutation based on the mutation probability. When it chooses the Gaussian mutation, the local search ability of the algorithm will be enhanced, and by choosing Levy mutation, the ability of the algorithm to jump out of local optimization will be enhanced.
The paper is organized as follows. In Section 2, the dynamic search fireworks algorithm is introduced. The AMdynFWA is presented in Section 3. The simulation experiments and analysis of the results are given in detail in Section 4. Finally, the conclusion is summarized in Section 5.

2. Dynamic Search Fireworks Algorithm

The AMdynFWA is based on the dynFWA because it is very simple and it works stably. In this section, we will briefly introduce the framework and the operators of the dynFWA for further discussion.
Without the loss of generality, consider the following minimization problem:
min f ( x )
The object is to find an optimal x with a minimal evaluation (fitness) value.
In dynFWA, there are two important components: the explosion operator (the sparks generated by the explosion) and the selection strategy.

2.1. Explosion Operator

Each firework explodes and generates a certain number of explosion sparks within a certain range (explosion amplitude). The numbers of explosion sparks (Equation (2)) are calculated according to the qualities of the fireworks.
For each firework Xi, its explosion sparks’ number is calculated as follows:
S i = m × y max f ( X i ) + ε i = 1 N ( y max f ( X i ) ) + ε
where ymax = max (f(Xi)), m is a constant to control the number of explosion sparks, and ε is the machine epsilon to avoid Si equal to 0.
In order to limit the good fireworks that do not produce too many explosive sparks, while the poor fireworks do not produce enough sparks, its scope Si is defined as.
S i = { r o u n d ( a × m ) , S i < a × m r o u n d ( b × m ) , S i > b × m r o u n d ( S i ) , o t h e r w i s e
where a and b are fixed constant parameters that confine the range of the population size.
In dynFWA, fireworks are divided into two types: non-core fireworks and core firework, and the core firework (CF) is the firework with the best fitness, and is calculated by Equation (4).
X C F = min f ( x i )
The calculations of the amplitude of the non-core fireworks and the core firework are different. The non-core fireworks’ explosion amplitudes (except for CF) are calculated just as in the previous versions of FWA:
A i = A × f ( X i ) y min + ε i = 1 N ( f ( X i ) y min ) + ε
where ymin = min f(Xi), A is a constant to control the explosion amplitude, and ε is the machine epsilon to avoid Ai equal to 0.
However, for the CF, its explosion amplitude is adjusted according to the search results in the last generation:
A C F ( t ) = { A C F ( 1 ) t = 1 C r A C F ( t 1 ) f ( X C F ( t ) ) = f ( X C F ( t 1 ) ) C a A C F ( t 1 ) f ( X C F ( t ) ) < f ( X C F ( t 1 ) )
where ACF(t) is the explosion amplitude of the CF in generation t. In the first generation, the CF is the best among all the randomly initialized fireworks, and its amplitude is preset to a constant number which is usually the diameter of the search space.
Algorithm 1 describes the process of the explosion operator in dynFWA.
Algorithm 1. Generating Explosion Sparks
Calculate the number of explosion sparks Si
Calculate the non-core fireworks of explosion amplitude Ai
Calculate the core firework of explosion amplitude ACF
Set z = rand (1, d)
For k = 1:d do
 If kz then
  If Xjk is core firework then
   Xjk = Xjk + rand (0, ACF)
  Else
   Xjk = Xjk + rand (0, Ai)
  If Xjk out of bounds
   Xjk = Xmink + |Xjk| % (XmaxkXmink)
  End if
 End if
End for
Where the operator % refers to the modulo operation, and Xmink and Xmaxk refer to the lower and upper bounds of the search space in dimension k.

2.2. Selection Strategy

In dynFWA, a selection method is applied, which is referred to as the Elitism-Random Selection method. In this selection process, the optima of the set will be selected firstly. Then, the other individuals are selected randomly.

3. Adaptive Mutation Dynamic Search Fireworks Algorithm

The mutation operation is an important step in the swarm intelligence algorithm. Different mutation schemes have different search characteristics. Zhou pointed out that the Gaussian mutation has a strong local development ability [14]. Fei illustrated that the Levy mutation not only improves the global optimization ability of the algorithm, but also helps the algorithm jump out of the local optimal solution and keeps the diversity of the population [15]. Thus, combining the Gaussian mutation with the Levy mutation is an effective way to improve the exploitation and exploration of dynFWA.
For the core firework, for each iteration, two mutation schemes are alternatives to be conducted based on a probability p. The new mutation strategy is defined as:
X C F = { X C F + X C F G a u s s i a n ( )   , i f   E < p X C F + X C F L e v y ( ) , O t h e r w i s e
where p is a probability parameter, XCF is the core firework in the current population, and the symbol represents the dot product. Gaussian() is a random number generated by the normal distribution with mean parameter mu = 0 and standard deviation parameter sigma = 1, and Levy() is a random number generated by the Levy distribution, and it can be calculated with the parameter β = 1.5 [16]. The value of E varies dynamically with the evolution of the population, with reference to the annealing function of the simulated annealing algorithm, and the value of E is expected to change exponentially, and it is calculated as follows:
E = e ( 2 t / T max ) 2
where t is the current function evaluations, and Tmax is the maximum number of function evaluations.
To sum up, another type of sparks, the mutation sparks, are generated based on an adaptive mutation process (Algorithm 2). This algorithm is performed Nm times, each time with the core firework XCF (Nm is a constant to control the number of mutation sparks).
Algorithm 2. Generating Mutation Sparks
Set the value of mutation probability p
Find out the core firework XCF in current population
Calculate the value of E by Equation (8)
Set z = rand (1, d)
For k = 1:d do
 If kz then
  Produce mutation spark XCF’ by Equation (7)
  If XCF’ out of bounds
   XCF’ = Xmin + rand * (XmaxXmin)
  End if
 End if
End for
Where d is the number of dimensions, Xmin is the lower bound, and Xmax is the upper bound.
As Figure 1 shows, the Levy mutation has a stronger perturbation effect than the Gaussian mutation. In the Levy mutation, the occasional larger values can effectively help jump out of the local optimum and keep the diversity of the population. On the contrary, the Gaussian mutation has better stability, which improves the local search ability.
The flowchart of the adaptive mutation dynamic search fireworks algorithm (AMdynFWA) is shown in Figure 2.
Algorithm 3 demonstrates the complete version of the AMdynFWA.
Algorithm 3. Pseudo-Code of AMdynFWA
Randomly choosing m fireworks
Assess their fitness
Repeat
 Obtain Ai (except for ACF)
 Obtain ACF by Equation (6)
 Obtain Si
 Produce explosion sparks
 Produce mutation sparks
 Assess all sparks’ fitness
 Retain the best spark as a firework
 Select other m−1 fireworks randomly
 Until termination condition is satisfied
Return the best fitness and a firework location

4. Simulation Results and Analysis

4.1. Simulation Settings

Similar to dynFWA, the number of fireworks in AMdynFWA is set to five, the number of mutation sparks is also set to five, and the maximum number of sparks in each generation is set to 150.
In the experiment, the function of each algorithm is repeated 51 times, and the final results after 300,000 function evaluations are presented. In order to verify the performance of the algorithm proposed in this paper, we use the CEC2013 test set [16], including 28 different types of test functions, which are listed in Table 1. All experimental test function dimensions are set to 30, d = 30.
Finally, we use the Matlab R2014a software on a PC with a 3.2 GHz CPU (Intel Core i5-3470), 4 GB RAM, and Windows 7 (64 bit).

4.2. Simulation Results and Analysis

4.2.1. Study on the Mutation Probability p

In AMdynFWA, the mutation probability p is introduced to control the probability of selecting the Gaussian and Levy mutations. To investigate the effects of the parameter, we compare the performance of AMdynFWA with different values of p. In this experiment, p is set to 0.1, 0.3, 0.5, 0.7, and 0.9, respectively.
Table 2 gives the computational results of AMdynFWA with different values of p, where ‘Mean’ is the mean best fitness value. The best results among the comparisons are shown in bold. It can be seen that p = 0.5 is suitable for unimodal problems f1 − f5. For f6 − f20, p = 0.3 has a better performance than the others. When p is set as 0.1 or 0.9, the algorithm obtains better performance on f21 − f28.
The above results demonstrate that the parameter p is problem-oriented. For different problems, different p may be required. In this paper, taking into account the average ranking, p = 0.3 is regarded as the relatively suitable value.

4.2.2. Comparison of AMdynFWA with FWA-Based Algorithms

To assess the performance of AMdynFWA, AMdynFWA is compared with enhanced fireworks algorithm (EFWA), dynamic search fireworks algorithms (dynFWA), and adaptive fireworks algorithm (AFWA), and the EFWA parameters are set in accordance with [2], the AFWA parameters are set in accordance with [3], and the dynFWA parameters are set in accordance with [4].
The probability p used in AMdynFWA is set to 0.3. For each test problem, each algorithm runs 51 times, all experimental test function dimensions are set as 30, and their mean errors and total number of rank 1 are reported in Table 3.
The results from Table 3 indicate that the total number of rank 1 of AMdynFWA (23) is the best of the four algorithms.
Figure 3 shows a comparison of the average run-time cost in the 28 functions for AFWA, EFWA, dynFWA, and AMdynFWA.
The results from Figure 3 indicate that the average run-time cost of EFWA is the most expensive among the four algorithms. The time cost of AFWA is the least, but the run-time cost of AMdynFWA is almost the same compared with AFWA. The run-time cost of AMdynFWA is less than that of dynFWA. Taking into account the results from Table 3, AMdynFWA performs significantly better than the other three algorithms.
To evaluate whether the AMdynFWA results were significantly different from those of the EFWA, AFWA, and dynFWA, the AMdynFWA mean results during the iteration for each test function were compared with those of the EFWA, AFWA, and dynFWA. The T test [17], which is safe and robust, was utilized at the 5% level to detect significant differences between these pairwise samples for each test function.
The ttest2 function in Matlab R2014a was used to run the T test, as shown in Table 4. The null hypothesis is that the results of EFWA, AFWA, and dynFWA are derived from distributions of equal mean, and in order to avoid increases of type I errors, we correct the p-values using the Holm’s method, and order the p-values for the three hypotheses being tested from smallest to largest, and we then have three T tests. Thus, the p-value 0.05 is changed to 0.0167, 0.025, and 0.05, and then the corrected p-values were used to compare with the calculated p-values, respectively.
Where the p-value is the result of the T test. The ‘+’ indicates the rejection of the null hypothesis at the 5% significance level, and the ‘-’ indicates the acceptance of the null hypothesis at the 5% significance level.
Table 5 indicates that AMdynFWA showed a large improvement over EFWA in most functions. However, in Unimodal Functions, AMdynFWA is not significant when compared with AFWA and dynFWA. In Basic Multimodal Functions and Composition Functions, the AMdynFWA also showed a large improvement over AFWA and dynFWA.
Figure 4 shows the mean fitness searching curves of the 28 functions for EFWA, AFWA, dynFWA, and AMdynFWA.

4.2.3. Comparison of AMdynFWA with Other Swarm Intelligence Algorithms

In order to measure the relative performance of the AMdynFWA, a comparison among the AMdynFWA and the other swarm intelligence algorithms is conducted on the CEC2013 single objective benchmark suite. The algorithms compared here are described as follows.
(1)
Artificial bee colony (ABC) [18]: A powerful swarm intelligence algorithm.
(2)
Standard particle swarm optimization (SPSO2011) [19]: The most recent standard version of the famous swarm intelligence algorithm PSO.
(3)
Differential evolution (DE) [20]: One of the best evolutionary algorithms for optimization.
(4)
Covariance matrix adaptation evolution strategy (CMA-ES) [21]: A developed evolutionary algorithm.
The above four algorithms use the default settings. The comparison results of ABC, DE, CMS-ES, SPSO2011, and AMdynFWA are presented in Table 6, where the ’Mean error’ is the mean error of the best fitness value. The best results among the comparisons are shown in bold. ABC beats the other algorithms on 12 functions (some differences are not significant), which is the most, but performs poorly on the other functions. CMA-ES performs extremely well on unimodal functions, but suffers from premature convergence on some complex functions. From Table 7, the AMdynFWA ranked the top three (22/28), which is better than the other algorithms (except the DE), and in terms of average ranking, the AMdynFWA performs the best among these five algorithms on this benchmark suite due to its stability. DE and ABC take the second place and the third place, respectively. The performances of CMS-ES and the SPSO2011 are comparable.

5. Conclusions

AMdynFWA was developed by applying two mutation methods to dynFWA. It selects the Gaussian mutation or Levy mutation according to the mutation probability. We apply the CEC2013 standard functions to examine and compare the proposed algorithm AMdynFWA with ABC, DE, SPSO2011, CMS-ES, AFWA, EFWA, and dynFWA. The results clearly indicate that AMdynFWA can perform significantly better than the other seven algorithms in terms of solution accuracy and stability. Overall, the research demonstrates that AMdynFWA performed the best for solution accuracies.
The study on the mutation probability p demonstrates that there is no constant p for all the test problems, while p = 0.3 is regarded as the relatively suitable value for the current test suite. A dynamic p may be a good choice. This will be investigated in future work.

Acknowledgments

The authors are thankful to the anonymous reviewers for their valuable comments to improve the technical content and the presentation of the paper. This paper is supported by the Liaoning Provincial Department of Education Science Foundation (Grant No. L2013064), AVIC Technology Innovation Fund (basic research) (Grant No. 2013S60109R), and the Research Project of Education Department of Liaoning Province (Grant No. L201630).

Author Contributions

Xi-Guang Li participated in the draft writing. Shou-Fei Han participated in the concept, design, and performed the experiments and commented on the manuscript. Liang Zhao, Chang-Qing Gong, and Xiao-Jing Liu participated in the data collection, and analyzed the data.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tan, Y.; Zhu, Y. Fireworks Algorithm for Optimization. In Advances in Swarm Intelligence, Proceedings of the 2010 International Conference in Swarm Intelligence, Beijing, China, 12–15 June 2010; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
  2. Zheng, S.; Janecek, A.; Tan, Y. Enhanced fireworks algorithm. In Proceedings of the 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; pp. 2069–2077. [Google Scholar]
  3. Zheng, S.; Li, J.; Tan, Y. Adaptive fireworks algorithm. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation, Beijing, China, 6–11 July 2014; pp. 3214–3221. [Google Scholar]
  4. Zheng, S.; Tan, Y. Dynamic search in fireworks algorithm. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation, Beijing, China, 6–11 July 2014; pp. 3222–3229. [Google Scholar]
  5. Li, X.-G.; Han, S.-F.; Gong, C.-Q. Analysis and Improvement of Fireworks Algorithm. Algorithms 2017, 10, 26. [Google Scholar] [CrossRef]
  6. Tan, Y. Fireworks Algorithm Introduction, 1st ed.; Science Press: Beijing, China, 2015; pp. 13–136. (In Chinese) [Google Scholar]
  7. Gao, H.Y.; Diao, M. Cultural firework algorithm and its application for digital filters design. Int. J. Model. Identif. Control 2011, 4, 324–331. [Google Scholar] [CrossRef]
  8. Andreas, J.; Tan, Y. Using population based algorithms for initializing nonnegative matrix factorization. In Advances in Swarm Intelligence, Proceedings of the 2010 International Conference in Swarm Intelligence, Chongqing, China, 12–15 June 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 307–316. [Google Scholar]
  9. Wen, R.; Mi, G.Y.; Tan, Y. Parameter optimization of local-concentration model for spam detection by using fireworks algorithm. In Proceedings of the 4th International Conference on Swarm Intelligence, Harbin, China, 12–15 June 2013; pp. 439–450. [Google Scholar]
  10. Zheng, S.; Tan, Y. A unified distance measure scheme for orientation coding in identification. In Proceedings of the 2013 IEEE Congress on Information Science and Technology, Yangzhou, China, 23–25 March 2013; pp. 979–985. [Google Scholar]
  11. Pholdee, N.; Bureerat, S. Comparative performance of meta-heuristic algorithms for mass minimisation of trusses with dynamic constraints. Adv. Eng. Softw. 2014, 75, 1–13. [Google Scholar] [CrossRef]
  12. Yang, X.; Tan, Y. Sample index based encoding for clustering using evolutionary computation. In Advances in Swarm Intelligence, Proceedings of the 2014 International Conference on Swarm Intelligence, Hefei, China, 17–20 October 2014; Springer: Berlin/Heidelberg, Germany, 2014; pp. 489–498. [Google Scholar]
  13. Mohamed Imran, A.; Kowsalya, M. A new power system reconfiguration scheme for power loss minimization and voltage profile enhancement using fireworks algorithm. Int. J. Electr. Power Energy Syst. 2014, 62, 312–322. [Google Scholar] [CrossRef]
  14. Zhou, F.J.; Wang, X.J.; Zhang, M. Evolutionary Programming Using Mutations Based on the t Probability Distribution. Acta Electron. Sin. 2008, 36, 121–123. [Google Scholar]
  15. Fei, T.; Zhang, L.Y.; Chen, L. Improved Artificial Fish Swarm Algorithm Mixing Levy Mutation and Chaotic Mutation. Comput. Eng. 2016, 42, 146–158. [Google Scholar]
  16. Liang, J.; Qu, B.; Suganthan, P.; Hernandez-Diaz, A.G. Problem Definitions and Evaluation Criteria for the CEC 2013 Special Session on Real-Parameter Optimization; Technical Report 201212; Zhengzhou University: Zhengzhou, China, January 2013. [Google Scholar]
  17. Teng, S.Z.; Feng, J.H. Mathematical Statistics, 4th ed.; Dalian University of Technology Press: Dalian, China, 2005; pp. 34–35. (In Chinese) [Google Scholar]
  18. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  19. Zambrano-Bigiarini, M.; Clerc, M.; Rojas, R. Standard particle swarm optimization 2011 at CEC2013: A baseline for future PSO improvements. In Proceedings of the 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; pp. 2337–2344. [Google Scholar]
  20. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  21. Hansen, N.; Ostermeier, A. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of the 1996 IEEE International Conference on Evolutionary Computation, Nagoya, Japan, 20–22 May 1996; pp. 312–317. [Google Scholar]
Figure 1. The value produced by the Levy mutation and Gaussian mutation.
Figure 1. The value produced by the Levy mutation and Gaussian mutation.
Algorithms 10 00048 g001
Figure 2. The flowchart of AMdynFWA.
Figure 2. The flowchart of AMdynFWA.
Algorithms 10 00048 g002
Figure 3. The EFWA, AFWA, dynFWA, and AMdynFWA run-time cost.
Figure 3. The EFWA, AFWA, dynFWA, and AMdynFWA run-time cost.
Algorithms 10 00048 g003
Figure 4. The EFWA, AFWA, dynFWA, and AMdynFWA searching curves. (a) f1 function; (b) f2 function; (c) f3 function; (d) f4 function; (e) f5 function; (f) f6 function; (g) f7 function; (h) f8 function; (i) f9 function; (j) f10 function; (k) f11 function; (l) f12 function; (m) f13 function; (n) f14 function; (o) f15 function; (p) f16 function; (q) f17 function; (r) f18 function; (s) f19 function; (t) f20 function; (u) f21 function; (v) f22 function; (w) f23 function; (x) f24 function; (y) f25 function; (z) f26 function; (A) f27 function; (B) f28 function.
Figure 4. The EFWA, AFWA, dynFWA, and AMdynFWA searching curves. (a) f1 function; (b) f2 function; (c) f3 function; (d) f4 function; (e) f5 function; (f) f6 function; (g) f7 function; (h) f8 function; (i) f9 function; (j) f10 function; (k) f11 function; (l) f12 function; (m) f13 function; (n) f14 function; (o) f15 function; (p) f16 function; (q) f17 function; (r) f18 function; (s) f19 function; (t) f20 function; (u) f21 function; (v) f22 function; (w) f23 function; (x) f24 function; (y) f25 function; (z) f26 function; (A) f27 function; (B) f28 function.
Algorithms 10 00048 g004aAlgorithms 10 00048 g004bAlgorithms 10 00048 g004cAlgorithms 10 00048 g004d
Table 1. CEC2013 test set.
Table 1. CEC2013 test set.
Function TypeFunction NumberFunction NameOptimal Value
Unimodal Functions1Sphere function−1400
2Rotated high conditioned elliptic function−1300
3Rotated bent cigar function−1200
4Rotated discus function−1100
5Different powers function−1000
Basic Multimodal Functions6Rotated rosenbrock’s function−900
7Rotated schaffers F7 function−800
8Rotated Ackley’s function−700
9Rotated weierstrass function−600
10Rotated griewank’s function−500
11Rastrigin’s function−400
12Rotated rastrigin’s function−300
13Non-continuous rotated rastrigin’s function−200
14Schewefel’s function−100
15Rotated schewefel’s function100
16Rotated katsuura function200
17Lunacek Bi_Rastrigin function300
18Rotated Lunacek Bi_Rastrigin function400
19Expanded griewank’s plus rosenbrock’s function500
20Expanded scaffer’s F6 function600
Composition Functions21Composition function 1 (N = 5)700
22Composition function 2 (N = 3)800
23Composition function 3 (N = 3)900
24Composition function 4 (N = 3)1000
25Composition function 5 (N = 3)1100
26Composition function 6 (N = 5)1200
27Composition function 7 (N = 5)1300
28Composition function 8 (N = 5)1400
Table 2. Mean value and average rankings achieved by AMdynFWA with different p, where the ‘mean’ indicates the mean best fitness value.
Table 2. Mean value and average rankings achieved by AMdynFWA with different p, where the ‘mean’ indicates the mean best fitness value.
Functionsp = 0.1p = 0.3p = 0.5p = 0.7p = 0.9
MeanMeanMeanMeanMean
f1−1400−1400−1400−1400−1400
f23.76 × 1053.84 × 1054.56 × 1053.96 × 1054.13 × 105
f31.01 × 1088.32 × 1075.56 × 1077.16 × 1076.69 × 107
f4−1099.9872−1099.98−1099.988−1099.9870−1099.984
f5−1000−1000−1000−1000−1000
f6−870.38−876.05−875.5−875.29−874.71
f7−713.59−711.45−713.69−712.66−702.99
f8−679.069−679.057−679.052−679.063−679.067
f9−578.503−577.189−577.75−577.436−576.518
f10−499.976−499.968−499.968−499.972−499.974
f11−305.44−302.436−307.02−311.215−309.596
f12−164.688−174.843−163.865−173.722−154.561
f13−31.9988−36.4318−35.7453−30.6652−32.3421
f142616.6472543.7162676.6412586.0642704.535
f153664.1133974.2453888.1973946.2143723.16
f16200.3942200.3496200.3884200.3441300.3698
f17437.5601425.8707426.4633424.32428.1304
f18583.18577.8134578.672576.0805573.5208
f19506.931506.5545506.6363507.0156506.3289
f20613.1458613.154613.113613.594613.423
f211047.0891051.011016.4751035.4831049.556
f223871.8043928.6674109.6144059.6324032.769
f235402.425574.5295524.1355597.7515338.983
f241264.251265.8451265.611268.2311264.214
f251390.1051387.7641387.8081390.0351391.654
f261408.7521412.9011424.7521414.981412.238
f272203.5792187.7242192.0542191.3722181.232
f281812.1541762.6471707.2621771.6121830.575
Average Ranking
2.932.822.863.072.93
Table 3. Mean errors and total number of rank 1 achieved by EFWA, AFWA, dynFWA, and AMdynFWA.
Table 3. Mean errors and total number of rank 1 achieved by EFWA, AFWA, dynFWA, and AMdynFWA.
FunctionsEFWAAFWAdynFWAAMdynFWA
Mean ErrorMean ErrorMean ErrorMean Error
f17.82 × 10−2000
f25.43 × 1058.93 × 1057.87 × 1053.84 × 105
f31.26 × 1081.26 × 1081.57 × 1088.32 × 107
f41.0911.512.82.02 × 10−2
f57.9 × 10−26.04 × 10−45.42 × 10−41.86 × 10−4
f634.929.931.523.9
f71.33 × 1029.19 × 1011.03 × 1028.85 × 101
f82.10 × 1012.09 × 1012.09 × 1012.09 × 101
f93.19 × 1012.48 × 1012.56 × 1012.28 × 101
f108.29 × 10−14.73 × 10−24.20 × 10−23.18 × 10−2
f114.22×1021.05 × 1021.07 × 1029.75 × 101
f126.33 × 1021.52 × 1021.56 × 1021.25 × 102
f134.51 × 1022.36 × 1022.44 × 1021.63 × 102
f144.16 × 1032.97 × 1032.95 × 1032.64 × 103
f154.13 × 1033.81 × 1033.71 × 1033.87 × 103
f165.92 × 10−14.97 × 10−14.77 × 10−13.4 × 10−1
f173.10 × 1021.45 × 1021.48 × 1021.25 × 102
f181.75 × 1021.75 × 1021.89 × 1021.77 × 102
f1912.36.926.876.55
f2014.6131313
f213.24 × 1023.16 × 1022.92 × 1023.51 × 102
f225.75 × 1033.45 × 1033.41 × 1033.12 × 103
f235.74 × 1034.70 × 1034.55 × 1034.67 × 103
f243.37 × 1022.70 × 1022.72 × 1022.65 × 102
f253.56 × 1022.99 × 1022.97 × 1022.87 × 102
f263.21 × 1022.73 × 1022.62 × 1022.12 × 102
f271.28 × 1039.72 × 1029.92 × 1028.87 × 102
f284.34 × 1024.37 × 1023.40 × 1023.62 × 102
total number of rank 1
14723
Table 4. T test results of AMdynFWA compared with EFWA, AFWA and dynFWA.
Table 4. T test results of AMdynFWA compared with EFWA, AFWA and dynFWA.
Functionsp/SignificanceEFWAAFWAdynFWA
f1p-value0NaNNaN
significance+--
f2p-value1.5080 × 10325.1525 × 10−502.6725 × 10−49
significance+++
f3p-value0.80040.43020.0778
significance---
f4p-value1.5546 × 10−1361.8922 × 10−2468.8572 × 10−235
significance+++
f5p-value0NaNNaN
significance+--
f6p-value1.5957 × 10−140.71080.0139
significance+-+
f7p-value1.8067 × 10−360.56650.0084
significance+-+
f8p-value0.15620.01379.2522 × 10−6
significance-++
f9p-value7.0132 × 10−270.02786.6090 × 10−8
significance+++
f10p-value2.7171 × 10−1347.3507 × 10−60.0364
significance+++
f11p-value2.2083 × 10−1003.0290 × 10−100.0437
significance+++
f12p-value1.7319 × 10−1011.3158 × 10−111.8212 × 10−7
significance+++
f13p-value2.3914 × 10−894.1645 × 10−368.6284 × 10−37
significance+++
f14p-value0.04240.01174.4964 × 10−5
significance+++
f15p-value1.1749 × 10−60.99760.6064
significance+--
f16p-value2.2725 × 10−178.9230×10122.3427 × 1013
significance+++
f17p-value1.5713 × 10−817.3257 × 10101.0099 × 106
significance+++
f18p-value0.85100.24300.1204
significance---
f19p-value3.6331 × 10−255.3309 × 1060.0086
significance+++
f20p-value3.5246 × 10140.28300.4615
significance+--
f21p-value2.2455 × 1060.01200.0028
significance+++
f22p-value3.2719 × 10460.06340.0344
significance+--
f23p-value2.1191 × 10330.12250.4819
significance+--
f24p-value8.9612 × 10699.0342 × 1056.0855 × 104
significance+++
f25p-value1.2812 × 10591.0745 × 1061.6123 × 108
significance+++
f26p-value4.6864 × 10392.5440 × 10161.1739 × 1011
significance+++
f27p-value2.3540 × 10464.8488 × 1062.1456 × 107
significance+++
f28p-value6.4307 × 10920.44140.0831
significance+--
Table 5. Total number of significance of AMdynFWA compared with EFWA, AFWA and dynFWA.
Table 5. Total number of significance of AMdynFWA compared with EFWA, AFWA and dynFWA.
Functions TypeEFWAAFWAdynFWA
Unimodal Functions (f1 − f5)422
Basic Multimodal Functions (f6 − f20)131012
Composition Functions (f21 − f28)855
Total number of significance in EFWA, AFWA and dynFWA
251719
Table 6. Mean errors and ranking achieved by ABC, DE, CMS-ES, SPSO2011, and AMdynFWA.
Table 6. Mean errors and ranking achieved by ABC, DE, CMS-ES, SPSO2011, and AMdynFWA.
FunctionsMean Error/RankABCDECMS-ESSPSO2011AMdynFWA
f1Mean error01.89 × 10−3000
Rank12111
f2Mean error6.20 × 1065.52 × 10403.38 × 1053.84 × 105
Rank52134
f3Mean error5.74 × 1082.16 × 1061.41 × 1012.88 × 1088.32 × 107
Rank52143
f4Mean error8.75 × 1041.32 × 10−103.86 × 1042.02 × 10−2
Rank53142
f5Mean error02.48 × 10−305.42 × 10−41.86 × 10−4
Rank14132
f6Mean error1.46 × 1017.827.82 × 10−23.79 × 1012.39 × 101
Rank32154
f7Mean error1.25 × 1024.89×1011.91 × 1018.79 × 1018.85 × 101
Rank52134
f8Mean error2.09 × 1012.09 × 1012.14 × 1012.09 × 1012.09 × 101
Rank11211
f9Mean error3.01 × 1011.59 ×1014.81 × 1012.88 × 1012.28 × 101
Rank41532
f10Mean error2.27 × 10−13.42 × 10-21.78 × 10−23.40 × 10−13.18 × 10−2
Rank43152
f11Mean error07.88 × 1014.00 × 1021.05 × 1029.75 × 101
Rank12543
f12Mean error3.19 × 1028.14 × 1019.42 × 1021.04 × 1021.25 × 102
Rank41523
f13Mean error3.29 × 1021.61 × 1021.08 × 1031.94 × 1021.63 × 102
Rank41532
f14Mean error3.58 ×10−12.38 × 1034.94 × 1033.99 × 1032.64 × 103
Rank12543
f15Mean error3.88 × 1035.19 × 1035.02 × 1033.81 × 1033.87 × 103
Rank35412
f16Mean error1.071.975.42 × 10−21.313.4 × 10−1
Rank35142
f17Mean error3.04 × 1019.29 × 1017.44 × 1021.16 × 1021.25 × 102
Rank12534
f18Mean error3.04 × 1022.34 × 1025.17 × 1021.21 × 1021.77 × 102
Rank43512
f19Mean error2.62 × 10−14.513.549.516.55
Rank13254
f20Mean error1.44 × 1011.43 × 1011.49 × 1011.35 × 1011.30 × 101
Rank43521
f21Mean error1.65 × 1023.20 × 1023.44 × 1023.09 × 1023.51 × 102
Rank13425
f22Mean error2.41 × 1011.72 × 1037.97 × 1034.30 × 1033.12 × 103
Rank12543
f23Mean error4.95 × 1035.28 × 1036.95 × 1034.83 × 1034.67 × 103
Rank34521
f24Mean error2.90 × 1022.47 × 1026.62 × 1022.67 × 1022.65 × 102
Rank41532
f25Mean error3.06 × 1022.89 × 1024.41 × 1022.99 × 1022.87 × 102
Rank42531
f26Mean error2.01 × 1022.52 × 1023.29 × 1022.86 × 1022.12 × 102
Rank13542
f27Mean error4.16 × 1027.64 × 1025.39 × 1021.00 × 1038.87 × 102
Rank14253
f28Mean error2.58 × 1024.02 × 1024.78 × 1034.01 × 1023.62 × 102
Rank14532
Table 7. Statistics of rank (SR) and average rankings (AR).
Table 7. Statistics of rank (SR) and average rankings (AR).
SR/ARABCDECMS-ESSPSO2011AMdynFWA
Total number of rank 1125945
Total number of rank 20103411
Total number of rank 347096
Total number of rank 484275
Total number of rank 5421441
Total number of rank7672938770
Average ranking2.712.573.323.112.5

Share and Cite

MDPI and ACS Style

Li, X.-G.; Han, S.-F.; Zhao, L.; Gong, C.-Q.; Liu, X.-J. Adaptive Mutation Dynamic Search Fireworks Algorithm. Algorithms 2017, 10, 48. https://doi.org/10.3390/a10020048

AMA Style

Li X-G, Han S-F, Zhao L, Gong C-Q, Liu X-J. Adaptive Mutation Dynamic Search Fireworks Algorithm. Algorithms. 2017; 10(2):48. https://doi.org/10.3390/a10020048

Chicago/Turabian Style

Li, Xi-Guang, Shou-Fei Han, Liang Zhao, Chang-Qing Gong, and Xiao-Jing Liu. 2017. "Adaptive Mutation Dynamic Search Fireworks Algorithm" Algorithms 10, no. 2: 48. https://doi.org/10.3390/a10020048

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop