**2. Related Work**

CS algorithm is capable of finding the best solutions by continuously using new and potentially better solution to replace a not-so-good cuckoo in the population, and it has been applied successfully to diverse fields. Recently, many CS variants have been developed to improve the performance of the CS algorithm. These variants can be generally divided into four categories: (1) parameter control [70]; (2) novel learning schemes [76]; (3) hybrid methods with other algorithm [74]; and (4) local search operator [77].

Due to the important influence of control parameters for the performance, much meaningful work has been done on the control parameter settings of CS algorithm. Initially, step size parameter control was investigated to improve the performance of CS algorithms. For instance, aiming at the faults that Cuckoo Search algorithm cannot acquire exact solutions and converges slowly in the later period, Ma et al. [78] proposed a self-adaptively step size adjustment cuckoo search algorithm (ASCS), which is an adaptively adjusted step size by using the distance between cuckoo nest location and the optimal nest location, which speeds up CS algorithm speed and improves the calculation accuracy. To balance the exploration and exploitation, Li and Yin [79] introduced two mutation rules and combined these two rules using a linear decreasing probability. Then, an adaptive parameter adjustment strategy was developed according to the relative success number of two newly added parameters in the previous iteration. Comparison results of the proposed algorithm show that this scheme is better than other algorithms. Two important factors, speed factor and aggregation factor, were defined by Yang et al. [80]. Then, according to these two factors, the step size and discovery probability were regulated. Experimental results show that the CS with improved step size and discovery probability has strong competitiveness in tackling numerical optimization problems. Li et al. [79] proposed the self-adaptive parameter CS algorithm, which uses two new mutation rules based on the rand and best individuals among the entire population. The self-adaptive parameter is set as a uniform random value based on the relative success number of the two new proposed parameters in the previous period, which enhance diversity of the population. Experimental results show that the proposed method performs better than twelve algorithms from the literature.

Li et al. [65] proposed an enhanced CS algorithm called dynamic CS with Taguchi opposition-based search and dynamic evaluation. The Taguchi search strategy provided random generalized learning based on opposing relationships to enhance the exploration ability of the algorithm. The dynamic evaluation strategy reduced the number of function evaluations, and accelerated the convergence property. Statistical comparisons of experimental results showed that the proposed algorithm makes an appropriate trade-off between exploration and exploitation. Li et al. [81] proposed a new cuckoo search algorithm extension based on self-adaptive knowledge learning, in which a learning model

with individual history knowledge and population knowledge is introduced into the CS algorithm. Individuals constantly adjust their position according to historical knowledge and communicate in the optimization process. Statistical comparisons of the experimental results showed that the proposed algorithm is a competitive new type of algorithm. Hojjat et al. [75] presented a new CS algorithm, called snap-drift cuckoo search (SDCS), which first employs a learning strategy and then considers improved search operators. The snap-drift learning strategy provides an online trade-o ff between local and global search via two snap and drift modes. SDCS tends to increase global search to prevent algorithm of being trapped in local minima via snap mode and reinforces the local search to enhance the convergence rate via drift mode. Statistical comparisons of experimental results showed that SDCS is superior to modified CS algorithms in terms of convergence speed and robustness.

According to the rand and best individuals among the entire population, Cheng et al. [82] proposed an ensemble CS variant in which three di fferent cuckoo search algorithms coexist in the entire search process, which compete to produce better o ffspring for numerical optimization. Then, an external archive is introduced to further maintain population diversity. Statistical comparisons of experimental results showed that the improved CS variant is superior to modified CS algorithms in terms of convergence speed and robustness. Wen et al. [83] proposed a new hybrid algorithm based on grey wolf optimizer and cuckoo search (GWOCS), which was developed to extract the parameters of di fferent PV cell models with the experimental data under di fferent operating conditions. Zhang et al. [84] proposed an ensemble CS variant that divides the population into two subgroups and adopts CS and DE for these two subgroups independently. These two subgroups can exchange useful information by division. These two algorithms can utilize each other's advantages to complement their shortcomings, thus balancing the quality of solution and the computation consumption. Zhang et al. [85] devised a hybridization of CS and covariance matrix adaption evolution strategy (CMA\_ES) to improve performance for the di fferent optimization problems. Computational results demonstrate that the proposed algorithm outperforms other competitor algorithms. Tang et al. [86] introduced Gaussian distribution, Cauchy distribution, Levy distribution, and Uniform distribution, improving the performance of cuckoo search algorithm by the method of pair combination. Simulation results show that the hybrid distribution with Cauchy distribution and Levy distribution can make the CS algorithm perform better.

With respect to applications, CS has been extensively applied to many domains, such as neural networks [87], image processing [88], nonlinear systems [89,90], network structural optimization [91], agriculture optimization [92], engineering optimization [93], and scheduling [94]. These applications indicate that CS algorithm is an e ffective and e fficient optimizer for solving some real-world problems.
