Next Article in Journal
Inference of Factors for Labor Productivity Growth Used Randomized Experiment and Statistical Causality
Next Article in Special Issue
A Compact and High-Performance Acoustic Echo Canceller Neural Processor Using Grey Wolf Optimizer along with Least Mean Square Algorithms
Previous Article in Journal
Identifying Combination of Dark–Bright Binary–Soliton and Binary–Periodic Waves for a New Two-Mode Model Derived from the (2 + 1)-Dimensional Nizhnik–Novikov–Veselov Equation
Previous Article in Special Issue
A Partition-Based Random Search Method for Multimodal Optimization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

MFO-SFR: An Enhanced Moth-Flame Optimization Algorithm Using an Effective Stagnation Finding and Replacing Strategy

by
Mohammad H. Nadimi-Shahraki
1,2,*,
Hoda Zamani
1,2,
Ali Fatahi
1,2 and
Seyedali Mirjalili
3,4,*
1
Faculty of Computer Engineering, Najafabad Branch, Islamic Azad University, Najafabad 8514143131, Iran
2
Big Data Research Center, Najafabad Branch, Islamic Azad University, Najafabad 8514143131, Iran
3
Centre for Artificial Intelligence Research and Optimisation, Torrens University Australia, Brisbane 4006, Australia
4
Yonsei Frontier Lab, Yonsei University, Seoul 03722, Republic of Korea
*
Authors to whom correspondence should be addressed.
Mathematics 2023, 11(4), 862; https://doi.org/10.3390/math11040862
Submission received: 2 December 2022 / Revised: 22 January 2023 / Accepted: 3 February 2023 / Published: 8 February 2023

Abstract

:
Moth-flame optimization (MFO) is a prominent problem solver with a simple structure that is widely used to solve different optimization problems. However, MFO and its variants inherently suffer from poor population diversity, leading to premature convergence to local optima and losses in the quality of its solutions. To overcome these limitations, an enhanced moth-flame optimization algorithm named MFO-SFR was developed to solve global optimization problems. The MFO-SFR algorithm introduces an effective stagnation finding and replacing (SFR) strategy to effectively maintain population diversity throughout the optimization process. The SFR strategy can find stagnant solutions using a distance-based technique and replaces them with a selected solution from the archive constructed from the previous solutions. The effectiveness of the proposed MFO-SFR algorithm was extensively assessed in 30 and 50 dimensions using the CEC 2018 benchmark functions, which simulated unimodal, multimodal, hybrid, and composition problems. Then, the obtained results were compared with two sets of competitors. In the first comparative set, the MFO algorithm and its well-known variants, specifically LMFO, WCMFO, CMFO, ODSFMFO, SMFO, and WMFO, were considered. Five state-of-the-art metaheuristic algorithms, including PSO, KH, GWO, CSA, and HOA, were considered in the second comparative set. The results were then statistically analyzed through the Friedman test. Ultimately, the capacity of the proposed algorithm to solve mechanical engineering problems was evaluated with two problems from the latest CEC 2020 test-suite. The experimental results and statistical analysis confirmed that the proposed MFO-SFR algorithm was superior to the MFO variants and state-of-the-art metaheuristic algorithms for solving complex global optimization problems, with 91.38% effectiveness.

1. Introduction

Global optimization problems are complex and characterized by various properties, for instance, they can be non-linear, non-separable, symmetric, asymmetrical, smooth with narrow ridges, unimodal, and multimodal, and can involve non-differentiable functions and high dimensionality [1,2]. These properties create challenges for existing optimization algorithms, and finding the global optimum is one of the long-standing goals in this area of study. To overcome such challenges, a series of metaheuristic algorithms have been introduced using various innovative approaches. Metaheuristic algorithms have exhibited impressive performance in exploring the problem space and approximating the promising regions in reasonable timeframes. They have been widely improved upon and adapted to solve optimization problems in diverse fields such as computer science [3,4], engineering [5,6], and medicine [7,8,9]. Metaheuristic algorithms can be classified into two groups: single-solution-based and population-based algorithms [10,11]. Single-solution-based metaheuristic algorithms are more oriented towards exploitation searches and they manipulate a single solution during the optimization process, which increases its potential to easily become stuck in local optima [12]. To solve this challenge, population-based metaheuristic algorithms were developed to be more exploration-oriented and to share the information in order to promote significant diversification in the search space [13,14]. Based on the source of inspiration, these algorithms can be classified as evolutionary-based, physics-based, human-based, and swarm intelligence-based algorithms [15,16].
Evolutionary-based algorithms involve a heuristic approach inspired by the biological evolution of species, such as animals, insects, and plants in nature [17,18]. Some prominent optimizers in this group are genetic algorithms [19], differential evolution [20], and the evolution strategy [21]. Physics-based algorithms are defined based on the main concepts of mathematics and physics, such as quantum physics [22,23,24], gravity [25,26], and optics [27], with the aim of performing a meaningful search in the problem space. Human-based algorithms simulate various human activities in order to generate innovative solutions in solving optimization problems. The imperialist competitive algorithm [28], the harmony search algorithm [29], teaching learning-based optimization [30], brain storm optimization (BSO) [31], the soccer league competition algorithm [32], the volleyball premier league algorithm [33], poor and rich optimization (PRO) [34], and past present future (PPF) [35] are some of the state-of-the-art optimizers in this group. Swarm intelligence-based optimization algorithms originated from the collective and self-organized behavior of unsophisticated agents such as insects, terrestrial, fish, and birds [36,37]. Ant colony optimization [38] and particle swarm optimization [39] were the most successful swarm intelligence-based optimization algorithms proposed in the 1990s. From the 21st century onwards, some new algorithms have been put forward in this group, such as artificial bee colony (ABC) [40], cuckoo search (CS) [41], the whale optimization algorithm (WOA) [42], elephant herding optimization (EHO) [43], moth-flame optimization (MFO) [44], the horse herd optimization algorithm (HOA) [45], the quantum-based avian navigation optimizer algorithm (QANA) [46], the African vultures optimization algorithm [47], farmland fertility [48], dwarf mongoose optimization (DMO) [49], the starling murmuration optimizer (SMO) [50], and the artificial gorilla troops optimizer [51].
Most population-based metaheuristic algorithms lack mechanisms that can maintain population diversity and the imbalance between search strategies and premature convergence problems. Hence, many effective mechanisms have been proposed to alleviate the weaknesses of these algorithms [52,53]. The artificial bee colony algorithm (ABC) is a prominent population-based metaheuristic algorithm that suffers from poor local search performance. Hence, Zhu et al. [54] proposed the Gbest-guided ABC (GABC) algorithm to incorporate information on the global best solution into the search strategy in order to improve the ability to exploit the algorithm. Other algorithms that have achieved significant performance improvements in terms of their local search ability are the quick artificial bee colony (qABC), best-so-far ABC [55], and grey artificial bee colony (GABC) algorithms [56]. Nadimi-Shahraki et al. [57] introduced a diversity-maintained multi-trial vector-differential evolution algorithm to increase population diversity and suspend the risk of premature convergence during the evolutionary process.
The moth-flame optimization (MFO) algorithm was inspired by the navigation behavior of moths toward a light source in nature and is used to solve global optimization problems. The MFO algorithm benefits from having a straightforward structure and a small number of control parameters, which increases its versatility. However, the MFO algorithm suffers from problems related to low population diversity [58], which leads it to become stuck in unpromising regions and to achieve low-quality solutions. Many MFO variants have been developed by introducing and hybridizing different search strategies and operators to overcome such challenges. Kaur et al. [59] proposed an enhanced moth flame optimization (E-MFO) method to solve global optimization problems. The E-MFO algorithm applied a Cauchy distribution function and the influence of the best flame parameter to enhance its exploration and exploitation capabilities, respectively. Moreover, an adaptive step size and division of iterations were proposed to balance search strategies. Li et al. [60] presented the Lévy-flight moth-flame optimization (LMFO) algorithm to prevent premature convergence into local optima and enable a trade-off between the algorithm’s exploration and exploitation abilities during the search process. Khalilpourazari et al. [61] introduced the WCMFO algorithm, which is a hybridized form of two algorithms, the water cycle and moth-flame optimization algorithms, to increase the exploitation ability of MFO and the exploration ability of the water cycle algorithm. To cope with the weaknesses of MFO, Hongwei et al. [62] proposed chaos-enhanced moth-flame optimization (CMFO) using ten chaotic maps. The chaotic maps are applied in population initialization, boundary handling, and the tuning of the distance parameter. Other variants of MFO are sine-cosine moth-flame optimization (SMFO) [63], combining MFO with Gaussian, Cauchy, and Lévy mutations (LGCMFO) [64], the enhancement of the local search mechanism based on shuffled frog leaping and a death mechanism with MFO (ODSFMFO) [65], and the chaotic local search and Gaussian mutation-enhanced MFO (CLSGMFO) approach [66].
Although the mentioned MFO variants have attained effective modifications in performance, they may still suffer from poor population diversity, which leads to premature convergence to local optima and a decrease in the quality of the algorithms’ solutions when tackling complex optimization problems. Moreover, due to the approximate nature of metaheuristic algorithms, there is always an opportunity for improvement in their search strategies. Therefore, this study was devoted to proposing an enhanced moth-flame optimization algorithm named MFO-SFR with the aim of solving global optimization problems. The proposed MFO-SFR algorithm is equipped with an effective stagnation finding and replacing (SFR) strategy to establish diversity throughout the search process and overcome the drawbacks of previous MFO approaches. Moreover, the boundary handling of the MFO algorithm is rectified by generating new random solutions in the range of the problem space. Overall, the main contributions of this study can be summarized as follows.
  • We propose the MFO-SFR algorithm, boosting the performance and enriching the diversity of the canonical MFO;
  • We introduce an effective stagnation finding and replacing (SFR) strategy to boost the performance of the search process; and
  • We introduce an archive to incorporate the representative and the global best flames throughout the search process in order to enrich the diversity.
The performance of the proposed MFO-SFR algorithm was assessed with the CEC 2018 test functions [67] in 30 and 50 dimensions. Then, the MFO-SFR algorithm was compared with two sets of MFO variants and well-known optimizers. In the first set of contender algorithms, the canonical MFO [44] and its variants— Lévy-flight moth-flame optimization LMFO [60], an efficient hybrid algorithm based on the water cycle and moth-flame algorithms (WCMFO) [61], chaos-enhanced moth-flame optimization (CMFO) [62], death mechanism-based moth–flame optimization (ODSFMFO) [65], the synthesis of the moth-flame optimizer with sine cosine mechanisms (SMFO) [63], and the hybrid of whale and moth-flame optimization (WMFO) [68]—were selected. In the second set, the particle swarm optimization (PSO) [39], krill herd (KH) [69], grey wolf optimization (GWO) [70], the crow search algorithm (CSA) [71], and the horse herd optimization algorithm (HOA) [45] were considered. Furthermore, the results obtained using the proposed and contender algorithms were statistically analyzed using the Friedman test. Ultimately, two well-known mechanical engineering problems from the CEC 2020 test suite [72] were considered to assess the applicability of MFO-SFR in solving real-world optimization problems. The experimental results indicated that the proposed MFO-SFR algorithm boosted the performance of the canonical MFO by using an effective stagnation finding and replacing (SFR) strategy and an archive construction mechanism. Moreover, the statistical analysis revealed that the performance of the proposed MFO-SFR algorithm was superior to that of the contender algorithms.
The structure of the paper is as follows. Section 2 contains a review of related literature. Section 3 presents the MFO algorithm. In Section 4, the proposed MFO-SFR algorithm is explained in detail. Section 5 thoroughly evaluates the MFO-SFR’s performance in addressing CEC 2018 benchmark test functions. Section 6 evaluates the applicability of the proposed MFO-SFR using two real-world mechanical engineering problems from the latest CEC 2020 test suite. Finally, Section 6 summarizes the results and outlines possible future directions of research.

2. Related Works

MFO variants used to solve different optimization problems are reviewed in this section.
Li et al. [60] boosted the performance of the canonical MFO by using the Lévy-flight strategy. Nadimi-Shahraki et al. [73] proved that the canonical MFO suffers from premature convergence, low population diversity, and an imbalance between search strategies in solving global optimization problems. Therefore, they proposed an improved moth-flame optimization (I-MFO) algorithm to cope with the abovementioned deficiencies. The I-MFO algorithm is equipped with the adapted wandering-around search strategy to maintain population diversity and escape from local optima. The chaos-enhanced MFO (CMFO) [62] algorithm was proposed to improve the performance of the MFO algorithm by incorporating chaos maps into population initialization, boundary handling, and parameter tuning. Pelusi et al. [74] proposed the improved moth-flame optimization (IMFO) algorithm using a hybrid phase, a dynamic crossover mechanism, and a fitness-dependent weight factor. The hybrid phase achieved a good trade-off between the exploration and exploitation phases, the dynamic crossover mechanism enhanced the population diversity, and the fitness-dependent weight factor improved the exploitation phase.
Xu et al. [64] proposed a series of MFO variants by combining the standard MFO algorithm with Gaussian mutation, Cauchy mutation, and Lévy mutation. Gaussian mutation was employed to improve its neighborhood-informed capability, Cauchy mutation was used to enhance its global exploration ability, and the Lévy mutation was employed to increase the randomness in the search process. Li et al. [65] proposed the ODSFMFO algorithm, which consists of an improved flame generation mechanism based on opposition-based learning and the differential evolution algorithm, an enhanced local search mechanism based on the shuffled frog leaping algorithm and a death mechanism. This algorithm maintained the quality of the population through opposition-based learning, population diversity using the differential evolution algorithm, the global search ability through the use of the shuffled frog leaping algorithm, and provided an escape from local optima via the use of the death mechanism. Nadimi-Shahraki et al. [75] proposed a migration-based moth–flame optimization (M-MFO) algorithm with a random migration operator, a guided migration operator, and a guiding archive to alleviate the low population diversity and poor exploration ability of MFO.
Ma et al. [76] developed an improved moth-flame optimization algorithm to prevent premature convergence to local minima. This algorithm uses the inertia weight of diversity feedback control to strike a balance between search strategies and maintain population diversity. Moreover, the mutation probability was added to improve the optimization performance. To enhance the diversity in the position of flames and the search strategy used for moths, Zhao et al. [77] developed an improved MFO (IMFO) algorithm. In this algorithm, the flames are generated through orthogonal opposition-based learning, and their positions are updated using a linear search and a mutation operator. Sapre et al. [78] introduced an opposition-based moth flame optimization method with Cauchy mutation and evolutionary boundary constraint handling (OMFO) to bypass the local optima and accelerate the convergence speed towards promising areas. Sahoo et al. [79] proposed a modified dynamic-opposite-learning-based MFO algorithm named m-MFO, using a modified dynamic-opposite learning strategy to enrich the performance of MFO in solving optimization problems. Other MFO variants include the double-evolutionary learning MFO algorithm (DELMFO) [80], the improved moth-flame optimization algorithm (IMFO) [81], the hybrid MFO and hill climbing (MFOHC) method [82], an enhanced MFO algorithm integrated with orthogonal learning and the Broyden–Fletcher–Goldfarb–Shanno (BFGSOLMFO) method [83], and quantum-behaved simulated annealing algorithm-based moth-flame optimization (QSMFO) [84].
Due to the simple structure of MFO and its low number of control parameters, it has great potential to solve real-world applications. However, the canonical MFO critically suffers from local optimum trapping and premature convergence during the optimization process, which results in low-quality solutions [85,86,87]. Therefore, many improved and hybrid variants have been developed to overcome these challenges. Sayed et al. [88] presented the SA-MFO algorithm, a hybrid of the MFO approach and the simulated annealing (SA) algorithm, to escape from local optima using SA and accelerate the search process using MFO. Many researchers have applied the MFO algorithm to solve the optimal power flow (OPF) problem [89,90,91]. An effective hybridization of the whale optimization algorithm and a modified moth-flame optimization algorithm named WMFO [68] was proposed to solve diverse scales of the OPF problem. Sahoo et al. [92] proposed a hybrid MFO and butterfly optimization algorithm (h-MFOBOA) to overcome shortcomings such as a slow convergence speed and poor exploitation ability in both optimizers. Sattar Khan et al. [93] adapted the MFO algorithm for an integrated power plant system containing stochastic wind. MFO has been applied to the solution of problems related to fuel cells in a renewable active distribution network [94], the identification of parameters for photovoltaic modules [95], and fuel consumption in variable-cycle engines [96], with promising results.

3. Moth-Flame Optimization (MFO) Algorithm

Nocturnal moths use celestial light sources to navigate over long distances accurately. They fly in a straight line with a constant angle toward the Moon or stars, and this behavior is called transverse orientation. However, when a moth flies toward a nearby artificial light, it thinks it is a star or the Moon. Therefore, the moth continually changes its flight angle to keep going in a straight line toward the light, resulting in a spiral motion around the artificial light. In 2015, this behavior was mathematically modeled in the moth-flame optimization algorithm [44] developed by Mirjalili to solve the global optimization problem, described in detail as follows.
In this approach to solving the optimization problem, the positions of moths evolve during predefined iterations. In the first iteration, moths are randomly distributed in the problem space using Equation (1), where Xid denotes the dth dimension of the ith moth position and the parameters Ubd and Lbd are the upper and lower boundaries for the dth dimension, respectively.
X i d = r a n d i , d × U b d L b d + L b d ,   1 d D
For the rest of the iterations, their new positions are updated based on the position of the flame. Therefore, the flame number (R) is computed using Equation (2), where the parameters N and MaxIterations denote the number of moths and the maximum number of iterations, respectively. Then, the positions of the flames are determined based on the stepwise procedure denoted in Table 1.
R = r o u n d N t × N 1 M a x I t e r a t i o n s
Ultimately, for the flame number (R), each moth can update its position using the two different trials denoted in Equation (3), where Xi (t + 1) is the new position of the ith moth, D i ( t ) is computed using Equation (4), b is the constant value, k is calculated using Equations (5) and (6), and Fi (t) denotes the ith flame.
R X i ( t + 1 ) = D i ( t ) × e b k × cos 2 π k + F i ( t ) i R D i ( t ) × e b k × cos 2 π k + F R ( t ) i > R
D i ( t ) = | F i t X i t |
k = a 1 × r a n d 0 , 1 + 1
a = 1 + t × 1 M a x I t e r a t i o n s
In the second trial (when i > R), the parameter D i ( t ) is computed using Equation (7), and FR (t) is the current position of the Rth flame.
D i ( t ) = | F R ( t ) X i t |

4. The Proposed MFO-SFR Algorithm

According to the literature, the canonical MFO lacks an efficient operator to maintain population diversity. The search process may be biased by the best solutions obtained in each iteration [97]. This deficiency leads to premature convergence into unpromising regions, local optimum stagnation, and a decrease in the solution quality when solving complex problems. Hence, in this study, we were motivated to propose an enhanced moth-flame optimization algorithm named MFO-SFR to effectively maintain population diversity and mitigate the deficiencies mentioned above by introducing an effective stagnation finding and replacing (SFR) strategy.
Stagnation finding and replacing (SFR) strategy: Suppose that the matrix X (t) = {X1D (t), …, XiD (t), …, XND (t)} denotes a moth population in the current iteration t in a D-dimensional search space. Each vector XiD (t) denotes the position of the ith moth in the problem space. The matrix X (t) is initialized for the first iteration using a uniform random distribution. For the rest of the iterations (when t ≥ 2), the new positions of the moths are determined using Equation (8), where D i α ( t ) and D i β ( t ) are the main elements of the SFR strategy, which is computed using Equations (9) and (10), respectively. A constant b expresses the shape of the logarithmic spiral, and τ is a random number between the intervals −1 and 1. F j t and F R t are the positions of the jth flame and the Rth flame such that the parameter R is computed using Equation (2). In Equation (9), vector M i t is determined using Definition 1. To find the stagnant solutions, the mean of the distance or φ i is calculated using Equation (11), where Xiq is the qth dimension of the ith moth. Fjq is the qth dimension of the jth flame in which the index j is determined by Equation (12), which sorts the results obtained from Equation (11) in descending order to obtain the indexes, then applies them as flame indexes in Equation (10).
X i t + 1 = D i α ( t ) × e b τ × cos 2 π t + F j t i f   i R t D i β ( t ) × e b τ × cos 2 π t + F R t e l s e
D i α t = F j t M i t
D i β t = F j t X i t φ i > 0 Selecting a random position from the Arc φ i = 0
φ 1 , , φ i , , φ N φ i = 1 D × q = 1 D F j q t X i q t
φ 1 , , φ j , , φ N S o r t φ 1 , , φ i , , φ N
Definition 1.
(Archive construction): The main idea behind archive construction is to enrich the population diversity by preserving the generated representative flame and boost the convergence of solutions toward promising areas by preserving the best solutions in each iteration. To construct the archive Arc, consider the matrix M = {M1, …, Mi, …, Mκ} as the memory of the Arc with predefined κ. Each Mi = [mi1, mi2,, miD] denotes this memory’s vector position, which is generated using Algorithm 1. First, dualPop and dualFit are created based on the flame construction process described in Table 1. Then, the representative flame (RF) with the average of flames’ positions is computed using Equation (13), where C is the total number of considered moths and Fid denotes the dth dimension of the ith flame. Finally, the global best flame and RF position are archived as two new entries in the memory M. In regard to inserting these new entries; they are randomly replaced with two existing entries if the memory is full.
R F d t = 1 C i = 1 C F i d t
In addition, MFO-SFR checks the feasibility of the position of the new moths to return those that have violated the problem space boundaries by generating random positions in the range of the problem space.
Algorithm 1. The pseudocode of the archive construction process.
Input: C: Number of considered flames, and κ: the maximum size of the archive Arc.
Output: Returns the archive Arc.
1.begin
2.dualPop and dualFit are created based on flame construction defined in Table 1.
3.FitBest = Ascending order of the vector dualFit and selecting the best N values.
4.PopBest = The corresponding positions of vector FitBest.
5.Computing RF using Equation (13) for C number of considered flames.
6.If the current memory size < κ−1.
7. Inserting RF and the global best flame into the Arc.
8.else
9. Replacing RF and the global best flame with two existing memory entries.
10.end if
11.end

Complexity Analysis

Regarding the pseudocode of MFO-SFR shown in Algorithm 2, the MFO-SFR algorithm consists of six distinct phases: initialization, flame construction, archive construction, movement, correcting the violated positions, and updating the positions. In the initialization phase, N moths are randomly distributed in a D-dimensional search space with an O(ND) computational complexity. In the flame construction phase, flames are constructed differently with the computational complexity of O(N2), considering the worst case for the quicksort algorithm. The computational complexity of the archive construction phase using Algorithm 1 is O(N2 + ND), because lines 2−4 have the complexity of O(N2) with respect to the original paper’s definition of MFO, and Equation (13) has O(ND) in the worst case. The cost of the movement phase is O(ND), using either Equations (8) and (9) when iR or using Equations (8) and (10) when i > R. Then, the feasibility of the new positions is checked to correct the violated positions with the computational complexity of O(ND). Finally, the updating phase is performed with O(ND) computational complexity. Therefore, considering T iterations, the computational complexity of MFO-SFR is O(ND + N2 + T(2N2 + 4ND) or O(TN2 + TND). In the same fashion, the space complexity is O(N + ND + κ), considering that the memory is reusable and the size of the memory is κ. Thus, the space complexity of MFO-SFR is O(ND + D2log N).
Algorithm 2. The pseudocode of the proposed MFO-SFR algorithm.
Input: N: Number of moths, MaxIterations: Maximum iterations, and D: Dimension size.
Output: Returns the position of the global best flame and its fitness value.
1.Begin
2.Initiating matrix X (t) using a uniform random distribution in the D-dimensional search space.
3.Computing the fitness value of X (t) and storing them in vector OX (t).
4.Constructing the flame fitness value OF by ascending order of the vector OM (t).
5.Constructing the flame positions F based on their obtained vector OF.
6.While tMaxIterations
7. Updating F and OF by the best N moths from F and current X.
8. Computing the flame number R using Equation (2).
9. Archiving using Algorithm 1.
10. For i = 1: N
11.  If iR
12.   Computing the distance between flame Fi (t) and Mi (t) using Equation (9).
13.   Updating the position of Xi (t) using Equation (8).
14.  else
15.   Computing the distance using Equation (10).
16.   Updating the position of Xi (t) using Equation (8).
17.  End if
18.  Checking the feasibility and correcting the new position.
19.  Computing the fitness value of the new position.
20. End for
21. Updating the global best flame.
22.End while

5. Evaluation of the Proposed MFO-SFR Algorithm

In this section we present our evaluation of the performance of the proposed MFO-SFR algorithm in solving global optimization problems from the CEC 2018 benchmark test suite [67]. This test suite is suitable for evaluating the proposed algorithm in terms of its local optimum avoidance ability and the diversity of solutions as it consists of 29 test functions with different characteristics, such as unimodal, multimodal, and hybrid functions, as well as compositions with various dimensions (D), specifically, 30 and 50 dimensions. Moreover, in this section, we also present two separate sets of experiments conducted to extensively assess and compare the performance of the proposed MFO-SFR algorithm with several well-known optimization methods. The proposed algorithm was compared to the original MFO and its variants in the first set, and then, in the second experimental set, it was compared to other prominent and recent optimizers. In both experiment sets, all comparative algorithms’ control parameter values were adjusted to match those in their original articles, as depicted in Table 2. All of the algorithms were executed 20 times on a laptop with an Intel Core i7-10750H CPU (2.60 GHz), 24 GB of memory, and MATLAB R2022a with a maximum of (D × 104)/N iterations, where D represents the dimension size of the problem and N is the population size, which was set to 100 in this study.
To investigate the impact of the archive introduced in Equation (10), a numerical pretest was performed on the canonical MFO algorithm using the CEC 2018 benchmark test suite on dimension 30. In this pre-test percentage of situations when the parameter φi was equal to zero is computed and reported in Table A1 of Appendix A. The results reported in Table A1 in Appendix A showed that for some test functions, especially hybrid and composition ones, the percentage of stagnant solutions was high enough to affect the quality of the generated solutions.

5.1. Comparing the Proposed MFO-SFR Algorithm with MFO Variants

In this set of experiments, we compared the proposed MFO-SFR algorithm with moth-flame optimization (MFO) [44] and its variants, including LMFO [60], WCMFO [61], CMFO [62], ODSFMFO [65], SMFO [63], and WMFO [68]. Table 3 compares the results of the proposed MFO-SFR algorithm with those of MFO and its variants in solving the CEC 2018 test functions with 30 dimensions. The results acquired from the unimodal test functions F1 and F3 demonstrated that MFO-SFR had an acceptable exploitation potential compared to the other algorithms. The results from multimodal test functions F4–F10 indicated that the proposed algorithm was able to efficiently search the problem space and find the unvisited areas by maintaining its population diversity throughout the optimization process. The overall results of the hybrid and composition functions F11–F30 confirmed that MFO-SFR avoided local optimum solutions by striking a balance between exploration and exploitation abilities. Moreover, the final rows of Table 3 and Table 4 reveal that according to the Friedman test [98], the proposed MFO-SFR algorithm ranked first among the algorithms, including MFO and the other investigated variants.
Table 4 presents the average and minimum fitness values obtained from the proposed MFO-SFR algorithms, MFO, and its six variants in solving the CEC 2018 benchmark test functions with 50 dimensions. Overall, the results showed that the proposed MFO-SFR algorithm provided competitive results for most test functions, and it ranked first according to the Friedman test results, which are reported in the final row of the table. Additionally, an exploratory data analysis is depicted in Figure 1 to show the ranking of algorithms for each function. Overall, it can be seen that the proposed MFO-SFR algorithm surrounds the center of the radar chart for most test functions in 30 and 50 dimensions. For instance, for F1, the proposed MFO-SFR algorithm was ranked first in 30 dimensions and third in 50 dimensions, whereas WMFO and the canonical MFO algorithms were ranked second and seventh for 30 and 50 dimensions, respectively. For F12, it can be seen that MFO-SFR was ranked second, MFO was ranked seventh, and WMFO was ranked first for 30 dimensions, and these three algorithms were ranked second, seventh, and first, respectively, for 50 dimensions. For F27, MFO-SFR and WMFO were ranked first and sixth in both 30 and 50 dimensions, whereas the canonical MFO algorithm was ranked fourth in 30 dimensions and fifth in 50 dimensions.
The convergence comparison of the proposed MFO-SFR algorithm and the other studied algorithms is shown in Figure 2. For F1 in 30 dimensions, it can be seen that although MFO-SFR exhibited prolonged convergence, it provided the best solution compared to the other algorithms. In 50 dimensions, however, it ranked second after WMFO. For multimodal functions F5 and F7, the convergence trend of MFO-SFR continued up to the final iterations, whereas most of the competitors were flattened in local optimum zones. As evidence of the adequate balance between exploration and exploitation, for hybrid functions F10 and F16, MFO-SFR exhibited sharp movements in the first half of the iterations and relatively modest fluctuations in the second half. Ultimately, for composition test functions F21, F26, and F30, MFO-SFR exhibited a gradual trend toward the optimum solutions after beginning its convergence with a sharply descending slope. This behavior indicates the capacity of MFO-SFR to bypass the local optimum and avoid premature convergence.

5.2. Comparing the Proposed MFO-SFR Algorithm with Other Well-Known Optimization Algorithms

The second set of experiments, we compared the performance of the proposed MFO-SFR algorithm with the well-known representative metaheuristic algorithms presented in the literature, including particle swarm optimization (PSO) [39], krill herd (KH) [69], grey wolf optimization (GWO) [70], the crow search algorithm (CSA) [71], and the horse herd optimization algorithm (HOA) [45]. The algorithms’ source codes were gathered from publicly available resources, and their parameter values were the same ones considered in the original papers, as reported in Table 2. Table 5 and Table 6 compare the average and minimum fitness values produced by the proposed MFO-SFR algorithm and the other algorithms for 30 and 50 dimensions. The results of the test functions F1 and F3–F10 for both numbers of dimensions demonstrated that MFO-SFR exhibited impressive exploitation and exploration capabilities and generated better solutions while dealing with unimodal and multimodal tests. The results of test functions F11–F30 demonstrated that the MFO-SFR avoided local optimum trapping and balanced the trade-off between exploration and exploitation abilities. Furthermore, the final two rows present the results of the Friedman test for each algorithm, in which MFO-SFR ranked first among the comparative algorithms for both 30 and 50 dimensions.
The exploratory data analysis shown in Figure 3 was conducted to investigate the ranking of algorithms for each test function. Overall, it can be noted that the proposed MFO-SFR algorithm was ranked first among the other compared algorithms for all test functions, except for F10 in 30 dimensions. For 50 dimensions, it is notable that MFO-SFR was ranked first for all test functions except for F4, F10, F22, and F28.
As shown in Figure 4, we analyzed MFO-SFR’s convergence behavior and compared it with that of the other algorithms. Overall, it can be seen that the proposed MFO-SFR algorithm was able to converge toward more accurate solutions by avoiding local optimum solutions and striking a balance between its search abilities. It is also notable that the proposed MFO-SFR algorithm maintained its solution accuracy by enhancing the number of dimensions, which demonstrates the scalability of the proposed algorithm.

5.3. Population Diversity Analysis

Maintaining population diversity is essential in metaheuristic algorithms since low diversity among search agents may cause the algorithm to become stuck at local optimum areas. In this experiment, the population diversity of MFO-SFR and five representatives of comparative algorithms was investigated on several CEC 2018 benchmark test suites with 30 and 50 dimensions. The population diversity curves presented in Figure 5 were calculated by measuring the moment of inertia (Ic) [99], where Ic denotes the spreading of each individual from their centroid, which was determined by Equation (14), and the centroid cj for j = 1, 2, ... D was calculated using Equation (15). Comparing the population diversity curves with the convergence curves plotted in Figure 2 and Figure 4, it can be noted that the proposed MFO-SFR algorithm effectively maintained diversification among solutions until the near-optimal solution was met. This behavior occurred mainly because of the introduced SFR strategy, which identified stagnant solutions using a distance-based technique and replaced them with a solution selected from the archive constructed from the previous solutions. The introduced archive was able to maintain not only the diversification of solutions by preserving the generated representative flame but also the convergence of solutions toward promising areas by preserving the best solutions in each iteration.
I c = i = 1 D j = 1 N M i j c i 2
c i = 1 N j = 1 N M i j

5.4. The Overall Effectiveness of MFO-SFR

The overall effectiveness (OE) achieved by the proposed MFO-SFR algorithm in solving test functions with 30 and 50 dimensions was computed using Equation (16) and the results are reported in Table 7 and Table 8. OEi indicates the overall effectiveness of the i-th algorithm, Li is the total number of test functions that the i-th algorithm lost, and TF is the total number of test functions. Table 7 compares the OE achieved by the proposed MFO-SFR with the other MFO variants, showing that MFO-SFR attained the highest OE value, equal to 74.14%. Moreover, Table 8 shows that MFO-SFR achieved a higher OE value of 91.38% compared to other well-known optimization algorithms.
O E i ( % ) = T F L i T F

6. Applicability of MFO-SFR to Solving Mechanical Engineering Problems

There is a growing interest in using optimization algorithms in mechanical and engineering systems to improve performance, cost, and product lifespan [50,100]. Therefore, in this section we assessed the applicability of MFO-SFR using two challenging real-world optimization issues from the most recent CEC 2020 test suite [72]. The constraints of the problems were handled using a death penalty function. The maximum number of iterations for MFO-SFR and the variants of MFO was (D × 104)/N, where D is the number of decision variables and N is the number of search agents, which was set to 20.

6.1. Welded Beam Design (WBD) Problem

The WBD [101], stated in Equation (17), is a well-known optimization issue in constrained engineering problems. The primary goal of this task, as indicated in Figure 6, is to minimize the total fabrication cost of a welded beam by determining the best design parameters for the clamped bar length (l), weld thickness (h), bar thickness (b), and bar height (t). The results tabulated in Table 9 indicate that the proposed MFO-SFR exhibited superior performance compared with the other algorithms.
Consider x = x 1 , x 2 , x 3 , x 4 = h , l , t , b , (17)
Min f x = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 14.0 + x 2 ,
Subject to g 1 x = τ x - τ m a x 0 ,
g 2 x = σ x - σ m a x 0 ,
g 3 x = x 1 - x 4 0 ,
g 4 x = 1.10471 x 1 2 + 0.04811 x 3 x 4 14.0 + x 2 - 5.0 0 ,
g 5 x = 0.125 - x 1 0 ,
g 6 x = δ x - δ m a x 0 ,
g 7 x = P - P c 0 ,
Variable range  0.1 x i 2 , i = 1,4 ,
0.1 x i 10 , i = 2,3 .
where τ x = τ 2 + 2 τ τ x 2 2 R + τ 2 ,   τ = P 2 x 1 x 2 ,   τ = M R J ,   M = P L + x 2 2 ,
R = x 2 2 4 + x 1 + x 3 2 2 ,   J = 2 2 x 1 x 2 x 2 2 12 + x 1 + x 3 2 2 ,   σ x = 6 P L x 4 x 3 2 ,  
δ x = 6 P L 3 E x 3 2 x 4 , P c x = 4.013 E x 3 2 x 4 6 36 E x 3 2 x 4 1 - x 3 2 L E 4 G , P = 6000 l b , L = 14 i n . ,
E = 30 × 10 6 p s i , G = 12 × 10 6 p s i ,   τ m a x = 13,600 p s i ,   σ m a x = 30,000 p s i ,
δ m a x = 0.25 in.

6.2. The Four-Stage Gearbox Problem

The design of a four-stage gearbox [102] was the second engineering design optimization problem examined in this study. To reduce the weight of the gearbox, the mathematical model specified in Equation (18) was used, together with 86 non-linear constraints and 22 discrete decision variables. According to the results reported in Table 10, the MFO-SFR algorithm outperformed the other algorithms in terms of the quality of its solution.
Minimize :   F x - = π 1000 i = 1 4 b i c i 2 N p i 2 + N g i 2 N p i + N g i 2 , i = ( 1 , 2 , 3 , 4 )
Subject to:
g 1 x - = 366000 π ω 1 + 2 c 1 N p 1 N p i + N g 1 N p 1 + N g 1 2 4 b 1 c 1 2 N p 1 σ N J R 0.0167 W K 0 K m 0 g 2 x - = 366000 N g 1 π ω 1 N p 1 + 2 c 2 N p 2 N p 2 + N g 2 N p 2 + N g 2 2 4 b 2 c 2 2 N p 2 σ N J R 0.0167 W K 0 K m 0 g 3 x - = 366000 N g 1 N g 2 π ω 1 N p 1 N p 2 + 2 c 3 N p 3 N p 3 + N g 3 N p 3 + N g 3 2 4 b 3 c 3 2 N p 3 σ N J R 0.0167 W K 0 K m 0 g 4 x - = 366000 N g 1 N g 2 N g 3 π ω 1 N p 1 N p 2 N p 3 + 2 c 4 N p 4 N p 4 + N g 4 N p 4 + N g 4 2 4 b 4 c 4 2 N p 4 σ N J R 0.0167 W K 0 K m 0 g 5 x - = 366000 π ω 1 + 2 c 1 N p 1 N p 1 + N g 1 N p 1 + N g 1 3 4 b 1 c 1 2 N g 1 N p 1 2 σ H C p 2 sin c o s ( ) 0.0334 W K 0 K m 0 g 6 x - = 366000 N g 1 π ω 1 N p 1 + 2 c 2 N p 2 N p 2 + N g 2 N p 2 + N g 2 3 4 b 2 c 2 2 N g 2 N p 2 2 σ H C p 2 sin c o s ( ) 0.0334 W K 0 K m 0 g 7 x - = 366000 N g 1 N g 2 π ω 1 N p 1 N p 2 + 2 c 3 N p 3 N p 3 + N g 3 N p 3 + N g 3 3 4 b 3 c 3 2 N g 3 N p 3 2 σ H C p 2 sin c o s ( ) 0.0334 W K 0 K m 0 g 8 x - = 366000 N g 1 N g 2 N g 3 π ω 1 N p 1 N p 2 N p 3 + 2 c 4 N p 4 N p 4 + N g 4 N p 4 + N g 4 3 4 b 4 c 4 2 N g 4 N p 4 2 σ H C p 2 sin c o s ( ) 0.0334 W K 0 K m 0 g 9–12 x - = N p i sin 2 4 1 N p i + 1 N p i 2 + N g i sin 2 4 1 N g i + 1 N g i 2 + sin N p i + N g i 2 + C R m i n π cos 0 g 13–16 x - = d m i n 2 c i N p i N p i + N g i 0 g 17–20 x - = d m i n 2 c i N g i N p i + N g i 0 g 21 x - = x p 1 + N p 1 + 2 c 1 N p 1 + N g 1 L m a x 0 g 22–24 x - = L m a x + N p i + 2 c i N p i + N g i i = 2 , 3 , 4 + x g ( i 1 ) 0 g 25 x - = x p 1 + N p 1 + 2 c 1 N p 1 + N g 1 0 g 26–28 x - = N p i + 2 c i N p i + N g i x g ( i 1 ) i = 2 , 3 , 4 0 g 29 x - = y p 1 + N p 1 + 2 c 1 N p 1 + N g 1 L m a x 0 g 30–32 x - = L m a x + c i 2 + N p i N p i + N g i y g ( i 1 ) i = 2 , 3 , 4 0 g 33 x - = 2 + N p 1 c 1 N p 1 + N g 1 y p 1 0 g 34–36 x - = c i 2 + N p i N p i + N g i y g ( i 1 ) i = 2 , 3 , 4 0 g 37–40 x - = L m a x + 2 + N g i c 1 N p i + N g i + x g i 0 g 41–44 x - = x g i + N g i + 2 c i N p i + N g i + x g i 0 g 45–48 x - = y g i + N g i + 2 c i N p i + N g i L m a x 0 g 49–52 x - = y g i + N g i + 2 c i N p i + N g i 0 g 53–56 x - = ( b i 8.255 ) ( b i 5.715 ) ( b i 12.70 ) ( N p i + 0.945 c i N g i ) ( 1 ) 0 g 57–60 x - = ( b i 8.255 ) ( b i 3.175 ) ( b i 12.70 ) ( N p i + 0.646 c i N g i ) 0 g 61–64 x - = ( b i 5.715 ) ( b i 3.175 ) ( b i 12.70 ) ( N p i + 0.504 c i N g i ) 0 g 65–68 x - = ( b i 5.715 ) ( b i 3.175 ) ( b i 8.255 ) ( 0 c i N g i N p i ) 0 g 69–72 x - = ( b i 8.255 ) ( b i 5.715 ) ( b i 12.70 ) ( N g i + N p i 1.812 c i ) 0 g 73–76 x - = ( b i 8.255 ) ( b i 3.175 ) ( b i 12.70 ) ( 0.945 c i + N p i + N g i ) 0 g 77–80 x - = ( b i 5.715 ) ( b i 3.175 ) ( b i 12.70 ) ( 0.646 c i + N p i + N g i ) ( 1 ) 0 g 81–84 x - = ( b i 5.715 ) ( b i 3.175 ) ( b i 8.255 ) ( N p i + N g i 0.504 c i ) 0 g 85 x - = ω m i n + ω 1 N p 1 N p 2 N p 3 N p 4 N g 1 N g 2 N g 3 N g 4 0 g 86 x - = ω 1 N p 1 N p 2 N p 3 N p 4 N g 1 N g 2 N g 3 N g 4 ω m i n 0
where
x - = { N p 1 , N g 1 , N p 2 , N g 2 b 1 , b 2 x p 1 , x g 1 , x g 2 y p 1 , y g 1 , y g 2 y g 4 } c i = y g i y p i 2 + x g i x p i 2 , K 0 = 1.5 ,   d m i n = 25 , J R = 0.2 ,   = 120 ° , W = 55.9 ,   K M = 1.6 , C R m i n = 1.4 , L m a x = 127 , C p = 464 ,   σ H = 3290 ,   ω m a x = 255 ,   ω 1 = 5000 , σ N = 2090 ,   ω m i n = 245 .
with bounds:
b 1 3.175 , 12.7 , 8.255 , 5.715 y p 1 , x p 1 , y g i , x g i 12.7 , 38.1 , 25.4 , 50.8 , 76.2 , 63.5 , 88.9 , 114.3 , 101.6 7 N g i , N p i 76 i n t e g e r .

7. Conclusions and Future Works

MFO is a prominent metaheuristic algorithm, inspired by the nighttime convergent behavior of moths in relation to a light source. A large part of MFO’s popularity in recent years has been attributed to its straightforward construction. However, due to its rapid loss of population diversity and inadequate exploration ability, the MFO algorithm often encounters local optimum entrapment and premature convergence. In this study, an enhanced moth-flame optimization (MFO-SFR) algorithm was proposed to tackle these weaknesses. MFO-SFR introduces an effective stagnation finding and replacing (SFR) strategy to effectively maintain population diversity by finding stagnant solutions using a distance-based technique and replacing them with a solution selected from the archive constructed on the basis of previous solutions.
The performance of the proposed MFO-SFR algorithm was evaluated on global optimization problems using the CEC 2018 benchmark test suite in two different sets of experiments. In the first set of experiments, the performance of MFO-SFR was benchmarked by conducting the CEC 2018 benchmark functions with 30 and 50 dimensions. The obtained results were compared to those obtained using MFO and its six recent variants, including Lévy-flight moth-flame optimization (LMFO), an efficient hybrid algorithm based on the water cycle and moth-flame (WCMFO), chaos-enhanced moth-flame optimization (CMFO), death mechanism-based moth-flame optimization (ODSFMFO), the synthesis of the moth-flame optimizer with sine cosine mechanisms (SMFO), and the hybrid of whale and moth-flame optimization (WMFO). In the second set of experiments, the results obtained using MFO-SFR were compared with the results of five well-known swarm intelligence algorithms, including particle swarm optimization (PSO), krill herd (KH), the grey wolf optimizer (GWO), the crow search algorithm (CSA), and the horse herd optimization algorithm (HOA) in 30 and 50 dimensions. Furthermore, the results of the two sets of experiments were statistically analyzed and ranked based on their average fitness values. To further analyze the performance of the proposed algorithms, convergence and population diversity results were plotted and compared with those of the other studied algorithms. The plotted curves showed that MFO-SFR could avoid premature convergence and local optimum solutions by maintaining its population diversity throughout the optimization process. To verify the viability of MFO-SFR in solving real-world optimization problems, two well-known mechanical engineering problems from the CEC 2020 dataset were considered. For future studies, solving the problem of improving the exploitation ability of MFO-SFR without degrading its exploration ability is a worthwhile direction of research. Furthermore, the SFR strategy could be considered as a reference in solving the issue of low population diversity for those metaheuristic algorithms that suffer from this problem. Moreover, alternative methods to construct an archive, such as history-based methods, as used in SHADE [103], can be investigated in future studies.

Author Contributions

Conceptualization, M.H.N.-S., A.F. and H.Z.; methodology, M.H.N.-S., A.F. and H.Z.; software, M.H.N.-S., A.F. and H.Z.; validation, M.H.N.-S., H.Z., and S.M.; formal analysis, M.H.N.-S., H.Z., A.F. and S.M.; investigation, M.H.N.-S., A.F. and H.Z.; resources, M.H.N.-S., H.Z. and S.M.; data curation, M.H.N.-S., A.F. and H.Z.; writing, M.H.N.-S., H.Z. and A.F.; original draft preparation, M.H.N.-S., A.F. and H.Z.; writing—review and editing, M.H.N.-S., H.Z., A.F. and S.M.; visualization, M.H.N.-S., A.F. and H.Z.; supervision, M.H.N.-S., H.Z. and S.M.; project administration, M.H.N.-S. and S.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data and code used in the research may be obtained from the corresponding author upon request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1 provides the results of the pretest conducted on the canonical MFO in 30 dimensions to investigate the average and maximum percentages of situations when φ i was equal to 0.
Table A1. The analysis of situations where φ i was equal to zero with D = 30.
Table A1. The analysis of situations where φ i was equal to zero with D = 30.
#FMax
Percentage
Average
Percentage
#FMax
Percentage
Average
Percentage
#FMax
Percentage
Average
Percentage
F12.030.40F1226.053.32F2222.133.51
F30.000.00F131.160.16F230.120.01
F40.900.13F146.850.34F243.410.42
F50.600.16F152.120.21F250.270.03
F60.000.00F160.100.01F262.530.65
F74.250.34F170.020.00F273.400.28
F814.970.84F180.370.02F2811.150.64
F90.010.00F192.000.13F2928.221.46
F1012.051.22F205.240.81F309.300.47
F111.250.31F210.010.00

References

  1. Zabinsky, Z.B. Stochastic methods for practical global optimization. J. Glob. Optim. 1998, 13, 433–444. [Google Scholar] [CrossRef]
  2. Pardalos, P.M.; Romeijn, H.E.; Tuy, H. Recent developments and trends in global optimization. J. Comput. Appl. Math. 2000, 124, 209–228. [Google Scholar] [CrossRef]
  3. Hosseinzadeh, M.; Masdari, M.; Rahmani, A.M.; Mohammadi, M.; Aldalwie, A.H.M.; Majeed, M.K.; Karim, S.H.T. Improved butterfly optimization algorithm for data placement and scheduling in edge computing environments. J. Grid Comput. 2021, 19, 1–27. [Google Scholar] [CrossRef]
  4. Hassan, B.A.; Rashid, T.A.; Mirjalili, S. Formal context reduction in deriving concept hierarchies from corpora using adaptive evolutionary clustering algorithm star. Complex Intell. Syst. 2021, 7, 2383–2398. [Google Scholar] [CrossRef]
  5. Hassan, B.A. CSCF: A chaotic sine cosine firefly algorithm for practical application problems. Neural Comput. Appl. 2021, 33, 7011–7030. [Google Scholar] [CrossRef]
  6. Yi, H.; Duan, Q.; Liao, T.W. Three improved hybrid metaheuristic algorithms for engineering design optimization. Appl. Soft Comput. 2013, 13, 2433–2444. [Google Scholar] [CrossRef]
  7. Nadimi-Shahraki, M.H.; Asghari Varzaneh, Z.; Zamani, H.; Mirjalili, S. Binary Starling Murmuration Optimizer Algorithm to Select Effective Features from Medical Data. Appl. Sci. 2022, 13, 564. [Google Scholar] [CrossRef]
  8. Piri, J.; Mohapatra, P.; Acharya, B.; Gharehchopogh, F.S.; Gerogiannis, V.C.; Kanavos, A.; Manika, S. Feature Selection Using Artificial Gorilla Troop Optimization for Biomedical Data: A Case Analysis with COVID-19 Data. Mathematics 2022, 10, 2742. [Google Scholar] [CrossRef]
  9. Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S. Binary Approaches of Quantum-Based Avian Navigation Optimizer to Select Effective Features from High-Dimensional Medical Data. Mathematics 2022, 10, 2770. [Google Scholar] [CrossRef]
  10. Talbi, E.-G. Metaheuristics: From Design to Implementation; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar]
  11. Siddiqi, U.F.; Shiraishi, Y.; Dahb, M.; Sait, S.M. A memory efficient stochastic evolution based algorithm for the multi-objective shortest path problem. Appl. Soft Comput. 2014, 14, 653–662. [Google Scholar] [CrossRef]
  12. Kavoosi, M.; Dulebenets, M.A.; Abioye, O.; Pasha, J.; Theophilus, O.; Wang, H.; Kampmann, R.; Mikijeljević, M. Berth scheduling at marine container terminals: A universal island-based metaheuristic approach. Marit. Bus. Rev. 2019, 5, 30–66. [Google Scholar] [CrossRef]
  13. Dokeroglu, T.; Sevinc, E.; Kucukyilmaz, T.; Cosar, A. A survey on new generation metaheuristic algorithms. Comput. Ind. Eng. 2019, 137, 106040. [Google Scholar] [CrossRef]
  14. Agushaka, J.O.; Ezugwu, A.E. Initialisation Approaches for Population-Based Metaheuristic Algorithms: A Comprehensive Review. Appl. Sci. 2022, 12, 896. [Google Scholar] [CrossRef]
  15. Singh, A.; Kumar, A. Applications of nature-inspired meta-heuristic algorithms: A survey. Int. J. Adv. Intell. Paradig. 2021, 20, 388–417. [Google Scholar] [CrossRef]
  16. Fister Jr, I.; Yang, X.-S.; Fister, I.; Brest, J.; Fister, D. A brief review of nature-inspired algorithms for optimization. arXiv 2013, arXiv:1307.4186. [Google Scholar]
  17. Dehghani, M.; Mardaneh, M.; Malik, O.P.; NouraeiPour, S.M. DTO: Donkey theorem optimization. In Proceedings of the 2019 27th Iranian Conference on Electrical Engineering (ICEE), Yazd, Iran, 30 April–2 May 2019; pp. 1855–1859. [Google Scholar]
  18. Fard, E.S.; Monfaredi, K.; Nadimi, M.H. An Area-Optimized Chip of Ant Colony Algorithm Design in Hardware Platform Using the Address-Based Method. Int. J. Electr. Comput. Eng. 2014, 4, 989–998. [Google Scholar] [CrossRef]
  19. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  20. Storn, R.; Price, K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  21. Beyer, H.-G.; Schwefel, H.-P. Evolution strategies–a comprehensive introduction. Nat. Comput. 2002, 1, 3–52. [Google Scholar] [CrossRef]
  22. Jiao, L.; Li, Y.; Gong, M.; Zhang, X. Quantum-inspired immune clonal algorithm for global optimization. IEEE Trans. Syst. Man Cybern. Part B 2008, 38, 1234–1253. [Google Scholar] [CrossRef]
  23. Lu, T.-C.; Juang, J.-C. Quantum-inspired space search algorithm (QSSA) for global numerical optimization. Appl. Math. Comput. 2011, 218, 2516–2532. [Google Scholar] [CrossRef]
  24. Arpaia, P.; Maisto, D.; Manna, C. A Quantum-inspired Evolutionary Algorithm with a competitive variation operator for Multiple-Fault Diagnosis. Appl. Soft Comput. 2011, 11, 4655–4666. [Google Scholar] [CrossRef]
  25. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  26. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  27. Kashan, A.H. A new metaheuristic for optimization: Optics inspired optimization (OIO). Comput. Oper. Res. 2015, 55, 99–125. [Google Scholar] [CrossRef]
  28. Atashpaz-Gargari, E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; pp. 4661–4667. [Google Scholar]
  29. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  30. Rao, R.V.; Savsani, V.J.; Vakharia, D. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. -Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  31. Shi, Y. Brain storm optimization algorithm. In Proceedings of the International Conference in Swarm Intelligence, Chongqing, China, 12–15 June 2011; pp. 303–309. [Google Scholar]
  32. Moosavian, N.; Roodsari, B.K. Soccer league competition algorithm: A novel meta-heuristic algorithm for optimal design of water distribution networks. Swarm Evol. Comput. 2014, 17, 14–24. [Google Scholar] [CrossRef]
  33. Moghdani, R.; Salimifard, K. Volleyball premier league algorithm. Appl. Soft Comput. 2018, 64, 161–185. [Google Scholar] [CrossRef]
  34. Moosavi, S.H.S.; Bardsiri, V.K. Poor and rich optimization algorithm: A new human-based and multi populations algorithm. Eng. Appl. Artif. Intell. 2019, 86, 165–181. [Google Scholar] [CrossRef]
  35. Naik, A.; Satapathy, S.C. Past present future: A new human-based algorithm for stochastic optimization. Soft Comput. 2021, 25, 12915–12976. [Google Scholar] [CrossRef]
  36. Chakraborty, A.; Kar, A.K. Swarm intelligence: A review of algorithms. Nat. -Inspired Comput. Optim. 2017, 10, 475–494. [Google Scholar]
  37. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. CCSA: Conscious neighborhood-based crow search algorithm for solving global optimization problems. Appl. Soft Comput. 2019, 85, 105583. [Google Scholar] [CrossRef]
  38. Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
  39. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  40. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  41. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  42. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  43. Wang, G.-G.; Deb, S.; Coelho, L.d.S. Elephant herding optimization. In Proceedings of the 2015 3rd International Symposium on Computational and Business Intelligence (ISCBI), Bali, Indonesia, 7–9 December 2015; pp. 1–5. [Google Scholar]
  44. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. -Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  45. MiarNaeimi, F.; Azizyan, G.; Rashki, M. Horse herd optimization algorithm: A nature-inspired algorithm for high-dimensional optimization problems. Knowl. -Based Syst. 2021, 213, 106711. [Google Scholar] [CrossRef]
  46. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. QANA: Quantum-based avian navigation optimizer algorithm. Eng. Appl. Artif. Intel. 2021, 104, 104314. [Google Scholar] [CrossRef]
  47. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  48. Shayanfar, H.; Gharehchopogh, F.S. Farmland fertility: A new metaheuristic algorithm for solving continuous optimization problems. Appl. Soft Comput. 2018, 71, 728–746. [Google Scholar] [CrossRef]
  49. Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Dwarf mongoose optimization algorithm. Comput. Method Appl. M 2022, 391, 114570. [Google Scholar] [CrossRef]
  50. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. Starling murmuration optimizer: A novel bio-inspired algorithm for global and engineering optimization. Comput. Methods Appl. Mech. Eng. 2022, 392, 114616. [Google Scholar] [CrossRef]
  51. Abdollahzadeh, B.; Soleimanian Gharehchopogh, F.; Mirjalili, S. Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. Int. J. Intell. Syst. 2021, 36, 5887–5958. [Google Scholar] [CrossRef]
  52. Pandey, H.M.; Chaudhary, A.; Mehrotra, D. A comparative review of approaches to prevent premature convergence in GA. Appl. Soft Comput. 2014, 24, 1047–1077. [Google Scholar] [CrossRef]
  53. Chaitanya, K.; Somayajulu, D.; Krishna, P.R. Memory-based approaches for eliminating premature convergence in particle swarm optimization. Appl. Intell. 2021, 51, 4575–4608. [Google Scholar] [CrossRef]
  54. Zhu, G.; Kwong, S. Gbest-guided artificial bee colony algorithm for numerical function optimization. Appl. Math. Comput. 2010, 217, 3166–3173. [Google Scholar] [CrossRef]
  55. Banharnsakun, A.; Achalakul, T.; Sirinaovakul, B. The best-so-far selection in artificial bee colony algorithm. Appl. Soft Comput. 2011, 11, 2888–2901. [Google Scholar] [CrossRef]
  56. Xiang, W.-L.; Li, Y.-Z.; Meng, X.-L.; Zhang, C.-M.; An, M.-Q. A grey artificial bee colony algorithm. Appl. Soft Comput. 2017, 60, 1–17. [Google Scholar] [CrossRef]
  57. Nadimi-Shahraki, M.H.; Zamani, H. DMDE: Diversity-maintained multi-trial vector differential evolution algorithm for non-decomposition large-scale global optimization. Expert Syst. Appl. 2022, 198, 116895. [Google Scholar] [CrossRef]
  58. Wang, F.; Liao, X.; Fang, N.; Jiang, Z. Optimal Scheduling of Regional Combined Heat and Power System Based on Improved MFO Algorithm. Energies 2022, 15, 3410. [Google Scholar] [CrossRef]
  59. Kaur, K.; Singh, U.; Salgotra, R. An enhanced moth flame optimization. Neural Comput. Appl. 2020, 32, 2315–2349. [Google Scholar] [CrossRef]
  60. Li, Z.; Zhou, Y.; Zhang, S.; Song, J. Lévy-flight moth-flame algorithm for function optimization and engineering design problems. Math. Probl. Eng. 2016, 2016, 1–22. [Google Scholar] [CrossRef]
  61. Khalilpourazari, S.; Khalilpourazary, S. An efficient hybrid algorithm based on Water Cycle and Moth-Flame Optimization algorithms for solving numerical and constrained engineering optimization problems. Soft Comput. 2019, 23, 1699–1722. [Google Scholar] [CrossRef]
  62. Hongwei, L.; Jianyong, L.; Liang, C.; Jingbo, B.; Yangyang, S.; Kai, L. Chaos-enhanced moth-flame optimization algorithm for global optimization. J. Syst. Eng. Electron. 2019, 30, 1144–1159. [Google Scholar]
  63. Chen, C.; Wang, X.; Yu, H.; Wang, M.; Chen, H. Dealing with multi-modality using synthesis of Moth-flame optimizer with sine cosine mechanisms. Math. Comput. Simul. 2021, 188, 291–318. [Google Scholar] [CrossRef]
  64. Xu, Y.; Chen, H.; Luo, J.; Zhang, Q.; Jiao, S.; Zhang, X. Enhanced Moth-flame optimizer with mutation strategy for global optimization. Inf. Sci. 2019, 492, 181–203. [Google Scholar] [CrossRef]
  65. Li, Z.; Zeng, J.; Chen, Y.; Ma, G.; Liu, G. Death mechanism-based moth–flame optimization with improved flame generation mechanism for global optimization tasks. Expert Syst. Appl. 2021, 183, 115436. [Google Scholar] [CrossRef]
  66. Xu, Y.; Chen, H.; Heidari, A.A.; Luo, J.; Zhang, Q.; Zhao, X.; Li, C. An efficient chaotic mutative moth-flame-inspired optimizer for global optimization tasks. Expert Syst. Appl. 2019, 129, 135–155. [Google Scholar] [CrossRef]
  67. Awad, N.; Ali, M.; Liang, J.; Qu, B.; Suganthan, P. Problem Definitions and Evaluation Criteria for the CEC 2017 Special Session and Competition on Single Objective Bound Constrained Real-Parameter Numerical Optimization; Technical Report; Nanyang Technological University: Singapore, 2016. [Google Scholar]
  68. Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Oliva, D. Hybridizing of Whale and Moth-Flame Optimization Algorithms to Solve Diverse Scales of Optimal Power Flow Problem. Electronics 2022, 11, 831. [Google Scholar] [CrossRef]
  69. Gandomi, A.H.; Alavi, A.H. Krill herd: A new bio-inspired optimization algorithm. Commun. Non-Linear Sci. Numer. Simul. 2012, 17, 4831–4845. [Google Scholar] [CrossRef]
  70. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  71. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  72. Kumar, A.; Wu, G.; Ali, M.Z.; Mallipeddi, R.; Suganthan, P.N.; Das, S. A test-suite of non-convex constrained optimization problems from the real-world and some baseline results. Swarm Evol. Comput. 2020, 56, 100693. [Google Scholar] [CrossRef]
  73. Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L. An improved moth-flame optimization algorithm with adaptation mechanism to solve numerical and mechanical engineering problems. Entropy 2021, 23, 1637. [Google Scholar] [CrossRef] [PubMed]
  74. Pelusi, D.; Mascella, R.; Tallini, L.; Nayak, J.; Naik, B.; Deng, Y. An Improved Moth-Flame Optimization algorithm with hybrid search phase. Knowl. -Based Syst. 2020, 191, 105277. [Google Scholar] [CrossRef]
  75. Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L.; Abd Elaziz, M. Migration-based moth-flame optimization algorithm. Processes 2021, 9, 2276. [Google Scholar] [CrossRef]
  76. Ma, L.; Wang, C.; Xie, N.-g.; Shi, M.; Ye, Y.; Wang, L. Moth-flame optimization algorithm based on diversity and mutation strategy. Appl. Intell. 2021, 51, 5836–5872. [Google Scholar] [CrossRef]
  77. Zhao, X.; Fang, Y.; Liu, L.; Li, J.; Xu, M. An improved moth-flame optimization algorithm with orthogonal opposition-based learning and modified position updating mechanism of moths for global optimization problems. Appl. Intell. 2020, 50, 4434–4458. [Google Scholar] [CrossRef]
  78. Sapre, S.; Mini, S. Opposition-based moth flame optimization with Cauchy mutation and evolutionary boundary constraint handling for global optimization. Soft Comput. 2019, 23, 6023–6041. [Google Scholar] [CrossRef]
  79. Sahoo, S.K.; Saha, A.K.; Nama, S.; Masdari, M. An improved moth flame optimization algorithm based on modified dynamic opposite learning strategy. Artif. Intell. Rev. 2022, 1–59. [Google Scholar] [CrossRef]
  80. Li, C.; Niu, Z.; Song, Z.; Li, B.; Fan, J.; Liu, P.X. A double evolutionary learning moth-flame optimization for real-parameter global optimization problems. IEEE Access 2018, 6, 76700–76727. [Google Scholar] [CrossRef]
  81. Li, Y.; Zhu, X.; Liu, J. An improved moth-flame optimization algorithm for engineering problems. Symmetry 2020, 12, 1234. [Google Scholar] [CrossRef]
  82. Shehab, M.; Alshawabkah, H.; Abualigah, L.; AL-Madi, N. Enhanced a hybrid moth-flame optimization algorithm using new selection schemes. Eng. Comput. 2021, 37, 2931–2956. [Google Scholar] [CrossRef]
  83. Zhang, H.; Li, R.; Cai, Z.; Gu, Z.; Heidari, A.A.; Wang, M.; Chen, H.; Chen, M. Advanced orthogonal moth flame optimization with Broyden–Fletcher–Goldfarb–Shanno algorithm: Framework and real-world problems. Expert Syst. Appl. 2020, 159, 113617. [Google Scholar] [CrossRef]
  84. Yu, C.; Heidari, A.A.; Chen, H. A quantum-behaved simulated annealing algorithm-based moth-flame optimization method. Appl. Math. Model. 2020, 87, 1–19. [Google Scholar] [CrossRef]
  85. Alzaqebah, M.; Alrefai, N.; Ahmed, E.A.; Jawarneh, S.; Alsmadi, M.K. Neighborhood search methods with moth optimization algorithm as a wrapper method for feature selection problems. Int. J. Electr. Comput. Eng. 2020, 10, 3672. [Google Scholar] [CrossRef]
  86. Xu, L.; Li, Y.; Li, K.; Beng, G.H.; Jiang, Z.; Wang, C.; Liu, N. Enhanced moth-flame optimization based on cultural learning and Gaussian mutation. J. Bionic Eng. 2018, 15, 751–763. [Google Scholar] [CrossRef]
  87. Helmi, A.; Alenany, A. An enhanced Moth-flame optimization algorithm for permutation-based problems. Evol. Intell. 2020, 13, 741–764. [Google Scholar] [CrossRef]
  88. Sayed, G.I.; Hassanien, A.E. A hybrid SA-MFO algorithm for function optimization and engineering design problems. Complex Intell. Syst. 2018, 4, 195–212. [Google Scholar] [CrossRef]
  89. Buch, H.; Trivedi, I.N.; Jangir, P. Moth flame optimization to solve optimal power flow with non-parametric statistical evaluation validation. Cogent Eng. 2017, 4, 1286731. [Google Scholar] [CrossRef]
  90. Trivedi, I.N.; Jangir, P.; Parmar, S.A.; Jangir, N. Optimal power flow with voltage stability improvement and loss reduction in power system using Moth-Flame Optimizer. Neural Comput. Appl. 2018, 30, 1889–1904. [Google Scholar] [CrossRef]
  91. Jangir, P.; Jangir, N. Optimal power flow using a hybrid particle Swarm optimizer with moth flame optimizer. Glob. J. Res. Eng. 2017, 17, 15–32. [Google Scholar]
  92. Sahoo, S.K.; Saha, A.K. A hybrid moth flame optimization algorithm for global optimization. J. Bionic Eng. 2022, 19, 1522–1543. [Google Scholar] [CrossRef]
  93. Khan, B.S.; Raja, M.A.Z.; Qamar, A.; Chaudhary, N.I. Design of moth flame optimization heuristics for integrated power plant system containing stochastic wind. Appl. Soft Comput. 2021, 104, 107193. [Google Scholar] [CrossRef]
  94. Singh, P.; Bishnoi, S. Modified moth-Flame optimization for strategic integration of fuel cell in renewable active distribution network. Electr. Power Syst. Res. 2021, 197, 107323. [Google Scholar] [CrossRef]
  95. Zhang, H.; Heidari, A.A.; Wang, M.; Zhang, L.; Chen, H.; Li, C. Orthogonal Nelder-Mead moth flame method for parameters identification of photovoltaic modules. Energy Convers. Manag. 2020, 211, 112764. [Google Scholar] [CrossRef]
  96. Cui, Z.; Li, C.; Huang, J.; Wu, Y.; Zhang, L. An improved moth flame optimization algorithm for minimizing specific fuel consumption of variable cycle engine. IEEE Access 2020, 8, 142725–142735. [Google Scholar] [CrossRef]
  97. Khurma, R.A.; Aljarah, I.; Sharieh, A. A simultaneous moth flame optimizer feature selection approach based on levy flight and selection operators for medical diagnosis. Arab. J. Sci. Eng. 2021, 46, 8415–8440. [Google Scholar] [CrossRef]
  98. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  99. Morrison, R.W. Designing Evolutionary Algorithms for Dynamic Environments; Springer: Berlin/Heidelberg, Germany, 2004; Volume 178. [Google Scholar]
  100. Altabeeb, A.M.; Mohsen, A.M.; Abualigah, L.; Ghallab, A. Solving capacitated vehicle routing problem using cooperative firefly algorithm. Appl. Soft Comput. 2021, 108, 107403. [Google Scholar] [CrossRef]
  101. Ragsdell, K.; Phillips, D. Optimal design of a class of welded structures using geometric programming. Eng. Ind. 1976, 98, 1021–1025. [Google Scholar] [CrossRef]
  102. Yokota, T.; Taguchi, T.; Gen, M. A solution method for optimal weight design problem of the gear using genetic algorithms. Comput. Ind. Eng. 1998, 35, 523–526. [Google Scholar] [CrossRef]
  103. Tanabe, R.; Fukunaga, A. Success-history based parameter adaptation for differential evolution. In Proceedings of the 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; pp. 71–78. [Google Scholar]
Figure 1. Exploratory data analysis of MFO-SFR, MFO, and its variants on CEC 2018 with 30 and 50 dimensions.
Figure 1. Exploratory data analysis of MFO-SFR, MFO, and its variants on CEC 2018 with 30 and 50 dimensions.
Mathematics 11 00862 g001
Figure 2. Convergence comparison of MFO-SFR, MFO, and its variants on CEC 2018 with D = 30 and 50.
Figure 2. Convergence comparison of MFO-SFR, MFO, and its variants on CEC 2018 with D = 30 and 50.
Mathematics 11 00862 g002
Figure 3. Exploratory data analysis of MFO-SFR and other well-known MFO algorithms on CEC 2018 with 30 and 50 dimensions.
Figure 3. Exploratory data analysis of MFO-SFR and other well-known MFO algorithms on CEC 2018 with 30 and 50 dimensions.
Mathematics 11 00862 g003
Figure 4. Comparison of the convergence behavior of MFO-SFR and well-known algorithms for CEC 2018 test functions with 30 and 50 dimensions.
Figure 4. Comparison of the convergence behavior of MFO-SFR and well-known algorithms for CEC 2018 test functions with 30 and 50 dimensions.
Mathematics 11 00862 g004
Figure 5. Population diversity of MFO-SFR and comparative algorithms on CEC 2018 test functions.
Figure 5. Population diversity of MFO-SFR and comparative algorithms on CEC 2018 test functions.
Mathematics 11 00862 g005
Figure 6. Schematic of the welded beam design problem [101].
Figure 6. Schematic of the welded beam design problem [101].
Mathematics 11 00862 g006
Table 1. Flame construction procedure.
Table 1. Flame construction procedure.
Input: X: the positions of moths, Fit: the fitness values of moths, F: the position of the flame, and OF: the fitness values of flames.
Flame construction in the first iteration when t = 1.
1.Sort the vector Fit in ascending order and extract the sorted index in {j1, j2, …, jN}.
2.Construct the flame matrix F (t) = {F1Xj1, F2Xj2, …, FNXjN}.
Flame construction for the rest iteration when t > 1.
1.Construct matrix dualPop by combining matrices F(t) and X (t − 1).
2.Construct vector dualFit by combining vectors OF(t) and Fit (t − 1).
3.Sort the vector dualFit in ascending order and extract the sorted index in {j1, j2, …, j2N}.
4.Construct the flame matrix F (t) = {F1Xj1, F2Xj2, …, FNXjN}.
Table 2. Parameter values for the optimization algorithms.
Table 2. Parameter values for the optimization algorithms.
Alg.Parameter Settings
MFOb = 1, a decreased linearly from −1 to −2.
LMFOβ = 1.5, µ and v are normal distributions, Г is the gamma function.
WCMFOThe number of rivers and seas = 4.
CMFOb = 1, a decreased linearly from −1 to −2, chaotic map = Singer.
ODSFMFOm = 6, pc = 0.5, γ = 5, α = 1, l = 10, b = 1, β = 1.5.
SMFOr4 = random number between the interval (0, 1).
WMFOα decreased linearly from 2 to 0, b = 1.
PSOc1 = c2 = 2, vmax = 6, w = 0.9.
KHVf = 0.02, Dmax = 0.005, Nmax = 0.01, Sr = 0.
GWOThe parameter a decreased linearly from 2 to 0.
CSAAP = 0.1, fl = 2.
HOAw = 1, δD = 0.02, δI = 0.02, gδ = 1.5, hβ = 0.9, hγ = 0.5, sβ = 0.2, sγ = 0.1, iγ = 0.3, dα = 0.5, dβ = 0.2, dγ = 0.1, rδ = 0.1, rγ = 0.05
MFO-SFRb = 1, a decreased linearly from −1 to −2, κ = round (D2 × (log N)), C = N/5.
Table 3. Comparison of MFO-SFR with MFO variants for CEC 2018 test functions with D = 30.
Table 3. Comparison of MFO-SFR with MFO variants for CEC 2018 test functions with D = 30.
F.MetricsMFOLMFOWCMFOCMFOODSFMFOSMFOWMFOMFO-SFR
F1Avg6.278 × 1092.544 × 1071.317 × 1041.078 × 1086.016 × 1063.119 × 10103.822 × 1031.791 × 103
Min1.027 × 1091.899 × 1071.924 × 1033.760 × 1069.949 × 1051.734 × 10101.013 × 1021.017 × 102
F3Avg9.453 × 1043.473 × 1031.541 × 1035.059 × 1043.050 × 1048.300 × 1043.909 × 1021.312 × 104
Min1.203 × 1041.499 × 1033.111 × 1022.945 × 1041.631 × 1047.186 × 1043.007 × 1027.513 × 103
F4Avg8.558 × 1024.919 × 1024.846 × 1026.960 × 1025.356 × 1025.612 × 1034.810 × 1024.914 × 102
Min4.991 × 1024.742 × 1024.009 × 1025.139 × 1024.985 × 1022.322 × 1034.249 × 1024.700 × 102
F5Avg6.740 × 1026.300 × 1026.721 × 1026.073 × 1025.506 × 1028.725 × 1026.739 × 1025.227 × 102
Min6.114 × 1025.816 × 1026.126 × 1025.736 × 1025.270 × 1028.105 × 1026.234 × 1025.109 × 102
F6Avg6.260 × 1026.030 × 1026.236 × 1026.189 × 1026.037 × 1026.814 × 1026.366 × 1026.000 × 102
Min6.113 × 1026.018 × 1026.137 × 1026.086 × 1026.010 × 1026.571 × 1026.143 × 1026.000 × 102
F7Avg1.007 × 1038.716 × 1029.050 × 1029.430 × 1028.099 × 1021.359 × 1031.056 × 1037.669 × 102
Min8.538 × 1028.311 × 1028.045 × 1028.684 × 1027.824 × 1021.198 × 1039.248 × 1027.460 × 102
F8Avg9.895 × 1029.375 × 1029.839 × 1029.097 × 1028.528 × 1021.093 × 1039.539 × 1028.209 × 102
Min9.126 × 1028.978 × 1029.344 × 1028.645 × 1028.343 × 1021.052 × 1038.547 × 1028.090 × 102
F9Avg6.219 × 1039.256 × 1028.623 × 1032.331 × 1031.118 × 1039.431 × 1034.543 × 1039.038 × 102
Min3.323 × 1039.074 × 1025.118 × 1031.476 × 1039.647 × 1027.359 × 1031.675 × 1039.005 × 102
F10Avg5.259 × 1034.240 × 1034.848 × 1035.005 × 1034.332 × 1038.272 × 1035.192 × 1034.062 × 103
Min4.231 × 1033.205 × 1034.003 × 1034.204 × 1033.570 × 1037.449 × 1033.759 × 1032.461 × 103
F11Avg3.967 × 1031.314 × 1031.363 × 1031.985 × 1031.284 × 1035.799 × 1031.248 × 1031.143 × 103
Min1.370 × 1031.180 × 1031.252 × 1031.206 × 1031.204 × 1032.547 × 1031.170 × 1031.107 × 103
F12Avg9.043 × 1075.251 × 1061.416 × 1062.113 × 1072.157 × 1064.342 × 1091.014 × 1051.508 × 105
Min7.305 × 1041.578 × 1063.718 × 1047.171 × 1052.328 × 1052.607 × 1096.932 × 1032.035 × 104
F13Avg4.593 × 1064.072 × 1059.457 × 1049.006 × 1031.184 × 1047.405 × 1086.660 × 1036.405 × 103
Min1.003 × 1041.634 × 1051.150 × 1042.446 × 1031.596 × 1031.145 × 1081.400 × 1031.690 × 103
F14Avg6.942 × 1042.500 × 1041.872 × 1043.941 × 1045.651 × 1041.715 × 1061.406 × 1048.200 × 103
Min5.450 × 1032.821 × 1034.075 × 1036.379 × 1034.686 × 1037.879 × 1043.027 × 1032.021 × 103
F15Avg3.090 × 1048.218 × 1043.207 × 1045.756 × 1035.070 × 1034.161 × 1071.157 × 1045.614 × 103
Min5.117 × 1034.614 × 1042.547 × 1031.707 × 1031.703 × 1031.868 × 1061.609 × 1031.515 × 103
F16Avg2.956 × 1032.564 × 1032.867 × 1032.709 × 1032.366 × 1034.223 × 1032.662 × 1031.855 × 103
Min2.398 × 1032.101 × 1032.267 × 1032.241 × 1031.965 × 1033.565 × 1032.068 × 1031.617 × 103
F17Avg2.349 × 1032.192 × 1032.315 × 1032.056 × 1031.985 × 1032.788 × 1032.234 × 1031.745 × 103
Min1.975 × 1031.925 × 1031.942 × 1031.818 × 1031.764 × 1032.359 × 1031.958 × 1031.727 × 103
F18Avg2.830 × 1062.674 × 1051.804 × 1057.780 × 1058.975 × 1055.330 × 1078.188 × 1041.493 × 105
Min7.725 × 1043.452 × 1044.774 × 1047.998 × 1049.364 × 1042.825 × 1066.883 × 1034.305 × 104
F19Avg4.261 × 1067.040 × 1043.083 × 1042.505 × 1047.822 × 1037.588 × 1071.560 × 1046.534 × 103
Min1.293 × 1043.487 × 1042.168 × 1033.280 × 1031.968 × 1035.192 × 1062.310 × 1031.910 × 103
F20Avg2.537 × 1032.398 × 1032.528 × 1032.402 × 1032.287 × 1032.837 × 1032.690 × 1032.091 × 103
Min2.215 × 1032.117 × 1032.103 × 1032.185 × 1032.053 × 1032.454 × 1032.294 × 1032.004 × 103
F21Avg2.472 × 1032.439 × 1032.485 × 1032.384 × 1032.351 × 1032.630 × 1032.462 × 1032.321 × 103
Min2.420 × 1032.378 × 1032.430 × 1032.338 × 1032.331 × 1032.363 × 1032.389 × 1032.312 × 103
F22Avg6.353 × 1034.878 × 1036.611 × 1032.380 × 1032.319 × 1038.681 × 1035.292 × 1032.300 × 103
Min3.223 × 1032.325 × 1035.330 × 1032.319 × 1032.305 × 1035.677 × 1032.300 × 1032.300 × 103
F23Avg2.811 × 1032.754 × 1032.796 × 1032.797 × 1032.722 × 1033.273 × 1032.861 × 1032.671 × 103
Min2.740 × 1032.724 × 1032.749 × 1032.734 × 1032.697 × 1033.027 × 1032.763 × 1032.654 × 103
F24Avg2.979 × 1032.924 × 1032.972 × 1032.948 × 1032.872 × 1033.482 × 1033.003 × 1032.844 × 103
Min2.926 × 1032.888 × 1032.927 × 1032.887 × 1032.848 × 1033.217 × 1032.912 × 1032.828 × 103
F25Avg3.181 × 1032.888 × 1032.894 × 1033.011 × 1032.925 × 1033.972 × 1032.900 × 1032.887 × 103
Min2.895 × 1032.885 × 1032.884 × 1032.935 × 1032.890 × 1033.467 × 1032.884 × 1032.887 × 103
F26Avg5.650 × 1034.854 × 1035.538 × 1034.465 × 1034.415 × 1039.093 × 1035.841 × 1033.903 × 103
Min4.921 × 1034.504 × 1035.074 × 1033.113 × 1032.876 × 1035.057 × 1034.741 × 1033.739 × 103
F27Avg3.233 × 1033.223 × 1033.229 × 1033.285 × 1033.244 × 1033.754 × 1033.276 × 1033.219 × 103
Min3.206 × 1033.194 × 1033.204 × 1033.238 × 1033.218 × 1033.538 × 1033.220 × 1033.208 × 103
F28Avg3.756 × 1033.270 × 1033.192 × 1033.376 × 1033.294 × 1035.462 × 1033.199 × 1033.216 × 103
Min3.263 × 1033.211 × 1033.100 × 1033.265 × 1033.271 × 1034.419 × 1033.122 × 1033.196 × 103
F29Avg4.014 × 1033.764 × 1033.949 × 1034.040 × 1033.691 × 1035.639 × 1034.068 × 1033.414 × 103
Min3.499 × 1033.410 × 1033.574 × 1033.629 × 1033.475 × 1034.728 × 1033.545 × 1033.323 × 103
F30Avg2.524 × 1051.426 × 1053.318 × 1047.742 × 1051.803 × 1042.326 × 1081.079 × 1047.835 × 103
Min7.219 × 1036.606 × 1041.642 × 1045.609 × 1047.769 × 1032.468 × 1075.674 × 1036.362 × 103
Average rank6.063.954.514.573.287.914.181.55
Total rank73562841
Table 4. Comparison of MFO-SFR with MFO variants for CEC 2018 test functions with D = 50.
Table 4. Comparison of MFO-SFR with MFO variants for CEC 2018 test functions with D = 50.
F.MetricsMFOLMFOWCMFOCMFOODSFMFOSMFOWMFOMFO-SFR
F1Avg3.036 × 10101.091 × 1086.099 × 1041.323 × 1092.624 × 1087.209 × 10104.264 × 1033.463 × 104
Min7.064 × 1038.074 × 1078.054 × 1021.287 × 1083.470 × 1075.140 × 10101.054 × 1029.385 ×3
F3Avg1.540 × 1053.139 × 1041.413 × 1041.004 × 1059.325 × 1041.770 × 1059.948 × 1025.495 × 104
Min1.176 × 1041.960 × 1042.418 × 1037.381 × 1046.538 × 1041.457 × 1053.217 × 1024.223 × 104
F4Avg4.178 × 1035.786 × 1025.431 × 1021.182 × 1037.401 × 1021.835 × 1045.385 × 1025.867 × 102
Min1.187 × 1035.296 × 1024.286 × 1025.419 × 1026.608 × 1021.005 × 1044.961 × 1025.196 × 102
F5Avg9.086 × 1028.080 × 1029.240 × 1028.065 × 1026.269 × 1021.125 × 1038.608 × 1025.624 × 102
Min7.996 × 1027.226 × 1027.743 × 1026.742 × 1025.862 × 1021.032 × 1037.209 × 1025.318 × 102
F6Avg6.455 × 1026.078 × 1026.395 × 1026.356 × 1026.075 × 1026.888 × 1026.513 × 1026.001 × 102
Min6.270 × 1026.035 × 1026.165 × 1026.239 × 1026.041 × 1026.780 × 1026.291 × 1026.000 × 102
F7Avg1.728 × 1031.081 × 1031.139 × 1031.207 × 1039.855 × 1021.937 × 1031.461 × 1038.682 × 102
Min1.119 × 1031.011 × 1031.023 × 1031.031 × 1038.795 × 1021.769 × 1031.204 × 1038.099 × 102
F8Avg1.217 × 1031.107 × 1031.213 × 1031.055 × 1039.213 × 1021.406 × 1031.131 × 1038.610 × 102
Min1.050 × 1031.021 × 1031.096 × 1039.983 × 1028.625 × 1021.315 × 1031.021 × 1038.318 × 102
F9Avg1.651 × 1041.737 × 1032.097 × 1046.212 × 1031.717 × 1033.051 × 1041.190 × 1049.243 × 102
Min8.748 × 1039.529 × 1021.190 × 1043.607 × 1031.299 × 1031.925 × 1045.498 × 1039.066 × 102
F10Avg8.426 × 1037.527 × 1037.974 × 1037.980 × 1037.490 × 1031.387 × 1047.755 × 1036.534 × 103
Min6.288 × 1036.340 × 1036.303 × 1036.040 × 1035.766 × 1031.198 × 1046.397 × 1035.135 × 103
F11Avg5.571 × 1031.594 × 1031.469 × 1032.114 × 1031.865 × 1031.495 × 1041.299 × 1031.259 × 103
Min1.574 × 1031.439 × 1031.291 × 1031.417 × 1031.394 × 1038.985 × 1031.200 × 1031.146 × 103
F12Avg2.581 × 1094.574 × 1076.833 × 1061.705 × 1081.999 × 1073.109 × 10105.722 × 1051.873 × 106
Min6.409 × 1072.731 × 1071.384 × 1061.878 × 1067.129 × 1061.600 × 10101.341 × 1051.078 × 106
F13Avg2.561 × 1082.672 × 1069.255 × 1041.340 × 1051.729 × 1041.251 × 10108.898 × 1035.362 × 103
Min1.454 × 1051.614 × 1063.011 × 1045.781 × 1039.648 × 1031.435 × 1092.611 × 1031.749 × 103
F14Avg9.567 × 1051.375 × 1056.855 × 1041.073 × 1054.132 × 1052.622 × 1073.670 × 1044.016 × 104
Min1.246 × 1043.771 × 1042.161 × 1049.772 × 1037.234 × 1048.185 × 1051.148 × 1041.202 × 104
F15Avg1.078 × 1075.308 × 1056.616 × 1048.967 × 1036.016 × 1031.341 × 1097.075 × 1032.964 × 103
Min4.298 × 1043.335 × 1051.422 × 1041.884 × 1032.543 × 1031.237 × 1081.943 × 1031.534 × 103
F16Avg4.104 × 1033.570 × 1033.769 × 1033.335 × 1032.915 × 1036.808 × 1033.575 × 1032.614 × 103
Min3.133 × 1032.836 × 1032.788 × 1032.616 × 1032.404 × 1035.302 × 1032.509 × 1032.148 × 103
F17Avg3.846 × 1033.218 × 1033.787 × 1033.151 × 1032.690 × 1034.919 × 1033.478 × 1032.474 × 103
Min3.034 × 1032.568 × 1033.044 × 1032.615 × 1032.084 × 1033.399 × 1032.827 × 1032.018 × 103
F18Avg4.168 × 1061.053 × 1063.688 × 1052.670 × 1061.816 × 1065.905 × 1071.937 × 1051.247 × 106
Min1.543 × 1052.843 × 1051.381 × 1052.302 × 1051.304 × 1055.306 × 1063.375 × 1041.067 × 105
F19Avg2.346 × 1062.754 × 1052.368 × 1046.213 × 1041.682 × 1049.590 × 1081.541 × 1041.276 × 104
Min5.030 × 1031.841 × 1052.700 × 1035.247 × 1032.057 × 1032.437 × 1072.172 × 1032.447 × 103
F20Avg3.529 × 1032.999 × 1033.311 × 1033.108 × 1032.830 × 1033.923 × 1033.317 × 1032.485 × 103
Min3.116 × 1032.429 × 1032.655 × 1032.572 × 1032.495 × 1033.475 × 1032.534 × 1032.081 × 103
F21Avg2.682 × 1032.604 × 1032.720 × 1032.503 × 1032.408 × 1033.071 × 1032.635 × 1032.360 × 103
Min2.575 × 1032.528 × 1032.590 × 1032.445 × 1032.379 × 1032.938 × 1032.518 × 1032.335 × 103
F22Avg1.028 × 1049.106 × 1039.780 × 1037.972 × 1035.227 × 1031.605 × 1049.538 × 1038.339 × 103
Min8.688 × 1037.734 × 1038.346 × 1032.497 × 1032.436 × 1031.474 × 1048.203 × 1036.317 × 103
F23Avg3.133 × 1033.009 × 1033.095 × 1033.137 × 1032.888 × 1033.969 × 1033.183 × 1032.793 × 103
Min3.013 × 1032.945 × 1032.974 × 1032.979 × 1032.822 × 1033.594 × 1033.056 × 1032.761 × 103
F24Avg3.197 × 1033.135 × 1033.224 × 1033.217 × 1033.030 × 1034.292 × 1033.274 × 1032.970 × 103
Min3.098 × 1033.071 × 1033.101 × 1033.098 × 1032.978 × 1033.875 × 1033.095 × 1032.931 × 103
F25Avg5.123 × 1033.062 × 1033.048 × 1033.889 × 1033.242 × 1031.069 × 1043.061 × 1033.070 × 103
Min3.031 × 1032.994 × 1032.964 × 1033.242 × 1033.176 × 1037.290 × 1033.021 × 1032.985 × 103
F26Avg8.137 × 1036.855 × 1038.205 × 1038.456 × 1035.513 × 1031.577 × 1048.342 × 1034.406 × 103
Min6.910 × 1036.234 × 1037.239 × 1035.759 × 1034.905 × 1031.445 × 1042.900 × 1034.051 × 103
F27Avg3.538 × 1033.403 × 1033.489 × 1034.237 × 1033.525 × 1035.612 × 1033.759 × 1033.307 × 103
Min3.407 × 1033.297 × 1033.361 × 1033.897 × 1033.448 × 1034.453 × 1033.460 × 1033.266 × 103
F28Avg7.554 × 1033.555 × 1033.296 × 1034.481 × 1033.749 × 1039.637 × 1033.300 × 1033.378 × 103
Min4.720 × 1033.268 × 1033.259 × 1033.882 × 1033.472 × 1038.008 × 1033.259 × 1033.310 × 103
F29Avg5.133 × 1034.380 × 1034.681 × 1035.199 × 1034.191 × 1031.608 × 1044.870 × 1033.545 × 103
Min4.271 × 1033.944 × 1033.587 × 1034.366 × 1033.748 × 1038.290 × 1034.292 × 1033.289 × 103
F30Avg2.924 × 1075.428 × 1062.810 × 1062.564 × 1071.768 × 1062.271 × 1091.204 × 1061.144 × 106
Min2.389 × 1063.442 × 1061.262 × 1068.984 × 1069.999 × 1052.782 × 1086.441 × 1059.567 × 105
Average rank6.293.824.254.783.347.933.791.79
Total rank73562841
Table 5. Comparison of MFO-SFR with well-known algorithms for CEC 2018 test functions with D = 30.
Table 5. Comparison of MFO-SFR with well-known algorithms for CEC 2018 test functions with D = 30.
F.MetricsPSOKHGWOCSAHOAMFO-SFR
F1Avg5.907 × 10101.963 × 1041.145 × 1093.246 × 10103.003 × 1091.791 × 103
Min3.270 × 10107.354 × 1031.583 × 1082.130 × 10102.203 × 1091.017 × 102
F3Avg1.308 × 1054.403 × 1042.987 × 1049.600 × 1043.075 × 1041.312 × 104
Min1.039 × 1052.051 × 1041.476 × 1045.473 × 1042.032 × 1047.513 × 103
F4Avg1.183 × 1044.965 × 1025.369 × 1025.726 × 1031.047 × 1034.914 × 102
Min7.302 × 1034.041 × 1024.933 × 1023.185 × 1038.889 × 1024.700 × 102
F5Avg9.635 × 1026.454 × 1025.950 × 1028.509 × 1027.938 × 1025.227 × 102
Min8.845 × 1026.115 × 1025.511 × 1028.017 × 1027.612 × 1025.109 × 102
F6Avg6.921 × 1026.418 × 1026.047 × 1026.683 × 1026.605 × 1026.000 × 102
Min6.828 × 1026.303 × 1026.009 × 1026.554 × 1026.496 × 1026.000 × 102
F7Avg2.511 × 1038.400 × 1028.512 × 1021.730 × 1031.029 × 1037.669 × 102
Min2.201 × 1037.960 × 1027.932 × 1021.552 × 1039.979 × 1027.460 × 102
F8Avg1.220 × 1039.054 × 1028.709 × 1021.134 × 1031.061 × 1038.209 × 102
Min1.163 × 1038.647 × 1028.450 × 1021.105 × 1031.041 × 1038.090 × 102
F9Avg1.735 × 1043.138 × 1031.360 × 1031.040 × 1044.292 × 1039.038 × 102
Min1.271 × 1042.368 × 1039.830 × 1027.223 × 1032.668 × 1039.005 × 102
F10Avg8.218 × 1034.797 × 1033.874 × 1038.279 × 1038.340 × 1034.062 × 103
Min7.661 × 1033.165 × 1033.030 × 1037.738 × 1037.745 × 1032.461 × 103
F11Avg1.018 × 1041.711 × 1031.408 × 1034.700 × 1031.797 × 1031.143 × 103
Min7.488 × 1031.304 × 1031.236 × 1033.395 × 1031.699 × 1031.107 × 103
F12Avg6.824 × 1092.220 × 1063.441 × 1072.979 × 1093.763 × 1081.508 × 105
Min3.870 × 1097.468 × 1052.122 × 1061.516 × 1092.821 × 1082.035 × 104
F13Avg3.156 × 1093.457 × 1041.505 × 1069.478 × 1081.051 × 1086.405 × 103
Min5.760 × 1081.430 × 1044.674 × 1045.211 × 1083.296 × 1071.690 × 103
F14Avg7.227 × 1052.910 × 1051.926 × 1054.342 × 1051.340 × 1058.200 × 103
Min1.041 × 1051.873 × 1042.446 × 1041.482 × 1055.216 × 1042.021 × 103
F15Avg2.025 × 1081.788 × 1041.956 × 1057.286 × 1073.143 × 1075.614 × 103
Min1.064 × 1079.433 × 1031.435 × 1042.378 × 1078.141 × 1061.515 × 103
F16Avg4.452 × 1032.884 × 1032.385 × 1033.989 × 1033.786 × 1031.855 × 103
Min3.827 × 1032.377 × 1031.949 × 1033.147 × 1033.419 × 1031.617 × 103
F17Avg3.298 × 1032.277 × 1031.943 × 1032.628 × 1032.361 × 1031.745 × 103
Min2.755 × 1031.804 × 1031.778 × 1032.256 × 1032.102 × 1031.727 × 103
F18Avg6.473 × 1064.258 × 1058.049 × 1057.890 × 1061.212 × 1061.493 × 105
Min6.593 × 1054.192 × 1046.880 × 1042.022 × 1064.281 × 1054.305 × 104
F19Avg2.509 × 1089.593 × 1047.374 × 1051.358 × 1084.426 × 1076.534 × 103
Min3.341 × 1071.081 × 1043.279 × 1036.263 × 1071.774 × 1071.910 × 103
F20Avg2.847 × 1032.624 × 1032.362 × 1032.759 × 1032.681 × 1032.091 × 103
Min2.574 × 1032.303 × 1032.146 × 1032.476 × 1032.487 × 1032.004 × 103
F21Avg2.707 × 1032.416 × 1032.379 × 1032.625 × 1032.575 × 1032.321 × 103
Min2.615 × 1032.359 × 1032.351 × 1032.587 × 1032.539 × 1032.312 × 103
F22Avg8.759 × 1033.018 × 1034.411 × 1036.831 × 1034.513 × 1032.300 × 103
Min6.900 × 1032.300 × 1032.406 × 1035.558 × 1032.705 × 1032.300 × 103
F23Avg3.239 × 1032.880 × 1032.729 × 1033.143 × 1033.133 × 1032.671 × 103
Min3.101 × 1032.807 × 1032.678 × 1033.061 × 1033.059 × 1032.654 × 103
F24Avg3.539 × 1033.107 × 1032.890 × 1033.319 × 1033.188 × 1032.844 × 103
Min3.253 × 1032.994 × 1032.849 × 1033.206 × 1033.122 × 1032.828 × 103
F25Avg7.655 × 1032.911 × 1032.958 × 1034.890 × 1033.137 × 1032.887 × 103
Min5.843 × 1032.887 × 1032.916 × 1034.344 × 1033.069 × 1032.887 × 103
F26Avg8.709 × 1035.651 × 1034.483 × 1038.661 × 1034.738 × 1033.903 × 103
Min6.500 × 1032.800 × 1033.473 × 1037.772 × 1033.736 × 1033.739 × 103
F27Avg3.827 × 1033.400 × 1033.230 × 1033.690 × 1033.720 × 1033.219 × 103
Min3.591 × 1033.283 × 1033.212 × 1033.537 × 1033.616 × 1033.208 × 103
F28Avg6.851 × 1033.228 × 1033.356 × 1035.474 × 1033.519 × 1033.216 × 103
Min5.611 × 1033.198 × 1033.283 × 1034.541 × 1033.468 × 1033.196 × 103
F29Avg5.426 × 1034.194 × 1033.642 × 1035.253 × 1034.726 × 1033.414 × 103
Min4.907 × 1033.858 × 1033.439 × 1034.952 × 1034.454 × 1033.323 × 103
F30Avg2.946 × 1081.043 × 1062.842 × 1061.084 × 1082.610 × 1077.835 × 103
Min8.913 × 1078.260 × 1045.249 × 1054.274 × 1071.221 × 1076.362 × 103
Average rank5.842.682.455.003.931.09
Total rank632541
Table 6. Comparison of MFO-SFR with well-known algorithms for CEC 2018 test functions with D = 50.
Table 6. Comparison of MFO-SFR with well-known algorithms for CEC 2018 test functions with D = 50.
F.MetricsPSOKHGWOCSAHOAMFO-SFR
F1Avg1.526 × 10111.703 × 1054.506 × 1099.609 × 10101.149 × 10103.463 × 104
Min8.451 × 10101.932 × 1047.183 × 1088.123 × 10108.346 × 1099.385 × 103
F3Avg2.549 × 1051.192 × 1057.730 × 1042.070 × 1058.230 × 1045.495 × 104
Min1.957 × 1057.643 × 1044.662 × 1041.774 × 1056.990 × 1044.223 × 104
F4Avg3.307 × 1045.428 × 1028.017 × 1021.712 × 1042.589 × 1035.867 × 102
Min1.811 × 1044.765 × 1026.224 × 1021.286 × 1042.058 × 1035.196 × 102
F5Avg1.367 × 1037.662 × 1026.775 × 1021.211 × 1031.047 × 1035.624 × 102
Min1.256 × 1037.090 × 1026.316 × 1021.145 × 1031.011 × 1035.318 × 102
F6Avg7.114 × 1026.499 × 1026.112 × 1026.862 × 1026.745 × 1026.001 × 102
Min6.991 × 1026.393 × 1026.075 × 1026.776 × 1026.645 × 1026.000 × 102
F7Avg4.585 × 1031.052 × 1039.899 × 1023.206 × 1031.338 × 1038.682 × 102
Min4.142 × 1039.353 × 1029.057 × 1022.735 × 1031.289 × 1038.099 × 102
F8Avg1.687 × 1031.059 × 1039.862 × 1021.499 × 1031.354 × 1038.610 × 102
Min1.575 × 1031.033 × 1039.423 × 1021.423 × 1031.283 × 1038.318 × 102
F9Avg5.170 × 1049.873 × 1034.844 × 1033.622 × 1042.153 × 1049.243 × 102
Min3.909 × 1047.955 × 1032.552 × 1033.021 × 1041.416 × 1049.066 × 102
F10Avg1.441 × 1047.948 × 1035.919 × 1031.429 × 1041.412 × 1046.534 × 103
Min1.356 × 1046.701 × 1034.473 × 1031.342 × 1041.324 × 1045.135 × 103
F11Avg2.610 × 1044.813 × 1032.766 × 1031.573 × 1043.782 × 1031.259 × 103
Min1.947 × 1043.034 × 1031.652 × 1031.157 × 1043.239 × 1031.146 × 103
F12Avg4.300 × 10101.287 × 1073.406 × 1082.210 × 10102.417 × 1091.873 × 106
Min2.464 × 10104.289 × 1063.715 × 1071.421 × 10101.705 × 1091.078 × 106
F13Avg1.683 × 10105.864 × 1041.026 × 1086.132 × 1095.696 × 1085.362 × 103
Min5.672 × 1092.099 × 1048.150 × 1043.709 × 1094.310 × 1081.749 × 103
F14Avg6.142 × 1065.701 × 1053.453 × 1054.140 × 1068.316 × 1054.016 × 104
Min1.659 × 1061.615 × 1052.928 × 1041.829 × 1062.222 × 1051.202 × 104
F15Avg5.029 × 1091.956 × 1044.078 × 1061.190 × 1092.294 × 1082.964 × 103
Min2.307 × 1099.499 × 1032.722 × 1044.163 × 1089.908 × 1071.534 × 103
F16Avg7.306 × 1033.250 × 1032.896 × 1036.252 × 1035.184 × 1032.614 × 103
Min6.699 × 1032.463 × 1032.326 × 1035.731 × 1034.749 × 1032.148 × 103
F17Avg1.334 × 1043.359 × 1032.661 × 1035.190 × 1033.888 × 1032.474 × 103
Min5.938 × 1032.849 × 1032.264 × 1034.547 × 1033.183 × 1032.018 × 103
F18Avg3.800 × 1072.330 × 1063.051 × 1063.471 × 1078.478 × 1061.247 × 106
Min1.430 × 1071.131 × 1063.848 × 1051.224 × 1074.500 × 1061.067 × 105
F19Avg2.060 × 1091.719 × 1051.231 × 1065.231 × 1087.924 × 1071.276 × 104
Min6.897 × 1082.586 × 1049.261 × 1032.060 × 1082.979 × 1072.447 × 103
F20Avg4.010 × 1033.276 × 1032.743 × 1033.903 × 1033.675 × 1032.485 × 103
Min3.712 × 1032.764 × 1032.380 × 1033.680 × 1033.261 × 1032.081 × 103
F21Avg3.164 × 1032.556 × 1032.473 × 1032.994 × 1032.850 × 1032.360 × 103
Min3.046 × 1032.455 × 1032.422 × 1032.896 × 1032.767 × 1032.335 × 103
F22Avg1.609 × 1041.049 × 1048.211 × 1031.603 × 1041.527 × 1048.339 × 103
Min1.492 × 1049.115 × 1036.990 × 1031.493 × 1044.474 × 1036.317 × 103
F23Avg4.077 × 1033.379 × 1032.916 × 1033.777 × 1033.749 × 1032.793 × 103
Min3.763 × 1033.052 × 1032.826 × 1033.595 × 1033.533 × 1032.761 × 103
F24Avg4.174 × 1033.643 × 1033.109 × 1033.983 × 1033.744 × 1032.970 × 103
Min3.943 × 1033.406 × 1032.991 × 1033.738 × 1033.597 × 1032.931 × 103
F25Avg2.556 × 1043.092 × 1033.355 × 1031.580 × 1044.331 × 1033.070 × 103
Min1.693 × 1043.052 × 1033.146 × 1031.384 × 1043.997 × 1032.985 × 103
F26Avg1.697 × 1049.335 × 1035.804 × 1031.506 × 1045.800 × 1034.406 × 103
Min1.280 × 1043.159 × 1034.993 × 1031.326 × 1045.170 × 1034.051 × 103
F27Avg5.456 × 1034.354 × 1033.516 × 1035.185 × 1035.049 × 1033.307 × 103
Min4.582 × 1033.985 × 1033.402 × 1034.636 × 1034.657 × 1033.266 × 103
F28Avg1.242 × 1043.346 × 1033.927 × 1031.039 × 1044.740 × 1033.378 × 103
Min9.606 × 1033.311 × 1033.545 × 1038.991 × 1034.469 × 1033.310 × 103
F29Avg1.018 × 1045.360 × 1034.202 × 1038.981 × 1036.514 × 1033.545 × 103
Min8.653 × 1034.196 × 1033.826 × 1037.451 × 1036.113 × 1033.289 × 103
F30Avg2.508 × 1094.241 × 1077.032 × 1071.287 × 1093.337 × 1081.144 × 106
Min1.281 × 1091.429 × 1073.629 × 1076.453 × 1082.374 × 1089.567 × 105
Average rank5.932.702.294.983.921.18
Total rank632541
Table 7. The overall effectiveness of MFO-SFR and MFO variants.
Table 7. The overall effectiveness of MFO-SFR and MFO variants.
Algorithms30
(W|T|L)
50
(W|T|L)
Total
(W|T|L)
OE
MFO0|0|290|0|290|0|580%
LMFO0|0|290|0|290|0|580%
WCMFO1|0|282|0|273|0|555.17%
CMFO0|0|290|0|290|0|580%
ODSFMFO1|0|281|0|282|0|563.45%
SMFO 0|0|290|0|290|0|580%
WMFO4|0|256|0|2310|0|4817.24%
MFO-SFR23|0|620|0|943|0|1574.14%
Table 8. The overall effectiveness of MFO-SFR and contender algorithms.
Table 8. The overall effectiveness of MFO-SFR and contender algorithms.
Algorithms30
(W|T|L)
50
(W|T|L)
Total
(W|T|L)
OE
PSO0|0|290|0|290|0|580%
KH0|0|292|0|272|0|563.45%
GWO1|0|282|0|273|0|555.17%
CSA0|0|290|0|290|0|580%
HOA0|0|290|0|290|0|580%
MFO-SFR28|0|125|0|453|0|591.38%
Table 9. Comparison of the results obtained for the welded beam design problem.
Table 9. Comparison of the results obtained for the welded beam design problem.
AlgorithmsOptimal Values for VariablesOptimum Cost
hltb
MFO0.205763.470179.035820.205771.72499
LMFO0.207393.435889.251310.208071.77799
WCMFO0.205733.470359.036700.205731.72489
CMFO0.182764.490278.983080.208191.82934
ODSFMFO0.205693.471289.036620.205731.72490
SMFO0.208363.463248.991440.208531.74135
WMFO0.207223.453418.998340.207481.73151
MFO-SFR0.205733.470569.036620.205731.72486
Table 10. Comparison of the results obtained for the four-stage gearbox problem.
Table 10. Comparison of the results obtained for the four-stage gearbox problem.
VariablesMFOLMFOWCMFOCMFOODSFMFOSMFOWMFOMFO-SFR
x121.931126.039013.270914.447019.95227.489611.608119.7537
x256.268659.601438.042835.908742.787936.860336.528951.3053
x36.510023.428047.003813.705116.73639.690532.549717.3633
x421.234962.148764.734527.489134.368535.006648.255435.7343
x517.841541.77046.992519.500015.166215.497618.655117.4063
x637.920150.802316.373032.674427.117836.719535.167430.5308
x723.130018.532816.379013.519022.438637.197315.617721.5071
x827.597949.652533.911031.502956.441715.545538.456944.0394
x90.51000.98090.74431.30210.97272.07021.42770.8917
x103.39140.59640.88942.49190.85520.75891.39201.2137
x110.51000.71734.22851.49991.20690.80630.97810.6197
x120.51000.51001.24521.49960.93561.34441.08931.1821
x137.87925.47356.37993.46492.36790.92501.67574.1070
x145.95334.89565.51386.45605.10764.78995.76686.8698
x155.39934.95215.88345.18355.07480.58675.62592.6797
x166.90704.87222.73915.15774.10234.64705.09892.2119
x176.71224.60785.41746.49365.14414.04774.66314.2234
x187.52147.56638.37276.47515.14033.36383.57621.5645
x194.73383.78693.57414.49723.93643.63663.75933.7526
x205.32544.39663.31843.45974.85084.63033.52225.4510
x214.89233.35285.77644.42782.81712.92032.64394.0416
x225.60864.07244.89854.47632.77813.92675.68055.2131
Optimum
Weight
7.2632 × 1016.6228 × 1012.1631 × 10125.2157 × 1013.6565 × 1013.4380 × 10172.6870 × 10143.6555 × 101
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nadimi-Shahraki, M.H.; Zamani, H.; Fatahi, A.; Mirjalili, S. MFO-SFR: An Enhanced Moth-Flame Optimization Algorithm Using an Effective Stagnation Finding and Replacing Strategy. Mathematics 2023, 11, 862. https://doi.org/10.3390/math11040862

AMA Style

Nadimi-Shahraki MH, Zamani H, Fatahi A, Mirjalili S. MFO-SFR: An Enhanced Moth-Flame Optimization Algorithm Using an Effective Stagnation Finding and Replacing Strategy. Mathematics. 2023; 11(4):862. https://doi.org/10.3390/math11040862

Chicago/Turabian Style

Nadimi-Shahraki, Mohammad H., Hoda Zamani, Ali Fatahi, and Seyedali Mirjalili. 2023. "MFO-SFR: An Enhanced Moth-Flame Optimization Algorithm Using an Effective Stagnation Finding and Replacing Strategy" Mathematics 11, no. 4: 862. https://doi.org/10.3390/math11040862

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop