1. Introduction
Optimization problems have existed widely in scientific and engineering fields. Over time, scholars have developed many methods to deal with optimization problems. These methods include the Gradient Descent Optimizer [
1,
2], Line Search Algorithm [
3,
4], and Trust Region Algorithm [
5,
6], among others. However, as problems become increasingly complex, traditional methods face challenges when confronted with optimization problems that involve intricate constraints and complicated calculation processes. To meet such requirements, researchers have introduced metaheuristic algorithms with a simple structure, strong global search capability, robustness, and independence from gradient information. Metaheuristic algorithms have indeed emerged as potent tools for addressing complex optimization problems across diverse fields. These algorithms exhibit superiority over traditional optimization methods, from their efficacy in dealing with intricate constraints, non-linear relationships, and high-dimensional search spaces [
7].
In recent times, a large number of metaheuristic algorithms (MA) have been proposed, encompassing both classic metaheuristic algorithms and their improved variants. The classic MA can be classified into seven categories, namely Biology-based (BioA), Math-based (MaA), Physic-based (PhyA), Evolutionary-based (EvoA), Human-social-based (HuSoA), Plant-based (PlA), and Music-based (MuA) algorithms [
8]. The classification of MAs are shown in
Table 1.
BioA algorithms draw inspiration from animal group activities in nature. Examples of BioA algorithms include PSO [
9], GWO [
10] and FOX [
11]. MaA algorithms are based on principles and laws in mathematics, such as SCA [
12] and CIOA [
13]. PhyA algorithms are inspired by physical phenomena in nature, such as the SA [
14] and GSA [
15]. EvoA algorithms are inspired by the search Algorithms of biological evolutionary mechanisms such as natural selection and genetics; some classical EvoA algorithms include GA [
16] and DE [
17]. HuSoA algorithms are generally derived from human social phenomena and activities; some classical HuSoA algorithms include, TLBO [
18] and TCCO [
19]. PlA algorithms are based on intelligent behavior in plants, including IWO [
20] and PA [
21]. MuA algorithms are inspired by music-related concepts and principles, such as MS [
22] and HS [
23].
Currently, in the research of metaheuristic algorithms, addressing the issue of local optima while exploring the problem space remains an important research area. A more effective direction in metaheuristic algorithm research is to enhance the internal structure of existing algorithms to tackle various complex optimization problems [
24]. In recent years, researchers have proposed various improved variants based on MA to solve complex optimization problems.
Shaukat proposed a modified genetic algorithm (MGA) for optimizing the multi-objective core overloading pattern. In comparison to the classical GA, MGA effectively preserves chromosomes with the best fitness, resulting in a more efficient search for the optimal fuel loading pattern [
25]. Lodewijks conducted a comparison of the optimization performance of three state-of-the-art Particle Swarm Optimization (PSO) algorithms for solving the optimization problem of an Airport Baggage Handling Transportation System (BHTS) [
26]. The experimental results showed that all three variants of PSO were capable of finding effective and efficient solutions. Among them, the Self-Regulation PSO (SRPSO) algorithm, which exhibited the lowest CPU running time, was selected and adopted. Romeh proposed the Hybrid Vulture Cooperative Multi-Robot Exploration (HVCME) algorithm and applied it to optimize the construction of limited maps in multi-robot exploration [
27]. Compared to four other similar algorithms, HVCME demonstrated more effective optimization of limited map construction in unknown indoor environments.
In addition to these algorithms, the Squirrel Search Algorithm [
28] has emerged as a BioA algorithm. Inspired by the foraging strategy of squirrels, they utilize parachute-like membranes to slide between trees in search of food. The SSA employs a combination of random search and local search mechanisms, enabling it to effectively search the solution space and converge towards optimal or near-optimal solutions. SSA has demonstrated strong competitiveness when compared to well-known MAs such as PSO, Artificial Bee Colony (ABC), and others. Since its proposal, SSA has been applied to various complex optimization problems such as Production Scheduling [
29,
30,
31], Image Analysis [
32,
33], and Biomedicine [
34,
35]. The versatility and effectiveness of SSA make it a valuable tool for solving optimization problems in diverse domains.
Simultaneously, similar to any other metaheuristic algorithm, may not be suitable for every optimization problem. Researchers might choose to improve SSA to address specific problem characteristics or requirements. SSA also faced challenges such as imbalanced exploration and exploitation, falling into local optimum, and low convergence accuracy. To address these issues, numerous scholars have studied to improve the performance of SSA. These efforts can be divided into three main categories based on improvement strategies.
Firstly, the adaptive parameter mechanism refers to improving one or more constants in the algorithm to make it adaptive in the position updating process. Zheng added the adaptive strategy of predator existence probability and the optimal selection strategy to enhance the exploitation capabilities of SSA [
36]. Wen used the roulette strategy to update the squirrels’ position on normal trees, which could increase the population diversity and avoid falling into the local optimum [
37]. Second, the search strategy update mechanism refers to improving one or more position update strategies in the algorithm. Wang used the jump search method to improve the winter search strategy and the progressive search method to improve the Levy Flight stage, which make SSA could maintain the population diversity in these two stages [
38]. Karaboga used the cloud model to replace the random function of uniform distribution to update the new position of squirrels, which improved the convergence accuracy of SSA [
39]. Third, combining with different algorithms refers to the SSA update strategy combining with other algorithms. Sakthivel combined the Pareto dominance principle with SSA to keep the distribution diversity of Pareto optimal solutions in the algorithm’s evolution process [
40]. Liu used the crossover operator and mutation operator to enhance the squirrel position update stage, which increased the population diversity of SSA and improved the convergence of SSA [
41].
The purpose of this paper is to develop a fuzzy squirrel search algorithm based on a wide-area search mechanism (FSSSA). There are three improvement strategies in FSSSA:
To accelerate the convergence speed, the adaptive weight optimized by the fuzzy inference system (FIS) is added to change the step size in three position update strategies.
The sine cosine mutation strategy (SCM) enhanced the exploration ability and avoided falling into the local optimum by improving the sliding constant in the position update stage.
The wide-area search mechanism (WAS) is used to improve the location update stage of some elite individuals (the first type of location update stage). It can balance the exploration and exploitation and improve the convergence accuracy of the algorithm.
The main contributions of this paper can be summarized as follows:
On the basis of SSA, this paper proposes an improved squirrel search algorithm (FSSSA).
In the FSSSA, the exploration ability is enhanced by using the FIS and the SCM strategy. The exploitation ability is enhanced by using the WAS strategy.
The FSSSA is tested on 24 benchmark functions and four engineering problems. Compared with the other algorithms, FSSSA has a preferable performance.
The rest of this paper is structured as follows. The second section introduces the principle of the basic SSA. The third section introduces three strategies to improve the SSA and calculates the space complexity of the FSSSA. The fourth and fifth sections introduces the validity of FSSSA by benchmark functions and four engineering optimization problems. The sixth part summarizes the research content of this paper and puts forward the prospect.
3. The Proposed Algorithm (FSSSA)
While SSA does possess strong global search capabilities, the fixed search range and direction in each iteration can result in slow convergence. Additionally, the random search strategy employed by the elite individuals in SSA for exploring the global range can lead to weaker exploitation ability and lower convergence accuracy. Moreover, when dealing with high-dimensional complex optimization problems, SSA may encounter challenges related to falling into local optima. The exploration-exploitation balance becomes crucial in such scenarios to avoid being trapped in suboptimal solutions.
Aiming at the above limitations, this section proposes three strategies, the FIS, the SCM, and the WAS. By employing the FIS, the output of inertia weights can be obtained to adjust the step size variation during iterations, thereby accelerating the convergence speed of the algorithm. By utilizing the SCM, the sliding constant can be adjusted to enable the search range and search direction to vary with iterations, thereby enhancing the exploratory capability of the algorithm. Through the use of WAS, the search mechanism of elite individuals can be improved, thereby enhancing their exploitative capability in the vicinity of the optimal solution.
3.1. Introduced Fuzzy Inference System
This paper used FIS to output inertia weights . It was added into three position updating stages and made the search step size change randomly with the number of iterations to improve the convergence speed of the algorithm.
Under the framework of an adaptive network, the FIS [
43] can combine fuzzy inference and control the input of the system. The FIS is divided into five parts: Fuzzification Interface, Fuzzy Database, Fuzzy Rule Base, Fuzzy Reasoning, and Defuzzification Interface. The FIS is shown in
Figure 1.
FIS has been used in many fields. Kumar [
44] used the FIS to provide localization results for transnational faults of circuits. Yu [
45] used the FIS to optimize the neural network. Therefore, the studies proved that the FIS has a preferable performance and can be used to optimize the metaheuristic algorithms.
Liu used FIS to optimize the PSO according to the model error to make its parameters dynamically self-adaptive [
46]. Amador–Angulo used Type-2 FIS to optimize the bee colony optimization and make its parameters dynamically self-adaptive [
47]. Many people applied FIS to improve the algorithm and achieved remarkable results.
The FIS designed in this paper, named FSSSA-Type-1, belongs to the category of Type-1 FIS. When dealing with optimization problems, metaheuristic algorithms often encounter uncertainties, such as the form of the objective function and the complexity of the constraint conditions. Type-1 FIS can effectively handle these uncertainties by utilizing fuzzy sets and fuzzy rules for modeling and inference. This approach enhances the adaptability of the algorithm to complex and ambiguous problems. The fuzzy rules defined in this system utilize triangular membership functions. The system diagram of the FSSSA is illustrated in
Figure 2. The design of the fuzzy rules output surface as shown in
Figure 3.
The inputs of the system are iteration progress (
) and iteration stall time (
). Where iteration progress
is the ratio of the current iteration to the number of maximum iterations.
is the current iteration progress and
is the maximum number of iterations. The
represents the time or number of iterations elapsed when the algorithm is unable to significantly improve the quality of the solution. Its calculation method is depicted in Equation (11).
where
represents the best historical fitness of the
iteration.
represents the optimal historical fitness of the
iteration. The single output of the system is inertia weight
.
By observing the output surface in
Figure 3, a clear understanding of the distribution of fuzzy outputs obtained by the FSSSA-Type-1 can be achieved. In this process, the inertia weight (
) gradually increases with the increase in iteration stall time, thereby enhancing the algorithm exploration capability in the later stages of iteration. Consequently, this enhances the likelihood of escaping the local optimum and further improves the algorithm performance.
The membership functions of the input and output are shown in
Figure 4.
The position update strategy of SSA is adjusted by inertia weight , which increases the exploitation ability and accelerates the convergence speed of the algorithm.
3.2. Introduced Sine Cosine Mutation
In SSA, the search individual only follows the fixed direction. It will lead to slow convergence speed and may fall into the local optimum. The sine and cosine variables in the Sine Cosine Algorithm (SCA) [
12] change randomly with iteration. The sine and cosine variables were introduced to improve the gliding constant
, so that the search range and direction were changed with iteration.
The investigation stage of the SCA is shown in Equation (12).
where
is the position of the current solution in the
dimension at the
iteration and
is the position of the best point (best solution) in the
dimension. The roles of
, and
define the moving direction of the next position, the distance moved to the next position, random weights, and random decision parameters.
and
is a predetermined constant, where in SCA, its value is set to 3. The
and
are random numbers of [0, 2]. The
is a random number of [0, 1].
According to Equation (12), the direction and the distance of SCA are random. The random combinatorial property of the SCA was used to improve the
in Equations (3)–(5). The improved
is no longer a fixed value; it dynamically adapts and changes with each iteration. This method is mathematically formulated as follows.
where
and
are random numbers of [0, 1].
randomly changes the search step of SSA.
The SSA search direction and range changed accordingly by improving the sliding constant, which increases the convergence speed of SSA and enhances the ability to jump out of the local optimum.
3.3. Introduced Wide-Area Search Mechanism
In SSA, elite individuals randomly jump within the global search range, and the exploitation ability of the algorithm is weak, resulting in low convergence accuracy. This section used the WAS to improve the search strategy of elite individuals.
Many metaheuristic algorithms also add a WAS to improve the algorithm’s exploitation ability. Simulated Annealing (SA) [
14] added the WAS in the status update phase. It reduced the probability of accepting the new value and increased the exploitation ability. Improved Evolution Algorithm (IEA) [
48] used the WAS to improve the evolution operator of Differential Evolution (DE). It increased the population diversity of DE.
The WAS is added to the location update strategy of producers in the Sparrow search algorithm [
49] by Equation (14).
where
represents the value of the
position of the sparrow in the
dimension during the
iteration. The
denotes the maximum number of iterations. Both
and
are random numbers within the range [0, 1]. The ST represents the safety threshold and. The
is a random number generated according to a normal distribution. The
is a
matrix (where
signifies the problem’s dimension), with each element being 1.
When , it indicates the absence of predators in the vicinity. In this case, producers (sparrows) narrow their search range with each iteration in a randomized manner, restricting their movement to the neighborhood of the current optimal solution.
Inspired by the Sparrow search algorithm, the position of some elite individuals (
——
) is updated as Equation (15).
where both
and
are random numbers within the range [0, 1]. The
is an improved sliding variable according to the Formula (13). The
represents the probability of the presence of predators. When
, it signifies the absence of predators in the forest, prompting squirrels on oak trees to engage in wide-area search mechanisms to locate pecan tree. When
, it indicates the presence of predators in the forest, and squirrels update their positions randomly to evade the predators.
By incorporating non-deterministic elements based on the acquired information and results during the search process, the efficiency and performance of the search conducted by elite individuals are improved. This approach enables more refined searches within the neighborhood of the optimal value, utilizing the updated positions of elite individuals. It aims to thoroughly explore the entire neighborhood as much as possible, thereby enhancing the exploitation capability and convergence accuracy of SSA.
3.4. SSA with Mixed Strategy
Firstly, the adaptive weight optimized by the FIS is added to change the step size in three position update strategies. It can improve the exploitation and accelerate the convergence speed of the SSA. Secondly, the SCM is introduced to enhance the sliding constant () in the position update stage. The search direction and step are adjusted during iteration, which enhances the exploration ability and avoids falling into the local optimum. Finally, the WAS mechanism is introduced to improve the elite individuals. Make the elite individuals jump around the optimal value of the neighborhood. It can balance the exploration and exploitation and improve the convergence accuracy of the algorithm.
The three-position update strategies of the improved SSA are shown in Equations (16)–(18).
To show the optimization idea of FSSSA more clearly, the pseudocode Algorithm 2 of FSSSA is shown, and the flow chart of FSSSA is shown in
Figure 5.
Algorithm 2 Pseudocode for FSSSA |
Begin: |
Input the optimization problem information. |
Set control parameters population (), scaling factor (), and predators (). |
Generate random locations for flying squirrels using Equation (1) |
Evaluate the fitness of each flying squirrel’s location. |
Sort flying squirrel locations by fitness value. |
The best value is defined as the squirrel on the pecan tree, the three of the next best values are the squirrels on the oak trees, and the rest values are the squirrels on the normal trees. |
while (the stopping criterion is not satisfied) do |
Calculate iterative stall time () using Equation (11) |
Calculate inertia weight () from the fuzzy inference system |
Calculate gliding constant () using Equation (13) |
For t = 1 to |
Update flying squirrel locations which are on oak trees and moving towards pecan trees using Equation (16) |
Update flying squirrel locations which are on normal trees and moving towards oak trees using Equation (17) |
Evaluate the fitness of each flying squirrel’s location and reorder the squirrels. |
Update flying squirrel locations which are on normal trees and moving towards pecan trees using Equation (18) |
Evaluate the fitness of each flying squirrel location and reorder the squirrels. |
end |
Calculate seasonal constant () using Equation (6) |
Update the minimum value of seasonal constant () using Equation (7) |
If (Seasonal monitoring condition is satisfied) |
Randomly relocate flying squirrels on normal trees using Equation (8) |
Evaluate the fitness of each flying squirrel’s location. |
end |
Reorder the squirrels. The best value is defined as the squirrel on the pecan tree, the three of the next best values are the squirrels on the oak trees, and the rest values are the squirrels on the normal trees. |
end |
The location of the squirrel on the pecan tree is the final optimal solution |
End |
3.5. Computational Complexity
The complexity of SSA is composed of population initialization, fitness evaluation, and three strategies for updating location. Remarkably, the complexity of the proposed FSSSA adds the inertia weight designed in FIS, the improved adaptive parameters of SCM, and the improved elite individual location update strategy of WAS. The coefficients involved are the algorithm population and the number of iterations of the algorithm . The complexity of population initialization is , the complexity of fitness evaluation is . The complexity of the FIS strategy is , the complexity of the SCM strategy is , and the complexity of the WAS strategy and the other two location update strategies is . Therefore, the complexity of the FSSSA is +++ = . The FSSSA maintains the same computational load as the classical SSA and other classical MAs, such as PSO, DE, and A.
4. Experimental Studies on Function Optimization Problems
In this section, the effectiveness of FSSSA will be demonstrated clearly and intuitively through classical benchmark optimization problems. Firstly, the parameter tuning of FSSSA will be introduced, analyzing the impact of parameter selection on FSSSA. Secondly, a comparative analysis will be conducted between SSA and the proposed FSSSA on 24 benchmark functions. This analysis will include convergence curves, convergence accuracy, balance, and diversity, aiming to provide a comprehensive evaluation of the optimization capabilities of FSSSA. By assessing these aspects, the effectiveness of the improvement strategy can be validated. The Wilcoxon rank-sum test will be employed to evaluate the significance difference between SSA and FSSSA. Lastly, experimental comparisons will be conducted between FSSSA and other metaheuristic algorithms, including MA and improved variants algorithms, to further evaluate the optimization performance and applicability of FSSSA.
4.1. Benchmark Functions and Parameter Setting
All experiments in this paper are carried out under the environment of IIl(IXeon(R) CPU AMD Ryzen 5 5600H with Radeon Graphics 3.30 GHz, 16 GB RAM, and MATLAB (2020b). To reduce the randomness of the experiment, each experimental result is independently repeated 30 times to take the average.
4.1.1. Parameter Setting
The algorithm parameters used in the experimental test are the White Shark Optimizer (WSO) algorithm [
50], Runge Kutta Optimization (RUN) algorithm [
51], Weighted Mean of Vectors (INFO) algorithm [
52], PSO [
9], Group Teaching Optimization Algorithm with Information Sharing (ISGTOA) [
53], Seagull Optimization Algorithm (SOA) [
54], Grey Wolf optimizer (GWO) [
10] and Ensemble Sinusoidal Differential Covariance Matrix Adaptation with Euclidean Neighborhood (LSHADE-cnEpSin, one of the winners of CEC 2017 competition) [
55]. The algorithm parameters above are consistent with those in the original paper of the algorithm. Specific settings are shown in
Table 2.
4.1.2. Benchmark Functions
This paper conducts experiments on 24 classic benchmark test functions [
56,
57]. According to the character of functions, they can be described as unimodal, multimodal, separable, and non-separable functions. The function set in this paper is shown in
Table 3, including 11 unimodal and 13 multimodal functions, 12 separable and 12 non-separable functions, among which F12 and F17-F19 are four shifted functions. The shifted function refers to the operation of shifting the graph of a function in space, wherein the function’s image is horizontally or vertically shifted along the coordinate axes. In these functions, the best position is moved or rotated to other locations primarily to avoid situations where certain algorithms would copy one parameter to another parameter to generate neighboring solutions.
The character of functions is detailed in the Character column of
Table 1. U, M, S, and N are used to indicate unimodal, multimodal, separable, and non-separable functions [
58].
It should be noted that when the dimension increases, the search space and the corresponding difficulty will be increased. It is more challenging to solve high-dimensional problems than low-dimensional ones [
59]. Therefore, dimensions 30 and 50 are used for experimental tests in this paper.
4.2. FSSSA Parameter Tuning
The performance of the algorithm is influenced by various factors, including population size, number of iterations, and parameters. In the FSSSA, the scaling factor
plays a crucial role in balancing the exploration and exploitation phases [
28]. Therefore, selecting an appropriate
value is vital for achieving optimal performance with FSSSA. Based on relevant literature and previous experiments, a value
in the range of 16 to 37 can achieve the desired accuracy level without compromising algorithm stability. In order to provide a clearer understanding of the impact of
on FSSSA’s performance across benchmark functions, manual adjustments were made using functions F8 and F9 to observe experimental results more effectively.
Table 4 records the average and standard deviation of parameter settings as
over 30 runs with 20,000 function evaluations (FEs), as well as the convergence curve shown in
Figure 6 and the box plot illustrated in
Figure 7.
By examining
Figure 6 and
Table 4, it becomes evident that when
, FSSSA achieves the lowest average and standard deviation on F8 and F9. This indicates that with
, FSSSA exhibits the best optimization performance and the highest stability across the benchmark functions.
From the nature of the box plot [
60], it can be seen in F9 that the
has the smallest quartile range of box plot, the
and the
have wide interquartile ranges of box plot. Consequently, the optimal solution is more stable when
. In F8, while
,
,
, and
have a similar quartile range of box plots, the median of
is smaller than the others. It proves that the optimal solution of
has the most advantages.
In this paper, the value of the scaling factor selected is 18, which improves the accuracy and stability of the FSSSA.
The specific settings for the other parameters of FSSSA have been detailed in
Table 2. Most of these parameters are fixed and have minimal impact on the algorithm’s performance across benchmark functions. Although this study did not employ a parameter tuning mechanism, such as CRS-Tuning [
61], F-Race [
62], or REVAC [
63], a systematic process of experimental adjustment combined with a deep understanding of algorithm characteristics, as well as drawing from previous literature on SSA research and empirical outcomes, led to the identification of a well-considered parameter configuration. This approach aimed to attain optimal algorithm performance for specific problems, ensuring the rigor and replicability of the experiments conducted in this study.
4.3. Compared with SSA
In this section, convergence accuracy under the same iteration, evaluation times under the same accuracy, balance and diversity analysis, and the nonparametric statistical tests experiments are used to verify the effectiveness of the FSSSA.
Table 5 shows that four types of SSA are formed by combining policies. FIS is represented as the fuzzy inference system strategy, SCM is represented as the sine cosine mutation strategy, WAS is represented as the wide-area search mechanism strategy. A 0 means that the strategy is not used, and 1 means that the strategy is used.
4.3.1. Convergence Accuracy under Fixed Number of Iterations
This section evaluates the convergence accuracy of SSA and improved SSA under fixed 20,000 FEs on the benchmark function in
Table 3. The parameter Settings of the algorithm are shown in
Table 2. The benchmark function selects 30 dimensions. The experimental results are the mean value and standard deviation after 30 independent runs. The ordinate is the logarithm base 10 of fitness value, and the best results of each test function are shown in bold in the table.
Figure 8 shows the convergence curves of SSA, FSSA, SSSA, and FSSSA in partial functions.
Table 6 records the mean value, standard deviation, and optimal value of each algorithm run 30 times.
In
Table 6, the symbols “+”, “−”, and “=“, respectively, indicate that the average convergence accuracy of FSSSA over 30 runs (mean) is better than, worse than, or equal to the average convergence accuracy of the comparison algorithms (mean). Average Value Rank (AVR) represents the average ranking value of each algorithm in 24 functions, and “Rank” indicates the final rank. The average ranking shows that the AVR of FSSSA is 1.041, which is the lowest among all algorithms, indicating the best optimization performance of FSSSA. The data in the “+/−/=“ indicates that FSSSA outperforms SSA in 23 benchmark functions, outperforms FSSA in 16 benchmark functions, and outperforms SSSA in 10 benchmark functions.
Combining
Figure 8 and
Table 6, except for F6, FSSA, SSSA, and FSSSA outperform SSA in the remaining 23 benchmark functions, indicating that the three variants of SSA proposed in this paper exhibit significant improvements. SSSA demonstrates higher convergence accuracy than FSSA in 14 functions. Most of these functions are unimodal functions or separable multimode functions, and the local optimums of multimode functions are relatively more and not far away. This puts a high requirement on the exploitation of the algorithms. The WAS strategy can effectively improve the exploitation of the SSA, so the WAS strategy has better optimization performance when solving the above functions. The convergence accuracy of FSSA at F8 and F9 is better than SSSA. The slope of the F8 function is small and the distance between the local optimal values of F9 is far. These two kinds of functions require more exploration ability of algorithms. The FSSA containing SCM and FIS strategy is better than other algorithms in these kinds of functions. It is proved that the SCM and the FIS strategies can improve the exploration of the algorithm.
In addition, The SSSA can find the optimal value on most functions. Therefore, the algorithm improved by the WAS strategy has better convergence accuracy. The standard deviation of the FSSA is minor, so the algorithm improved by FIS and SCM strategy has a more stable optimization effect. The FSSSA notably outperforms other algorithms on the 24 functions. Therefore, the optimization performance of the FSSSA combined with the three strategies can adapt to more types of functions and the optimization performance is more stable.
4.3.2. The Number of Functional Evaluations with Fixed Target Accuracy
In order to intuitively demonstrate the effectiveness of the improvement strategy, this study tested the number of FEs required for SSA, FSSA, SSSA, and FSSSA on 24 benchmark functions under a fixed accuracy. The fixed accuracy was set to 1.00 × 10
−100, and the dimensions of the benchmark functions were set to 50. The experimental results were compared and analyzed based on the average, maximum, and minimum values obtained from 30 repetitions. To facilitate comparison and observation, the experimental data were rounded to integers. The stopping criteria for this experiment were set to reach either the maximum FEs or achieve the target accuracy. The maximum number of FEs was set to 20,000. The experimental data are presented in
Table 7.
As shown in
Table 7, SSA struggled to achieve the target accuracy within 20,000 evaluations. FSSA achieved the target accuracy on 8 benchmark functions, while SSSA and FSSSA achieved the target accuracy on 21 benchmark functions. Except for F13 and F23, FSSSA consistently achieved the target accuracy with the fewest FEs. Although FSSSA performed relatively weaker compared to SSSA on F13 and F23, overall, FSSSA demonstrated strong stability.
In conclusion, among the 24 benchmark functions under a fixed accuracy, FSSSA exhibited the best optimization performance. Thus, FSSSA effectively improved the performance of SSA.
4.3.3. Nonparametric Statistical Tests with SSA
In this section, non-parametric statistical tests are used to examine the performance differences among the four algorithms listed in
Table 5 [
64]. This study adopts the Wilcoxon signed-rank test at a 5% significance level for statistical analysis. The
p-values computed from the Wilcoxon signed-rank test are adjusted using the Bonferroni–Holm correction [
65]. The computed and adjusted
p-values are presented in
Table 8. In the Wilcoxon signed-rank test, if the Corrected
p-value is less than 0.05, it indicates that the improved algorithm shows significant differences compared to SSA. If the
p-value is greater than 0.05, it means that the improved algorithm is not significantly different from SSA.
Based on the results in
Table 8, it can be observed that the proposed FSSA, SSSA, and FSSSA algorithms exhibit significant differences when compared to the original SSA algorithm.
4.3.4. Balance and Diversity Analysis
Striking a balance between exploration and exploitation is one of the key factors in designing new algorithms or enhancing existing ones. Exploration involves traversing the entire search space to discover promising regions, known as global search capability. The exploitation phase involves refining the search by utilizing the promising regions already discovered to find the optimal solution, known as local search capability. When an appropriate equilibrium is achieved between exploration and exploitation, algorithms tend to exhibit favorable convergence behavior [
66,
67].
In this study, the population diversity measurement method proposed by Hussain et al. [
68] was adopted to assess the algorithm’s balancing capability. This method assesses the algorithm’s balancing capability by calculating the average variation of distances within the population across different dimensions. If the average value decreases gradually during iterations, it is considered as the exploitation phase. Conversely, if the average value increases gradually, it is considered as the exploration phase. If the dimension diversity decreases while the average value remains unchanged, it indicates that the algorithm has converged.
Despite its simplicity and intuitiveness, the diversity measurement method is limited to evaluating the entire population and cannot directly express the exploration or exploitation status of individual solutions within the population [
69]. In this study, this measurement method was employed to evaluate the balancing and diversity performance of FSSSA and SSA across 24 test functions, providing clear and substantial evidence in support of the effectiveness of FSSSA’s improvement strategy.
However, in practical problem-solving scenarios, different problems may require adjusting the trade-off between exploration and exploitation according to specific circumstances. Therefore, in practical applications, it may be necessary to employ more sophisticated methods and metrics to determine when to prioritize exploration or exploitation, in order to better optimize the algorithm’s performance and discover superior solutions [
69].
The balance and diversity of FSSSA and SSA are tested on 24 test functions, as shown in
Figure 9.
Figure 9a is the balance analysis diagram of FSSSA,
Figure 9b is the balance analysis diagram of SSA, and
Figure 9c is the diversity analysis diagram of FSSSA and SSA. In
Figure 9a,b, the
x-axis is the number of iterations, and the
y-axis is the percentage. There are two curves in
Figure 9a,b, the red is the exploitation curve, and the blue is the exploration curve, respectively representing the proportion of the exploitation and exploration in a certain iteration. In
Figure 9c, the x-axis is the number of iterations, and the
y-axis is the population diversity. The red and blue curves represent the diversity of FSSSA and SSA.
It can be seen from the balance analysis diagram of FSSSA and SSA that, except for F22 and F23, the exploration of the SSA is larger than the exploitation. Therefore, the local search ability is weak, resulting in low convergence accuracy. However, in the search process of the FSSSA, the exploitation stage is larger than the exploration stage, which provides excellent local search ability for FSSSA. Except for F23, the proportion of the exploration stage also increases steadily in the late iteration stage to prevent FSSSA from falling into the local optimum.
In addition, according to the diversity analysis diagram of FSSSA and SSA, the population diversity of SSA is higher due to the more stages of random location update and strong global search ability. Although the population diversity of the FSSSA is low, most of the function iterations show an upward trend in the late stage. It ensures the population diversity of the FSSSA in the late-stage search.
The balance and diversity analysis of SSA and FSSSA show that the proposed FSSSA can effectively balance the exploration and exploitation of the algorithm and performs perfectly.
4.4. FSSSA with Advanced Metaheuristic Algorithms
To further verify the optimization performance of the FSSSA, eight metaheuristic algorithms are selected for experimental testing on the benchmark function in
Table 3. FSSSA compared with White Shark Optimizer, Runge Kutta Optimization algorithm, Weighted Mean of Vectors algorithm, Equilibrium Optimizer, Group Teaching Optimization Algorithm with Information Sharing, Gull Optimization Algorithm, Grey Wolf Optimizer and Ensemble Sinusoidal Differential Covariance Matrix Adaptation with Euclidean Neighborhood (LSHADE-cnEpSin, one of the winners of CEC 2017 competition). The population number of all algorithms is set as 50, the population of the LSHADE-cnEpSin is not set because the population of it changes with the update process. The number of FEs is 20,000. The other parameters are set as shown in
Table 2, and the dimension of the benchmark function is 50.
4.4.1. Comparative Analysis of Convergence Accuracy
The experimental data are shown in
Table 9, and the comparison of convergence curves of 50 dimensions is shown in
Figure 10.
From the convergence curve in
Figure 10, it can be observed that, except for F9, FSSSA outperforms other algorithms in the remaining functions. In F9, although FSSSA exhibits weaker optimization performance compared to RUN and ISGTOA, it demonstrates faster convergence speed in 24 functions compared to the other 8 algorithms.
In
Table 9, the symbols “+”, “−”, and “=“, respectively, indicate that the average convergence accuracy of FSSSA over 30 runs (mean) is better than, worse than, or equal to the average convergence accuracy of the comparison algorithms (mean). AVR represents the average ranking value of each algorithm in 24 functions, and “Rank” indicates the final rank. Considering the data comparison in
Table 9, it can be deduced that FSSSA is capable of finding the theoretical optimal values in 18 benchmark functions. Apart from F9, FSSSA’s convergence accuracy is higher than other algorithms in 23 benchmark functions. Regarding F9, RUN achieves the highest convergence accuracy, while FSSSA exhibits the fastest convergence speed.
The data in the “+/−/=“ rows indicates that the average convergence accuracy of FSSSA over 30 runs (mean) outperforms SOA and LSHADE-cnEpSin in 23 benchmark functions, outperforms RUN and INFO in 14 benchmark functions, outperforms PSO in 24 benchmark functions, outperforms ISGTOA in 21 benchmark functions, and outperforms GWO in 22 benchmark functions. FSSSA’s AVR is 1.083, ranking first among all algorithms.
In conclusion, the FSSSA notably outperforms the other eight metaheuristic algorithms.
4.4.2. Nonparametric Statistical Tests with Other Algorithms
In order to verify the effectiveness of the experiment in the previous section, the nonparametric Wilcoxon signed-rank test and the Friedman test [
64] were used to compare the FSSSA with eight algorithms. In the Wilcoxon signed-rank test, The
p-values are adjusted using the Bonferroni–Holm correction [
65]. The computed and corrected
p-values are shown in
Table 10. If the Corrected
p-value is less than 0.05, it indicates that FSSSA is significantly different. If the
p-value is greater than 0.05, it means that FSSSA is not significantly different compared to the other algorithms.
From
Table 10, it is evident that FSSSA exhibits significant differences when compared with the six metaheuristic algorithms. In comparison with the RUN and INFO algorithms, the corrected
p-values are greater than 0.05. Although not statistically significant, FSSSA’s optimization performance is superior to both the RUN and INFO algorithms in 14 functions.
The Friedman test allows for multiple comparisons among several algorithms by calculating ranks based on observed results. The experimental results are shown in
Table 11. In the Friedman test, to compare the significance differences between FSSSA and other algorithms, we calculate the Critical Difference (CD) using the Bonferroni–Dunn test [
70]. The Bonferroni–Dunn test is more suitable for comparing a particular algorithm with the remaining k-1 algorithms. If the average Rank Difference (RD) between FSSSA and other algorithms is greater than the CD, then it indicates that FSSSA statistically outperforms those algorithms. If it is less than the CD, then it indicates that there is no statistically significant difference between them.
From
Table 11, it is evident that FSSSA shows significant differences compared to 8 algorithms, and there is no significant difference when compared to RUN and INFO. However, FSSSA ranks first in terms of average ranks. Considering the average ranks and values of each algorithm, the overall analysis indicates that FSSSA’s optimization capability surpasses the other 8 algorithms.
It can be observed that in the comparative experiments with SSA, FSSSA demonstrates excellent performance in terms of convergence speed, convergence accuracy, balance, diversity, and non-parametric statistical tests, thereby confirming the effectiveness of the improvement strategies. In the comparative experiments with eight other metaheuristic algorithms, FSSSA achieves the first rank in terms of convergence speed, convergence accuracy, and non-parametric statistical tests, thus demonstrating the outstanding optimization performance of FSSSA.
5. Application to Engineering Optimization Problems
Engineering problems are common challenges and demands in practical applications, often characterized by diversity and complexity, spanning across various fields and contexts. Selecting engineering problems as test cases allows for a better assessment of the algorithm’s practicality and adaptability, providing valuable solutions for real-world applications. At the same time, engineering problems are also widely used to verify the performance of optimization algorithms [
12,
28]. Therefore, this study chose four engineering design problems [
71], namely Speed Reducer (SR), Cantilever Beam (CB), Optimal Design of the I-shaped Beam (ODIB), and Piston Lever (PL). These problems originate from different application domains, and each problem has distinct design requirements and optimization objectives. Specific optimization problems are covered in each section.
The FSSSA algorithm was compared with the classic SSA and Biogeography-based Optimization (BBO) [
72]. To ensure the fairness of the experiment, the population number was set to 50, the FEs were set to 20,000, and the other parameters were consistent with the original paper. Each algorithm was independently run 30 times and recorded separately after taking its average value to reduce the randomness of the experiment.
5.1. Speed Reducer (SR)
The SR is an essential part of the gearbox in the mechanical system, as shown in
Figure 11 [
71], and is widely used [
73]. This optimization problem has 11 constraints and 7 variables. The mathematical expression to describe this problem is shown in Equation (19).
Variable range:
where
(
in
Figure 11) represents the width of the surface,
(
in
Figure 11) represents the tooth mold,
(
in
Figure 11) represents the number of teeth in the pinion,
(
in
Figure 11) represents the length of the first shaft between bearings,
(
in
Figure 11) represents the length of the second shaft between bearings,
(
in
Figure 11) represents the diameter of the second axis, and
(
in
Figure 11) represents the diameter of the second axis.
The experimental results are shown in
Table 12, and the convergence curve is shown in Figure 15a. The results show that the FSSSA algorithm is superior to other algorithms in terms of final accuracy.
5.2. Cantilever Beam (CB)
The CB is a weight optimization problem of a cantilever beam with a square cross-section, which is generally an example of structural engineering design. The structure diagram is shown in
Figure 12 [
71], and the mathematical expression to describe this kind of problem is shown in Equation (20).
Variable range:
where
to
, respectively represents the width (or height) of five hollow square blocks with constant thickness, which are called decision variables, their thickness
remains unchanged (here
).
The experimental results are shown in
Table 13, and the convergence curve is shown in Figure 15b. The results show that the FSSSA algorithm is superior to other algorithms in terms of final accuracy.
5.3. Optimal Design of the I-Shaped Beam
The ODIB is a vertical disturbance optimization problem of the I-beam. The primary purpose is to minimize the vertical deflection of the I-beam under the constraints of cross-sectional area and stress under preset loads. The structure diagram is shown in
Figure 13 [
71], and the mathematical expression to describe this problem is shown in Equation (21).
Variable range:
where,
represents flange width (
in
Figure 13),
section height (
in
Figure 13),
web thickness (
in
Figure 13),
flange thickness (
in
Figure 13),
is vertical disturbance of I-beam,
and
are beam length and elastic modulus, which are 5200 and 523.104, respectively.
The experimental results are shown in
Table 14, and the convergence curve is shown in Figure 15c. The results show that the FSSSA algorithm outperforms other algorithms regarding final accuracy.
5.4. Piston Lever (PL)
The PL is an optimization problem for positioning several piston components, with the primary goal of minimizing fuel consumption by increasing the piston rod from 0° to 45°. The structure diagram is shown in
Figure 14 [
71], and the mathematical expression to describe this kind of problem is shown in Equation (22).
Variable range:
where
,
,
, and
, respectively, represent the positioning of four optimized piston components, corresponding to
,
,
, and
, respectively, in
Figure 14.
The experimental results are shown in
Table 15, and the convergence curve is shown in
Figure 15d. The results show that the FSSSA algorithm is superior to other algorithms in terms of final accuracy.
In summary, FSSSA performs well in the optimization of four engineering problems, except for some stability issues observed in the CB problem. When compared to SSA and BBO, FSSSA has shown the ability to find superior design solutions, significantly improving system performance and efficiency while satisfying the given constraints.
Overall, the performance advantages of FSSSA make it a reliable choice for solving engineering optimization problems. Its overall effectiveness and potential for applications make FSSSA a valuable tool for seeking better design solutions.
6. Conclusions
This paper proposes an improved squirrel search algorithm (FSSSA) to solve the problems of the slow convergence speed and unbalanced search stages of the SSA. Using the fuzzy inference system, sine cosine mutation, and the wide-area search mechanism to enhance the performance of SSA. By comparing the improved SSA with four evaluation index experiments on 24 benchmark functions, the effectiveness of the FSSSA is proved. From the convergence accuracy test and evaluation times tests, FSSSA performs excellently in the convergence speed. According to the balance and diversity analysis, FSSSA can better balance the exploration and exploitation ability and improve the convergence accuracy. By comparing the results in convergence accuracy and non-parametric statistical experiments, it can be analyzed that FSSSA maintains the top rank with other algorithms. In addition, FSSSA is applied to four kinds of engineering problems, and the experimental results show that FSSSA is more competitive in dealing with real complex problems.
This study will be helpful to further research on SSA. To meet the requirement of the complex problems, it can simplify the actual steps of FSSSA, and reduce the running time and computational complexity of the algorithm. In addition, FSSSA can be used to solve the multi-objective and feature selection.