Next Article in Journal
On the Pentanomial Power Mapping Classification of 8-bit to 8-bit S-Boxes
Previous Article in Journal
Quantitative Uniform Approximation by Activated Singular Operators
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Modified Sand Cat Swarm Optimization Algorithm Based on Multi-Strategy Fusion and Its Application in Engineering Problems

1
College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832000, China
2
Engineering Research Center for Production Mechanization of Oasis Characteristic Cash Crop, Ministry of Education, Shihezi 832000, China
3
Key Laboratory of Northwest Agricultural Equipment, Ministry of Agriculture and Rural Affairs, Shihezi 832000, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2024, 12(14), 2153; https://doi.org/10.3390/math12142153
Submission received: 10 June 2024 / Revised: 5 July 2024 / Accepted: 5 July 2024 / Published: 9 July 2024

Abstract

:
Addressing the issues of the sand cat swarm optimization algorithm (SCSO), such as its weak global search ability and tendency to fall into local optima, this paper proposes an improved strategy called the multi-strategy integrated sand cat swarm optimization algorithm (MSCSO). The MSCSO algorithm improves upon the SCSO in several ways. Firstly, it employs the good point set strategy instead of a random strategy for population initialization, effectively enhancing the uniformity and diversity of the population distribution. Secondly, a nonlinear adjustment strategy is introduced to dynamically adjust the search range of the sand cats during the exploration and exploitation phases, significantly increasing the likelihood of finding more high-quality solutions. Lastly, the algorithm integrates the early warning mechanism of the sparrow search algorithm, enabling the sand cats to escape from their original positions and rapidly move towards the optimal solution, thus avoiding local optima. Using 29 benchmark functions of 30, 50, and 100 dimensions from CEC 2017 as experimental subjects, this paper further evaluates the MSCSO algorithm through Wilcoxon rank-sum tests and Friedman’s test, verifying its global solid search ability and convergence performance. In practical engineering problems such as reducer and welded beam design, MSCSO also demonstrates superior performance compared to five other intelligent algorithms, showing a remarkable ability to approach the optimal solutions for these engineering problems.

1. Introduction

Optimization problems refer to finding the optimal solution from all possible scenarios under certain conditions, aiming to maximize or minimize performance indicators. These problems are primarily applied in practical engineering applications [1], image processing [2], path planning [3], and other fields.
Engineering optimization problems aim to solve real-life problems with constraints, enabling engineering systems’ parameters to reach their optimal states. During the development of the engineering field, numerous nonlinear and high-dimensional problems have emerged, making optimization problems increasingly complex, with corresponding increases in computational complexity. Traditional optimization algorithms were widely developed and applied to various optimization problems in the early stages. However, traditional optimization algorithms have certain limitations regarding problem form and scale. They face challenges such as high computational costs and slow convergence speeds for complex or large-scale problems, making it challenging to find the optimal solution within a reasonable time frame.
Metaheuristic algorithms have demonstrated remarkable robustness in addressing the shortcomings of traditional optimization algorithms when dealing with complex problems. These complex issues encompass multi-objective, single-objective, discrete, combinatorial, high-dimensional, or dynamic optimization, among others [4]. Metaheuristic algorithms are general-purpose optimization algorithms characterized by their global perspective when searching the solution space [5]. They are primarily divided into four categories: evolutionary-based algorithms, swarm-based algorithms, algorithms inspired by physics, mathematics, and chemistry, and algorithms based on human and social behaviors [6]. This article primarily focuses on swarm-based algorithms, i.e., swarm intelligence optimization algorithms, which mimic the behavior of natural swarms to achieve information exchange and cooperation, ultimately identifying the optimal solution to optimization problems. From early algorithms such as particle swarm optimization (PSO) [7], ant colony optimization (ACO) [8], and genetic algorithm (GA) [9] to later developments including the sparrow search algorithm (SSA) [10], dung beetle optimizer (DBO) [11], pelican optimization algorithm (POA) [12], Harris hawks optimization (HHO) [13], and subtraction-average-based optimization (SABO) [14], swarm intelligence optimization algorithms have gradually evolved into an optimization method with few design parameters and simple operation.
The sand cat swarm optimization (SCSO) algorithm is a novel swarm intelligence optimization algorithm, proposed by Seyyedabbasi in 2022 [15], inspired by the behavior of sand cats in nature. The SCSO algorithm is divided into an exploration phase and an exploitation phase, with the balance between the two phases adjusted by the parameter R. Due to its robust and powerful optimization capabilities, SCSO has been widely applied in various fields. For instance, it has been used to predict the scale of short-term rock burst damage [16] and design feedback controllers for nonlinear systems [17]. Nevertheless, the SCSO algorithm also has some drawbacks, such as slow convergence speed, poor global search ability during the exploration phase, and a tendency to fall into local optima.
Various experts and scholars have proposed improvements to address these shortcomings of the SCSO algorithm. Ref. [18] combined dynamic pinhole imaging and the golden sine algorithm to enhance the optimization performance of the sand cat population, improving global search ability through dynamic pinhole imaging and enhancing exploitation ability and convergence through the golden sine algorithm. Ref. [19] utilized a target encoding mechanism, elitism strategy, and adaptive T-distribution to solve dynamic path-planning problems, improving path-planning efficiency and introducing the ability to escape local optima. Ref. [20] proposed combining SCSO with reinforcement learning techniques to improve the algorithm’s ability to find optimal solutions in a given search space. Ref. [21] used chaotic mapping with chaos concepts to enhance global search ability and reduce issues like slow transitions between phases and early or late convergence. Ref. [22] defined a new coefficient affecting the exploration and exploitation phases and introduced a mathematical model to address the issues of slow convergence and late exploitation issues in SCSO. Although the experts and scholars above have significantly improved the SCSO algorithm’s performance, some areas still need further improvement, as shown in Table 1.
In summary, while specific improvements have been made in the literature to address the shortcomings of the SCSO algorithm, according to the no free lunch theorem [23], despite the extensive research and enhancements made to the SCSO algorithm’s global search ability, convergence performance, and ability to escape local optima, existing optimization strategies are still insufficient to tackle the ever-changing and increasing complexity of real-world problems. Therefore, it is necessary to continue improving the SCSO algorithm.
To further optimize the SCSO algorithm’s performance and expand its applications in engineering problems, this paper proposes a multi-strategy fusion sand cat swarm optimization (MSCSO) algorithm.
  • During the initial population phase, the good point set strategy is adopted to ensure the population is evenly distributed within the search space, enhancing population diversity.
  • A nonlinear adjustment strategy is introduced in the exploration phase to improve the sand cat’s sensitivity range, dynamically adjusting the balance conversion parameters between the exploration and exploitation stages.it expands the search range of the sand cats, enhances the algorithm’s global search ability, and reduces the risk of falling into local optima.
  • In the exploitation phase, the sparrow search algorithm’s warning mechanism is fused to update the sand cats’ positions, enabling them to respond quickly when approaching optimality, avoid falling into local optima, and accelerate the algorithm’s convergence speed.

2. SCSO: Sand Cat Swarm Optimization Algorithm

2.1. Population Initialization

Sand cats are randomly generated in the search area, with the position of the individual as indicated in Equation (1).
x = u b + a × ( u b l b )
where the u b and l b represent the upper and lower limits of the search area. a is a random number between 0 and 1, and d is the dimensionality of the algorithm. The upper and lower bounds of the search area are as shown in Equation (2).
u b = [ u b 1 , u b 2 u b d ] ,   l b = [ l b 1 , l b 2 l b d ]

2.2. Search for Prey (Exploration Phase)

In the exploration stage, the sand cat’s search behavior and attack behavior were dynamically adjusted by changing the size of the R-value, as shown in Equation (4). When the |R| < 1, the sand cat performs a search behavior, and during the search process, the sand cat relies on the range of low-frequency noise to locate its prey, by effectively exploring the potential best prey locations with sand cat location updates, as Equation (6) indicates.
r g = S M S M × t T m a x
R = 2 × r g × r a n d 0 , 1 r g
r = r g r a n d ( 0 , 1 )
where the r g value drops linearly from 2 to 0, S M is the auditory feature with a value of 2, t is the current number of updates, T m a x is the maximum number of updates, r a n d ( 0 , 1 ) is a random number from 0 to 1, and r is the sensitivity range [15].
x = r × ( x b c r a n d ( 0 , 1 ) ) × x c
where x b c represents the current best location, and x c is the current location.

2.3. Attacking Prey (Development Phase)

When the |R| 1, the sand cat starts to attack, gradually narrowing down the search area to improve the search accuracy when approaching the prey. At this stage, the sand cat selects a random angle θ between 0 and 360 in the search area and uses x b c and x c to generate a random position, making the sand cat’s attack behavior more flexible and changeable, increasing the probability of finding the global optimal solution, as Equation (8) indicates.
x r n d = r a n d ( 0 , 1 ) × ( x b c x c )
x = x b r × x r n d × cos ( θ )
where x b is the optimal position and x r n d is the random position.
The pseudocode of SCSO is presented in Algorithm 1.
Algorithm 1 SCSO
1: Initialize the population
2: Calculate the fitness function based on the objective function
3: Use Equations (3)–(5) to initialize the parameters r g , R, and r
4: while (t < T) do
5:  for Every individual do
6:   Take a random angle θ between 0 and 360°
7:     if |R| ≤ 1
8:    Use Equation (8) to update the position
9:    else
10:     Use Equation (6) to update the location
11:     end
12: end
13: end

3. MSCSO: Fusion of Multi-Strategy Sand Cat Swarm Optimization Algorithm

3.1. The Good Point Set Strategy

Based on Equation (1), it can be seen that the SCSO algorithm initializes the population using a random strategy, resulting in a random distribution of the population within the search space; it can lead to an uneven distribution of the population and an increased likelihood of falling into local optima. Therefore, this paper introduces the good point set strategy, which was proposed by Hua Luogeng in 1978. This strategy utilizes the distribution patterns of point sets to optimize search and solution efficiency [24]. Meanwhile, Ref. [25] employed the good point set strategy to distribute the population uniformly within the search space, effectively enhancing population diversity and reducing the probability of falling into local optima. Given the advantages of the good point set strategy in initializing populations, this paper adopts the good point set strategy for population initialization. The steps are as follows:
Step 1: Generate the smallest prime number within the z-dimensional Euclidean space, as Equation (9) depicts.
z p 3 2
Step 2: Generate a good point within the z-dimensional space, as Equation (10) depicts.
r = 2 × cos 2 × w × π p ,   1     w     z
Step 3: Let Gz be the unit cube in the z-dimensional Euclidean space, generating a good point set with a sample size of n, as Equation (11) depicts.
F n = r 1 n × k , r 1 n × k , , r z n × k , 1     k     n
Step 4: Replace Equation (1) with Equation (12) for population initialization.
x = u b + F n × ( u b l b )
In a two-dimensional plane, 300 points were randomly generated using the good point set and random strategy, as shown in Figure 1 and Figure 2. Based on the distribution of these 300 points, it can be observed that the good point set strategy results in a more uniform distribution across the entire search space, enhancing the diversity of the SCSO algorithm; it validates the effectiveness of the good point set strategy in initializing the population for the SCSO algorithm.

3.2. Nonlinear Adjustment Strategy

The SCSO algorithm prioritizes breadth during the exploration phase and emphasizes depth and speed during the exploitation phase. As indicated in Equation (3), the sensitivity range of the sand cat’s r g value gradually decreases linearly from 2 to 0. This linear decrease does not fully utilize the global search capability of Equation (3), resulting in a tendency for the algorithm to get trapped in local optima. Consequently, a nonlinear adjustment strategy is adopted to improve Equation (3). Studies in [26,27] have validated that transitioning from a linear method to a nonlinear adjustment strategy enables the algorithm to maintain both global search capability and local exploitation ability throughout the entire iteration process, thus fulfilling the algorithm’s requirements across iterations. In this paper, the improved nonlinear substitution formula for Equation (3) is utilized to modify the auditory range of the sand cat. The steps for enhancing the nonlinear adjustment strategy formula are outlined below. Additionally, the modified formula is presented as Equation (14).
Step 1: Design parameter f, as shown in Equation (13)
f = sin ( π 4 × t T m a x )
Step 2: Introduce f into the original Equation (3); the improved Equation is shown as Equation (14).
R = S M S M × t T m a x × f
where the S M is the auditory feature with a value of 2, t is the current number of updates, T m a x is the maximum number of updates.
As shown in Figure 3 and Figure 4, a comparative analysis was conducted on the search range capabilities of the linear and nonlinear adjustment strategies. The red line represents the process where r g changes linearly from 2 to 0, while the black line depicts the improved range. From the figures, it is evident that the exploitation phase of SCSO overly emphasizes local attacks, neglecting the global search capability. However, the improved formula enables extensive exploration during the exploration phase and facilitates both local attacks and global exploration during the exploitation phase. It can be concluded that Equation (14) significantly enhances the overall performance of the algorithm and reduces the risk of falling into local optima; it verifies the global search capability of the nonlinear adjustment strategy during the exploitation phase.

3.3. Integrate the Warning Mechanism of the Sparrow Algorithm

In the later iterations of the SCSO algorithm, the convergence speed may be relatively slow when dealing with some complex issues, requiring a longer computational time to find the optimal solution. The warning mechanism in the SSA can rapidly guide sparrows toward the optimal position based on their current locations [10], thereby enhancing the algorithm’s convergence during the iteration process and reducing computational time. Therefore, inspired by the sparrow search algorithm, this paper introduces the warning mechanism of the sparrow algorithm into the SCSO algorithm. The detailed position update Equation is shown in Equation (15).
x b t + 1 = x b t + β × x c t x b ( t ) ,   i f       f i > f g x c t + K × x c t x w o r s t t f i f w + ε , i f       f i f g
where x b is the current global optimal position, x w o r s t t indicates the worst position of the Tth iteration, β is the step-size control parameter, K ∈ [−1.1] is a random number, f i is the current fitness value, f g is the current global optimal fitness value, f w is the minimum fitness value of the current population, ε is constant [10]. The steps for updating the position are as follows.
Step 1: In the sand cat population, 30% of the individuals are randomly selected to serve as warning sand cats.
Step 2: While f i > f g , this indicates that the sand cat is safe around x b , and it can randomly move within this range to avoid falling into a local optimum. The specific movement position of the sand cat is represented as x b t + β × x c t x b ( t ) .
Step 3: While f i f g , this indicates that the sand cat realizes it is in an unfavorable position and needs to move closer to other sand cats to find an optimal position. The sand cat updates its position according to the Equation x c t + K × x c t x w o r s t t f i f w + ε . The sand cats’ position updates are illustrated in Figure 5.

3.4. MSCSO Algorithm Flowchart and Pseudocode

Figure 6 shows the flow chart of the MSCSO algorithm after SCSO is integrated with the good point set strategy, the nonlinear adjustment strategy, and the early warning mechanism of the sparrow search algorithm. The MSCSO pseudocode is presented in Algorithm 2.
Algorithm 2 MSCSO
Parameters: the current fitness value f i , a n d the current global optimal fitness value f g
1: Initialize the population using Equation (12)
2: Calculate the location and fitness values for each individual
3: Use Equations (4) and (5) are used to initialize the parameters R and r and Equation (13) is used to calculate R
4: while (t < T) do
5: for Every individual do
6:   Take a random angle θ between 0 and 360°
7:    if |R| ≤ 1
8:    Use Equation (8) to update the position and fitness values of the sand cat
9:    else
10:      Use Equation (6) to update the location and fitness values of the sand cat
11:  Update the current best fitness value and posion
12:    end
13:  Randomly select 30% of the population size
14:  for i = 1 to number of warning sand cats do
15:    if  f i  >  f g
16:    Use x b t + β × x c t x b ( t ) to update the position
17:    else
18:    Use x c t + K × x c t x w o r s t t f i f w + ε to update the position
19:    end
20:  end
21:  Compare the updated position using the integrated early warning mechanism with the position in Step 11, and if the position is superior to the original one, proceed with the update.
22:  end for
23:  t = t + 1
24:  end while

4. Experimental Results and Discussion

4.1. Experimental Design

In this experiment, the population size is set to 30, and the maximum number of iterations is 500. Each test function was independently calculated 30 times to obtain more accurate experimental results. The evaluation criteria are the data results’ mean value and standard deviation. The mean value represents the convergence accuracy of the algorithm, where a smaller value indicates higher accuracy. While the standard deviation represents the algorithm’s stability, with a more minor standard deviation indicating much better stability [28].
The CEC 2017 benchmark was adopted to evaluate the optimization algorithm’s global search ability and convergence performance. The CEC 2017 test functions incorporate complex and diverse optimization problems from real-life scenarios, mathematical models, and artificially constructed benchmark problems. Furthermore, the CEC 2017 test functions cover many optimization types, from simple local searches to complex global optimizations. As the dimensionality of the functions increases, the difficulty in solving them also escalates, posing more significant challenges to the performance of optimization algorithms. Therefore, this paper employs the 30-dimensional, 50-dimensional, and 100-dimensional functions from the CEC 2017 benchmark to verify the comprehensive performance of the optimization algorithm. Among the 29 essential test functions in CEC 2017, F1 and F2 are unimodal functions used to test the optimization capability and convergence speed. F3–F10 are simple multimodal functions with multiple local optima designed to evaluate the global search ability of the algorithm. F11–F20 are hybrid functions that combine unimodal and multimodal characteristics, requiring higher performance from the algorithm. F21–F30 are composition functions that reflect the comprehensive performance of the algorithm. However, F2 is not analyzed independently. Additionally, five intelligent optimization algorithms, namely DBO, POA, HHO, SABO, and SCSO, are utilized for comparative analysis with MSCSO. The specific parameters of these six intelligent algorithms are detailed in Table 2.

4.2. Results and Analysis

4.2.1. Analysis and Comparison of Experimental Results

As seen in Table 3, the MSCSO algorithm leads in terms of mean and standard deviation for the unimodal function F1. Among the simple multimodal functions F3–F10, MSCSO achieves the best mean and standard deviation for functions F3 and F4. The standard deviation of MSCSO for functions F7 and F10 is slightly higher than that of DBO and POA but lower than the original algorithm. It maintains a leading position for the remaining functions. In the simple multimodal functions, the average performance of MSCSO is average, ranking fifth and fourth in F5 and F6 and third in F7, F8, F9, and F10. For the mixed functions F11–F20, the MSCSO algorithm outperforms the other five in standard deviation for F11, F12, F13, F15, F16, and F19. The standard deviation of MSCSO for F14, F17, F18, and F120 is slightly higher than POA but better than the original algorithm. The average values rank fourth in F17 and second in F18. Among the composite functions F21–F30, MSCSO achieves the optimal mean and standard deviation for F23, F25, F27, F28, and F30. Although the mean value of MSCSO ranks fifth in F21, sixth in F22, second in F24, last in F26, and third in F29, its standard deviation remains leading among all composite functions.
As seen in Table A1, in the 50-dimensional experiments, the MSCSO algorithm outperforms the other five algorithms regarding mean and standard deviation for the unimodal function F1. Among the multimodal functions F3–F10, the evaluation metrics reach the optimal values for F4. The standard deviation is slightly higher than the DBO algorithm for F7 but performs excellently for the remaining functions. The average values for F5–F10 remain average. In the mixed functions F11–F20, the MSCSO algorithm demonstrates strong optimization performance. MSCSO ranks first in average values and standard deviations for F11–16 and F18–19. The standard deviation of MSCSO is only higher than POA for F20 but lower than the original algorithm. The average values rank average only for F17 and F20. In the composite functions F21-F30, the MSCSO algorithm’s standard deviation also performs outstandingly, consistently ranking first. The average values rank average for F21, F22, F24, and F26, but MSCSO can always find the solution fastest for the remaining functions.
As seen in Table A2, in the 100-dimensional experiments, the MSCSO algorithm exhibits excellent performance in handling complex problems. For the unimodal function F1, the evaluation metrics maintain the optimal state. Among the multimodal functions F3–F10, the optimization performance of the MSCSO algorithm is similar to that in the 50-dimensional experiments. In the mixed functions F11–F20, the average value of the MSCSO algorithm ranks only fourth in F17 and F20, and the standard deviation is higher than POA only in F20. The average and standard deviation exhibits the best performance for the remaining functions. In the composite functions F21–F30, the average value of the MSCSO algorithm ranks poorly only in F21 and F26, but the evaluation metrics maintain a leading position for the remaining functions.
In summary, the MSCSO algorithm performs averagely in handling simple multimodal problems. Still, it exhibits excellent performance in dealing with unimodal, mixed, and composite function problems, especially when handling complex composite and high-dimensional functions.

4.2.2. Convergence Plot Analysis Discussion

In this paper, convergence curves of six intelligent optimization algorithms on the CEC2017 benchmark in 30-dimensional, 50-dimensional, and 100-dimensional spaces are introduced to demonstrate the performance of optimization algorithms.
As seen in Figure A1, it can be observed that DBO and POA surpass the MSCSO algorithm in the later iterations of F6 and F14. However, its performance is significantly improved compared to the SCSO algorithm. In F3 and F22, the convergence performance of the algorithm is not evident in the early to mid-iterations. However, it accelerates in the later iterations, surpassing the other five algorithms and ranking first; it fully demonstrates that the algorithm’s integration of an early warning mechanism accelerates its convergence. In functions such as F1, F4, F5, F8, F9, F10, F12, F13, F19–F23, F26, F28, and F30, MSCSO exhibits faster convergence speeds. This acceleration in convergence speed indicates that MSCSO leverages the good point set strategy and nonlinear adjustment strategy to enhance its global search capabilities, enabling the algorithm to locate optimal solutions more rapidly and improve its convergence performance.
As seen in Figure A2, the MSCSO algorithm is only surpassed by DBO in F7 and needs to find the optimal solution. Among the other algorithms, it exhibits strong convergence and optimization performance. Compared to the original algorithm, it accelerates the convergence of the algorithm.
As seen in Figure A3, the MSCSO algorithm fails to find the optimal solution only in F20, but it can quickly find the optimal solution in most functions. Moreover, it can conduct a broader global search in the early iterations and escape from local optima in the later iterations, demonstrating that the MSCSO algorithm has certain advantages in handling complex and high-dimensional functions.
In summary, while the MSCSO algorithm exhibits some improvement in addressing the premature convergence and stagnation issues in the 30-dimensional experimental convergence curve, this improvement is not particularly evident in the 30-dimensional experiments. However, as the dimensionality increased, the algorithm’s convergence performance saw significant improvement, demonstrating that the MSCSO algorithm has certain advantages in handling complex and high-dimensional problems and can escape from local optima.

4.2.3. Boxplots Analysis

This paper utilizes boxplots to compare MSCSO with five other algorithms, making the optimization performance of MSCSO more visually intuitive. The median represents the average of the sample results after 30 independent runs of the algorithm. The upper and lower quartiles indicate the degree of fluctuation in the sample data to some extent. Outliers suggest instability in the data. Figure A4, Figure A5 and Figure A6 show the example data results of the six intelligent algorithms in 30-dimensional, 50-dimensional, and 100-dimensional test functions.
In Figure A4, MSCSO exhibits the lowest average value on most functions, verifying its powerful optimization capability. For test functions F6-F9, F16-F17, F20, F24, F26, and F29, the upper and lower quartiles of MSCSO’s boxplots are evenly distributed, indicating low data dispersion. Although there are outliers in some functions, the overall performance remains stable.
In Figure A5 and Figure A6, the box plots of MSCSO in 50-dimensional and 100-dimensional spaces show significant improvements in their upper and lower quartile distributions compared to the 30-dimensional boxplots. The distribution range becomes shorter and more uniform, indicating that as the dimension increases, the dispersion of the algorithm decreases. Additionally, as the dimension increases, the number of functions with the lowest average value for MSCSO also rises, demonstrating that MSCSO is more reliable than the other five algorithms.

4.3. Non-Parametric Test

4.3.1. Wilcoxon Rank-Sum Test

This paper introduces the Wilcoxon rank-sum test to verify whether there are significant differences in performance between MSCSO and the other five algorithms. Each algorithm was independently tested 30 times in the experiment. The results are used to calculate the p-values. The null hypothesis is rejected when the p-value is less than 0.05, indicating a significant performance difference between the two algorithms. When the p-value is more significant than 0.05, it suggests that the search results of the two algorithms are statistically significant [29].
From Table 4, it can be observed that 5.5% of the algorithms are comparable to the MSCSO algorithm. In functions F6 and F7, MSCSO’s performance is statistically significant compared to DBO. In functions F17, F18, and F22, MSCSO’s performance is comparable to POA. In functions F10 and F20, the p-values for SCSO are more significant than 0.05, indicating statistical significance. Additionally, 5.5% of the algorithms are comparable to the MSCSO algorithm.
Table A3 indicates that only 2% of the algorithms have comparable performance to MSCSO, while 98% show significant differences from MSCSO. Specifically, in function F7, MSCSO’s performance is comparable to DBO. In functions F3 and F20, the p-values for POA are greater than 0.05, suggesting statistical significance.
In Table A4, 98.6% of the data results are less than 0.05, suggesting that MSCSO exhibits significant differences from the other five algorithms.
In summary, as the dimensionality of the test functions continues to increase, the significance of the differences between MSCSO and DBO, POA, SCSO, HHO, SABO, and SCSO becomes increasingly pronounced.

4.3.2. Friedman Test

In this paper, the Friedman Test is used to provide an overall ranking of the algorithm experimental results. The Friedman Test aids in comprehensively evaluating the overall performance of different algorithms. In this ranking, a smaller serial number for an algorithm indicates better performance. Figure 7 displays the average Friedman ranking of six intelligent optimization algorithms: DBO, POA, HHO, SABO, SCSO, and MSCSO. As evident from the figure, MSCSO consistently ranks first in the average ranking across the 30-dimensional, 50-dimensional, and 100-dimensional test functions of CEC2017, validating the effectiveness of the improved algorithm and demonstrating the excellent comprehensive performance of MSCSO.

5. Engineering Applications

In this section, two specific engineering examples, the reducer design problem, and the welded beam design problem, are used to verify the performance of the MSCSO algorithm [30].

5.1. Reducer Design Problems

The reducer design problem is a classic constraint engineering optimization problem, and its core goal is to minimize the weight of the reducer under the premise of satisfying a series of complex constraints. As can be seen from Figure 1, this problem involves seven structural parameters: face width (W1), tooth die (W2), number of pinion teeth (W3), length of the first shaft between bearings (W4), length of the second shaft between bearings (W5), diameter of the first shaft (W6) and diameter of the second shaft (W7). By optimizing the combination of the seven parameters, the weight of the reducer is minimized, as shown in Figure 8.
W1, W2, W3, W4, W5, W6, W7 are the seven variables of the reducer design problem, and their objective functions are set as Equation (16) indicates.
f w = 0.7854 × w 1 × w 2 2 × 3.3333 w 3 2 + 14.9334 w 3 43.0934 1.508 × w 1 × w 6 2 + w 7 2 + 7.4777 × w 6 3 + w 7 3 + 0.7854 × ( w 4 w 6 2 + w 5 w 7 2 )
Restrictions:
f 1 w ¯ = 27 w 1 w 2 2 w 3 1 0 f 2 w ¯ = 397.5 w 1 w 2 2 w 3 2 1 0 f 3 w ¯ = 1.93 w 4 3 w 2 w 7 4 w 3 1 0 f 4 w ¯ = 1.93 w 4 3 w 2 w 7 4 w 3 1 0 f 5 w ¯ = ( 745 w 4 w 2 w 3 ) 2 + 16.9 × 10 6 1 2 110 w 6 3 1 0   f 6 w ¯ = ( 745 w 5 w 2 w 3 ) 2 + 157.5 × 10 6 1 2 85 w 7 3 1 0   f 7 w ¯ = w 2 w 3 40 1 0   f 8 w ¯ = 5 w 2 w 1 1 0     f 9 w ¯ = w 1 12 w 2 1 0 f 10 w ¯ = 1.5 w 6 + 1.9 w 4 1 0   f 11 w ¯ = 1.5 w 7 + 1.9 w 5 1 0
The range of the variables is as follows:
2.6 w 1 3.6 0.7 w 2 0.8 17 w 3 28 7.3 w 4 8.3 7.3 w 5 8.3 2.9 w 6 3.9 5.0 w 7 5.5
MSCSO is compared with DBO, POA, HHO, SABO, and SCSO algorithms to verify the performance of MSCSO in the reducer design problem. From the data in Table 5, compared with the other five algorithms, the MSCSO algorithm is the closest to the optimal value of the reducer design problem, with a value of 1.6702, and the values of its seven variables are 3.5, 0.7, 17, 7.3, 7.71532, 3.35054 and 5.28665.
As seen in Figure 9, MSCSO can quickly find the global optimal solution compared to other algorithms, demonstrating its applicability in the reducer design problem.

5.2. Welded Beam Design Problems

The welded beam design problem is a typical strongly constrained nonlinear programming problem, with its core objective being to reduce manufacturing costs by optimizing the design scheme. Figure 10 illustrates the structural parameters of the welded beam, where the optimal combination of joint thickness (h), tightening bar length (l), bar height (t), and bar thickness (b) is sought to minimize manufacturing costs, under the premise of satisfying various physical and engineering constraints.
The parameters h, l, t, and b are represented by the variables w1, w2, w3, and w4, and their objective functions are set as Equation (17) indicates.
f w = 1.10471 × w 1 2 w 2 + 0.04811 × w 3 w 4 ( 14.0 + w 2 )
Restrictions:
f 1 w ¯ = τ w ¯ τ m a x 0 f 2 w ¯ = σ w ¯ σ m a x 0 f 3 w ¯ = δ w ¯ δ m a x 0 f 4 w ¯ = w 1 w 4 0 f 5 w ¯ = p p c w ¯ 0 f 6 w ¯ = 0.125 w 1 0 f 7 w ¯ = 1.10471 w 1 2 w 2 + 0.04811 w 3 w 4 14.0 + w 2 5.0 0
The range of variables is as follows:
0.1 w 1 2 0.1 w 2 10 0.1 w 3 10 0.1 w 4 2 τ w = ( τ ) 2 + 2 τ τ w 2 2 R + ( τ ) 2 τ = P 2 w 1 w 2 τ = M R J , M = P L + w 2 2 R = w 2 2 4 + w 1 + w 3 2 2 J = 2 2 w 1 w 2 w 2 2 4 + w 1 + w 3 2 2 σ w = 6 P L w 4 w 3 2 δ w = 6 P L 3 E w 3 2 w 4 p c w = 4.013 E w 3 2 w 4 6 36 L 2 1 w 3 2 L E 4 G
As seen from the data in Table 6, MSCSO shows superior performance compared with the five algorithms of DBO, POA, HHO, SABO, and SCSO in solving the welded beam design problem. Specifically, the optimal solutions of the four variables of the MSCSO algorithm are 0.19883, 3.3374, 9.1920, and 0.19883, respectively. The combination of the four variables accurately approximates the optimal solution, and its value is 2994.4245.
As seen in Figure 11, MSCSO can quickly locate the global optimal solution in the early stages of iteration for the welded beam design and converge effectively. The presence of multiple inflection points on the MSCSO curve indicates that MSCSO can escape from local optima.

6. Conclusions

This paper proposed a multi-strategy fusion algorithm named MSCSO to improve the optimization performance of SCSO. The effectiveness of MSCSO was verified through simulation experiments using 29 benchmark test functions from CEC 2017. Through detailed analysis and comparison of the experimental data, the following enhancements in MSCSO’s performance are observed.
  • Precision and convergence speed in unimodal functions: MSCSO demonstrated significantly higher precision and faster convergence speed in unimodal functions than other algorithms; it indicated MSCSO’s superiority in efficiently navigating and exploiting simple search landscapes.
  • Robust optimization performance in hybrid and composition functions: MSCSO exhibited formidable optimization capabilities in hybrid and composition functions. It maintained a leading position in precision and convergence in most functions, underscoring its adaptability to diverse and complex optimization challenges. As the problem dimensionality increases, MSCSO’s performance became even more pronounced, demonstrating its prowess in tackling intricate and high-dimensional problems.
  • Superiority confirmed by Wilcoxon rank-sum test: The Wilcoxon rank-sum test statistically validated that MSCSO outperforms other algorithms, providing robust evidence of its enhanced performance.
  • Applicability in engineering problems: In practical engineering applications such as gearbox reducer and welded beam design problems, MSCSO produced solutions closer to the optimal solutions than five other algorithms. It highlighted MSCSO’s effectiveness in real-world scenarios and underscored its broad applicability in solving engineering optimization tasks.

Author Contributions

All authors contributed to the conception and design of the study. Material preparation, data collection and analysis were carried out by H.P. and X.Z. The first draft of the manuscript was written by H.P., X.Z., H.M., Y.L., J.Q. and Z.K. All authors have commented on previous versions of the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the Xinjiang Agricultural Machinery R and D, Manufacturing, Promotion and Application Integration Project (Autonomous Region Agricultural and Rural Mechanization Development Center YTHSD2022-19-3).

Data Availability Statement

The data used in the current study are available from the corresponding author on reasonable request.

Acknowledgments

The authors would like to thank the Engineering Research Center for Production Mechanization of Oasis Special Economic Crop, Ministry of Education for providing experimental materials.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. CEC2017 50D test results.
Table A1. CEC2017 50D test results.
DBOPOAHHOSABOSCSOMSCSO
F1Mean330239808810390489898148465345487998812018778375893196284819
Std3899252845498231180665512734336421195829383004224458961728185.7
F3Mean66786.3511822023.4148218437.3461716561.5700813980.7703917413.1313
Std268106.3887117684.1181167032.023182813.6126133707.505899157.0174
F4Mean524.82496422726.350905480.10930443027.9114091656.42195153.2610363
Std1409.9576187812.6840221805.6355678077.4406034571.652584567.295001
F5Mean84.9567136140.2275823831.1238238343.5212181136.85337843.369063
Std960.315895926.2623231922.57444921103.391257948.8158352838.558676
F6Mean11.320405226.2371675324.06575749612.35532667.1909012817.57760074
Std668.8507287672.129517679.9782754686.2532994677.4816775656.270736
F7Mean135.321570790.4180247269.834961179.64051781110.3563116131.381464
Std1412.101161764.556761899.3819161632.2916131665.2950171464.57607
F8Mean83.0129984239.8249519638.744447943.1013623151.1866191950.2141428
Std1321.3156031255.0668281244.0885641426.2667041290.5160191161.60697
F9Mean7710.7474871529.0368552603.7186593922.3441623210.2506772219.92498
Std25806.0192218620.3852532122.6923831402.2501621158.1082912664.2894
F10Mean2272.307432630.9933193950.8373144368.70320541119.8043331573.49553
Std11510.694429181.62784410355.8356415130.5910810190.868368705.85798
F11Mean1322.8909372470.236447526.05972842745.1672832255.91850982.9280624
Std4033.5755536728.0499672988.03384311606.627847861.1056791379.4339
F12Mean538853153.47237740245567912624.7796095320122088293176744695.91
Std830186029.614648371591954810132.713876884946383395012510351644.2
F13Mean114150217.4449494712941254837.952425906523190374660235670.1713
Std92502949.45470735161733200051.092580833263147487711456441.5596
F14Mean3974759.282607675.38373910591.1935374681.8082703986.653146418.047
Std3918959.364649914.59064748048.3596304012.6752681790.879310747.588
F15Mean23480267.91175874736.5955411.5079296542849.3424171520.714866.0908
Std9825049.54288883732.081495111.31238992978.7189176839.528855.0623
F16Mean628.3633522692.8244518665.4755948562.5161157569.3277895532.561722
Std4989.4215034372.0867474990.4625756001.4004794672.4124373592.24632
F17Mean429.73778408.269056485.9001181365.0381958479.430001426.899651
Std4413.3803063774.0153273828.2989614618.7582343919.0095623492.97037
F18Mean14964943.254418438.2386842181.00428288404.3424711295.341272072.92
Std15796585.344262892.24210795307.2340890071.6823837109.21564630.47
F19Mean13933122.64362744484.73677685.84950424696.53222217039.814766.5931
Std9116184.623149962044.93069100.45950048950.3870718954.7137005.8523
F20Mean410.2578286213.5439526308.4450941168.3324002294.2973357345.601964
Std3710.285143063.8366873668.5208854212.2713483565.6748773335.5459
F21Mean83.1409421961.95986229101.634997263.5962396973.2354484166.0088371
Std2876.185412827.9351962958.6502852934.3554522794.5942132648.16974
F22Mean2015.9227421074.838903823.22839512343.895269985.71606271602.4466
Std12502.8776411437.6093112417.4003516309.1302812327.3349110992.8653
F23Mean132.4932719147.379751206.4727144180.1562827123.2249366114.317005
Std3523.6334573590.8069853990.4931093943.8721933384.5206333182.09634
F24Mean158.7447433109.0159033255.7003902158.4743871104.2052828106.811743
Std3695.1704023733.6721814419.949573913.2515213537.0078243320.05097
F25Mean1569.2404041094.525409162.7050111287.475791663.437857327.3972524
Std3861.6119516999.0334043816.3191947075.9420695232.6779923114.66829
F26Mean1118.2327351865.6003141342.11311842.53722361614.137883234.80923
Std11100.8808212768.0597411536.6316413553.1363111748.540795729.92126
F27Mean308.0431984291.1735528683.8444814492.0369643207.7881354169.730378
Std4010.0329614339.7225635155.2848144642.8277834274.5499973611.67258
F28Mean2254.867558755.4207857441.8933227825.6899025570.213497387.1245666
Std6131.0618486815.8754294758.0248157617.2829935661.3314363377.80703
F29Mean1002.694976621.40945411096.5222171507.201481822.3748579485.91776
Std6231.207157022.5565727345.7880089940.7015736901.5556294970.4202
F30Mean47489390.7242380028.851564123.09603825274.3112349961.31336896.06
Std60009001.81346537086.2128720404.57343346492208730742431763.66
Table A2. CEC2017 100D test results.
Table A2. CEC2017 100D test results.
DBOPOAHHOSABOSCSOMSCSO
F1Mean6443978535014679470611712306424212786225157135741415986191835519
Std889469076921.44787 × 1011480588634721.56287 × 10111.06645 × 10111.8053 × 1010
F3Mean327312.907231671.50162122549.388314531.9572417342.9247330730.8664
Std730924.6607305881.7838372984.6991348932.4274320105.4844298781.525
F4Mean15071.018876198.7753651998.3898536240.4469454582.4078111073.78027
Std16471.5668625602.815929838.83543331928.3039815787.592561733.29016
F5Mean255.994228664.4129300846.42333274113.617776460.1925420350.4321657
Std1737.1513571606.0747381673.7520521930.3529061632.9769581371.77767
F6Mean13.175708873.7488547113.8474190045.7912034394.6521308333.18444647
Std682.8830424681.3642251690.6116046705.539359686.9988035664.088596
F7Mean230.535451680.11767378131.0653554183.3490029204.1486638207.995493
Std2978.2779463470.6507983776.6869543405.0518463424.5839532882.10396
F8Mean203.169019969.7568833467.3496511897.7594149776.22128493120.07328
Std2215.080762086.0335862132.869362360.1882362085.9773321762.19817
F9Mean6031.9298113543.2338644754.0547934463.1805295964.9016281649.51376
Std78882.2091441704.0048168787.8943978575.2905944974.5170726104.6671
F10Mean5235.2053381140.9928051997.673166731.05271761614.7623982533.84918
Std27335.0408221101.1941924224.2007832359.407322747.0466317293.8057
F11Mean68644.951518627.2631838471.0773242124.3269523270.274056659.60843
Std223448.526488617.44402144597.4059205655.462194315.7348928493.7276
F12Mean22685808641636592864926760654311816445490213817550434441613219
Std727763303768930637601116195285495487255259836736964033255071073
F13Mean192201584.95459325140196797367.54378193430322277681830025.803
Std264566938.215150913501340976541.68883083561566864684265185.9529
F14Mean11153472.543516377.1252237802.1839886433.7275827627.5481513169.62
Std18865348.027659394.6169054348.95623213173.8610840847.182950801.58
F15Mean109477038280000022510706452.681617740396185945361224423.7711
Std91000114.11541441236318070615.522194201582212945078344422.7137
F16Mean1437.5273931401.7780211127.1431675.0722611129.010461687.019308
Std9830.45777711418.3700210048.4480914272.7656510546.59166353.67028
F17Mean1446.02326660266.756642015.37960987948.1510755169.84024647.352039
Std9522.42314844338.512658111.73275258510.7633830618.697145891.41214
F18Mean13836695.433976344.6514860168.72914620039.895884800.6313282615.36
Std24279018.439314116.2610333720.1129683396.1211376427.434933958.16
F19Mean104306523.7245126497736239372.081655661585157026519191064.6918
Std139026655.4433680411840238023.72111376240105824680036491.2231
F20Mean873.1403828485.1281919391.7970499288.1319208574.8553324492.113677
Std7283.6948655452.8462146232.5212047911.4339156246.7518695591.37673
F21Mean194.6733628151.272736240.8060467174.772987158.0561619150.778371
Std4034.4751663900.8829464336.4745134609.401683732.2583143344.92746
F22Mean4982.552861240.2219941250.0682681073.8743861304.7521492262.16708
Std29603.273824731.0980827107.1468134789.0581726263.4043221557.4056
F23Mean228.9527889228.6376536463.797077382.3292313187.8860342170.356698
Std4774.385154964.9231555861.9480175563.1402434534.2217363861.13035
F24Mean434.7971439293.7675921601.9109496544.0433336292.4855238254.89335
Std6174.0674846366.8754058242.8269527426.19785655.7947574576.10926
F25Mean4971.6251622146.970151469.62900821718.7719091580.137642280.365032
Std8528.15253313375.004756799.20561715072.289339867.1561624069.11079
F26Mean2716.2971162893.5581692197.6969982627.7382462775.0871587306.39958
Std26555.379536793.6480531943.8138239199.6609433818.4193919838.3825
F27Mean517.8607132624.43691591066.926374907.8012227519.8501265142.25572
Std4796.6778456170.7445996821.5166797226.3795135855.3038433782.70941
F28Mean6867.5241672361.314563803.44995352055.7695711887.503296420.03276
Std18557.44218477.302519319.44545819527.9879413378.178024602.70205
F29Mean2818.55828110708.197591042.9899658864.1064043914.703263617.239425
Std12278.126822161.3402812642.2340824025.0756515093.169397535.46986
F30Mean109769797.44948234972294385332.635709755232380766605930694.966
Std241833483.513085971393669153209.389788856024603212747981372.554
Table A3. Fifty-dimensional Wilcoxon rank-sum test of CEC2017 test function.
Table A3. Fifty-dimensional Wilcoxon rank-sum test of CEC2017 test function.
DBOPOAHHOSABOSCSO
F13.6 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F38.9 × 10−11 < 0.053.2 × 10−01 > 0.051.1 × 10−05 < 0.052.8 × 10−06 < 0.053.2 × 10−02 < 0.05
F43.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F52.2 × 10−09 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F67.5 × 10−07 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F76.5 × 10−01 > 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.054.6 × 10−10 < 0.051.4 × 10−08 < 0.05
F89.7 × 10−10 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F93.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F107.3 × 10−10 < 0.055.1 × 10−07 < 0.054.0 × 10−11 < 0.053.0 × 10−11 < 0.051.5 × 10−09 < 0.05
F113.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F122.3 × 10−10 < 0.053.0 × 10−11 < 0.056.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F133.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F144.6 × 10−08 < 0.051.6 × 10−05 < 0.051.1 × 10−09 < 0.053.0 × 10−11 < 0.057.3 × 10−10 < 0.05
F153.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F168.9 × 10−11 < 0.053.3 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F173.0 × 10−11 < 0.053.0 × 10−11 < 0.053.3 × 10−11 < 0.053.0 × 10−11 < 0.051.2 × 10−10 < 0.05
F181.8 × 10−09 < 0.051.8 × 10−06 < 0.057.5 × 10−07 < 0.053.0 × 10−11 < 0.051.4 × 10−07 < 0.05
F193.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F207.7 × 10−09 < 0.053.4 × 10−01 > 0.055.3 × 10−03 < 0.053.0 × 10−11 < 0.055.5 × 10−03 < 0.05
F214.5 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.051.6 × 10−10 < 0.05
F222.0 × 10−08 < 0.055.1 × 10−07 < 0.051.7 × 10−10 < 0.053.0 × 10−11 < 0.054.6 × 10−08 < 0.05
F233.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.051.2 × 10−10 < 0.05
F243.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.054.5 × 10−11 < 0.05
F253.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F266.9 × 10−03 < 0.053.0 × 10−11 < 0.053.6 × 10−11 < 0.053.0 × 10−11 < 0.051.4 × 10−10 < 0.05
F271.9 × 10−10 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F283.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F293.6 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F303.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
Table A4. One-hundred-dimensional Wilcoxon rank-sum test of CEC2017 test function.
Table A4. One-hundred-dimensional Wilcoxon rank-sum test of CEC2017 test function.
DBOPOAHHOSABOSCSO
F11.8 × 10−09 < 0.051.8 × 10−06 < 0.057.5 × 10−07 < 0.053.0 × 10−11 < 0.051.4 × 10−07 < 0.05
F38.9 × 10−11 < 0.054.9 × 10−01 > 0.051.1 × 10−05 < 0.052.8 × 10−06 < 0.053.2 × 10−02 < 0.05
F43.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F52.2 × 10−09 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F67.5 × 10−07 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F77.5 × 10−07 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.054.6 × 10−10 < 0.051.4 × 10−08 < 0.05
F89.7 × 10−10 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F93.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F107.3 × 10−10 < 0.055.1 × 10−07 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.051.5 × 10−09 < 0.05
F113.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F123.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F132.3 × 10−10 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F143.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F153.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F168.9 × 10−11 < 0.053.3 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F173.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.051.2 × 10−10 < 0.05
F181.8 × 10−09 < 0.051.8 × 10−06 < 0.057.5 × 10−07 < 0.053.0 × 10−11 < 0.051.4 × 10−07 < 0.05
F193.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F207.7 × 10−09 < 0.052.3 × 10−01 > 0.055.3 × 10−03 < 0.053.0 × 10−11 < 0.055.5 × 10−03 < 0.05
F214.5 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.051.6 × 10−10 < 0.05
F222.0 × 10−08 < 0.055.1 × 10−07 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.054.6 × 10−08 < 0.05
F233.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.051.2 × 10−10 < 0.05
F243.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F253.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F266.9 × 10−03 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.051.4 × 10−10 < 0.05
F271.9 × 10−10 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F283.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F297.6 × 10−05 < 0.051.3 × 10−07 < 0.055.4 × 10−09 < 0.053.0 × 10−11 < 0.051.7 × 10−07 < 0.05
F303.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
Figure A1. Thirty-dimensional convergence curves of six optimization algorithms.
Figure A1. Thirty-dimensional convergence curves of six optimization algorithms.
Mathematics 12 02153 g0a1
Figure A2. Fifty-dimensional convergence curves of six optimization algorithms.
Figure A2. Fifty-dimensional convergence curves of six optimization algorithms.
Mathematics 12 02153 g0a2
Figure A3. One-hundred-dimensional convergence curves of six optimization algorithms.
Figure A3. One-hundred-dimensional convergence curves of six optimization algorithms.
Mathematics 12 02153 g0a3
Figure A4. Results of 30-dimensional box plots.
Figure A4. Results of 30-dimensional box plots.
Mathematics 12 02153 g0a4
Figure A5. Results of 50-dimensional box plots.
Figure A5. Results of 50-dimensional box plots.
Mathematics 12 02153 g0a5
Figure A6. Results of 100-dimensional box plots.
Figure A6. Results of 100-dimensional box plots.
Mathematics 12 02153 g0a6

References

  1. Rezk, H.; Olabi, A.G.; Wilberforce, T.; Sayed, E.T. Metaheuristic optimization algorithms for real-world electrical and civil engineering application: A Review. Results Eng. 2024, 23, 102437. [Google Scholar] [CrossRef]
  2. Jia, H. Design of fruit fly optimization algorithm based on Gaussian distribution and its application to image processing. Syst. Soft Comput. 2024, 6, 200090. [Google Scholar] [CrossRef]
  3. Aishwaryaprajna; Kirubarajan, T.; Tharmarasa, R.; Rowe, J.E. UAV path planning in presence of occlusions as noisy combinatorial multi-objective optimisation. Int. J. Bio-Inspired Comput. 2023, 21, 209–217. [Google Scholar] [CrossRef]
  4. Wang, K.; Guo, M.; Dai, C.; Li, Z.; Wu, C.; Li, J. An effective metaheuristic technology of people duality psychological tendency and feedback mechanism-based Inherited Optimization Algorithm for solving engineering applications. Expert Syst. Appl. 2024, 244, 122732. [Google Scholar] [CrossRef]
  5. Lan, F.; Castellani, M.; Zheng, S.; Wang, Y. The SVD-enhanced bee algorithm, a novel procedure for point cloud registration. Swarm Evol. Comput. 2024, 88, 101590. [Google Scholar] [CrossRef]
  6. Zhou, S.; Shi, Y.; Wang, D.; Xu, X.; Xu, M.; Deng, Y. Election Optimizer Algorithm: A New Meta-Heuristic Optimization Algorithm for Solving Industrial Engineering Design Problems. Mathematics 2024, 12, 1513. [Google Scholar] [CrossRef]
  7. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the International Conference on Neural Networks 1995, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  8. Dorigo, M. Positive Feedback as a Search Strategy; Technical Report; Dipartimento di Elettronica, Politecnico di Milano: Milan, Italy, 1991; pp. 16–91. [Google Scholar]
  9. Holland, J. Genetic algorithms. Sci. Am. 1992, 267, 66–72. [Google Scholar] [CrossRef]
  10. Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control. Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
  11. Xue, J.; Shen, B. Dung beetle optimizer: A new meta-heuristic algorithm for global optimization. J. Supercomput. 2023, 79, 7305–7336. [Google Scholar] [CrossRef]
  12. Trojovský, P.; Dehghani, M. Pelican optimization algorithm: A novel nature-inspired algorithm for engineering applications. Sensors 2022, 22, 855. [Google Scholar] [CrossRef]
  13. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris Hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  14. Trojovský, P.; Dehghani, M. Subtraction-average-based optimizer: A new swarm-inspired metaheuristic algorithm for solving optimization problems. Biomimetics 2023, 8, 149. [Google Scholar] [CrossRef] [PubMed]
  15. Seyyedabbasi, A.; Kiani, F. Sand Cat swarm optimization: A nature-inspired algorithm to solve global optimization problems Figure 3. Eng. Comput. 2023, 39, 2627–2651. [Google Scholar] [CrossRef]
  16. Qiu, Y.; Zhou, J. Short-term rockburst damage assessment in burst-prone mines: An explainable XGBOOST hybrid model with SCSO algorithm. Rock Mech. Rock Eng. 2023, 56, 8745–8770. [Google Scholar] [CrossRef]
  17. Aghaei, V.T.; SeyyedAbbasi, A.; Rasheed, J.; Abu-Mahfouz, A.M. Sand cat swarm optimization-based feedback controller design for nonlinear systems. Heliyon 2023, 9, e13885. [Google Scholar] [CrossRef] [PubMed]
  18. Adegboye, O.R.; Feda, A.K.; Ojekemi, O.R.; Agyekum, E.B.; Khan, B.; Kamel, S. DGS-SCSO: Enhancing Sand Cat Swarm Optimization with Dynamic Pinhole Imaging and Golden Sine Algorithm for improved numerical optimization performance. Sci. Rep. 2024, 14, 1491. [Google Scholar] [CrossRef] [PubMed]
  19. Niu, Y.; Yan, X.; Wang, Y.; Niu, Y. An improved sand cat swarm optimization for moving target search by UAV. Expert Syst. Appl. 2024, 238, 122189. [Google Scholar] [CrossRef]
  20. Seyyedabbasi, A. A reinforcement learning-based metaheuristic algorithm for solving global optimization problems. Adv. Eng. Softw. 2023, 178, 103411. [Google Scholar] [CrossRef]
  21. Kiani, F.; Nematzadeh, S.; Anka, F.A.; Findikli, M.A. Chaotic sand cat swarm optimization. Mathematics 2023, 11, 2340. [Google Scholar] [CrossRef]
  22. Kiani, F.; Anka, F.A.; Erenel, F. PSCSO: Enhanced sand cat swarm optimization inspired by the political system to solve complex problems. Adv. Eng. Softw. 2023, 178, 103423. [Google Scholar] [CrossRef]
  23. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  24. Luogeng, H. Applications of Number Theory to Modern Analysis; Science Press: Beijing, China, 1978. [Google Scholar]
  25. Zhu, F.; Li, G.; Tang, H.; Li, Y.; Lv, X.; Wang, X. Dung beetle optimization algorithm based on quantum computing and multi-strategy fusion for solving engineering problems. Expert Syst. Appl. 2024, 236, 121219. [Google Scholar] [CrossRef]
  26. Qiu, Y.; Yang, X.; Chen, S. An improved gray wolf optimization algorithm solving to functional optimization and engineering design problems. Sci. Rep. 2024, 14, 14190. [Google Scholar] [CrossRef] [PubMed]
  27. Gao, P.; Ding, H.; Xu, R. Whale optimization algorithm based on skew tent chaotic map and nonlinear strategy. Acad. J. Comput. Inf. Sci. 2021, 4, 91–97. [Google Scholar] [CrossRef]
  28. Li, Q.; Shi, H.; Zhao, W.; Ma, C. Enhanced Dung Beetle Optimization Algorithm for Practical Engineering Optimization. Mathematics 2024, 12, 1084. [Google Scholar] [CrossRef]
  29. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2021, 1, 3–18. [Google Scholar] [CrossRef]
  30. Bayzidi, H.; Talatahari, S.; Saraee, M.; Lamarche, C.P. Social network search for solving engineering optimization prob lems. Comput. Intell. Neurosci. 2021, 2021, 8548639. [Google Scholar] [CrossRef]
Figure 1. The good point set strategy.
Figure 1. The good point set strategy.
Mathematics 12 02153 g001
Figure 2. The random strategy.
Figure 2. The random strategy.
Mathematics 12 02153 g002
Figure 3. The global search scope of Equation (3).
Figure 3. The global search scope of Equation (3).
Mathematics 12 02153 g003
Figure 4. The global search scope of Equation (14).
Figure 4. The global search scope of Equation (14).
Mathematics 12 02153 g004
Figure 5. Movement diagram of the warning mechanism of the sand cat fusion with sparrow algorithm.
Figure 5. Movement diagram of the warning mechanism of the sand cat fusion with sparrow algorithm.
Mathematics 12 02153 g005
Figure 6. MSCSO flow chart.
Figure 6. MSCSO flow chart.
Mathematics 12 02153 g006
Figure 7. Friedman average rank.
Figure 7. Friedman average rank.
Mathematics 12 02153 g007
Figure 8. Reducer design.
Figure 8. Reducer design.
Mathematics 12 02153 g008
Figure 9. Convergence curves of six algorithms in reducer design.
Figure 9. Convergence curves of six algorithms in reducer design.
Mathematics 12 02153 g009
Figure 10. Welded beam design.
Figure 10. Welded beam design.
Mathematics 12 02153 g010
Figure 11. Convergence curves of six algorithms in welded beam design.
Figure 11. Convergence curves of six algorithms in welded beam design.
Mathematics 12 02153 g011
Table 1. Improved SCSO.
Table 1. Improved SCSO.
Improved SCSO Research Gap
DGS-SCSO [18] Time consumption is long
ISCSO [19]Unable to apply multi-objective coordinated search
RLSCSO [20] There exists premature convergence on specific problems
CSCSO [21] Not suitable for multi-objective and multi-dimensional problems
Table 2. Algorithm parameter settings.
Table 2. Algorithm parameter settings.
AlgorithmBasic Parameters
DBOP-Percent = 0.2
POAI = round (1 + rand (1, 1))
HHOE0 ∈ [−1, 1], E1 ∈ [0, 2]
SABOI = round (1 + rand + rand)
SCSO SM = 2, P = [1, 360]
MSCSOSM = 2, P = [1, 360]
Table 3. CEC2017 30D test results.
Table 3. CEC2017 30D test results.
DBOPOAHHOSABOSCSOMSCSO
F1Mean181061557.95660538295255329045.4461403168141871573396480.351978
Std226987212.716150983243421644778.71442567465291402517645760.1867
F3Mean17515.689878641.9378497230.2293329120.61548713161.36025891.522648
Std92767.2121342864.8711258714.4347261490.6212257139.927923032.27802
F4Mean139.38060641180.492654100.06837081538.350635971.76277417.53120497
Std702.71529132508.661121730.57320482822.3918121302.43027499.858241
F5Mean55.8872214936.7103345234.9028207742.8609980546.575617349.52457744
Std754.5220169771.7684659769.3682924821.7540847765.931483696.6663608
F6Mean14.229654236.4200056936.81316334214.046551168.554957479.141826164
Std649.3417836661.4178179667.1667533665.3981712662.780493641.1724601
F7Mean102.491861379.5927992567.4198270346.32864246102.70389968.75953635
Std1043.0005881265.8426421289.1646911137.5365451151.547091049.194281
F8Mean65.3034596428.5244885125.8393464739.9166550235.890403928.88077642
Std1028.0001561000.597586982.31297531089.7488561015.94441964.2559725
F9Mean2480.28238751.47593721110.6365052021.9135391090.507321064.581016
Std6502.3881625694.5406178781.7932427867.6021496263.037944356.192112
F10Mean845.120596674.125884955.1853536310.1298527695.423901707.9633178
Std5844.3664995179.7931256074.4984078875.3437566308.356865578.120938
F11Mean383.3717531984.7520981131.06977911666.2394611444.1623965.31309442
Std1747.3224182276.1020841572.2861934958.0049623077.696631279.354045
F12Mean98614979.78153255683770205784.97701884840391983407716338.4703
Std51956094.49136467526172935842.39862414803278116157799837.9558
F13Mean16430067.8767106598.521020004.509422127043.356869122532716.10554
Std6687687.64824812467.871341056.187210921857.116622361736573.76204
F14Mean397818.357651345.756781006339.5331078547.96573708.14949306.0385
Std396369.855131504.890971214146.751266375.926507603.59362704.4259
F15Mean9618753.61927415.0249569501.597663290687.00513835517.626416.15645
Std2151740.61240267.45615126213.04822787500.824827029.2716700.49309
F16Mean425.8795674411.7597504538.3952084369.0523296394.224764286.8438364
Std3385.8438133156.7224453671.9303274133.5417833355.589792908.555732
F17Mean276.5561996206.9603863327.376406272.2066037253.446717282.808772
Std2721.0862182274.4900882642.9620912952.613352411.410412349.256087
F18Mean5019785.4215328.45914829644.6247611276.173100455.93217570.0692
Std4500202.134230891.66094833387.0295627992.8522951381.6278289.2344
F19Mean2881341.6971213028.7981170113.0433997783.17412579710.719503.72515
Std1687089.2631425064.2151552859.143910735.1626626031.2412942.19377
F20Mean213.6571874131.0908482223.428119150.6634982208.000522249.5415292
Std2793.250762528.4050692820.8334383084.7848162663.158462561.539277
F21Mean37.664132630.982302974.4215332530.7502798345.060089548.3281029
Std2547.4396682548.4028412607.4817152600.0651472531.55952476.341655
F22Mean2294.0551981506.8509561337.270747574.82877052056.714692509.271215
Std4708.0585495269.1364917169.6158834278.3991464749.340693880.26186
F23Mean77.5419952591.00184094133.330650777.4092551163.580537859.34920959
Std3035.5599583042.5581053285.1931483164.5051772941.824752861.767827
F24Mean78.454142386.76109528144.880604870.4349531355.777438659.05235058
Std3177.1842873205.6545883484.855493274.9964693085.017513017.376545
F25Mean241.7992819204.255337838.69880114187.0182296145.27916715.66963627
Std3023.5651773278.422613009.4366773424.1329393160.507292892.210674
F26Mean926.4768522932.37019571027.042403470.43459261124.232862061.212617
Std6706.1268627536.1277078216.3991138348.5415156674.180734821.825556
F27Mean79.0457745882.70004022144.3232787124.487646777.081621122.1711531
Std3327.0253253376.4180613556.3861943506.9111463401.594013251.414278
F28Mean805.8031815405.938520166.17406326465.5476469248.15082917.60517628
Std3810.4412444016.3776613461.712244425.9802043709.254123222.277593
F29Mean279.9945813313.0157891416.3759905632.9971989240.354284371.348704
Std4458.8238374701.7273844959.423175791.3088454507.632974035.466042
F30Mean3286368.5048636188.0628439425.61947899885.5111321242.9121760.9925
Std2209335.56711889940.3110203011.3441653700.0416812631.449818.54126
Table 4. Thirty-dimensional Wilcoxon rank-sum test of CEC2017 test function.
Table 4. Thirty-dimensional Wilcoxon rank-sum test of CEC2017 test function.
DBOPOAHHOSABOSCSO
F13.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F33.0 × 10−11 < 0.054.2 × 10−10 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F43.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F51.1 × 10−04 < 0.055.0 × 10−08 < 0.055.0 × 10−06 < 0.058.9 × 10−11 < 0.053.3 × 10−06 < 0.05
F65.0 × 10−01 > 0.054.1 × 10−07 < 0.053.1 × 10−10 < 0.057.1 × 10−05 < 0.059.8 × 10−08 < 0.05
F75.6 × 10−01 > 0.057.3 × 10−11 < 0.053.0 × 10−11 < 0.051.2 × 10−07 < 0.052.3 × 10−06 < 0.05
F84.0 × 10−08 < 0.051.1 × 10−06 < 0.053.1 × 10−03 < 0.054.0 × 10−11 < 0.052.1 × 10−07 < 0.05
F91.4 × 10−07 < 0.051.4 × 10−07 < 0.053.0 × 10−11 < 0.051.4 × 10−10 < 0.051.1 × 10−08 < 0.05
F107.9 × 10−03 < 0.054.8 × 10−03 < 0.052.0 × 10−01 > 0.053.0 × 10−11< 0.059.3 × 10−02> 0.05
F117.3 × 10−11 < 0.054.9 × 10−11 < 0.058.1 × 10−10 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F123.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F134.1 × 10−10 < 0.052.1 × 10−10 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.052.9 × 10−09 < 0.05
F142.0 × 10−06 < 0.057.2 × 10−03 < 0.051.1 × 10−09 < 0.053.0 × 10−11 < 0.051.4 × 10−07 < 0.05
F151.5 × 10−08 < 0.057.0 × 10−08< 0.058.1 × 10−11 < 0.053.0 × 10−11 < 0.051.2 × 10−10 < 0.05
F162.1 × 10−05< 0.053.6 × 10−03 < 0.054.8 × 10−07 < 0.053.0 × 10−11 < 0.051.3 × 10−05 < 0.05
F178.6 × 10−05 < 0.054.7 × 10−01 > 0.052.2 × 10−04 < 0.051.0 × 10−10 < 0.051.3 × 10−05 < 0.05
F184.3 × 10−08 < 0.051.8 × 10−01 > 0.052.5 × 10−07 < 0.052.0 × 10−08 < 0.053.8 × 10−06 < 0.05
F196.1 × 10−10 < 0.055.4 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F204.2 × 10−02 < 0.054.8 × 10−03 < 0.053.5 × 10−03 < 0.056.7 × 10−10 < 0.053.0 × 10−1 > 0.05
F215.9 × 10−05 < 0.055.9 × 10−05 < 0.058.3 × 10−08 < 0.052.2 × 10−09 < 0.059.5 × 10−04 < 0.05
F221.2 × 10−03 < 0.055.7 × 10−02 > 0.051.4 × 10−05 < 0.051.4 × 10−01 < 0.053.1×10−02 < 0.05
F239.8 × 10−08 < 0.054.1 × 10−09 < 0.053.3 × 10−11 < 0.054.5 × 10−11 < 0.054.0 × 10−05 < 0.05
F249.2 × 10−09 < 0.052.8 × 10−10 < 0.053.0 × 10−11 < 0.053.6 × 10−11 < 0.051.1×10−03 < 0.05
F252.6 × 10−09 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F261.2 × 10−09 < 0.057.7 × 10−09 < 0.058.9 × 10−11< 0.054.0 × 10−11 < 0.051.7 × 10−07 < 0.05
F271.6 × 10−08 < 0.054.5 × 10−09 < 0.051.2 × 10−11< 0.054.9 × 10−11 < 0.051.4 × 10−09 < 0.05
F283.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
F297.6 × 10−05 < 0.051.3 × 10−07 < 0.055.4 × 10−09 < 0.053.0 × 10−11 < 0.051.7 × 10−07 < 0.05
F301.6 × 10−09 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.053.0 × 10−11 < 0.05
Table 5. Application of six intelligent algorithms in reducer design.
Table 5. Application of six intelligent algorithms in reducer design.
AlgorithmOptimal Values for Variables (w1, w2, w3, w4, w5, w6, w7)Min f(w)
DBO(3.5, 0.7, 17, 8.3, 8.3, 3.35253, 5.5)1.7189
POA(3.5, 0.7, 17, 8.3, 7.71541, 3.38461, 5.28665) 1.6718
HHO(3.52428, 0.7, 17, 8.0053, 8.0053, 3.527, 5.28675) 2.6522
SABO(3.54507, 0.7, 26.252, 8.26088, 8.1548, 3.9, 5.29138)1.9847
SCSO(3.50177, 0.700005, 17.0048, 7.35336, 8.16168, 3.35139, 5.28758)1.6724
MSCSO(3.5, 0.7, 17, 7.3, 7.71532, 3.35054, 5.28665)1.6702
Table 6. Application of six intelligent algorithms in welded beam design.
Table 6. Application of six intelligent algorithms in welded beam design.
AlgorithmOptimal Values for Variables (w1, w2, w3, w4) Min f(w)
DBO(0.17595, 3.8169, 9.2549, 0.20014) 3158.6698
POA(0.19893, 3.3362, 9.1899, 0.19893)3012.1955
HHO(0.20080, 3.5380, 9.0906, 0.20330) 3064.6047
SABO(0.59404, 1.5484, 5.0911, 0.70551) 5208.4211
SCSO(0.18774, 3.5610, 9.1921, 0.19883)3007.0386
MSCSO(0.19883, 3.3374, 9.1920, 0.19883)2994.4245
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Peng, H.; Zhang, X.; Li, Y.; Qi, J.; Kan, Z.; Meng, H. A Modified Sand Cat Swarm Optimization Algorithm Based on Multi-Strategy Fusion and Its Application in Engineering Problems. Mathematics 2024, 12, 2153. https://doi.org/10.3390/math12142153

AMA Style

Peng H, Zhang X, Li Y, Qi J, Kan Z, Meng H. A Modified Sand Cat Swarm Optimization Algorithm Based on Multi-Strategy Fusion and Its Application in Engineering Problems. Mathematics. 2024; 12(14):2153. https://doi.org/10.3390/math12142153

Chicago/Turabian Style

Peng, Huijie, Xinran Zhang, Yaping Li, Jiangtao Qi, Za Kan, and Hewei Meng. 2024. "A Modified Sand Cat Swarm Optimization Algorithm Based on Multi-Strategy Fusion and Its Application in Engineering Problems" Mathematics 12, no. 14: 2153. https://doi.org/10.3390/math12142153

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop