Next Article in Journal
Heart Rate Estimation Using FMCW Radar: A Two-Stage Method Evaluated for In-Vehicle Applications
Previous Article in Journal
IAROA: An Enhanced Attraction–Repulsion Optimisation Algorithm Fusing Multiple Strategies for Mechanical Optimisation Design
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Hybrid Nonlinear Greater Cane Rat Algorithm with Sine–Cosine Algorithm for Global Optimization and Constrained Engineering Applications

1
School of Electrical and Photoelectronic Engineering, West Anhui University, Lu’an 237012, China
2
School of Electronics and Information Engineering, West Anhui University, Lu’an 237012, China
*
Author to whom correspondence should be addressed.
Biomimetics 2025, 10(9), 629; https://doi.org/10.3390/biomimetics10090629
Submission received: 16 August 2025 / Revised: 13 September 2025 / Accepted: 14 September 2025 / Published: 17 September 2025
(This article belongs to the Section Biological Optimisation and Management)

Abstract

The greater cane rat algorithm (GCRA) is a swarm intelligence algorithm inspired by the discerning and intelligent foraging behavior of the greater cane rats, which facilitates mating during the rainy season and non-mating during the dry season. However, the basic GCRA exhibits serious drawbacks of high parameter sensitivity, insufficient solution accuracy, high computational complexity, susceptibility to local optima and overfitting, poor dynamic adaptability, and a severe curse of dimensionality. In this paper, a hybrid nonlinear greater cane rat algorithm with sine–cosine algorithm named (SCGCRA) is proposed for resolving the benchmark functions and constrained engineering designs; the objective is to balance exploration and exploitation to identify the globally optimal precise solution. The SCGCRA utilizes the periodic oscillatory fluctuation characteristics of the sine–cosine algorithm and the dynamic regulation and decision-making of nonlinear control strategy to improve search efficiency and flexibility, enhance convergence speed and solution accuracy, increase population diversity and quality, avoid premature convergence and search stagnation, remedy the disequilibrium between exploration and exploitation, achieve synergistic complementarity and reduce sensitivity, and realize repeated expansion and contraction. Twenty-three benchmark functions and six real-world engineering designs are utilized to verify the reliability and practicality of the SCGCRA. The experimental results demonstrate that the SCGCRA exhibits certain superiority and adaptability in achieving a faster convergence speed, higher solution accuracy, and stronger stability and robustness.

1. Introduction

The global optimization involves transforming complex problems with high-dimensionality, nonlinearity, and multiple constraints into explicit mathematical models with objective functions and constraints, within a defined variable domain or under specific constraints, which is utilized to regulate the quantitative indicators; attain the global extremum solution; acquire objective and repeatable results; eschew the subjectivity of empirical decision-making; and materialize optimal system performance, lowest cost, and highest efficiency. The traditional optimization methods, such as gradient descent, Newton’s method, Lagrange multiplier method, and simplex method, exhibit notable drawbacks: strong dependency on function properties, weak handling skills of high dimensions, sensitive constraint handling conditions, poor anti-interference and robustness, and narrow practical scenarios. However, metaheuristic algorithms (MHAs) exhibit remarkable advantages, including strong universality and usability, robustness and adaptability, excellent parallelism and distributed computing, extensive practicality and scalability, preferable detection efficiency and solution accuracy, and low computational complexity. MAs can be broadly categorized into four main types according to the source of inspiration to address the large-scale, nonlinear, multimodal optimization problems and identify the global extremum solutions.
(1)
Swarm intelligence algorithms (SIAs)
SIAs, inspired by the collective behavior of organisms in nature, are distributed computing methods that imitate the local rules of numerous simple individuals to achieve interaction and collaboration, facilitating information sharing and adaptive adjustment, and thereby emerging with collective intelligent behavior, thereby efficiently acquiring the global extremum solution. SIAs exhibit local perception and global emergence, a balance of positive and negative feedback, a combination of randomness and determinism, and division of labor and collaboration mechanisms. SIAs exhibit specific competitive characteristics of substantial decentralization, self-organization, robustness, scalability, simplicity and ease implementation, excellent local interaction and global emergence, preferable distribution and parallelism, and reliable population diversity and global convergence, without gradient information, such as Chinese pangolin optimization (CPO) [1], black-winged kite algorithm (BKA) [2], elk herd optimization (EHO) [3], puma optimization (PO) [4], horned lizard optimization algorithm (HLOA) [5], and greater cane rat algorithm (GCRA) [6].
(2)
Evolutionary algorithms (EAs)
EAs inspired by biological evolution theory are stochastic computational methods that imitate natural selection, genetic variation, and population reproduction. They utilize population iteration, fitness-driven, and random search mechanisms to approach global extremum solutions gradually. EAs are independent of the mathematical properties of continuity and differentiability of the problems, which are suitable for addressing nonlinear, non-convex, multimodal, high-dimensional, and multi-constraint problems. EAs exhibit specific competitive characteristics of intense exploration and exploitation, flexible multi-objective optimization and parallel computing, eschewing dimensional disaster and gradient information, strong flexibility, parallelism, scalability and adaptability, such as wave search algorithm (WSA) [7], snow avalanches algorithm (SAA) [8], liver cancer algorithm (LCA) [9], coronavirus mask protection algorithm (CMPA) [10], gooseneck barnacle optimization (GBO) [11], and orchard algorithm (OA) [12].
(3)
Physics/Chemistry/Mathematics-inspired algorithms
Physics/Chemistry/Mathematics-inspired algorithms based on natural phenomena or mathematical principles are randomized computational methods, which imitate the energy conservation, gravitational interaction, thermodynamics process of physical systems, molecular interactions, energy conversion, chemical reaction equilibrium of the chemistry systems, mathematical transformations, probability distributions, and differential evolution of the mathematics systems to maintain population diversity and multi-objective dynamic equilibrium; promote information compensation coordination and collaborative optimization of dual populations; explore high-quality search regions; and exploit global extremum solutions. These algorithms exhibit specific competitive characteristics of strong multi-objective adaptability; parallelism and scalability; low parameter sensitivity; rigorous logicality and convergence; optimal complementary advantages; preferable universality and energy orientation; superior solution quality and efficiency, such as artemisinin optimization (AO) [13]; Newton–Raphson-based optimization (NRBO) [14]; exponential distribution optimizer (EDO) [15]; Young’s double-slit experiment (YDSE) [16]; Lévy arithmetic algorithm (LAA) [17]; and triangulation topology aggregation optimizer (TTAO) [18].
(4)
Human-inspired algorithms (HBAs)
HBAs are inspired by social activities, human decision-making, cognitive patterns, environmental adaptation, and collaborative logic, which abstract the empirical rules, group interactions, adaptive strategies, or search logic accumulated by humans in addressing practical problems into computable mathematical models, and construct efficient search mechanisms. HBAs utilize individual experience accumulation, group information sharing, goal-oriented trial and error, or adaptive adjustment to efficiently explore the high-quality detection scope, extract global extremum solutions, achieve multi-agent collaboration, and avoid blind search and premature convergence. HBAs exhibit specific competitive characteristics of intense collaboration, adaptability, robustness, parallelism, interpretability and self-organization, straightforward interpretation and implementation; low dependency on mathematical properties; low computational complexity; intuitive and easily tunable parameters; strong multi-objective optimization and dynamic adaptability; and equilibrium between global exploration and local exploitation, such as educational competition optimization (ECO) [19], information acquisition optimizer (IAO) [20], human evolutionary optimization algorithm (HEOA) [21], memory backtracking strategy (MBS) [22], guided learning strategy (GLS) [23], and thinking innovation strategy (TIS) [24].
The greater cane rat algorithm (GCRA) is motivated by the dispersed foraging behavior during the mating season and the concentrated foraging behavior during the non-mating season of the greater cane rats, which facilitates an efficient switch between global exploration and local exploitation, expands the search range, and updates the population’s position to obtain potential optimal solutions [6]. The basic GCRA exhibits serious drawbacks of high parameter sensitivity, insufficient solution accuracy, high computational complexity, susceptibility to local optima and overfitting, poor dynamic adaptability, and severe curse of dimensionality. The no-free-lunch (NFL) theorem explicitly states that there is no universal algorithm or absolutely superior algorithm that applies to all complex problems, which not only reveals the conditional dependence of algorithms and the adaptability of difficulties but also prompts us to construct a hybrid nonlinear greater cane rat algorithm with sine–cosine algorithm called (SCGCRA) to resolve the benchmark functions and constrained engineering designs. The core purpose is to integrate the powerful global exploration ability of GCRA and the excellent local exploitation ability of SCA, and adaptively balance the two algorithms through a nonlinear control strategy, which can comprehensively improve the optimization performance of SCGCRA in handling complex function optimization and engineering practical problems, achieve dynamic balance between exploration and exploitation, enhance the adaptability and robustness of algorithms, provide efficient, reliable, and practical solutions for complex optimization problems. For function optimization, the SCGCRA employs mechanism complementarity, dynamic tuning, and information sharing to overcome local optima, accelerate convergence, and enhance solution accuracy. For engineering examples, the SCGCRA has certain superiority and practicality in handling complex constraints, adapting to dynamic environments, and solving high-dimensional problems. The SCGCRA not only employs global coarse exploration to enable greater cane rats to move and forage among scattered shelters within the territory, leave trail marks leading to food sources, and explore potential solutions, but also utilizes local refined exploitation to allow isolated males to concentrate on meticulous foraging in food-rich areas and enhance solution accuracy.
The main contributions of the SCGCRA are summarized as follows: (1) The hybrid nonlinear greater cane rat algorithm with sine–cosine algorithm (SCGCRA) is proposed to resolve the global optimization and constrained engineering applications. (2) The periodic oscillatory fluctuation characteristics of the sine–cosine algorithm and the dynamic regulation and decision-making of nonlinear control strategy improve search efficiency and flexibility, enhance convergence speed and solution accuracy, increase population diversity and quality, avoid premature convergence and search stagnation, remedy the disequilibrium between exploration and exploitation, achieve synergistic complementarity and reduce sensitivity, and realize repeated expansion and contraction. (3) The SCGCRA is compared with numerous advanced algorithms that contain recently published, highly cited, and highly performing algorithms, such as CPO, BKA, EHO, PO, WSA, HLOA, ECO, IAO, AO, HEOA, NRBO, and GCRA. (4) The SCGCRA is tested against twenty-three benchmark functions and six real-world engineering designs by performing simulation experiments and analyzing the results. (5) The evaluation metrics and overall performance of the SCGCRA outperform those of other algorithms. The SCGCRA exhibits substantial superiority and adaptability in achieving a dynamic balance between exploration and exploitation, leveraging a diversity mechanism to create a synergistic effect of complementary strengths, comprehensively enhancing convergence speed, solution accuracy, stability, and robustness.
The following sections constitute the article: Section 2 emphasizes the greater cane rat algorithm (GCRA). Section 3 elucidates the nonlinear greater cane rat algorithm with the sine–cosine algorithm (SCGCRA). Section 4 elaborates on the simulation test and result analysis for tackling benchmark functions. Section 5 portrays the SCGCRA for tackling engineering designs. Section 6 encapsulates the conclusions and future research.

2. Greater Cane Rat Algorithm (GCRA)

The GCRA is based on the intelligent foraging behavior and social collaboration mechanism of the greater cane rats (GCRs), which simulates their territory exploration, path tracking, and reproduction strategies to construct an efficient framework for global exploration and local exploitation in complex optimization problems. GCRs are highly nocturnal animals that primarily inhabit areas such as swamps, riverbanks, and cultivated land, with sugarcane and grass as their main diet. The GCRA simulates the dispersed foraging behavior of GCRs during the non-mating or dry season and the aggregated breeding behavior of GCRs during the mating or rainy season in balancing the ability to explore globally (search for unknown areas) and exploit locally (optimize known solutions). Figure 1 portrays the natural foraging habitat of GCRs. GCRs live near the water source in the bottom shaded areas. The whiteness of the regions and paths depicts the trails taken by the vine-like grasses, which are features of previously known food sources.

2.1. Population Initialization

The matrix of a randomly initialized population is calculated as follows:
X = x 1 , 1 x 1 , 2 x 1 , d 1 x 1 , d x 2 , 1 x 2 , 2 x 2 , d 1 x 2 , d x i , j x n , 1 x n , 2 x n , d 1 x n , d
where X denotes the GCRs population, x i , j denotes i t h location in j t h dimension, n denotes the population scale, and d denotes the problem dimension. x i , j is calculated as follows:
x i , j = r a n d   ×   ( U B j L B j ) + L B j
where r a n d [ 0 , 1 ] and U B and L B denotes the upper and lower boundaries.
The dominant male GCR x k , j is identified as the fittest individual, which can guide the group towards known food sources or shelters, realign the locations, and avoid blind search. ρ = 0.5 serves as a variable that determines whether it is the rainy season or not, which is used to dynamically switch between exploration and exploitation.
x i , j n e w = 0.7   ×   ( x i , j + x k , j ) 2
where x i , j n e w denotes the latest location and x i , j denotes the current location.

2.2. Exploration

GCRs construct hideout or shallow burrow shelters scattered around the territory in swamps, riverbanks, and cultivated land, which achieves dispersed migration for foraging and leaves trail marks through territorial sheltering and route tracking. Figure 2 portrays the exploration action of GCRs while looking for sources. The most suitable location of the dominant male GCR is considered the food source route, while the remaining GCRs follow and adjust their locations to explore different areas, expand the solution regions, avoid search stagnation, and obtain multiple potential solutions. The location is calculated as follows:
x i , j n e w = x i , j + C   ×   ( x k , j r   ×   x i , j )
x i = x i , j + C   ×   ( x i , j α   ×   x k , j ) ,               F i n e w < F i x i , j + C   ×   ( x m , j β   ×   x k , j ) ,                 otherwise
where x i denotes the latest location of i t h GCR, x i , j n e w denotes the location of j t h dimension, x i , j denotes the GCR’s current location, x k , j denotes the dominant male GCR’s location, F x k denotes the fitness of the x k , j , F x i denotes the current fitness, C [ 0 , 1 ] denotes the dispersed food sources and shelters, r imitates the impact of a diverse food source and intensify exploitation, α imitates a decreasing food source and forces GCRs to explore the latest food sources and shelters, and β promptes GCRs to relocate to other abundant food sources available within the breeding areas. r , α , and β are calculated as follows:
r = F x k t   ×   ( F x k T )
α = 2   ×   r   ×   r a n d r
β = 2   ×   r   ×   μ r
where t denotes the current iteration and T denotes the maximum iteration.

2.3. Exploitation

Rampant breeding male GCRs leave the groups during the rainy season to forage deeply within areas with abundant food sources. Figure 3 portrays the exploitation action of GCRs during mating season. The GCRA utilizes the random selection of female GCRs to imitate the focused exploitation of high-quality areas through reproductive behavior, strength local meticulous search, and enhance solution quality. The location is calculated as follows:
x i , j n e w = x i , j + C   ×   ( x k , j μ   ×   x m , j )
where x m , j denotes the female GCR’s location and μ [ 1 , 4 ] imitates the number of offspring produced by each female GCR.
Algorithm 1 portrays the pseudocode of the GCRA.
Algorithm 1 GCRA
Step 1. Initialize the GCRs population X i ( i = 1 , 2 , , n )
Step 2. Estimate the fitness of GCRs, renovate the global best solution ( G b e s t )
            Sift the fittest GCR as the dominant male x k
            Renovate the remaining GCRs stem from x k via Equation (3)
Step 3. while  t < T do
                  for all GCRs
                            Renovate ρ , r , α , β , C , μ
                            if  r a n d < ρ
                              Exploration
                              Renovate GCRs positions via Equation (4)
                          else
                              Exploitation
                              Renovate GCRs positions via Equation (9)
                           end if
                    end for
                    Affirm whether any solution has overflowed the search interval and revise it
                    Estimate the fitness of GCRs stem from a renewed location
                     Renovate GCRs positions via Equation (5)
                     Renovate G b e s t and sift a renewed dominant male x k
                     t = t + 1
            end while
            Return  G b e s t

3. Nonlinear Greater Cane Rat Algorithm with Sine–Cosine Algorithm (SCGCRA)

The SCGCRA integrates the intelligent discerning and foraging behavior of GCRA, the mathematical periodic oscillatory fluctuation characteristic of the sine–cosine algorithm, and the adaptive adjustment mechanism in remedying the disequilibrium between global exploration and local exploitation, enhancing convergence efficiency and solution accuracy, highlighting robustness and applicability, avoiding premature convergence and dimensional disaster, preventing redundant search, and achieving superior solution quality.

3.1. Nonlinear GCRA

The downward slope of the reduction in the parameter value has been altered according to the algorithm structure. The nonlinear control strategy exhibits a strong anti-disturbance ability and nonlinear processing characteristic, ensuring solution accuracy and stability, enhancing overall search efficiency and localized fine-tuning, strengthening adaptability and operability, and facilitating dynamic regulation and solution quality [25]. The locations are calculated as follows:
W = 2 e ( 8 t T ) 2
x i , j n e w = W   ×   x i , j + C   ×   ( x k , j r   ×   x i , j )
x i = W   ×   x i , j + C   ×   ( x i , j α   ×   x k , j ) ,               F i n e w < F i W   ×   x i , j + C   ×   ( x m , j β   ×   x k , j ) ,                 otherwise
x i , j n e w = W   ×   x i , j + C   ×   ( x k , j μ   ×   x m , j )
where t denotes the current iteration and T denotes the maximum iteration.

3.2. Sine–Cosine Algorithm (SCA)

The SCA is derived from the periodic fluctuation and range restriction of sine and cosine functions, which imitates the dynamic oscillatory behavior of trigonometric functions to guide the search agents in efficiently searching within a multi-dimensional solution space and approximating the global optimal solution [26]. The SCA offers remarkable advantages, including concise structure and parameters, easy equilibrium and implementation, structural flexibility and stability, efficient astringency and solution quality, strong robustness and versatility, adaptive switching between exploration and exploitation, low computational overhead, and abundant population multiplicity. The location is calculated as follows:
X i t + 1 = X i t + r 1   ×   sin ( r 2 )   ×   r 3   ×   P i t X i t , r 4 < 0.5 X i t + r 1   ×   cos ( r 2 )   ×   r 3   ×   P i t X i t ,   r 4 0.5
where X i t denotes the current location, X i t + 1 denotes the latest location, P i t denotes the fittest location, r 2 [ 0 , 2 π ] , r 3 [ 2 , 2 ] , r 4 [ 0 , 1 ] , and ∣∣ denotes the absolute value.
The r 1 is calculated as follows:
r 1 = a t a T
where a = 2 .

3.3. SCGCRA

The SCGCRA utilizes the group migration mechanism led by the dominant male GCRs and the randomly dispersed food source simulation mechanism to provide clear global search directions, avoid direction dispersion and redundant detection caused by SCA’s mathematical periodic oscillatory fluctuation characteristics, ensure complete coverage of the solution space, dynamically expand the search scope, enhance population diversity, and avoid premature convergence. The SCGCRA utilizes the local aggregation mechanism of the female GCRs and narrow amplitude oscillations of the SCA to exploit a small range of high-quality solution areas, efficiently locate the optimal fitness scope, achieve weak fine adjustment, remedy local roughness, reduce ineffective iterations, accelerate the convergence speed, and enhance solution accuracy. The nonlinear control strategy can dynamically adjust the dispersion to avoid search stagnation and improve robustness and usability. The SCGCRA utilizes global guidance, local oscillation, and dynamic control to achieve a fast convergence speed, high solution accuracy, and strong anti-interference and adaptability.
In exploration of the SCGCRA, the location is calculated as follows:
x i , j n e w = x i , j n e w + r 1   ×   sin ( r 2 )   ×   r 3   ×   x k , j x i , j n e w , r 4 < 0.5 x i , j n e w + r 1   ×   cos ( r 2 )   ×   r 3   ×   x k , j x i , j n e w , r 4 0.5
x i = x i + r 1   ×   sin ( r 2 )   ×   r 3   ×   x k , j x i , r 4 < 0.5 , F i n e w < F i x i + r 1   ×   cos ( r 2 )   ×   r 3   ×   x k , j x i , r 4 0.5 , F i n e w < F i x i + r 1   ×   sin ( r 2 )   ×   r 3   ×   x k , j x i , r 4 < 0.5 , F i n e w F i x i + r 1   ×   cos ( r 2 )   ×   r 3   ×   x k , j x i , r 4 0.5 , F i n e w F i
In exploitation of the SCGCRA, the location is calculated as follows:
x i , j n e w = x i , j n e w + r 1   ×   sin ( r 2 )   ×   r 3   ×   x k , j x i , j n e w , r 4 < 0.5 x i , j n e w + r 1   ×   cos ( r 2 )   ×   r 3   ×   x k , j x i , j n e w , r 4 0.5
where x i , j n e w denotes the location of j t h dimension, x i , j n e w and x i , j n e w denotes the latest location of j t h dimension, x i denotes the location of i t h GCR, x i denotes the latest location of i t h GCR, x k , j denotes the dominant male GCR’s location, r 2 [ 0 , 2 π ] , r 3 [ 2 , 2 ] , r 4 [ 0 , 1 ] , and r 1 is linearly decreases from 2 to 0.
The SCGCRA combines three existing techniques: GCRA, SCA, and a nonlinear control strategy. The motivation of a specific combination is summarized as follows: (1) Core problem-driven: Limitations of a single algorithm and the necessity of a hybrid design. The inherent characteristic of GCRA is to simulate the foraging and escape behavior of GCRs in terms of food source attraction and breeding season grouping, and achieve global detection and extensive search of the solution space. The GCRA lacks strong randomness and dispersion due to a lack of dynamic adjustment mechanisms, relying on individual experience accumulation and mathematically driven, refined development strategies, which determine limited solution accuracy. The inherent characteristics of SCA are based on the mathematical periodic oscillation characteristics of sine and cosine functions, which can systematically perform fine local development near the current optimal solution. The amplitude coefficient is usually linearly attenuated, and the later oscillation amplitude is too small to break through the local extremum. The SCGCRA achieves a balance between the breadth of global exploration and the accuracy of local exploitation by combining the biological behavior of GCRA with the mathematical periodicity of SCA, which has strong robustness and adaptability to high-dimensional, multimodal, and dynamic constraint problems. (2) Complementarity: A collaborative mechanism between biological behavior and mathematical models. The GCRA employs an adaptive grouping strategy and a constraint attraction mechanism to balance global exploration and local exploitation dynamically, guiding the population to move towards the feasible domain naturally. The SCA utilizes periodic oscillation coverage and amplitude dynamic attenuation to frequently switch directions, thereby enhancing the ability to escape local optima and requiring nonlinear control optimization. The SCGCRA possesses strong global exploration and local exploitation capabilities, enabling hybrid collaboration, identifying potential high-quality areas, and enhancing solution accuracy. (3) Nonlinear control strategy: Dynamic regulation and performance enhancement. The nonlinear control strategy adjusts SCGCRA parameters in real-time through feedback mechanisms, achieving optimal matching between solution quality and parameter adjustment. It avoids premature convergence or excessive oscillation caused by fixed parameters, ensuring the stability and robustness of the search process.
Algorithm 2 portrays the pseudocode of the SCGCRA. Figure 4 portrays the flowchart of SCGCRA.
Algorithm 2 SCGCRA
Step 1. Initialize the GCRs population X i ( i = 1 , 2 , , n )
Step 2. Estimate the fitness of GCRs, renovate the global best solution ( G b e s t )
            Sift the fittest GCR as the dominant male x k
            Renovate the remaining GCRs stem from x k via Equation (3)
Step 3. while  t < T do
                  for all GCRs
                           Renovate ρ , r , α , β , C , μ
                           if  r a n d < ρ
                               Exploration
                               The nonlinear control strategy is introduced into exploration of GCRA
                               Combine SCA with GCRA to enhance the global exploration efficiency
                               Renovate GCRs positions via Equations (11) and (16)
                           else
                               Exploitation
                               The nonlinear control strategy is introduced into exploitation of GCRA
                               Combine SCA with GCRA to enhance the local exploitation accuracy
                               Renovate GCRs positions via Equations (13) and (18)
                           end if
                     end for
                     Affirm whether any solution has overflowed the search interval and revise it
                     Estimate the fitness of GCRs stem from a renewed location
                     The nonlinear control strategy is introduced into GCRA
                     Combine SCA with GCRA to enhance the exploration and exploitation
                     Renovate GCRs positions via Equations (12) and (17)
                     Renovate G b e s t and sift a renewed dominant male x k
                      t = t + 1
               end while
               Return  G b e s t

4. Simulation Test and Result Analysis for Tackling Benchmark Functions

4.1. Experimental Disposition

The experimental disposition stipulated a 64-bit Windows 11 OS, a 12th Gen Intel(R) Core(TM) i9-12900HX 2.30 GHz CPU, 4 TB storage, an independent 16 GB graphics card, and 16 GB RAM. All comparison approaches were implemented in MATLAB R2022b.

4.2. Benchmark Functions

The SCGCRA employed unimodal functions ( f 1 f 7 ), multimodal functions ( f 8 f 12 ), and fixed-dimension multimodal functions ( f 13 f 23 ) to validate the reliability and practicality. Table 1 outlines the benchmark functions.

4.3. Parameter Settings

To validate the practicality and applicability, the SCGCRA is compared with the CPO, BKA, EHO, PO, WSA, HLOA, ECO, IAO, AO, HEOA, NRBO, and GCRA. The parameter selection and sensitivity analysis are summarized as follows: (1) Inheritance principle: The control parameters of SCGCRA are directly derived from the original parameters of GCRA and SCA, and strictly inherit the widely validated recommended or default values in the original papers. These parameters have been extensively studied and proven to possess broad applicability, reliable representativeness, and strong robustness. The necessity of repetitive experimental verification is relatively low. Modifying these highly standardized parameters would undermine the mathematical foundation and convergence guarantee of the SCGCRA. (2) Principles of Cybernetics: A nonlinear control strategy will smoothly transition and dynamically adjust the equivalent effects based on state variables, such as iteration times and population diversity, utilizing adaptive compensation mechanisms that are insensitive to the initial absolute values. A nonlinear control strategy can automatically compensate for performance losses caused by parameter deviations, ensure the robustness of parameter changes, theoretically guarantee the algorithm’s stability, and guide the search in a favorable direction. (3) Normalization principle: We dimensionless the few critical parameters that need to be set (such as values between 0 and 1, or correlated with the number of iterations), which dramatically reduces the coupling degree between selected parameters and the specific problem scale, making a set of parameters applicable to a class of problems. (4) Principle of structural superiority: The SCGCRA realizes the complementary advantages of the GCRA’s directional exploitation ability and SCA’s global exploration ability. The dynamic scheduling of the nonlinear control strategy makes the SCGCRA insensitive to subtle changes in parameters. The performance improvement primarily comes from structural innovation rather than fine-tuning of parameters. The algorithm structure itself ensures good performance within a reasonable range of parameters.
CPO: Invariable values Q = 100 , D c = 0.6 , β = 1.5 .
BKA: Stochastic values r a n d [ 0 , 1 ] , r [ 0 , 1 ] , Cauchy mutation C ( 0 , 1 ) , invariable values p = 0.9 , δ = 1 , μ = 0 .
EHO: Stochastic values α [ 0 , 1 ] , β [ 0 , 2 ] , γ [ 0 , 2 ] .
PO: Invariable values P F 1 = 0.5 , P F 2 = 0.5 , P F 3 = 0.3 , U = 0.2 , stochastic values L [ 0.7 , 0.9 ] , α [ 1 , 2 ] .
WSA: Invariable values a 0 = 0.3 , c = 1.6 .
HLOA: Hue circle angle h [ 0 , 2 π ] , binary value σ = 0       o r       1 , invariable values = 2 , v 0 = 1 , α = π / 2 , ε = 1   ×   10 6 , g = 0.009807 , stochastic values L i g h t [ 0 , 0.4046661 ] , D a r k [ 0.5440510 , 1 ] , w a l k [ 1 , 1 ] .
ECO: Stochastic values R 1 [ 0 , 1 ] , R 2 [ 0 , 1 ] , invariable values γ = 1.5 , H = 0.5 , E = 1 .
IAO: Stochastic values ϑ [ 0 , 1 ] , r a n d [ 0 , 1 ] , v [ 0 , 1 ] , β [ 0 , 1 ] , γ [ 0 , 1 ] , δ [ 0 , 1 ] , ε [ 0 , 1 ] , ζ [ 0 , 1 ] , κ [ 0 , 1 ] , w [ 0 , 1 ] .
AO: Stochastic values R [ 0 , 1 ] , r 1 [ 0 , 1 ] , d [ 0.1 , 0.6 ] .
HEOA: Stochastic values r a n d ( 0 , 1 ) , R [ 0 , 1 ] , invariable values γ = 1.5 , A = 0.6 .
NRBO: Stochastic values r a n d ( 0 , 1 ) , δ [ 1 , 1 ] , a ( 0 , 1 ) , b ( 0 , 1 ) , r 1 ( 0 , 1 ) , r 2 ( 0 , 1 ) , θ 1 ( 1 , 1 ) , θ 2 ( 0.5 , 0.5 ) , Δ ( 0 , 1 ) , invariable value D F = 0.6 , binary value β = 0       o r       1 .
GCRA: Stochastic values r a n d ( 0 , 1 ) , C [ 0 , 1 ] , μ [ 1 , 4 ] , invariable value ρ = 0.5 .
SCGCRA: Stochastic values r a n d ( 0 , 1 ) , C [ 0 , 1 ] , μ [ 1 , 4 ] , r 2 [ 0 , 2 π ] , r 3 [ 2 , 2 ] , r 4 [ 0 , 1 ] , invariable values ρ = 0.5 , a = 2 .

4.4. Simulation Test and Result Analysis

To objectively and comprehensively evaluate the convergence characteristics and solution accuracy of each algorithm, the population scale, maximum iteration, and stand-alone run remained consistent, with values of 50, 1000, and 30, respectively. Table 2 outlines the contrastive results of benchmark functions.
The SCGCRA was manipulated to resolve the benchmark functions; the core objective was to break through the limitations of the original GCRA and the bottlenecks of local optimum, enhance the adaptability and reliability of strongly constrained functions, promote the detection efficiency and solution quality, strengthen stability and robustness, diminish the parameter sensitivity and result fluctuations, and identify the global optimum or the high-quality approximate solution. The optimal value (Best), worst value (Worst), mean value (Mean), and standard deviation (Std) systematically validate the core characteristics and reflect the reliability and applicability from different dimensions. The optimal value is the objective fitness value corresponding to the fittest solution in multiple stand-alone runs, which verifies the global detectability and convergence accuracy. When the optimal value approaches the theoretical global optimum, the algorithm has enhanced the potential to excavate high-quality solutions and strengthened the effectiveness to break through the local optimum. The rapid decrease and stabilization of the optimal value during the iteration process indicate that the algorithm has a fast convergence speed and high solution accuracy. The worst value is the objective fitness value corresponding to the worst solution in multiple stand-alone runs, which estimates the robustness and adaptability to extreme scenarios. The smaller the disparity between the worst value and the optimal value, the lower the sensitivity of the algorithm to initial population, parameter settings, complex solution domain, or randomness, and the stronger the robustness. The modified strategy significantly surpasses the original algorithm in terms of value. It effectively reduces the number of extreme poor solutions during the search process, thereby avoiding infeasible solutions and extreme local convergence. The mean value is the arithmetic mean of the objective fitness values of all solutions in multiple stand-alone runs, which reveals the overall average performance, search efficiency, and general applicability. It assesses the stability of the algorithm and the uniformity of the distribution of the disaggregation. The extent to which the mean value approaches the theoretical optimal solution evaluates the overall search efficiency of the algorithm, designates the correctness of the overall search direction, and avoids search stagnation due to local optima. The standard deviation is the degree of the objective fitness value relative to the mean value, which measures stability and distribution uniformity of the algorithm. The smaller standard deviation indicates that the algorithm exhibits less fluctuation in the convergence results, higher stability, and stronger repeatability. The standard deviation is extended to the degree of discrepancy in the objective space, which is used to evaluate the impact of the modified approach on solution diversity. For unimodal functions, the solution space comprises a unique global extremum point without local extremum points. The monotonicity can swiftly narrow the search scope, diminish ineffective exploration, and clarify the direction of gradient descent. The core requirement was to rapidly and precisely approximate the optimal solution, which validates the algorithm’s local mining accuracy, convergence efficiency, and adaptability of simple solution spaces. For f 1 , f 2 , f 3 , and f 4 , the optimal values, worst values, mean values, and standard deviations of the IAO, GCRA, and SCGCRA remained consistent and exhibited optimal extreme solutions. The quantitative metrics, detection efficiency, and solution accuracy of the SCGCRA were superior to those of the CPO, BKA, EHO, PO, WSA, HLOA, ECO, AO, HEOA, and NRBO, and the SCGCRA utilized the aggregation effect of dominant male GCRs and the decentralized search of the population to overlap the global exploration scope, locate potential optimal areas, strengthen local exploitation, and enhance solution accuracy. For f 5 , f 6 , and f 7 , the SCGCRA not only achieved a small improvement in quantitative metrics, detection efficiency, and solution accuracy but also significantly outperformed other algorithms. The SCGCRA utilized the periodic fluctuations of sine and cosine functions to furnish multi-directional perturbation paths, actualize high-precision local convergence, avert search oscillation, narrow the search solution space, and ensure population diversity. For multimodal functions f 8 f 12 , the solution space comprised multiple local extremum points, some of which were close in solution quality to the global extremum point. The core requirement was to avoid search stagnation, surmount local traps, and identify the global optimum, which validates the algorithm’s global search capability, the maintenance and activation of population diversity. For f 8 and f 10 , the optimal values, worst values, mean values, and standard deviations of the CPO, BKA, PO, WSA, HLOA, ECO, IAO, NRBO, and GCRA, the SCGCRA remained consistent and exhibited optimal extreme solutions, which were superior to those of the EHO and AO. The SCGCRA integrated the nonlinear control strategy of GCRA and the periodic oscillatory fluctuation of SCA to guide dominant individuals, adjust dynamic step sizes, and enhance adaptability and complementarity. For f 9 , f 11 , and f 12 , the quantitative metrics, detection efficiency, and solution accuracy of the SCGCRA were better than those of the CPO, BKA, EHO, PO, WSA, HLOA, ECO, IAO, AO, HEOA, NRBO, and GCRA. The SCGCRA prioritized strong reliability and practicality in scanning the solution scope, facilitating extensive global exploration, enhancing group collaboration and guidance, discovering precise optimal solutions, and avoiding premature convergence. For fixed-dimension multimodal f 13 f 23 , the solution space retained multiple local extremum points of multimodal functions and a unique global extremum point of unimodal functions, adopted fixed-dimensionality to control complexity, and eliminated interference of dimensionality changes. The core requirement was to resist performance degradation caused by increased dimensionality, which validates the algorithm’s global stability, robustness, and consistency, the decoupling and search direction focusing on high-dimensional variables, adaptability to the curse of dimensionality, and equilibrium ability between detection and exploitation. For f 13 , f 14 , f 15 , f 16 , f 17 , f 21 , f 22 , and f 23 , the optimal values, worst values, and mean values of the SCGCRA remained consistent and exhibited optimal extreme solutions; the quantitative metrics, detection efficiency, and solution accuracy of the SCGCRA were superior to those of CPO, BKA, EHO, PO, WSA, HLOA, ECO, IAO, AO, HEOA, NRBO, and GCRA. The SCGCRA exhibited strong stability and robustness in effectively guiding and regulating the initial population distribution, reducing dependence on initial conditions, and steadily searching towards the optimal solution, and diminishing the algorithm’s variability. For f 18 , f 19 , and f 20 , the SCGCRA exhibited strong superiority and operability in terms of the quantitative metrics, detection efficiency, and solution accuracy. The SCGCRA utilized the random fluctuation and periodicity of SCA, the group collaboration and guidance mechanism of GCRA, and dynamic adjustment and search mechanism of the nonlinear control strategy to explore the solution scope extensively and intensively identify potential high-quality regions; meticulously exploit the extremum solutions; effectively avoid blind large-scale search; strictly avert search stagnation and slow convergence; and preferably balance exploration and exploitation to approximate the optimal solution.

4.5. Convergence Analysis

Figure 5 portrays the convergence curves of the SCGCRA and comparative algorithms for addressing the benchmark functions. The convergence curves can be used to determine whether an algorithm can efficiently approximate the optimal solution and measure solution accuracy by observing steep descent, plateau periods, and oscillatory curves. This also allows for quantifying convergence efficiency by observing the number of iterations required to reach a specific stable value quickly. The optimal value and mean value jointly reveal the search efficiency and convergence accuracy of the algorithm. If the disparity between the optimal value and the average value is significant, it validates that the algorithm has prematurely fallen into a local optimum. If the discrepancy between the optimal value and the average value is slight and continues to decrease, it validates that the algorithm exhibits a strong equilibrium between global exploration and local exploitation. For unimodal functions, the numerous quantitative metrics, detection efficiency, and solution accuracy of the SCGCRA were superior to those of the CPO, BKA, EHO, PO, WSA, HLOA, ECO, IAO, AO, HEOA, NRBO, and GCRA. The SCGCRA utilized the territorial foraging behavior and dominant individual guidance mechanism of the GCRA to quickly locate potential optimal areas, avoid aimless search, cover the solution space, remedy the disequilibrium between exploration and exploitation, and enhance robustness and reliability. For multimodal functions, the SCGCRA exhibited remarkable advantages and superiority in terms of numerous quantitative metrics, detection efficiency, and solution accuracy compared to other comparative algorithms. The SCGCRA utilized the mathematical periodic oscillatory fluctuation of the SCA to enhance local exploitation, quantify solution accuracy, avoid dimensional disaster, achieve synergistic complementarity, and reduce sensitivity, thereby increasing population diversity and flexibility. For fixed-dimensional multimodal f 13 f 23 , the multitudinous optimal values, mean values, detection efficiency, and solution accuracy of the SCGCRA outperformed the comparative algorithms. The SCGCRA employed a nonlinear control strategy to exhibit strong anti-disturbance ability and stability, thereby enhancing adaptability, operability, and practicality; facilitating repeated expansion and contraction; and improving dynamic regulation and solution quality. To summarize, the SCGCRA integrated the periodic oscillatory fluctuation characteristics of the SCA and the dynamic regulation and decision-making of a nonlinear control strategy to provide clear global search directions, avoid direction dispersion and redundant detection, ensure complete coverage of the solution space, enhance population diversity, and achieve good detection efficiency and solution accuracy.

4.6. Boxplot Analysis

Figure 6 portrays boxplots of the SCGCRA and comparative algorithms for addressing the benchmark functions. The boxplots can quantify the stability and reliability of solutions, reflect the changes in solution diversity over time, and describe the dispersion of individual solutions within the population. The standard deviation intuitively demonstrates the sensitivity of the algorithm to initial population and parameter perturbations. The continuous decrease in standard deviation indicates that the algorithm maintains good convergence to force the population to converge towards the global extremum solution. If standard deviation mutates or oscillates, it may fall into a local optimum or require parameter adjustment. The standard deviation and worst value jointly reveal the stability and robustness of the algorithm. If the standard deviation is slight, the worst value infinitely approaches the optimal value, which validates that the algorithm is insensitive to randomness and the adaptability of initial conditions. The standard deviation has strong stability and expansibility in assessing the uniformity of the distribution of the solution set (such as the dispersity of solutions on the Pareto front in multi-objective optimization) and in verifying the profound impact of improvement strategies on solution diversity. For unimodal functions f 1 f 7 , the multitudinous standard deviation and dispersion of the SCGCRA were superior to those of other comparative algorithms. The SCGCRA exhibited strong adaptability and operability in overcoming the drawbacks of high parameter sensitivity, insufficient solution accuracy, high computational complexity, susceptibility to local optima and overfitting, poor dynamic adaptability, and the severe curse of dimensionality. For multimodal functions f 8 f 12 , compared with other comparative algorithms, the SCGCRA exhibited remarkable advantages and superiority in terms of the multitudinous standard deviation and dispersion. The SCGCRA utilized the intelligent foraging behavior and social collaboration mechanism of GCRs to simulate territory exploration, path tracking, and reproduction strategies. The SCGCRA employed a nonlinear control strategy to ensure solution accuracy and stability, facilitate dynamic regulation, and improve solution quality. For fixed-dimensional multimodal f 13 f 23 , the multitudinous standard deviation and dispersion of the SCGCRA outperformed the comparative algorithms. The SCGCRA exhibited strong reliability and practicality, facilitating mating during the rainy season and non-mating during the dry season. This enabled the expansion of the search range, updated population positions, and enhanced population diversity, ultimately identifying the globally optimal precise solution. To summarize, the SCGCRA not only exhibited substantial superiority and adaptability to determine the superior standard deviation and dispersion and enhance the stability and robustness, but also exhibited strong practicality and reliability to balance the global coarse exploration and local refined exploitation, overlap the global exploration scope, locate potential optimal areas, strengthen local exploitation, and enhance solution accuracy.

4.7. Wilcoxon Rank-Sum Test

The Wilcoxon rank-sum test is a non-parametric statistical approach for paired data, which quantifies whether the overall discrepancy between the SCGCRA and other algorithms is statistically significant without relying on assumptions about data distribution [27]. p < 0.05 designates the dramatic difference, p 0.05 designates the non-dramatic difference, and N/A designates “not applicable”. Table 3 portrays the contrastive results of the p-value Wilcoxon rank-sum test on the benchmark functions. The SCGCRA exhibits strong stability and reliability in acquiring genuine and effective data rather than accidental data.

5. SCGCRA for Tackling Engineering Designs

To validate adaptability and practicality, the SCGCRA was utilized to tackle the constrained real-world engineering designs: three-bar truss design [28], piston lever design [29], gear train design [30], car side impact design [31], multiple-disk clutch brake design [32], and rolling element bearing design [33].

5.1. Three-Bar Truss Design

The dominant motivation was to weaken the cumulative weight, which incorporated two quantitative metrics: the intersecting surfaces A 1 and A 2 . Figure 7 portrays the sketch map of the three-bar truss design.
Consider
x = [ x 1       x 2 ] = [ A 1       A 2 ]
Minimize
f ( x ) = ( 2 2 x 1 + x 2 )   ×   l
Subject to
g 1 ( x ) = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0
g 2 ( x ) = x 2 2 x 2 + 2 x 1 x 2 P σ 0
g 3 ( x ) = 1 2 x 2 + x 1 P σ 0
l = 100   c m ,             P = 2   k N / c m 2 ,             σ = 2   k N / c m 2
Variable range
0 x 1 , x 2 1
Table 4 outlines the contrastive results of the three-bar truss design. The SCGCRA used periodic oscillations of sine and cosine functions to achieve large-scale and multi-directional coverage of the solution space without relying on neighborhood continuity. This can directly cross distant subregions in high-dimensional space, avoiding the traversal blind spots caused by a single GCRA dominated by local neighborhood search. The SCGCRA employed a large step size for strong fluctuation and a small step size for weak contraction to lock out the high-quality regions, approximate the computational extreme solution, weaken redundant searches, and accelerate convergence speed. The global extremum solution was materialized by the SCGCRA at quantitative metrics: 0.78645 and 0.41813, with the optimum weight of 263.8543.

5.2. Piston Lever Design

The dominant motivation was to weaken (attenuate) the cumulative weight, oil volume, and ascertain the components as the piston joystick is elevated from 0 ° to 45 ° , which incorporated four quantitative metrics: H , B , X , and D . Figure 8 portrays the sketch map of the piston lever design.
Consider
x = [ x 1       x 2       x 3       x 4 ] = [ H       B       D       X ]
Minimize
f ( x ) = 1 4 π x 3 2 ( L 2 L 1 )
Subject to
g 1 ( x ) = Q L cos θ R F 0
g 2 ( x ) = Q ( L x 4 ) M max 0
g 3 ( x ) = 6 5   ×   ( L 2 L 1 ) L 1 0
g 4 ( x ) = x 3 2 x 2 0
R = x 4 ( x 4 sin θ + x 1 ) + x 1 ( x 2 x 4 cos θ ) ( x 4 x 2 ) 2 + x 1 2
F = π P x 3 2 4
L 1 = ( x 4 x 2 ) 2 + x 1 2
L 2 = ( x 4 sin θ + x 1 ) 2 + ( x 2 x 4 cos θ ) 2
θ = 45 ° ,       Q = 10000   l b s ,       L = 240   i n ,       M max = 1.8   ×   10 6   l b s     i n ,       P = 1500   p s i
Variable range
0.05 x 1 , x 2 , x 4 500 ,             0.05 x 3 120
Table 5 outlines the contrastive results of the piston lever design. The SCGCRA simulated the regional search and information sharing behavior of GCRs, which adopts individual position interaction and group optimal guidance to systematically cover the multimodal space of nonlinear problems, fully detect globally, accurately mine locally, and balance optimization efficiency and stability. The SCGCRA utilized the periodic oscillatory fluctuations of the SCA to overcome the exploration limitations of a population converging towards dominant individuals, ensure the population focuses on nuanced exploration in high-quality areas, and avoid the step size from rigidly oscillating near the extremal solution. The global extremum solution was materialized by the SCGCRA at quantitative metrics: 0.05, 0.125364154, 120, and 4.12410157, with the optimum weight of 7.794.

5.3. Gear Train Design

The dominant motivation was to weaken (attenuate) the cumulative teeth size and the gear ratio’s optimum cost, which incorporates four quantitative metrics: the teeth scales of the gear train n A , n B , n C and n D . Figure 9 portrays the sketch map of the gear train design.
Consider
x = [ x 1       x 2       x 3       x 4 ] = [ n A       n B       n C       n D ]
Minimize
f ( x ) = 1 6.931 x 3 x 2 x 1 x 4 2
Variable range
12 x i 60 ,       i = 1 , 2 , , 4
Table 6 outlines the contrastive results of the gear train design. The SCGCRA utilized the sine direction adjustment and cosine step size control to achieve periodic small-scale fluctuations within the potential optimal region locked by GCRA, dynamically decreased the step size with iteration, and compensated for the shortcomings of slow convergence speed and low solution accuracy of a single GCRA in the later stage. The SCGCRA exhibited strong adaptability and practicality, facilitating the exploration of fluctuations and targeted exploitation, thereby covering the solution space, triggering population diversity, avoiding chaotic search, and enhancing resistance to the curse of dimensionality. The SCGCRA materialized the global extremum solution at quantitative metrics: 50, 22, 19, and 52, with the optimum cost of 3.25   ×   10 18 .

5.4. Car Side Impact Design

The dominant motivation was to weaken (attenuate) the cumulative weight, which incorporates 11 quantitative metrics: thicknssssess of B-pillar inner ( x 1 ), B-pillar reinforcement ( x 2 ), floor side inner ( x 3 ), cross members ( x 4 ), door beam ( x 5 ), door beltline reinforcement ( x 6 ), roof rail ( x 7 ), materials of B-pillar inner ( x 8 ), floor side inner ( x 9 ), barrier height ( x 10 ), and hitting position ( x 11 ). Figure 10 portrays the sketch map of the car side impact design.
Consider
x = [ x 1       x 2       x 3       x 4       x 5       x 6       x 7       x 8       x 9       x 10       x 11 ]
Minimize
f ( x ) = 1.98 + 4.90 x 1 + 6.67 x 2 + 6.98 x 3 + 4.01 x 4 + 1.78 x 5 + 2.73 x 7
Subject to
g 1 ( x ) = 1.16 0.3717 x 2 x 4 0.00931 x 2 x 10 0.484 x 3 x 9 + 0.01343 x 6 x 10 1
g 2 ( x ) = 0.261 0.0159 x 1 x 2 0.188 x 1 x 8 0.019 x 2 x 7 + 0.0144 x 3 x 5 + 0.0008757 x 5 x 10 + 0.080405 x 6 x 9 + 0.00139 x 8 x 11 + 0.00001575 x 10 x 11 0.32
g 3 ( x ) = 0.214 + 0.00817 x 5 0.131 x 1 x 8 0.0704 x 1 x 9 + 0.03099 x 2 x 6 0.018 x 2 x 7 + 0.0208 x 3 x 8 + 0.121 x 3 x 9 0.00364 x 5 x 6 + 0.0007715 x 5 x 10 0.000535 x 6 x 10 + 0.00121 x 8 x 11 0.32
g 4 ( x ) = 0.074 0.061 x 2 0.163 x 3 x 8 + 0.001232 x 3 x 10 0.166 x 7 x 9 + 0.227 x 2 2 0.32
g 5 ( x ) = 28.98 + 3.818 x 3 4.2 x 1 x 2 + 0.0207 x 5 x 10 + 6.63 x 6 x 9 7.7 x 7 x 8 + 0.32 x 9 x 10 32
g 6 ( x ) = 33.86 + 2.95 x 3 + 0.1792 x 10 5.057 x 1 x 2 11.0 x 2 x 8 0.0215 x 5 x 10 9.98 x 7 x 8 + 22.0 x 8 x 9 32
g 7 ( x ) = 46.36 9.9 x 2 12.9 x 1 x 8 + 0.1107 x 3 x 10 32
g 8 ( x ) = 4.72 0.5 x 4 0.19 x 2 x 3 0.0122 x 4 x 10 + 0.009325 x 6 x 10 + 0.000191 x 11 2 4
g 9 ( x ) = 10.58 0.674 x 1 x 2 1.95 x 2 x 8 + 0.02054 x 3 x 10 0.0198 x 4 x 10 + 0.028 x 6 x 10 9.9
g 10 ( x ) = 16.45 0.489 x 3 x 7 0.843 x 5 x 6 + 0.0432 x 9 x 10 0.0556 x 9 x 11 0.000786 x 11 2 15.7
Variable range
0.5 x 1 x 7 1.5 ,             x 8 , x 9 ( 0.192 , 0.345 ) ,             30 x 10 , x 11 30
Table 7 outlines the contrastive results of the car side impact design. The GCRA did not require complex dimensional decoupling to update positions, which used relative position adjustment between individual and population optimal solutions to achieve optimization, naturally handling multi-parameter coupling relationships. The nonlinear control strategy provided dynamic feedback regulation to suppress fluctuations and quickly switch between GCRA and SCA to enhance exploration or development when the SCGCRA deviated from the optimal region. The SCGCRA possessed strong scalability and adaptability, which reduced the sensitivity of control parameters, promoted population diffusion, avoided premature convergence, focused on the optimal region, and eschewed search disorder. The global extremum solution is materialized by the SCGCRA at quantitative metrics: 0.5, 1.11643, 0.5, 1.30208, 0.5, 1.5, 0.5, 0.345, 0.192, −19.54935, and −0.00431, with the optimum weight of 22.84294.

5.5. Multiple-Disk Clutch Brake Design

The dominant motivation is to weaken (attenuate) the cumulative weight, which incorporates five quantitative metrics: thickness of disks ( t ), inner radius ( r i ), outer radius ( r o ), actuating force ( F ), and the number of friction surfaces ( Z ). Figure 11 portrays the sketch map of the multiple-disk clutch brake.
Consider
x = [ x 1       x 2       x 3       x 4       x 5 ] = [ r i       r 0       t       F       Z ]
Minimize
f ( x ) = π t ρ ( r 0 2 r i 2 ) ( Z + 1 )
Subject to
g 1 ( x ) = r 0 r i Δ r 0
g 2 ( x ) = l max ( Z + 1 ) ( t + δ ) 0
g 3 ( x ) = p max + p r z 0
g 4 ( x ) = p max v s r   max p r z v s r 0
g 5 ( x ) = v s r   max v s r 0
g 6 ( x ) = T max T 0
g 7 ( x ) = M h s M s 0
g 8 ( x ) = T 0
M h = 2 3 μ F Z r 0 3 r i 3 r 0 2 r i 2
p r z = F π ( r 0 2 r i 2 )
v s r = 2 π n ( r 0 3 r i 3 ) 90 ( r 0 2 r i 2 )
T = I z π n 30 ( M h + M f )
Δ r = 20   m m ,       I z = 55   k g m m 2 ,       p max = 1   M p a ,       F max = 1000   N
T max = 15   s ,       μ = 0.5 ,       s = 1.5 ,       M s = 40   N m
M f = 3   N m ,       n = 250   r p m
v s r   max = 10   m / s ,       l max = 30   m m ,       r i     min = 60
r i   max = 80 ,       r o   min = 90
r o   max = 110 ,       t min = 1.5 ,       t max = 3 ,       F min = 600
F max = 1000 ,       Z min = 2 ,       Z max = 9
Table 8 outlines the contrastive results of the multiple-disk clutch brake design. The SCA possesses a strong fine-tuning ability, ensuring the precise matching of control parameters with nonlinear system characteristics, thereby meeting the dual requirements of fast response and minimal system overshoot. The SCGCRA achieved closed-loop coordination between a nonlinear control strategy and GCRA optimization, which received control parameter feedback, thereby enhancing the robustness of the control system. The SCGCRA utilized the global coarse exploration and the local refined exploitation to promote search efficiency and flexibility, enhance convergence speed and solution accuracy, avoid search stagnation and dimensional disaster, strengthen adaptability and operability, and facilitate dynamic regulation and solution quality. The SCGCRA materialized the global extremum solution at quantitative metrics: 70, 90, 1, 600, and 2, with the optimum weight of 0.235247.

5.6. Rolling Element Bearing Design

The dominant motivation was to maximize the dynamic load-bearing capacity and optimum cost, which incorporates 10 quantitative metrics: pitch diameter ( D m ), ball diameter ( D b ), number of balls ( Z ), inner ( f i ), and outer ( f o ) raceway curvature coefficients, K D min , K D max , ε , e , and ζ . Figure 12 portrays the sketch map of the rolling element bearing design.
Consider
x = [ x 1       x 2       x 3       x 4       x 5       x 6       x 7       x 8       x 9       x 10 ] = [ D m   D b   Z   f i   f o   K D min   K D max   ε   e   ζ ]
Minimize
C d = f c Z 2 / 3 D b 1.8 ,                               i f     D 25.4 m m 3.647 f c Z 2 / 3 D b 1.4 ,               i f       D > 25.4 m m
Subject to
g 1 ( x ) = ϕ 0 2 sin 1 ( D b / D m ) Z + 1 0
g 2 ( x ) = 2 D b K D min ( D d ) 0
g 3 ( x ) = K D max ( D d ) 2 D b 0
g 4 ( x ) = ζ B ω D b 0
g 5 ( x ) = D m 0.5 ( D + d ) 0
g 6 ( x ) = ( 0.5 + e ) ( D + d ) D m 0
g 7 ( x ) = 0.5 ( D D m D b ) ε D b 0
g 8 ( x ) = f i 0.515
g 9 ( x ) = f o 0.515
f c = 37.91 1 + 1.04 1 r 1 + r 1.72 f i ( 2 f o 1 ) f o ( 2 f i 1 ) 0.41 10 3 0.3 r 0.3 ( 1 r ) 1.39 ( 1 + r ) 1 / 3 2 f i 2 f i 1 0.41
x = ( D d ) 2 3 T 4 2 + D 2 T 4 D b 2 d 2 + T 4 2
y = 2 ( D d ) 2 3 T 4 D 2 T 4 D b
ϕ o = 2 π 2 cos 1 x y
r = D b D m ,       f i = r i D b ,       f o = r o D b ,       T = D d 2 D b
D = 160 ,       d = 90 ,       B ω = 30 ,       r i = r o = 11.033
Variable range
0.5 ( D + d ) D m 0.6 ( D + d )
0.15 ( D d ) D b 0.45 ( D d )
4 Z 50 ,       0.515 f i , f o 0.6
0.4 K D min 0.5 ,       0.6 K D min 0.7
0.3 ε 0.4 ,       0.02 e 0.1 ,       0.6 ζ 0.85
Table 9 outlines the contrastive results of the rolling element bearing design. The GCRA and SCA both contain control parameters that need to be set. The introduction of a nonlinear control strategy compensated for the performance loss caused by suboptimal sub-algorithm parameters through its adaptive scheduling mechanism. The nonlinear control strategy dynamically adjusted the dominance of GCRA and SCA based on real-time feedback signals such as iteration progress, population diversity, or convergence speed. The SCGCRA utilized the SCA’s mathematical periodic oscillatory fluctuation characteristics to overlap the global exploration scope, locate potential optimal areas, and enhance population diversity. The SCGCRA employed a nonlinear control strategy to adjust dispersion, minimize ineffective exploration, and improve adaptability and complementarity. The global extremum solution was materialized by the SCGCRA at the following quantitative metrics: 126.2339, 20.1947, 10.5139, 0.5524, 0.5428, 0.4072, 0.6565, 0.3254, 0.0681, and 0.6142, with an optimum cost of 90,020.39.

6. Conclusions and Future Works

This paper portrays the SCGCRA to address the twenty-three benchmark functions and six constrained real-world engineering designs; the dominant motivation is to balance global coarse exploration and local refined exploitation to identify the superior quantitative metrics, detection efficiency, and solution accuracy, and exhibit strong practicality and adaptability to strengthen solution quality and discover the global extremum or high-quality approximate solution. To address the GCRA’s serious drawbacks of the high parameter sensitivity, insufficient solution accuracy, high computational complexity, susceptibility to local optima and overfitting, poor dynamic adaptability, and severe curse of dimensionality, the SCGCRA combines the periodic oscillatory fluctuation characteristics of the SCA and the anti-disturbance ability and nonlinear processing characteristics of the nonlinear control strategy to realize repeated expansion and contraction, facilitate dynamic regulation and population diversity, avert dimensional disaster and search oscillation, achieve synergistic complementarity and reduce sensitivity, and quantify the solution accuracy and search efficiency. The SCGCRA possesses strong flexibility and operability, enabling multi-directional perturbation paths that achieve high-precision local convergence, narrow the search solution space, expand the global exploration scope, locate potential optimal areas, strengthen local exploitation, and enhance solution accuracy. The SCGCRA is compared with the CPO, BKA, EHO, PO, WSA, HLOA, ECO, IAO, AO, HEOA, NRBO, and GCRA. The experimental results demonstrate that the SCGCRA exhibits substantial superiority and responsibility in remedying the disequilibrium between exploration and exploitation, thereby accelerating convergence speed, enhancing solution accuracy, and attaining the global extremum solution.
In future research, we will utilize the SCGCRA to resolve the numerical experiments of CEC2017, CEC2019, CEC2020, CEC2021, and CEC2022, which will further verify the practicality and reliability of the proposed algorithm. We will utilize the maximum fitness evaluations to compare the performance of different algorithms fairly. We will leverage the Anhui Provincial Engineering Research Center for Intelligent Equipment for Under-forest Crops to showcase the core aspects: deep technological integration and innovation, breakthroughs in equipment for specialty crops, and the upgrading of green and intelligent equipment. We will combine the distributed computing capabilities of the SCGCRA with the decentralized characteristics of under-forest crops to achieve thoughtful and efficient detection, and bionic precision harvesting equipment, thereby reducing redundant searches and enhancing detection efficiency. We will utilize the dynamic task allocation capability and optimization of tillage depth and frequency provided by the SCGCRA to exploit lightweight and modular equipment that is detachable and easy to transport, thereby reducing soil damage. We will utilize the energy management strategy of the SCGCRA to develop low-power, long-endurance harvesting robots, reduce carbon emissions, achieve variable fertilization and precise irrigation, dwindle resource waste, and promote the coordinated progress of agriculture and ecological protection.

Author Contributions

Conceptualization, J.Z.; methodology, J.Z. and T.Z.; software, J.Z. and A.J.; validation, J.Z., A.J. and T.Z.; formal analysis, J.Z. and A.J.; investigation, A.J. and T.Z.; resources, A.J. and T.Z.; data curation, J.Z.; writing—original draft preparation, J.Z. and A.J.; writing—review and editing, A.J. and T.Z.; visualization, J.Z. and A.J.; supervision, A.J. and T.Z.; project administration, A.J. and T.Z.; funding acquisition, J.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by the Natural Science Key Research Project of Anhui Educational Committee under Grant No. 2024AH051989, Start-up Fund for Distinguished Scholars of West Anhui University under Grant No. WGKQ2022052, Horizontal topic: Research on path planning technology of smart agriculture and forestry harvesting robots based on evolutionary algorithms under Grant No. 0045024064, School-level Quality Engineering (Teaching and Research Project) under Grant No. wxxy2023079, and School-level Quality Engineering (School-enterprise Cooperation Development Curriculum Resource Construction) under Grant No. wxxy2022101. The authors would like to thank the editor and anonymous reviewers for their helpful comments and suggestions.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available upon request from the corresponding author. The MATLAB code developed for this study is available from the corresponding author upon reasonable request.

Acknowledgments

The authors would like to thank everyone involved for their contributions to this article. The authors would like to thank the editors and reviewers for providing useful comments and suggestions to improve the quality of this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Guo, Z.; Liu, G.; Jiang, F. Chinese Pangolin Optimizer: A Novel Bio-Inspired Metaheuristic for Solving Optimization Problems. J. Supercomput. 2025, 81, 517. [Google Scholar] [CrossRef]
  2. Wang, J.; Wang, W.; Hu, X.; Qiu, L.; Zang, H. Black-Winged Kite Algorithm: A Nature-Inspired Meta-Heuristic for Solving Benchmark Functions and Engineering Problems. Artif. Intell. Rev. 2024, 57, 98. [Google Scholar] [CrossRef]
  3. Al-Betar, M.A.; Awadallah, M.A.; Braik, M.S.; Makhadmeh, S.; Doush, I.A. Elk Herd Optimizer: A Novel Nature-Inspired Metaheuristic Algorithm. Artif. Intell. Rev. 2024, 57, 48. [Google Scholar] [CrossRef]
  4. Abdollahzadeh, B.; Khodadadi, N.; Barshandeh, S.; Trojovskỳ, P.; Gharehchopogh, F.S.; El-kenawy, E.-S.M.; Abualigah, L.; Mirjalili, S. Puma Optimizer (PO): A Novel Metaheuristic Optimization Algorithm and Its Application in Machine Learning. Clust. Comput. 2024, 27, 5235–5283. [Google Scholar] [CrossRef]
  5. Peraza-Vázquez, H.; Peña-Delgado, A.; Merino-Treviño, M.; Morales-Cepeda, A.B.; Sinha, N. A Novel Metaheuristic Inspired by Horned Lizard Defense Tactics. Artif. Intell. Rev. 2024, 57, 59. [Google Scholar] [CrossRef]
  6. Agushaka, J.O.; Ezugwu, A.E.; Saha, A.K.; Pal, J.; Abualigah, L.; Mirjalili, S. Greater Cane Rat Algorithm (GCRA): A Nature-Inspired Metaheuristic for Optimization Problems. Heliyon 2024, 10, e31629. [Google Scholar] [CrossRef]
  7. Zhang, H.; San, H.; Sun, H.; Ding, L.; Wu, X. A Novel Optimization Method: Wave Search Algorithm. J. Supercomput. 2024, 80, 16824–16859. [Google Scholar] [CrossRef]
  8. Golalipour, K.; Nowdeh, S.A.; Akbari, E.; Hamidi, S.S.; Ghasemi, D.; Abdelaziz, A.Y.; Kotb, H.; Yousef, A. Snow Avalanches Algorithm (SAA): A New Optimization Algorithm for Engineering Applications. Alex. Eng. J. 2023, 83, 257–285. [Google Scholar] [CrossRef]
  9. Houssein, E.H.; Oliva, D.; Samee, N.A.; Mahmoud, N.F.; Emam, M.M. Liver Cancer Algorithm: A Novel Bio-Inspired Optimizer. Comput. Biol. Med. 2023, 165, 107389. [Google Scholar] [CrossRef]
  10. Yuan, Y.; Shen, Q.; Wang, S.; Ren, J.; Yang, D.; Yang, Q.; Fan, J.; Mu, X. Coronavirus Mask Protection Algorithm: A New Bio-Inspired Optimization Algorithm and Its Applications. J. Bionic Eng. 2023, 20, 1747–1765. [Google Scholar] [CrossRef]
  11. Ahmed, M.; Sulaiman, M.H.; Mohamad, A.J.; Rahman, M. Gooseneck Barnacle Optimization Algorithm: A Novel Nature Inspired Optimization Theory and Application. Math. Comput. Simul. 2024, 218, 248–265. [Google Scholar] [CrossRef]
  12. Kaveh, M.; Mesgari, M.S.; Saeidian, B. Orchard Algorithm (OA): A New Meta-Heuristic Algorithm for Solving Discrete and Continuous Optimization Problems. Math. Comput. Simul. 2023, 208, 95–135. [Google Scholar] [CrossRef]
  13. Yuan, C.; Zhao, D.; Heidari, A.A.; Liu, L.; Chen, Y.; Wu, Z.; Chen, H. Artemisinin Optimization Based on Malaria Therapy: Algorithm and Applications to Medical Image Segmentation. Displays 2024, 84, 102740. [Google Scholar] [CrossRef]
  14. Sowmya, R.; Premkumar, M.; Jangir, P. Newton-Raphson-Based Optimizer: A New Population-Based Metaheuristic Algorithm for Continuous Optimization Problems. Eng. Appl. Artif. Intell. 2024, 128, 107532. [Google Scholar] [CrossRef]
  15. Abdel-Basset, M.; El-Shahat, D.; Jameel, M.; Abouhawwash, M. Exponential Distribution Optimizer (EDO): A Novel Math-Inspired Algorithm for Global Optimization and Engineering Problems. Artif. Intell. Rev. 2023, 56, 9329–9400. [Google Scholar] [CrossRef]
  16. Abdel-Basset, M.; El-Shahat, D.; Jameel, M.; Abouhawwash, M. Young’s Double-Slit Experiment Optimizer: A Novel Metaheuristic Optimization Algorithm for Global and Constraint Optimization Problems. Comput. Methods Appl. Mech. Eng. 2023, 403, 115652. [Google Scholar] [CrossRef]
  17. Barua, S.; Merabet, A. Lévy Arithmetic Algorithm: An Enhanced Metaheuristic Algorithm and Its Application to Engineering Optimization. Expert Syst. Appl. 2024, 241, 122335. [Google Scholar] [CrossRef]
  18. Zhao, S.; Zhang, T.; Cai, L.; Yang, R. Triangulation Topology Aggregation Optimizer: A Novel Mathematics-Based Meta-Heuristic Algorithm for Continuous Optimization and Engineering Applications. Expert Syst. Appl. 2024, 238, 121744. [Google Scholar] [CrossRef]
  19. Lian, J.; Zhu, T.; Ma, L.; Wu, X.; Heidari, A.A.; Chen, Y.; Chen, H.; Hui, G. The Educational Competition Optimizer. Int. J. Syst. Sci. 2024, 55, 3185–3222. [Google Scholar] [CrossRef]
  20. Wu, X.; Li, S.; Jiang, X.; Zhou, Y. Information Acquisition Optimizer: A New Efficient Algorithm for Solving Numerical and Constrained Engineering Optimization Problems. J. Supercomput. 2024, 80, 25736–25791. [Google Scholar] [CrossRef]
  21. Lian, J.; Hui, G. Human Evolutionary Optimization Algorithm. Expert Syst. Appl. 2024, 241, 122638. [Google Scholar] [CrossRef]
  22. Jia, H.; Lu, C.; Xing, Z. Memory Backtracking Strategy: An Evolutionary Updating Mechanism for Meta-Heuristic Algorithms. Swarm Evol. Comput. 2024, 84, 101456. [Google Scholar] [CrossRef]
  23. Jia, H.; Lu, C. Guided Learning Strategy: A Novel Update Mechanism for Metaheuristic Algorithms Design and Improvement. Knowl.-Based Syst. 2024, 286, 111402. [Google Scholar] [CrossRef]
  24. Jia, H.; Zhou, X.; Zhang, J. Thinking Innovation Strategy (TIS): A Novel Mechanism for Metaheuristic Algorithm Design and Evolutionary Update. Appl. Soft Comput. 2025, 175, 113071. [Google Scholar] [CrossRef]
  25. Dehkordi, A.A.; Sadiq, A.S.; Mirjalili, S.; Ghafoor, K.Z. Nonlinear-Based Chaotic Harris Hawks Optimizer: Algorithm and Internet of Vehicles Application. Appl. Soft Comput. 2021, 109, 107574. [Google Scholar]
  26. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar]
  27. Zhang, J.; Liu, W.; Zhang, G.; Zhang, T. Quantum Encoding Whale Optimization Algorithm for Global Optimization and Adaptive Infinite Impulse Response System Identification. Artif. Intell. Rev. 2025, 58, 158. [Google Scholar] [CrossRef]
  28. Zhong, C.; Li, G.; Meng, Z.; Li, H.; Yildiz, A.R.; Mirjalili, S. Starfish Optimization Algorithm (SFOA): A Bio-Inspired Metaheuristic Algorithm for Global Optimization Compared with 100 Optimizers. Neural Comput. Appl. 2025, 37, 3641–3683. [Google Scholar]
  29. Mohammadzadeh, A.; Mirjalili, S. Eel and Grouper Optimizer: A Nature-Inspired Optimization Algorithm. Clust. Comput. 2024, 27, 12745–12786. [Google Scholar] [CrossRef]
  30. Han, M.; Du, Z.; Yuen, K.F.; Zhu, H.; Li, Y.; Yuan, Q. Walrus Optimizer: A Novel Nature-Inspired Metaheuristic Algorithm. Expert Syst. Appl. 2024, 239, 122413. [Google Scholar] [CrossRef]
  31. Jia, H.; Li, Y.; Wu, D.; Rao, H.; Wen, C.; Abualigah, L. Multi-Strategy Remora Optimization Algorithm for Solving Multi-Extremum Problems. J. Comput. Des. Eng. 2023, 10, 1315–1349. [Google Scholar] [CrossRef]
  32. Dhiman, G. SSC: A Hybrid Nature-Inspired Meta-Heuristic Optimization Algorithm for Engineering Applications. Knowl.-Based Syst. 2021, 222, 106926. [Google Scholar] [CrossRef]
  33. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A Nature-Inspired Meta-Heuristic Optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  34. Hashim, F.A.; Mostafa, R.R.; Hussien, A.G.; Mirjalili, S.; Sallam, K.M. Fick’s Law Algorithm: A Physical Law-Based Algorithm for Numerical Optimization. Knowl.-Based Syst. 2023, 260, 110146. [Google Scholar] [CrossRef]
  35. Abdel-Basset, M.; Mohamed, R.; Azeem, S.A.A.; Jameel, M.; Abouhawwash, M. Kepler Optimization Algorithm: A New Metaheuristic Algorithm Inspired by Kepler’s Laws of Planetary Motion. Knowl.-Based Syst. 2023, 268, 110454. [Google Scholar] [CrossRef]
  36. Abdel-Basset, M.; Mohamed, R.; Jameel, M.; Abouhawwash, M. Nutcracker Optimizer: A Novel Nature-Inspired Metaheuristic Algorithm for Global Optimization and Engineering Design Problems. Knowl.-Based Syst. 2023, 262, 110248. [Google Scholar] [CrossRef]
  37. Bai, J.; Li, Y.; Zheng, M.; Khatir, S.; Benaissa, B.; Abualigah, L.; Wahab, M.A. A Sinh Cosh Optimizer. Knowl.-Based Syst. 2023, 282, 111081. [Google Scholar] [CrossRef]
  38. Wang, W.; Tian, W.; Xu, D.; Zang, H. Arctic Puffin Optimization: A Bio-Inspired Metaheuristic Algorithm for Solving Engineering Design Optimization. Adv. Eng. Softw. 2024, 195, 103694. [Google Scholar] [CrossRef]
  39. Bai, J.; Nguyen-Xuan, H.; Atroshchenko, E.; Kosec, G.; Wang, L.; Wahab, M.A. Blood-Sucking Leech Optimizer. Adv. Eng. Softw. 2024, 195, 103696. [Google Scholar] [CrossRef]
  40. Bouaouda, A.; Hashim, F.A.; Sayouti, Y.; Hussien, A.G. Pied Kingfisher Optimizer: A New Bio-Inspired Algorithm for Solving Numerical Optimization and Industrial Engineering Problems. Neural Comput. Appl. 2024, 36, 15455–15513. [Google Scholar] [CrossRef]
  41. Kim, P.; Lee, J. An Integrated Method of Particle Swarm Optimization and Differential Evolution. J. Mech. Sci. Technol. 2009, 23, 426–434. [Google Scholar] [CrossRef]
  42. Rechenberg, I. Evolutionsstrategien. In Proceedings of the Simulationsmethoden in der Medizin und Biologie: Workshop, Hannover, Germany, 29 September–1 October 1977; Springer: Berlin/Heidelberg, Germany, 1978; pp. 83–114. [Google Scholar]
  43. Bayzidi, H.; Talatahari, S.; Saraee, M.; Lamarche, C.-P. Social Network Search for Solving Engineering Optimization Problems. Comput. Intell. Neurosci. 2021, 2021, 8548639. [Google Scholar] [CrossRef]
  44. Seyyedabbasi, A.; Kiani, F. Sand Cat Swarm Optimization: A Nature-Inspired Algorithm to Solve Global Optimization Problems. Eng. Comput. 2023, 39, 2627–2651. [Google Scholar] [CrossRef]
  45. Azizi, M.; Talatahari, S.; Giaralis, A. Optimization of Engineering Design Problems Using Atomic Orbital Search Algorithm. IEEE Access 2021, 9, 102497–102519. [Google Scholar] [CrossRef]
  46. Sadeeq, H.T.; Abdulazeez, A.M. Giant Trevally Optimizer (GTO): A Novel Metaheuristic Algorithm for Global Optimization and Challenging Engineering Problems. IEEE Access 2022, 10, 121615–121640. [Google Scholar] [CrossRef]
  47. Ezugwu, A.E.; Agushaka, J.O.; Abualigah, L.; Mirjalili, S.; Gandomi, A.H. Prairie Dog Optimization Algorithm. Neural Comput. Appl. 2022, 34, 20017–20065. [Google Scholar] [CrossRef]
  48. Das, A.K.; Pratihar, D.K. Bonobo Optimizer (BO): An Intelligent Heuristic with Self-Adjusting Parameters over Continuous Spaces and Its Applications to Engineering Problems. Appl. Intell. 2022, 52, 2942–2974. [Google Scholar] [CrossRef]
  49. Singh, H.; Singh, B.; Kaur, M. An Improved Elephant Herding Optimization for Global Optimization Problems. Eng. Comput. 2022, 38, 3489–3521. [Google Scholar] [CrossRef]
  50. Shen, Y.; Zhang, C.; Gharehchopogh, F.S.; Mirjalili, S. An Improved Whale Optimization Algorithm Based on Multi-Population Evolution for Global Optimization and Engineering Design Problems. Expert Syst. Appl. 2023, 215, 119269. [Google Scholar] [CrossRef]
  51. Wang, L.; Cao, Q.; Zhang, Z.; Mirjalili, S.; Zhao, W. Artificial Rabbits Optimization: A New Bio-Inspired Meta-Heuristic Algorithm for Solving Engineering Optimization Problems. Eng. Appl. Artif. Intell. 2022, 114, 105082. [Google Scholar] [CrossRef]
  52. Yadav, D. Blood Coagulation Algorithm: A Novel Bio-Inspired Meta-Heuristic Algorithm for Global Optimization. Mathematics 2021, 9, 3011. [Google Scholar] [CrossRef]
  53. Tarkhaneh, O.; Alipour, N.; Chapnevis, A.; Shen, H. Golden Tortoise Beetle Optimizer: A Novel Nature-Inspired Meta-Heuristic Algorithm for Engineering Problems. arXiv 2021, arXiv:210401521. [Google Scholar]
  54. Wu, H.; Zhang, X.; Song, L.; Zhang, Y.; Gu, L.; Zhao, X. Wild Geese Migration Optimization Algorithm: A New Meta-Heuristic Algorithm for Solving Inverse Kinematics of Robot. Comput. Intell. Neurosci. 2022, 2022, 5191758. [Google Scholar] [CrossRef]
  55. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. Starling Murmuration Optimizer: A Novel Bio-Inspired Algorithm for Global and Engineering Optimization. Comput. Methods Appl. Mech. Eng. 2022, 392, 114616. [Google Scholar] [CrossRef]
  56. Lang, Y.; Gao, Y. Dream Optimization Algorithm (DOA): A Novel Metaheuristic Optimization Algorithm Inspired by Human Dreams and Its Applications to Real-World Engineering Problems. Comput. Methods Appl. Mech. Eng. 2025, 436, 117718. [Google Scholar] [CrossRef]
  57. Luan, T.M.; Khatir, S.; Tran, M.T.; De Baets, B.; Cuong-Le, T. Exponential-Trigonometric Optimization Algorithm for Solving Complicated Engineering Problems. Comput. Methods Appl. Mech. Eng. 2024, 432, 117411. [Google Scholar] [CrossRef]
  58. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching–Learning-Based Optimization: A Novel Method for Constrained Mechanical Design Optimization Problems. Comput.-Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  59. Bhesdadiya, R.; Trivedi, I.N.; Jangir, P.; Jangir, N. Moth-Flame Optimizer Method for Solving Constrained Engineering Optimization Problems. In Advances in Computer and Computational Sciences: Proceedings of ICCCCS 2016, Volume 2; Springer: Berlin/Heidelberg, Germany, 2017; pp. 61–68. [Google Scholar]
  60. Deb, K.; Srinivasan, A. Innovization: Discovery of Innovative Design Principles through Multiobjective Evolutionary Optimization. In Multiobjective Problem Solving from Nature: From Concepts to Applications; Springer: Berlin/Heidelberg, Germany, 2007; pp. 243–262. [Google Scholar]
  61. Sayed, G.I.; Darwish, A.; Hassanien, A.E. A New Chaotic Multi-Verse Optimization Algorithm for Solving Engineering Optimization Problems. J. Exp. Theor. Artif. Intell. 2018, 30, 293–317. [Google Scholar] [CrossRef]
  62. Eskandar, H.; Sadollah, A.; Bahreininejad, A.; Hamdi, M. Water Cycle Algorithm–A Novel Metaheuristic Optimization Method for Solving Constrained Engineering Optimization Problems. Comput. Struct. 2012, 110, 151–166. [Google Scholar] [CrossRef]
  63. Singh, N.; Kaur, J. Hybridizing Sine–Cosine Algorithm with Harmony Search Strategy for Optimization Design Problems. Soft Comput. 2021, 25, 11053–11075. [Google Scholar] [CrossRef]
  64. Azizyan, G.; Miarnaeimi, F.; Rashki, M.; Shabakhty, N. Flying Squirrel Optimizer (FSO): A Novel SI-Based Optimization Algorithm for Engineering Problems. Iran J Optim 2019, 11, 177–205. [Google Scholar]
  65. Yildiz, B.S.; Pholdee, N.; Bureerat, S.; Yildiz, A.R.; Sait, S.M. Enhanced Grasshopper Optimization Algorithm Using Elite Opposition-Based Learning for Solving Real-World Engineering Problems. Eng. Comput. 2022, 38, 4207–4219. [Google Scholar] [CrossRef]
  66. Zhao, W.; Wang, L.; Zhang, Z. Artificial Ecosystem-Based Optimization: A Novel Nature-Inspired Meta-Heuristic Algorithm. Neural Comput. Appl. 2020, 32, 9383–9425. [Google Scholar] [CrossRef]
  67. Zhao, W.; Wang, L.; Mirjalili, S. Artificial Hummingbird Algorithm: A New Bio-Inspired Optimizer with Its Engineering Applications. Comput. Methods Appl. Mech. Eng. 2022, 388, 114194. [Google Scholar] [CrossRef]
  68. Askari, Q.; Saeed, M.; Younas, I. Heap-Based Optimizer Inspired by Corporate Rank Hierarchy for Global Optimization. Expert Syst. Appl. 2020, 161, 113702. [Google Scholar] [CrossRef]
  69. Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger Games Search: Visions, Conception, Implementation, Deep Analysis, Perspectives, and towards Performance Shifts. Expert Syst. Appl. 2021, 177, 114864. [Google Scholar] [CrossRef]
  70. Naruei, I.; Keynia, F.; Sabbagh Molahosseini, A. Hunter–Prey Optimization: Algorithm and Applications. Soft Comput. 2022, 26, 1279–1314. [Google Scholar] [CrossRef]
  71. Zhao, W.; Zhang, Z.; Wang, L. Manta Ray Foraging Optimization: An Effective Bio-Inspired Optimizer for Engineering Applications. Eng. Appl. Artif. Intell. 2020, 87, 103300. [Google Scholar] [CrossRef]
  72. Dhiman, G.; Garg, M.; Nagar, A.; Kumar, V.; Dehghani, M. A Novel Algorithm for Global Optimization: Rat Swarm Optimizer. J. Ambient. Intell. Humaniz. Comput. 2021, 12, 8457–8482. [Google Scholar] [CrossRef]
  73. Emami, H. Anti-Coronavirus Optimization Algorithm. Soft Comput. 2022, 26, 4991–5023. [Google Scholar] [CrossRef]
  74. Braik, M.; Ryalat, M.H.; Al-Zoubi, H. A Novel Meta-Heuristic Algorithm for Solving Numerical Optimization Problems: Ali Baba and the Forty Thieves. Neural Comput. Appl. 2022, 34, 409–455. [Google Scholar] [CrossRef]
  75. Ahmadianfar, I.; Heidari, A.A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN beyond the Metaphor: An Efficient Optimization Algorithm Based on Runge Kutta Method. Expert Syst. Appl. 2021, 181, 115079. [Google Scholar] [CrossRef]
  76. Azizi, M.; Aickelin, U.; Khorshidi, H.A.; Baghalzadeh Shishehgarkhaneh, M. Energy Valley Optimizer: A Novel Metaheuristic Algorithm for Global and Engineering Optimization. Sci. Rep. 2023, 13, 226. [Google Scholar] [CrossRef]
  77. Emami, H. Stock Exchange Trading Optimization Algorithm: A Human-Inspired Method for Global Optimization. J. Supercomput. 2022, 78, 2125–2174. [Google Scholar] [CrossRef] [PubMed]
Figure 1. The natural foraging habitat of GCRs.
Figure 1. The natural foraging habitat of GCRs.
Biomimetics 10 00629 g001
Figure 2. Exploration action of GCRs while looking for sources.
Figure 2. Exploration action of GCRs while looking for sources.
Biomimetics 10 00629 g002
Figure 3. Exploitation action of GCRs during mating season.
Figure 3. Exploitation action of GCRs during mating season.
Biomimetics 10 00629 g003
Figure 4. Flowchart of SCGCRA.
Figure 4. Flowchart of SCGCRA.
Biomimetics 10 00629 g004
Figure 5. Convergence curves of the SCGCRA and comparative algorithms for addressing the benchmark functions.
Figure 5. Convergence curves of the SCGCRA and comparative algorithms for addressing the benchmark functions.
Biomimetics 10 00629 g005
Figure 6. Boxplots of the SCGCRA and comparative algorithms for addressing the benchmark functions.
Figure 6. Boxplots of the SCGCRA and comparative algorithms for addressing the benchmark functions.
Biomimetics 10 00629 g006
Figure 7. Sketch map of the three-bar truss design.
Figure 7. Sketch map of the three-bar truss design.
Biomimetics 10 00629 g007
Figure 8. Sketch map of the piston lever design.
Figure 8. Sketch map of the piston lever design.
Biomimetics 10 00629 g008
Figure 9. Sketch map of the gear train design.
Figure 9. Sketch map of the gear train design.
Biomimetics 10 00629 g009
Figure 10. Sketch map of the car side impact design.
Figure 10. Sketch map of the car side impact design.
Biomimetics 10 00629 g010
Figure 11. Sketch map of the multiple-disk clutch brake design.
Figure 11. Sketch map of the multiple-disk clutch brake design.
Biomimetics 10 00629 g011
Figure 12. Sketch map of the rolling element bearing design.
Figure 12. Sketch map of the rolling element bearing design.
Biomimetics 10 00629 g012
Table 1. Benchmark functions.
Table 1. Benchmark functions.
Benchmark Test FunctionsDimRange f min
f 1 = i = 1 n x i 2 30[−100, 100]0
f 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | 30[−10, 10]0
f 3 ( x ) = i = 1 n ( j = 1 i x j ) 2 30[−100, 100]0
f 4 ( x ) = max i { | x i | , 1 i n } 30[−100, 100]0
f 5 ( x ) = i = 1 n 1 [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ] 30[−30, 30]0
f 6 ( x ) = i = 1 n ( [ x i + 0.5 ] ) 2 30[−100, 100]0
f 7 ( x ) = i = 1 n i x i 4 + r a n d o m [ 0 , 1 ) 30[−1.28, 1.28]0
f 8 ( x ) = i = 1 n [ x i 2 10 cos ( 2 π x i ) + 10 ] 30[−5.12, 5.12]0
f 9 ( x ) = 20 exp 0.2 1 n i = 1 n x i 2 exp 1 n i = 1 n cos 2 π x i + 20 + e 30[−32, 32]0
f 10 ( x ) = 1 4000 i = 1 n x i 2 i = 1 n cos x i i + 1 30[−600, 600]0
f 11 ( x ) = π n 10 sin 2 ( π y 1 ) + i = 1 n 1 ( y 1 ) 2 [ 1 + 10 sin 2 ( π y 1 ) ] + ( y n 1 ) 2 + i = 1 n u ( x i , 10 , 100 , 4 ) y i = 1 + x i + 1 4 u ( x i , a , k , m ) = k ( x i a ) m , x i > a 0 , a x i a k ( x i z ) m , x i < a 30[−50, 50]0
f 12 ( x ) = 0.1 sin 2 3 π x 1 + i = 1 n ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n 1 ) 2 [ 1 + sin 2 ( 2 π x n ) ] + i = 1 n u ( x i , 5 , 100 , 4 ) 30[−50, 50]0
f 13 ( x ) = ( 1 500 + j = 1 25 1 j + i = 1 2 ( x i a i j ) 6 ) 1 2[−65, 65]0.998
f 14 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 2[−5, 5]−1.0316
f 15 ( x ) = 1 + cos ( 12 x 1 2 + x 2 2 ) 0.5 ( x 1 2 + x 2 2 ) + 2 2[−5.12, 5.12]−1
f 16 ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ]   ×   [ 30 + ( 2 x 1 3 x 2 ) 2 ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ] 2[−2, 2]3
f 17 ( x ) = i = 1 4 c i exp ( j = 1 6 a i j ( x j p i j ) 2 ) 6[0, 1]−3.32
f 18 ( x ) = i = 1 5 [ ( x a i ) ( x a i ) T + c i ] 1 4[0, 10]−10.1532
f 19 ( x ) = i = 1 7 [ ( x a i ) ( x a i ) T + c i ] 1 4[0, 10]−10.4029
f 20 ( x ) = i = 1 10 [ ( x a i ) ( x a i ) T + c i ] 1 4[0, 10]−10.5364
f 21 ( x ) = cos ( x 1 ) cos ( x 2 ) exp ( ( x 1 π ) 2 ( x 2 π ) 2 ) 2 [ 2 π , 2 π ] −1
f 22 ( x ) = 0.5 + sin 2 x 1 2 + x 2 2 0.5 ( 1 + 0.001 ( x 1 2 + x 2 2 ) ) 2 2[−100, 100]−1
f 23 ( x ) = i = 1 n x i sin ( x i ) + 0.1 x i 10[−10, 10]0
Table 2. Contrastive results of benchmark functions.
Table 2. Contrastive results of benchmark functions.
FunctionResultCPOBKAEHOPOWSAHLOAECOIAOAOHEOANRBOGCRASCGCRA
f 1 Best 3.9   ×   10 191 7.4   ×   10 221 7.22   ×   10 22 000 8.3   ×   10 177 0 5.86   ×   10 51 3.5   ×   10 188 000
Worst 1.2   ×   10 135 8.2   ×   10 181 1.03   ×   10 18 000 2.5   ×   10 104 0 1.26   ×   10 36 3.1   ×   10 154 000
Mean 3.9   ×   10 137 2.7   ×   10 182 1.05   ×   10 19 000 8.4   ×   10 106 0 7.51   ×   10 38 1.0   ×   10 155 000
Std 2.1   ×   10 136 0 2.50   ×   10 19 000 4.6   ×   10 105 0 2.62   ×   10 37 5.7   ×   10 155 000
f 2 Best 1.40   ×   10 99 6.5   ×   10 109 2.33   ×   10 14 9.5   ×   10 274 7.4   ×   10 157 1.3   ×   10 270 1.14   ×   10 89 0 1.41   ×   10 36 1.84   ×   10 93 000
Worst 1.48   ×   10 70 4.71   ×   10 94 5.06   ×   10 11 2.2   ×   10 265 5.2   ×   10 151 1.4   ×   10 251 1.41   ×   10 52 0 8.28   ×   10 26 4.30   ×   10 69 1.6   ×   10 306 00
Mean 4.92   ×   10 72 1.60   ×   10 95 2.41   ×   10 12 1.5   ×   10 266 2.4   ×   10 152 5.9   ×   10 253 4.75   ×   10 54 0 7.50   ×   10 27 1.44   ×   10 70 8.9   ×   10 308 00
Std 2.69   ×   10 71 8.95   ×   10 95 9.19   ×   10 12 0 9.6   ×   10 152 0 2.57   ×   10 53 0 1.82   ×   10 26 7.85   ×   10 70 000
f 3 Best 9.5   ×   10 186 4.7   ×   10 219 1.719617000 5.1   ×   10 165 0 3.60   ×   10 17 1.6   ×   10 216 000
Worst 1.9   ×   10 142 5.6   ×   10 158 23.041440 4.0   ×   10 307 0 4.0   ×   10 115 00.000685 4.9   ×   10 205 000
Mean 7.7   ×   10 144 1.9   ×   10 159 9.040724000 1.3   ×   10 116 0 5.04   ×   10 05 1.6   ×   10 206 000
Std 3.5   ×   10 143 1.0   ×   10 158 5.778427000 7.4   ×   10 116 00.0001390000
f 4 Best 1.60   ×   10 94 5.3   ×   10 109 4.708022 5.9   ×   10 277 4.1   ×   10 164 1.2   ×   10 281 1.68   ×   10 82 0 3.46   ×   10 07 1.19   ×   10 58 000
Worst 5.40   ×   10 73 9.62   ×   10 94 14.89710 4.3   ×   10 269 1.3   ×   10 152 4.2   ×   10 250 2.93   ×   10 58 0 7.22   ×   10 05 1.74   ×   10 48 5.9   ×   10 296 00
Mean 1.94   ×   10 74 3.21   ×   10 95 9.530409 1.8   ×   10 270 4.4   ×   10 154 1.6   ×   10 251 1.05   ×   10 59 0 2.44   ×   10 05 7.96   ×   10 50 2.1   ×   10 297 00
Std 9.86   ×   10 74 1.76   ×   10 94 2.7509750 2.4   ×   10 153 0 5.35   ×   10 59 0 1.87   ×   10 05 3.31   ×   10 49 000
f 5 Best28.2264123.971812.096586 8.62   ×   10 05 16.70165 3.36   ×   10 06 25.5555428.6480026.331540.12820826.62669 1.03   ×   10 08 4.20   ×   10 12
Worst28.9710128.91524152.725925.7134120.9388828.7057026.4656528.7716626.803141.64326228.819000.000118 6.95   ×   10 06
Mean28.8006526.1945852.4091723.7912918.8973720.0946525.9456928.7237526.609470.58947027.49856 9.89   ×   10 06 8.75   ×   10 07
Std0.1461141.37780939.414124.5247371.22687413.351280.2583480.0269580.1263090.3812450.603713 2.40   ×   10 05 1.59   ×   10 06
f 6 Best6.032364 3.93   ×   10 05 8.63   ×   10 23 1.93   ×   10 10 0 3.77   ×   10 06 1.57   ×   10 10 0.170839 2.35   ×   10 10 0.0018021.564431 2.03   ×   10 10 4.13   ×   10 13
Worst6.7509205.838999 7.70   ×   10 19 7.70   ×   10 09 00.000167 4.97   ×   10 07 1.827244 5.58   ×   10 05 1.4835723.069441 2.93   ×   10 07 7.77   ×   10 09
Mean6.4938470.712540 6.86   ×   10 20 1.73   ×   10 09 0 6.09   ×   10 05 6.50   ×   10 08 0.946642 5.18   ×   10 06 0.2573512.262051 5.62   ×   10 08 1.04   ×   10 09
Std0.1839351.678794 1.56   ×   10 19 1.93   ×   10 09 0 4.49   ×   10 05 1.11   ×   10 07 0.445780 1.11   ×   10 05 0.4210810.364347 7.19   ×   10 08 1.70   ×   10 09
f 7 Best 5.29   ×   10 07 1.85   ×   10 06 0.016694 1.70   ×   10 06 6.76   ×   10 07 1.46   ×   10 06 2.74   ×   10 06 8.06   ×   10 07 0.000187 1.59   ×   10 06 3.42   ×   10 06 8.72   ×   10 07 1.48   ×   10 08
Worst0.0003060.0003120.1544080.0001990.0001140.0003960.000232 6.07   ×   10 05 0.0028120.0002430.0003790.000111 2.02   ×   10 06
Mean 6.87   ×   10 05 8.94   ×   10 05 0.064251 6.74   ×   10 05 3.95   ×   10 05 0.000124 8.52   ×   10 05 2.29   ×   10 05 0.001062 4.96   ×   10 05 0.000112 4.45   ×   10 05 5.64   ×   10 07
Std 6.91   ×   10 05 8.54   ×   10 05 0.034555 5.00   ×   10 05 3.17   ×   10 05 0.000104 6.18   ×   10 05 1.66   ×   10 05 0.000603 4.66   ×   10 05 9.89   ×   10 05 3.15   ×   10 05 5.11   ×   10 07
f 8 Best0016.9143000000029.40100000
Worst0079.5964800000038.57384000
Mean0027.8256600000030.84521000
Std0010.910090000002.421543000
f 9 Best 4.44   ×   10 16 4.44   ×   10 16 5.87   ×   10 12 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.00   ×   10 15 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16
Worst 4.44   ×   10 16 4.44   ×   10 16 3.681357 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 1.47   ×   10 14 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16
Mean 4.44   ×   10 16 4.44   ×   10 16 1.863179 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 8.85   ×   10 15 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16 4.44   ×   10 16
Std000.78379500000 3.02   ×   10 15 0000
f 10 Best0000000000000
Worst000.157370000000.0757270.082115000
Mean000.021437000000.0037560.022793000
Std000.038474000000.0144090.026793000
f 11 Best0.811592 2.76   ×   10 06 2.38   ×   10 22 9.83   ×   10 12 1.77   ×   10 30 1.11   ×   10 08 5.86   ×   10 11 0.018092 3.48   ×   10 13 1.81   ×   10 05 0.088837 2.72   ×   10 13 4.06   ×   10 13
Worst1.4023940.2924360.936925 1.19   ×   10 09 0.2073170.207367 7.60   ×   10 08 0.231948 9.63   ×   10 07 0.9608630.350026 8.59   ×   10 09 3.05   ×   10 09
Mean0.9993410.0161930.214786 1.37   ×   10 10 0.0241870.006916 4.73   ×   10 09 0.085527 6.23   ×   10 08 0.1996700.183719 1.34   ×   10 09 2.05   ×   10 10
Std0.1563320.0528930.326651 2.31   ×   10 10 0.0648970.037859 1.38   ×   10 08 0.056073 1.79   ×   10 07 0.2754050.062374 2.23   ×   10 09 5.56   ×   10 10
f 12 Best2.7037480.206470 5.47   ×   10 21 1.62   ×   10 11 3.96   ×   10 31 4.56   ×   10 06 1.07   ×   10 09 0.074259 2.41   ×   10 12 0.0001351.157680 3.42   ×   10 10 4.92   ×   10 13
Worst2.8989332.9967173.608452 1.13   ×   10 08 0.0308270.1085590.8014242.9750890.5869730.0041592.791573 2.30   ×   10 07 1.27   ×   10 08
Mean2.8033251.4683600.620813 1.39   ×   10 09 0.0027950.0098250.0307242.1675490.1110980.0014341.980314 3.94   ×   10 08 2.74   ×   10 09
Std0.0423380.7540961.358982 2.24   ×   10 09 0.0077030.0256630.1456851.2579030.1332870.0012290.430654 6.12   ×   10 08 4.13   ×   10 09
f 13 Best1.0001940.9980040.9980040.9980040.9980040.9980040.9980040.9980040.9980040.9980040.9980040.9980040.998004
Worst2.9821051.99203110.763180.9980041.99203112.670512.9821050.99800410.7631812.670512.9821050.9980040.998004
Mean2.0299391.0311382.3119010.9980041.0311384.8886821.1302770.9980041.5220787.7345341.4609610.9980040.998004
Std0.7560350.1814842.32384800.1814843.8495280.50338301.8285915.0351980.853527 1.78   ×   10 12 1.33   ×   10 10
f 14 Best−1.03160−1.03163−1.03163−1.03163−1.03163−1.03163−1.03163−1.03163−1.03163−1.03163−1.03163−1.03161−1.03163
Worst−1.02996−1.03163−1.03163−1.03163−1.03163−0.21546−1.03163−1.03163−1.03141−1.03125−1.03163−0.83077−1.03163
Mean−1.03127−1.03163−1.03163−1.03163−1.03163−1.00442−1.03163−1.03163−1.03161−1.03156−1.03163−0.99581−1.03163
Std0.000390 5.90   ×   10 16 6.78   ×   10 16 6.78   ×   10 16 6.52   ×   10 16 0.149011 3.40   ×   10 15 6.78   ×   10 16 5.39   ×   10 05 8.67   ×   10 05 5.83   ×   10 16 0.044594 6.58   ×   10 16
f 15 Best−1−1−1−1−1−1−1−1−1−1−1−1−1
Worst−1−1−0.93625−0.93625−1−1−1−1−1−1−1−1−1
Mean−1−1−0.98725−0.99787−1−1−1−1−1−1−1−1−1
Std000.0259380.0116400000000 4.42   ×   10 12 0
f 16 Best3333333333.00001133.1295003
Worst3.000200333330333.0440263.029225332.684543
Mean3.00002733333.9333.0036413.003765318.459673
Std 4.37   ×   10 05 7.14   ×   10 16 1.75   ×   10 15 1.10   ×   10 15 9.72   ×   10 16 4.929503 1.10   ×   10 15 1.82   ×   10 15 0.0086070.006171 3.77   ×   10 15 12.04699 1.66   ×   10 15
f 17 Best−3.15823−3.32200−3.32200−3.32200−3.32200−3.32200−3.32200−3.32200−3.32200−3.30996−3.32200−2.60843−3.32200
Worst−2.45031−3.15869−3.20310−3.20310−3.20310−3.08394−3.20310−3.32200−3.20166−3.06329−3.13725−1.16984−3.32200
Mean−2.94344−3.29920−3.29029−3.29029−3.25066−3.25655−3.25462−3.32200−3.27040−3.19238−3.25994−1.82424−3.32200
Std0.1446630.0523180.0534750.0534750.0592410.0753050.059923 1.36   ×   10 15 0.0600010.0728750.0672230.391285 9.58   ×   10 15
f 18 Best−4.06428−10.1532−10.1532−10.1532−10.1532−10.1532−10.1532−10.1532−10.1532−10.1520−10.1532−10.1532−10.1532
Worst−0.85099−10.1532−2.63047−10.1532−5.05520−2.63047−2.63047−5.05520−2.62952−7.07049−5.38758−10.1494−10.1521
Mean−1.58082−10.1532−8.31182−10.1532−9.64340−9.65092−9.90244−7.43427−5.85794−9.16726−9.98192−10.1528−10.1531
Std0.975053 5.71   ×   10 15 2.938369 6.51   ×   10 15 1.5555461.9083701.3734562.5868092.8131230.8702570.8694910.0007230.000198
f 19 Best−3.87932−10.4029−10.4029−10.4029−10.4029−10.4029−10.4029−10.4029−10.4029−10.4005−10.4029−10.4028−10.4028
Worst−0.84842−3.72430−2.75193−2.76590−5.08767−2.76590−1.83759−5.08767−2.74761−6.51320−4.89939−10.3973−10.4020
Mean−2.01752−10.1794−9.41615−10.1484−9.33989−8.71556−9.16305−7.56813−6.89784−9.30859−9.95802−10.4024−10.4027
Std0.9190061.2191882.5635261.3943272.1624543.1166582.8351412.6970543.6105401.1224351.4054880.0010980.000190
f 20 Best−4.18195−10.5364−10.5364−10.5364−10.5364−10.5364−10.5364−10.5364−10.5364−10.5339−10.5364−10.5363−10.5363
Worst−0.93680−10.5364−2.87114−3.83543−5.12848−2.42173−2.42173−5.12848−2.42144−8.11593−7.26135−10.5341−10.5359
Mean−2.66391−10.5364−9.83417−10.0897−9.81535−8.46676−8.74623−6.75086−6.70094−9.84772−10.2548−10.5360−10.5362
Std0.816296 2.03   ×   10 15 2.1477201.7000941.8697693.5042723.3321912.5205903.7400670.7804520.7499360.000594 9.39   ×   10 05
f 21 Best−0.99988−1−1−1−1−1−1−1−1−1−1−1−1
Worst−0.93514−1−1−1−1−1−1−1−0.99901−0.99996−1−1−1
Mean−0.98330−1−1−1−1−1−1−1−0.99992−0.99999−1−1−1
Std0.01405300000000.000223 9.32   ×   10 06 0 8.37   ×   10 08 2.55   ×   10 07
f 22 Best−1−1−1−1−1−1−1−1−1−1−1−1−1
Worst−1−1−0.99028−0.99028−1−1−1−1−1−0.99028−1−1−1
Mean−1−1−0.99255−0.99870−1−1−1−1−1−0.99636−1−1−1
Std000.0041800.003359000000.0047180 4.41   ×   10 10 0
f 23 Best 6.4   ×   10 140 1.1   ×   10 110 5.52   ×   10 97 2.7   ×   10 275 5.5   ×   10 181 0 1.39   ×   10 80 0 1.17   ×   10 87 8.05 × 10−970 4.08   ×   10 14 0
Worst 2.4   ×   10 111 9.07   ×   10 87 1.44   ×   10 15 4.2   ×   10 263 7.43   ×   10 05 1.6   ×   10 250 1.33   ×   10 61 0 9.34   ×   10 51 0.3783880 1.29   ×   10 05 0
Mean 8.1   ×   10 113 3.02   ×   10 88 1.31   ×   10 16 3.6   ×   10 264 6.41   ×   10 06 5.3   ×   10 252 8.56   ×   10 63 0 3.19   ×   10 52 0.0306040 6.22   ×   10 07 0
Std 4.4   ×   10 112 1.66   ×   10 87 3.12   ×   10 16 0 1.58   ×   10 05 0 3.21   ×   10 62 0 1.70   ×   10 51 0.0859380 2.4   ×   10 06 0
Table 3. Contrastive results of the p-value Wilcoxon rank-sum test on the benchmark functions.
Table 3. Contrastive results of the p-value Wilcoxon rank-sum test on the benchmark functions.
FunctionSCGCRA vs. CPOSCGCRA vs. BKASCGCRA vs. EHOSCGCRA vs. POSCGCRA vs. WSASCGCRA vs. HLOASCGCRA vs. ECOSCGCRA vs. IAOSCGCRA vs. AOSCGCRA vs. HEOASCGCRA vs. NRBOSCGCRA vs. GCRA
f 1 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 N/AN/AN/A1.21   ×   10 12 N/A1.21   ×   10 12 1.21   ×   10 12 N/AN/A
f 2 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 N/A1.21   ×   10 12 1.21   ×   10 12 4.19   ×   10 02 N/A
f 3 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 N/A3.33   ×   10 02 N/A1.21   ×   10 12 N/A1.21   ×   10 12 1.21   ×   10 12 N/AN/A
f 4 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 N/A1.21   ×   10 12 1.21   ×   10 12 4.57   ×   10 12 N/A
f 5 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 4.08   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 1.68   ×   10 03
f 6 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 4.85   ×   10 03 1.21   ×   10 12 3.02   ×   10 11 6.52   ×   10 09 3.02   ×   10 11 2.37   ×   10 10 3.02   ×   10 11 3.02   ×   10 11 1.10   ×   10 08
f 7 1.33   ×   10 10 3.34   ×   10 11 3.02   ×   10 11 3.69   ×   10 11 8.99   ×   10 11 3.69   ×   10 11 3.02   ×   10 11 6.07   ×   10 11 3.02   ×   10 11 3.69   ×   10 11 3.02   ×   10 11 4.98   ×   10 11
f 8 N/AN/A1.20   ×   10 12 N/AN/AN/AN/AN/AN/A1.21   ×   10 12 N/AN/A
f 9 N/AN/A1.21   ×   10 12 N/AN/AN/AN/AN/A2.01   ×   10 13 N/AN/AN/A
f 10 N/AN/A1.92   ×   10 09 N/AN/AN/AN/AN/A8.15   ×   10 02 1.27   ×   10 05 N/AN/A
f 11 3.02   ×   10 11 3.02   ×   10 11 6.10   ×   10 01 4.64   ×   10 02 1.11   ×   10 06 3.02   ×   10 11 1.07   ×   10 07 3.02   ×   10 11 3.83   ×   10 05 3.02   ×   10 11 3.02   ×   10 11 1.00   ×   10 03
f 12 3.02   ×   10 11 3.02   ×   10 11 N/A5.39   ×   10 01 1.11   ×   10 06 3.02   ×   10 11 8.89   ×   10 10 3.02   ×   10 11 1.20   ×   10 08 3.02   ×   10 11 3.02   ×   10 11 7.60   ×   10 07
f 13 3.02   ×   10 11 1.44   ×   10 10 7.25   ×   10 02 1.21   ×   10 12 4.56   ×   10 11 9.48   ×   10 06 4.72   ×   10 09 1.21   ×   10 12 1.05   ×   10 02 1.61   ×   10 10 3.75   ×   10 04 1.03   ×   10 06
f 14 3.15   ×   10 12 3.91   ×   10 03 8.14   ×   10 02 8.14   ×   10 02 6.99   ×   10 01 7.48   ×   10 08 2.65   ×   10 02 8.14   ×   10 02 3.87   ×   10 11 3.15   ×   10 12 1.83   ×   10 03 3.15   ×   10 12
f 15 N/AN/A1.09   ×   10 02 3.33   ×   10 02 N/AN/AN/AN/AN/AN/AN/A2.21   ×   10 06
f 16 2.19   ×   10 11 1.52   ×   10 05 2.49   ×   10 08 6.96   ×   10 05 1.06   ×   10 03 2.17   ×   10 11 1.29   ×   10 05 1.80   ×   10 02 1.26   ×   10 10 2.19   ×   10 11 5.55   ×   10 06 2.19   ×   10 11
f 17 1.25   ×   10 11 1.25   ×   10 11 8.66   ×   10 01 8.62   ×   10 01 3.28   ×   10 07 1.25   ×   10 11 9.51   ×   10 11 6.56   ×   10 04 5.91   ×   10 08 1.25   ×   10 11 1.25   ×   10 11 1.25   ×   10 11
f 18 3.02   ×   10 11 6.39   ×   10 12 6.66   ×   10 03 1.41   ×   10 11 8.38   ×   10 08 1.32   ×   10 04 3.79   ×   10 10 6.59   ×   10 01 4.87   ×   10 05 3.02   ×   10 11 3.04   ×   10 06 3.03   ×   10 03
f 19 3.02   ×   10 11 4.37   ×   10 09 8.43   ×   10 07 1.94   ×   10 10 5.55   ×   10 05 1.17   ×   10 04 8.33   ×   10 06 6.60   ×   10 01 6.95   ×   10 01 3.02   ×   10 11 5.13   ×   10 03 2.64   ×   10 02
f 20 3.02   ×   10 11 2.08   ×   10 11 3.43   ×   10 08 5.12   ×   10 09 7.79   ×   10 07 3.26   ×   10 02 3.69   ×   10 04 7.50   ×   10 03 7.72   ×   10 02 3.02   ×   10 11 2.50   ×   10 02 1.85   ×   10 02
f 21 3.02   ×   10 11 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 8.34   ×   10 03 3.50   ×   10 09 1.21   ×   10 12 4.03   ×   10 02
f 22 N/AN/A1.77   ×   10 09 4.17   ×   10 02 N/AN/AN/AN/AN/A4.95   ×   10 06 N/A8.86   ×   10 07
f 23 1.21   ×   10 12 1.21   ×   10 12 1.20   ×   10 12 1.21   ×   10 12 1.21   ×   10 12 4.57   ×   10 12 1.21   ×   10 12 N/A1.21   ×   10 12 1.21   ×   10 12 N/A1.21   ×   10 12
Table 4. Contrastive results of the three-bar truss design.
Table 4. Contrastive results of the three-bar truss design.
AlgorithmOptimum VariablesOptimum Weight
A 1 A 2
LSHADE [34]0.7852490.410335263.8915
SCA [34]0.7886490.408235263.8715
WOA [34]0.788602760.408453070263.8958
TEO [34]0.78866180.4082831263.8958
HGSO [34]0.7782540.440528264.1762
HGS [34]0.78845620.40886831263.8959
KOA [35]0.7886750.408248263.895843
COA [35]0.7880570.410073263.903379
RUN [35]0.7887930.407916263.895854
SMA [35]0.7885410.408627263.895857
DO [35]0.7886430.408339263.895844
POA [35]0.7886750.408248263.895843
NOA [36]0.788680.40825263.89584338
GBO [36]0.788680.40825263.89584338
BKA [2]0.7886750.408248263.895843
SHO [2]0.7888980.40762263.895881
TTAO [18]0.7886880.408213263.8958431
SCHO [37]0.78866420.40827926263.8958476
APO [38]0.78870.4082263.89584338
BSLO [39]0.788679300.40823651263.8958434
FOX [39]0.788702690.4081704263.8958523
ARSCA [1]0.78870.4081263.8958
CPO [1]0.78850.4088263.8959
PKO [40]0.78868708380.4082144942263.8958435
SFOA [28]0.788680.40825263.89584
SCGCRA0.786450.41813263.8543
Table 5. Contrastive results of the piston lever design.
Table 5. Contrastive results of the piston lever design.
AlgorithmOptimum VariablesOptimum Weight
H B X D
PSO [41]133.32.44117.144.75122
DE [41]129.42.43119.84.75159
GA [41]2503.9660.035.91161
HPSO [41]135.52.48116.624.75162
CS [42]0.0502.0431204.0858.427
SNS [43]0.0502.0421204.0838.412698349
SCSO [44]0.0502.040119.994.0838.40901438899551
CSO [44]0.0502.39985.684.080413.7094866557362
GWO [44]0.0602.03901204.0838.40908765909047
WAO [44]0.0992.057118.44.1129.05943208079399
SSA [44]0.0502.073116.324.1458.80243253777633
GSA [44]497.4950060.0412.215168.094363238712
BWO [44]12.36412.801172.023.07495.9980864948937
AOS [45]0.052.042112482119.9517274.0840044928.419142742
GTO [46]0.052.052859119.63924.0897138.41270
MFO [46]0.052.0415141204.0833658.412698
WOA [46]0.0518742.045915119.95794.0858498.449975
ISA [47]N/AN/AN/AN/A8.4
CGO [47]N/AN/AN/AN/A8.41281381
MGA [47]N/AN/AN/AN/A8.41340665
TTAO [18]0.052.0415144.0830271208.412698323
EGO [29]1.9796530793.652740666426.3791882.0315072368.41269886
MVO [29]0.052.046900355119.929244.0955825028.57509432
ALO [29]0.052.051360067118.8211594.1026931868.53445096
CS-EO [29]0.052.0415141204.0830278.412698
SCGCRA0.050.1253641541204.124101577.794
Table 6. Contrastive results of the gear train design.
Table 6. Contrastive results of the gear train design.
AlgorithmOptimum VariablesOptimum Cost
n A n B n C n D
BO [48]431916492.700857   ×   10 12
KOA [35]442016502.700857   ×   10 12
FLA [35]441620492.700857   ×   10 12
COA [35]231412489.92158   ×   10 10
RUN [35]441719492.700857   ×   10 12
SMA [35]523013532.307816   ×   10 11
DO [35]491619442.700857   ×   10 12
POA [35]441719492.700857   ×   10 12
PDO [47]481722542.70   ×   10 12
DMOA [47]491916432.70   ×   10 12
AOA [47]491919542.70   ×   10 12
CPSOGSA [47]551616432.31   ×   10 11
SSA [47]491919492.70   ×   10 12
SCA [47]491934492.700857   ×   10 12
IEHO [49]191643492.70085   ×   10 12
MEWOA [50]491619432.7099   ×   10 12
ARO [51]491916432.7009   ×   10 12
BCA [52]431619492.7009   ×   10 12
BWO [53]501817467.5421   ×   10 17
GMO [54]431916492.700857   ×   10 12
GBO [18]531320342.3078   ×   10 11
TTAO [18]431619492.70   ×   10 12
WO [30]431619432.700857   ×   10 12
GCRA [6]551616432.70   ×   10 12
GOA [6]491916432.70   ×   10 12
SCGCRA502219523.25   ×   10 18
Table 7. Contrastive results of the car side impact design.
Table 7. Contrastive results of the car side impact design.
AlgorithmOptimum VariablesOptimum Weight
x 1 x 2 x 3 x 4 x 5 x 6
x 7 x 8 x 9 x 10 x 11
ACO [55]0.51.120040.51.296270.51.5
0.50.3450.192−18.905−0.0008 22.84371
KH [55]0.51.147470.51.261180.51.5
0.50.3450.345−13.998−0.8984 22.88596
HHO [55]0.51.156270.51.271330.51.4777
0.50.3450.192−14.592−2.4898 22.98537
BOA [55]0.82461.032240.540071.356390.63771.26889
0.58540.1920.345−5.73330.4352 25.06573
HGSO [55]0.51.223750.51.271110.51.31085
0.50.3450.345−4.32352.93676 23.43457
LIACO [55]0.51.115930.51.302930.51.5
0.50.1920.345−19.64−0.000003 22.84299
SMO [55]0.51.116340.51.302240.51.5
0.50.3450.345−19.5660.000001 22.84298
DOA [56]0.50811.20210.53181.30520.57191.4954
0.55570.3030.2585−24.81713.4047 23.9682
DCS [56]0.57721.25860.51951.20020.54631.258
0.50730.2780.26692.08885.4035 23.9995
COA [56]0.51.27910.51.27391.28280.5
0.50.29540.1923.55719.0792 25.2083
MSA [56]0.51511.26840.55451.37370.52611.3484
0.71560.28690.2167−7.239411.7869 25.2334
HLOA [56]0.51.06690.80161.07040.5041.4873
0.50.1920.192−29.97863.2119 23.6956
AROA [56]0.51.50.51.29280.50.5
0.50.1920.31958.826523.0874 25.3642
EGO [29]0.51.11070.51.3120.50011.5
0.500010.987320.04604−20.570.18084 22.84570
MVO [29]0.51.13520.50121.273180.50031.5
0.504030.534890.23089−16.14490.99051 22.88565
ETO [57]0.502821.24140.516041.22010.603341.3878
0.50.748320.067472.2526−7.2818 23.2574
SCHO [57]0.51.102860.870880.886430.526091.49992
0.50.035080.19439−30−0.5913 23.7209
AOA [57]0.51.22790.51.43320.51.5
0.50.610180.216190.00126−0.0765 24.1125
HGS [57]0.51.106121.110440.50.51.5
0.54.4   ×   10 09 0.00000−30−6.0 ×   10 09 23.8188
GJO [57]0.51.203090.503271.287780.510531.5
0.50.000009.5   ×   10 05 −22.115−0.0536 23.4052
ROA [31]1.098334 9010.9574590581.1125211551.0433566480.7308174331.009550656
0.515615970.3450.3450.0532359330.042350889 28.40584747
SCSO [31]0.5023667741.235339390.51.2230087610.5152679671.39187245
0.500033690.3406477750.2119501711.374158706−7.77399175 23.35787723
SHO [31]1.51.2678851921.50.7687833641.118116620.74785158
0.560896670.3450.3452.0505216883.263049114 34.86111849
SOA [31]0.5001392391.2548685870.51.2058710770.7392337160.772309974
0.50.3169990140.303083340.7496600432.039711514 23.8070425
SFOA [28]0.51.2340.51.1870.8750.892
0.40.3450.1921.50.572 23.5616
SCGCRA0.51.116430.51.302080.51.5
0.50.3450.192−19.54935−0.00431 22.84294
Table 8. Contrastive results of the multiple-disk clutch brake design.
Table 8. Contrastive results of the multiple-disk clutch brake design.
AlgorithmOptimum VariablesOptimum Weight
r i r 0 t F Z
TLBO [58]7090181030.313657
MFO [59]7090191030.313656
NSGA-II [60]70901.5100030.470400
MVO [61]7090191030.313656
CMVO [61]7090191030.313656
WCA [62]7090191030.313656
APSO [63]7696184030.337181
IAPSO [63]7090190030.31365661
DAPSO-GA [63]70901100030.31365661
FSO [64]7090187030.31365661053
GOA [65]7192183530.3355146
EOBL-GOA [65]7090198430.31365661053
ABC [66]7090179030.313657
PSO [66]7090186030.3136566
CS [66]7090181030.3136566
GSA [66]7292281530.3175771
AEO [66]7090181030.3136566
AHA [67]7090184030.3136566
HBO [68]70901100030.3136566
HGS [69]70901100030.313657
I-ABC [70]7090190030.313766
MRFO [71]7090183530.3136566
GA [71]7292191830.321498
DE [71]7192183530.3355146
RSO [32]7090181030.313657
SCGCRA7090160020.235247
Table 9. Contrastive results of the rolling element bearing design.
Table 9. Contrastive results of the rolling element bearing design.
AlgorithmOptimum VariablesOptimum Cost
D m D b Z f i f o
K D min K D max ε e ζ
HHO [33]1252111.092070.5150.515
0.40.60.30.0504740.683,011.88
RSA [33]125.172221.2973410.885210.5152530.517764
0.412450.6323380.3019110.0243950.602483,486.64
RSO [72]12521.4176910.940270.5150.515
0.40.70.30.020.685,069.021
EPO [63]12521.418910.941130.5150.515
0.40.70.30.020.685,067.983
ESA [63]12521.417510.941090.510.515
0.40.70.30.020.685,070.085
HSCAHS [63]12510.540.5150.515
0.40.60.30.020.685,539.192
SSA [63]12520.7756211.012470.5150.515
0.50.613970.30.050040.6100182,773.982
PSOGSA [73] 125.00853321.11263811.0622670.5150.5195993
0.404876430.60325010.30.10.703712783,650.9164
WOA [73]125.10073421.423310.951190.5150.515
0.40.70.3142160.020.685,265.167
HGSA [73]125.70800621.423300510.9999780.5150.515
0.50.70.3003040.02710980.685,532.7227
ACVO [73]125.7095921.423299711.0001040.5150.515
0.483526980.618218970.30027530.020.647881785,533.4103
AFT [74]12521.41811.3560.5150.515
0.40.680.30.020.62285,206.641
AHA [67]125.71841121.4253510.5279790.5150.515155
0.4702160.6408180.3000120.0951220.68224185,547.49822
HBO [68]125.722718421.4233110.5150.515
0.4384760.6999980.30.0475320.60108185,533.18
HPO [70]12521.87510.7770.5150.515
0.40.70.30.0290.683,918.4925
MRFO [71]125.719055621.4255902110.5150.515
0.40508560.69057780.30.05366020.692580285,549.239
CS [71]125.44278721.205159110.5150.5416852
0.50.70.30.09757810.601549283,988.259
RUN [75]125.214221.5979611.40240.5150.515
0.400590.614670.30530.020.6366583,680.47
ARO [51]125.71892110.54030.5150.515
0.44590.6721320.30.08250.631785,548.5106
MGA [76]125.71821.874511910.77706580.515000820.51500299
0.4059083530.655588020.300004150.077544920.683,912.87983
CGO [76]12521.87510.7770090.5150.515
0.40.646200520.30.0501524450.683,918.49253
EVO [76]125.719055621.425590210.69553280.5150.515
0.4631829360.69992650.30.0634315190.60421310881,859.7415974
SELO [77]126.352121.0299110.5150.515
0.40.60110.30.10.600483,805.29
LFD [77]126.399921110.5150.5251
0.50.60.30.10.683,670.78
SETO [77]125.722721.4233110.5150.515
0.40.70.30.10.685,539.19
SCGCRA126.233920.194710.51390.55240.5428
0.40720.65650.32540.06810.614290,020.39
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, J.; Jin, A.; Zhang, T. A Hybrid Nonlinear Greater Cane Rat Algorithm with Sine–Cosine Algorithm for Global Optimization and Constrained Engineering Applications. Biomimetics 2025, 10, 629. https://doi.org/10.3390/biomimetics10090629

AMA Style

Zhang J, Jin A, Zhang T. A Hybrid Nonlinear Greater Cane Rat Algorithm with Sine–Cosine Algorithm for Global Optimization and Constrained Engineering Applications. Biomimetics. 2025; 10(9):629. https://doi.org/10.3390/biomimetics10090629

Chicago/Turabian Style

Zhang, Jinzhong, Anqi Jin, and Tan Zhang. 2025. "A Hybrid Nonlinear Greater Cane Rat Algorithm with Sine–Cosine Algorithm for Global Optimization and Constrained Engineering Applications" Biomimetics 10, no. 9: 629. https://doi.org/10.3390/biomimetics10090629

APA Style

Zhang, J., Jin, A., & Zhang, T. (2025). A Hybrid Nonlinear Greater Cane Rat Algorithm with Sine–Cosine Algorithm for Global Optimization and Constrained Engineering Applications. Biomimetics, 10(9), 629. https://doi.org/10.3390/biomimetics10090629

Article Metrics

Back to TopTop