Next Article in Journal
Research on Heave Compensation System Based on Switched Reluctance Motor
Previous Article in Journal
COMSOL-Based Simulation of Microwave Heating of Al2O3/SiC Composites with Parameter Variations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Enhanced Symmetric Sand Cat Swarm Optimization with Multiple Strategies for Adaptive Infinite Impulse Response System Identification

School of Electrical and Photoelectronic Engineering, West Anhui University, Lu’an 237012, China
*
Author to whom correspondence should be addressed.
Symmetry 2024, 16(10), 1255; https://doi.org/10.3390/sym16101255
Submission received: 19 July 2024 / Revised: 12 September 2024 / Accepted: 19 September 2024 / Published: 24 September 2024
(This article belongs to the Section Engineering and Materials)

Abstract

:
An infinite impulse response (IIR) system might comprise a multimodal error surface and accurately discovering the appropriate filter parameters for system modeling remains complicated. The swarm intelligence algorithms facilitate the IIR filter’s parameters by exploring parameter domains and exploiting acceptable filter sets. This paper presents an enhanced symmetric sand cat swarm optimization with multiple strategies (MSSCSO) to achieve adaptive IIR system identification. The principal objective is to recognize the most appropriate regulating coefficients and to minimize the mean square error (MSE) between an unidentified system’s input and the IIR filter’s output. The MSSCSO with symmetric cooperative swarms integrates the ranking-based mutation operator, elite opposition-based learning strategy, and simplex method to capture supplementary advantages, disrupt regional extreme solutions, and identify the finest potential solutions. The MSSCSO not only receives extensive exploration and exploitation to refrain from precocious convergence and foster computational efficiency; it also endures robustness and reliability to facilitate demographic variability and elevate estimation precision. The experimental results manifest that the practicality and feasibility of the MSSCSO are superior to those of other methods in terms of convergence speed, calculation precision, detection efficiency, regulating coefficients, and MSE fitness value.

1. Introduction

The IIR filter leverages the simplified scheme orders, fewer adjustable coefficients, and augmented spectrum allocation to accomplish high measurement precision and efficient convergence performance, which is frequently utilized in monitoring systems, wireless communication, signal manipulation, geographical exploratory, noise mitigation, and software-driven equalization. The error cost of the flexible IIR system identification is characterized by nonlinearity and non-differentiability. Numerous optimization techniques have recently been executed to construct the IIR filter system. The conventional design techniques utilize the Butterworth or Chebyshev Type I or Type II to design the IIR filter, which employs the bilinear transformation approach to convert the analog filter (s-domain) to the digital filter (z-domain). However, this technique has severe demerits in terms of the frequency-distorting effect, quantization errors, and nonlinear phase errors [1,2,3].
Later, gradient-based techniques such as Newton’s method, Quasi-Newton method, simplex technique, least mean square (LSM), linear programming, and least squares and its enhanced variants are used to resolve the IIR filter and minimize the error fitness value [4,5,6]. Most adaptive IIR filters are associated with nonlinear and multimodal fitness errors. These techniques are reliable and stable for the parameter estimation of a linear IIR system, and the unimodal objective function between an unidentified system’s input and the IIR filter’s output must be linear. However, these techniques have limitations: inefficient initial population quality, sluggish convergence acceleration, restricted evaluation precision, easy precocious convergence, slow detection efficiency, and combinatorial explosion. Meanwhile, these techniques exhibit some inherent demerits: the necessity of continuous and differentiable fitness, typically recognizing the regional extreme solution or retracing the sub-optimal solution, an inability to explore the enormous search domain, a constraint of the piecewise linear cost approximation (linear programming), and starting points becoming exceedingly sensitive as the solution variables increase [7,8]. They are uncertain and unstable when designing the higher-order systems of the IIR filter system because the poles may relocate the exterior of the unit circle.
Some swarm intelligence algorithms are applied to resolve the nonlinear, non-differentiable, multimodal, non-convex, and non-quadratic IIR system identification to overcome the above demerits. The swarm intelligence algorithms imitate the macro-intelligence characteristics and learning methods of groups of animals in nature, utilizing cooperation and information exchange to attain the most affordable and achievable solution. These algorithms exhibit parallel solid processing performance, global exploration ability, search efficiency, and strong adaptability to handle super-complex, large-scale, high-dimensional problems [9]. These algorithms establish single-objective and multi-objective mathematical models according to the different assessment environments and research objectives, which utilize different constraints and evaluation indicators to validate sustainability and feasibility and establish the most satisfactory value. These algorithms harmonize to the finest potential solution with faster detection efficiency and inhibit precocious convergence with superior escape ability. They ensure formidable discovery and extraction to recognize the most appropriate regulating coefficients and minimize the MSE.
The basic swarm intelligence algorithms are applied to resolve the IIR system identification and recognize the most appropriate regulating coefficients. Mahata et al. pursued a manta ray foraging approach to resolve the IIR system identification. The pole relating and initialization schedule ensured stability and reliability, which utilized chain, cyclone, and somersault scavenging to attain an attractive scavenging tactic and exemplary extraction efficiency to validate coherence and convergence. Still, it was impacted by an inappropriate selection of the fitness, robustness, resolution speed, and computational efficiency [10]. Izci et al. conducted a whale optimization approach to resolve IIR system identification, and this approach utilized circumnavigating prey, bubble-net assaulting, and pursuit for prey to stabilize localized exploitation and broad exploration to recognize the expansive accurate solution and the most appropriate regulation variables. The exploration and exploitation analysis of the proposed approach needed further research and discussion [11]. Singh et al. established a teaching-learning procedure to resolve the IIR system identification, and this approach utilized the average difference to maintain the diversity and robustness, advance extraction operation, and disrupted regional extreme solutions to identify the most satisfactory solution. The regulating coefficients, mean square error, convergence accuracy, and detection efficiency of the proposed approach needed to be further enhanced in addressing the higher-order IIR system identification [12]. Wang et al. explained an artificial fish swarm approach to resolve the IIR system identification, and this approach exhibited strong search ability and optimization efficiency to attain a lower ameliorated system error and greater estimated precision. However, the approach’s detection accuracy and convergence efficiency were associated with the appropriate selection of the control parameters, and the computational complexity was huge [13]. Ekinci et al. explored the atom search optimization to resolve IIR filter identification; this approach filtrated the finest potential solution, fostered selection probability, quickened the convergence procedure, and bolstered localized exploitation so that it maintained instructive sustainability and adaptability to prohibit search standstill and determine the finest global solution. The exploration and exploitation, diversity analysis, and hyperparameters tuning of the proposed algorithm needed further analysis and research [14].
However, the original algorithms may exhibit the demerits of slow convergence rate, low evaluation precision, and easy precocious convergence. To enhance convergence accuracy and detection efficiency, swarm intelligence algorithms with ensemble strategies are applied to resolve the identification of IIR systems. Zhang et al. devised a self-adaptation hybridized cross-mutation slime mold approach to resolve the IIR system identification; the Cauchy mutation operator augmented the mutation ability and evolutionary directions, the crossover rate equilibrium with differential vector information compensated for an individual-fitness relationship to expand the detection process, and the self-adaptation resume hybrid opposition learning elevated population diversity and alleviated search stagnation. This approach promoted and maintained instructive competitiveness and effectiveness to bolster resolution accuracy, a single objective approach with certain limitations in solving multi-objective issues [15]. Su et al. exploited the multi-objective particle swarm optimization to resolve the IIR system identification, and the approach incorporated the Pareto frontier efficiency and global detection ability to balance passband and stopband filter orders and qualitative concerns. The proposed approach significantly reduced the passband oscillation and maximized stopband suppression to ensure the predetermined indicators of the interrupted frequency and the IIR system order, which exhibited substantial sustainability and dependability in accomplishing the minimal fitness and facilitating regulation variables [16]. Rizk-Allah et al. developed sophisticated artificial rabbit optimization to solve the IIR filter identification. The adaptive local search mechanism reduced accuracy loss, enhanced local exploitation accuracy, and improved the optimization process, and the experience-based perturbed learning strategy promoted detection efficiency, increased population diversity, and avoided search stagnation. The sustainability and adaptability of the proposed approach are superior to those of other approaches in terms of the MSE fitness value, convergence accuracy, regulating coefficients, and evaluation efficiency [17]. Zhang et al. implemented an upgraded GJO to resolve the IIR filter identification, aiming to achieve the most appropriate regulating coefficients and minimize the mean square error. The elite opposition-based learning advanced demographic variety, promoted discovery efficiency, extended the detection domain, and avoided anticipation stagnation. The simplex approach accelerated the investigation of velocity, enhanced the extraction ability, fostered the estimation precision, and advanced the exploitation depth. The proposed approach advanced extraction operation and expedited evaluation efficiency to feature more enjoyable regulation variables and a more formidable convergence solution [18]. Niu et al. utilized an altered artificial ecosystem approach with a dynamical antagonistic learning tactic and the nonlinear adaptive weight coefficient to resolve IIR filter identification, and this approach facilitated population variety, fostered the extraction procedure, expedited evaluation efficiency, refrained precocious convergence, and bolstered broad exploration to maintain the finest potential solution [19].
The hybrid swarm intelligence algorithms are applied to bolster the exploration and exploitation and minimize the mean square error to resolve the IIR system identification. Kumar et al. incorporated a particle swarm optimization with a grasshopper optimization method to resolve IIR system identification; this approach exhibited remarkable intuitiveness and simplicity in identifying intelligent collaborative detection, accelerating exploitation efficiency, and facilitating solution accuracy, achieving a compensated estimation error. However, the proposed approach had limitations in the investigation research, parameter tuning process, computational complexity, and exploration and exploitation analysis [20]. Kaur et al. merged the chimp optimization approach with the cuckoo search approach to resolve IIR system identification; this approach combined exploration and exploitation to display sustainability and superiority in acquiring the most favorable evaluation solution and adjustment factors. However, the proposed approach’s convergence speed, estimation precision, and detection efficiency mainly depended on the appropriate selection of the regulating coefficients [21]. Ekinci et al. explored a cooperative technique with a gazelle optimization technique and simulated annealing method to resolve the IIR system identification; this approach not only achieved complementary advantages to avoid anticipation stagnation and enhance the detection efficiency, but also integrated prospecting and extracting to identify an extreme anticipation efficiency and identify precision.
Further research is needed to verify the stability and robustness of the control parameter adjustment and explore the adaptive technique of the proposed approach [22]. Ekinci et al. established a pattern search ameliorated arithmetic optimization technique to resolve the IIR system identification, and this approach utilized exploration and exploitation to fortify trustworthiness and dependability in accomplishing the superior regulation indicators and the highest-quality solution. The specific control parameters are associated with evaluation efficiency and estimation precision [23]. Liang et al. juxtaposed functional inequality-constrained optimization with a genetic algorithm to resolve IIR filter identification; this approach advanced initial population quality, escalated recognition depth, disrupted regional extreme solution, advanced extraction operation, and elevated estimation precision to improve convergence productivity and accomplished a more accurate solution [24].
Table 1 summarizes the brief literature review of existing methods for IIR system identification.
The SCSO is motivated by the distinguishable properties of sand cats and mimics the surveillance and neutralization of prey to recognize the low-frequency noise, the prey location, and the most appropriate solution [25]. The basic SCSO exhibits deficiencies, such as inefficient initial population quality, sluggish convergence acceleration, and restricted evaluation precision. Therefore, an enhanced symmetric SCSO with multiple strategies (MSSCSO) is presented to accomplish the benchmark functions and adaptive IIR filter identification. The no-free-lunch (NFL) theorem stipulates that no swarm intelligence algorithm can resolve each optimization-related difficulty, which motivates us to establish an enhanced symmetric SCSO. The main contributions are summarized: (1) The ranking-based mutation operator, elite opposition-based learning strategy, and simplex method are introduced into the basic SCSO to accelerate convergence efficiency and advance estimation accuracy. (2) The ranking-based mutation operator filtrates the finest potential solution, fosters selection probability, quickens the convergence procedure, and bolsters localized exploitation. (3) The elite opposition-based learning strategy facilitates demographic variability, upgrades searched an area, expedites evaluation efficiency, refrains precocious convergence, and bolsters broad exploration. (4) The simplex method advances initial population quality, escalates recognition depth, disrupts regional extreme solutions, advances extraction operation, and elevates estimation precision. (5) The MSSCSO is evaluated using the CEC 2022 and three sets of the IIR filter identification compared to recently published approaches. (6) The MSSCSO exhibits strong stability and robustness to balance exploration and exploitation to determine a swifter convergent velocity, greater explored accuracy, lower standard deviation, and more accurate fitness.
The subsequent sections constitute this article. Section 2 reviews SCSO. Section 3 extends MSSCSO. Section 4 summarizes the simulation test and result analysis for benchmark functions. Section 5 conveys MSSCSO-based adaptive IIR system identification. Section 6 discusses the exploration and exploitation analysis and the diversity analysis. Section 7 outlines the conclusion and future research.

2. SCSO

The SCSO is motivated by the sand cat’s extraordinary features, imitating the surveillance and neutralization of prey to recognize expansive, accurate solutions by detecting low-frequency noise and the prey’s location. Figure 1 articulates the sand cat’s distinguishable properties: living, searching, and hunting.

2.1. Initial Population

The initialization matrix is constructed as follows:
S a n d   C a t i = x 1 , 1 x 1 , 2 x 1 , d x 2 , 1 x 2 , 2 x 2 , d x n , 1 x n , 2 x n , d
where x i , j signifies the jth dimension of the ith sand cat.
The calculation matrix of fitness is constructed as follows:
f i t n e s s = f ( x 1 , 1 , x 1 , 2 , , x 1 , d ) f ( x 2 , 1 , x 2 , 2 , , x 2 , d ) f ( x n , 1 , x n , 2 , , x n , d )

2.2. Searching for Prey (Exploration)

The sand cat typically exploits low-frequency noise fabrication as an essential component of identifying and acquiring prey. The sensitivity r G will mitigate continuously from 2 to 0, which encourages the sand cat to gradually approach and capture prey without scaring it away or forfeiting it. The r G is constructed as follows:
r G = s M S M × i t e r c i t e r max
where S M = 2 signifies the auditory traits, i t e r c signifies the newest iteration, and i t e r max signifies the highest iteration.
The R and r are constructed as follows:
R = 2 × r G × r a n d ( 0 , 1 ) r G
r = r G × r a n d ( 0 , 1 )
The position vector P o s is constructed as follows:
P o s ( t + 1 ) = r ( P o s b c ( t ) r a n d ( 0 , 1 ) P o s c ( t ) )
where P o s b c signifies the optimal position, P o s c signifies the newest position.

2.3. Attacking on Prey (Exploitation)

Each sand cat migrates in an alternative circumferential direction of the exploitation region if the sensitivity region is circle-shaped. Figure 2 articulates the position refreshing procedure of the SCSO. The SCSO deploys the Roulette Wheel Section to identify an arbitrary angle θ to investigate the harvesting position. This quickens the convergence procedure, expedites evaluation efficiency, refrains precocious convergence, and elevates estimation precision. The positions are constructed as follows:
P o s r n d = r a n d ( 0 , 1 ) P o s b ( t ) P o s c ( t )
P o s ( t + 1 ) = P o s b ( t ) r P o s r a d cos ( θ )
where P o s r n d signifies the separation between the newest position P o s c and the ideal position P o s b , and θ [ 0 , 360 ] signifies a random angle, cos θ [ 0 , 1 ] .

2.4. Exploration and Exploitation

The SCSO seamlessly switches between localized exploitation and broad exploration to expedite evaluation efficiency and attain the finest potential solution. When R 1 , the SCSO instructs the sand cat to retrieve prey; it utilizes the Roulette Wheel Section to exploit and refrain from precocious convergence. When R > 1 , the SCSO instructs the sand cat to determine a new potential solution, which operates a prey-seeking procedure to explore and pursue prey. Figure 3 articulates the attacking prey (exploitation) versus searching for prey (exploration). The positions are constructed as follows:
X ( t + 1 ) = P o s b ( t ) P o s r n d cos ( θ ) r R 1 ; exploitation r ( P o s b c ( t ) r a n d ( 0 , 1 ) P o s c ( t ) ) R > 1 ; exploitation
Algorithm 1 emphasizes the pseudocode of SCSO.
Algorithm 1 SCSO
Begin
Step 1. Randomly initialize the population,  r , r G and R
Step 2. Ascertain the sand cat’s fitness and identify the finest prey P o s b c
Step 3. while ( t < T ) do
     for each sand cat do
      Accomplish a randomized angle via Roulette Wheel Selection ( 0 ° θ 360 ° )
     if ( R 1 ) then
      Renew position via Equation (8)
     else
      Renew position via Equation (6)
     end if
    end for
     t = t + 1
    end while
    Restore the finest prey P o s b c
End

3. MSSCSO

To advance initial population quality, convergence acceleration, evaluation precision, and solution efficiency, the expanded MSSCSO maintains instructive sustainability and adaptability to realize supplementary advantages and refrain from precocious convergence, and this integrates localized exploitation and broad exploration to expedite evaluation efficiency and elevates estimation precision.

3.1. Ranking-Based Mutation Operator

This strategy filtrates the finest potential solution, fosters selection probability, quickens convergence procedure, and bolsters localized exploitation. The most desirable prey is assigned the finest search agent, and the fitness is sorted from most significant to poorest [26,27,28]. The ranking of sand cat is constructed as follows:
R i = N p i , i = 1 , 2 , , N p
where N p signifies the population size. The selection probability P i is constructed as follows:
p i = R i N p , i = 1 , 2 , , N p
Algorithm 2 emphasizes the ranking-based mutation operator of “DE/rand/1”. Sand cats with the highest rating endure a more extensive probability of being established as the primordial vector or terminus vector, and the intention is to transmit the population’s genetic material to its descendants. A ranking-based mutation operator refrains from employing population sorting to ascertain the selection probability of the starting vector. The associated discovery procedure could dramatically diminish and facilitate voracious convergence if two differential variation vectors are extracted from the more prestigious vectors.
Algorithm 2 Ranking-based mutation operator of “DE/rand/1”
Begin
Sort the population and distribute the selection probability P i
Automatically identify r 1 1 , N p {base vector index}
while r a n d > p r 1 o r r 1 = = i
Automatically identify r 1 1 , N p
end
Automatically identify r 2 1 , N p {terminal vector index}
while  r a n d > p r 2 o r r 2 = = r 1 o r r 2 = = i
Automatically identify r 2 1 , N p
end
Automatically identify r 3 1 , N p {starting vector index}
while  r 3 = = r 2 o r r 3 = = r 1 o r r 3 = = i
Automatically identify r 3 1 , N p
end
End

3.2. Elite Opposition-Based Learning Strategy

This strategy facilitates demographic variability, upgrades the search of an area, expedites evaluation efficiency, refrains from precocious convergence, and bolsters broad exploration. This approach specifies the most advantageous solution to serve as the search agent and executes the exploratory procedure by assessing the acceptable and the opposite solutions. A sand cat with the finest fitness is considered an elite solution x e = ( x e , 1 , x e , 2 , , x e , D ) [29,30,31]. The satisfactory solution x i = ( x i , 1 , x i , 2 , , x i , D ) and the opposite solution x i = ( x i , 1 , x i , 2 , , x i , D ) are constructed as follows:
x i , j = k ( d a j + d b j ) x e , j ,   i = 1 , 2 , , n ;   j = 1 , 2 , , D
where n signifies the population size, D signifies the dimension, k ( 0 , 1 ) , and d a j and d b j signify the dynamic boundaries, which together are constructed as follows:
d a j = min ( x i , j ) ,   d b j = max ( x i , j )
The dynamic boundary may alter the inverse solution’s discovery region and preserve the most outstanding solution. The individual x i , j is constructed as follows:
x i , j = r a n d ( d a j , d b j ) ,   i f   x i , j < d a j   o r   x i , j > d b j

3.3. Simplex Method

This strategy advances initial population quality, escalates recognition depth, disrupts regional extreme solutions, advances extraction operation, and elevates estimation precision, which identifies the optimality of a solution by replacing the original solution with a feasible solution [32,33,34]. Figure 4 articulates the simplex method schematically.
  • Step 1. Ascertain the fitness, filter out adequate solution X g , inadequate solution X b , and renewed solution X s , and ascertain the fitnesses f ( X g ) , f ( X b ) , and f ( X s ) .
  • Step 2. Ascertain the midpoint X c .
    X c = X g + X b 2
  • Step 3. Ascertain the refraction point X r and the fitness f ( X r ) .
    X r = X c + α ( X c X s )
    where α = 1 signifies the reflectivity.
  • Step 4. If f ( X r ) < f ( X g ) , the extension operation is computed as follows:
    X e = X c + γ ( X r X c )
    where γ = 2 signifies the extension factor. If f ( X e ) < f ( X g ) , extension point X e replaces X s ; otherwise, X r replaces X s .
  • Step 5. If f ( X r ) > f ( X s ) , the compression operation is constructed as follows:
    X t = X c + β ( X s X c )
    where β = 0.5 signifies the compression factor. If f ( X t ) < f ( X s ) , compression point X t replaces X s ; otherwise, X r replaces X s .
  • Step 6. If f ( X g ) < f ( X r ) < f ( X s ) , the contraction point X w is computed as follows:
    X w = X c β ( X s X c )
    where β = 0.5 signifies the contraction factor. If f ( X w ) < f ( X s ) , contraction point X w replaces X s ; otherwise, X r replaces X s .
Symmetry is an indispensable characteristic in methodologies and applications, and this is essential in multiple distinctive disciplines. Symmetry in algorithms frequently emphasizes patterns within data, streamlines complicated difficulties, and advances methodology efficiency. In this paper, we derive motivation from symmetric cooperative swarms. The entire sand cat swarm is separated into superior and inferior individuals, encouraging each individual to carry out the multiple responsibilities and occupations and facilitating the investigation and utilization of the solution region.
Algorithm 3 emphasizes the pseudocode of MSSCSO.
Algorithm 3 MSSCSO
Begin
Step 1. Randomly initialize the population, r , r G and R
Step 2. Ascertain the sand cat’s fitness and identify the finest prey P o s b c
Step 3. while ( t < T ) do
     for each sand cat do
     Sort the population and distribute the selection probability P i
     /*ranking-based mutation stage*/
     Automatically identify r 1 1 , N p {base vector index}
     while  r a n d > p r 1 o r r 1 = = i
     Automatically identify r 1 1 , N p
     end
     Automatically identify r 2 1 , N p {terminal vector index}
     while r a n d > p r 2 o r r 2 = = r 1 o r r 2 = = i
     Automatically identify r 2 1 , N p
     end
     Automatically identify r 3 1 , N p {starting vector index}
     while r 3 = = r 2 o r r 3 = = r 1 o r r 3 = = i
     Automatically identify r 3 1 , N p
     end
     /*end of ranking-based mutation stage*/
       Accomplish a randomized angle via Roulette Wheel Selection ( 0 ° θ 360 ° )
     if ( R 1 ) then
         Renew position via Equation (8)
     else
         Renew position via Equation (6)
     end if
     The elite opposition-based learning strategy and simplex method are introduced into exploring and exploiting SCSO.
     end for
      t = t + 1
    end while
    Restore the finest prey P o s b c
End

4. Simulation Test and Result Analysis for Benchmark Functions

4.1. Experimental Setup

The computerized instruments are executed on a Windows 10 PC with an Intel Core i7-8750H 2.2 GHz CPU, a GTX1060, and 8 GB RAM. All approaches are written and operated in MATLAB R2018b.

4.2. CEC 2022 Benchmark Functions

To assess the practicality and feasibility, the MSSCSO is used to resolve the function optimization. Table 2 summarizes the description of the CEC 2022 test suite [30,35].

4.3. Variables Setting

The MSSCSO is employed to resolve the CEC 2022, which is contrasted with bottlenose dolphin optimization (BDO) [36], Chornobyl disaster optimization (CDO) [37], golden jackal optimization (GJO) [38], Kepler optimization algorithm (KOA) [39], liver cancer algorithm (LCA) [40], spider wasp optimization (SWO) [41], rat swarm optimization (RSO) [42], hybrid leader-based optimization (HLBO) [43], waterwheel plant algorithm (WWPA) [44], sinh cosh optimization (SCHO) [45], and sand cat swarm optimization (SCSO) [25]. Each approach’s regulated variables are conventional speculative values extracted from the source papers.
BDO: accelerate factor a f = 3.5 , randomized factor S r = 0.8 , probability factor p = 0.5 , positive constant θ min = 0.4 .
CDO: gamma speed s γ 1 , 300000 , beta speed s β 1 , 270000 , alpha speed s α 1 , 16000 , randomized value r ( 0 , 1 ) .
GJO: randomized value r a n d [ 0 , 1 ] , randomized value r [ 0 , 1 ] , positive constant c 1 = 1.5 , randomized value u ( 0 , 1 ) , randomized value v ( 0 , 1 ) , positive constant β = 1.5 .
KOA: positive constant T = 3 , positive constant γ = 15 , positive constant μ 0 = 0.1 .
LCA: randomized value r d [ 0 , 1 ] , tumor w i d t h [ 0 , 1 ] , tumor d e p t h [ 0 , 1 ] , tumor l e n g t h [ 0 , 1 ] , positive constant f = 1 , radius r [ 0 , 1 ] .
SWO: tradeoff rate T R = 0.3 , crossover rate C R = 0.2 , randomized value l [ 2 , 1 ] , randomized value p [ 0 , 1 ] .
RSO: randomized value R [ 1 , 5 ] , randomized value C [ 0 , 2 ] .
HLBO: randomized value r [ 0 , 1 ] , integer value I { 1 , 2 } , positive constant R = 0.2 .
WWPA: randomized value r 1 [ 0 , 2 ] , randomized value r 2 [ 0 , 1 ] , randomized value r 3 [ 0 , 2 ] , exponential value K [ 0 , 1 ] , randomized value F [ 5 , 5 ] , randomized value C [ 5 , 5 ] .
SCHO: positive constant c t = 3.6 , randomized value r [ 0 , 1 ] , sensitive factor u = 0.388 , sensitive factor m = 0.45 , tiny positive value ε = 0.003 .
SCSO: sensitivity variable r G [ 0 , 2 ] , phase control variable R [ 2 r G , 2 r G ] .
MSSCSO: sensitivity variable r G [ 0 , 2 ] , phase control variable R [ 2 r G , 2 r G ] , randomized value k ( 0 , 1 ) , reflectivity α = 1 , expansion coefficient γ = 1.5 , compression coefficient β = 0.5 , contraction coefficient β = 0.2 , scaling factor F = 0.7 .

4.4. Sensitivity Analysis

This section discusses the sensitivity investigation of the hyperparameter tuning of MSSCSO by varying the scaling factor F and keeping other parameters fixed.
Table 3 summarizes the sensitivity results of MSSCSO for the different scaling factors F . For each approach, the average sample equals 50, the highest iteration equals 1000, and the detached operation equals 30. Best, Std, Mean, Median, and Worst signify the idealized value, standard deviation, average value, median value, and catastrophic value. To investigate the impact of the scaling factor F on the convergence efficiency and estimation precision, the MSSCSO is carried out for several different values of the scaling factor F taken as 0.1, 0.3, 0.5, 0.7, 0.9, 0.11, and 0.13, while keeping other parameters unchanged. The MSSCSO has a relatively good sensitivity to the scaling factor F . The difference between the results of different scaling factors F is significant. When the scaling factor F is set to 0.7, MSSCSO receives excellent reliability and robustness to avoid anticipation stagnation and incorporates exploration and exploitation to achieve a swifter convergent velocity, greater explored accuracy, and more accurate fitness.

4.5. Simulation Test and Result Analysis

The MSSCSO is utilized to resolve the CEC 2022 test suite, which contains unimodal, essential, hybrid, and composite functions. The ranking is based on the catastrophic value.
Table 4 summarizes the simulation results of different approaches for the CEC 2022. To alleviate deficiencies of SCSO, such as inefficient initial population quality, sluggish convergence acceleration, and restricted evaluation precision, the ranking-based mutation operator, elite opposition-based learning strategy, and simplex method are incorporated into SCSO. For F1, F2, F3, F4, F5, and F10, the idealized values, standard deviations, average values, median values, and catastrophic values of the MSSCSO have been substantially augmented; the MSSCSO exhibits instructive sustainability and adaptability to refrain from precocious convergence and recognize the finest potential solution. The idealized values, standard deviations, average values, median values, and catastrophic values of MSSCSO are superior to those of BDO, CDO, GJO, KOA, LCA, SWO, RSO, HLBO, WWPA, SCHO, and SCSO. The MSSCSO utilizes the ranking-based mutation operator to filtrate the finest potential solution, foster selection probability, quicken the convergence procedure, and bolster localized exploitation. The MSSCSO utilizes broad discovery and localized extraction to enhance convergence acceleration and obtain better assessment metrics. The MSSCSO exhibits the most minor standard deviations and the best ranking, and it has strong stability and reliability. For F6 and F8, the standard deviations of the MSSCSO are worse than those of the SCSO. Still, the idealized values, standard deviations, average, median, and catastrophic values of the MSSCSO are more accurate than those of the BDO, KOA, LCA, SWO, RSO, HLBO, WWPA, and SCHO. The MSSCSO utilizes the elite opposition-based learning strategy to facilitate demographic variability, upgrade searched areas, expedite evaluation efficiency, refrain from precocious convergence, and bolster broad exploration. For F7, F9, F11, and F12, the MSSCSO has strong stability and reliability to enhance the idealized values, standard deviations, average, median, and catastrophic values. The assessment metrics and convergence efficiency of the MSSCSO have been substantially augmented. The idealized values, standard deviations, average, median, and catastrophic values of the MSSCSO are superior to those of the BDO, CDO, LCA, SWO, RSO, HLBO, SCHO, and SCSO. The MSSCSO utilizes the simplex method to advance initial population quality, escalate recognition depth, disrupt regional extreme solutions, advance extraction operation, and elevate estimation precision. The MSSCSO with symmetric cooperative swarms not only maintains instructive sustainability and adaptability to capture supplementary advantages and prohibit search standstill but also integrates prospecting and extracting to strengthen convergence productivity and accomplish the most satisfactory solution.
Figure 5 articulates the convergence trajectories of each approach. The convergence curve is a spontaneous technique used to visualize each approach’s utilization efficiency and resolution precision. When the comparison algorithms stabilize after a series of iterations, the approach exhibits superior evaluation precision and swifter utilization efficiency, which has instructive sustainability and adjustability to discover the most satisfactory solution. The MSSCSO endures an exceptional detection efficiency to mitigate anticipation stagnation and achieve the global optimal solution, which swiftly converges early and reliably re-explores later. The MSSCSO exhibits strong reliability and stability to incorporate discovery and extraction, superior convergence speed, calculation precision, and detection efficiency. For unimodal functions, essential functions, hybrid functions, and composite functions, the overall convergence efficiency and estimation precision of MSSCSO are superior to those of BDO, CDO, GJO, KOA, LCA, SWO, RSO, HLBO, WWPA, SCHO, and SCSO. The MSSCSO exhibits simple architectures, excessive optimization efficiency, marvelous parallelism, outstanding robustness, and stability, and it encounters excellent superiority and predictability in refraining from combinatorial explosion and precocious convergence. Figure 6 articulates the ANOVA tests of each approach. The standard deviation is a spontaneous technique to visualize each approach’s stability and reliability. The approach exhibits superior durability and dependability to foster optimization efficiency, advance extraction operation, and elevate estimation precision. The standard deviations of MSSCSO are superior to those of BDO, CDO, GJO, KOA, LCA, SWO, RSO, HLBO, WWPA, SCHO, and SCSO. The MSSCSO exhibits the formidable sustainability and dependability to disrupt the stagnation of exploration and recognize the most desirable solution. The MSSCSO incorporates the utilization and investigation to attain a swifter convergence velocity, greater estimation precision, superior regulation variables, lower standard deviation, and more accurate fitness.
Table 5 summarizes the execution time of different approaches for the CEC 2022 test functions. For each approach, BDO is 189.2856 s, CDO is 51.8795 s, GJO is 67.0120 s, KOA is 76.1346 s, LCA is 48.2005 s, SWO is 43.2389 s, RSO is 38.7741 s, HLBO is 103.4885 s, WWPA is 65.2331 s, SCHO is 96.4285 s, SCSO is 52.2834 s, MSSCSO is 79.8849 s. The execution time of the MSSCSO is better than those of the BDO, HLBO, and SCHO, but worse than those of the CDO, GJO, KOA, LCA, SWO, RSO, WWPA, and SCSO. The ranking-based mutation operator, elite opposition-based learning strategy, and simplex method are introduced into the SCSO, which increases the execution time needed to identify the most satisfactory solution.
Table 6 summarizes the results of the p-value Wilcoxon rank-sum test on the benchmark functions. Wilcoxon’s rank-sum test is administered to ascertain if MSSCSO differs noticeably from other approaches [46,47,48]. p < 0.05 shows the distinguished discrepancy. p 0.05 shows no distinguished discrepancy. The experimental results demonstrate a distinguished discrepancy between MSSCSO and different approaches, and the data endure an appropriate degree of authenticity and dependability that was not established by accident.

5. MSSCSO-Based Adaptive IIR System Identification

5.1. Adaptive IIR System Identification

The MSSCSO is implemented to resolve the IIR system identification, and the principal objective is to recognize the most accurate regulating coefficients, minimize MSE between an unidentified system’s input and the IIR filter’s output, and match the unknown system’s transfer function. The adaptive IIR system exhibits some drawbacks: (1) It is unstable due to the inappropriate selection of denominator coefficients. The proper selection of search space overcomes this problem. (2) It cannot provide an exact linear phase response. The ranking-based mutation operator, elite opposition-based learning strategy, and simplex method are added to the SCSO to overcome drawbacks and improve the convergence speed and calculation accuracy. The MSSCSO maintains instructive stability and adaptability to capture supplementary advantages, prohibit discovery stagnation, and exhibit excellent global exploration and local exploitation to achieve the most satisfactory solution. The MSSCSO has strong robustness and reliability to minimize the MSE according to the stability of the IIR system, and the minimum solution of the MSE is regarded as the global optimal solution. System identification is advantageous for establishing a mathematical representation by investigating the unidentified structure’s input and the IIR filter’s output. Figure 7 articulates the adaptive IIR system identification via MSSCSO.
The input x ( n ) and output y ( n ) of the IIR filter is constructed as follows:
l = 0 L a l y ( n l ) = k = 0 K b k x ( n k )
where L signifies the feedforward order, K signifies the feedback, a l signifies the pole factor, and b k signifies the zero factor. The H ( z ) is constructed as follows:
H ( z ) = k = 0 K b k z k 1 + l = 0 L a l z l
The discrepancy between the unidentified system’s input and the IIR filter’s output is e ( n ) = y 0 ( n ) y ( n ) . The MSE is constructed as follows:
M S E = J ( w ) = 1 N n = 1 N e 2 ( n )
where N signifies the input sample size and w signifies the coefficient factor, w = ( a 1 , a 2 , , a L , b 0 , b 1 , , b K ) T .
The simple steps of the proposed model are summarized as follows:
(1)
Give the input variable x ( n ) and output variable y ( n ) of the adaptive IIR system identification.
(2)
Establish the mathematical expression for the IIR filter’s input and output; the formula is l = 0 L a l y ( n l ) = k = 0 K b k x ( n k ) .
(3)
Introduce noise disturbance v ( n ) and an adaptive optimization algorithm (MSSCSO).
(4)
Utilize the unknown IIR system and adaptive IIR systems to construct the transfer function of the IIR filter H ( z ) = k = 0 K b k z k 1 + l = 0 L a l z l .
(5)
Establish the fitness evaluation function of the mean square error (MSE) between an unidentified system’s input and the IIR filter’s output. The MSE is M S E = J ( w ) = 1 N n = 1 N e 2 ( n ) .

5.2. MSSCSO-Based Adaptive IIR System Identification

Algorithm 4 emphasizes the MSSCSO-based adaptive IIR system identification. Figure 8 articulates the flowchart of MSSCSO for adaptive IIR system identification.
Algorithm 4 MSSCSO-based adaptive IIR system identification
Begin
Step 1. Randomly initialize the population, r , r G and R
Step 2. Ascertain the error fitness via Equation (22) and identify the finest prey P o s b c
Step 3. while ( t < T ) do
      for each sand cat do
      Sort the population and distribute the selection probability P i
      /*ranking-based mutation stage*/
      Automatically identify r 1 1 , N p {base vector index}
      while r a n d > p r 1 o r r 1 = = i
      Automatically identify r 1 1 , N p
      end
      Automatically identify r 2 1 , N p {terminal vector index}
      while  r a n d > p r 2 o r r 2 = = r 1 o r r 2 = = i
      Automatically identify r 2 1 , N p
      end
      Automatically identify r 3 1 , N p {starting vector index}
      while  r 3 = = r 2 o r r 3 = = r 1 o r r 3 = = i
      Automatically identify r 3 1 , N p
      end
      /*end of ranking-based mutation stage*/
       Accomplish a randomized angle via Roulette Wheel Selection ( 0 ° θ 360 ° )
      if ( R 1 ) then
    Renew position via Equation (8)
      else
    Renew position via Equation (6)
      end if
      The elite opposition-based learning strategy and simplex method are introduced into the exploration and exploitation of SCSO.
      Ascertain the error fitness via Equation (22)
      end for
       t = t + 1
  end while
  Restore the finest prey P o s b c
End

5.3. Computational Complexity of MSSCSO

Time complexity: This measures each algorithm’s computational time and resource consumption to resolve the optimization issues. The big-O notation efficiently verifies the stability and reliability. The time complexity mainly relies on the population size, the maximum iteration, and the issue dimension. The ranking-based mutation operator, elite opposition-based learning strategy, and simplex method are added to the SCSO to promote exploration and exploitation. The MSSCSO contains three main parts: population initialization, searching for prey (exploration) and attacking prey (exploitation), updating the sand cat’s location, and halting judgment. In MSSCSO, the population size is N , the maximum iteration is T , and the issue dimension is D . The population initialization is O ( N × D ) . The searching for prey (exploration) is O ( T × N × D ) . The attacking on prey (exploitation) is O ( T × N × D ) . The updating of the sand cat’s location is O ( T × N × D ) . The updating of the sand cat’s location with the ranking-based mutation operator, elite opposition-based learning strategy, and simplex method is O ( T × N × D ) . The halting judgment is O ( 1 ) . Therefore, the total time complexity of MSSCSO is O ( N × D + T × N × D ) . Removing lower-order terms, the overall time complexity of the MSSCSO is O ( T × N × D ) .
Space complexity: This is measured by an algorithm’s additional storage. In MSSCSO, the population size is N and the issue dimension is D . The MSSCSO with symmetric cooperative swarms maintains instructive sustainability and adaptability to capture supplementary advantages, prohibit search standstill, and identify the most satisfactory solution. The overall space complexity of the MSSCSO is O ( N × D ) . The space efficiency of the MSSCSO is effective and stable.

5.4. Variables Setting

The MSSCSO is implemented to accomplish the IIR filter, which is contrasted with the coati optimization algorithm (COA) [49], GOOSE (GOOSE) [50], horned lizard defense tactics (HLOA) [51], Chornobyl disaster optimization (CDO) [37], liver cancer algorithm (LCA) [40], sinh cosh optimization (SCHO) [45], spider wasp optimization (SWO) [41], and sand cat swarm optimization (SCSO) [25]. Each approach’s regulated variables are conventional speculative values extracted from the source papers.
COA: randomized value r [ 0 , 1 ] , integer value I { 1 , 2 } .
GOOSE: weight of stone r a n d i [ 5 , 25 ] , randomized value r n d [ 0 , 1 ] , randomized value p r o [ 0 , 1 ] , randomized value c o e [ 0 , 1 ] .
HLOA: hue angle h ( 0 , 2 π ) , positive constant = 2 , randomized value L i g h t 1 , 2 0 , 0.4046661 , randomized value D a r k 1 , 2 0.544051 , 1 , randomized value w a l k [ 1 , 1 ] , randomized value σ [ 0 , 1 ] .
CDO: gamma speed s γ 1 , 300000 , beta speed s β 1 , 270000 , alpha speed s α 1 , 16000 , randomized value r ( 0 , 1 ) .
LCA: randomized value r d [ 0 , 1 ] , tumor w i d t h [ 0 , 1 ] , tumor d e p t h [ 0 , 1 ] , tumor l e n g t h [ 0 , 1 ] , positive constant f = 1 , radius r [ 0 , 1 ] .
SCHO: positive constant c t = 3.6 , randomized value r [ 0 , 1 ] , sensitive factor u = 0.388 , sensitive factor m = 0.45 , tiny positive value ε = 0.003 .
SWO: tradeoff rate T R = 0.3 , crossover rate C R = 0.2 , randomized value l [ 2 , 1 ] , randomized value p [ 0 , 1 ] .
SCSO: sensitivity variable r G [ 0 , 2 ] , phase control variable R [ 2 r G , 2 r G ] .
MSSCSO: sensitivity variable r G [ 0 , 2 ] , phase control variable R [ 2 r G , 2 r G ] , randomized value k ( 0 , 1 ) , reflectivity α = 1 , expansion coefficient γ = 1.5 , compression coefficient β = 0.5 , contraction coefficient β = 0.2 , scaling factor F = 0.7 .

5.5. Simulation Test and Result Analysis

The MSSCSO maintains instructive sustainability and adaptability to capture supplementary advantages and prohibit search standstill, which integrates investigation and extraction to strengthen utilization efficiency and accomplish the most satisfactory solution.
For case 1, each approach facilitates a first-order IIR filter to recognize a second-order procedure; the unidentified procedure H P ( z ) and IIR filter H M ( z ) are constructed as follows:
H P ( z ) = 0.05 0.4 z 1 1 1.1314 z 1 + 0.25 z 2
H M ( z ) = b 1 a z 1
For case 2, each approach facilitates a second-order IIR filter to recognize a second-order procedure; the unidentified system H P ( z ) and IIR filter H M ( z ) are constructed as follows:
H P ( z ) = 1 1 1.4 z 1 + 0.49 z 2
H M ( z ) = b 1 + a 1 z 1 + a 2 z 2
For case 3, each approach facilitates a higher-order IIR filter to recognize a higher-order procedure; the unidentified system H P ( z ) and IIR filter H M ( z ) are constructed as follows:
H P ( z ) = 1 0.4 z 2 0.65 z 4 + 0.26 z 6 1 0.77 z 2 0.8498 z 4 + 0.6486 z 6
H M ( z ) = b 0 + b 1 z 1 + b 2 z 2 + b 3 z 3 + b 4 z 4 1 + a 1 z 1 + a 2 z 2 + a 3 z 3 + a 4 z 4
The MSSCSO is executed to eliminate the adaptive IIR system identification, and the principal objective is to establish the most appropriate regulation variables and the minimum achievable solution, which diminishes the estimation error between an unidentified system’s input and the IIR filter’s output. Table 7 summarizes each approach’s experimental results (MSE) for case 1. Table 8 summarizes the finest evaluation variables of each approach for case 1. Table 9 summarizes the average evaluation variables of each approach for case 1. Table 10 summarizes each approach’s experimental results (MSE) for case 2. Table 11 summarizes the finest evaluation variables of each approach for case 2. Table 12 summarizes the average evaluation variables of each approach for case 2. Table 13 summarizes each approach’s experimental results (MSE) for case 3. Table 14 summarizes the finest evaluation variables of each approach for case 3. Table 15 summarizes the average evaluation variables of each approach for case 3. For each approach, the average sample equals 30, the highest iteration equals 500, and the detached operation equals 30. Best, Worst, Mean, and Std signify the idealized value, catastrophic value, average value, and standard deviation, which comprehensively confirm the sustainability and profitability of MSSCSO. The ranking is established by measuring the standard deviation. To alleviate the deficiencies of the inefficient initial population quality, sluggish convergence acceleration, and restricted evaluation precision, the expanded MSSCSO is implemented to accomplish the IIR filter. For case 1, the assessment metrics of the MSSCSO are more accurate than those of the COA, GOOSE, HLOA, CDO, LCA, SCHO, SWO, and SCSO, and the MSSCSO encounters courageous investigation and extraction to refrain from sluggish searching and accomplish the most satisfactory solution. The finest or average evaluation variables of the MSSCSO are superior to those of the COA, GOOSE, HLOA, CDO, LCA, SCHO, SWO, and SCSO, and the MSSCSO receives fantastic consistency and endurance to ascertain the most effective evaluation variables. For case 2, the evaluation precision and solution efficiency of the MSSCSO have been substantially advanced compared to the SCSO. The assessment metrics and the finest or average evaluation variables of the MSSCSO are superior to those of the COA, GOOSE, HLOA, CDO, LCA, SCHO, SWO, and SCSO. The MSSCSO maintains discovery and extraction to attain a swifter convergence velocity, greater estimation precision, superior evaluation variables, lower standard deviation, and more accurate fitness. For case 3, the Best, Worst, and Mean of the MSSCSO are more precise than those of the COA, GOOSE, HLOA, CDO, LCA, SCHO, SWO, and SCSO, and the MSSCSO maintains instructive sustainability and adaptability to capture supplementary advantages and prohibit search standstill. The Std and ranking of the MSSCSO are superior to those of the GOOSE, HLOA, SCHO, SWO, and SCSO, but worse than those of the COA, LCA, and CDO, and the MSSCSO exhibits relative stability and robustness. The experimental results indicate that MSSCSO integrates prospecting and extracting to strengthen convergence productivity and accomplish the most satisfactory solution.
Figure 9 articulates the convergence trajectories of each approach for case 1. Figure 10 articulates the ANOVA tests of each approach for case 1. Figure 11 articulates the convergence trajectories of each approach for case 2. Figure 12 articulates the ANOVA tests of each approach for case 2. Figure 13 articulates the convergence trajectories of each approach for case 3. Figure 14 articulates the ANOVA tests of each approach for case 3. The MSSCSO is impacted by the scaling factor F and the distinguishable properties of sand cats, which makes MSSCSO converge toward the optimal solution faster than other comparison approaches and subsequently re-exploit the evaluation accuracy. The overall convergence velocity and estimation precision of the MSSCSO for cases 1, 2, and 3 are superior to those of the COA, GOOSE, HLOA, CDO, LCA, SCHO, SWO, and SCSO; the MSSCSO not only maintains an equilibrium between localized extraction and nationwide investigation to restrict explore stagnation and strengthen convergence acceleration but also fosters selection probability and facilitates population variety to elevate estimation precision and expedite evaluation efficiency. For cases 1 and 2, the standard deviations of the MSSCSO are superior to those of the COA, GOOSE, HLOA, CDO, LCA, SCHO, SWO, and SCSO. The MSSCSO portrays noteworthy sustainability and superiority in recognizing the best solution and evaluation variables. For case 3, the standard deviation of the MSSCSO is superior to those of the GOOSE, HLOA, SCHO, SWO, and SCSO, but is worse than those of the COA, LCA, and CDO. The MSSCSO encounters relative stability and reliability to disrupt regional extreme solutions, foster convergence acceleration, bolster evaluation precision, and advance solution efficiency. The MSSCSO exhibits instructive sustainability and adaptability to foster selection probability, quicken the convergence procedure, bolster localized exploitation and broad exploration, facilitate population variety, broaden the searched area, refrain from precocious convergence, advance initial population quality, escalate recognition depth, and advance extraction operation. The experimental results manifest that MSSCSO maintains discovery and extraction to attain a swifter convergence velocity, greater estimation precision, superior regulation variables, lower standard deviation, and more accurate fitness.
Table 16 summarizes the execution time of different approaches for cases 1, 2, and 3. For MSSCSO, the execution time of case 1 is 8.9495 s, the execution time of case 2 is 9.8702 s, and the execution time of case 3 is 10.6880 s. The overall execution time of the MSSCSO is better than that of the COA, GOOSE, HLOA, and CDO, but it is worse than that of the LCA, SCHO, SWO, and SCSO. The MSSCSO exhibits strong stability and robustness to balance exploration and exploitation and achieve a more accurate solution.
Table 17 summarizes the results of the p-value Wilcoxon rank-sum test. The experimental results demonstrate a distinguished discrepancy between MSSCSO and other approaches, and the data endure an appropriate degree of authenticity and dependability that was not established by accident.

6. Exploration and Exploitation Analysis, Diversity Analysis

Exploration and exploitation are two essential search properties that inhibit precocious convergence, facilitate detection quality, and achieve the most satisfactory solution. There is a robust relationship between the exploration and exploitation of the unique search mechanism and the convergence accuracy. MSSCSO employs exploration to create low-frequency noise fabrication, search for prey, and capture prey, which is conducive to establishing the most appropriate solution in the search region and reducing the convergence efficiency of the optimization approach. The MSSCSO not only utilizes the exploitation to assault and retrieve prey but also utilizes the Roulette Wheel Section to refrain from precocious convergence, which is conducive to accelerating the convergence speed and advancing the possibility of regional extreme solutions. The appropriate ratio between exploration and exploitation is necessary to ensure the MSSCSO’s reasonable performance. The MSSCSO has strong overall searchability, stability, and robustness.
The MSSCSO is employed to accomplish the CEC 2022 and the adaptive IIR system identification, which utilizes a set of better candidate solutions to achieve exploration and exploitation, recognize the most satisfactory solution, and minimize the estimation error between an unidentified system’s input and the IIR filter’s output. Generally, the sand cat with the superior solution can guide the entire search process to achieve global exploration and local exploitation. When the distance between the sand cats gradually increases, the exploration’s impact will become progressively more apparent, and the exploitation’s impact will gradually weaken. A diversity measurement evaluates the increase and decrease in distance between the sand cats. The population diversity is constructed as follows:
D i v j = 1 N i = 1 N m e d i a n x j x i j
D i v = 1 m j = 1 m D i v j
where m e d i a n x j signifies the median of the jth dimension, x i j signifies the jth dimension of ith sand cat, N signifies the population size, and m signifies the design variables. The mathematical expression to measure the percentage of exploration and exploitation is constructed as follows:
Exploration % = D i v D i v max × 100
Exploration % = D i v D i v max D i v max × 100
where D i v max signifies the maximum diversity. The two elements of exploration and exploitation are both mutually conflicting and complementary. In evaluating the balance impact, the median value utilizes the reference element to avoid optimization discrepancies, and the balance impact is affected by D i v max in the entire optimization process. The D i v max is employed as a reference to evaluate the rate of exploration and exploitation. The experimental results indicate that MSSCSO integrates exploration and exploitation to exhibit a swifter convergent velocity, greater explored accuracy, superior regulation variables, lower standard deviation, and more accurate fitness.

7. Conclusions and Future Research

The SCSO is motivated by the distinguishable properties of sand cats and mimics the surveillance and neutralization of prey to recognize the low-frequency noise, the prey location, and the most appropriate solution. The ranking-based mutation operator, elite opposition-based learning strategy, and simplex method are incorporated into the SCSO to enrich the population quality, convergence acceleration, evaluation precision, and solution efficiency. The expanded MSSCSO with symmetric cooperative swarms is implemented to accomplish the benchmark functions and adaptive IIR filter identification, and the principal objective is to recognize the most appropriate regulating variables and minimize the estimation error between an unidentified system’s input and the IIR filter’s output. The ranking-based mutation operator filtrates the finest potential solution, fosters selection probability, quickens the convergence procedure, and bolsters localized exploitation. The elite opposition-based learning strategy facilitates demographic variability, upgrades the search of an area, expedites evaluation efficiency, refrains from precocious convergence, and bolsters broad exploration. The simplex method advances initial population quality, escalates recognition depth, disrupts regional extreme solution, advances extraction operation, and elevates estimation precision. The MSSCSO exhibits simple architectures, excessive optimization efficiency, marvelous parallelism, outstanding robustness, and stability. The MSSCSO encounters excellent superiority and predictability in refraining from combinatorial explosion and precocious convergence. The experimental results demonstrate that MSSCSO incorporates the utilization and investigation to attain a swifter convergent velocity, greater explored accuracy, superior regulation variables, lower standard deviation, and more accurate fitness.
The limitations of the MSSCSO are summarized as follows: (1) The simulation experiments are primarily focused on the adaptive IIR system identification, and further investigations are needed to explore the optimization efficiency and convergence accuracy in a broader range of application fields. (2) The hyperparameters tuning process may require careful consideration to achieve the most satisfactory solution for different problem domains. (3) The proposed algorithm consumes more execution time to recognize the most appropriate regulating coefficients and minimize the MSE. (4) The computational complexity, mathematical logic reasoning, algorithm convergence proof, and dynamic parameter selection will affect the convergence speed and calculation accuracy. (5) The convergence efficiency and estimation accuracy of MSSCSO need to be further strengthened by introducing advanced search strategies, alternative encoding formats, and hybrid algorithmic structures to capture supplementary advantages and inhibit precocious convergence. (6) The MSSCSO is utilized to resolve super-complex, large-scale, high-dimensional, and multi-objective optimization problems, which may not be able to effectively balance exploration and exploitation to determine the finest potential solution.
Our future research will focus on the following aspects: (1) We will utilize more different IIR filter examples or large-scale engineering problems to evaluate and validate the effectiveness and stability of the MSSCSO. The MSSCSO will be compared with state-of-the-art methods to show its competitive performance. We will use statistical analysis to illustrate the discrepancies between the two data sets. (2) We will verify the algorithm convergence and computational complexity from theoretical and practical perspectives. We will exploit dynamic control parameters to enhance the stability and robustness. (3) We will execute advanced search strategies, alternative encoding formats, and hybrid algorithmic structures to capture supplementary advantages and avoid precocious convergence. The renewed SCSO will integrate localized exploitation and broad exploration to strengthen population quality, convergence acceleration, evaluation precision, and solution efficiency. (4) The renewed SCSO will be executed to improve characteristic Dabie Mountain forestry and crops, incorporating Huoshan Dendrobium, Lu’an Guapian, Chinese herbal medicine, and Shucheng Camellia oleifera. The quality of agricultural equipment and intelligent operation will be strengthened by intelligent detection, intelligent control, and specialized agricultural equipment.

Author Contributions

Conceptualization, J.Z. and C.D.; methodology, J.F.; software, C.D.; validation, J.Z., C.D. and J.F.; formal analysis, J.F.; investigation, C.D.; resources, J.Z.; data curation, C.D.; writing—original draft preparation, C.D.; writing—review and editing, J.Z. and J.F.; visualization, J.F.; supervision, C.D.; project administration, J.Z.; funding acquisition, J.Z., C.D. and J.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Start-up Fee for Scientific Research of High-level Talents of West Anhui University under Grant No. WGKQ2022052, School-level Quality Engineering (School-enterprise Cooperation Development Curriculum Resource Construction) under Grant No. wxxy2022101, School-level Quality Engineering (Teaching and Research Project) under Grant No. wxxy2023079, PWMDIC design and application under Grant No. WXCHX0045023110, and Natural Science Key Research Project of Anhui Educational Committee under Grant No. 2022AH051675.

Data Availability Statement

All data used to support the findings of this study are included within the article.

Acknowledgments

The authors would like to thank everyone involved for their contributions to this article. They would also like to thank the editors and anonymous reviewers for their helpful comments and suggestions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Agrawal, N.; Kumar, A.; Bajaj, V. A New Method for Designing of Stable Digital IIR Filter Using Hybrid Method. Circuits Syst. Signal Process. 2019, 38, 2187–2226. [Google Scholar] [CrossRef]
  2. Goswami, O.P.; Rawat, T.K.; Upadhyay, D.K. A Novel Approach for the Design of Optimum IIR Differentiators Using Fractional Interpolation. Circuits Syst. Signal Process. 2020, 39, 1688–1698. [Google Scholar] [CrossRef]
  3. Handkiewicz, A.; Naumowicz, M. NANO-Studio, the Design Environment of Filter Banks Implemented in Standard CMOS Technology. Analog Integr. Circuits Signal Process. 2021, 109, 323–333. [Google Scholar] [CrossRef]
  4. Chen, L.; Liu, M.; Wu, J.; Yang, J.; Dai, Z. Structure Evolution-Based Design for Low-Pass IIR Digital Filters with the Sharp Transition Band and the Linear Phase Passband. Soft Comput. 2019, 23, 1965–1984. [Google Scholar] [CrossRef]
  5. Upadhyay, P.; Kar, R.; Mandal, D.; Ghoshal, S. A New Design Method Based on Firefly Algorithm for IIR System Identification Problem. J. King Saud Univ.-Eng. Sci. 2016, 28, 174–198. [Google Scholar] [CrossRef]
  6. Janjanam, L.; Saha, S.K.; Kar, R.; Mandal, D. Adaptive Recursive System Identification Using Optimally Tuned Kalman Filter by the Metaheuristic Algorithm. Soft Comput. 2024, 28, 7013–7037. [Google Scholar] [CrossRef]
  7. Liu, J.; Shen, L.; Qian, G. Maximum Complex Correntropy Criterion Adaptive IIR Filtering Based on Gauss-Newton Approach. IEEE Trans. Circuits Syst. II Express Briefs 2023, 70, 4271–4275. [Google Scholar] [CrossRef]
  8. Lai, B.; Bernstein, D.S. Convergence of Recursive Least Squares Based Input/Output System Identification with Model Order Mismatch. arXiv 2024, arXiv:240410850. [Google Scholar]
  9. Zhang, J.; Zhang, T.; Wang, D.; Zhang, G.; Kong, M.; Li, Z.; Chen, R.; Xu, Y. A Complex-Valued Encoding Golden Jackal Optimization for Multilevel Thresholding Image Segmentation. Appl. Soft Comput. 2024, 165, 112108. [Google Scholar] [CrossRef]
  10. Mahata, S.; Herencsar, N.; Alagoz, B.B.; Yeroglu, C. Reduced Order Infinite Impulse Response System Identification Using Manta Ray Foraging Optimization. Alex. Eng. J. 2024, 87, 448–477. [Google Scholar] [CrossRef]
  11. Izci, D.; Ekinci, S. Application of Whale Optimization Algorithm to Infinite Impulse Response System Identification. In Handbook of Whale Optimization Algorithm; Elsevier: Amsterdam, The Netherlands, 2024; pp. 423–434. [Google Scholar]
  12. Singh, S.; Ashok, A.; Kumar, M.; Rawat, T.K. Adaptive Infinite Impulse Response System Identification Using Teacher Learner Based Optimization Algorithm. Appl. Intell. 2019, 49, 1785–1802. [Google Scholar] [CrossRef]
  13. Wang, L.; Li, S.; Wang, W.; Liu, B. Design of Digital Infinite Impulse Response Filter Based on Artificial Fish Swarm Algorithm. In Proceedings of the Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2021; Volume 2030, p. 012082. [Google Scholar]
  14. Ekinci, S.; Budak, C.; Izci, D.; Gider, V. An Atom Search Optimization Approach for IIR System Identification. Int. J. Model. Simul. 2023, 1–17. [Google Scholar] [CrossRef]
  15. Zhang, Y.-J.; Wang, Y.-F.; Yan, Y.-X.; Zhao, J.; Gao, Z.-M. Self-Adaptive Hybrid Mutation Slime Mould Algorithm: Case Studies on Uav Path Planning, Engineering Problems, Photovoltaic Models and Infinite Impulse Response. Alex. Eng. J. 2024, 98, 364–389. [Google Scholar] [CrossRef]
  16. Su, T.-J.; Zhuang, Q.-Y.; Lin, W.-H.; Hung, Y.-C.; Yang, W.-R.; Wang, S.-M. Design of Infinite Impulse Response Filters Based on Multi-Objective Particle Swarm Optimization. Signals 2024, 5, 526–541. [Google Scholar] [CrossRef]
  17. Rizk-Allah, R.M.; Ekinci, S.; Izci, D. An Improved Artificial Rabbits Optimization for Accurate and Efficient Infinite Impulse Response System Identification. Decis. Anal. J. 2023, 9, 100355. [Google Scholar] [CrossRef]
  18. Zhang, J.; Zhang, G.; Kong, M.; Zhang, T. Adaptive Infinite Impulse Response System Identification Using an Enhanced Golden Jackal Optimization. J. Supercomput. 2023, 79, 10823–10848. [Google Scholar] [CrossRef]
  19. Niu, Y.; Yan, X.; Wang, Y.; Niu, Y. Dynamic Opposite Learning Enhanced Artificial Ecosystem Optimizer for IIR System Identification. J. Supercomput. 2022, 78, 13040–13085. [Google Scholar] [CrossRef]
  20. Kumar, S.; Singh, R. Design and Analysis of Finite Impulse Response Filter Based on Particle Swarm Optimization and Grasshopper Optimization Algorithms. In Proceedings of the International Conference on Innovations in Computational Intelligence and Computer Vision; Springer: Berlin/Heidelberg, Germany, 2022; pp. 595–606. [Google Scholar]
  21. Kaur, M.; Kaur, R.; Singh, N. A Novel Hybrid of Chimp with Cuckoo Search Algorithm for the Optimal Designing of Digital Infinite Impulse Response Filter Using High-Level Synthesis. Soft Comput. 2022, 26, 13843–13867. [Google Scholar] [CrossRef]
  22. Ekinci, S.; Izci, D. Enhancing IIR System Identification: Harnessing the Synergy of Gazelle Optimization and Simulated Annealing Algorithms. E-Prime-Adv. Electr. Eng. Electron. Energy 2023, 5, 100225. [Google Scholar] [CrossRef]
  23. Ekinci, S.; Izci, D. Pattern Search Ameliorated Arithmetic Optimization Algorithm for Engineering Optimization and Infinite Impulse Response System Identification. Electrica 2024, 24, 119–130. [Google Scholar] [CrossRef]
  24. Liang, Y.; Ling, B.W.-K. Infinite Impulse Response Filter Bank Based Graphic Equalizer Design via Functional Inequality Constrained Optimization and Genetic Algorithm. IEEE Access 2021, 9, 65116–65126. [Google Scholar] [CrossRef]
  25. Seyyedabbasi, A.; Kiani, F. Sand Cat Swarm Optimization: A Nature-Inspired Algorithm to Solve Global Optimization Problems. Eng. Comput. 2023, 39, 2627–2651. [Google Scholar] [CrossRef]
  26. Xia, D.; Wu, X.; Yan, M.; Xiong, C. An Adaptive Stochastic Ranking-Based Tournament Selection Method for Differential Evolution. J. Supercomput. 2024, 80, 20–49. [Google Scholar] [CrossRef]
  27. Zhang, J.; Zhang, T.; Zhang, G.; Kong, M. Parameter Optimization of PID Controller Based on an Enhanced Whale Optimization Algorithm for AVR System. Oper. Res. 2023, 23, 44. [Google Scholar] [CrossRef]
  28. Zhang, J.; Zhang, G.; Huang, Y.; Kong, M. A Novel Enhanced Arithmetic Optimization Algorithm for Global Optimization. IEEE Access 2022, 10, 75040–75062. [Google Scholar] [CrossRef]
  29. Meng, A.; Wu, Z.; Zhang, Z.; Xu, X.; Tang, Y.; Xie, Z.; Xian, Z.; Zhang, H.; Luo, J.; Wang, Y.; et al. Optimal Scheduling of Integrated Energy System Using Decoupled Distributed CSO with Opposition-Based Learning and Neighborhood Re-Dispatch Strategy. Renew. Energy 2024, 224, 120102. [Google Scholar] [CrossRef]
  30. Huang, P.; Zhou, Y.; Deng, W.; Zhao, H.; Luo, Q.; Wei, Y. Orthogonal Opposition-Based Learning Honey Badger Algorithm with Differential Evolution for Global Optimization and Engineering Design Problems. Alex. Eng. J. 2024, 91, 348–367. [Google Scholar] [CrossRef]
  31. Zhou, G.; Zhang, T.; Zhou, Y. Elite Opposition-Based Bare Bones Mayfly Algorithm for Optimization Wireless Sensor Networks Coverage Problem. Arab. J. Sci. Eng. 2024, 1–21. [Google Scholar] [CrossRef]
  32. Yan, Z.; Zhang, J.; Tang, J. Path Planning for Autonomous Underwater Vehicle Based on an Enhanced Water Wave Optimization Algorithm. Math. Comput. Simul. 2021, 181, 192–241. [Google Scholar] [CrossRef]
  33. Yang, F.; Li, Y.; Chen, D.; Hu, S.; Xu, X. Parameter Identification of PEMFC Steady-State Model Based on p-Dimensional Extremum Seeking via Simplex Tuning Optimization Method. Energy 2024, 292, 130601. [Google Scholar] [CrossRef]
  34. Ambarsari, I.F.; Hasanah, N.; Astindari, T.; Sari, F.K.; Masruro, A.A. Application of The Simplex Method and Digital Literacy in Profit Optimization Problems Taufik Tempe. Mathline J. Mat. Pendidik. Mat. 2024, 9, 175–188. [Google Scholar] [CrossRef]
  35. Abdel-Basset, M.; El-Shahat, D.; Jameel, M.; Abouhawwash, M. Exponential Distribution Optimizer (EDO): A Novel Math-Inspired Algorithm for Global Optimization and Engineering Problems. Artif. Intell. Rev. 2023, 56, 9329–9400. [Google Scholar] [CrossRef]
  36. Srivastava, A.; Das, D.K. A Bottlenose Dolphin Optimizer: An Application to Solve Dynamic Emission Economic Dispatch Problem in the Microgrid. Knowl.-Based Syst. 2022, 243, 108455. [Google Scholar] [CrossRef]
  37. Shehadeh, H.A. Chernobyl Disaster Optimizer (CDO): A Novel Meta-Heuristic Method for Global Optimization. Neural Comput. Appl. 2023, 35, 10733–10749. [Google Scholar] [CrossRef]
  38. Chopra, N.; Ansari, M.M. Golden Jackal Optimization: A Novel Nature-Inspired Optimizer for Engineering Applications. Expert Syst. Appl. 2022, 198, 116924. [Google Scholar] [CrossRef]
  39. Abdel-Basset, M.; Mohamed, R.; Azeem, S.A.A.; Jameel, M.; Abouhawwash, M. Kepler Optimization Algorithm: A New Metaheuristic Algorithm Inspired by Kepler’s Laws of Planetary Motion. Knowl.-Based Syst. 2023, 268, 110454. [Google Scholar] [CrossRef]
  40. Houssein, E.H.; Oliva, D.; Samee, N.A.; Mahmoud, N.F.; Emam, M.M. Liver Cancer Algorithm: A Novel Bio-Inspired Optimizer. Comput. Biol. Med. 2023, 165, 107389. [Google Scholar] [CrossRef]
  41. Abdel-Basset, M.; Mohamed, R.; Jameel, M.; Abouhawwash, M. Spider Wasp Optimizer: A Novel Meta-Heuristic Optimization Algorithm. Artif. Intell. Rev. 2023, 56, 11675–11738. [Google Scholar] [CrossRef]
  42. Dhiman, G.; Garg, M.; Nagar, A.; Kumar, V.; Dehghani, M. A Novel Algorithm for Global Optimization: Rat Swarm Optimizer. J. Ambient Intell. Humaniz. Comput. 2021, 12, 8457–8482. [Google Scholar] [CrossRef]
  43. Dehghani, M.; Trojovskỳ, P. Hybrid Leader Based Optimization: A New Stochastic Optimization Algorithm for Solving Optimization Applications. Sci. Rep. 2022, 12, 5549. [Google Scholar] [CrossRef]
  44. Abdelhamid, A.A.; Towfek, S.; Khodadadi, N.; Alhussan, A.A.; Khafaga, D.S.; Eid, M.M.; Ibrahim, A. Waterwheel Plant Algorithm: A Novel Metaheuristic Optimization Method. Processes 2023, 11, 1502. [Google Scholar] [CrossRef]
  45. Bai, J.; Li, Y.; Zheng, M.; Khatir, S.; Benaissa, B.; Abualigah, L.; Wahab, M.A. A Sinh Cosh Optimizer. Knowl.-Based Syst. 2023, 282, 111081. [Google Scholar] [CrossRef]
  46. Miao, F.; Wu, Y.; Yan, G.; Si, X. A Memory Interaction Quadratic Interpolation Whale Optimization Algorithm Based on Reverse Information Correction for High-Dimensional Feature Selection. Appl. Soft Comput. 2024, 164, 111979. [Google Scholar] [CrossRef]
  47. Miao, F.; Li, H.; Yan, G.; Mei, X.; Wu, Z.; Zhao, W.; Liu, T.; Zhang, H. Optimizing UAV Path Planning in Maritime Emergency Transportation: A Novel Multi-Strategy White Shark Optimizer. J. Mar. Sci. Eng. 2024, 12, 1207. [Google Scholar] [CrossRef]
  48. Miao, F.; Yao, L.; Zhao, X.; Zheng, Y. Phasor Symbiotic Organisms Search Algorithm for Global Optimization. In Intelligent Computing Theories and Application: Proceedings of the 16th International Conference, ICIC 2020, Bari, Italy, 2–5 October 2020, Proceedings, Part I 16; Springer: Berlin/Heidelberg, Germany, 2020; pp. 67–78. [Google Scholar]
  49. Dehghani, M.; Montazeri, Z.; Trojovská, E.; Trojovskỳ, P. Coati Optimization Algorithm: A New Bio-Inspired Metaheuristic Algorithm for Solving Optimization Problems. Knowl.-Based Syst. 2023, 259, 110011. [Google Scholar] [CrossRef]
  50. Hamad, R.K.; Rashid, T.A. GOOSE Algorithm: A Powerful Optimization Tool for Real-World Engineering Challenges and Beyond. Evol. Syst. 2024, 15, 1249–1274. [Google Scholar] [CrossRef]
  51. Peraza-Vázquez, H.; Peña-Delgado, A.; Merino-Treviño, M.; Morales-Cepeda, A.B.; Sinha, N. A Novel Metaheuristic Inspired by Horned Lizard Defense Tactics. Artif. Intell. Rev. 2024, 57, 59. [Google Scholar] [CrossRef]
Figure 1. Sand cat in nature: (a) living; (b) searching; (c) hunting.
Figure 1. Sand cat in nature: (a) living; (b) searching; (c) hunting.
Symmetry 16 01255 g001
Figure 2. Position refreshing procedure of SCSO: (a) iteration t ; (b) iteration t + 1 .
Figure 2. Position refreshing procedure of SCSO: (a) iteration t ; (b) iteration t + 1 .
Symmetry 16 01255 g002
Figure 3. Attacking prey (exploitation) versus searching for prey (exploration).
Figure 3. Attacking prey (exploitation) versus searching for prey (exploration).
Symmetry 16 01255 g003
Figure 4. Simplex method schematic.
Figure 4. Simplex method schematic.
Symmetry 16 01255 g004
Figure 5. Convergence trajectories of each approach.
Figure 5. Convergence trajectories of each approach.
Symmetry 16 01255 g005
Figure 6. ANOVA tests of each approach.
Figure 6. ANOVA tests of each approach.
Symmetry 16 01255 g006
Figure 7. The adaptive IIR system identification via MSSCSO.
Figure 7. The adaptive IIR system identification via MSSCSO.
Symmetry 16 01255 g007
Figure 8. Flowchart of MSSCSO for adaptive IIR system identification.
Figure 8. Flowchart of MSSCSO for adaptive IIR system identification.
Symmetry 16 01255 g008
Figure 9. Convergence trajectories of case 1.
Figure 9. Convergence trajectories of case 1.
Symmetry 16 01255 g009
Figure 10. ANOVA tests of case 1.
Figure 10. ANOVA tests of case 1.
Symmetry 16 01255 g010
Figure 11. Convergence trajectories of case 2.
Figure 11. Convergence trajectories of case 2.
Symmetry 16 01255 g011
Figure 12. ANOVA tests of case 2.
Figure 12. ANOVA tests of case 2.
Symmetry 16 01255 g012
Figure 13. Convergence trajectories of case 3.
Figure 13. Convergence trajectories of case 3.
Symmetry 16 01255 g013
Figure 14. ANOVA tests of case 3.
Figure 14. ANOVA tests of case 3.
Symmetry 16 01255 g014
Table 1. Brief literature review of existing methods for IIR system identification.
Table 1. Brief literature review of existing methods for IIR system identification.
Methods with ReferenceMeritsDemeritsGaps
Conventional design [1,2,3]Bilinear transformation technique, quick and efficient implementationFrequency warping effect, quantization errors, nonlinear phase errorsLack of stability and robustness analysis
Gradient-based techniques [4,5,6,7,8]Flexible development, low execution time, high efficiency and reliability in resolving the unimodal objective functions, simple implementation, low complexityInefficient initial population quality, sluggish convergence speed, restricted evaluation precision, easy precocious convergence, slow detection efficiency, combinatorial explosion, sensitivity to parametersLack of stability and robustness analysis
Basic swarm intelligence algorithms [10,11,12,13,14]Easy to implement and understand, simple algorithm framework, few optimization parameters, convenient expansion, good parallelism and stabilityPoor detection efficiency, slow convergence rate, and low estimation precision in resolving the large-scale, high-dimension, multi-objective and extremely complex optimization issues; the exploration and exploitation need to be enhancedLack of stability and robustness analysis analysis, ignoring the additive noise, lack of exploration and exploitation analysis, diversity analysis, hyperparameters tuning
Swarm intelligence algorithms with ensemble strategies [15,16,17,18,19]Fosters selection probability, bolsters localized exploitation and broad exploration, increases population diversity, refrains from precocious convergence, disrupts regional extreme solution, efficiently balances exploration and exploitation, strong self-organization and robustnessThe simulation result depends on the selection of the control parameters, the high computational complexity of the algorithm; further improvement needed in terms of detection efficiency, convergence rate, estimation precision, long execution timeLack of stability and robustness analysis analysis, ignoring the additive noise, lack of exploration and exploitation analysis, diversity analysis, hyperparameters tuning
Hybrid swarm intelligence algorithms [20,21,22,23,24]Realizes the complementary advantages between algorithms, intense exploration and exploitation, strong stability and robustness The algorithm’s computational complexity is high, convergence rate and estimation precision depend on the selection of the control parameters, long execution timeLack of different system identification structures, lack of stability and robustness analysis, ignoring the additive noise, lack of exploration and exploitation analysis, diversity analysis, failure to review some new approaches
Table 2. Description of the CEC 2022 test suite.
Table 2. Description of the CEC 2022 test suite.
Function TypeNo. Name Dim[lb,ub] F min
Unimodal function F 1 Shifted and full rotated Zakharov function10[−100,100]300
Basic functions F 2 Shifted and full rotated Rosenbrock’s function10[−100,100]400
F 3 Shifted and full rotated expanded Schaffer’s F 6 function10[−100,100]600
F 4 Shifted and full rotated non-continuous Rastrigin’s function 10[−100,100]800
F 5 Shifted and rotated Levy function 10[−100,100]900
Hybrid functions F 6 Hybrid function 1 ( N = 3 )10[−100,100]1800
F 7 Hybrid function 2 ( N = 6 )10[−100,100]2000
F 8 Hybrid function 3 ( N = 5 )10[−100,100]2200
Composite functions F 9 Composite function 1 ( N = 5 )10[−100,100]2300
F 10 Composite function 2 ( N = 4 )10[−100,100]2400
F 11 Composite function 3 ( N = 5 )10[−100,100]2600
F 12 Composite function 4 ( N = 6 )10[−100,100]2700
Table 3. Sensitivity results of MSSCSO for the different scaling factors F .
Table 3. Sensitivity results of MSSCSO for the different scaling factors F .
FunctionResultScaling Factor F
0.10.30.50.70.90.110.13
F1Best300.3073300.1744300.1571300.6584301.1911302.4874301.6503
Std27.3522120.6949819.7837720.2856947.5189844.9970045.89157
Mean321.4812323.6530320.5828318.9658356.7509352.9056377.4065
Median309.1524321.0634318.1187310.3300344.9288342.6918386.6018
Worst394.7847373.6334378.0730372.1049498.4575471.1267485.6404
F2Best400.4242400.1573400.2118400.0055400.3607400.4434400.6102
Std26.4313329.9447837.6915013.897183.37227613.0977714.30825
Mean424.7244445.7873439.7443410.6535407.3034410.5346411.9227
Median411.6288447.6871431.2803407.4000407.5555407.6815407.8502
Worst475.6888514.4983542.6178470.8142412.2323472.5011465.7700
F3Best600.6350604.3034601.6990600.1112600.2999600.3239600.3078
Std8.3357328.2544387.3593912.5943581.8839092.4799792.106617
Mean615.4231616.4457611.8143602.2905602.1334602.6434602.5331
Median615.9688615.7775610.2676601.2250601.8102601.5358601.9688
Worst635.8255635.9863632.3663611.8587610.7669610.9289609.6038
F4Best807.9609813.9444806.9650804.9956809.7937805.5202805.5820
Std6.9237907.4821256.1572796.0923634.8904886.6452097.098210
Mean823.8254824.5767820.9302816.7228817.9319818.4581818.4082
Median824.4116824.9014821.4310815.7339816.9648817.2838817.5602
Worst835.8218838.8042833.8812831.8398831.0936830.9169833.8749
F5Best900.0619903.2530901.8745900.0761900.0278900.1004900.0980
Std98.91538119.9691108.997010.845373.8807279.69229217.70980
Mean994.8915999.93701009.578907.1468903.2252906.7598909.9744
Median969.2928963.6642981.3706901.8173902.3931902.3058901.7920
Worst1358.2851372.5781343.114947.3566920.6647938.2756986.2309
F6Best1862.7731946.6911905.8181973.3022169.2751933.3122469.662
Std1902.8442033.6401474.6812141.7482433.5408388.8752368.603
Mean4246.5734175.9933974.2904515.0215128.0797043.9006138.991
Median3773.3783639.7433740.0144402.0194151.5885452.9605734.492
Worst8123.1738108.1538175.5318192.2079377.55349681.5711236.24
F7Best2021.6552016.7352021.2672001.1672020.4842020.8952014.423
Std10.2859112.4587310.9671811.577525.0052225.3796385.546232
Mean2038.7482037.7052039.9832024.3582025.2202026.3942025.945
Median2037.8182035.9942037.9762024.0582023.3732024.6692024.328
Worst2059.6322065.1612070.7982059.7122042.6602042.6652038.861
F8Best2213.6172204.9472209.7942202.6912202.8192206.1412203.812
Std3.2876745.0579584.1329356.5428525.6883334.8421155.877626
Mean2224.9562224.3972224.7002221.2382222.2702223.1262223.689
Median2224.8632225.3682225.9152222.5552223.9982224.0512225.141
Worst2232.6052232.3082229.8032227.9692227.8762228.1472230.420
F9Best2510.6022518.7212521.2442508.8682501.4902512.8812514.225
Std45.1519436.2640534.8138310.025687.5188564.5559635.415469
Mean2582.4502566.2892567.3742525.3312524.9042524.6142523.913
Median2582.5902566.3982561.1442524.9912527.3442524.9592524.591
Worst2674.0612650.7422656.3302564.2742536.9832531.9662533.878
F10Best2500.2132500.2292500.1862500.2502500.2852500.2562500.339
Std57.3745464.5041162.013040.0687160.0974350.1354860.086766
Mean2540.2522559.7262561.2972500.3792500.4292500.4712500.469
Median2500.5272502.3702554.7322500.3722500.4182500.4692500.453
Worst2628.7602639.6142636.2802500.5562500.6832500.9222500.668
F11Best2600.5392600.9132600.8432600.8512601.9302601.7762603.467
Std169.085666.55507130.865563.13874135.0487112.427661.59830
Mean2678.3472646.6052652.8492654.7902710.5812703.6432689.650
Median2603.1622604.1552602.3632606.7692724.5642732.3582732.491
Worst3223.8282788.2543275.8802752.2983197.8263191.9052757.200
F12Best2851.1282855.6372850.1012853.4292853.4142853.4332856.232
Std14.2750615.928068.7449102.8454762.3911513.1919902.704067
Mean2865.7452867.1392861.9972860.2252861.3752861.0212861.655
Median2861.5522860.4242860.2802859.6482861.6682861.8662861.316
Worst2901.3842901.0502900.0712866.5252866.1802866.5992866.945
Table 4. Simulation results of different approaches for the CEC 2022.
Table 4. Simulation results of different approaches for the CEC 2022.
FunctionResultBDOCDOGJOKOALCASWORSOHLBOWWPASCHOSCSOMSSCSO
F1Best4925.7898249.941328.375416,253.068085.7563867.7461819.9593932.9078309.1041833.181322.7396300.6584
Std1833.0281.33   ×   10 6 2097.54811,783.83776.49724116.1021448.8143268.3594852.77861,039.221434.06220.28569
Mean9656.029270,488.72401.40534,595.3710,345.938855.8672898.58211,440.7512,591.3729,761.881309.624318.9658
Median10,234.82176,61.921772.79431,105.1610,767.568296.0031856.98811,725.6910,863.607169.126473.4388310.3300
Worst13,376.907.30   ×   10 6 6500.79556,400.9011,022.0017,229.556223.82419,334.0330,465.83262,160.34667.163372.1049
Rank512610284791131
F2Best829.4427806.7506407.6932702.1547624.9459449.6727569.4683846.03781026.749801.1575400.1207400.0055
Std240.512015.2190824.26073987.30361019.98750.27431206.8468220.41271045.7651000.58233.1833413.89718
Mean1165.991849.1894448.50792466.1622250.694530.7590790.30621184.1463036.1821781.587429.6792410.6535
Median1091.252852.3551455.06322459.7772046.559519.9205720.59381173.4743225.1431227.342408.9474407.4000
Worst1679.642870.1469487.01204940.1525749.846641.24781305.5001664.2895004.7194342.539502.0246470.8142
Rank823911567121041
F3Best636.7920628.4212600.1967675.9502623.1902616.4829627.8814650.0869668.1951627.6416600.5163600.1112
Std6.5411134.0202163.2356698.53812312.843648.4936227.4182316.4054146.47918517.927347.5731232.594358
Mean656.2125635.2412604.6347692.7154665.5881634.3778643.7557662.9752681.1905654.1414610.8692602.2905
Median656.3028634.8934604.0315693.2727668.6956632.9724643.3323662.6562679.9632654.4357609.4264601.2250
Worst670.9830644.6496612.8190708.2447683.1953653.7690661.1006675.1539691.8179700.069628.9012611.8587
Rank632101197451281
F4Best835.3624832.5326813.5885876.4839853.2148837.0745825.9955837.5426862.9625838.5356817.4304804.9956
Std7.8705136.26215810.1082611.4646610.060619.5196177.13050110.099958.39342511.088176.9882116.092363
Mean853.8336846.3934827.8400898.2084875.2959852.8306839.3213855.2635881.4053858.4868828.9365816.7228
Median854.6623845.5254825.0855904.2125876.8686853.5633839.3571855.1270882.8744859.3410829.1559815.7339
Worst868.3677860.5760853.0835909.6414893.7319875.2241854.7475875.2821893.5234877.6968842.3118831.8398
Rank521012874961131
F5Best1188.0731275.414900.90222983.7611508.611977.59621065.3451247.9591746.7021206.901900.7257900.0761
Std170.493196.0060091.90270302.2772279.2597258.8724114.0445188.7156153.9177440.5012129.926710.84537
Mean1562.1931404.611972.21273426.4351940.8321338.8771296.0011630.6612208.4171598.9691031.926907.1468
Median1564.4011391.264951.55163485.0051941.1041342.8611299.2121611.1672236.3751532.964995.7267901.8173
Worst1863.9781671.8771347.8083733.0182441.7941915.3481591.6742086.5082559.2623771.1141496.126947.3566
Rank732111094861251
F6Best2.05   ×   10 6 4.64   ×   10 7 3448.4822.16   ×   10 8 1.12   ×   10 7 9038.4764897.2452255.4911.54   ×   10 7 7362.5071938.9871973.302
Std6.79   ×   10 7 2.92   ×   10 8 1972.5047.68   ×   10 8 7.66   ×   10 7 2.07   ×   10 6 1.67   ×   10 7 1.56   ×   10 7 6.85   ×   10 7 8.11   ×   10 8 2250.4992141.748
Mean9.98   ×   10 7 2.10   ×   10 8 7969.7261.20   ×   10 9 1.10   ×   10 8 1.11   ×   10 6 6.97   ×   10 6 3.05   ×   10 6 1.17   ×   10 8 5.02   ×   10 8 4249.4654515.021
Median8.59   ×   10 7 6.46   ×   10 7 8294.4231.07   ×   10 9 1.07   ×   10 8 2.36   ×   10 5 2.69   ×   10 4 6.65   ×   10 4 1.14   ×   10 8 1.00   ×   10 8 3778.6224402.019
Worst2.46   ×   10 8 9.53   ×   10 8 1.47   ×   10 4 3.06   ×   10 9 2.16   ×   10 8 9.07   ×   10 6 5.66   ×   10 7 8.55   ×   10 7 2.21   ×   10 8 3.54   ×   10 9 8186.0268192.207
Rank710111946581232
F7Best2071.1422115.1902021.7472111.4932039.1492054.5272060.3022097.0862092.8172115.3812013.1442001.167
Std19.666787.59522014.3197858.4554645.7585122.2277128.5272828.3512230.4613452.9566127.0290211.57752
Mean2111.5622136.6842045.2622241.4362151.4342086.6972086.5492159.5022191.0142192.9862045.3932024.358
Median2110.6462138.7622042.5272239.5252148.6292085.6862083.6492164.0462198.5372190.0782039.6882024.058
Worst2153.6312146.2142079.2772346.1682239.6682142.5462191.9962215.8622240.5322331.3482129.0242059.712
Rank710111946581232
F8Best2244.8502218.1072221.9192320.8202236.0462227.6012228.6692224.8422250.1252224.9842209.1452202.691
Std20.806174.1581693.1594791226.40779.2781212.2736728.4774528.3842952.30765141.85364.7535176.542852
Mean2282.7702229.6192225.6642893.1132331.2072243.2722256.5332262.2512326.1462371.1112227.0302221.238
Median2282.5772230.0402225.2332569.2422312.8122242.1072248.0922262.3382306.1442299.8892226.4272222.555
Worst2349.7062238.4582232.9379113.3862502.4892290.6622361.1802348.5392471.1512695.0232234.4572227.969
Rank621121058791134
F9Best2689.4382656.2222529.4122763.8462720.1232501.4302563.3442684.1202712.3922653.5782529.2872508.868
Std27.038781.45847435.86557187.381185.3058541.9046242.7940713.2689240.8743137.3731341.9270610.02568
Mean2737.1292658.2582589.2613045.1172851.6212577.6672659.5852707.8532818.6652698.7752579.7272525.331
Median2732.0012658.2542583.5623006.6002841.0782581.5622667.1912706.2522813.9972706.6402582.7222524.991
Worst2796.1552662.4272660.8703655.1143021.7562682.0512759.1722752.5302907.4092772.7682683.4152564.274
Rank415121181037692
F10Best2524.0342617.0312500.3782541.0042530.8982501.5282508.3852539.2402559.0442563.4002500.3692500.250
Std31.56800136.829261.09714240.9939663.210979.1404330.89074204.1968514.8208338.615964.433560.068716
Mean2565.0772729.7472549.8472698.4553215.5762581.9532515.6192871.1223398.7022974.8082559.4682500.379
Median2559.2262698.0232500.6092611.4292948.8542542.4062509.1452827.9713328.6722915.6302500.6962500.372
Worst2652.4083348.2142644.5633796.9954994.0022739.0432678.4223311.6894131.0914028.9042645.4832500.556
Rank374912628111051
F11Best2961.5183333.8392729.9985291.3003296.1522811.4702899.3272989.8093807.3913337.3462600.9102600.851
Std373.45644.709603209.25362.7   ×   10 12 500.8876422.8975310.27351717.658314.3944593.7176155.515563.13874
Mean3812.0363342.0672948.2725291.3004686.2323479.9353315.2775161.0474668.8553990.2782803.9752654.790
Median3740.7523342.1492871.4735291.3004909.6813561.0353282.2784963.8904763.1863880.1412742.4022606.769
Worst4393.4123351.0793471.8455291.3005113.7014659.7804090.66911367.205066.6285273.2143131.5132752.298
Rank825110961271143
F12Best2872.6132987.3262859.8222891.0372995.4172874.0422871.7792884.1722892.1183008.1902862.5682853.429
Std7.91741729.737609.73557147.77224115.55924.73968036.537972.9114701.439513141.899915.691882.845476
Mean2892.3423065.7532868.6822964.6583194.2472899.1372901.3272899.3902899.7403212.0782870.7292860.225
Median2892.1183063.4662865.0812964.3063190.9252900.0022885.0132900.0022900.0023184.1562865.4242859.648
Worst2912.7963114.9002897.6563059.8693436.4872900.0023016.0872900.0022900.0023513.0192938.4282866.525
Rank586101149311272
Table 5. The execution time of different approaches for the CEC 2022 test functions.
Table 5. The execution time of different approaches for the CEC 2022 test functions.
FunctionBDOCDOGJOKOALCASWORSOHLBOWWPASCHOSCSOMSSCSO
F1–12189.285651.879567.012076.134648.200543.238938.7741103.488565.233196.428552.283479.8849
Table 6. Results of the p-value Wilcoxon rank-sum test on the benchmark functions.
Table 6. Results of the p-value Wilcoxon rank-sum test on the benchmark functions.
FunctionCGJO vs
BDOCDOGJOKOALCASWORSOHLBOWWPASCHOSCSO
F13.02   ×   10 11 3.02   ×   10 11 8.99   ×   10 11 2.93   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 1.61   ×   10 10
F23.02   ×   10 11 3.02   ×   10 11 1.85   ×   10 8 3.02   ×   10 11 3.02   ×   10 11 4.08   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 5.55   ×   10 5
F33.02   ×   10 11 3.02   ×   10 11 2.05   ×   10 3 3.00   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 7.09   ×   10 8
F43.02   ×   10 11 3.02   ×   10 11 8.88   ×   10 6 2.69   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 4.98   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 6.01   ×   10 8
F53.02   ×   10 11 3.02   ×   10 11 2.20   ×   10 7 2.55   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 4.57   ×   10 9
F63.02   ×   10 11 3.02   ×   10 11 2.83   ×   10 8 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 1.55   ×   10 9 3.52   ×   10 7 3.02   ×   10 11 9.92   ×   10 11 3.40   ×   10 2
F73.02   ×   10 11 3.02   ×   10 11 1.73   ×   10 7 3.02   ×   10 11 4.08   ×   10 11 3.34   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 1.17   ×   10 4
F83.02   ×   10 11 1.43   ×   10 8 1.00   ×   10 3 3.02   ×   10 11 3.02   ×   10 11 3.69   ×   10 11 3.02   ×   10 11 9.92   ×   10 11 3.02   ×   10 11 1.21   ×   10 10 4.12   ×   10 6
F93.02   ×   10 11 3.02   ×   10 11 1.09   ×   10 10 3.02   ×   10 11 3.02   ×   10 11 9.83   ×   10 8 3.34   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 6.72   ×   10 10
F103.02   ×   10 11 3.02   ×   10 11 1.10   ×   10 8 2.98   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 2.23   ×   10 9
F113.02   ×   10 11 3.02   ×   10 11 8.15   ×   10 11 1.21   ×   10 12 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 2.60   ×   10 5
F122.39   ×   10 11 3.02   ×   10 11 1.70   ×   10 8 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 5.00   ×   10 9
Table 7. Experimental results (MSE) of each approach for case 1.
Table 7. Experimental results (MSE) of each approach for case 1.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
Best0.0104250.0093640.0097820.0134100.0128800.0106750.0138870.0098370.008475
Worst0.0183600.0188380.0190140.0188750.0217290.0195530.0262230.0181000.016172
Mean0.0140990.0123530.0129950.0165880.0180850.0134510.0182940.0130340.011493
Std0.0025730.0023300.0023720.0018600.0022780.0025250.0036290.0026570.001287
Rank745236981
Table 8. The finest evaluation variables of each approach for case 1.
Table 8. The finest evaluation variables of each approach for case 1.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
a 0.9192180.8961970.9212170.1038870.1363510.9121200.9429380.953202−0.10047
b −0.23761−0.32082−0.292900.1623000.135762−0.24297−0.27375−0.228910.100398
Table 9. The average evaluation variables of each approach for case 1.
Table 9. The average evaluation variables of each approach for case 1.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
a 0.5056770.6336120.5720770.2368240.2506140.5055950.7420000.6056830.804566
b −0.11108−0.13753−0.193230.0640530.027292−0.14041−0.25005−0.15625−0.23740
Table 10. Experimental results (MSE) of each approach for case 2.
Table 10. Experimental results (MSE) of each approach for case 2.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
Best6.96   ×   10 5 3.31   ×   10 9 3.68   ×   10 5 0.1232400.0157562.81   ×   10 6 0.0044081.10   ×   10 8 3.14   ×   10 9
Worst0.2280980.2441920.2368830.2646690.2791430.2603640.3778170.2441371.17   ×   10 6
Mean0.0682310.0487000.0816010.2127650.2017060.1286990.1126130.0574941.81   ×   10 7
Std0.0825170.0886440.0721970.0334960.0674220.1057200.1018710.0893632.58   ×   10 7
Rank564239871
Table 11. The finest evaluation variables of each approach for case 2.
Table 11. The finest evaluation variables of each approach for case 2.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
a 1 −1.39981−1.39999−1.41159−1.06573−1.40094−1.40155−1.31698−1.39996−1.40000
a 2 0.4921750.4899790.5002640.1370230.4816190.4914720.4123500.4899500.489998
b 0.9983401.0000000.9833960.9546330.8219230.9967581.0531190.9999531.000000
Table 12. The average evaluation variables of each approach for case 2.
Table 12. The average evaluation variables of each approach for case 2.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
a 1 −0.99638−0.97431−1.006680.062561−0.26140−0.61345−1.00668−0.98987−1.40011
a 2 0.3137800.4443940.1798910.210050.1431420.1040670.2018680.2872840.490102
b 0.6912260.8247750.7059120.221440.2366950.5334971.0529410.7686470.999844
Table 13. Experimental results (MSE) of each approach for case 3.
Table 13. Experimental results (MSE) of each approach for case 3.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
Best0.0179460.0002970.0076080.0127960.0204420.0068560.0184550.0005460.000221
Worst0.0316820.0241220.0801260.0277490.0334940.0757490.1565160.0567020.015203
Mean0.0251110.0045690.0481960.0182970.0272390.0422780.0805820.0087270.004805
Std0.0033850.0059820.0190070.0038160.0034660.0240860.0372260.0109060.004760
Rank157328964
Table 14. The finest evaluation variables of each approach for case 3.
Table 14. The finest evaluation variables of each approach for case 3.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
a 1 0.9535860.007352−0.207860.8783690.905309−0.279870.022008−0.005410.008618
a 2 0.953586−0.04654−0.132160.2650570.905486−0.16811−0.04303−0.00298−0.02547
a 3 0.9535860.005118−0.175350.5391420.905668−0.06786−0.420770.027448−0.00666
a 4 0.953586−0.80073−0.403950.1770610.903126−0.36939−0.26702−0.83054−0.83264
b 0 0.9535860.9966890.9143600.9728020.9022310.9977880.9118900.9462020.997731
b 1 0.9535860.001516−0.253200.6380420.906205−0.198970.196905−0.04245−0.01344
b 2 0.9535860.3264760.0777200.3325440.9027350.1874190.0828710.3723050.341360
b 3 0.9535860.005216−0.090170.6678460.904501−0.27117−0.324810.013260−0.03256
b 4 0.953586−0.32506−0.111800.2741410.905339−0.04838−0.06055−0.27418−0.33272
Table 15. The average evaluation variables of each approach for case 3.
Table 15. The average evaluation variables of each approach for case 3.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
a 1 0.6891010.026186−0.088640.3670870.889640.046238−0.01561−0.01891−0.04763
a 2 0.686175−0.125760.0292590.1407990.8894280.021665−0.07093−0.19670−0.24465
a 3 0.6571350.1280470.0499460.3473530.8909350.032133−0.073620.0197010.005844
a 4 0.589617−0.46737−0.002190.1043410.887833−0.01196−0.03842−0.32093−0.41933
b 0 0.7604310.9621520.3522170.8307970.8920010.4947110.4665690.8689080.928592
b 1 0.6543530.0132910.0001010.2227360.8895280.034416−0.02076−0.02577−0.05120
b 2 0.7292080.2406030.1443260.2262740.8913940.0767280.0591020.0941510.095193
b 3 0.6311460.1180330.0940430.2571690.891455−0.02049−0.04550−0.01461−0.00774
b 4 0.671518−0.128220.1363920.2427470.8888260.0614400.1212460.034869−0.01609
Table 16. The execution time of different approaches for cases 1, 2, and 3.
Table 16. The execution time of different approaches for cases 1, 2, and 3.
CaseCOAGOOSEHLOACDOLCASCHOSWOSCSOMSSCSO
Case 19.698311.246912.344414.55515.31234.85133.69185.29338.9495
Case 210.475612.670012.891215.10895.54215.09473.68315.94319.8702
Case 311.200513.970713.721417.30765.63335.39193.55437.243510.6880
Table 17. Results of the p-value Wilcoxon rank-sum test.
Table 17. Results of the p-value Wilcoxon rank-sum test.
ResultCOAGOOSEHLOACDOLCASCHOSWOSCSO
Case 11.11   ×   10 4 4.29   ×   10 2 9.07   ×   10 3 8.15   ×   10 11 6.07   ×   10 11 3.37   ×   10 4 8.99   ×   10 11 5.94   ×   10 4
Case 23.02   ×   10 11 6.63   ×   10 3 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 3.02   ×   10 11 1.91   ×   10 2
Case 33.02   ×   10 11 3.87   ×   10 2 6.07   ×   10 11 1.61   ×   10 10 3.02   ×   10 11 1.78   ×   10 10 3.02   ×   10 11 7.48   ×   10 5
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Du, C.; Zhang, J.; Fang, J. An Enhanced Symmetric Sand Cat Swarm Optimization with Multiple Strategies for Adaptive Infinite Impulse Response System Identification. Symmetry 2024, 16, 1255. https://doi.org/10.3390/sym16101255

AMA Style

Du C, Zhang J, Fang J. An Enhanced Symmetric Sand Cat Swarm Optimization with Multiple Strategies for Adaptive Infinite Impulse Response System Identification. Symmetry. 2024; 16(10):1255. https://doi.org/10.3390/sym16101255

Chicago/Turabian Style

Du, Chengtao, Jinzhong Zhang, and Jie Fang. 2024. "An Enhanced Symmetric Sand Cat Swarm Optimization with Multiple Strategies for Adaptive Infinite Impulse Response System Identification" Symmetry 16, no. 10: 1255. https://doi.org/10.3390/sym16101255

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop