Next Article in Journal
Fixed-Time Adaptive Event-Triggered Guaranteed Performance Tracking Control of Nonholonomic Mobile Robots under Asymmetric State Constraints
Previous Article in Journal
New Properties of Analytic Functions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Slime Mould Algorithm Based on a Gaussian Mutation for Solving Constrained Optimization Problems

1
Department of Mathematics, Chandigarh University, Mohali 140413, Punjab, India
2
Department of Industry 4.0, Shri Vishwakarma Skill University, Palwal 121102, Haryana, India
3
Department of Physics & Electronics, School of Sciences, JAIN (Deemed to Be University), Bangalore 560069, Karnataka, India
4
Faculty of Physics and Applied Computer Science, AGH University of Krakow, 30-059 Krakow, Poland
5
MEU Research Unit, Middle East University, Amman 11813, Jordan
*
Author to whom correspondence should be addressed.
Mathematics 2024, 12(10), 1470; https://doi.org/10.3390/math12101470
Submission received: 8 April 2024 / Revised: 5 May 2024 / Accepted: 7 May 2024 / Published: 9 May 2024
(This article belongs to the Section Mathematics and Computer Science)

Abstract

:
The slime mould algorithm may not be enough and tends to trap into local optima, low population diversity, and suffers insufficient exploitation when real-world optimization problems become more complex. To overcome the limitations of SMA, the Gaussian mutation (GM) with a novel strategy is proposed to enhance SMA and it is named as SMA-GM. The GM is used to increase population diversity, which helps SMA come out of local optima and retain a robust local search capability. Additionally, the oscillatory parameter is updated and incorporated with GM to set the balance between exploration and exploitation. By using a greedy selection technique, this study retains an optimal slime mould position while ensuring the algorithm’s rapid convergence. The SMA-GM performance was evaluated by using unconstrained, constrained, and CEC2022 benchmark functions. The results show that the proposed SMA-GM has a more robust capacity for global search, improved stability, a faster rate of convergence, and the ability to solve constrained optimization problems. Additionally, the Wilcoxon rank sum test illustrates that there is a significant difference between the optimization outcomes of SMA-GM and each compared algorithm. Furthermore, the engineering problem such as industrial refrigeration system (IRS), optimal operation of the alkylation unit problem, welded beam and tension/compression spring design problem are solved, and results prove that the proposed algorithm has a better optimization efficiency to reach the optimum value.

1. Introduction

The method of optimization involves finding the best global response for a given problem in the search space. Several real-world problems are facing optimization challenges. The need for novel optimization strategies becomes more apparent than before as problems become more complicated. Several approaches have been significantly advanced in the last few decades to solve optimization problems. For example, prior to the development of heuristic optimization techniques, the only method available for solving issues was mathematical optimization. However, these techniques require knowledge of the optimization problem’s properties, such as continuity or differentiability. Metaheuristic techniques have gained increasing momentum in optimization techniques in recent years. Several popular algorithms in this domain are particle swarm optimization (PSO) [1], genetic algorithms (GA) [2], moth flame optimization [3], whale optimization algorithm [4], grey wolf optimization [5], and so on. These algorithms find application in several fields of research and industry as well. Notwithstanding these optimizers’ advantages, the fundamental question is whether an optimizer can solve each optimization problem. In optimization, as stated by the No Free-Lunch (NFL) theorem [6], researchers are permitted to create new algorithms that solve optimization problems more successfully. This makes it accessible to researchers to propose algorithm improvements that are more effective in specific situations, which advances the solution of complex problems in optimization.
Constrained optimization problems (COPs) are optimization problems that are currently in existence but are frequently limited by certain restrictions. Constrained evolutionary algorithms handle constrained optimization problems by combining evolutionary algorithms with constraint-handling technology. Evolutionary algorithms are commonly employed to tackle constrained optimization problems because of their strong universality, robustness, reliability, and low information requirements [7]. Meanwhile, over the past 20 years, numerous algorithm-based constraint-handling strategies have been suggested. Currently, the most popular and straightforward approach for handling constraints is the penalty function method [8], which includes or removes a penalty term from the objective function to change a constrained optimization problem into an unconstrained optimization problem. Since most real-world problems include constraints, it is essential to determine the optimum solution for an optimization problem with constraints, such as resources, time, cost, or design constraints. Slime Mould Algorithm (SMA), a unique bio-inspired optimization method [9], was proposed in 2020 by Li et al. The oscillatory behavior of slime mould during foraging serves as its inspiration. The benefits of the SMA are its excellent scalability, few parameters, and straightforward concepts.
While the SMA has been shown to work exceptionally well, it is not without the limitations. It is unable to find stability between the search trends—exploration and exploitation. This limitation could lead to inadequate solution precision, slower convergence, and an inability to come out of local optimum values. To improve the limitations of the SMA algorithm, this study proposes an enhanced algorithm called SMA-GM that presents a Gaussian mutation (GM) technique for improving SMA. The main contributions are as follows:
  • The proposed work designs the Gaussian mutation (GM) scheme, which acts on the current positions of slime mould. The proposed scheme can efficiently boost the local search capability of the optimal position of slime mould and avoid falling into the local optima;
  • The greedy selection approach is used for selection purposes to make sure that the slime mould with better fitness enters the next generation, ensures the algorithm’s convergence speed, and preserves population diversity;
  • The proposed strategy is verified by using 13 unconstrained and constrained optimization problems and CEC2022 benchmark functions. Furthermore, the comparison of the SMA-GM with the original SMA and with some well-established optimization algorithms has been shown, and the constrained engineering problem is solved through SMA-GM;
  • The experimental results, along with the Wilcoxon rank sum test as a statistical test, demonstrate the advantages and the superior performance of the proposed SMA-GM algorithm. The findings prove that the GM approach effectively improves the classical SMA’s search efficiency.
The remaining structure of this paper is organized as follows: Section 2 provides related work. Section 3 presents the preliminaries. In Section 4, the proposed SMA-GM algorithm is explained. Experimental results are given in Section 5. Section 6 includes the mechanical engineering design problem. Lastly, conclusions and directions for further study are presented in Section 7.

2. Related Work

The referenced literature studies show that the researchers have created a wide range of hybrid and metaheuristic versions of SMA to address many kinds of stochastic problems by mimicking the behavior of slime mould in finding out the food source. Yuan L et al. [10] proposed an improved algorithm by embedding elite and chaotic stochastic approaches into the structure of SMA for maintaining better stability among exploration and exploitation stages. Naik et al. [11] enhanced the SMA’s exploration capability by adding a balanced optimizer balanced pool, which improves the SMA individuals’ randomness. Houssein et al. [12] used an adaptive guided differential evolution (DE) algorithm for improving SMA. Yu et al. [13] introduced a quantum-based enhanced SMA, which gives better results in many optimization problems. A hybrid of arithmetic optimizer with SMA was employed by Chauhan S et al. [14] for improving the lesser internal memory and low convergence at local optimal. Zhao J et al. [15] enhanced the searching ability of SMA by replacing random weights through levy flights. Abualigah L et al. [16] proposed an opposition-based learning algorithm to increase the SMA rate of convergence and further levy flight distribution is incorporated with SMA for improving exploration and exploitation stages. Örnek et al. [17] propose the updating of SMA position by imposing the reciprocal oscillation feature of sine and cosine through each generation, which successfully boosts the exploration phase.
Gaussian mutation (GM) is also an optimization algorithm. Researchers have proposed GM as an improvement mechanism to boost the exploration or exploitation function. Yu et al. [18] proposed the quantum revolving door and GM approach for improving the dragonfly algorithm (DA) performance efficiently. Zhang et al. [19] employed fruit fly optimization (FOA) with a GM strategy and a chaotic local search approach for solving problems of feature. Chen et al. [20] proposed improved bacterial foraging optimization (BFO) with GM and a chaotic design for strengthening its performance. To balance exploration and exploitation, Luo et al. [21] developed an enhanced multi-strategy grasshopper optimization algorithm (GOA) with GM, levy flight, and opposition-based learning strategies. An enhanced Gaussian barebone was proposed by S. Wu et al. [22] as the DE mutation extension to improve the update approach of SMA.

3. Preliminary

3.1. Slime Mould Algorithm

With several novel characteristics and a unique mathematical framework that makes use of adaptive weights to simulate positive and negative feedback through bio-oscillator-based propagation waves as best paths for linking food for better exploration and exploitation, the SMA [9] is a novel stochastic optimizer resulting from the slime mould oscillation model in nature. The SMA mimics the behavioral changes that slime moulds undergo when searching for and encircling food during the foraging phase. They have a fan-like morphology at the front, followed by a web of connecting veins. The diffusive waves are created by the slime mould’s bio-oscillator, which modifies the cytoplasmic flow in the veins as they reach the food point and eventually generate the optimal path. The constant z with value 0.03 and transition probabilities p are critical parameters for maintaining balance in the SMA. SMA will randomly explore between the bounds set by l b and u b when randomly created values are smaller than z. On the other hand, SMA will exploit and search in neighboring areas of the current position if the random value, r , is greater than p . SMA will wrap around the current best position when r is smaller than p . The current location’s fit will determine the wrapping direction and radius. The following operations highlight the SMA behavior.
Approach food: For simulating the contraction mode and to represent the approaching behavior of slime mould, the given below rule is presented
X ( t + 1 ) = X b t + v b W X A t X B t   , r < p v c X t , r p    
where v c decreases linearly from one to zero and v b has a range of [−a, a]. X indicates the location of the slime mould, X A , and X B signify two arbitrarily chosen individuals from the population, W means the slime mould weight, and t indicates the current iteration. X b represents the individual position with the maximum odor concentration. The following is the p formula:
p = tanh S i D F  
The following is the formula for v b and W :
a = arctanh t m a x _ t + 1
W ( S l I i ) = 1 + r l o g b f S i b f w f + 1 , c o n d i t i o n 1 r l o g b f S i b f w f + 1 , o t h e r s    
S l I = s o r t S
where S l I indicates the smell index, i.e., it is the sorted order of fitness values; b f signifies the optimum fitness found in the current iterative manner, w f presents the poorest fitness value found in the iterative course, and c o n d i t i o n shows that S ( i ) ranks the first half of the population and that the random value in the interval [0, 1] is denoted by r .
Wrap food: The mathematical strategy given below is applied for updating the location of slime mould
X ( t + 1 ) = r a n d u b l b + l b , r a n d < z x b t + v b W x A t x B t , r < p v c X t r p
where r and r a n d indicate the arbitrary value in [0, 1] and l b , u b indicates the lower and upper boundaries of the search range, and z is the parameter taken as 0.03.
Oscillation: With the increasing number of iterations, the v b value gradually approaches zero and oscillates arbitrarily between [−a, a]. v c oscillates from [−1, 1] and ultimately tends to zero. The pseudocode of SMA is presented in Algorithm 1.
Algorithm 1: Pseudocode of SMA
Begin
  1.
Initialize the parameters p o p s i z e ,   M a x _ i t e r a t i o n
  2.
Initialize the positions of slime mould X i i = 1 ,   2 ,   ,   n
  3.
While t M a x _ i t e r a t i o n )
       Calculate the fitness of all slime mould
        U p d a t e   b e s t F i t n e s s ,   X b
       Calculate the W by Equation (4)
       For e a c h   s e a r c h   p o r t i o n
        U p d a t e   p ,   v b ,   v c
        U p d a t e   p o s i t i o n s   b y Equation (6)
       End for
        t = t + 1 ;  
  4.
End While
  5.
Return the best fitness and b e s t   p o s i t i o n
End

3.2. Gaussian Mutation

The popular optimization technique that works well at the exploitation stage is the Gaussian mutation (GM). The Gaussian distribution, represented by N ( μ ,   σ 2 ) , where μ and σ are the variable’s mean and standard deviation, is widely employed in statistics and the natural sciences to represent real-valued random variables. Gaussian distribution includes the renowned 3 σ rule, which provides a chance for foraging depending upon the requirement of the problem. Gaussian distribution has been widely utilized in the literature to modify control parameter values but it is rarely employed to create new mutation operators to retain a robust local search capability while increasing diversity. The GM operator is used to enhance the SMA to exploit the benefits of the Gaussian distribution fully. The Gaussian probability density formulation is represented in the following way:
G ( x ) = 1 2 π σ 2 e x p x μ 2 2 σ 2
The scheme GM is expressed in the following way
m u t a n t ( x ) = x i 1 + G ( 0 ,   σ )
where x i is the parameter’s current value and G ( 0 ,   σ ) represents the arbitrary value from Gaussian distribution with mean 0 and standard deviation σ . GM is the mutation operation that introduces population variability and explores the search space. When using GM, the original population is replaced with an arbitrary number that satisfies the variance and mean values. The GM scheme emphasizes a local area close to the original individual based on the normal distribution characteristics and it has demonstrated excellent efficiency in various optimizers.

4. Proposed Methodology

SMA has limitations even though it performs competitively when compared to other algorithms. The reduced population diversity of SMA enables it to enter the local optimum quickly. As the problem grows more complex and converges more slowly in subsequent iterations, SMA finds it challenging to balance exploration and exploitation. To further boost the performance of SMA and considering that the NFL inspires us to enhance the existing algorithms’ performance, Gaussian mutation (GM) is proposed to strengthen the SMA’s search capability and to achieve local and global optimality and is named the SMA-GM algorithm. Back and Schwefe [23] proposed using the Gaussian mutation (GM) to increase the search efficiency of heuristic algorithms. In general, GM produces new solutions that are near the candidate solutions. When the search is being conducted, it explores the search space in small steps, increasing the diversity of the population. Random perturbations are introduced to the current solution based on a Gaussian (normal) distribution. The proposed SMA-GM holds a better position when compared with the target value of the current optimum individual solution. The algorithm produces well-balanced local and global results because of the proposed strategy. A random variable is created by taking the mean of zero and the standard deviation of one. The Gaussian distribution function can be expressed as follows:
G g a u s s i a n 0 , σ x = 1 2 π σ 2 e x p x μ 2 2 σ 2  
where the variance value of the candidate solutions is σ2. To thoroughly study the features of the current population, two locations are arbitrarily selected, and their difference is considered to link with the Gaussian distribution operator to construct the proposed GM mechanism. The following is the presentation for GM strategy:
G new t = X best t 1 + λ v b G 0 , 1 X C t X D t
where G 0,1 is the Gaussian distribution mechanism of Equation (9), and   X best t is the optimum location for the current iteration of the original SMA. Following the GM operation, the new position is presented by G new t . The positions of the two slimes selected randomly from the population are represented by X C t and X D t , respectively. The value of v b oscillates randomly between [ a ,   a ] and approaches 0 with increasing iterations. In order to further balance the exploitation and exploration and making the smooth oscillation process, the parameter a is updated as follows:
a = arctanh t m a x _ t + 1 + c o s π 2 t m a x _ t
λ = 1 t 2 m a x _ t 2
where, λ is a dynamic parameter adaptively adjusted with the number of iterations. t indicates the current iteration and m a x _ t indicates maximum number of iterations. Algorithm 2 displays the proposed algorithm’s pseudocode after the normal execution of SMA while updating the parameter a . The optimum location is mutated to create a new mutated solution using the GM process through Equation (10).
Algorithm 2: Pseudocode of the SMA-GM algorithm
Begin
  1.
Initialize the parameters p o p s i z e ,   M a x _ i t e r a t i o n
  2.
Initialize the positions of slime mould X i i = 1 ,   2 ,   ,   n
  3.
While t M a x _ i t e r a t i o n )
       Calculate the fitness of all slime mould
        U p d a t e   b e s t F i t n e s s ,   X b
       Calculate the W by Equation (4) and a by Equation (11)
       For e a c h   s e a r c h   p o r t i o n
        U p d a t e   p ,   v b ,   v c ;
        U p d a t e   p o s i t i o n s   b y Equation (6)
       Update the position of the slime mould and the optimal position as X best t .
       Gaussian mutation mechanism
        Update   λ by Equation (12)
        G new t = X best t * 1 + λ * v b   G 0,1 * X C t X D t
        I f F G n e w t < F X b e s t t
       Update optimal solution to G n e w t
       Replace the parent slime with the generated mutant slimes if its fitness is found to be better.
       End If
       End For
        t = t + 1 ;  
  4.
End While
  5.
Return the b e s t   f i t n e s s   a n d   b e s t   p o s i t i o n
End
The new mutated solution is then compared to the optimal position of the original SMA to determine whether the new mutated solution is still the better solution and if not, the optimal position of the original SMA is updated to the optimal solution through a greedy selection mechanism. Individuals who have better fitness are selected to enter the next iteration. The computing process for this is shown below.
X ( t + 1 ) = G n e w t i f   F G n e w t < F X b e s t t X b e s t t e l s e
The pseudocode of SMA-GM illustrates how the proposed method reconsiders the position details following the original algorithm’s search for a feasible outcome. The above-mentioned technique not only broadens the population’s diversity but also boosts the effectiveness of the search process; in the process, the enhanced version keeps the original algorithm’s structure relatively straightforward. SMA-GM consists of the subsequent parts for complexity, such as initialization, fitness evaluation, sorting, weight update, position update, Gaussian mutation, and greedy search. Among them, N represents the number of slime mould, d i m represents the dimension, and M a x t represents the maximum number of iterations. The total complexity of SMA-GM is O ( d i m + M a x t N ( 2 + l o g N + d i m ) ) . Figure 1 presents the flowchart of the SMA-GM algorithm.

5. Experimental Setup and Results

Optimization problems are generally categorized into two: unconstrained and constrained. The optimization method in the unconstrained problems seeks for an objective function’s minimum or maximum possible value. Since the optimal solution for the second kind of problem must meet the constraints specified for the problem, the method is more complex. In simple words, the solution ought to be feasible. To verify the performance of SMA-GM, the results of the proposed algorithm are compared against four state-of-the-art algorithms from the literature, including SMA [9], GWO [5], MFO [3], and WOA [4] and two more improved optimization algorithms are included for fair comparison, which are AGWO [24] and IChoA [25]. The parameter settings of these algorithms are given in Table 1. The environmental conditions are an 11th Gen Intel(R) Core (TM) i3-1125G4 @ 2.00GHz, 8.00 GB of RAM and the Windows 11 operating system and the simulation experiments are set up on the MATLAB R2021a platform. The SMA-GM algorithm was evaluated and compared with other algorithms using 38 functions, including 13 unconstrained, 13 unconstrained (F1–F13), constrained benchmark functions (G1 to G13), and CEC2022 benchmark functions. The “Mean”, “St.dev”, “Best”, and “Worst” values of the objective function are given after each problem was run 30 times and the number of iterations of each algorithm was fixed to 1000 with a population size 30 and the maximum number of function evaluations is set to 30,000.

5.1. Unconstrained Benchmark Functions

This section includes the 13 benchmark functions that were utilized for the experiments. These functions are categorized as unimodal (F1–F7) and multimodal functions (F8–F13). The unimodal functions are employed for testing an algorithm’s potential for exploitation. Conversely, the ability to explore and the stability of algorithms can be found in the multimodal test functions that have a large number of local minima. For each function, the experiment is carried out for 30, 60, and 200 dimensions and the search range of all the functions is set to [−100, 100]. The main parameter values of each algorithm are listed in Table 1 and Table 2 lists the summary of unimodal and multimodal test functions. It should be noted that SMA-GM employed the same parameter values as the original algorithm. As such, it could offer a guarantee of stability in the proposed algorithm’s performance. Additionally, for a fair comparison, the test conditions were the same as those listed in Table 1.

5.1.1. Exploration and Exploitation Analysis

Table 3 displays the results of the unimodal and multimodal test functions. According to the experimental findings, SMA-GM performs better in most of the test functions. Most notably, SMA-GM achieved satisfactory results in both low and high dimensions for the unimodal functions (F1–F7). In all three dimensions, SMA-GM is able to consistently achieve the theoretically optimal solutions for F1 and F3. In comparison, the other comparable algorithms are not very capable of achieving the theoretical optimal value in all the test functions and perform weaker than SMA-GM. Comparing the test results of each dimension, the results show that the SMA-GM’s performance has not dropped too much with increasing dimensions, which illustrates that SMA-GM has outstanding local exploitation capability. For the multimodal functions (F8–F13), the SMA-GM steadily achieves the theoretical optimal values at F9–F11 in all three dimensions (30, 60, and 200). SMA-GM has the overall best performance in the multimodal functions, signifying that the improved algorithm extensively boosts the global exploration capability of SMA.

5.1.2. Convergence and Scalability Analysis

In order to study the convergence performance of SMA-GM, convergence curves are plotted according to the results with 60 dimensions, as given in Figure 2. In unimodal functions, the curve is smooth and continuously decreasing, demonstrating the algorithm’s ability to find the optimal solution. For multimodal functions, the convergence curve descends in steps, signifying the algorithm’s capability to steadily escape local optima and reach the global optimum. At the beginning of the iteration, the decline speed for F1, F2, F4, and F10 of SMA and AGWO in the convergence curve is faster than that of SMA-GM. But in the later stage of the iteration, SMA-GM exceeds SMA and AGWO, indicating the robust development ability of the proposed algorithm. Therefore, the improvement strategy proposed in this paper can successfully enhance the convergence speed of SMA and achieve better optimization results.
The scalability test reveals the performance fluctuation of optimization algorithms. In this study, the performance of SMA-GM in various dimensions (dim = 30, 60, 200) was also tested. It is evident that the algorithm will have greater difficulty in locating the global optimal solution for the higher dimension. Notably, all other experimental parameters were constant in accordance with the aforementioned descriptions, with the exception of the dimension setting. The experiments’ mean value (mean) and standard deviation (Std) are used to measure the results and Table 3 displays the experimental results. A particular algorithm’s quality and accuracy are indicated by its mean value, while its stability is indicated by its standard deviation. The results of the unimodal and multimodal functions both showed that SMA-GM performed exceptionally well in higher dimensions. SMA-GM outperformed SMA and other well-known algorithms in all of the functions except F6. It should be mentioned that these comparison algorithms (GWO, MFO, WOA, AGWO, and IChoA) usually showed weak optimization capabilities, particularly in higher dimensions where the algorithms’ performance fluctuates with increasing dimensionality. Hence, the proposed SMA-GM performed more consistently with better optimization behavior when handling high-dimensional problems.

5.1.3. Diversity Analysis

Generally, the algorithm that gives priority to the exploitation phase will have a high convergence rate but there would be a risk of trapping into the local optima. However, the algorithm that focusses more on the exploration phase finds the more search space and the chance of finding the global solution increases though the convergence rate may be weak. Thus, a rationally working algorithm attempts to maintain the balance between these two phases. The diversity test is one of the methods for balance analysis between these two phases. The diversity curves, as seen in Figure 3 for unconstrained functions, indicate the average distance between the search agents during the iterations. The diversity is derived using the following equations, which were inspired by certain studies [26,27]:
D V j = 1 N i = 1 N | m e d i a n ( x j x i j | )
D V = 1 m j = 1 m D V j
where N is the number of search agents, m indicates the dimension of the problem, x i j presents the dimension j of search agent i , and m e d i a n ( x j ) presents the median of dimension j of the total population.
The diversity in each dimension is defined as the distance between the dimension j of each search agent and the average median of the dimension. The variation in diversity between the original SMA and SMA-GM is shown in Figure 3. The average distance between the search agents is displayed on the vertical axis, while the number of iterations is displayed on the horizontal axis. Because of the random initialization process, it is evident that both algorithms reflect high diversity in the initial phase of the execution. Then, as a result of various algorithmic operations, the diversity varies. As the iterations increase, the search agents traverse the whole search space and the average distance gradually becomes smaller, making the diversity decrease, and the algorithms gradually go into the exploitation phase, which is consistent with the above analysis. However, in most of the test functions, the average distance in each iteration is high for the SMA-GM algorithm as compared to SMA, which shows the better ability of search in the SMA-GM in terms of exploring new search regions of the search space. This shows the effect of improving the search mechanism of slime moulds through the GM strategy in the SMA-GM algorithm.

5.2. Constrained Handling Technique

The intelligent constrained handling strategy must use the constraints (restrictions) conditions during optimization to optimize the constrained problems [28,29]. The method of including a penalty to the objective function’s value is employed in this article. Equations (16)–(19) were applied for this purpose. The constraint and objective function values are denoted by the parameters c ( u ) and f ( u ) , respectively, in these equations. Furthermore, the constraint values are g ( u ) for the inequality and h ( u ) for the equality constraints. Here, u is the solution vector. Additionally, the number of inequality and equality constraints is q and m , respectively, and the whole constraint is j . In this study, 1E-04 is the small tolerance for the δ parameter.
f i t n e s s = f ( u ) + 10 20 1 j C j 2
C j ( u ) = m a x ( 0 ,   g j ( u ) ) i f   1 j q m a x ( 0 ,   ( | h j ( u ) | ) δ ) i f   q + 1 j m
i n e q u a l i t y c o n s t r a i n t s : g j u 0 , j = 1 , , q
e q u a l i t y c o n s t r a i n t s : h j u = 0 , j = q + 1 , , m

5.3. Results and Discussion on Constrained Functions

Considering 13 commonly used constrained benchmark functions, the proposed SMA-GM’s capability to handle constrained problems was evaluated. Table 4 shows the different linear, non-linear, and quadratic equations for each problem in the set that take the form of equality and inequality constraints. Table 5 illustrates the results for each constrained benchmark function employed with SMA-GM and other algorithms. The results clearly illustrate that SMA-GM outperforms all other algorithms for 10 of the 13 constrained benchmark functions with excellent efficiency, showing the superiority of the proposed algorithm. IChoA outperformed SMA-GM in the G2 problem; in the G8 problem, SMA-GM performed better than SMA but could not outperform the other algorithms for the G8 problem and for the G5 problem, IChoA performed better than SMA-GM. For problem G11, compared with SMA, GWO, and IChoA, SMA-GM obtained similar results and for the G12 problem, SMA-GM, MFO, and IChoA acquired identical results. The algorithm has produced satisfactory results, either optimal or nearly optimal. According to the results, the proposed Gaussian mutation (GM)-based SMA-GM outperforms the competitive metaheuristic techniques in terms of solution accuracy except for the G2, G5, and G8 problems. Boxplots in Figure 4 strengthen the conclusion mentioned above. The SMA-GM algorithm demonstrates excellent robustness for handling constrained problems, as evidenced by the lowermost position and least height of the box, which is plotted by the obtained outcomes.

5.4. Results and Discussions on CEC2022 Benchmark Functions

In this section, the most recent CEC2022 test functions are selected to further evaluate the performance of the proposed SMA-GM and compare it with other above-mentioned algorithms. The highly complex global optimization problems are simulated by the CEC 2022 test function [30]. All functions were tested in 20-dimensional space to guarantee the validity and fairness of the experimental results. In total, 12 test functions, 4 of which are classified as compositional, unimodal, essential, and hybrid, are included in Table 6 as the summary of CEC2022. The algorithm parameter values used were the same as those listed in Table 1. Table 7 presents the mean and standard deviation (Std) for 30 independent runs, illustrating the higher performance of SMA-GM for 10 out of the 12 test functions in the CEC-2022 test suite. By analyzing the results, it is clear that SMA-GM has satisfactory exploration and exploitation abilities to solve optimization problems. Compared with other algorithms, its optimization accuracy, solution stability, and adaptability to different functions have noticeable advantages and SMA-GM has gained more competitive results.
As seen in Figure 5, the convergent graphs are extracted from the CEC2022 test functions to thoroughly validate the SMA-GM algorithm’s performance. The convergence curve shows that whereas other algorithms enter local optima too early, SMA-GM has a faster rate of convergence in the functions C1–C3 and C6–C12. In order to prevent stagnation into local minima, SMA-GM’s performance first explored the majority of the search space. Later, it gradually switched to the exploitation operator to quickly reduce diversity during the second half of the optimization process, speeding convergence toward the most promising region so far found. Due to the search process variation, the SMA-GM performs well, which allows the algorithm to have a strong exploration and exploitation operator during the optimization process. The GM operator helps to preserve population diversity, prevents the system from becoming stuck in local optima, and speeds up the convergence towards the best-so-far solution.

5.5. Wilcoxon Rank Sum Test

The statistical analysis of the significant difference between the two methods can be conducted using the non-parametric Wilcoxon rank sum test [31]. The Wilcoxon rank sum test method is employed in this study to evaluate if the experimental findings of the SMA-GM algorithm and the counterpart algorithms differ significantly from one another. The results of each algorithm running independently 30 times on the 13 constrained functions (G1–G13) and CEC2022 test functions listed above provide the data utilized in the rank sum test. The rank sum test p-value between SMA-GM and each comparison algorithm is displayed in Table 8. The null hypothesis is rejected when the difference between the two algorithms is significant (p < 0.05). In Table 8, the N a N denotes the independent results obtained from SMA-GM and its corresponding optimizer are identical. The majority of p-values shown in Table 8 is less than 5%, which indicates that the alternative hypothesis is accepted. Hence, the outcome of SMA-GM is different from those of the other compared algorithms. Thus, SMA-GM is a robust optimizer, as verified by its capability to overcome SMA, GWO, MFO, WOA, AGWO, and IChoA, which are the most competent and highly cited metaheuristic optimizers.

6. Constrained Engineering Design Problem

The goal of developing and improving any stochastic search algorithm is to find solutions to real-world problems. In the preceding section, the SMA-GM efficacy was studied through numerical constrained experiments. This section evaluates the exploration and exploitation abilities in real-life situations using the SMA-GM algorithm. In this section, the SMA-GM algorithm is used to solve mechanical engineering design problems, such as the industrial refrigeration system (IRS) design problem, optimization of the alkylation unit procedure, welded beam, and tension/compression spring design problem.

6.1. Optimal Design of an Industrial Refrigeration System

Solving engineering design problems aims to provide the best possible design scheme while taking into account a variety of constraints. SMA-GM is compared with the metaheuristic algorithms mentioned above and the same parameter settings are listed in Table 1. Nowadays, energy saving and emission reduction work has become the focus of several fields. IRS accounts for a large proportion of energy consumption, so it is essential to optimize and control the IRS [32]. The optimal design of IRS is a highly complex mechanical engineering design problem, which has 14 design variables and 15 constraints. Below is an illustration of its mathematical formulation.
  • Minimize
f u = 63098.88 u 2 u 4 u 12 + 5441.5 u 2 2 u 12 + 115055.5 u 2 1.664 u 6 + 6172.27 u 2 2 u 6 + 63098.88 u 1 u 3 u 11 + 5441.5 u 1 2 u 11 + 115055.5 u 1 1.664 u 5 + 6172.27 u 1 2 u 5 + 140.53 u 1 u 11 + 281.29 u 3 u 11 + 70.26 u 1 2 + 281.29 u 1 u 3 + 281.29 u 3 2 + 14437 u 8 1.8812 u 12 0.3424 u 10 u 14 1 u 1 2 u 7 u 9 1 + 20470.2 u 7 2.893 u 11 0.316 u 1 2
subject to
g 1 ( u ) = 1.524 u 7 1 1 , g 2 ( u ) = 1.524 u 8 1 1 , g 3 ( u ) = 0.07789 u 1 2 u 7 1 u 9 1 0 , g 4 ( u ) = 7.05305 u 9 1 u 1 2 u 10 u 8 1 u 2 1 u 14 1 1 0 , g 5 u = 0.0833 u 13 1 u 14 1 0 , g 6 ( u ) = 47.136 u 2 0.333 u 10 1 u 12 1.333 u 8 u 13 2.1195 + 62.08 u 13 2.1195 u 12 1 u 8 0.2 u 10 1 1 0 , g 7 ( u ) = 0.04771 u 10 u 8 1.8812 u 12 0.3424 1 0 , g 8 ( u ) = 0.0488 u 9 u 7 1.893 u 11 0.316 1 0 , g 9 u = 0.0099 u 1 u 3 1 1 0 , g 10 ( u ) = 0.0193 u 2 u 4 1 1 0 , g 11 ( u ) = 0.0298 u 1 u 5 1 1 0 , g 12 ( u ) = 0.056 u 2 u 6 1 1 0 , g 13 ( u ) = 2 u 9 1 1 0 , g 14 ( u ) = 2 u 10 1 1 0 , g 15 ( u ) = u 12 u 11 1 1 0 ,
With bounds
0.001 u i 5 , i = 1 , , 14 .
Table 9 compares the results of SMA-GM with the other four optimizers to show which one is best at solving the IRS design problem. Table 10 presents the optimal constraint values. These findings show that SMA-GM outperforms the other optimizers and converges faster than the other optimizers, as shown in Figure 6.

6.2. Optimal Operation of an Alkylation Unit

The optimal operation of an alkylation unit is very common in the petroleum industry [33]. In the alkylation process, the reactor is filled with the olefin feedstock (100% butane), pure isobutane recycling, and 100% isobutene supplement, along with the acid catalyst. The reactor product is then sent through a fractionator to separate the isobutene and alkane. Additionally, spent acid, the base product, is taken out of the reactor. The alkylation product is the objective function and the primary goal of this problem is to raise the octane number of the olefin feedstock in acidic conditions. This is a complex optimization problem with 14 constraints and 7 variables. Below is an illustration of its mathematical formulation.
  • Maximize
f ( u ) = 0.035 u 1 u 6 + 1.715 u 1 + 10.0 u 2 + 4.0565 u 3 0.063 u 3 u 5
subject to
g 1 ( u ) = 0.0059553571 u 6 2 u 1 + 0.88392857 u 3 0.1175625 u 6 u 1 u 1 0 , g 2 u = 1.1088 u 1 + 0.1303533 u 1 u 6 0.0066033 u 1 u 6 2 u 3 0 , g 3 ( u ) = 6.66173269 u 6 2 56.596669 u 4 + 172.39878 u 5 10000 191.20592 u 6 0 , g 4 ( u ) = 1.08702 u 6 0.03762 u 6 2 + 0.32175 u 4 + 56.85075 u 5 0 , g 5 ( u ) = 0.006198 u 7 u 4 u 3 + 2462.3121 u 2 25.125634 u 2 u 4 u 3 u 4 0 , g 6 ( u ) = 161.18996 u 3 u 4 + 5000.0 u 2 u 4 489510.0 u 2 u 3 u 4 u 7 0 , g 7 ( u ) = 0.33 u 7 + 44.333333 u 5 0 , g 8 ( u ) = 0.022556 u 5 1.0 0.007595 u 7 0 , g 9 u = 0.00061 u 3 1.0 0.0005 u 1 0 , g 10 ( u ) = 0.819672 u 1 u 3 + 0.819672 0 , g 11 u = 24500.0 u 2 250.0 u 2 u 4 u 3 u 4 0 , g 12 ( u ) = 1020.4082 u 4 u 2 + 1.2244898 u 3 u 4 100000 u 2 0 , g 13 ( u ) = 6.25 u 1 u 6 + 6.25 u 1 7.625 u 3 100000 0 , g 14 ( u ) = 1.22 u 3 u 6 u 1 u 1 + 1.0 0 ,
With bounds
1000 u 1 2000 , 0 u 2 100 , 2000 u 3 4000 , 0 u 4 100 , 0 u 5 100,0 u 6 20 , 0 u 7 200 .
Table 11 shows that the SMA-GM algorithm performs better than other algorithms, signifying that it can maximize the alkylation product value and has a better result for the optimization of the alkylation process. The convergence curve for the optimal operation of the alkylation unit is shown in Figure 7 and Table 12 shows the constraint values. It can be seen from the Figure that the convergence performance of the proposed algorithm is not optimal at the initial stage. But at the later stage of the iteration, the SMA-GM algorithm can get rid of the local optima and continue exploring to improve the overall optimization performance.

6.3. Welded Beam Design

The objective of the problem is to construct a welded beam [33] with minimal cost under the bounds of buckling load ( P c ), end deflection of the beam ( δ ) , bending stress ( θ ) , and shear stress ( τ ) . It considers the weld thickness h , the joint length l , the height t of the beam, and the thickness b as variables and the mathematical formulation of this problem is shown as follows:
  • Consider
V a r i a b l e   u = [ u 1 , u 2 , u 3 , u 4 ] = [ h ,   l ,   t ,   b ]
Minimize
f ( u ) = 0.04811 u 3 u 4 ( u 2 + 14 ) + 1.10471 u 1 2 u 2
subject to
g 1 u = u 1 u 4 0 , g 2 u = δ u δ m a x 0 , g 3 u = P P c u , g 4 u = τ m a x τ u , g 5 u = σ u σ m a x 0 ,
where,
τ = τ 2 + τ 2 + 2 τ τ u 2 2 R   , τ = R M J ,   τ = P 2 u 2 u 1 ,   M = p u 2 2 + L , R = u 2 2 4 + u 1 + u 3 2 2 ,   J = 2 u 2 2 4 + u 1 + u 3 2 2 2 u 1 u 2 , σ u = 6 P L u 4 u 3 2 , δ u = 6 P L 3 E u 4 u 3 2 , P c u = 4.013 E u 3 u 4 3 6 L 2 1 u 3 2 L E 4 G , L = 14 i n , P = 6000 l b , E = 30.10 6 p s i , σ m a x = 30,000 p s i , T m a x = 13,600 p s i , G = 12.10 6 p s i , δ m a x = 0.25 i n .
With bounds
0.1 u 3 , u 2 10 ,   0.1 u 4 2 ,   0.125 u 1 2 .
The optimal solution of SMA-GM and other counterparts on the welded beam problem are presented in Table 13 and Table 14, which gives the corresponding constraint values of these algorithms for this problem. According to the comparative results, SMA-SM achieved optimum parameter values resulting in optimum cost function value 1.7249 with ideal parameter [0.20573, 3.4704, 9.037, 0.20573] for welded beam design problem. Figure 8 shows the convergence behavior of the SMA-GM and other algorithms for this problem.

6.4. Tension/Compression Spring Design Problem

The tension/compression spring problem is a classic structural engineering design problem [33], whose purpose is to minimize the weight of tension or compression spring under certain constraints such as shear stress, surge frequency, and deflection. To solve the problem, three core variables are needed: the diameter of the wire (d), mean value of the coil diameter (D), and number of active coils (N). The mathematical model of this problem is as follows:
  • Consider
Variable   u = u 1 , u 2 , u 3 = d , D , N
Minimize
f u = u 1 2 u 2 2 + u 3
subject to
g 1 ( u ) = 1 u 3 2 u 3 71785 u 1 4 0 , g 2 ( u ) = 4 u 2 2 u 1 u 2 12566 ( u 2 u 1 3 u 1 4 ) + 1 5108 u 1 2 1 0 , g 3 u = 1 140.45 u 1 u 2 2 u 3 0 , g 4 ( u ) = u 1 + u 2 1.5 1 0 ,
with bounds
0.05 u 1 2.00 ,   0.25 u 2 1.30 ,   2.00 u 3 15.0 .
The results of SMA-GM and competitor algorithms in terms of achieving the optimal solution for the tension/compression spring design optimization problem are reported in Table 15. In addition, Table 16 provides the corresponding constraint values of these algorithms for this problem. As can be seen from Table 15, the SMA-GM algorithm not only performed the best out of all comparison algorithms but also obtained the solution closest to the optimal value of the tension/compression spring design problem. For qualitative analysis, the convergence plots were recorded and illustrated in Figure 9 and the figure simply proves the efficient convergence potential of the SMA-GM. Therefore, we can say that the SMA-GM algorithm not only has an excellent ability to solve complex engineering problems but also shows high efficiency and accuracy and obtains the optimal solution with the best parameter and constraint values.

7. Conclusions

A novel algorithm named SMA-GM has been proposed in this paper. SMA-GM has been proposed to address the shortcomings of the original SMA algorithm, such as its tendency to trap into local optima and insufficient exploitation. In the proposed method for avoiding SMA’s limitations, the exploitation phase of the original SMA has improved through the Gaussian mutation (GM) mechanism. The proposed SMA-GM performs exceptionally well in terms of convergence rate, stability, and optimization reliability compared to other metaheuristic algorithms. According to experiments conducted on the unconstrained, constrained, and CEC2022 functions and to further check the potential of SMA-GM, the complex constrained engineering problems show the superiority of the proposed algorithm as compared to other algorithms. To avoid the local optima, the GM strategy introduces a slight arbitrary variation into the group of search agents, improving convergence and enhancing exploitation potential while expanding the individual search space. Strengthening the global search and boosting the updated population is the primary goal of the proposed GM strategy. To guarantee that slime moulds with better fitness enter the next generation, the greedy selection mechanism was included. Convergence curves, diversity curves, and boxplot results showed that the proposed SMA-GM algorithm has efficiently enhanced the convergence accuracy, fitness performance, and the algorithm’s stability.
The proposed SMA-GM has some limitations in practical application. If the scale of the problem is too large, difficulty could arise in selecting the optimal solution. However, like other stochastic algorithms, SMA-GM might not always reach the global optima for all benchmark functions and is susceptible to being trapped at local minima. The limitations of SMA-GM must be addressed to improve its ability to provide global solutions. By overcoming these challenges, SMA-GM can advance toward becoming a widely accepted and widely used state-of-the-art algorithm.
In future studies, more effective constraint processing methods must be developed and incorporated into other metaheuristic algorithms. Multi-objective optimization is also a highly challenging area. Future work can focus on combining the proposed algorithm with different techniques to handle multi-objective optimization problems, focusing more on the operators to analyze the diversity and striking a balance between exploration and exploitation. Consequently, the proposed algorithm can be applied to complex optimization problems involving many design variables. It is worthwhile to investigate whether the SMA-GM may be used for a more extensive variety of real-life problems in future research. This will contribute to further validating the algorithm’s applicability to generate optimal solutions for a wide variety of optimization problems.

Author Contributions

G.T.: Conceptualization and Writing—original Draft, A.P.: Writing—review and editing and Supervision, N.M.: Methodology and Investigation, A.R.: Writing—review and editing, R.S.: Methodology, Supervision, Visualization, Writing—review and editing, Project Administration, and Funding Acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data used in the current study are available from the corresponding author on reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this research.

References

  1. Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In MHS’95, Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4–6 October 1995; IEEE: Piscataway, NJ, USA, 1995; pp. 39–43. [Google Scholar]
  2. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  3. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  4. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  5. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  6. Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  7. Mirrashid, M.; Naderpour, H. Transit search: An optimization algorithm based on exoplanet exploration. Results Control Optim. 2022, 7, 100127. [Google Scholar] [CrossRef]
  8. Schlüter, M.; Gerdts, M. The oracle penalty method. J. Glob. Optim. 2010, 47, 293–325. [Google Scholar] [CrossRef]
  9. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime Mould Algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  10. Yuan, L.; Ji, J.; Liu, X.; Liu, T.; Chen, H.; Chen, D. An Improved Elite Slime Mould Algorithm for Engineering Design. CMES-Comput. Model. Eng. Sci. 2023, 137, 415–454. [Google Scholar] [CrossRef]
  11. Naik, M.K.; Panda, R.; Abraham, A. An entropy minimization based multilevel colour thresholding technique for analysis of breast thermograms using equilibrium slime mould algorithm. Appl. Soft Comput. 2021, 113, 107955. [Google Scholar] [CrossRef]
  12. Houssein, E.H.; Mahdy, M.A.; Blondin, M.J.; Shebl, D.; Mohamed, W.M. Hybrid slime mould algorithm with adaptive guided differential evolution algorithm for combinatorial and global optimization problems. Expert Syst. Appl. 2021, 174, 114689. [Google Scholar] [CrossRef]
  13. Yu, C.; Heidari, A.A.; Xue, X.; Zhang, L.; Chen, H.; Chen, W. Boosting quantum rotation gate embedded slime mould algorithm. Expert Syst. Appl. 2021, 181, 115082. [Google Scholar] [CrossRef]
  14. Chauhan, S.; Vashishtha, G.; Kumar, A. A symbiosis of arithmetic optimizer with slime mould algorithm for improving global optimization and conventional design problem. J. Supercomput. 2022, 78, 6234–6274. [Google Scholar] [CrossRef]
  15. Juan, Z.; Zheng-Ming, G.; Wu, S. The improved slime mould algorithm with Levy flight. J. Phys. Conf. Ser. 2020, 1617, 012033. [Google Scholar]
  16. Abualigah, L.; Diabat, A.; Elaziz, M.A. Improved slime mould algorithm by opposition-based learning and Levy flight distribution for global optimization and advances in real-world engineering problems. J. Ambient. Intell. Humaniz. Comput. 2023, 14, 1163–1202. [Google Scholar] [CrossRef]
  17. Örnek, B.N.; Aydemir, S.B.; Düzenli, T.; Özak, B. A novel version of slime mould algorithm for global optimization and realworld engineering problems: Enhanced slime mould algorithm. Math. Comput. Simul. 2022, 198, 253–288. [Google Scholar] [CrossRef]
  18. Yu, C.; Cai, Z.; Ye, X.; Wang, M.; Zhao, X.; Liang, G.; Li, C. Quantum-like mutation-induced dragonfly-inspired optimization approach. Math. Comput. Simul. 2020, 178, 259–289. [Google Scholar] [CrossRef]
  19. Zhang, X.; Xu, Y.; Yu, C.; Heidari, A.A.; Li, S.; Chen, H.; Li, C. Gaussian mutational chaotic fruit fly-built optimization and feature selection. Expert Syst. Appl. 2020, 141, 112976. [Google Scholar] [CrossRef]
  20. Chen, H.; Zhang, Q.; Luo, J.; Xu, Y.; Zhang, X. An enhanced bacterial foraging optimization and its application for training kernel extreme learning machine. Appl. Soft Comput. 2020, 86, 105884. [Google Scholar] [CrossRef]
  21. Luo, J.; Chen, H.; Xu, Y.; Huang, H.; Zhao, X. An improved grasshopper optimization algorithm with application to financial stress prediction. Appl. Math. Model. 2018, 64, 654–668. [Google Scholar] [CrossRef]
  22. Wu, S.; Heidari, A.A.; Zhang, S.; Kuang, F.; Chen, H. Gaussian bare-bone slime mould algorithm: Performance optimization and case studies on truss structures. Artif. Intell. Rev. 2023, 56, 9051–9087. [Google Scholar] [CrossRef]
  23. Bäck, T.; Schwefel, H.P. An overview of evolutionary algorithms for parameter optimization. Evol. Comput. 1993, 1, 1–23. [Google Scholar] [CrossRef]
  24. Ma, C.; Huang, H.; Fan, Q.; Wei, J.; Du, Y.; Gao, W. Grey wolf optimizer based on Aquila exploration method. Expert Syst. Appl. 2022, 205, 117629. [Google Scholar] [CrossRef]
  25. Preeti; Kaur, R.; Singh, D. Dimension learning based chimp optimizer for energy efficient wireless sensor networks. Sci. Rep. 2022, 12, 14968. [Google Scholar] [CrossRef] [PubMed]
  26. Qtaish, A.; Albashish, D.; Braik, M.; Alshammari, M.T.; Alreshidi, A.; Alreshidi, E.J. Memory-based Sand Cat Swarm Optimization for Feature Selection in Medical Diagnosis. Electronics 2023, 12, 2042. [Google Scholar] [CrossRef]
  27. Duan, Y.; Liu, C.; Li, S.; Guo, X.; Yang, C. Manta ray foraging and Gaussian mutation-based elephant herding optimization for global optimization. Eng. Comput. 2023, 39, 1085–1125. [Google Scholar] [CrossRef]
  28. Deb, K. An efficient constraint handling method for genetic algorithms. Comput. Methods Appl. Mech. Eng. 2000, 186, 311–338. [Google Scholar] [CrossRef]
  29. Runarsson, T.P.; Yao, X. Stochastic ranking for constrained evolutionary optimization. IEEE Trans. Evol. Comput. 2000, 4, 284–294. [Google Scholar] [CrossRef]
  30. Abdel-Basset, M.; Mohamed, R.; Sallam, K.M.; Chakrabortty, R.K. Light spectrum optimizer: A novel physics-inspired metaheuristic optimization algorithm. Mathematics 2022, 10, 3466. [Google Scholar] [CrossRef]
  31. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  32. Azizi, M.; Talatahari, S.; Giaralis, A. Optimization of engineering design problems using atomic orbital search algorithm. IEEE Access 2021, 9, 102497–102519. [Google Scholar] [CrossRef]
  33. Kumar, A.; Wu, G.; Ali, M.Z.; Mallipeddi, R.; Suganthan, P.N.; Das, S. A test-suite of non-convex constrained optimization problems from the real-world and some baseline results. Swarm Evol. Comput. 2020, 56, 100693. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the SMA-GM algorithm.
Figure 1. Flowchart of the SMA-GM algorithm.
Mathematics 12 01470 g001
Figure 2. Convergence curves of unconstrained benchmark functions with 60 dim.
Figure 2. Convergence curves of unconstrained benchmark functions with 60 dim.
Mathematics 12 01470 g002aMathematics 12 01470 g002b
Figure 3. Diversity analysis of SMA and SMA-GM.
Figure 3. Diversity analysis of SMA and SMA-GM.
Mathematics 12 01470 g003aMathematics 12 01470 g003b
Figure 4. Boxplot of the SMA-GM algorithm and other algorithms on constrained functions.
Figure 4. Boxplot of the SMA-GM algorithm and other algorithms on constrained functions.
Mathematics 12 01470 g004aMathematics 12 01470 g004bMathematics 12 01470 g004c
Figure 5. Convergence curves of CEC2022 benchmark functions.
Figure 5. Convergence curves of CEC2022 benchmark functions.
Mathematics 12 01470 g005aMathematics 12 01470 g005b
Figure 6. Convergence curve for IRS.
Figure 6. Convergence curve for IRS.
Mathematics 12 01470 g006
Figure 7. Convergence curve for the optimal operation of an alkylation unit.
Figure 7. Convergence curve for the optimal operation of an alkylation unit.
Mathematics 12 01470 g007
Figure 8. Convergence curve for the welded beam design problem.
Figure 8. Convergence curve for the welded beam design problem.
Mathematics 12 01470 g008
Figure 9. Convergence curve for the tension/compression spring design problem.
Figure 9. Convergence curve for the tension/compression spring design problem.
Mathematics 12 01470 g009
Table 1. Parameter setting of the involved algorithms.
Table 1. Parameter setting of the involved algorithms.
AlgorithmParameters
CommonPopulation size = 30
Maximum iterations = 1000
Number of independent runs = 30
SMAz = 0.03
GWOa = [2, 0]
MFOb = 1, t = [−1, 1], a ∈ [−1, 2]
WOAa1 = [2, 0]; a2 = [−2, −1]; b = 1
AGWOB = 0.8, a = 2(nonlinear reduction from 2 to 0)
IChoAm = chaotic vector, C3 = 1, C4 = 2, l = 2.5 (nonlinear reduction from 2.5 to 0)
Table 2. Summary of unconstrained benchmark functions.
Table 2. Summary of unconstrained benchmark functions.
FunctionDimRange f m i n
F 1 = i = 1 n u i 2 30 , 60 , 200 100 , 100 0
F 2 = i = 1 n u i + i = 1 n u i 30 , 60 , 200 10 , 10 0
F 3 = i = 1 n j 1 i u j 2 30 , 60 , 200 100 , 100 0
F 4 = m a x i u i , 1 i n 30 , 60 , 200 100 , 100 0
F 5 = i = 1 n 1 100 u i + 1 u i 2 2 + u i 1 2 30 , 60 , 200 30 , 30 0
F 6 = i = 1 n u i + 0.5 2 30 , 60 , 200 100 , 100 0
F 7 = i = 1 n i u i 4 + r a n d o m   [ 0,1 ) 30 , 60 , 200 1.28 , 1.28 0
F 8 = i = 1 n u i s i n ( u i ) 30 , 60 , 200 500 , 500 418.9829 × d i m
F 9 = i = 1 n u i 2 10 cos 2 π u i + 10 30 , 60 , 200 5.12 , 5.12 0
F 10 = 20 exp 0.2 1 n i = 1 n u i 2 exp 1 n i = n n c o s 2 π u i + 20 + e 30 , 60 , 200 32 , 32 0
F 11 = 1 4000 i = 1 n u i 2 i = 1 n c o s u i i + 1 30 , 60 , 200 600 , 600 0
F 12 x = π n 10 s i n π v i + i = 1 n 1 v i 1 2 1 + 10 s i n 2 π v i + 1 + v n 1 2 + i = 1 n x u i , 10 , 100 , 4 ,   v i = 1 + u i + 1 4 x u i , a , k , m = k u i a m u i > a 0 a < u i < a k u i a m u i < a 30 , 60 , 200 50 , 50 0
F 13 = 0.1 { s i n 2 3 π u 1 + i = 1 n u i 1 2 1 + s i n 2 3 π u i + 1 + u n 1 2 1 + s i n 2 2 π u n } + i = 1 n x u i , 5 , 100 , 4 30 , 60 , 200 50 , 50 0
Table 3. Results of the unconstrained benchmark function in different dimensions.
Table 3. Results of the unconstrained benchmark function in different dimensions.
Func.DimMetricSMA-GMSMAGWOMFOWOAAGWOIChoA
F130Mean0.0000E+000.0000E+001.1420e-583.6667e+031.2910E-1501.3156E-2871.6126E-21
Std0.0000E+000.0000E+004.6065E-585.5605E+036.3122E-1500.0000E+004.1464E-21
60Mean0.0000E+000.0000E+002.7410E-399.1879E+031.6816E-1506.1214E-2301.8770E-10
Std0.0000E+000.0000E+004.2922E-391.0502E+044.8997E-1500.0000E+006.4058E-10
200Mean0.0000E+001.9669E-3053.9434E-201.9153E+057.1770E-1475.3730E-698.7000E-02
Std0.0000E+000.0000E+003.6218E-201.8886E+042.8695E-1462.9429E-681.1720E-01
F230Mean7.6334E-2002.6217E-1918.8027E-353.2667E+017.6231E-1002.9271E-1631.3916E-14
Std0.0000E+000.0000E+007.7522E-352.0160E+014.1689E-990.0000E+002.1687E-14
60Mean1.4896E-1941.4056E-1741.2233E-238.6110E+014.2500E-1036.7393E-1282.0968E-08
Std0.0000E+000.0000E+006.6131E-244.0701E+011.8355E-1022.5883E-1272.3717E-08
200Mean8.4895E-1772.7465E-1581.4618E-125.6387E+028.7032E-1022.2380E-475.7000E-03
Std0.0000E+001.5043E-1574.2765E-136.2588E+013.0302E-1011.2258E-463.3000E-03
F330Mean0.0000E+000.0000E+006.0647E-151.7087E+042.2457E+041.6522E-1672.7398E+00
Std0.0000E+000.0000E+001.8196E-141.0410E+041.1010E+040.0000E+005.1045E+00
60Mean0.0000E+000.0000E+006.5304E-047.5837E+042.2594E+054.8340E-1238.2193E+03
Std0.0000E+000.0000E+001.6000E-033.1531E+044.4994E+042.6440E-1225.9416E+03
200Mean0.0000E+006.3250e-3203.4423E+036.9143E+054.2199E+061.3011E-323.4075E+05
Std0.0000E+000.0000E+002.8117E+031.5015E+051.0616E+067.1244E-321.1396E+05
F430Mean4.6331E-2104.0835E-1981.3767E-146.5007E+013.8088E+014.6986E-1252.7000E-03
Std0.0000E+000.0000E+001.3959E-141.1037E+012.8905E+011.9603E-1245.0000E-03
60Mean6.1647E-2022.7375E-1838.5570E-088.6026E+016.3206E+013.3964E-1113.9345E+00
Std0.0000E+000.0000E+009.8672E-083.7970E+002.7820E+011.4692E-1103.6291E+00
200Mean9.1750E-1891.8162E-1428.8706E+009.7009E+017.6502E+012.7230E-985.3073E+00
Std0.0000E+009.9477E-1424.6248E+009.2640E-012.6037E+014.1843E-981.0233E+01
F530Mean2.5590E-013.7879E+002.6922E+015.3298E+062.7265E+012.7589E+012.5520E+01
Std2.3510E-019.3190E+006.6630E-012.0255E+076.0040E-017.3900E-019.8440E-01
60Mean2.2271E+005.8131E+005.7298E+011.1389E+075.7782E+015.7853E+015.7956E+01
Std1.9501E+001.4323E+019.2360E-012.6454E+076.0090E-016.2410E-019.9190E-01
200Mean1.3377E+014.1798E+011.9763E+025.9481E+081.9743E+021.9852E+025.1980E+02
Std1.5351E+015.3705E+016.9930E-011.0827E+083.2160E-012.9500E-019.1577E+02
F630Mean9.7570E-049.3449E-047.0830E-011.6700E+035.9300E-023.0459E+007.5000E-03
Std3.7960E-045.0173E-044.0930E-013.7984E+036.8300E-024.4440E-016.5000E-03
60Mean2.0600E-022.7500E-023.5305E+001.0899E+046.4480E-019.1116E+003.3529E+00
Std1.6900E-021.5600E-026.9670E-019.7755E+033.1760E-014.0690E-016.9030E-01
200Mean9.0740E-012.1431E+002.7871E+011.8316E+056.8375E+004.3473E+013.7212E+01
Std1.0581E+002.5638E+001.0950E+002.4172E+041.8806E+004.7210E-012.3697E+00
F730Mean8.7485E-059.6302E-058.7230E-042.2649E+001.6000E-031.1749E-041.6000E-03
Std7.1310E-058.6358E-053.6967E-044.1453E+001.4000E-039.1413E-059.8050E-04
60Mean8.8723E-051.3046E-041.7000E-034.4999E+012.8000E-031.7075E-045.0000E-03
Std8.8346E-051.0644E-046.1596E-044.5085E+013.0000E-031.3568E-043.2000E-03
200Mean1.5401E-042.3170E-044.4000E-031.9388E+032.1000E-037.4743E-047.6600E-02
Std1.4408E-041.6692E-041.5000E-034.7091E+022.6000E-035.3142E-047.5000E-02
F830Mean−1.2569E+04−1.2569E+04−6.0789E+03−8.6141E+03−1.1297E+04−3.3244E+03−7.7426E+03
Std6.3400E−021.0400E-018.0426E+021.1705E+031.5114E+033.8411E+029.0124E+02
60Mean−2.5138E+04−2.5138E+04−1.0625E+04−1.5298E+04−2.2263E+04−4.4142E+03−1.0631E+04
Std1.1267E+001.0520E+001.0626E+031.4798E+033.2644E+037.2947E+021.7171E+03
200Mean−8.3789E+04−8.3786E+04−2.9174E+04−4.0031E+04−7.6030E+04−8.1330E+03−2.5357E+04
Std1.0139E+011.8694E+014.6872E+033.7644E+039.5686E+039.2865E+022.6464E+03
F930Mean0.0000E+000.0000E+002.4930E-011.6683E+020.0000E+000.0000E+001.9671E+01
Std0.0000E+000.0000E+007.7990E-014.2543E+010.0000E+000.0000E+001.7795E+01
60Mean0.0000E+000.0000E+001.1348E+003.7428E+020.0000E+000.0000E+002.3614E+01
Std0.0000E+000.0000E+002.5719E+007.2331E+010.0000E+000.0000E+002.8944E+01
200Mean0.0000E+000.0000E+001.5482E+001.9325E+031.5158E-140.0000E+004.1362E+01
Std0.0000E+000.0000E+003.4541E+001.0024E+028.3025E-140.0000E+003.1214E+01
F1030Mean8.8818E-168.8818E-161.5573E-141.4131E+013.7303E-154.6777E-152.4759E-12
Std0.0000E+000.0000E+002.5945E-158.0236E+002.8605E-159.0135E-167.7754E-12
60Mean8.8818E-168.8818E-164.2218E-141.9587E+013.7303E-156.3357E-152.2566E-06
Std0.0000E+000.0000E+003.0208E-154.8200E-012.1681E-151.8027E-153.2071E-06
200Mean8.8818E-168.8818E-161.2428E-111.9942E+013.7303E-157.6383E-151.9300E-02
Std0.0000E+000.0000E+005.1053E-121.6200E-022.5380E-151.0840E-151.3000E-02
F1130Mean0.0000E+000.0000E+001.8000E-031.8086E+015.7000E-030.0000E+003.2000E-03
Std0.0000E+000.0000E+006.4000E-034.3768E+013.1100E-020.0000E+005.6000E-03
60Mean0.0000E+000.0000E+002.6000E-031.0863E+026.2000E-030.0000E+002.7000E-03
Std0.0000E+000.0000E+006.1000E-031.0392E+022.3600E-020.0000E+007.2000E-03
200Mean0.0000E+000.0000E+002.9000E-031.6379E+030.0000E+000.0000E+004.2600E-02
Std0.0000E+000.0000E+009.3000E-032.3127E+020.0000E+000.0000E+004.7800E-02
F1230Mean8.3411E-041.3000E-033.5500E-029.0740E-014.3200E-022.2990E-018.9249E-04
Std8.5665E-041.6000E-031.9600E-021.0168E+002.0430E-016.2200E-021.4000E-03
60Mean2.1000E-032.6000E-031.2490E-012.5636E+071.1500E-025.1330E-011.9970E-01
Std3.0000E-035.0000E-036.8700E-027.8110E+076.2000E-036.2300E-026.4100E-02
200Mean1.6000E-033.1000E-034.9390E-011.1459E+092.3900E-029.1770E-011.0377E+00
Std3.2000E-036.8000E-033.8100E-023.5311E+089.7000E-032.3400E-021.5627E+00
F1330Mean6.2781E-041.4000E-036.1850E-014.8580E-012.0330E-012.1164E+001.2368E+00
Std4.5182E-042.8000E-032.5900E-011.1769E+001.3870E-011.5080E-014.2830E-01
60Mean6.5000E-039.5000E-032.8045E+007.7544E+078.3050E-015.1814E+004.7267E+00
Std4.7000E-031.5400E-024.1280E-011.9330E+083.8300E-011.3920E-013.0040E-01
200Mean7.9700E-021.7200E-011.5998E+012.4956E+093.9585E+001.9365E+013.0459E+01
Std1.1340E-012.5920E-014.6650E-015.9820E+081.2726E+001.1070E-012.2853E+01
Table 4. Summary of constrained benchmark functions.
Table 4. Summary of constrained benchmark functions.
Func.Objective FunctionConstraintsNo. of VariablesGlobal Best
G1 f u = 5 i = 1 4 u i 5 i = 1 4 u i 2 5 i = 5 13 u i g 1 = 2 u 1 + 2 u 2 + u 10 + u 11 10 0
g 2 = 2 u 1 + 2 u 3 + u 10 + u 12 10 0
g 3 = 2 u 2 + 2 u 3 + u 11 + u 12 10 0
g 4 = 8 u 1 + u 10 0
g 5 = 8 u 2 + u 11 0
g 6 = 8 u 3 + u 12 0
g 7 = 2 u 4 2 u 5 + u 10 0
g 8 = 2 u 6 u 7 + u 11 0
g 9 = 2 u 8 u 9 + u 12 0
13−15
G2 f u = i = 1 n c o s 4 ( u i ) 2 n = 1 n c o s 2 ( u i ) i = 1 4 i u i 2 g 1 = i = 1 n u i + 0.75 0
g 2 = i = 1 n u i 7.5 n 0
20−0.803619
G3 f u = ( n n ) i = 1 n u i g 1 = i = 1 n u i 2 1 = 0 20−1
G4 f u = 5.3578547 u 3 2 + 0.8356891 u 1 u 5 + 37.293239   u 1 40792.141 g 1 = p u 92   0
g 2 = p u 0
g 3 = r u 110 0
g 4 = r u + 90 0
g 5 = s u 25 0
g 6 = s u + 20 0
w h e r e
p u = 85.334407 + 0.0056858 u 2 u 5 + 0.0006262 u 1 u 4 0.0022053 u 3 u 5
r u = 80.51249 + 0.0071317 u 2 u 5 + 0.0029955 u 1 u 2 + 0.0021813 u 3 2
s u = 9.300961 + 0.0047026 u 3 u 5 + 0.0012547 u 1 u 3 + 0.0019085 u 3 u 4
5−30 665.539
G5 f u = 3 u 1 + 10 6 u 3 1 + 2 u 2 + 2 3 10 6 u 3 2 g 1 = u 3 u 4 0.55 0
g 2 = u 4 u 3 0.55   0
g 3 = 1000 sin u 3 0.25 + sin u 4 0.25 + 894.8 u 1 = 0
g 4 = 1000 sin u 3 0.25 + sin u 3 u 4 0.25 + 894.8 u 2 = 0  
g 5 = 1000 sin u 4 0.25 + sin u 4 u 3 0.25 + 1294.8 = 0
45126.4981
G6 f u = u 1 10 3 + u 2 20 3 g 1 = u 1 5 2 u 2 5 2 + 100 0
g 2 = u 1 6 2 + u 2 5 2 82.81 0
2−6961.81388
G7 f u = u 1 2 + u 2 2 + u 1 u 2 14 u 1 16 u 2   + u 3 10 2 + 4 u 4 5 2 + u 5 3 2 + 2 u 6 1 2 + 5 u 7 2 + 7 u 8 11 2 + 2 u 9 10 2 + u 10 7 2 + 45 g 1 = 4 u 1 + 5 u 2 3 u 7 + 9 u 8 105 0
g 2 = 10 u 1 8 u 2 17 u 7 + 2 u 8 0
g 3 = 8 u 1 + 2 u 2 + 5 u 9 2 u 10 12 0
g 4 = 3 u 1 2 2 + 4 u 2 3 2 + 2 u 3 2 7 u 4 120 0
g 5 = 5 u 1 2 + 8 u 2 + u 3 6 2 + 2 u 4 40 0
g 6 = u 1 2 + 2 u 2 2 2 2 u 1 u 2 + 14 u 5 6 u 6 0
g 7 = 0.5 u 1 8 2 + 2 u 2 4 2 + 3 u 5 2 u 6 30 0
g 8 = 3 u 1 + 6 u 2 + 12 u 9 8 2 + 7 u 10 0
1024.3062091
G8 f ( u ) = s i n 3 2 π u 1 s i n ( 2 π u 2 ) u 1 3 ( u 1 + u 2 ) g 1 = u 1 2 u 2 + 1   0
g 2 = 1 u 1 + u 2 4 2 0
20.095825
G9 f u = u 1 10 2 + 5 u 2 12 2 + u 3 4 + 3 u 4 11 2 + 10 u 5 6 + 7 u 6 2 + u 7 4 4 u 6 u 7 10 u 6 8 u 7 g 1 = 2 u 1 2 + 3 u 2 4 + u 3 + 4 u 4 2 + 5 u 5 127   0
g 2 = 7 u 1 + 3 u 2 + 10 u 3 2 + u 4 u 5 282 0
g 3 = 23 u 1 + u 2 2 + 6 u 6 2 8 u 7 196 0
g 4 = 4 u 1 2 + u 2 2 3 u 1 u 2 + 2 u 3 2 + 5 u 6 11 u 7 0
7680.6300573
G10 f u =   u 1 + u 2 + u 3 g 1 = 1 + 0.0025 u 4 + u 6 0
g 2 = 1 + 0.0025 u 4 + u 5 + u 7 0
g 3 = 1 + 0.01 u 5 + u 8 0
g 4 = 100 u 1 u 1 u 6 + 833.33252 u 4 83333.333   0
g 5 = u 2 u 4 u 2 u 7 1250 u 4 + 1250 u 5 0
g 6 = u 3 u 5 u 3 u 8 2500 u 5 + 1250000   0
87049.3307
G11 f u = u 1 2 + u 2 1 2 g 1 = u 2 u 1 2 = 0 20.75
G12 f u = 1 0.01 [ u 1 5 2 + u 2 5 2 + u 3 5 2 ] g p , r , s = u 1 p 2 + u 2 r 2 + u 3 s 2
0.0625   0           p , r , s = 1 , , 9
3−1
G13 f u = e u 1 u 2 u 3 u 4 u 5 g 1 = u 1 2 + u 2 2 + u 3 2 + u 4 2 + u 5 2 10 = 0
g 2 = u 2 u 3 5 u 4 u 5 = 0
g 3 = u 1 3 + u 2 3 + 1 = 0
50.0539498
Table 5. Comparison results of constrained benchmark functions.
Table 5. Comparison results of constrained benchmark functions.
Func.MetricSMA-GMSMAGWOMFOWOAAGWOIChoA
G1Mean−1.4834E+01−1.3171E+01−1.0193E+01−1.1600E+01−5.9204E+00−7.8403E+00−1.2915E+01
Std5.9100E−021.7645E+002.3500E+002.0443E+003.2965E+001.7856E+001.5116E+00
Best−1.5000E+01−1.5000E+01−1.4965E+01−1.5000E+01−1.4927E+01−1.1854E+01−1.4954E+01
Worst−1.4692E+01−9.0061E+00−5.9998E+00−9.0000E+00−2.0000E+00−5.0000E+001.5116E+00
G2Mean−5.3780E-01−4.4930E-01−7.0890E-01−4.5880E-01−4.2790E-01−5.6220E-01−7.7500E-01
Std1.1250E-012.1100E-025.7100E-021.2080E-011.1570E-015.3500E-021.4300E-02
Best−7.7790E-01−5.0900E-01−7.9150E-01−6.4820E-01−6.1450E-01−7.1270E-01−7.9160E-01
Worst−3.0110E-01−4.3220E-01−5.8000E-01−2.2600E-01−2.5680E-015.3500E−02−7.3300E−01
G3Mean−1.0000E+00−9.8190E-01−8.9930E-01−9.8070E-01−9.8400E-02−9.6580E-01−9.9360E-01
Std2.5096E-089.9000E-034.0650E-021.0600E-021.3470E-011.2300E-021.9000E-03
Best−1.0000E+00−9.9100E-01−9.9700E-01−9.9610E-01−5.7960E-01−9.8630E-01−9.9630E-01
Worst−1.0000E+00−9.5580E-010.0000E+00−9.5730E-010.0000E+00−9.3630E-01−9.8970E-01
G4Mean−3.0666E+04−3.0666E+04−3.0660E+04−3.0662E+04−2.9825E+04−3.0652E+04−3.0664E+04
Std1.7000E-034.9000E-033.1921E+002.0642E+012.5829E+027.8038E+001.1844E+00
Best−3.0666E+04−3.0666E+04−3.0665E+04−3.0666E+04−3.0153E+04−3.0663E+04−3.0665E+04
Worst−3.0665E+044.9000E-03−3.0654E+04−3.0552E+04−2.8958E+04−3.0628E+041.1844E+00
G5Mean5.2398E+035.3587E+035.2782E+035.4571E+035.7436E+035.2632E+035.1614E+03
Std9.6915E+012.3378E+021.0116E+022.8011E+024.1954E+024.8620E+011.1303E+01
Best5.1265E+035.1269E+035.1289E+035.1280E+035.1701E+035.1565E+035.1401E+03
Worst5.4664E+035.9295E+035.5960E+036.0631E+036.6413E+035.3114E+035.1869E+03
G6Mean−6.9618E+03−6.9616E+03−6.9179E+031.5639E+192.5558E+191.4141E+18−6.9588E+03
Std2.3600E-021.9630E-012.3803E+014.5209E+193.9435E+197.7453E+182.2065E+00
Best−6.9618E+03−6.9618E+03−6.9734E+03−6.9562E+03−6.9496E+03−6.9504E+03−6.9607E+03
Worst−6.9617E+03−6.9610E+03−6.8502E+032.3980E+202.0342E+204.2423E+19−6.9498E+03
G7Mean2.5207E+012.6543E+013.7482E+011.5192E+021.3137E+023.2642E+012.6184E+01
Std4.0710E-011.8443E+002.4819E+011.8031E+021.6180E+023.9615E+024.6660E-01
Best2.4380E+012.4390E+012.7835E+012.5275E+013.4038E+013.2642E+012.5374E+01
Worst2.7376E+013.1575E+011.3469E+026.0889E+027.7964E+029.6900E+022.7474E+01
G8Mean−8.4600E-021.2524E+20−9.5800E-02−9.5800E-02−9.5800E-02−9.5800E-02−9.5800E-02
Std2.5600E-023.3375E+203.7948E-071.8937E-172.6006E-072.2121E-061.6094E-17
Best−9.5400E-02−2.1700E-02−9.5800E-02−9.5800E-02−9.5800E-02−9.5800E-02−9.5800E-02
Worst−2.5500E-021.6954E+21−9.5800E-02−9.5800E-02−9.5800E-02−9.5800E-02−9.5800E-02
G9Mean6.8080E+026.8186E+026.8640E+026.8133E+027.2166E+027.1236E+026.8088E+02
Std1.0620E-011.2547E+005.8186E+007.1000E-012.9694E+014.8318E+017.5500E-02
Best6.8065E+026.8077E+026.8100E+026.8068E+026.8838E+026.8436E+026.8076E+02
Worst6.8112E+026.8669E+027.0999E+026.8335E+028.2469E+029.0186E+026.8113E+02
G10Mean7.8444E+038.2845E+038.0568E+039.0121E+171.0923E+198.4891E+038.1973E+03
Std3.4986E+024.5156E+024.0786E+022.1431E+182.6706E+193.5825E+023.9594E+02
Best7.0652E+037.2193E+037.2777E+037.0684E+039.6935E+037.7745E+037.6514E+03
Worst8.5464E+039.2693E+038.9547E+038.3416E+181.2393E+209.0926E+038.6863E+03
G11Mean7.5000E-017.5000E-017.5000E-017.5070E-017.5050E-017.5010E-017.5000E-01
Std1.7737E-055.9117E-051.5477E-056.7194E-041.0000E-038.0545E-051.0865E-05
Best7.5000E-017.5000E-017.5000E-017.5000E-017.5000E-017.5000E-017.5000E-01
Worst7.5010E-017.5020E-017.5010E-017.5270E-017.5460E-017.5030E-017.5010E-01
G12Mean−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00
Std0.0000E+001.7076E-092.0459E-080.0000E+004.9676E-081.6854E-070.0000E+00
Best−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00
Worst−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00−1.0000E+00
G13Mean9.0910E-011.1229E+001.5451E+009.7310E-017.5399E+153.4936E+152.9115E+16
Std3.5220E-016.0810E-012.4258E+004.2510E-011.7540E+165.2593E+164.6794E+16
Best7.7400E-025.9700E-012.5760E-013.0590E-019.3930E-017.2668E+132.9424E+13
Worst1.9110E+003.5430E+001.0792E+012.9538E+006.4159E+165.2593E+162.1279E+17
Table 6. Description of CEC2022 functions.
Table 6. Description of CEC2022 functions.
No.TypeFunctionsGlobal Optima
C1 Unimodal function Shifted and Full Rotated Zakharov Function300
C2 Basic functions Shifted and Full Rotated Rosenbrock’s Function400
C3 Shifted and Full Rotated Expanded Schaffer’s f6 Function 600
C4 Shifted and Full Rotated Non-continuous Rastrigin’s Function 800
C5 Shifted and Rotated Levy Function 900
C6 Hybrid functions Hybrid Function 1 (N = 3)1800
C7Hybrid Function 2 (N = 6)2000
C8 Hybrid function 3 (N = 5) 2200
C9Composition FunctionsComposition Function 1 (N = 5)2300
C10Composition Function 2 (N = 4)2400
C11Composition Function 3 (N = 5)2600
C12Composition Function 4 (N = 6)2700
Search range: [−100, 100] D
Table 7. Comparison results of SMA-GM with other algorithms for CEC2022 functions.
Table 7. Comparison results of SMA-GM with other algorithms for CEC2022 functions.
FunctionMeasureSMA-GMSMAGWOMFOWOAAGWOIChoA
C1Mean3.0075E+023.0606E+021.5556E+043.7841E+042.6617E+041.8679E+046.3721E+03
Std1.0474E+001.1659E+015.3757E+032.2427E+048.5331E+034.6890E+032.1821E+03
C2Mean4.5264E+024.6056E+025.0443E+025.5281E+025.5932E+026.7119E+024.5511E+02
Std2.6936E+013.5103E+014.6753E+011.3298E+026.3667E+011.0402E+021.1528E+01
C3Mean6.0185E+026.0336E+026.2079E+026.0498E+026.6742E+026.4909E+026.0198E+02
Std1.2369E+003.2547E+001.0238E+012.8857E+001.2927E+018.6198E+003.1905E+00
C4Mean8.6863E+028.7890E+028.5632E+028.9231E+029.2433E+028.9681E+029.1443E+02
Std1.7006E+012.6378E+012.0991E+012.6189E+012.9938E+011.6206E+012.5496E+01
C5Mean1.4174E+031.6059E+031.1853E+032.8859E+033.7702E+032.2381E+032.1262E+03
Std4.2994E+025.2857E+022.8897E+021.0928E+031.2968E+035.3361E+026.6054E+02
C6Mean1.6826E+041.6887E+041.0582E+065.3890E+061.9337E+069.3540E+062.2690E+06
Std8.1372E+037.7127E+032.7053E+061.1969E+073.8576E+061.1970E+075.6906E+06
C7Mean2.0754E+032.0852E+032.0937E+032.1307E+032.2072E+032.1544E+032.0962E+03
Std3.0956E+014.5286E+014.1485E+015.5167E+017.1031E+014.5667E+012.8800E+01
C8Mean2.2442E+032.2840E+032.2783E+032.2724E+032.3070E+032.2744E+032.2603E+03
Std4.2107E+017.4723E+015.9580E+016.1209E+018.4745E+016.8001E+013.8452E+01
C9Mean2.4810E+032.4810E+032.5107E+032.5126E+032.5661E+032.5804E+032.4814E+03
Std1.9860E-012.1650E-012.5276E+013.3465E+014.9679E+013.0063E+012.2150E-01
C10Mean2.9457E+032.9854E+033.5612E+033.9601E+034.7196E+034.8970E+033.6656E+03
Std3.1960E+023.4560E+028.7356E+029.2908E+021.3148E+031.2388E+031.6121E+03
C11Mean2.9271E+032.9743E+033.5838E+034.1465E+033.6249E+034.5947E+033.0641E+03
Std1.1749E+021.1053E+023.6136E+028.5659E+026.5494E+026.5468E+021.9326E+02
C12Mean2.9458E+032.9495E+032.9769E+032.9565E+033.0552E+033.0823E+033.0141E+03
Std6.5767E+008.8285E+002.4279E+011.2255E+017.2826E+016.5608E+016.6373E+01
Table 8. Wilcoxon rank sum test results (p-values) of SMA-GM with each algorithm for the constrained function and CEC2022 test function.
Table 8. Wilcoxon rank sum test results (p-values) of SMA-GM with each algorithm for the constrained function and CEC2022 test function.
Func.SMAGWOMFOWOAAGWOIChoA
G1 2.7100E-02 5.5727E-10 8.0899E-06 5.3893E-10 3.0199E-11 3.8307E-05
G2 1.0300E-02 3.4742E-10 6.1000E-03 3.3000E-03 1.2541E-07 1.3289E-10
G3 3.0199E-11 2.9543E-11 3.0199E-11 3.0180E-11 3.0199E-11 3.0199E-11
G4 1.1000E-03 3.0199E-11 5.3195E-05 3.0199E-11 3.0199E-11 3.0199E-11
G5 2.8100E-02 8.0000E-03 1.6813E-04 2.0338E-09 8.7000E-03 8.5641E-04
G6 3.4971E-09 5.5727E-10 2.9822E-11 3.0199E-11 3.0199E-11 3.0199E-11
G7 8.1465E-05 3.0199E-11 2.3715E-10 3.0199E-11 3.0199E-11 1.8500E-08
G8 1.7769E-10 9.2340E-01 1.2118E-12 7.0100E-02 7.7310E-01 1.2118E-12
G9 8.1014E-10 5.4941E-11 2.2539E-04 3.0199E-11 3.0199E-11 3.8307E-05
G10 3.0939E-06 4.0000E-03 9.0595E-08 3.0199E-11 5.5329E-08 3.5923E-05
G11 2.8100E-02 1.9900E-02 2.1947E-08 1.8580E-01 4.8011E-07 3.7900E-01
G12 1.2118E-12 1.2118E-12 N a N 1.2118E-12 1.2118e-12 N a N
G13 3.1500E-02 2.4200E-02 7.9580E-01 1.4643E-10 3.0199E-11 3.0199E-11
C1 5.5999E-07 3.0199E-11 3.0199E-11 3.0199E-11 3.0199E-11 3.0199E-11
C2 3.5100E-02 3.3520E-08 2.1327E-05 3.4742E-10 3.3384E-11 2.9730E-01
C3 2.6100E-02 3.0939E-06 3.6897E-11 3.0199E-11 3.0199E-11 4.5100E-02
C4 4.5100E-02 8.2400E-02 2.8389E-04 1.5465E-09 3.8053E-07 1.6980E-08
C5 4.2100E-02 5.3700E-02 2.3897E-08 8.9934E-11 4.8011E-07 2.5974E-05
C6 7.2400E-02 1.6000E-03 9.9000E-03 3.0199E-11 3.0199E-11 3.0059E-04
C7 4.5100E-02 2.3200E-02 7.1988E-05 1.3289E-10 6.0720E-11 2.6384E-06
C8 8.0000E-03 7.6973E-04 1.4932E-04 2.1959E-07 3.4285E-04 5.9000E-03
C9 2.9200E-02 3.3384E-11 5.5282E-08 3.0199E-11 7.5527E-11 5.0757E-13
C10 3.5100E-02 5.8000E-03 8.1465E-05 4.4205E-06 3.1770E-01 2.1232E-06
C11 4.6800E-02 1.7769E-10 4.4440E-07 4.9752E-11 1.9460E-09 5.6073E-05
C12 3.6400E-02 2.6695E-09 1.9963E-05 3.6897E-11 9.5867E-18 4.3116E-12
Table 9. Optimal design of IRS.
Table 9. Optimal design of IRS.
VariableSMA-GMSMAGWOMFOWOAAGWOIChoA
u 1 0.0010.0010.0010.0010.00102530.0010.0010023
u 2 0.0010.0010.00104290.0010.0010260.0010.0011995
u 3 0.00100020.00100040.00101310.0010.005730.0010.0016858
u 4 0.00100070.0010.00106780.0010.00102580.00101880.009546
u 5 0.0010.0010.0010410.0010.00102530.00101280.0015728
u 6 0.0010.0010.00236880.0010.00102520.0010090.0013415
u 7 1.5241.5241.52441.5241.5241.53881.559
u 8 1.5241.52411.52491.5241.53191.52491.5677
u 9 554.999154.99964.96994.9931
u 10 2.00022.78342.04332.18992.37192.01212.4023
u 11 0.00100010.00196070.00352540.0263990.0129860.0010.0030193
u 12 0.0010.00195930.00309390.0263990.00102520.0010.0029526
u 13 0.00729360.0117030.0124960.0347570.00102560.00653110.012581
u 14 0.0875530.140490.149810.417260.0119570.0770740.13863
Optimal value0.0322160.0365240.0368430.0544070.25360.0358470.046458
Table 10. Constraint values of the IRS design problem.
Table 10. Constraint values of the IRS design problem.
ConstraintSMA-GMSMAGWOMFOWOAAGWOIChoA
g1−0.0000−0.00000.00000−0.2368−0.0000−0.0287
g2−0.00000.0006−0.00030−0.01680.2793−0.0420
g3−7.5614−7.5616−7.5617−7.5616−6.0062−0.0010−7.3166
g4−0.9788−0.9774−0.9899−0.9935−0.84700.0867−0.9864
g5−0.0002−0.0001−0.00560−0.4332−0.0000−0.0513
g6−0.9802−0.9777−0.9665−0.8329−0.9796−0.0010−0.9687
g7−0.9389−0.9389−0.8976−0.8123−0.8982−0.0010−0.9021
g8−0.9901−0.9901−0.9903−0.9901−0.9975−0.0010−0.9950
g9−0.9807−0.9807−0.9917−0.9807−1.0000−0.0010−0.9967
g10−0.9702−0.9702−0.9730−0.9702−0.9702−0.0010−0.9860
g11−0.00040.0000−0.00530−0.94910.0974−0.2135
g12−0.9440−0.9440−0.9742−0.9440−0.9963−0.0009−0.9501
g13−0.6000−0.6000−0.6000−0.6000−0.59991.9990−0.5964
g14−0.0010−0.1145−0.0049−0.60000.00030.3814−0.0725
g15−0.00000−0.108100−0.0001−0.3702
Table 11. Optimal operation of an alkylation unit.
Table 11. Optimal operation of an alkylation unit.
VariableSMA-GMSMAGWOMFOWOAAGWOIChoA
u 1 20002000200020001999.96061706.86261991.0172
u 2 04.8302E-1800001.4796E-06
u 3 2571.42162473.16692457.53442820.4063094.96072682.55422904.3814
u 4 0000006.1598E-07
u 5 58.13942157.72304157.66305459.22937961.10294861.07029861.106755
u 6 1.23860140.82607770.762537312.38508484.27678754.58004743.1824157
u 7 41.38101140.37990240.03324844.58401650.81701150.39144126.440249
Optimal value−4529.1132−4526.428−4524.7028−4513.7265−4370.0026−3761.72723.2702E+13
Table 12. Constraint values of the optimal operation of an alkylation unit.
Table 12. Constraint values of the optimal operation of an alkylation unit.
ConstraintSMA-GMSMAGWOMFOWOAAGWOIChoA
g1−0.00000.0000−0.0054−0.0050−0.0056−0.0015−0.0035
g2−0.0053−0.0054−0.0003−0.0009−0.0001−0.0025−0.0009
g3−0.0205−0.0205−0.0221−0.0215−0.0234−0.0188−0.0102
g4−0.0000−0.00000.00000.00000.0000−0.0000−0.0001
g50.00000.000000000.0000
g6−0.0000−0.00000000−0.0000
g7−0.0000−0.0000−0.0000−0.0000−0.0000−0.0013−0.0009
g8−0.0000−0.0000−0.0000−0.0000−0.00000.00000.0000
g9−0.0000−0.0000−0.00000.0000−0.0000−0.0001−0.0000
g10−0.1030−0.1068−0.1629−0.1639−0.1631−0.0639−0.0865
g110.00000.000000000.0000
g12−0.0000−0.00000000−0.0000
g13−8.6920−8.5088−3.2088−4.1828−3.5747−9.9144−9.0460
g14−0.2092−0.2385−1.0865−0.9306−1.0279−0.0136−0.1525
Table 13. Optimal result of the welded beam design problem.
Table 13. Optimal result of the welded beam design problem.
VariableSMA-GMSMAGWOMFOWOAAGWOIChoA
u 1 0.205730.205570.205550.206120.178250.20040.2057
u 2 3.47043.4743.47243.46544.56023.58993.4719
u 3 9.0379.03669.05069.0288.98239.04939.0357
u 4 0.205730.205730.205720.206120.208230.205870.20577
Optimum cost1.72491.72511.72711.72631.83021.73581.7252
Table 14. Constraint values of the welded beam design problem.
Table 14. Constraint values of the welded beam design problem.
ConstraintSMA-GMSMAGWOMFOWOAAGWOIChoA
g1−0.01000.3094−34.33970−890.2884−25.72479.9250
g2−2.43955.0397−41.65300−194.3126176.675410.3670
g3−0.00000.0000−0.00100. 0000−0.0300−0.0023−0.0001
g4−3.4329−0.0000−3.4279−3.4272−3.3184−3.4287−3.4325
g5−0.08070. 0000−0.0802−0.0829−0.0536−0.0795−0.0810
g6−0.23550.0002−0.2356−0.2355−0.2356−0.2354−0.2355
g7−0.12280.0001−39.7059−169.2802−245.6852−70.7822−24.0572
Table 15. Optimal result of the tension/compression spring design problem.
Table 15. Optimal result of the tension/compression spring design problem.
VariableSMA-GMSMAGWOMFOWOAAGWOIChoA
u 1 0.05142060.05038010.05176740.05231250.05263950.050.0513004
u 2 0.3502940.3260350.3585980.3719030.3800170.3174050.347299
u 3 11.675713.343711.188510.451210.043214.037311.8796
Optimum weight0.0126670.0126970.0126740.0126720.0126810.0127260.012686
Table 16. Constraint values of the tension/compression spring design problem.
Table 16. Constraint values of the tension/compression spring design problem.
VariableSMA-GMSMAGWOMFOWOAAGWOIChoA
g 1 00.93030.00070−0.00000.5120−0.0873
g 2 0−0.1657−0.0012−0.00000.0040−0.16620.0022
g 3 −4.0409−55.1800−4.0544−4.0828−4.1181−7.0128−3.6005
g 4 −0.7322−0.8000−0.7265−0.7172−0.7121−0.8000−0.7486
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Thakur, G.; Pal, A.; Mittal, N.; Rajiv, A.; Salgotra, R. Slime Mould Algorithm Based on a Gaussian Mutation for Solving Constrained Optimization Problems. Mathematics 2024, 12, 1470. https://doi.org/10.3390/math12101470

AMA Style

Thakur G, Pal A, Mittal N, Rajiv A, Salgotra R. Slime Mould Algorithm Based on a Gaussian Mutation for Solving Constrained Optimization Problems. Mathematics. 2024; 12(10):1470. https://doi.org/10.3390/math12101470

Chicago/Turabian Style

Thakur, Gauri, Ashok Pal, Nitin Mittal, Asha Rajiv, and Rohit Salgotra. 2024. "Slime Mould Algorithm Based on a Gaussian Mutation for Solving Constrained Optimization Problems" Mathematics 12, no. 10: 1470. https://doi.org/10.3390/math12101470

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop