Next Article in Journal
Redheffer-Type Bounds of Special Functions
Previous Article in Journal
Optimal Pursuit Game of Two Pursuers and One Evader with the Grönwall-Type Constraints on Controls
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Solving Optimization Problems Using an Extended Gradient-Based Optimizer

1
College of Computing and Information Technology, University of Bisha, Bisha 61922, Saudi Arabia
2
Department of Computer, Damietta University, Damietta 34511, Egypt
Mathematics 2023, 11(2), 378; https://doi.org/10.3390/math11020378
Submission received: 10 December 2022 / Revised: 30 December 2022 / Accepted: 7 January 2023 / Published: 11 January 2023

Abstract

:
This paper proposes an improved method for solving diverse optimization problems called EGBO. The EGBO stands for the extended gradient-based optimizer, which improves the local search of the standard version of the gradient-based optimizer (GBO) using expanded and narrowed exploration behaviors. This improvement aims to increase the ability of the GBO to explore a wide area in the search domain for the giving problems. In this regard, the local escaping operator of the GBO is modified to apply the expanded and narrowed exploration behaviors. The effectiveness of the EGBO is evaluated using global optimization functions, namely CEC2019 and twelve benchmark feature selection datasets. The results are analyzed and compared to a set of well-known optimization methods using six performance measures, such as the fitness function’s average, minimum, maximum, and standard deviations, and the computation time. The EGBO shows promising results in terms of performance measures, solving global optimization problems, recording highlight accuracies when selecting significant features, and outperforming the compared methods and the standard version of the GBO.

1. Introduction

Nowadays, data volumes are increasing daily, massively, and rapidly; data include many features and attributes, and some of these attributes may be irrelevant or redundant. However, such data should be saved, processed, and retrieved with reasonable computational time and effort. Therefore, using data processing techniques in terms of volume and compatibility is essential; one of these techniques is the feature selection strategy, which is used to select the essential sub-features. In this context, searching for the best attributes in many features is considered a challenge. Therefore, many feature selection models have been proposed to help solve that problem. For instance, the authors of [1] proposed a modified version of the salp swarm algorithm (SSA) called DSSA to reduce the features of different benchmark datasets; the results showed the effectiveness of the DSSA and good classification accuracy compared to the other methods. Another work was performed by the authors of [2]; they introduced a feature selection stage to detect facial expressions from a real dataset using the sine–cosine algorithm (SCA); their method reduced the feature number by 87%.
Furthermore, the authors of [3] applied the particle swarm optimization (PSO) and fuzzy rough set to reduce the feature set of benchmark datasets. The obtained feature sets were evaluated using the decision tree and naive Bayes classifiers; the results showed their effectiveness at evaluating the selected features. The e-Jaya optimization algorithm was also effectively utilized by the authors of [4] to select the essential features in the group of essays to grade them in less time with more accuracy. In this context, the feature selection was applied in medical applications, such as [5]; it was applied to help classify the influenza A virus cases; the results showed good performances compared to the classic methods. The whale optimization algorithm (WOA) was also applied in feature selection by the authors of [6]. In that study, the WOA was enhanced by a pooling mechanism. The performance of the enhanced WOA was evaluated using two experiments: global optimization and feature selection. It was also evaluated in detecting the coronavirus disease (COVID-19). The results showed the efficiency of the enhanced WOA in solving different problems. More applications for feature selection can be found here [7,8,9,10,11].
From this insight, this paper proposes a new method for feature selection. The proposed method, called EGBO, improves the gradient-based optimizer (GBO) using expanded and narrowed exploration behaviors of the Aquila optimizer (AO) [12]. The effectiveness of the EGBO was evaluated using seven well-known benchmark feature selection datasets and compared with some optimization algorithms.
The gradient-based optimizer is a recent optimization algorithm that was motivated by the gradient-based Newton method [13]. It has been successfully applied to several applications, such as identifying the parameters of photovoltaic systems [14]. The GBO was also applied by [15] to predict the infected, recovered, and death cases for the COVID-19 pandemic in the US. On the other hand, the aquila optimizer (AO) is a recent optimization algorithm; it simulates the hunting behaviors of the Aquila proposed by [12]. The AO was used in several applications, such as forecasting oil production [16] and forecasting China’s rural community population [17]. Furthermore, the authors of [18] applied the GBO to identify the parameters in photovoltaic models. Their method’s performance was evaluated using the single-diode, double-diode, and three-diode models as well as the photovoltaic module model. The results showed that their method obtained the best results and it was highly competitive with the compared technologies. On the same side, the authors of [19] used an improved version of the GBO, called IGBO, for parameter extraction of photovoltaic models. The IGBO used two strategies to identify the adaptive parameters: adaptive weights and chaotic behavior. The results showed competitive performance compared to the other methods. Furthermore, the GBO was also modified and called GBOSMA by the authors of [20]. They applied the slime mould algorithm (SMA) as a local search to improve the GBOSMA to explore the search space; the experimental results demonstrated the superiority of their method. In this regard, the difference between the GBOSMA and this paper can be summarized as follows; the GBOSMA uses a probability to apply the operator of the SMA, whereas this paper extends the local search operator of the GBO using the expanded and narrowed exploration behaviors of the AO algorithm.
From the above studies, the GBO was used in different applications and showed promising results; it also has some advantages, such as ease of use and not having many predefined parameters that need to be optimized. However, it may show slow convergence in some cases and can be trapped in local optima; therefore, this paper introduces a new version of the GBO called EGBO to improve the local search phase of the standard GBO method.
The rest of this paper is arranged as follows: Section 2 describes the methods and materials used in this paper. Section 3 describes the proposed method. The experiments and results are presented in Section 4. Section 5 concludes the paper and lists future works.

2. Gradient-Based Optimizer

This section introduces a brief description of the gradient-based optimizer (GBO). The GBO is an optimization algorithm proposed by [13]. The optimization process of the GBO contains two main phases: gradient search rule (GSR) and local escaping operator (LEO). These phases are explained in the following subsections.

2.1. Gradient Search Rule (GSR)

The GSR is used to promote the GBO exploration. The following equations are applied to perform the GSR and to update the position ( x n l ) .
X 1 n l = x n l r a n d n × ρ 1 × 2 Δ x × x n l ( X w o r s t X b e s t + ε ) + r a n d × ρ 2 × ( X b e s t x n l )
ρ 1 = 2 × r a n d × α α
α = | β × sin ( 3 π 2 + sin ( β × 3 π 2 ) ) |
β = β min + ( β max β min ) × ( 1 ( l L ) 3 ) 2
where β min = 0.20 and β max = 1.20 . l and L denote the current and total iterations. ε is a random value [ 0 , 0.1 ] . r a n d n denotes a normally distributed number. ρ 2 is computed using Equation (5).
ρ 2 = 2 × r a n d × α α
Δ x = r a n d ( 1 : N ) × s t e p
s t e p = ( X b e s t x r 1 l ) + δ 2
δ = 2 × r a n d × ( | x r 1 l + x r 2 l + x r 3 l + x r 4 l 4 x n l | )
where r 1 , r 2 , r 3 , r 4 and r a n d denote random numbers [ 1 , N ] . In this stage, the  X 2 n l is computed as in Equation (9).
X 2 n l = X b e s t r a n d n × ρ 1 × 2 Δ x × x n l ( y p n l y q n l + ε ) + r a n d × ρ 2 × ( x r 1 l x r 2 l )
y p n = r a n d × ( [ z n + 1 + x n ] 2 + r a n d × Δ x )
y q n = r a n d × ( [ z n + 1 + x n ] 2 r a n d × Δ x )
After that, the solution of the next loop X n m + 1 is computed, as in Equation (12) based on X 1 n l and X 2 n l and the current solution X n l .
X n m + 1 = r a × ( r b × X 1 n l + ( 1 r b ) × X 2 n l ) + ( 1 r a ) × X 3 n l
X 3 n l = X n l ρ 1 × ( X 2 n l X 1 n l )

2.2. Local Escaping Operator (LEO)

The LEO is applied to improve the efficiency of the GBO algorithm. This phase starts if a random number ( p r ) exceeds 0.5 . It also uses another random number ( p r 2 ) to update the solutions using Equation (14) if p r 2 > 0.5 ; otherwise, it will update the solution using Equation (15).
X n l + 1 = X n l + 1 + f 1 × ( u 1 × X b e s t u 2 × x k l ) + f 2 × ρ 1 × ( u 3 × ( X 2 n l X 1 n l ) + u 2 × ( x r 1 l x r 2 l ) ) / 2
X n l + 1 = X b e s t + f 1 × ( u 1 × X b e s t u 2 × x k l ) + f 2 × ρ 1 × ( u 3 × ( X 2 n l X 1 n l ) + u 2 × ( x r 1 l x r 2 l ) ) / 2
where x r 1 m , x r 2 m , and  x k m are random solutions. f 1 and f 2 are random numbers. u 1 , u 2 , and  u 3 are generated as in Equations (16)–(18).
u 1 = { 2 × r a n d i f λ 1 < 0.5 1 o t h e r w i s e
u 2 = { r a n d i f λ 1 < 0.5 1 o t h e r w i s e
u 3 = { r a n d i f λ 1 < 0.5 1 o t h e r w i s e
where r a n d and λ 1 are generated randomly [ 0 , 1 ] . x k l in computed as in Equation (19).
x k l = L 2 × x p l + ( 1 L 2 ) × x r a n d
where L 2 is a binary variable generated randomly. Algorithm 1 shows the structure of the GBO.
Algorithm 1 Gradient-based optimizer.
1:
Define the global parameters L , N , and D, as well as the initial population, X.
2:
Compute the fitness function for population X.
3:
Select the best solution X b e s t m and the worst one X w o r s t m .
4:
Start the main optimization process.
5:
forl=1 to Ldo
6:
    for n = 1: N do
7:
        for i = 1: D do
8:
           Generate the random variables to apply GSR.
9:
           Update the current position using the GSR phase.
10:
        end for
11:
        Generate the random variables to apply LEO.
12:
        if  r a n d ( ) < 0.5  then
13:
           Update the current position using the LEO phase.
14:
        end if
15:
        Update the best and worst positions.
16:
    end for
17:
end for
18:
Return the best result.

3. Proposed Method

This section explains the structure of the proposed EGBO method. The EGBO aims to improve the exploration phase of the standard GBO by using Lévy flight as well as the expanded and narrowed exploration behaviors of the AO algorithm [12]. In this regard, the local escaping operator of the GBO is extended to include the expanded and narrowed exploration behaviors. This modification adds more flexibility and reliability to the GBO to explore different hidden areas in the search space, reflecting on effectively reaching the optimal point.
The EGBO starts by generating its population as in Equation (20), then evaluates it to determine the best solutions.
X = r ( N , D ) × ( l b u b ) + l b
where N and D denote the population size and dimension, respectively, of the given problem. r is a random function. l b and u b denote the lower and upper bounds.
Then, the GSR is applied to accelerate the convergence behavior. After that, the local escaping operator phase is used to maintain the current solution. In this stage, a random variable ( r a n d ) is checked to run the improvement phase. This phase runs an expanded or narrowed exploration (using a random variable). Therefore, if  r a n d 2 ( ) > 0.5 , run the expanded exploration; else, run the narrowed exploration.
The EGBO uses Equation (21) to apply the expanded exploration as follows:
X = X b e s t × ( 1 l L ) + ( X a X b e s t ) × r a n d ( )
where X b e s t denotes the best solution so far. l and L denote the current and total iterations. X a is the average of the current solution, and it is calculated as:
X a = 1 N ( i = 1 N x i )
Furthermore, the EGBO uses Equation (23) to apply the narrowed exploration. It also applies the Lévy flight distribution to update the solutions as follows:
X = X b e s t × L v y ( D ) + X r + ( y q ) × r a n d ( )
where L v y ( D ) applies the Lévy flight distribution as in Equations (24) and (25). X r is a random solution.
L v y ( D ) = s × u × σ | δ | 1 β 3
σ = ( Γ ( 1 + β 3 ) × s i n e ( π β 3 2 ) Γ ( 1 + β 3 2 ) × β 3 × 2 ( β 3 1 2 ) )
where s = 0.010 and β 3 = 1.50 . u and δ denote random numbers ∈ [0,1]. y and q are two variables that simulate the spiral shape, and they are calculated as in Equations (26)–(28), respectively.
y = r × c o s ( θ ) , q = r × s i n ( θ )
r = r 0 + U × d 1 , d 1 = 1 , 2 , , D
θ = ω × d 1 + θ 1 , θ 1 = 3 × π 2
where r 0 is a random value [ 0 , 10 ] . U = 0.005650 and ω = 0.0050 , these values are selected based on the recommendation of the AO study [12].
After that, the best solution is determined and saved using the fitness function. This sequence is repeated until the stop condition is reached. Finally, the best result is presented. Algorithm 2 shows the pseudo-code of the proposed EGBO method.
Algorithm 2 Pseudo-code of the proposed EGBO method.
1:
First: Define the global parameters L , N , and D, as well as the initial population X.
2:
Perform the fitness function to evaluate the population to define the best solution X b e s t and the worst one X w o r s t .
3:
Second: Start the main processing.
4:
for  l = 1 : L do
5:
    for  n = 1 : N  do
6:
        for  i = 1 : D  do
7:
           Generate random variables.
8:
           Update the current position using the GSR phase.
9:
        end for
10:
      Start the LEO phase
11:
      if  r a n d < 0.5  then
12:
           Compute the solution using the LEO phase.
13:
           if  r a n d < 0.5  then
14:
               if  r a n d < 0.5  then
15:
                   Apply the expanded exploration.
16:
               else
17:
                   Apply the narrowed exploration.
18:
               end if
19:
           end if
20:
        end if
21:
        Update the best and worst positions.
22:
    end for
23:
end for
24:
Return the best result.

4. Experiment and Results

In this section, the proposed EGBO is evaluated in two experiments; the first is global optimization, and the second is feature selection. All results of the proposed EGBO are compared to six algorithms: GBO, particle swarm optimization (PSO) [21], genetic algorithms (GA) [22], differential evolution (DE) [23], dragonfly algorithm (DA) [24], and moth-flame optimization (MFO) [25]. These algorithms are selected because they have shown stability and good results in the literature. They have different behaviors in exploring the search space; for instance, the PSO uses the particle’s position and velocity, whereas the GA uses three phases: selection, crossover, and mutation.
The parameters of all algorithms were set as mentioned in their original paper, whereas the global parameters were as follows: the search agents’ numbers = 25 and the maximum iteration number = 100. The number of the fitness function evaluation is set to 2500 as a stop condition, and each algorithm is applied with 30 independent runs.

4.1. First Experiment: Solving Global Optimization Problem

In this section, the proposed EGBO method is assessed using CEC2019 [26] benchmark functions to solve global optimization functions and the results are compared to some well-known algorithms. This experiment aims to evaluate the proposed EGBO in solving different types of test functions.
The comparison is performed using a set of performance measures: the mean (Equation (29)), min, max, and standard deviations (Equation (30)) of the fitness functions, and the computation times for all algorithms. All experimental results are listed in Table 1, Table 2, Table 3, Table 4 and Table 5.
m e a n = 1 N i = 1 N ( f i )
S t d = 1 N i = 1 N ( f i f ^ ) 2
where f is the produced fitness value and f ^ denotes the mean value of f. N indicates the size of the sample.
Table 1 tabulates the results of the mean of the fitness function measure. In that table, the proposed EGBO outperformed the other methods in six out of ten functions (i.e., F1–F5, F9); therefore, it was ranked first. The GA was ranked second by obtaining the best values in two functions, F7 and F8. The GBO came in the third rank with the best values in three functions, F1–F3, followed by the MFO. The rest of the algorithms were ordered: the AO, DE, DA, and PSO, respectively. Figure 1 illustrates the average fitness function values for all methods over all functions.
The results of the standard deviation measure are reported in Table 2 to show the stability of each method. In this measure, the smallest value indicates good stability behavior within the independent runs. The results show that the proposed EGBO showed the best stability in 70% of the functions compared to the other methods followed by the DE and GBO. The GA, AO, and MFO came in the fourth, fifth, and sixth ranks.
The results of the max measure are listed in Table 3. In this table, the worst value of each algorithm for each function is recorded. As seen in the table, the worst values of the proposed EGBO were better than the compared methods in six out of ten functions (i.e., F1–F3, F5, and F8–F9). The GA was ranked second by obtaining the best values in both F6 and F10. The MFO and GBO came in the third and fourth ranks, followed by the DE and AO.
Furthermore, the results of the min measure are presented in Table 4. In this measure, the best values so far obtained by each algorithm are recorded. As seen in the results, the proposed EGBO obtained the best values in four out of ten functions (i.e., F1, F2, F5, and F9) and obtained good results in the rest of the functions. The GA ranked second by obtaining the minimum values in F4 and F8 functions. The GBO came in third, followed by the MFO, DA, D, PSO, and AO.
The computation time is also considered in Table 5. In this measure, the MFO algorithm was the fastest among all methods, followed by the PSO and AO. The EGBO consumed an acceptable computation time in each function and was ranked fourth, followed by GBO, GA, and DE. The slower algorithm was the DA; it recorded the longest computation time in each function. Figure 2 illustrates the average computation times for all methods over all functions.
Moreover, Figure 3 illustrates the convergence curves for all methods during the optimization process. From this figure, it can be seen that the proposed EGBO can effectively maintain the populations to converge toward the optimal value as shown in F1, F4, F5, and F9. The PSO, GBO, and MFO also showed second-best convergence. In contrast, the DA algorithm showed the worst convergence in most functions.
Furthermore, Table 6 records the Wilcoxon rank-sum test as a statistical test to check if there is a statistical difference between the EGBO and the compared algorithms at a p-value < 0.05. The results, as recorded in Table 6, show significant differences between the proposed EGBO and the compared algorithms, especially with MFO, DA, DE, and AO, which indicates the effectiveness of the EGBO in solving global optimization problems.
In light of the above results, the proposed EGBO method can effectively solve global optimization problems and provide promising results compared to the other methods. Therefore, in the next section, it will be evaluated in solving different feature selection problems.

4.2. Second Experiment: Solving Feature Selection Problem

In this part, the proposed EGBO method is assessed in solving different feature selection problems using twelve well-known feature selection datasets from [27]; Table 7 lists their descriptions.
The performance of the proposed method is evaluated by five measures, namely fitness value (Equation (31)), accuracy (Acc) as in Equation (32), standard deviation (Std) as in Equation (30), number of selected features, and the computation time for each algorithm. The results are recorded in Table 8, Table 9, Table 10, Table 11, Table 12 and Table 13.
f ( x i ( t ) ) = ξ E x i ( t ) + ( 1 ξ ) ( c C ) ,
where E x i ( t ) denotes the error value of the classification process (kNN is used in this paper as a classifier). The second term defines the selected feature number. ξ [ 0 , 1 ] is used to balance the number of the selected features and the classification error. c and C are the current and ’all-feature’ numbers, respectively.
A c c u r a c y ( A c c ) = T P 1 + T N 1 T P 1 + T N 1 + F P 1 + F N 1
where the number of positive classes correctly classified is ( T P 1 ), whereas the number of negative classes correctly classified is ( T N 1 ). Both F P 1 and F N 1 are the numbers of positive and negative classes incorrectly classified, respectively.
Table 8 shows the results of the fitness function values for all algorithms over all datasets. These results indicate that the proposed EGBO obtained the best fitness value among the compared algorithms and performed well in all datasets. The DE obtained the second-best results in 75% of the datasets. The PSO came in third, followed by GBO, GA, DA, AO, and MFO. Figure 4 illustrates the average of this measure over all datasets, which shows that the EGBO obtained the smallest average among all methods.
The stability results of all algorithms are listed in Table 9. As in that table, the proposed EGBO showed the most stable algorithm in 7 out of 12 datasets: glass, tic-tac-toe, waveform, clean1data, SPECT, Zoo, and heart. The GA and DE ranked second and third for stability, respectively, followed by PSO, DA, and GBO.
As mentioned above, this experiment aims to decrease each dataset’s features by deleting the redundant descriptors and keeping the best ones. Therefore, Table 10 reports the number of selected features in each dataset. From Table 10, we can see that the EGBO chose the smallest number of features in 9 out of 12 datasets and showed good performance in the remaining datasets. The GBO was ranked second, followed by MFO, AO, DE, PSO, and GA. In this regard, the small number of features is sometimes the best; therefore, we used the classification accuracy measure to evaluate the obtained feature by each algorithm.
Consequently, the classification accuracy measure was used to evaluate the proposed method’s ability to classify the samples of each dataset correctly. The results of this measure were recorded in Table 11. This table shows that the EGBO ranked first; it outperformed the other methods and obtained high-accuracy results in all datasets. The DE came in the second rank, whereas the PSO obtained the third, followed by the GBO, GA, DA, and AO. In contrast, the MFO recorded the worst accuracy measures in most datasets.
The computation times for all methods were also measured. Table 12 records the time consumed by each algorithm. By analyzing the results, we can note that all algorithms consumed similar times to some extent. In detail, the EGBO was the fastest in 59% of the datasets, followed by PSO, MFO, DE, GBO, and DA, respectively, whereas the GA and AO recorded the longest computation time.
Furthermore, an example of the convergence curves for all methods is illustrated in Figure 5. In this figure, a sample of the datasets is presented, which shows that the proposed EGBO method converged to the optimal values faster than the compared algorithms, which indicates the good convergence behaviors of the EGBO when solving feature selection problems.
In addition, Table 13 shows the Wilcoxon rank-sum test as a statistical test to check if there is a statistical difference between the proposed method and the other algorithms at a p-value < 0.05. From this table, we can see that there are significant differences between the proposed methods and the other algorithms in most datasets, indicating the EGBO’s effectiveness in solving different feature selection problems.
From the previous results, the proposed EGBO method outperformed the compared algorithms in most cases in solving global optimization problems and selecting the most relative features. These promising results can be attributed to a few reasons, e.g., the local escaping operator of the GBO was extended to include the expanded and narrowed exploration behavior, which added more flexibility and reliability to the traditional GBO algorithm. This extension increased the ability of the GBO to explore more areas in the search space, reflecting on effectively reaching the optimal point. On the other hand, although the EGBO obtained good results in most cases, it failed to show the best results in the computation time measure, namely in the global optimization experiment. This defect can be due to the traditional GBO consuming a relatively longer time than the compared algorithm when performing an optimization task. Therefore, this defect can be studied in future work. Generally, the exploration behaviors used in the EGBO increased the performance of the traditional algorithm in terms of performance measures.

5. Conclusions

This paper proposed a new version of the gradient-based optimizer (GBO), called EGBO, to solve diverse optimization problems, namely global optimization and feature selection. The local search of the EGBO was improved and expanded using expanded and narrowed exploration behaviors to increase the ability of the GBO to explore broad areas in the search space. The effectiveness of the EGBO was checked and evaluated using CEC2019 as global optimization functions and twelve benchmark feature selection datasets. The results were analyzed and compared to a set of well-known optimization algorithms using six performance measures. The EGBO showed promising results in solving global optimization problems, recording the highlight accuracy, selecting the most significant features, and outperforming the compared methods and the traditional version of the GBO in terms of performance measures. However, the computation time of the EGBO needs to be improved. In future works, the EGBO will be evaluated and applied in parameter estimation, image segmentation, and classifier optimization. In addition, the complexity of the EGBO will be maintained, and its initial population will be improved using chaotic maps to add more diversity to the search space.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are openly available in [26,27].

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Tubishat, M.; Ja’afar, S.; Alswaitti, M.; Mirjalili, S.; Idris, N.; Ismail, M.A.; Omar, M.S. Dynamic salp swarm algorithm for feature selection. Expert Syst. Appl. 2021, 164, 113873. [Google Scholar] [CrossRef]
  2. Ewees, A.A.; ElLaban, H.A.; ElEraky, R.M. Features selection for facial expression recognition. In Proceedings of the 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kanpur, India, 6–8 July 2019; pp. 1–6. [Google Scholar]
  3. Huda, R.K.; Banka, H. Efficient feature selection methods using PSO with fuzzy rough set as fitness function. Soft Comput. 2022, 26, 2501–2521. [Google Scholar] [CrossRef]
  4. Gaheen, M.M.; ElEraky, R.M.; Ewees, A.A. Automated students arabic essay scoring using trained neural network by e-jaya optimization to support personalized system of instruction. Educ. Inf. Technol. 2021, 26, 1165–1181. [Google Scholar] [CrossRef]
  5. Ewees, A.A.; Al-qaness, M.A.; Abualigah, L.; Oliva, D.; Algamal, Z.Y.; Anter, A.M.; Ali Ibrahim, R.; Ghoniem, R.M.; Abd Elaziz, M. Boosting Arithmetic Optimization Algorithm with Genetic Algorithm Operators for Feature Selection: Case Study on Cox Proportional Hazards Model. Mathematics 2021, 9, 2321. [Google Scholar] [CrossRef]
  6. Nadimi-Shahraki, M.H.; Zamani, H.; Mirjalili, S. Enhanced whale optimization algorithm for medical feature selection: A COVID-19 case study. Comput. Biol. Med. 2022, 148, 105858. [Google Scholar] [CrossRef]
  7. Zhang, Y.; Liu, R.; Wang, X.; Chen, H.; Li, C. Boosted binary Harris hawks optimizer and feature selection. Eng. Comput. 2021, 37, 3741–3770. [Google Scholar] [CrossRef]
  8. Banerjee, D.; Chatterjee, B.; Bhowal, P.; Bhattacharyya, T.; Malakar, S.; Sarkar, R. A new wrapper feature selection method for language-invariant offline signature verification. Expert Syst. Appl. 2021, 186, 115756. [Google Scholar] [CrossRef]
  9. Sathiyabhama, B.; Kumar, S.U.; Jayanthi, J.; Sathiya, T.; Ilavarasi, A.; Yuvarajan, V.; Gopikrishna, K. A novel feature selection framework based on grey wolf optimizer for mammogram image analysis. Neural Comput. Appl. 2021, 33, 14583–14602. [Google Scholar] [CrossRef]
  10. Ewees, A.A.; Ismail, F.H.; Ghoniem, R.M. Wild Horse Optimizer-Based Spiral Updating for Feature Selection. IEEE Access 2022, 10, 106258–106274. [Google Scholar] [CrossRef]
  11. Bandyopadhyay, R.; Basu, A.; Cuevas, E.; Sarkar, R. Harris Hawks optimisation with Simulated Annealing as a deep feature selection method for screening of COVID-19 CT-scans. Appl. Soft Comput. 2021, 111, 107698. [Google Scholar] [CrossRef]
  12. Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-qaness, M.A.; Gandomi, A.H. Aquila Optimizer: A novel meta-heuristic optimization Algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  13. Ahmadianfar, I.; Bozorg-Haddad, O.; Chu, X. Gradient-based optimizer: A new metaheuristic optimization algorithm. Inf. Sci. 2020, 540, 131–159. [Google Scholar] [CrossRef]
  14. Ahmadianfar, I.; Gong, W.; Heidari, A.A.; Golilarz, N.A.; Samadi-Koucheksaraee, A.; Chen, H. Gradient-based optimization with ranking mechanisms for parameter identification of photovoltaic systems. Energy Rep. 2021, 7, 3979–3997. [Google Scholar] [CrossRef]
  15. Khalilpourazari, S.; Doulabi, H.H.; Çiftçioğlu, A.Ö.; Weber, G.W. Gradient-based grey wolf optimizer with Gaussian walk: Application in modelling and prediction of the COVID-19 pandemic. Expert Syst. Appl. 2021, 177, 114920. [Google Scholar] [CrossRef] [PubMed]
  16. AlRassas, A.M.; Al-qaness, M.A.; Ewees, A.A.; Ren, S.; Abd Elaziz, M.; Damaševičius, R.; Krilavičius, T. Optimized ANFIS model using Aquila Optimizer for oil production forecasting. Processes 2021, 9, 1194. [Google Scholar] [CrossRef]
  17. Ma, L.; Li, J.; Zhao, Y. Population Forecast of China’s Rural Community Based on CFANGBM and Improved Aquila Optimizer Algorithm. Fractal Fract. 2021, 5, 190. [Google Scholar] [CrossRef]
  18. Zhou, W.; Wang, P.; Heidari, A.A.; Zhao, X.; Turabieh, H.; Chen, H. Random learning gradient based optimization for efficient design of photovoltaic models. Energy Convers. Manag. 2021, 230, 113751. [Google Scholar] [CrossRef]
  19. Jiang, Y.; Luo, Q.; Zhou, Y. Improved gradient-based optimizer for parameters extraction of photovoltaic models. IET Renew. Power Gener. 2022, 16, 1602–1622. [Google Scholar] [CrossRef]
  20. Ewees, A.A.; Ismail, F.H.; Sahlol, A.T. Gradient-based optimizer improved by Slime Mould Algorithm for global optimization and feature selection for diverse computation problems. Expert Syst. Appl. 2023, 213, 118872. [Google Scholar] [CrossRef]
  21. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  22. Mitchell, M. An Introduction to Genetic Algorithms; MIT Press: Cambridge, MA, USA, 1998. [Google Scholar]
  23. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  24. Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 2016, 27, 1053–1073. [Google Scholar] [CrossRef]
  25. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  26. Price, K.; Awad, N.; Ali, M.; Suganthan, P. Problem definitions and evaluation criteria for the 100-digit challenge special session and competition on single objective numerical optimization. In Technical Report; Nanyang Technological University: Singapore, 2018. [Google Scholar]
  27. Dua, D.; Graff, C. UCI Machine Learning Repository, 2019; University of California, Irvine, School of Information and Computer Sciences: Irvine, CA, USA, 2019. [Google Scholar]
Figure 1. Average fitness function values for all methods in all functions.
Figure 1. Average fitness function values for all methods in all functions.
Mathematics 11 00378 g001
Figure 2. Average computation times for all methods in all functions.
Figure 2. Average computation times for all methods in all functions.
Mathematics 11 00378 g002
Figure 3. Convergence curves for the proposed EGBO and the compared methods.
Figure 3. Convergence curves for the proposed EGBO and the compared methods.
Mathematics 11 00378 g003
Figure 4. Average of the fitness function values for all methods.
Figure 4. Average of the fitness function values for all methods.
Mathematics 11 00378 g004
Figure 5. Example of the convergence curves for the proposed EGBO and the compared methods.
Figure 5. Example of the convergence curves for the proposed EGBO and the compared methods.
Mathematics 11 00378 g005
Table 1. Results of the average measure of the fitness function values.
Table 1. Results of the average measure of the fitness function values.
EGBOGBOPSOGAMFODADEAO
F14.43 × 10 4 5.01 × 10 4 5.83 × 10 12 6.19 × 10 10 5.60 × 10 10 7.14 × 10 10 1.49 × 10 11 7.53 × 10 4
F21.73 × 10 1 1.73 × 10 1 1.70 × 10 4 2.61 × 10 1 1.74 × 10 1 4.10 × 10 1 1.82 × 10 1 1.75 × 10 1
F31.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1
F47.62 × 10 1 9.22 × 10 1 3.68 × 10 3 1.27 × 10 2 2.13 × 10 2 9.09 × 10 2 8.84 × 10 1 3.61 × 10 3
F51.18 × 10 0 1.19 × 10 0 2.58 × 10 0 1.19 × 10 0 1.38 × 10 0 1.59 × 10 0 1.61 × 10 0 2.29 × 10 0
F61.17 × 10 1 1.20 × 10 1 1.09 × 10 1 7.65 × 10 0 7.21 × 10 0 8.80 × 10 0 1.06 × 10 1 1.19 × 10 1
F74.30 × 10 2 4.68 × 10 2 4.31 × 10 2 3.38 × 10 2 3.58 × 10 2 6.55 × 10 2 6.34 × 10 2 7.20 × 10 2
F86.00 × 10 0 5.86 × 10 0 5.78 × 10 0 5.41 × 10 0 5.76 × 10 0 5.92 × 10 0 6.51 × 10 0 6.34 × 10 0
F93.01 × 10 0 3.41 × 10 0 1.59 × 10 2 3.44 × 10 0 4.08 × 10 0 3.10 × 10 1 3.23 × 10 0 2.68 × 10 2
F102.06 × 10 1 2.06 × 10 1 2.05 × 10 1 2.02 × 10 1 2.01 × 10 1 2.02 × 10 1 2.04 × 10 1 2.06 × 10 1
Boldface indicates the best result.
Table 2. Results of the Std measure of the fitness function values.
Table 2. Results of the Std measure of the fitness function values.
EGBOGBOPSOGAMFODADEAO
F13.23 × 10 3 6.44 × 10 3 3.17 × 10 12 8.14 × 10 10 4.80 × 10 10 9.09 × 10 10 6.23 × 10 10 1.74 × 10 4
F22.79 × 10 5 6.89 × 10 5 4.73 × 10 3 2.51 × 10 1 1.77 × 10 2 7.29 × 10 1 5.83 × 10 1 8.72 × 10 2
F36.76 × 10 11 3.36 × 10 9 4.44 × 10 4 1.06 × 10 6 2.61 × 10 4 7.28 × 10 4 4.27 × 10 5 3.75 × 10 5
F43.73 × 10 1 6.89 × 10 1 2.54 × 10 3 8.82 × 10 1 3.16 × 10 2 7.30 × 10 2 1.34 × 10 1 1.43 × 10 3
F57.45 × 10 2 1.14 × 10 1 7.45 × 10 1 1.21 × 10 1 1.62 × 10 1 2.41 × 10 1 9.31 × 10 2 2.19 × 10 1
F68.67 × 10 1 5.07 × 10 1 9.32 × 10 1 1.05 × 10 0 1.60 × 10 0 1.70 × 10 0 5.43 × 10 1 8.56 × 10 1
F71.96 × 10 2 2.38 × 10 2 2.32 × 10 2 1.87 × 10 2 2.06 × 10 2 3.03 × 10 2 1.26 × 10 2 2.64 × 10 2
F83.57 × 10 1 6.77 × 10 1 8.36 × 10 1 7.76 × 10 1 6.22 × 10 1 6.22 × 10 1 3.74 × 10 1 5.90 × 10 1
F92.62 × 10 1 5.56 × 10 1 2.40 × 10 2 6.62 × 10 1 8.12 × 10 1 5.91 × 10 1 3.13 × 10 1 1.29 × 10 2
F106.56 × 10 2 1.44 × 10 1 1.54 × 10 1 7.52 × 10 2 1.20 × 10 1 1.33 × 10 1 6.65 × 10 2 1.08 × 10 1
Boldface indicates the best result.
Table 3. Results of the max measure of the fitness function values.
Table 3. Results of the max measure of the fitness function values.
EGBOGBOPSOGAMFODADEAO
F15.23 × 10 4 6.21 × 10 4 1.70 × 10 13 3.26 × 10 11 1.97 × 10 11 4.12 × 10 11 3.05 × 10 11 1.23 × 10 5
F21.73 × 10 1 1.73 × 10 1 2.55 × 10 4 1.52 × 10 2 1.74 × 10 1 3.80 × 10 2 1.94 × 10 1 1.76 × 10 1
F31.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1
F41.81 × 10 2 3.27 × 10 2 1.03 × 10 4 3.49 × 10 2 1.76 × 10 3 2.96 × 10 3 1.15 × 10 2 5.80 × 10 3
F51.29 × 10 0 1.53 × 10 0 4.68 × 10 0 1.52 × 10 0 1.81 × 10 0 2.00 × 10 0 1.73 × 10 0 2.83 × 10 0
F61.32 × 10 1 1.30 × 10 1 1.25 × 10 1 9.20 × 10 0 9.86 × 10 0 1.16 × 10 1 1.13 × 10 1 1.33 × 10 1
F78.28 × 10 2 9.41 × 10 2 9.81 × 10 2 7.85 × 10 2 7.31 × 10 2 1.37 × 10 3 8.64 × 10 2 1.22 × 10 3
F86.45 × 10 0 6.84 × 10 0 7.00 × 10 0 6.49 × 10 0 6.80 × 10 0 6.93 × 10 0 7.13 × 10 0 6.84 × 10 0
F93.62 × 10 0 4.95 × 10 0 1.08 × 10 3 5.12 × 10 0 6.12 × 10 0 2.48 × 10 2 4.02 × 10 0 5.26 × 10 2
F102.07 × 10 1 2.08 × 10 1 2.08 × 10 1 2.03 × 10 1 2.04 × 10 1 2.06 × 10 1 2.05 × 10 1 2.08 × 10 1
Boldface indicates the best result.
Table 4. Results of the min measure of the fitness function values.
Table 4. Results of the min measure of the fitness function values.
EGBOGBOPSOGAMFODADEAO
F13.93 × 10 4 3.98 × 10 4 1.39 × 10 12 2.10 × 10 9 1.06 × 10 10 1.85 × 10 9 5.11 × 10 10 4.43 × 10 4
F21.73 × 10 1 1.73 × 10 1 5.12 × 10 3 1.73 × 10 1 1.73 × 10 1 1.74 × 10 1 1.76 × 10 1 1.74 × 10 1
F31.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1 1.27 × 10 1
F42.80 × 10 1 3.53 × 10 1 6.45 × 10 2 1.80 × 10 1 5.82 × 10 1 9.83 × 10 1 5.32 × 10 1 6.61 × 10 2
F51.05 × 10 0 1.06 × 10 0 1.63 × 10 0 1.06 × 10 0 1.19 × 10 0 1.24 × 10 0 1.42 × 10 0 1.87 × 10 0
F69.57 × 10 0 1.11 × 10 1 8.87 × 10 0 5.53 × 10 0 4.40 × 10 0 5.86 × 10 0 9.14 × 10 0 9.81 × 10 0
F76.34 × 10 1 1.52 × 10 1 1.45 × 10 2 3.02 × 10 1 6.13 × 10 1 1.42 × 10 2 3.83 × 10 2 2.37 × 10 2
F85.17 × 10 0 4.51 × 10 0 3.83 × 10 0 3.61 × 10 0 4.19 × 10 0 4.00 × 10 0 5.44 × 10 0 4.77 × 10 0
F92.56 × 10 0 2.63 × 10 0 3.67 × 10 0 2.59 × 10 0 2.75 × 10 0 2.89 × 10 0 2.70 × 10 0 4.67 × 10 1
F102.05 × 10 1 2.01 × 10 1 2.03 × 10 1 2.01 × 10 1 2.00 × 10 1 2.00 × 10 1 2.03 × 10 1 2.04 × 10 1
Boldface indicates the best result.
Table 5. Results of the computation times of the fitness function values.
Table 5. Results of the computation times of the fitness function values.
EGBOGBOPSOGAMFODADEAO
F17.60 × 10 1 8.40 × 10 1 9.32 × 10 1 1.00 × 10 0 8.18 × 10 1 1.43 × 10 0 9.76 × 10 1 1.34 × 10 0
F21.11 × 10 1 1.17 × 10 1 5.80 × 10 2 1.59 × 10 1 3.36 × 10 2 1.02 × 10 0 1.66 × 10 1 8.41 × 10 2
F31.20 × 10 1 1.22 × 10 1 6.09 × 10 2 1.72 × 10 1 3.87 × 10 2 7.80 × 10 1 1.75 × 10 1 1.03 × 10 1
F41.22 × 10 1 1.13 × 10 1 5.96 × 10 2 1.71 × 10 1 3.21 × 10 2 6.22 × 10 1 1.73 × 10 1 9.77 × 10 2
F51.18 × 10 1 1.23 × 10 1 6.24 × 10 2 1.72 × 10 1 3.80 × 10 2 6.33 × 10 1 1.73 × 10 1 1.07 × 10 1
F62.97 × 10 1 3.19 × 10 1 3.00 × 10 1 4.03 × 10 1 2.58 × 10 1 7.14 × 10 1 3.98 × 10 1 5.48 × 10 1
F71.21 × 10 1 1.24 × 10 1 6.42 × 10 2 1.73 × 10 1 3.94 × 10 2 5.90 × 10 1 1.78 × 10 1 1.11 × 10 1
F81.27 × 10 1 1.24 × 10 1 6.81 × 10 2 1.82 × 10 1 4.24 × 10 2 6.18 × 10 1 1.83 × 10 1 1.14 × 10 1
F91.10 × 10 1 1.15 × 10 1 6.03 × 10 2 1.69 × 10 1 3.06 × 10 2 6.55 × 10 1 1.75 × 10 1 9.70 × 10 2
F101.25 × 10 1 1.30 × 10 1 7.26 × 10 2 1.80 × 10 1 3.97 × 10 2 6.22 × 10 1 1.88 × 10 1 1.21 × 10 1
Boldface indicates the best result.
Table 6. Results of the Wilcoxon rank-sum test for global optimization.
Table 6. Results of the Wilcoxon rank-sum test for global optimization.
GBOPSOGAMFODADEAO
F10.00150.00000.00000.00000.00000.00000.0000
F20.35060.00000.00050.00000.00000.00000.0000
F30.86780.99700.78780.41990.25100.06220.0757
F40.00150.00000.01140.00030.00000.03810.0000
F50.30850.00000.63310.00030.00000.00000.0000
F60.03740.03970.00000.00000.00000.00040.4423
F70.85190.93380.77160.45530.02510.00120.0066
F80.03210.19730.02220.67760.37120.00010.0020
F90.02810.00000.30930.00000.00000.05360.0000
F100.60610.13600.00000.00000.00000.00000.2814
Boldface indicates the best result.
Table 7. Datasets description.
Table 7. Datasets description.
DatasetsFeaturesSamplesClasses
Exactly21310002
breast cancer96992
ionosphere343512
glass92147
Lymphography181482
sonar602082
waveform4050003
clean1data1664762
tic-tac-toe99582
heart122702
SPECT222672
Zoo161016
Table 8. Results of the fitness function values.
Table 8. Results of the fitness function values.
EGBOGBOPSOGAAODEMFODA
ionosphere0.12580.15560.13270.17920.22540.13880.25400.2923
breast cancer0.15300.20470.19210.21310.25130.17920.30100.2497
glass0.12940.13790.14100.14600.17490.13720.20550.1502
sonar0.05330.10390.05930.16930.32220.09310.29540.2257
Lymphography0.20780.27230.23170.30110.45780.23770.47550.3284
tic-tac-toe0.00000.00430.00430.03600.00430.00430.47340.0735
waveform0.62030.63600.62670.63940.65710.62150.67090.6558
clean1data0.14210.21080.20350.21910.30430.20300.26910.2689
SPECT0.32200.35490.32880.34640.40360.32670.42620.3753
Zoo0.00000.00220.00440.00440.03730.00220.11410.0067
Exactly20.46850.48380.48550.48950.51090.47930.57350.5001
heart0.33210.35170.34490.35880.42010.33880.44960.3980
Boldface indicates the best result.
Table 9. Results of the standard deviation.
Table 9. Results of the standard deviation.
EGBOGBOPSOGAAODEMFODA
ionosphere0.04850.06190.06600.07070.08680.06490.03760.0408
breast cancer0.03490.02310.04160.03940.05930.04430.05070.0359
glass0.02190.03190.02820.03480.03700.03080.04300.0290
sonar0.06750.09180.07390.02870.10160.08940.06160.0538
Lymphography0.10420.09340.10580.06960.11760.08110.09020.0624
tic-tac-toe0.00000.01610.01610.08070.01610.01610.09400.1257
waveform0.01230.01800.01430.01260.03440.01350.03010.0180
clean1data0.01440.04550.04150.03580.06130.03690.03940.0408
SPECT0.02840.03720.03640.03710.05700.04140.06680.0381
Zoo0.00000.00830.01130.01130.06620.00830.08790.0133
Exactly20.02310.02510.02440.02060.03760.02100.02660.0175
heart0.02830.04420.04150.03570.07140.03880.06060.0423
Boldface indicates the best result.
Table 10. Ratios of the selected features.
Table 10. Ratios of the selected features.
EGBOGBOPSOGAAODEMFODA
ionosphere0.33090.35880.42750.42750.34120.42940.47450.4902
breast cancer0.31370.42750.44310.41180.32940.43140.45490.5059
glass0.50510.51110.51850.53330.63700.51110.59260.5852
sonar0.44870.48110.48110.49780.55110.46440.51670.5011
Lymphography0.52080.51850.49260.53700.55190.51480.53700.5593
tic-tac-toe0.80681.00001.00000.98521.00001.00000.81940.9704
waveform0.61900.67940.67940.69210.74920.66670.64760.6571
clean1data0.32130.44580.49400.49320.45060.49280.53900.5233
SPECT0.51300.43640.47270.51520.50910.45150.51520.5333
Zoo0.52680.57080.58330.57080.60830.58330.53950.6250
Exactly20.22120.26150.40000.42050.36410.46670.50260.4359
heart0.55130.60580.58650.60900.60900.62180.52240.5692
Boldface indicates the best result.
Table 11. Results of the accuracy measure.
Table 11. Results of the accuracy measure.
EGBOGBOPSOGAAODEMFODA
ionosphere0.98180.97200.97800.96290.94170.97650.93410.9129
breast cancer0.97540.95760.96140.95300.93330.96590.90680.9364
glass0.79940.75600.74970.75090.68810.74840.65790.7522
sonar0.99260.98080.99100.97050.88590.98330.90900.9462
Lymphography0.94590.91710.93510.90450.79820.93690.79820.8883
tic-tac-toe1.00000.99970.99970.99220.99970.99970.76710.9788
waveform0.79890.79310.79700.79260.78390.79810.77230.7818
clean1data0.97960.95350.95690.95070.90360.95740.92610.9261
SPECT0.89550.87260.89050.87860.83380.89150.81390.8577
Zoo1.00000.99730.99470.99470.96800.99730.88800.9920
Exactly20.78000.76530.76370.76000.73760.76990.67040.7496
heart0.88890.87440.87940.87000.81840.88370.79420.8398
Boldface indicates the best result.
Table 12. Computation times for all algorithms.
Table 12. Computation times for all algorithms.
EGBOGBOPSOGAAODEMFODA
ionosphere25.41825.47924.92028.32950.30925.40224.97928.792
breast cancer25.52525.53825.01228.41150.41225.48725.07826.812
glass26.26027.56126.58231.29152.11627.34427.20128.430
sonar25.15825.17124.69628.14149.70025.14224.77727.189
Lymphography21.51623.06621.78926.20139.41421.71722.75822.909
tic-tac-toe27.64729.75829.27134.57458.77930.43529.96230.080
waveform166.431157.170162.899190.428314.278164.305164.992151.181
clean1data30.40932.73333.14437.64564.31633.76233.62240.541
SPECT22.07725.10425.03028.64448.34425.69225.17426.721
Zoo7.72926.90825.38332.77349.30625.94527.18826.930
Exactly224.34624.48927.29628.71149.98729.13626.95327.980
heart25.40125.47924.99028.37849.88525.45325.18826.242
Boldface indicates the best result.
Table 13. Results of the Wilcoxon rank-sum test.
Table 13. Results of the Wilcoxon rank-sum test.
GBOPSOGAAODEMFODA
ionosphere0.48430.00630.10010.00060.02310.00000.0141
breast cancer0.20690.80770.00000.00000.34320.00000.0000
glass0.01270.00340.00060.00000.02230.00000.0000
sonar0.16390.00360.03880.16390.00150.00000.0185
Lymphography0.00950.04320.00030.00070.70250.00000.0000
tic-tac-toe0.00120.22450.00260.00000.16090.00000.0002
waveform0.16390.07980.07980.00380.16390.00000.0386
clean1data0.03710.35750.00210.00020.11860.00000.0000
SPECT0.00000.00020.00000.00000.00220.00000.0000
Zoo0.00000.00000.00000.00000.00000.00000.0000
Exactly20.00170.00080.00050.00010.00490.00000.0000
heart0.06490.28150.03450.00010.64310.00000.0000
Boldface indicates the best result.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ewees, A.A. Solving Optimization Problems Using an Extended Gradient-Based Optimizer. Mathematics 2023, 11, 378. https://doi.org/10.3390/math11020378

AMA Style

Ewees AA. Solving Optimization Problems Using an Extended Gradient-Based Optimizer. Mathematics. 2023; 11(2):378. https://doi.org/10.3390/math11020378

Chicago/Turabian Style

Ewees, Ahmed A. 2023. "Solving Optimization Problems Using an Extended Gradient-Based Optimizer" Mathematics 11, no. 2: 378. https://doi.org/10.3390/math11020378

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop