Next Article in Journal
Anti-Ice PMMA Surface Design and Processing
Previous Article in Journal
In Situ Combustion Characteristics of Heavy Oil in the Liaohe Oilfield at Different Temperatures
Previous Article in Special Issue
Preparation of Bioaerogel from Iron-Rich Microalgae for the Removal of Water Pollutants
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improved Dujiangyan Irrigation System Optimization (IDISO): A Novel Metaheuristic Algorithm for Hydrochar Characteristics

1
School of Energy and Environment, Inner Mongolia University of Science and Technology, Baotou 014010, China
2
School of Automotive Engineering, Jiangxi College of Applied Technology, Ganzhou 341000, China
*
Author to whom correspondence should be addressed.
Processes 2024, 12(7), 1321; https://doi.org/10.3390/pr12071321
Submission received: 14 May 2024 / Revised: 21 June 2024 / Accepted: 22 June 2024 / Published: 26 June 2024
(This article belongs to the Special Issue State-of-the-Art Wastewater Treatment Techniques)

Abstract

:
Hyperparameter tuning is crucial in the development of machine learning models. This study introduces the nonlinear shrinking factor and the Cauchy mutation mechanism to improve the Dujiangyan Irrigation System Optimization (DISO), proposing the improved Dujiangyan Irrigation System Optimization algorithm (IDISO) for hyperparameter tuning in machine learning. The optimization capabilities and convergence performance of IDISO were validated on 87 CEC2017 benchmark functions of varying dimensions and nine real-world engineering problems, demonstrating that it significantly outperforms DISO in terms of convergence speed and accuracy, and ranks first in overall performance among the seventeen advanced metaheuristic algorithms being compared. To construct a robust and generalizable prediction model for hydrochar element characteristics, this study utilized IDISO and DISO algorithms to fine-tune the parameters of the XGBoost model. The experimental results show that the IDISO-XGBoost model achieved an average prediction performance of 0.95, which represents a 4% improvement over the DISO-XGBoost model. These results indicate that the IDISO algorithm has significant potential and value in practical applications.

1. Introduction

The global energy shortage and environmental pollution are both becoming more and more serious issues. To achieve China’s “carbon peak” and “carbon neutrality” objectives, there is a need for an effective reduction in CO2 emissions. Exploring renewable and clean energy sources as alternatives to fossil fuels is critical. Biomass is emphasized as the sole viable carbon-based renewable alternative to fossil fuels [1]. Studies indicate that, in certain regions, initial trials exploring its integration into the energy mix are underway. In 2021, the amount of electricity generated from biomass combustion reached 750 TWh, and it is projected that, by 2050, the supply of biomass is expected to increase to 100 EJ [2,3].
Hydrothermal carbonization (HTC) is an emerging thermochemical conversion technology. Biomass is effectively transformed into hydrochar through HTC, with an energy density that is comparable to peat and lignite. This process efficiently recycles agricultural waste and exhibits economic and environmentally friendly characteristics [4]. The complex composition of the biomass material influences the HTC process. It is also affected by the hydrothermal reaction conditions. This process involves nonlinear and strongly coupled multivariate relationships. These factors hinder scholars’ deep understanding of the hydrothermal carbonization mechanism. Current studies primarily rely on experimental and simulation methods [5,6]. These methods are used for quantitative analysis to explore the factors affecting the hydrothermal carbonization process. However, the research mainly focuses on one or a few types of biomass materials. The experimental results need more universality, and the findings are challenging to generalize. When modeling using traditional methods, such as computational fluid dynamics, kinetics, and thermodynamics, various assumptions are inevitably made about the complex HTC process. These assumptions are made to simplify the complex models. However, these simplifications may affect the accuracy of the simulation results [7].
Machine learning techniques demonstrate more potential in predicting unknown relationships than traditional methods. Machine learning algorithms are widely applied in predicting industrial problems. They can make compelling predictions without a precise mathematical relationship between the input and output features. They have been proven to be alternatives to traditional modeling techniques for studying and understanding complex processes [8], which demonstrates significant potential in predicting the physicochemical properties of hydrochar.
Rasam et al. [9] pioneered a machine learning model to predict hydrothermal carbonization properties, with the Support Vector Machine (SVM) method showing superior performance over other methods like Decision Tree (DT), Random Forest (RF), and Multi-Layer Perceptron (MLP). Kardani et al. [10] found XGBoost to be the most accurate method in predicting hydrocarbons yield in biomass. Li et al.’s studies [11,12] employed optimized RF and SVM models to assess biomass hydrochar and pyrochar properties, achieving significant accuracy. Furthermore, they demonstrated that DNN models excelled in predicting the physicochemical properties of hydrochar from urban waste, with high accuracy in parameters like HHV and carbon content. Nguyen et al. [13] have also conducted similar work, achieving analogous results.
Machine learning methods are key in predicting hydrochar properties, but model optimization is challenging. Mu et al. [14], in 2022, successfully used Particle Swarm Optimization based on an ANN model to study hydrochar, achieving high accuracy, thereby demonstrating the potential of metaheuristic optimization in such research. Li et al. [15] employed Genetic Algorithms (GA) to optimize Artificial Neural Networks (ANN), achieving superior outcomes compared to those obtained through the response surface methodology (RSM).
Metaheuristic optimization techniques, known for their minimal derivation, flexibility, and their ability to escape local optima [16], are effective for nonlinear problems, finding applications in energy, mechanical, and chemical engineering [17,18,19]. And, because the optimization process of metaheuristic algorithms does not depend on gradient information [20], they find widespread applications in optimization problems for finding the best parameters. For example, the Dujiangyan irrigation system optimization (DISO) [21] is used to construct a DISO-SVM model [21] to detect the impact of dam displacement on dam operation; Particle Swarm Optimization (PSO) [22] is used to construct PSO-NN [14] and PSO-RF [23] models to predict hydrochar properties; the Grey Wolf Optimizer (GWO) [24] is used to construct a GWO-ELM model [25] for monitoring power quality; the Dandelion Optimizer (DO) [26] is used to improve the efficiency of multilevel inverters [27]; the Jellyfish Search Algorithm (JS) [28] is used to discover unknown parameters in fuel cells [29]; Young’s Double-Slit Experiment (YDSE) optimizer [30] is used to construct a YDSE-PWM model for predicting dissolved oxygen levels [31]; the Starling Murmuration Optimizer (SMO) [32] algorithm is used to construct an ADA-SMO model [33] for predicting the strength of the mechanical properties of concrete. The “No Free Lunch” theorem states that specific optimization algorithms are suited only to certain optimization problems [34]. This principle motivates us to create and improve existing optimization algorithms so that they perform better in specific scenarios with specialized difficulties. Researchers have proposed various improved algorithms, such as the Hybrid Whale–Particle Swarm Optimization Algorithm (HWPSO) [35]; adaptive hybrid dandelion optimizer (DETDO) [36]; Ameliorated Young’s double-slit experiment optimizer (IYDSE) [37]; enhanced jellyfish search algorithm (EJS) [38]; and efficient hybrid starling murmuration optimizer (DTCSMO) [39].
In this study, the Dujiangyan irrigation system optimization (DISO) [21] is enhanced with nonlinear shrinking factors and Cauchy mutation methods. This improvement aims to overcome its tendencies for converging to local optima and poor performance in high-dimensional spaces. IDISO will be used to optimize the predictive model for the physicochemical properties of hydrochar. The aim is to guide experimental designs. This approach can reduce time and economic costs. Additionally, it aids in revealing the relationships between different parameters in the hydrochar reaction process. It also provides theoretical guidance for the application of hydrochar in fuel chemical engineering.
The main contributions of this paper are as follows:
  • We improved the Dujiangyan irrigation system optimization (DISO) by introducing nonlinear shrinking factors and the Cauchy mutation mechanism, addressing its tendency to become trapped in local optima and its poor performance in high-dimensional spaces.
  • We compared the IDISO algorithm with seventeen state-of-the-art optimization algorithms using twenty-nine CEC2017 benchmark functions across three dimensions (30, 50, and 100) and nine engineering problems. Non-parametric tests indicated that the IDISO algorithm showed significant improvements in terms of convergence speed and accuracy.
  • We developed an IDISO-XGBoost model to predict the physicochemical properties of hydrochar, resulting in a prediction model with high robustness and generalization ability.
Section 2 elaborates on the original Dujiangyan method and its advancements. Section 3 and Section 4 involve testing the algorithm’s performance. Section 5 involves algorithm performance analysis. Section 6 discusses the application of the refined algorithm in optimizing the predictive model for hydrothermal carbon’s physicochemical properties. This manuscript concludes with Section 7, offering conclusions and insights for future research directions.

2. Materials and Methods

2.1. Data Source

The dataset utilized in this study was compiled through the meticulous collection and organization of existing published data, aligning with the rigorous standards of research methodologies. In this dataset, the input features included the operational conditions during hydrothermal carbonization and the elemental and industrial analysis of the biomass itself. The output feature was the basic analysis of hydrochar. A total of 420 datasets were used. The data originated from hydrothermal carbonization research articles on wood biomass, herbaceous biomass, food waste, sewage sludge, and other raw materials. This included common types of biomass materials.

2.2. DISO

The inspiration for the DISO method comes from the Dujiangyan irrigation project. In the DISO algorithm [21], potential solutions of the unknown function are considered as massless, volumeless droplets in the search space. The algorithm primarily consists of four main steps.
The first part is the initialization phase. Then, the global search phase simulates the Fish Mouth Dividing Project, and the formula for updating the velocity of water droplets is as follows:
V i 1 = H R O 2 3 × H G O 1 2 C F × r 1   r 1 < 0.23 H R I 2 3 × H G I 1 2 C F × r 2   r 2 0.23
where r 1 and r 2 are random numbers between 0 and 1; H R O and H R I are the hydraulic radii of the inner and outer rivers, set at fixed values of 1.2 and 7.2; H G O and H G I represent the hydraulic gradients of the inner and outer rivers, set at fixed values of 1.3 and 0.82; and C F denotes the comprehensive riverbed roughness, the formula for which is as follows:
C F t = C F R × g a m p d f t T , P , Q
where C F R represents the riverbed roughness, set at a fixed value of 9.435; t is the current iteration number; T is the maximum number of iterations; and g a m p d f t T , P , Q is the gamma probability density function, influenced by the values of P and Q , which are set at 0.85 and 2.5, respectively, in the DISO algorithm. The formula for updating the position of the water droplet is as follows:
X i 1 + 1 = X b e s t t + X b e s t X i × r a n d ( 0,1 ) × V i 1 + I m p r o v e 1 i
where X b e s t t represents the position of the best-performing candidate solution in the t iteration and I m p r o v e 1 i is the recipe for the self-improvement of the water body. The procedure is as follows:
I m p r o v e 1 i = ( 1 ) k × X i X m e a n × r a n d 0,1
In this formula, k is an integer between [0, 1], while the m e a n represents the average position of the population. After the water body enters the inner river, the water droplet will exhibit a spiral motion. At this point, the position update is influenced by centrifugal and lateral pressures, with the corresponding update formula being as follows.
X i , j = X i , j , R C F i < L P i U B j L B j r a n d 0,1 + L B j , R C F i L P i
In this formula, R C F i and L P i represent the centrifugal force and lateral pressure experienced by the i t h solution, respectively. The formula for this is as follows:
R C F i = W D I × cos 90 × t T × j = 1 D X b e s t , j X i , j 2
L P i = W D I × V D C × M L V i 2 C R C
In this formula, W D I is the density of the water body in the inner river, set at a fixed value of 1.35; V D C is the fluid distribution coefficient, set at 0.46; and M L V i is the longitudinal average velocity, with the formula being as follows:
M L V i = m e a n F i t
In this procedure, F i t represents the average fitness of the population.
The local development phase simulates the Baopingko Project, and the formula for updating the velocity of the water droplet is as follows:
V i 2 = H R I 2 3 × H G I 1 2 2 C F × r 3 , r 3 < p 2 H R I 2 3 × H G I 1 2 C F × r 4 , r 4 p 2
In this formula, r 3 and r 4 are random numbers between 0 and 1, while p 2 is the water dividing ratio at the bottleneck, set at a fixed value of 0.68. The formula for updating the position of the water droplet is as follows:
X i 2 + 1 = X b e s t t + X b e s t X i × r a n d ( 0,1 ) × V i 2 + I m p r o v e 2 i
In this formula, X b e s t t represents the position of the best-performing candidate solution in the t iteration, while I m p r o v e 2 i is the recipe for the self-improvement of the water body. The procedure for this is as follows:
I m p r o v e 2 i = s i g n ( F i t i d x F i t i ) × ( X i d x X i ) × r a n d ( 0,1 )
In this formula, F i t represents the fitness value of an individual and i d x denotes a random individual.
The final phase involves simulating the individual elimination stage of the Feishayan. Fitness values are ranked. Initialization is performed for individuals to be eliminated. This process affects the sand discharge. The elimination ratio per iteration is set at 0.23.

2.3. IDISO

In this section, we introduce the improvements to the original algorithm using the nonlinear shrinking factor and the Cauchy mutation mechanism.

2.3.1. Nonlinear Shrinking Factor

In the DISO algorithm, position updates Equations (3) and (10) primarily depend on the best candidate solution and positions of other solutions in the population. However, this transfer of information could be more practical in coordinating the relationship between the global search and local development. Consequently, a nonlinear shrinking factor is introduced. This is to balance the dynamic equilibrium between global tracking and the local development of individuals. Simultaneously, this approach ensures the convergence of DISO while enhancing its precision. The fractal dimension [40] is introduced as a nonlinear factor in the third term of the position update Formula (5), with the procedure being as follows:
C t = log C m a x C m i n × T t T + C m i n
In this formula, C m a x and C m i n are set at fixed values of e 3 and e 2 .
A nonlinear factor in the opposite direction is introduced in the second term of the position update Formula (12), with the formula being as follows:
Q t = e 2 1 t T
The updated position update formula is as follows:
X i 1 + 1 = X b e s t t + X b e s t X i × r a n d ( 0,1 ) × V i 1 + I m p r o v e 1 i × C t
X i 2 + 1 = X b e s t t + X b e s t X i × r a n d ( 0,1 ) × V i 2 × Q t + I m p r o v e 2 i
In this formula, C t nonlinearly decreases with the increase in the iteration number t , while Q t nonlinearly increases correspondingly. During the initial iteration phase, C t slowly decays while Q t increases gradually. This allows for a more extensive movement range of the water body in the first position update. Consequently, it enables extensive exploration within the search space. During the third position update, the water body can undergo detailed development within a localized area. As the number of iterations increases, C(t) decreases rapidly while Q(t) increases quickly. This results in the initial updates in the later stages of iteration, mainly focusing on local and detailed development. In the third update, the water body can move more quickly towards the global optimum through a more extensive range of motion, allowing for a more effective search for the optimal solution.

2.3.2. Cauchy Mutation Mechanism

The movement mechanism of DISO mainly depends on the information exchange within the population. It tends to fall into local optima and cannot escape from positions of local optimum values. Therefore, the Cauchy mutation mechanism is introduced before the fourth step of eliminating individuals. This enhances the particles’ ability to escape from local optima. The formula is as follows:
X i n e w = γ π ( γ 2 + ( x a ) 2 )
X i 3 + 1 = X i , F i t i n e w < F i t i X i n e w , F i t i n e w F i t i
In this formula, X i n e w represents the position of the water body after the Cauchy mutation. X i is the current position of the water body. a and γ are fixed values of 0 and 1. Before individual elimination (in the Feishayan stage), the final position of the water body is determined by comparing the fitness before and after mutation. The flowchart of IDISO is shown in Figure 1.

2.4. IDISO-XGBoost

Predicting the elemental characteristics of hydrochar is crucial for scholars to understand its reaction process. It is equally essential for guiding experimental and industrial applications.
XGBoost [41] is a machine learning algorithm based on gradient-boosted trees. It introduces regularization terms to control the complexity of the model and employs gradient boosting algorithms for training. Relevant studies indicate that XGBoost exhibits the best performance in predicting structured data problems [42].
Unlike manual tuning methods that are prone to local optima, grid search methods that are time-consuming and labor-intensive, and Bayesian optimization methods that have limitations in hyperparameter optimization, metaheuristic methods have the advantages of not requiring gradient information, strong global search capabilities, and high speed. Therefore, this study proposes the IDISO-XGBoost model, utilizing IDISO to optimize the parameters of XGBoost.
The dataset was divided into a training set, validation set, and test set in an 8:1:1 ratio. The training set was used to train the predictive model. The validation set was used for the hyperparameter tuning of the trained model. The test set was used to test the predictive model. The coefficient of determination (R2) was used as the evaluation metric for XGBoost and as the fitness value for IDISO. When optimizing the hyperparameters of XGBoost, the IDISO algorithm aimed to maximize the value of the fitness function.
The model selects three hyperparameters that most significantly affect XGBoost’s performance as follows: the learning rate, number of trees, and maximum depth of trees. Additionally, it includes five hyperparameters for IDISO’s search space as follows: the sample subsampling ratio for model generalization and regularization coefficients to prevent overfitting. Due to the lower number of search dimensions, the number of seeds for the IDISO algorithm is set to 10, with 20 iterations for optimization.
In order to implement the above algorithms, Python version 3.9.13, NumPy version 1.24.2, SciPy version 1.9.1, XGBoost version 1.7.5, scikit-learn (sklearn) version 1.3.0, and pandas version 1.4.4 were employed.

2.5. Performance Evaluation

In this section, we introduce the evaluation of the algorithm using statistical analysis and non-parametric statistics (sign test) methods.

2.5.1. Statistical Analysis

The average ranking method is used to preliminarily evaluate IDISO’s performance. This evaluation is specifically applied to CEC2017 and renowned engineering projects.

2.5.2. Non-parametric Statistics (Sign Test)

In average ranking tests, algorithms may have their overall average ranking affected by extreme rankings due to their unsuitability for certain specific problems. Therefore, non-parametric testing methods are used to ascertain IDISO’s performance further.
The sign test is a popular and straightforward method for evaluating algorithm performance. It compares the performance of two algorithms in each scenario. The evaluation is based on the number of times one algorithm outperforms the other. The algorithm with a more significant number of overall victories is considered to be the superior algorithm.

3. CEC2017 Results and Discussion

CEC2017 serves as a benchmark test function [43]. It comprises four types of benchmark test functions. Each category focuses on evaluating and testing the different capabilities of the algorithm. For detailed information, refer to Table 1 below. Unimodal function testing assesses the algorithm’s in-depth exploitation capability in the search space. Multimodal function testing measures the algorithm’s efficiency in exploring complex search spaces. Hybrid function testing focuses on the algorithm’s ability to balance between exploration and exploitation tasks. Composite function testing evaluates the overall performance of the algorithm in highly complex search environments.
This section compares the IDISO algorithm with 17 powerful, state-of-the-art optimization algorithms, including DISO. For detailed information, refer to Table 2 below.
The comparison is based on the performance across 29 CEC2017 test functions, covering 30, 50, and 100 dimensions. To ensure the fairness of the experimental results. All algorithms use the same population size (100) and the same number of iterations ( 10000 D 100 ). The comparison algorithms originate from their original authors and utilize their recommended optimal parameters. To minimize the randomness of the experimental outcomes, each algorithm is executed 30 times. The average fitness value is taken as the evaluation metric.
The comparison results of IDISO with other metaheuristic algorithms are shown in Table 3. Supplementary data provide a detailed display of the comparative algorithms’ performance on the three dimensions of the twenty-nine test functions. The results show that among the 17 compared optimization algorithms, DISO’s ranking declines as the dimensions increase. It drops from third to the fourth place. In cases of 50 and 100 dimensions, DISO’s average performance on CEC2017 test functions is surpassed by the HHO algorithm. This indicates its underperformance in higher dimensions. The improved IDISO algorithm exhibits a superior overall average performance across all three test dimensions when compared to the other 17 benchmarked algorithms. However, the sign test results indicate that IDISO is not superior to other algorithms in every problem. Its overall performance is the strongest. Different optimization algorithms may outperform IDISO in terms of some specific issues, which aligns with the “No Free Lunch theorem”.
As shown in Figure 2, the accuracy of the F1 function is significantly improved across all three dimensions. This reflects that the improvement strategy has enhanced the algorithm’s global search ability in unimodal function search spaces. As a result, it becomes more effective in discovering global optima. There is a significant improvement in the accuracy of the F4 function at 100 dimensions. The improvement strategy enhances the algorithm’s exploitation ability in high-dimensional multimodal function search spaces. It also enhances the ability to escape from local optima. There is a significant improvement in the accuracy of the F12 function. This demonstrates the enhanced robustness of the improved algorithm in balancing global exploration and local exploitation. It can effectively find the global optimum within the complex search space of composite functions.
The radar charts of the top four ranked algorithms (IDISO, DISO, GWO, HHO) across three dimensions of the CEC2017 test functions are shown in Figure 3. The radar charts intuitively demonstrate that the IDISO algorithm has the smallest shaded area in all three dimensions, indicating its best performance in the rankings of the test functions. Particularly, in the high-dimensional space of multimodal problems, the IDISO algorithm exhibits better characteristics than in low-dimensional space, suggesting that the improvement methods effectively enhance the performance of DISO in high-dimensional space. Additionally, the radar charts show that the IDISO algorithm has the best stability among all the compared algorithms, indicating that IDISO can be widely applied to various types of problems, especially high-dimensional problems.

4. Real-World Engineering Problem Results and Discussion

Nine renowned real-world engineering problems are utilized to test the improved algorithm [59]. The purpose of this is to evaluate its applicability in practical engineering scenarios. A summary of the problems is provided in Table 4. Detailed descriptions can be found at https://github.com/Jingyuango/idiso (accessed on 25 June 2024). The algorithms ranked in the top four across the three dimensions of the CEC2017 test functions are selected for comparison. To ensure the fairness of the experiments, all algorithms use the same population size (100) and the same number of iterations (1000). Each algorithm is executed 30 times. The average fitness value is used as the evaluation metric.
The test results are presented in Table 5. Detailed test results can be found in the Supplementary Materials. The IDISO algorithm ranks first in overall performance among the nine real-world engineering problems. It is the algorithm with the highest accuracy among all compared algorithms in four problems, which is close to the theoretical optimal value and demonstrates the broad applicability of IDISO in real-world engineering applications.
The radar chart of the top four ranked algorithms (IDISO, DISO, GWO, HHO) across the nine engineering problems is shown in Figure 4. The radar chart intuitively demonstrates that the IDISO algorithm has the smallest shaded area among the four compared algorithms, indicating its superior performance. Notably, IDISO achieved the best ranking in five problems and did not rank last in any problem, demonstrating its strong robustness.

5. IDISO Algorithm Analysis

5.1. Convergence Analysis

The convergence of the IDISO optimization algorithm compared with other algorithms is illustrated in Figure 2. The findings demonstrate that this approach exhibits superior convergence capabilities relative to other optimization methods. In the CEC2017 benchmark suite, IDISO consistently demonstrated rapid convergence to the global optimum across three different dimensions for four types of problems. This performance showcases its capability to quickly and accurately locate global optima in high-dimensional and complex search spaces and the algorithm’s broad applicability.

5.2. Exploration and Exploitation Analysis

Merely observing the convergence curves and outcomes of a population does not fully reveal the actual modifications in the algorithm. The ratio of exploration to exploitation in IDISO and DISO populations during iterations is computed. This calculation facilitates a more accurate and comprehensive understanding of the population’s behavior. To accurately demonstrate improvements in exploration and exploitation [60], the multimodal function F4 and hybrid function F12 from the CEC2017 test suite are selected as examples in a 30-dimensional context, with the experimental settings being consistent with the aforementioned.
Figure 5 displays the exploration and exploitation curves for IDISO and DISO. The graph illustrates that in multimodal and hybrid function problems, the proportion of exploration by IDISO individuals is higher compared to DISO. The improvements effectively prevent individuals within the population from becoming trapped in local optima, enhancing DISO’s ability to escape local optima. Throughout the convergence process, exploration and exploitation oscillate within a certain range, achieving a dynamic balance. This demonstrates IDISO’s effective balancing of global exploration and local exploitation.

5.3. Parameter Sensitivity Analysis

The IDISO algorithm adds the Cauchy variation mechanism and two nonlinear factors to the original algorithm, which has four main parameters as follows: C m a x , C m i n , a , and γ . The subsequent section discusses the sensitivity of the IDISO algorithm to these parameters.
C m a x and C m i n are significant determinants of the nonlinear contraction factor C(t). Since the primary convergence process of the IDISO optimization algorithm takes place in the initial stage of the iterative process, the analysis is conducted in the first part of the entire convergence process, specifically when t T = 1 100 . Since the nonlinear shrinkage factor primarily balances the search relationship, we use the F1 function at 30 dimensions for exemplification. The selected range for C m a x is [ 10 , 15 , e 3 , e 4 , 100 , e 5 , 200 , 300 ] , and the values for C m i n are [ 2 , e , 3 , 4 , 5 , 6 , e 2 , 7 ] . When C m a x is a variable, C m i n = e 2 . When C m i n is a variable, C m a x = e 3 . The findings are presented in Figure 6a, displaying a decrease in the C ( t ) C m a x value during the initial stages of the iteration as the C m a x values increase. This indicates a slower rate of change for C t . Increasing the step size during the initial position update will result in a more significant number of iterations, but it may also cause the algorithm to cross the local optimum, resulting in a decline in the final results. In the initial iteration phase, when the value of C m i n decreases, C t follows a similar pattern to C m a x and is highly responsive to any changes in these two variables. As a result, the convergence value range can be extensive. This study reveals that the IDISO algorithm performs best with respect to the influence factor of C t , when C m a x = e 3 and C m i n = e 2 .
The Cauchy variation mechanism is influenced by the parameters a and γ , which operate during the final stage of the process. This mechanism enables the water to escape local optima and prevents the population from being trapped as well. a represents the midpoint of the Cauchy distribution and γ represents the distribution’s width. Using the example of the 30 dimensions of the F9 function, the Cauchy variation mechanism is added to counter DISO’s inclination to converge towards a local optimum. The selected range for a is [ 100 , 50 , 25,0 , 25,50,100 ] , and the values for γ are [ 0.5,0.75,1 , 1.5,2 , 3,4 ] . When a is a variable, γ = 1 . When γ is a variable, a = 0 . The findings presented in Figure 6b demonstrate that γ exhibits greater sensitivity to alterations in the central region of distribution a . Optimal performance is achieved when a and γ are set to 0 and 1, respectively.

5.4. IDISO Complexity Analysis

As shown in Figure 1, the improved algorithm mainly includes new factors in the two positions to update the formulas and the Cauchy variation mechanism before eliminating the water body. This process uses the same time complexity as DISO, both of which are O N × T . Therefore, IDISO significantly enhances the accuracy performance of the DISO algorithm in high-dimensional complex functions without increasing the algorithm’s running time complexity.

5.5. IDISO Ablation Study

In Figure 7, IDISO1 represents the DISO algorithm with the Nonlinear Shrinking Factor added, and IDISO2 represents the DISO algorithm with the Cauchy mutation mechanism added. From left to right, the figure displays the CEC test functions F1, F4, F12, and F25, which represent unimodal, multimodal, hybrid, and composite problems, respectively. From top to bottom, the figure shows results for dimensions 30, 50, and 100. It can be observed that the performance of the DISO algorithm is inferior to the improved algorithms across all four problem types. In unimodal and hybrid problems, the differences among the three methods are significant, and these differences become more pronounced as the dimensionality increases. In multimodal and composite problems, the accuracy differences among the three methods are not substantial, with distinctions existing primarily in convergence speed. This indicates that the combination of the nonlinear shrinking factor and Cauchy mutation mechanism improves the performance of the DISO algorithm in high-dimensional unimodal and hybrid problems and accelerates the convergence of the DISO algorithm across all problem types.

6. IDISO-XGBoost Model

The hyperparameters are selected as shown in Table 6, and the comparison of the tuning results of IDISO-XGBoost with DISO-XGBoost is shown in Figure 8. The results indicate that the IDISO-XGBoost model achieved an average prediction performance of 0.95, representing a 4% improvement over the average prediction performance of the DISO-XGBoost model. Compared to the study by Chen et al., which used grid search to optimize the RF model [23], this research achieved an improvement in the R² for predicting elemental analysis on the test set from 0.92 to 0.96, representing a 4.4% increase. This advancement proposes a novel method for predicting the elemental analysis of hydrochar and subsequently guiding the design of experimental protocols.
Figure 9 displays the correlation between the actual and predicted values in both the validation and test sets. The green auxiliary line shows that the predicted values match the true values. The figure reveals that only a few points lie far away from the auxiliary line in the C, N, and O prediction models. Conversely, all issues remain near the auxiliary line in the H prediction model. These results demonstrate that the IDISO-XGBoost model can accurately predict the elemental properties of hydrothermal charcoal with good generalization and robustness. This study presents a new approach to predicting such properties and offers a valuable contribution to the field.

7. Conclusions

In this paper, we propose the Improved Dujiangyan Irrigation System Optimization (IDISO) algorithm by introducing the nonlinear shrinking factor and the Cauchy mutation mechanism to enhance the original DISO algorithm. To validate the optimization capabilities and convergence of the IDISO algorithm, we employed 87 CEC2017 benchmark functions of varying dimensions and compared its performance with 17 state-of-the-art metaheuristic algorithms. Additionally, we assessed the search performance of the IDISO algorithm in unknown spaces across nine real-world engineering problems. The results demonstrate that the IDISO algorithm achieved the fastest convergence speed and the highest convergence accuracy compared to the 17 powerful metaheuristic algorithms, significantly outperforming the DISO algorithm. In both the CEC2017 benchmark functions and the nine real-world engineering problems, the IDISO algorithm ranked first in overall performance.
Finally, we utilized the IDISO and DISO algorithms to fine-tune the parameters of the XGBoost model, aiming to construct a robust and generalizable prediction model for the characteristics of hydrochar elements. The results indicate that the IDISO-XGBoost model achieved an average prediction performance of 0.95, representing a 4% improvement over the average prediction performance of the DISO-XGBoost model. This novel approach to predicting hydrochar element characteristics highlights the significant potential and value of the IDISO algorithm in practical applications.

Supplementary Materials

The following supporting information can be downloaded at: https://github.com/Jingyuango/idiso, (accessed on 25 June 2024).

Author Contributions

Conceptualization, D.Z.; Methodology, J.S. and Z.S.; Software, J.S.; Validation, J.W.; Investigation, Z.S., Z.Z. and W.H.; Data curation, J.W. and Z.H.; Writing—original draft, J.S.; Writing—review & editing, D.Z. and Y.W. All authors have read and agreed to the published version of the manuscript.

Funding

This project is supported by the Inner Mongolia Autonomous Region Science and Technology Plan Project (2023YFDZ0031), Fundamental Research Fund for Inner Mongolia University of Science & Technology (2023QNJS129,2023QNJS130,2024RCTD008), Inner Mongolia Natural Science Foundation (2024QN05058), and the Open Research Project of State Key Laboratory of Baiyunobo Rare Earth Resource Researches and Comprehensive Utilization (2021H2275).

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

Our profound gratitude is extended to ChatGPT for its assistance. This manuscript has benefited from the facilitation provided by ChatGPT during the refinement process, enhancing the fluidity of the English expression. However, it is imperative to underscore that the final content and mode of expression within this document are a reflection of my personal choice, embodying my perspectives and stylistic approach.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Xiu, S.; Shahbazi, A. Bio-oil production and upgrading research: A review. Renew. Sustain. Energy Rev. 2012, 16, 4406–4414. [Google Scholar] [CrossRef]
  2. Lee, D.; Nam, H.; Seo, M.W.; Lee, S.H.; Tokmurzin, D.; Wang, S.; Park, Y.-K. Recent progress in the catalytic thermochemical conversion process of biomass for biofuels. Chem. Eng. J. 2022, 447, 137501. [Google Scholar] [CrossRef]
  3. Sikarwar, V.S.; Zhao, M.; Fennell, P.S.; Shah, N.; Anthony, E.J. Progress in biofuel production from gasification. Prog. Energy Combust. Sci. 2017, 61, 189–248. [Google Scholar] [CrossRef]
  4. Mäkelä, M.; Benavente, V.; Fullana, A. Hydrothermal carbonization of industrial mixed sludge from a pulp and paper mill. Bioresour. Technol. 2016, 200, 444–450. [Google Scholar] [CrossRef] [PubMed]
  5. Hansen, L.J.; Fendt, S.; Spliethoff, H. Impact of hydrothermal carbonization on combustion properties of residual biomass. Biomass Convers. Biorefinery 2022, 12, 2541–2552. [Google Scholar] [CrossRef]
  6. Samaksaman, U.; Pattaraprakorn, W.; Neramittagapong, A.; Kanchanatip, E. Solid fuel production from macadamia nut shell: Effect of hydrothermal carbonization conditions on fuel characteristics. Biomass Convers. Biorefinery 2021, 13, 2225–2232. [Google Scholar] [CrossRef]
  7. Wong, K.I.; Wong, P.K.; Cheung, C.S.; Vong, C.M. Modelling of diesel engine performance using advanced machine learning methods under scarce and exponential data set. Appl. Soft Comput. 2013, 13, 4428–4441. [Google Scholar] [CrossRef]
  8. Jeon, P.R.; Moon, J.-H.; Olanrewaju, O.N.; Lee, S.H.; Ling, J.L.J.; You, S.; Park, Y.-K. Recent advances and future prospects of thermochemical biofuel conversion processes with machine learning. Chem. Eng. J. 2023, 471, 144503. [Google Scholar] [CrossRef]
  9. Rasam, S.; Talebkeikhah, F.; Talebkeikhah, M.; Salimi, A.; Moraveji, M.K. Physico-chemical properties prediction of hydrochar in macroalgae Sargassum horneri hydrothermal carbonisation. Int. J. Environ. Anal. Chem. 2021, 101, 2297–2318. [Google Scholar] [CrossRef]
  10. Kardani, N.; Hedayati Marzbali, M.; Shah, K.; Zhou, A. Machine learning prediction of the conversion of lignocellulosic biomass during hydrothermal carbonization. Biofuels 2022, 13, 703–715. [Google Scholar] [CrossRef]
  11. Li, J.; Pan, L.; Suvarna, M.; Tong, Y.W.; Wang, X. Fuel properties of hydrochar and pyrochar: Prediction and exploration with machine learning. Appl. Energy 2020, 269, 115166. [Google Scholar] [CrossRef]
  12. Li, J.; Zhu, X.; Li, Y.; Tong, Y.W.; Ok, Y.S.; Wang, X. Multi-task prediction and optimization of hydrochar properties from high-moisture municipal solid waste: Application of machine learning on waste-to-resource. J. Clean. Prod. 2021, 278, 123928. [Google Scholar] [CrossRef]
  13. Nguyen, V.G.; Sharma, P.; Ağbulut, Ü.; Le, H.S.; Tran, V.D.; Cao, D.N. Precise prognostics of biochar yield from various biomass sources by Bayesian approach with supervised machine learning and ensemble methods. Int. J. Green Energy 2023, 21, 2180–2204. [Google Scholar] [CrossRef]
  14. Mu, L.; Wang, Z.; Wu, D.; Zhao, L.; Yin, H. Prediction and evaluation of fuel properties of hydrochar from waste solid biomass: Machine learning algorithm based on proposed PSO–NN model. Fuel 2022, 318, 123644. [Google Scholar] [CrossRef]
  15. Li, P.; Du, Z.; Chang, C.; Zhao, S.; Xu, G.; Xu, C.C. Efficient catalytic conversion of waste peanut shells into liquid biofuel: An artificial intelligence approach. Energy Fuels 2020, 34, 1791–1801. [Google Scholar] [CrossRef]
  16. Osaba, E.; Villar-Rodriguez, E.; Del Ser, J.; Nebro, A.J.; Molina, D.; LaTorre, A.; Suganthan, P.N.; Coello, C.A.C.; Herrera, F. A tutorial on the design, experimentation and application of metaheuristic algorithms to real-world optimization problems. Swarm Evol. Comput. 2021, 64, 100888. [Google Scholar] [CrossRef]
  17. Gavrilas, M. Heuristic and metaheuristic optimization techniques with application to power systems. In Proceedings of the 12th WSEAS International Conference on Mathematical Methods and Computational Techniques in Electrical Engineering, Kantaoui, Sousse, Tunisia, 3–6 May 2010; pp. 95–103. [Google Scholar]
  18. Abualigah, L.; Elaziz, M.A.; Khasawneh, A.M.; Alshinwan, M.; Ibrahim, R.A.; Al-Qaness, M.A.; Mirjalili, S.; Sumari, P.; Gandomi, A.H. Meta-heuristic optimization algorithms for solving real-world mechanical engineering design problems: A comprehensive survey, applications, comparative analysis, and results. Neural Comput. Appl. 2022, 34, 4081–4110. [Google Scholar] [CrossRef]
  19. Houssein, E.H.; Hosney, M.E.; Oliva, D.; Mohamed, W.M.; Hassaballah, M. A novel hybrid Harris hawks optimization and support vector machines for drug design and discovery. Comput. Chem. Eng. 2020, 133, 106656. [Google Scholar] [CrossRef]
  20. Turgut, O.E.; Turgut, M.S.; Kırtepe, E. A systematic review of the emerging metaheuristic algorithms on solving complex optimization problems. Neural Comput. Appl. 2023, 35, 14275–14378. [Google Scholar] [CrossRef]
  21. Niu, J.; Ren, C.; Guan, Z.; Cao, Z. Dujiangyan irrigation system optimization (DISO): A novel metaheuristic algorithm for dam safety monitoring. Structures 2023, 54, 399–419. [Google Scholar] [CrossRef]
  22. Blackwell, T.; Kennedy, J.; Poli, R. Particle swarm optimization. Swarm Intell. 2007, 1, 33–57. [Google Scholar]
  23. Chen, C.; Liang, R.; Wang, J.; Ge, Y.; Tao, J.; Yan, B.; Chen, G. Simulation and optimization of co-pyrolysis biochar using data enhanced interpretable machine learning and particle swarm algorithm. Biomass Bioenergy 2024, 182, 107111. [Google Scholar] [CrossRef]
  24. Mirjalili, S.; Mirjalili, S.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  25. Subudhi, U.; Dash, S. Detection and classification of power quality disturbances using GWO ELM. J. Ind. Inf. Integr. 2021, 22, 100204. [Google Scholar] [CrossRef]
  26. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Dandelion Optimizer: A nature-inspired metaheuristic algorithm for engineering applications. Eng. Appl. Artif. Intell. 2022, 114, 105075. [Google Scholar] [CrossRef]
  27. Saglam, M.; Bektas, Y.; Karaman, O.A. Dandelion Optimizer and Gold Rush Optimizer Algorithm-Based Optimization of Multilevel Inverters. Arab. J. Sci. Eng. 2024, 49, 7029–7052. [Google Scholar] [CrossRef]
  28. Chou, J.-S.; Molla, A. Recent advances in use of bio-inspired jellyfish search algorithm for solving optimization problems. Sci. Rep. 2022, 12, 19157. [Google Scholar] [CrossRef]
  29. Gouda, E.A.; Kotb, M.F.; El-Fergany, A.A. Jellyfish search algorithm for extracting unknown parameters of PEM fuel cell models: Steady-state performance and analysis. Energy 2021, 221, 119836. [Google Scholar] [CrossRef]
  30. Abdel-Basset, M.; El-Shahat, D.; Jameel, M.; Abouhawwash, M. Young’s double-slit experiment optimizer: A novel metaheuristic optimization algorithm for global and constraint optimization problems. Comput. Methods Appl. Mech. Eng. 2023, 403, 115652. [Google Scholar] [CrossRef]
  31. Dong, Y.; Sun, Y.; Liu, Z.; Du, Z.; Wang, J. Predicting dissolved oxygen level using Young’s double-slit experiment optimizer-based weighting model. J. Environ. Manag. 2024, 351, 119807. [Google Scholar] [CrossRef]
  32. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. Starling murmuration optimizer: A novel bio-inspired algorithm for global and engineering optimization. Comput. Methods Appl. Mech. Eng. 2022, 392, 114616. [Google Scholar] [CrossRef]
  33. Chao, X. Optimal boosting method of HPC concrete compressive and tensile strength prediction. Struct. Concr. 2024, 25, 283–302. [Google Scholar] [CrossRef]
  34. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  35. Laskar, N.M.; Guha, K.; Chatterjee, I.; Chanda, S.; Baishnab, K.L.; Paul, P.K. HWPSO: A new hybrid whale-particle swarm optimization algorithm and its application in electronic design optimization problems. Appl. Intell. 2019, 49, 265–291. [Google Scholar] [CrossRef]
  36. Hu, G.; Zheng, Y.; Abualigah, L.; Hussien, A.G. DETDO: An adaptive hybrid dandelion optimizer for engineering optimization. Adv. Eng. Inform. 2023, 57, 102004. [Google Scholar] [CrossRef]
  37. Hu, G.; Guo, Y.; Zhong, J.; Wei, G. IYDSE: Ameliorated Young’s double-slit experiment optimizer for applied mechanics and engineering. Comput. Methods Appl. Mech. Eng. 2023, 412, 116062. [Google Scholar] [CrossRef]
  38. Hu, G.; Wang, J.; Li, M.; Hussien, A.G.; Abbas, M. EJS: Multi-strategy enhanced jellyfish search algorithm for engineering applications. Mathematics 2023, 11, 851. [Google Scholar] [CrossRef]
  39. Hu, G.; Zhong, J.; Wei, G.; Chang, C.-T. DTCSMO: An efficient hybrid starling murmuration optimizer for engineering applications. Comput. Methods Appl. Mech. Eng. 2023, 405, 115878. [Google Scholar] [CrossRef]
  40. Guan, Z.; Ren, C.; Niu, J.; Wang, P.; Shang, Y. Great Wall Construction Algorithm: A novel meta-heuristic algorithm for engineer problems. Expert Syst. Appl. 2023, 233, 120905. [Google Scholar] [CrossRef]
  41. Chen, T.; He, T.; Benesty, M.; Khotilovich, V.; Tang, Y.; Cho, H.; Chen, K.; Mitchell, R.; Cano, I.; Zhou, T. Xgboost: Extreme gradient boosting. R Package Version 0.4-2 2015, 1, 1–4. [Google Scholar]
  42. Grinsztajn, L.; Oyallon, E.; Varoquaux, G. Why do tree-based models still outperform deep learning on typical tabular data? Adv. Neural Inf. Process. Syst. 2022, 35, 507–520. [Google Scholar]
  43. Mohamed, A.W.; Hadi, A.A.; Fattouh, A.M.; Jambi, K.M. LSHADE with semi-parameter adaptation hybrid with CMA-ES for solving CEC 2017 benchmark problems. In Proceedings of the 2017 IEEE Congress on Evolutionary Computation (CEC), Donostia, Spain, 5–8 June 2017; pp. 145–152. [Google Scholar]
  44. Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 2021, 51, 1531–1551. [Google Scholar] [CrossRef]
  45. Hayyolalam, V.; Kazem, A.A.P. Black widow optimization algorithm: A novel meta-heuristic approach for solving engineering optimization problems. Eng. Appl. Artif. Intell. 2020, 87, 103249. [Google Scholar] [CrossRef]
  46. Bairwa, A.K.; Joshi, S.; Singh, D. Dingo optimizer: A nature-inspired metaheuristic approach for engineering problems. Math. Probl. Eng. 2021, 2021, 2571863. [Google Scholar] [CrossRef]
  47. Ghafil, H.N.; Jármai, K. Dynamic differential annealed optimization: New metaheuristic optimization algorithm for engineering applications. Appl. Soft Comput. 2020, 93, 106392. [Google Scholar] [CrossRef]
  48. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  49. Peraza-Vázquez, H.; Peña-Delgado, A.; Ranjan, P.; Barde, C.; Choubey, A.; Morales-Cepeda, A.B. A bio-inspired method for mathematical optimization inspired by arachnida salticidade. Mathematics 2021, 10, 102. [Google Scholar] [CrossRef]
  50. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  51. Dhiman, G.; Garg, M.; Nagar, A.; Kumar, V.; Dehghani, M. A novel algorithm for global optimization: Rat swarm optimizer. J. Ambient Intell. Humaniz. Comput. 2021, 12, 8457–8482. [Google Scholar] [CrossRef]
  52. Mirjalili, S. SCA: A sine cosine algorithm for solving optimization problems. Knowl. -Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  53. Shadravan, S.; Naji, H.R.; Bardsiri, V.K. The Sailfish Optimizer: A novel nature-inspired metaheuristic algorithm for solving constrained engineering optimization problems. Eng. Appl. Artif. Intell. 2019, 80, 20–34. [Google Scholar] [CrossRef]
  54. Givi, H.; Hubalovska, M. Skill Optimization Algorithm: A New Human-Based Metaheuristic Technique. Comput. Mater. Contin. 2023, 74, 179–202. [Google Scholar] [CrossRef]
  55. Zhao, W.; Wang, L.; Zhang, Z. Supply-demand-based optimization: A novel economics-inspired algorithm for global optimization. IEEE Access 2019, 7, 73182–73206. [Google Scholar] [CrossRef]
  56. Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications. Adv. Eng. Softw. 2017, 114, 48–70. [Google Scholar] [CrossRef]
  57. Kaur, S.; Awasthi, L.K.; Sangal, A.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
  58. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Transient search optimization: A new meta-heuristic optimization algorithm. Appl. Intell. 2020, 50, 3926–3941. [Google Scholar] [CrossRef]
  59. Talatahari, S.; Bayzidi, H.; Saraee, M. Social network search for global optimization. IEEE Access 2021, 9, 92815–92863. [Google Scholar] [CrossRef]
  60. Hussain, K.; Salleh, M.N.M.; Cheng, S.; Shi, Y. On the exploration and exploitation in popular swarm-based metaheuristic algorithms. Neural Comput. Appl. 2019, 31, 7665–7683. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the proposed IDISO algorithm.
Figure 1. Flowchart of the proposed IDISO algorithm.
Processes 12 01321 g001
Figure 2. Partial results display of CEC2017 test functions.
Figure 2. Partial results display of CEC2017 test functions.
Processes 12 01321 g002
Figure 3. Radar chart of the top four algorithms in the overall ranking of the CEC2017 test.
Figure 3. Radar chart of the top four algorithms in the overall ranking of the CEC2017 test.
Processes 12 01321 g003
Figure 4. Radar chart of rankings for engineering problems. [Speed reducer (P1), Multiple disk clutch brake design problems (P2), Piston lever (P3), Car side impact design (P4), Cantilever beam (P5), Minimize I-beam vertical deflection (P6), Tubular column design (P7), Design of welded beam design (P8), Reinforced concrete beam design (P9)].
Figure 4. Radar chart of rankings for engineering problems. [Speed reducer (P1), Multiple disk clutch brake design problems (P2), Piston lever (P3), Car side impact design (P4), Cantilever beam (P5), Minimize I-beam vertical deflection (P6), Tubular column design (P7), Design of welded beam design (P8), Reinforced concrete beam design (P9)].
Processes 12 01321 g004
Figure 5. Exploration and exploitation analysis.
Figure 5. Exploration and exploitation analysis.
Processes 12 01321 g005
Figure 6. (a). C m a x and C m i n sensitivity analysis. (b). a and γ sensitivity analysis.
Figure 6. (a). C m a x and C m i n sensitivity analysis. (b). a and γ sensitivity analysis.
Processes 12 01321 g006aProcesses 12 01321 g006b
Figure 7. IDISO ablation Study.
Figure 7. IDISO ablation Study.
Processes 12 01321 g007
Figure 8. IDISO-XGBoost VS. DISO-XGBoost.
Figure 8. IDISO-XGBoost VS. DISO-XGBoost.
Processes 12 01321 g008
Figure 9. Relationship between true and predicted values for validation and test sets.
Figure 9. Relationship between true and predicted values for validation and test sets.
Processes 12 01321 g009
Table 1. CEC-2017 test functions.
Table 1. CEC-2017 test functions.
Function ClassificationFunction NameOptimal Value
Unimodal FunctionShifted and Rotated Bent Cigar Function (F1)100
Shifted and Rotated Zakharov Function (F3)300
Multimodal FunctionShifted and Rotated Rosenbrock’s Function (F4)400
Shifted and Rotated Rastrigin’s Function (F5)500
Shifted and Rotated Expanded Scaffer’s F6 Function (F6)600
Shifted and Rotated Lunacek Bi_Rastrigin Function (F7)700
Shifted and Rotated Non-Continuous Rastrigin’s Function (F8)800
Shifted and Rotated Levy Function (F9)900
Shifted and Rotated Schwefel’s Function (F10)1000
Hybrid FunctionHybrid Function 1 (N = 3) (F11)1100
Hybrid Function 2 (N = 3) (F12)1200
Hybrid Function 3 (N = 3) (F13)1300
Hybrid Function 4 (N = 4) (F14)1400
Hybrid Function 5 (N = 4) (F15)1500
Hybrid Function 6 (N = 4) (F16)1600
Hybrid Function 6 (N = 5) (F17)1700
Hybrid Function 6 (N = 5) (F18)1800
Hybrid Function 6 (N = 5) (F19)1900
Hybrid Function 6 (N = 6) (F20)2000
Composite FunctionComposition Function 1 (N = 3) (F21)2100
Composition Function 2 (N = 3) (F22)2200
Composition Function 3 (N = 4) (F23)2300
Composition Function 4 (N = 4) (F24)2400
Composition Function 5 (N = 5) (F25)2500
Composition Function 6 (N = 5) (F26)2600
Composition Function 7 (N = 6) (F27)2700
Composition Function 8 (N = 6) (F28)2800
Composition Function 9 (N = 3) (F29)2900
Composition Function 10 (N = 3) (F30)3000
Search Range: [ 100,100 ] D
Table 2. Comparative algorithm.
Table 2. Comparative algorithm.
Algorithm NameParameters
Arithmetic Optimization Algorithm (AOA) [44] M O P M a x = 1 ; M O P M i n = 0.2 ;
A l p h a = 5 ; M u = 0.499
Black Widow Optimization Algorithm (BWOA) [45] 1 < β < 1 ;
0.4 < m < 0.9
Dingo Optimization Algorithm (DOA) [46] 2 < b e t a < 2
Dynamic Differential Annealed Optimization (DDAO) [47] M a x S u b I t = 1000 ; T 0 = 2000 ;
A L P H A = 0.995
Grey Wolf Optimizer (GWO) [24]None
HarrisHawk Optimization (HHO) [48] 1 < E 0 < 1
Jumping Spider Optimization Algorithm (JSOA) [49] g r a v i t y = 9.80665 ; v o = 100
Reptile Search Algorithm (RSA) [50] A l p h a = 0.1 ; B e t a = 0.005
Rat Swarm Optimizer (RSO) [51] x = 1 ; y = 5
Sine Cosine Algorithm (SCA) [52] a = 2
Sailfish Optimization Algorithm (SFO) [53] A = 4 ; e = 0.001 ; S F p e r c e n t = 0.3
Skill optimization algorithm (SOA) [54]None
Supply–Demand-Based Optimization (SDO) [55]None
Spotted Hyena Optimizer (SHO) [56]None
Tunicate Swarm Algorithm (TSA) [57] P m i n = 1 ; P m a x = 4
Transient search algorithm (TSO) [58] K = 1
Dujiangyan irrigation system optimization (DISO) [21] P = 0.85 ; Q = 2.5
Table 3. (a) CEC2017 test function statistical analysis result and (b) non-parametric statistics result.
Table 3. (a) CEC2017 test function statistical analysis result and (b) non-parametric statistics result.
(a)
AlgorithmAverage D = 30Final D = 30Average D = 50Final D = 50Average D = 100Final D = 100
IDISO2.4112.1712.061
DISO3.3733.5543.624
AOA9.51910.101111.4411
BWOA8.6589.62910.7210
DOA5.3755.9357.137
DDOA111211.311211.8912
GWO2.4422.2422.342
HHO3.8643.2732.793
JSOA13.681414.061514.1715
RSA14.21513.821412.4413
RSO10.58119.821099
SCA6.8276.6866.655
SFO15.5816.515.721616.0616
SOA6.446776.936
SDO15.5816.515.861716.2417
SHO17.681817.411817.2418
TSA10.34109.3487.688
TSO13.371313.031312.5114
(b)
IDISO3050100IDISO3050100
AOA28/1/029/0/028/1/0BWOA29/0/029/0/029/0/0
DOA28/1/028/1/029/0/0DDAO29/0/029/0/029/0/0
GWO12/17/013/16/013/16HHO23/6/023/6/023/6/0
JSOA29/0/029/0/029/0/0RSA29/0/029/0/029/0/0
RSO29/0/029/0/029/0/0SCA28/1/029/0/029/0/0
SFO29/0/029/0/029/0/0SOA26/3/027/2/027/2/0
SDO29/0/029/0/029/0/0SHO29/0/029/0/029/0/0
TSA29/0/029/0/029/0/0TSO29/0/029/0/029/0/0
DISO17/12/020/9/023/6/0
Table 4. Engineering problem description.
Table 4. Engineering problem description.
FieldsProblemDConstraints F i t b e s t
Mechanical EngineeringSpeed reducer (P1)7112.99442 × 103
Multiple disk clutch brake design problems (P2)583.13660 × 10−1
Piston lever (P3)448.41270
Car side impact design (P4)11102.28430 × 101
Civil EngineeringCantilever beam (P5)511.33996
Minimize I-beam vertical deflection (P6)421.30700 × 10−2
Tubular column design (P7)262.64864× 101
Design of welded beam design (P8)471.72485
Reinforced concrete beam design (P9)323.59208 × 102
Table 5. Real-world engineering problem ranking. [Speed Reducer(P1), Multiple disk clutch brake design problems(P2), Piston lever(P3), Car side impact design(P4), Cantilever beam(P5), Minimize I-beam vertical deflection(P6), Tubular column design(P7), Design of welded beam design(P8), Reinforced concrete beam design(P9)].
Table 5. Real-world engineering problem ranking. [Speed Reducer(P1), Multiple disk clutch brake design problems(P2), Piston lever(P3), Car side impact design(P4), Cantilever beam(P5), Minimize I-beam vertical deflection(P6), Tubular column design(P7), Design of welded beam design(P8), Reinforced concrete beam design(P9)].
AlgorithmP1P2P3P4P5P6P7P8P9AverageFinal
IDISO12.51231.51311.77781
DISO22.52141.53222.33333
GWO3133132132.22222
HHO4444244443.66674
Table 6. Target feature hyperparameter values.
Table 6. Target feature hyperparameter values.
Learning RateNumber of TreesDepth of the TreeThe Proportion of Subsamples SampledRegular Term CoefficientVal_R2Test_R2
upper and lower bounds[0.01–1][100–1000][3–10][0.01–0.3][0.01–0.05]
IDISO_C2.9846157 × 10−1655103.8879849 × 10−13.3397663 × 10−20.970.95
DISO_C2.25181533 × 10−154991.92181424 × 10−14.15709607 × 10−20.960.92
IDISO_H2.95662070 × 10−140732.53441246 × 10−14.78747122 × 10−20.970.96
DISO_H4.68936589 × 10−123362.15128044 × 10−12.02117095 × 10−20.950.93
IDISO_O7.053795 × 10−265965.638555 × 10−13.864307 × 10−20.970.96
DISO_O4.65478169 × 10−156982.33724901 × 10−11.24724460 × 10−20.930.91
IDISO_N8.1734951 × 10−253385.5392384 × 10−13.5553879 × 10−20.950.94
DISO_N7.38914069 × 10−199782.26746118 × 10−14.19970431 × 10−20.900.89
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shi, J.; Zhang, D.; Sui, Z.; Wu, J.; Zhang, Z.; Hu, W.; Huo, Z.; Wu, Y. Improved Dujiangyan Irrigation System Optimization (IDISO): A Novel Metaheuristic Algorithm for Hydrochar Characteristics. Processes 2024, 12, 1321. https://doi.org/10.3390/pr12071321

AMA Style

Shi J, Zhang D, Sui Z, Wu J, Zhang Z, Hu W, Huo Z, Wu Y. Improved Dujiangyan Irrigation System Optimization (IDISO): A Novel Metaheuristic Algorithm for Hydrochar Characteristics. Processes. 2024; 12(7):1321. https://doi.org/10.3390/pr12071321

Chicago/Turabian Style

Shi, Jingyuan, Dapeng Zhang, Zifeng Sui, Jie Wu, Zifeng Zhang, Wenjie Hu, Zhanpeng Huo, and Yongfu Wu. 2024. "Improved Dujiangyan Irrigation System Optimization (IDISO): A Novel Metaheuristic Algorithm for Hydrochar Characteristics" Processes 12, no. 7: 1321. https://doi.org/10.3390/pr12071321

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop