Next Article in Journal
Brain Networks Modulation during Simple and Complex Gait: A “Mobile Brain/Body Imaging” Study
Previous Article in Journal
Editorial for the Special Issue “Machine Learning in Computer Vision and Image Sensing: Theory and Applications”
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Prediction Model of Coal Gas Permeability Based on Improved DBO Optimized BP Neural Network

1
College of Mechanical Engineering and Automation, Liaoning University of Technology, Jinzhou 121001, China
2
School of Coal Engineering, Shanxi Datong University, Datong 037000, China
3
China Safety Science Journal Editorial Department, China Occupational Safety and Health Association, Beijing 100011, China
*
Authors to whom correspondence should be addressed.
Sensors 2024, 24(9), 2873; https://doi.org/10.3390/s24092873
Submission received: 20 March 2024 / Revised: 28 April 2024 / Accepted: 29 April 2024 / Published: 30 April 2024
(This article belongs to the Section Sensor Networks)

Abstract

:
Accurate measurement of coal gas permeability helps prevent coal gas safety accidents effectively. To predict permeability more accurately, we propose the IDBO-BPNN coal body gas permeability prediction model. This model combines the Improved Dung Beetle algorithm (IDBO) with the BP neural network (BPNN). First, the Sine chaotic mapping, Osprey optimization algorithm, and adaptive T-distribution dynamic selection strategy are integrated to enhance the DBO algorithm and improve its global search capability. Then, IDBO is utilized to optimize the weights and thresholds in BPNN to enhance its prediction accuracy and mitigate the risk of overfitting to some extent. Secondly, based on the influencing factors of gas permeability, effective stress, gas pressure, temperature, and compressive strength, they are chosen as the coupling indicators. The SPSS 27 software is used to analyze the correlation among the indicators using the Pearson correlation coefficient matrix. Additionally, the Kernel Principal Component Analysis (KPCA) is employed to extract the original data. Then, the original data is divided into principal component data for the model input. The prediction results of the IDBO-BPNN model are compared with those of the PSO-BPNN, PSO-LSSVM, PSO-SVM, MPA-BPNN, WOA-SVM, BES-SVM, and DPO-BPNN models. This comparison assesses the capability of KPCA to enhance the accuracy of model predictions and the performance of the IDBO-BPNN model. Finally, the IDBO-BPNN model is tested using data from a coal mine in Shanxi. The results indicate that the predicted outcome closely aligns with the actual value, confirming the reliability and stability of the model. Therefore, the IDBO-BPNN model is better suited for predicting coal gas permeability in academic research writing.

1. Introduction

Coal mine gas accidents are a significant concern in the global coal mining safety field, posing a serious threat to both coal production and the safety of workers’ lives [1,2]. Coal gas permeability refers to the ability of gas to transmit through a unit area of coal within a unit of time. It is one of the key parameters for evaluating the potential release of gas from coal reservoirs [3,4]. However, accurately predicting the gas permeability of coal remains a challenging problem due to the heterogeneity and complex geological structure of coal.
Currently, both domestic and international scholars are primarily focused on studying the factors that influence changes in gas permeability [5,6]. Li Bobo et al. [7] conducted research on coal samples from the Liupanshui mining area in Guizhou. They applied the theory of effective stress to conduct seepage tests on coal and rock to investigate the impact of pore pressure changes on the characteristics of coal and rock infiltration. Gong Weidong et al. [8] utilized a triaxial penetration device to conduct tests and concluded that the gas permeability of coal is closely associated with factors such as effective stress, gas pressure, and compressive strength of coal. In recent years, with the advancement of science and technology, machine learning and deep learning have become widely utilized as emerging methods for prediction [9,10]. For instance, Tang Guoshui et al. [11] employed the enhanced Artificial Bee Colony algorithm (ABC) to optimize the kernel function parameters C and γ of the Support Vector Machine (SVM). They developed a permeability prediction model for coal-containing gas based on ABC-SVM. The findings demonstrate strong generalization ability and provide a new perspective for studying the permeability of coal-bearing gas. Shao Liangshan et al. [12] utilized Particle Swarm Optimization (PSO) to optimize the hyperparameters of the Least Squares Support Vector Machine (LSSVM). They developed a gas permeability prediction model called PSO-LSSVM and compared its predictive performance with that of BP Neural Network (BPNN) and SVM to enhance the accuracy of predictions. Xie Lirong et al. [13] utilized Learning Vector Quantization (LVQ) to classify and identify sample parameters. They then optimized the weights and thresholds of the BPNN using an enhanced PSO method. They developed a coal gas permeability prediction model based on LVQ-CPSO-BPNN, which showed the closest predicted values to the actual ones. Wang Pan et al. [14] utilized the Mean Impact Value method (MIV) to analyze the factors influencing coal seam gas permeability. They then developed a more precise prediction model for coal seam gas permeability using BPNN. This research provides valuable insights for the study of coal mine safety production and related fields. Ma Shengxiang et al. [15] employed factor analysis to reduce the dimensionality of the original data, thereby decreasing the number of input layers in the BPNN structure and simplifying it. This led to an improvement in the accuracy of model predictions. Song Xi et al. [16] utilized the Random Forest (RF) algorithm to construct a model for predicting coal gas permeability. The effectiveness of the model was validated through practical engineering tests, demonstrating its applicability in actual production and its significant role in guiding mine safety production. In summary, previous studies have made some progress in predicting coal gas permeability. However, there are still several shortcomings that need to be addressed. For instance, Support Vector Machine (SVM) is only suitable for small sample sizes and lacks the optimal method for determining values. LSSVM compromises the robustness and sparsity of standard SVM.BPNN is prone to getting stuck in local optimal values and has a slow convergence rate, as well as potential “overfitting” issues under certain conditions [17]. RF has limited capability in processing low-dimensional data and may exhibit randomness [18]. PSO is sensitive to parameter selection. Although it converges quickly, it easily falls into local optima. None of these methods address the issue of machine learning models tending to overfit. It is evident that current methods for predicting coal gas permeability have limitations that prevent them from meeting the requirements for accurate prediction in academic research.
In order to enhance the accuracy of predicting the gas permeability of coal bodies, the author improved the Dung Beetle Optimizer (DBO) algorithm to rectify its shortcomings and prevent the “overfitting” issue of BPNN. The enhanced DBO algorithm, referred to as IDBO, was employed to optimize the weights and thresholds in BPNN, leading to the development of a prediction model for coal gas permeability known as IDBO-BPNN. Subsequently, the performance of this model was compared with that of PSO-BPNN, PSO-SVM, PSO-LSSVM, and SSA-BPNN models to validate its prediction accuracy. Finally, the model was applied to a coal mine in Shanxi Province to investigate its practicality and stability further. These efforts aim to provide theoretical references to ensure safe and efficient production in coal mines and address related issues.

2. Basic Method Principles

2.1. Influence Factors of Gas Permeability in Coal

The influencing factors of gas permeability in coal bodies are highly complex, encompassing coal rock properties, stress states, temperature, gas pressure, gas content, and geological structure. An increase in effective stress leads to a reduction in the gap between coal bodies, subsequently decreasing gas permeability. Conversely, an increase in gas pressure leads to higher molecular flow speeds and increased gas permeability. Furthermore, higher temperatures lead to faster movement rates of gas molecules and, consequently, higher permeability [19]. The compressive strength plays a crucial role in determining the compactness of particle arrangement within the coal. Greater compressive strength corresponds to smaller particle gaps and lower permeability [20]. These non-linear factors interact with each other to collectively determine changes in gas permeability within coal.

2.2. BP Neural Network

BPNN is a widely used artificial neural network algorithm, typically consisting of three layers of neurons: the input layer, hidden layer, and output layer [21]. The number of nodes in the hidden layer is usually determined by the empirical formula N 1 + N 0 + L where N represents the number of nodes in the hidden layer, N1 represents the number of nodes in the input layer, and N0 represents the number of nodes in the output layer [22]. The topology is illustrated in Figure 1.

2.3. Improved DBO

2.3.1. DBO

DBO is a novel intelligent optimization algorithm inspired by the rolling, dancing, foraging, stealing, and reproduction behaviors of dung beetles. The algorithm categorizes the dung beetle population into four groups: rolling dung beetle, brooder dung beetle, small dung beetle, and thief dung beetle [23]. Further details can be found in the literature [24].

2.3.2. Improved DBO

Overfitting is a common issue encountered by machine learning models. When the model is too complex, interfered with noise, or when there is limited training data, overfitting is more likely to occur. Therefore, Differential Biogeography Optimization (DBO) is used to optimize the hyperparameters of the Back Propagation Neural Network (BPNN). However, DBO has shortcomings, such as an imbalance in global exploration and local development abilities, which can result in local optimal problems and a weak global exploration ability. To enhance the global search capability of DBO and avoid overfitting BPNN, three strategies are employed to improve DBO. Furthermore, the fitness function is not called multiple times in IDBO. The complexity is consistent with the original DBO.
(1) The population is initialized using the Sine chaotic mapping strategy [25]. The utilization of random generation in the initialization process by intelligent optimization algorithms leads to poor ergodicity, which results in a decrease in the quality of the initial solution [26]. However, utilizing chaotic mapping to generate random numbers significantly enhances the fitness function value and distributes them more evenly. This broader search range helps enhance the accuracy and stability of the algorithm, thereby improving its global search capability. Sine mapping, as a typical representative of chaotic mapping, is simple in form and easy to implement [27]. Its specific formula is as follows:
x k + 1 = 1 4 sin ( π x k ) ,   a ( 0 , 4 ]
where xk is the chaos number of the kth iteration.
(2) The Osprey optimization algorithm is introduced in this study. The global exploration strategy of the Osprey optimization algorithm addresses the limitations of the DBO in ball-rolling behavior. The DBO solely relies on the worst value and lacks timely communication with other dung beetles, in addition to having numerous parameters. Therefore, the Osprey optimization algorithm employs a global exploration strategy to randomly locate the position of a dung beetle and update its position by rolling. The specific formula for this strategy is as follows:
x i P 1 = x i + r ( S F I x i )
where x i P 1 is the new position of the i dung beetle in the exploration stage; r is the random number between [0, 1]; SF is the selected dung beetle and I is the random number in the set {1.2}.
(3) Adaptive T-distribution dynamic selection strategy. During the foraging stage of dung beetles, T-distribution perturbations are implemented to influence their foraging behavior. The T-division mutation operator, with the iteration number variation formula serving as the degree of freedom parameter of the T-distribution, is utilized to perturb the foraging behavior. This approach not only makes the best use of current position information but also introduces random interference information, which facilitates escaping from local optimal algorithms [28]. As the number of iterations increases, the T-distribution gradually approaches a Gaussian distribution, thereby enhancing the speed of algorithm convergence. Its mathematical characterization is as follows:
x n e w j = x b e s t j + t ( C _ i t e r ) x b e s t j
where x n e w j is the position of the optimal solution in the jth dimension after the adaptive T-distribution variation perturbation; x b e s t j is the position of the optimal solution in the jth dimension before the variation perturbation; t(C_iter) is the degree of freedom parameter of the t distribution.
The introduction of the adaptive T-distribution mutation operator can significantly enhance the optimization performance of the algorithm. However, it is indiscriminately used in all individuals in each iteration, which may lead to an increase in calculation time. Meanwhile, it doesn’t take advantage of the benefits of the original algorithm. To address this issue, a dynamic selection probability P is adopted to adjust the use of adaptive T-distribution mutation operators. This ensures that the algorithm demonstrates strong global development ability in the early stage of iteration while maintaining good local exploration ability in the late stage. Additionally, supplementing the algorithm with T-distribution mutation with a small probability further enhances the convergence speed [29]. The calculation formula for dynamic selection probability P is as follows:
P = w 1 w 2 ( M a x i t e r i t e r ) / M a x i t e r
where w1 is the upper limit of dynamic selection probability; w1 = 0.5; w2 is the change amplitude of dynamic selection probability; w1 = 0.1; Maxiter is the maximum number of iterations; iter is the current number of iterations.

2.3.3. Algorithm Validity Test

In order to evaluate the optimization performance of IDBO, the CEC2005 test set is utilized for iterative testing in the Matlab R2023a environment. The algorithm is compared with the Whale Optimization Algorithm (WOA), Subtraction Average Based Optimizer (SABO), Grey Wolf Optimizer (GWO), Northern Goshawk Optimization (NGO), Harris Hawk Optimization (HHO), and the original DBO. Each algorithm’s population size and maximum number of iterations are set to 30 and 1000, respectively, with the test being repeated 30 times. The details of the test function information can be found in Table 1.
The seven algorithms are tested for comparison and analysis. The test results are shown in Figure 2. The standard test function generates a two-dimensional convergence curve after each algorithm is executed. In this curve, the x-coordinate represents the number of iterations. During each iteration, the algorithm attempts to optimize the function. Therefore, the x-coordinate records the number of these optimization attempts. The goal of CEC test functions is to find the global minimum of the function, so the ordinate usually represents the function value. If the curve slopes downward, it indicates that the algorithm is approaching the optimal solution. If the curve fluctuates greatly, it may suggest that the algorithm is oscillating near the local optimum. According to Figure 2, the slope of the IDBO curve decline is significantly steeper than that of other algorithms in both single-peak benchmark functions and multi-peak, as well as fixed-dimensional multi-peak benchmark functions, which suggests that IDBO exhibits a faster convergence speed. Other algorithms show a relatively gradual decline, indicating that they may be trapped in local optima or experience slow convergence speeds. At the same time, the optimization accuracy of IDBO in test functions F2, F3, F4, F5, F6, F7, and F8 is the best. The fitness value of IDBO in test function F1 is not the best, but it still ranks ahead of several algorithms. The results show that the local development ability of IDBO is significantly improved, which reveals good local development ability compared with the original DBO. In general, IDBO can not only converge quickly but also have the ability to explore and develop balancedly and escape from local optimal solutions.
The seven algorithms are tested by eight different functions with optimal value, standard deviation, average value, median value, and worst value as evaluation indices, which reflect the convergence accuracy and stability of the algorithms, as shown in Table 2. As can be seen from Table 2, IDBO can accurately find the optimal value 0 in various functions, which can adapt to the transformation in global exploration and local exploration. Therefore, compared with other algorithms, IDBO has improved the accuracy of the solution and is more stable in average optimization performance.
Then, the performance of IDBO is further evaluated by the CEC2017 and CEC2021 test sets, as shown in Table 3. It is evident from Table 3 that IDBO has good performance in both the CEC2017 and CEC2021 test sets, showing strong convergence accuracy and speed. In summary, IDBO excellently performs in different test functions. It not only has absolute advantages in convergence speed but also demonstrates good convergence accuracy. At the same time, IDBO achieves a good balance between development and exploration capabilities, which further indicates that IDBO demonstrates outstanding comprehensive performance in many metaheuristic algorithms.

2.4. Construction of IDBO-BPNN Model

The metaheuristic optimization algorithm used to optimize machine learning or deep learning models has been demonstrated to significantly improve their prediction accuracy [30]. Therefore, the author utilized Improved Differential Bees Optimization (IDBO) to optimize the weights and thresholds of the Back Propagation Neural Network (BPNN) and established the coal gas permeability prediction model based on IDBO-BPNN. The construction process is illustrated in Figure 2. The specific construction steps are as follows:
(1) Data preprocessing involves handling missing values in the collected data;
(2) Determining whether dimensionality reduction is necessary can be conducted through the Pearson correlation coefficient matrix. If reduction is needed, Kernel Principal Component Analysis (KPCA) can be used to extract principal components from the original data;
(3) Dividing test samples and training samples in a 7:3 ratio and carrying out normalization processing;
(4) Setting the relevant parameters of IDBO and BPNN;
(5) Utilizing the Sine chaotic mapping to initialize the population and calculate the initial fitness value of dung beetles;
(6) Updating the position of each dung beetle and calculating its fitness value to obtain the optimal solution;
(7) Utilizing an adaptive T-distribution dynamic selection strategy to perturb the current optimal solution, acquire a new solution, and assess the need for a position update;
(8) Determining whether termination conditions are met. If not, repeat steps 6–7. If yes, output the optimal parameter;
(9) BPNN acquires optimal weight and threshold parameters for training and simulating predictions.

3. Experimental Contrastive Analysis

3.1. Data Source and Principal Component Extraction

According to relevant tests and theoretical analysis in the literature [8,31], it is evident that there are numerous factors influencing the gas permeability of coal. The main influencing factors include effective stress, gas pressure, temperature, and coal compressive strength. Therefore, 50 sets of coal gas permeability data under various conditions were selected from the literature [11] as test data for this experiment. Among these groups, data from samples 1 to 40 were used as training samples, while data from 41 to 50 were used as test samples. A portion of the test data is presented in Table 4.
The correlation analysis chart is a method used to visually represent the distribution of data and the relationship between different factors. In order to accurately capture the impact of different factors, SPSS 27 software was used to perform correlation analysis on the initial data concerning the factors influencing coal gas permeability. This analysis aimed to generate the Pearson correlation coefficient matrix for various indicators, as illustrated in Figure 3. The positive and negative signs in the correlation coefficient indicate the direction of the correlation between variables. A positive correlation coefficient indicates a consistent trend of change between two variables; specifically, when one variable increases, the other variable also increases. A negative correlation coefficient indicates an opposite trend in changes between two variables. This means that when one variable increases, the other variable decreases. According to Figure 4, a negative correlation is observed between effective stress and gas pressure, compressive strength and gas pressure, as well as temperature and compressive strength. Conversely, a positive correlation exists between temperature and effective stress, as well as between temperature and gas pressure. The closer the absolute value of the correlation coefficient is to 1, the stronger the relationship between the variables. A correlation coefficient of 1 indicates a perfect positive correlation, while a correlation coefficient of −1 indicates a perfect negative correlation. A correlation coefficient close to 0 suggests that there is no linear correlation between the two variables. These findings are important for understanding and analyzing relationships between variables in academic research. As shown in Figure 4, the correlation between coal body gas permeability and the influencing factors is not entirely linear; there is a slight correlation between the index factors. For instance, the correlation coefficients between effective stress and gas pressure, temperature, and compressive strength are −0.107, −0.001, and −0.103, respectively. This suggests a limited association among these factors in influencing coal gas permeability. The correlation coefficient between gas pressure and temperature is 0.174. When the correlation value between the two factors is too low (e.g., less than 0.2), it indicates that it may be less helpful for information enrichment. If used directly, it will inevitably affect the result to some extent. Therefore, it is essential to conduct kernel principal component analysis on the original data, which can not only reduce the amount of calculation but also improve the accuracy of model prediction.
Kernel Principal Component Analysis (KPCA) is a nonlinear method for processing data based on a high-dimensional feature space. It involves mapping the data from the original space to a new space and then conducting principal component analysis to successfully achieve dimensionality reduction of linear non-fractional datasets. This technique is widely used in academic research and has proven to be effective in various applications. Due to the nonlinear relationship between the influencing factors of coal gas permeability, Kernel Principal Component Analysis (KPCA) was utilized to reduce the dimensionality of the original data. The selection criteria for this reduction were based on interpreting more than 85% of the cumulative variance. Ultimately, three principal components were extracted and labeled as Y1, Y2, and Y3, respectively. Their respective variance interpretation rates were recorded as 41.74%, 26.83%, and 20.02%. The cumulative interpretive variance is 88.59%, indicating that the three extracted principal components can better reflect the vast majority of information in the original data. Some data after dimensionality reduction are shown in Table 5.

3.2. Model Evaluation Index

In order to verify the accuracy and reliability of the constructed prediction model, six indicators are used as the basis to test the prediction accuracy, model advantages and disadvantages, and fitting performance of the prediction model [32]. These indicators include Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), Root Mean Square Error (RMSE), R-Square (R2), Mean Squared Error (MSE), and Forecast Bias Ratio (FBR). The calculation formulas for these indicators are shown as follows:
MAE = 1 n i = 1 n f i y i
MAPE = 1 n i = 1 n f i y i y i × 100 %
R M S E = 1 n i = 1 n ( f i y i ) 2
R 2 = 1 i = 1 n ( y i f i ) 2 i = 1 n ( y i y ¯ ) 2
M S E = 1 n i = 1 n ( f i y i ) 2
F B R = y i f i y i × 100 %
where n is the number of samples; fi is the predicted value; yi is the true value; y ¯ is the average of the true values. Among them, the smaller the MAE, MAPE, RMSE, and MSE values, the closer the R2 value is to 1, the better, and the closer the FBR value is to 0, the better.

3.3. Experimental Comparison and Analysis

3.3.1. Multi-Optimization Model Construction

According to the literature [5], the PSO-BPNN model is constructed, and the thresholds and weights of BPNN are optimized using PSO. The PSO-LSSVM model was constructed based on literature [12], and the two parameters γ and σ in LSSVM were optimized using PSO. Based on reference [33], the PSO-SVM model was constructed, and the penalty parameters and kernel parameters in SVM were optimized using PSO. Additionally, the Marine Predators Algorithm (MPA) optimizing (BPNN) models (MPA-BPNN) was developed based on reference [34]. Furthermore, the WOA-SVM model was developed based on literature [35], while the Bald Eagle Search (BES) optimization SVM model (BES-SVM) was constructed according to reference [36]. These optimization models are compared with IDBO-BPNN and DPO-BPNN models constructed by the author, with parameter settings for each optimization model shown in Table 6.

3.3.2. Comparative Analysis

In the process of fitting and mapping multiple indicators, the significant difference in magnitude between the indicators can directly impact the final result. Therefore, the ‘mapminmax’ function in MATLAB R2023a is used to normalize the original data within a [0, 1] interval. After completing the model simulation and prediction, the mapminmax function is then used to reverse-normalize the data back to its original values. Based on the aforementioned model parameter settings, both the original data and principal component data are used as inputs to obtain permeability prediction results for test samples in each model. The prediction results for the original data are presented in Table 7, while those for the principal component data are shown in Table 8.
By summarizing the aforementioned performance evaluation indicators, the original data evaluation index comparison is shown in Table 9. The comparison of the principal component data evaluation index is shown in Table 10. By comparing the prediction results in Table 7 and Table 8, as well as the performance evaluation indicators in Table 9 and Table 10, principal component extraction of the original data is effectively helpful in concentrating the data, thereby improving the prediction accuracy of the model. Additionally, according to Table 9 and Table 10, the IDBO-BPNN model outperforms other models in various indices. Furthermore, MAE, MAPE, RMSE, R2, MSE, and FBR of other models in the test samples exhibit significant fluctuations compared to the training samples. This suggests a potential overfitting phenomenon in the test sample stage for these models. As a result, the model’s robustness decreases, and the error of the test sample increases. This further indicates that IDBO enhances the global search capability of the original DBO and improves the prediction accuracy of BPNN. In the case of using the original data, the MAE of the IDBO model in the test stage decreased by 0.0086~0.0271; MAPE decreased by 1.89~3.89%; RMSE decreased by 0.0064~0.0265; and R2 increased by 0.0188~0.0916 compared with other models. MSE decreased by 0.0008~0.0036; FBR increased by 1.24~4.21%. In the case of using principal component data, the MAE of the IDBO-BPNN model in test samples decreased by 0.0399, 0.0341, 0.0286, 0.0121, 0.021, 0.0188 and 0.0134, respectively, compared with other models. MAPE decreased by 5.61%, 5.55%, 4.19%, 2.01%, 3.14%, 2.5%, 1.95%, and RMSE decreased by 0.0476, 0.0338, 0.0376, 0.0112, 0.023, 0.0185, 0.012, respectively. R2 was increased by 0.098, 0.0577, 0.0679, 0.0127, 0.033, 0.0244, and 0.0139, respectively, while MSE was decreased by 0.0039, 0.0023, 0.0027, 0.0005, 0.0013, 0.0009, and 0.0005, respectively. FBR decreased by 2.53%, 4.1%, 2.51%, 1.2%, 1.94%, 2.77%, and 1.49%, respectively. Therefore, the IDBO-BPNN model has the smallest error and the best performance.

4. Model Case Test

In machine learning models, model stability refers to the consistency of performance across various datasets, even when the data is slightly altered or affected by noise. Ensuring the stability of a model is crucial to guarantee its reliability and generalization ability in practical applications. A coal mine in Shanxi Province was selected as the research subject to showcase the reliability and stability of the IDBO-BPNN model. The thickness of No. 2 coal seam in the mine is 0.75~1.93 m, the average thickness is 1.07 m, the coal seam inclination is 3~7°, the absolute emission of gas is 22.23 m3/min, the relative emission is 11.74 m3/t, it is a high gas mine, not easy to spontaneous combustion coal seam, coal dust is explosive. Therefore, a more accurate prediction of coal gas permeability is essential for preventing gas outburst accidents and ensuring the safe and efficient production of mines. A total of 67 groups of experimental data were selected from the coal mine. Groups 1 to 47 were used as training samples, while groups 48 to 67 were used as test samples. The model parameters remained consistent with the above. First, the Pearson correlation coefficient matrix is used to assess whether the original data needs dimensionality reduction, as shown in Table 11. It is evident from Table 11 that this data requires principal component extraction; therefore, KPCA is still used to process the original data. Finally, three principal components (denoted as Z1, Z2, and Z3) are extracted. Their respective variance interpretation rates are 40.45%, 26.27%, and 19.27%, with a total cumulative variance interpretation rate of 85.99%. Using principal components Z1, Z2, and Z3 as model inputs and permeability as the output variable, the prediction results for each test sample of the model are presented in Table 12. Additionally, the comparison results of performance evaluation indicators for each model are illustrated in Figure 5. As shown in Table 12 and Figure 5, the IDBO-BPNN model developed by the author demonstrates optimal performance in both the training and test samples. In the training sample, the MAE of the IDBO-BPNN model decreased by 0.011~0.139; MAPE decreased by 0.17~1.79%; RMSE decreased by 0.0025~0.0169; R2 increased by 0.0087~0.0529, compared with other models. MSE decreased by 0.0002~0.0017; FBR decreased by 0.12~1%. In the test sample, the MAE of the IDBO-BPNN model is reduced by 0.0111, 0.0076, 0.0097, 0.0053, 0.0066, 0.0027, and 0.0035, respectively, compared with other models. The MAPE decreased by 2.48%, 1.09%, 2.18%, 1.03%, 1.26%, 0.72%, and 0.74%, while the RMSE decreased by 0.0169, 0.0188, 0.0162, 0.0071, 0.0094, 0.0068, 0.0056, respectively. R2 was increased by 0.1166, 0.0726, 0.1126, 0.0478, 0.0594, 0.0418, and 0.0408, while MSE was decreased by 0.0022, 0.0025, 0.0021, 0.0009, 0.0012, 0.0008, and 0.0007, respectively. FBR decreased by 3.15%, 0.42%, 2.21%, 0.68%, 0.76%, 1.3%, and 0.26%, respectively. Therefore, the IDBO-BPNN model demonstrates good prediction accuracy and generalization performance.
In conclusion, the IDBO-BPNN model constructed by the author not only demonstrates high prediction accuracy but also exhibits a certain level of reliability and stability. Furthermore, its prediction results are more aligned with reality and can accurately forecast the gas permeability of coal bodies.

5. Discussion

(1) In the structural design of the BPNN model, empirical methods are still used to determine the number of hidden layer nodes. However, the verification method for empirical formulas lacks theoretical guidance. Therefore, determining the number of hidden layer nodes in the neural network structure using a scientific and rational method is a future research direction.
(2) The author employs BPNN and SVM as the fundamental models for predicting coal gas permeability. While there are numerous outstanding machine learning and deep learning methods available for developing prediction models, it is essential to conduct further research on combining and comparing these methods in the future.
(3) There are issues such as limited sample data and insufficient verification times. For future studies, it is recommended to select coal samples from different mines and various geological conditions for comparison. This will help to further improve the engineering application capability and universality of the IDBO-BPNN model.

6. Conclusions

(1) The integration of Sine chaotic mapping, Osprey optimization algorithm, and adaptive T-distribution dynamic selection strategies into DBO enhances the convergence speed and global search capability of IDBO. Iterative testing was conducted on the CEC2005 test set to validate its performance, comparing it with WOA, SABO, GWO, NGO, HHO, and the original DBO. Further validation was carried out on the CEC2017 and CEC2021 test sets. The results demonstrate that IDBO outperforms other intelligent optimization algorithms in terms of iteration times and accuracy.
(2) A prediction model of gas permeability in a coal body is constructed based on IDBO-BPNN. This model considers the factors influencing gas permeability in a coal body and combines them with IDBO and BPNN. Additionally, a Pearson correlation coefficient matrix analysis was conducted on the original data using SPSS software. The analysis indicated that dimensionality reduction processing was necessary for the original data. Subsequently, principal component extraction was performed on the original data using KPCA, resulting in a cumulative variance of 88.59%.
(3) The original data and principal component data were used as model inputs. The prediction results of the IDBO-BPNN model were compared with those of the PSO-BPNN, PSO-LSSVM, PSO-SVM, MPA-BPNN, WOA-SVM, BES-SVM, and DPO-BPNN models. The results indicate that using the principal component data can effectively improve the model’s prediction accuracy compared to the original data. This suggests that KPCA can effectively help concentrate the data. Secondly, when utilizing principal component data, the MAE of the IDBO-BPNN model in the test samples decreased by 0.0399, 0.0341, 0.0286, 0.0121, 0.021, 0.0188, and 0.0134, respectively, in comparison to other models. The MAPE decreased by 5.61%, 5.55%, 4.19%, 2.01%, 3.14%, 2.5%, and 1.95%. Additionally, the RMSE decreased by 0.0476, 0.0338, 0.0376, 0.0112, 0.023, 0.0185, and 0.012, respectively. R2 was increased by 0.098, 0.0577, 0.0679, 0.0127, 0.033, 0.0244, and 0.0139, respectively, while MSE was decreased by 0.0039, 0.0023, 0.0027, 0.0005, 0.0013, 0.0009, and 0.0005, respectively. FBR decreased by 2.53%, 4.1%, 2.51%, 1.2%, 1.94%, 2.77%, and 1.49%, respectively. The results indicate that the IDBO-BPNN model demonstrates superior quality, minimal error, and strong fitting performance. Furthermore, it suggests that IDBO significantly enhances global search capability and optimization accuracy compared to the original DBO. As a result, BPNN demonstrates higher prediction accuracy.
(4) To investigate the reliability and stability of the IDBO-BPNN model further, it was applied to a coal mine in Shanxi Province and compared with other prediction models. The results indicate that the IDBO-BPNN model outperforms other models in both training and test samples, demonstrating good prediction accuracy. The result is the closest to the actual value, indicating that the IDBO-BPNN model constructed by the author is more stable and better suited for predicting coal gas permeability. This finding can offer valuable insights for similar mining engineering practices.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/s24092873/s1.

Author Contributions

Conceptualization: W.W. and X.C.; Data curation: W.W., X.C. and Y.Q.; Funding acquisition: W.W. and Y.Q.; Project administration: K.X. and R.L.; Resources: W.W., X.C. and Y.Q.; Software: W.W., X.C., Y.Q. and K.X.; Supervision: W.W. and Y.Q.; Validation: R.L. and C.B.; Visualization: X.C., K.X. and C.B.; Writing—original draft: W.W. and X.C.; Writing—review & editing: W.W. and Y.Q. All authors have read and agreed to the published version of the manuscript.

Funding

Shanxi Basic Research Program (Free Exploration) Project (Funder: W.W.; Funding number: 202203021222300); Shanxi Province Higher Education Science and Technology Innovation Plan Project (Funder: W.W. and Y.Q.; Funding number: 2022L449 and 2022L448); Basic Research Project of Shanxi Datong University (Funder: W.W.; Funding number: 2022Q38).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data generated or analyzed during this study are included in this published article and its Supplementary Information Files.

Acknowledgments

We thank Xiangyan LI MTI and School of Foreign Languages in GuiZhou University of Finance and Economics for its linguistic assistance during the preparation of this manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Li, X.; Shi, Y.; Zeng, J.; Zhang, Q.; Gao, J.; Zhang, L.; Huang, T. Evolutionary Model and Experimental Validation of Gas-Bearing Coal Permeability under Negative Pressure Conditions. ACS Omega 2023, 8, 15708–15720. [Google Scholar] [CrossRef] [PubMed]
  2. Xiao, Z.; Wang, G.; Wang, C.; Jiang, Y.; Jiang, F.; Zheng, C. Permeability evolution and gas flow in wet coal under non-equilibrium state: Considering both water swelling and process-based gas swelling. Int. J. Min. Sci. Technol. 2023, 33, 585–599. [Google Scholar] [CrossRef]
  3. Kong, L.; Luo, Y.; Tang, J.; Wang, Y.; Yuan, F.; Li, S.; Hao, Y. Permeability Damage Mechanism and Evolution of Gas-Bearing Coal Seams Induced by Drilling Fluid. Nat. Resour. Res. 2023, 32, 1639–1655. [Google Scholar] [CrossRef]
  4. Xue, J.; Li, K.; Shi, Y. Study on Permeability Characteristics of Gas Bearing Coal under Cyclic Load. Sustainability 2022, 14, 11483. [Google Scholar] [CrossRef]
  5. Deng, S.; Li, X.; Xu, B. Prediction of gas permeability in coal body based on PSO-BP neural network. Miner. Eng. Res. 2022, 37, 35–41. [Google Scholar]
  6. Zhang, Y.; Lei, J.; Bi, R. Forecast model of coal gas permeability based on ANFIS. Coal Min. Technol. 2017, 22, 101–104. [Google Scholar]
  7. Li, B.; Yang, K.; Yuan, M.; Xu, J. Experimental study on effect of pore pressure on permeability of coal. China Saf. Sci. J. 2017, 27, 77–82. [Google Scholar]
  8. Gong, W.; Xie, X.; Liang, Y.; Cui, J. Comparative experimental research on permeability of two kinds of raw coal samples. J. Saf. Sci. Technol. 2017, 13, 47–52. [Google Scholar]
  9. Sikora, A.; Zielonka, A.; Ijaz, M.F.; Woźniak, M. Digital Twin Heuristic Positioning of Insulation in Multimodal Electric Systems. IEEE Trans. Consum. Electron. 2024, 1, 3436–3445. [Google Scholar] [CrossRef]
  10. Woźniak, M.; Wieczorek, M.; Siłka, J. BiLSTM deep neural network model for imbalanced medical data of IoT systems. Future Gener. Comput. Syst. 2023, 141, 489–499. [Google Scholar] [CrossRef]
  11. Tang, G.; Zhang, H.; Han, J.; Song, W. Prediction model on permeability of gas-bearing coal based on MABC-SVM. J. Saf. Sci. Technol. 2015, 11, 11–16. [Google Scholar]
  12. Shao, L.; Ma, H. Model of coal gas permeability prediction based on PSO-LSSVM. Coal Geol. Explor. 2015, 43, 23–26. [Google Scholar]
  13. Xie, L.R.; Lu, P.; Fan, W.H.; Ye, W.; Wang, J.R. LVQ-CPSO-BP based prediction technique of coal gas permeability rate. J. Min. Saf. Eng. 2017, 34, 398–404. [Google Scholar]
  14. Wang, P.; Du, W.; Feng, F. Prediction model of coalbed gas permeability based on optimization of influencing factors. Saf. Coal Mines 2017, 48, 21–25. [Google Scholar]
  15. Ma, S.; Li, X. Forecast of coal body gas permeability based on factor analysis and BP neural net. Coal Min. Technol. 2018, 23, 108–111. [Google Scholar]
  16. Song, X.; Ning, Y.; Ding, Y. Prediction of coal gas permeability based on random forest. Coal Technol. 2019, 38, 130–132. [Google Scholar]
  17. Wei, W.; Ran, L.; Yun, Q.; Baoshan, J.; Zewei, W. Prediction model of coal spontaneous combustion risk based on PSO-BPNN. China Saf. Sci. J. 2021, 33, 127–132. [Google Scholar]
  18. Man, K.; Wu, L.W.; Liu, X.L.; Song, Z.F.; Liu, Z.X.; Liu, R.L.; Cao, Z.X. Rockburst grade prediction based on grey correlation analysis and SSA-RF model. Met. Mine 2023, 5, 202–212. [Google Scholar]
  19. Wang, K. Experimental study on new sealing technology of gas extraction borehole. Shandong Coal Sci. Technol. 2023, 41, 128–130. [Google Scholar]
  20. Zhao, W. Study on Mechanical Properties of Loess Reinforced by Permeable Polymer; Zhengzhou University: Zhengzhou, China, 2022. [Google Scholar]
  21. Xu, P. Study on the Impact of Airborne Particulate Matter Dispersion in Public Transportation Spaces in High-Rise Residential Buildings; Huazhong University of Science and Technology: Wuhan, China, 2022. [Google Scholar]
  22. Hu, S. Research on Reliability of Vehicular Communication System Based on Machine Learning; Guilin University Of Electronic Technology: Guilin, China, 2022. [Google Scholar]
  23. Pan, J.; Li, S.; Zhou, P.; Yang, G.; Lyu, D. Dung Beetle Optimization Algorithm Guided by Improved Sine Algorithm. Comput. Eng. Appl. 2023, 59, 92–110. [Google Scholar]
  24. Xue, J.; Shen, B. Dung beetle optimizer: A new metaheuristic algorithm for global optimization. J. Supercomput. 2023, 79, 7305–7336. [Google Scholar] [CrossRef]
  25. Yao, H.; Li, C.; Yang, P.; Zheng, X. Prediction of Building Cooling Capacity Based on Sine-SSA-BP Model. Comput. Simul. 2023, 40, 525–529. [Google Scholar]
  26. Fan, Z.; Li, B.; Wang, K.; Zhao, Z. A two-layer IMA resource allocation method based on chaotic mapping and adaptive NSGA-Ⅱ algorithm. Electron. Opt. Control 2022, 29, 25–31. [Google Scholar]
  27. Panwar, K.; Purwar, R.K.; Srivastava, G. A Fast Encryption Scheme Suitable for Video Surveillance Applications Using SHA-256 Hash Function and 1D Sine-Sine Chaotic Map. Int. J. Image Graph. 2021, 21, 2150022. [Google Scholar] [CrossRef]
  28. Zhang, W.; Liu, S. Improved sparrow search algorithm based on adaptive t-distribution and golden sine and its application. Microelectron. Comput. 2022, 39, 17–24. [Google Scholar]
  29. Ge, Q.; Li, Y.; Qiao, B.; Zuo, X.; Wang, G. Differential biogeography optimization algorithm based on micro-perturbation and mixed variation. Comput. Eng. Des. 2021, 42, 432–441. [Google Scholar]
  30. Woźniak, M.; Sikora, A.; Zielonka, A.; Kaur, K.; Hossain, M.S.; Shorfuzzaman, M. Heuristic optimization of multi-pulse rectifier for reduced energy consumption. IEEE Trans. Ind. Inform. 2022, 18, 5515–5526. [Google Scholar] [CrossRef]
  31. Dai, J. Experimental study on influence of multiple factors on coal permeability. Coal Technol. 2020, 39, 122–125. [Google Scholar]
  32. Wang, W.; Cui, X.; Qi, Y.; Liang, R.; Jia, B.; Xue, K. Regression analysis model of coal spontaneous combustion temperature in goaf based on SSA-RF. China Saf. Sci. J. 2023, 33, 136–141. [Google Scholar]
  33. Gu, Y. Prediction of coal gas permeability based on PSOSVM. J. Math. Pract. Theory 2016, 46, 149–155. [Google Scholar]
  34. Dong, Z.; Sheng, Z.; Zhao, Y.; Zhi, P. Robust optimization design method for structural reliability based on active-learning MPA-BP neural network. Int. J. Struct. Integr. 2023, 14, 248–266. [Google Scholar] [CrossRef]
  35. Zhou, W.; Lian, J.; Zhang, J.; Mei, Z.; Gao, Y.; Hui, G. Tomato storage quality predicting method based on portable electronic nose system combined with WOA-SVM model. J. Food Meas. Charact. 2023, 17, 3654–3664. [Google Scholar] [CrossRef]
  36. Zhou, X.H.; Feng, Y.C.; Chen, L.; Luo, W.; Liu, S. Transformer fault diagnosis based on SVM optimized by bald eagle search algorithm. South. Power Syst. Technol. 2023, 17, 99–106. [Google Scholar]
Figure 1. Topology structure of BPNN.
Figure 1. Topology structure of BPNN.
Sensors 24 02873 g001
Figure 2. Algorithm convergence curve comparison.
Figure 2. Algorithm convergence curve comparison.
Sensors 24 02873 g002aSensors 24 02873 g002bSensors 24 02873 g002c
Figure 3. IDBO -BPNN model flow.
Figure 3. IDBO -BPNN model flow.
Sensors 24 02873 g003
Figure 4. Correlation coefficient matrix.
Figure 4. Correlation coefficient matrix.
Sensors 24 02873 g004
Figure 5. Comparison of evaluation indexes of different models.
Figure 5. Comparison of evaluation indexes of different models.
Sensors 24 02873 g005aSensors 24 02873 g005b
Table 1. Test function information.
Table 1. Test function information.
Reference FunctionsDimensionalityRadius
F 1 = i = 1 n x i 2 30[−100, 100]
F 2 = i = 1 n x i + i = 1 n x i 30[−10, 10]
F 3 = i = 1 n j = 1 i x j 2 30[−100, 100]
F 4 = max i x i , 1 i n 30[−100, 100]
F 5 = i = 1 n x i sin ( x i ) 30[−500, 500]
F 6 = 20 exp 0.2 1 n i = 1 n x i 2 30[−32, 32]
F 7 = i = 1 m x a i ( x a i ) T + c i 1 4[0, 10]
F 8 = i = 1 n i x i 4 + r a n d o m 0 , 1 30[−1.28, 1.28]
Table 2. Comparison of test results.
Table 2. Comparison of test results.
FunctionsEvaluation CriteriaWOADBOSABOGWONGOHHOIDBO
F1Optimal value2.7 × 10−1012.1 × 10−1911.6 × 10−2414.92 × 10−361.1 × 10−1087.6 × 10−1320
Standard deviation1.03 × 10−901.5 × 10−13201.57 × 10−338.7 × 10−1065.7 × 10−1090
Mean value3.09 × 10−912.8 × 10−1332.7 × 10−2371.18 × 10−333.3 × 10−1061.1 × 10−1090
mid-value6.86 × 10−941.8 × 10−1621.5 × 10−2385.89 × 10−341 × 10−1067.6 × 10−1220
Worst value5.47 × 10−908.3 × 10−1322.5 × 10−2365.73 × 10−334.8 × 10−1053.1 × 10−1080
F2Optimal value5.7 × 10−1065.2 × 10−2062.6 × 10−2421.94 × 10−363.9 × 10−1098.6 × 10−1330
Standard deviation9.38 × 10−918.5 × 10−13702.09 × 10−349.5 × 10−1076.9 × 10−1130
Mean value2.68 × 10−911.6 × 10−1371.3 × 10−2368.17 × 10−354.5 × 10−1071.3 × 10−1130
mid-value5.7 × 10−951.7 × 10−1609 × 10−2402.76 × 10−359.4 × 10−1081.8 × 10−1220
Worst value4.81 × 10−904.7 × 10−1363.4 × 10−2351.09 × 10−334.2 × 10−1063.8 × 10−1120
F3Optimal value3.31 × 10−681.59 × 10−944.1 × 10−1372.58 × 10−216.72 × 10−572.72 × 10−670
Standard deviation3.19 × 10−617.59 × 10−715.3 × 10−1332.75 × 10−208.67 × 10−554.38 × 10−590
Mean value7.62 × 10−621.39 × 10−711.9 × 10−1332.76 × 10−205.7 × 10−558.82 × 10−601.9 × 10−300
mid-value2.9 × 10−655.74 × 10−832.1 × 10−1341.68 × 10−202.8 × 10−559.15 × 10−640
Worst value1.67 × 10−604.16 × 10−702.2 × 10−1321.34 × 10−194.54 × 10−542.4 × 10−585.6 × 10−299
F4Optimal value12,353.023.6 × 10−1572.62 × 10−971.21 × 10−101.03 × 10−342.7 × 10−1170
Standard deviation12,261.042.88 × 10−651.15 × 10−517.79 × 10−72.99 × 10−276.18 × 10−720
Mean value40,102.655.26 × 10−662.13 × 10−523.05 × 10−79.91 × 10−281.13 × 10−720
mid-value38,697.387.6 × 10−1286.67 × 10−721.14 × 10−84.27 × 10−305.5 × 10−1020
Worst value64,604.581.58 × 10−646.31 × 10−513.44 × 10−61.23 × 10−263.38 × 10−710
F5Optimal value2.7 × 10−164009.42 × 10−671.2 × 10−2101.5 × 10−2620
Standard deviation5.3 × 10−135004.38 × 10−60000
Mean value1.1 × 10−1356.7 × 10−27909.29 × 10−611.7 × 10−2041.9 × 10−2210
mid-value1.7 × 10−145004.92 × 10−631.5 × 10−2075.5 × 10−2450
Worst value2.9 × 10−1342 × 10−27702.4 × 10−592.6 × 10−2035.8 × 10−2200
F6Optimal value2.9 × 10−1605.6 × 10−21606.6 × 10−1292.4 × 10−2241.4 × 10−1770
Standard deviation3.9 × 10−1309.2 × 10−13709.4 × 10−11102.7 × 10−1470
Mean value8.4 × 10−1311.7 × 10−1372.2 × 10−3021.7 × 10−1114.9 × 10−2177.1 × 10−1480
mid-value1.1 × 10−1388.4 × 10−1703 × 10−3087.7 × 10−1222.8 × 10−2204.9 × 10−1570
Worst value2.1 × 10−1295.1 × 10−1364.4 × 10−3015.2 × 10−1101.2 × 10−2151.1 × 10−1460
F7Optimal value2.15 × 10−473.27 × 10−700.0998730.0998730.0998732.98 × 10−660
Standard deviation0.0595870.0429321.28 × 10−070.0550861.95 × 10−135.98 × 10−580
Mean value0.1298780.0750220.0998730.1798730.0998731.86 × 10−580
mid-value0.0998730.0998730.0998730.1998730.0998737.44 × 10−620
Worst value0.2998730.0998730.0998740.2998730.0998732.73 × 10−570
F8Optimal value00.0097160.0097160.0097160.00971600
Standard deviation0.0181542.79 × 10−087.25 × 10−080.0133275.29 × 10−1400
Mean value0.0229470.0097160.0097160.0340050.00971600
mid-value0.0097160.0097160.0097160.0372240.00971600
Worst value0.0781890.0097160.0097160.0781890.00971600
Table 3. Optimization curves for different test sets.
Table 3. Optimization curves for different test sets.
Test Set TypeFunctionsConvergence CurvesRadius
CEC2017Shifted and Rotated
Rosenbrock’s Function
Sensors 24 02873 i001[−100, 100]
Shifted and Rotated Rastrigin’s FunctionSensors 24 02873 i002
Shifted and Rotated Levy FunctionSensors 24 02873 i003
Hybrid Function (N = 3)Sensors 24 02873 i004
CEC2021Shifted and Rotated Bent Cigar FunctionSensors 24 02873 i005
Shifted and Rotated Lunacek bi-Rastrigin FunctionSensors 24 02873 i006
Hybrid Function (N = 5)Sensors 24 02873 i007
Composition Function (N = 3)Sensors 24 02873 i008
Table 4. Coal gas permeability sample data.
Table 4. Coal gas permeability sample data.
No.Effective Stress/MPaGas Pressure/MPaTemperature/°CCompressive Strength/MPaPermeability/(10−5 m2)
121.84010.850.881
21.510.55512.851.062
34.010.53014.130.559
241.731.84514.130.805
25216012.620.633
262.51.53012.370.677
483.7813012.850.491
491.730.53014.131.189
50217011.50.632
Table 5. KPCA dimension reduction data.
Table 5. KPCA dimension reduction data.
No.Y1Y2Y3Permeability/(10−5 m2)
10.615−0.9721.6350.881
2−0.404−0.453−0.3731.062
3−0.4972.050−2.1330.559
24−0.906−1.783−0.3420.805
250.173−0.3300.4960.633
26−0.190−0.404−0.0530.677
480.0921.489−0.8060.491
49−1.578−0.560−2.2321.189
500.967−0.0881.6460.632
Table 6. The parameters of each model are set.
Table 6. The parameters of each model are set.
Parameter NameSpecific SettingParameter NameSpecific Setting
Population size30Maximum iterations100
BPNN training times1000BPNN target error1 × 10−6
BPNN learning rate0.01BPNN hidden layer node12
SVM cross-validate parameters5SVM option.gap0.9
SVM option.cbound[1, 100]SVM option.gbound[1, 100]
PSO learning factor1.5PSO inertia weight0.8
PSO maximum speed limit1PSO Maximum speed limit PSO minimum speed limit−1
MPA FADs0.2Probability of WOA contraction enveloping mechanism[0.1]
WOA spiral position update probability[0.1]Variation range of BES spiral trajectory(0.5, 2)
BES position change parameters(1.5, 2)BES spiral trajectory parameters(0, 5)
Table 7. Raw data predicted results.
Table 7. Raw data predicted results.
No.True ValuePredicted Value
PSO-BPNNPSO-LSVMPSO-SVMMPA-BPNNWOA-SVMBES-SVMDBO-BPNNIDBO-BPNN
400.8910.7590.8630.8040.7970.8200.8010.8150.803
410.5160.5480.6350.5820.5840.5820.5790.5880.552
420.6190.5250.5820.5850.6090.6000.6130.6080.611
430.6320.5690.6120.6130.6350.6290.6410.6350.642
450.5640.6020.7110.6760.6800.7040.6910.6830.665
460.7860.7240.8670.8400.8110.8700.8410.8440.865
470.6830.7320.7050.7840.7400.6890.6700.7360.688
480.4910.4120.5340.5180.5440.5440.5380.5180.487
491.1891.0441.1711.0701.1631.1461.1131.1271.151
500.6320.6320.7270.7040.6900.7250.6950.7030.686
Table 8. Principal component data prediction results.
Table 8. Principal component data prediction results.
No.True ValuePredicted Value
PSO-BPNNPSO-LSSVMPSO-SVMPSO-BPNNWOA-SVMBES-SVMPSO-BPNNIDBO-BPNN
400.8910.7460.8560.8050.8500.8290.8340.8300.850
410.5160.5410.4720.5420.5000.5050.5200.4790.518
420.6190.6510.5890.6070.6340.6390.5580.6130.621
430.6320.6850.6280.6310.6370.6500.5960.6200.626
450.5640.6440.4980.6810.5210.5140.5420.5450.558
460.7860.7880.7230.8640.7590.7120.7500.8120.789
470.6830.7590.6270.6980.6760.6760.7030.6530.659
480.4910.5120.5530.5030.5220.5180.4930.5130.511
491.1891.1341.1741.1651.1861.1851.1531.1671.190
500.6320.6550.5540.6590.5870.5820.6070.6220.625
Table 9. Comparison of raw data evaluation indicators.
Table 9. Comparison of raw data evaluation indicators.
ModelsModel Performance
MAEMAPE/%RMSER2MSEFBR/%
TrainTestTrainTestTrainTestTrainTestTrainTestTrainTest
PSO-BPNN0.05640.06957.359.650.07750.08150.85680.83180.00600.00664.835.61
PSO-LSSVM0.05420.06088.3410.000.07050.07510.88170.85730.00500.0056−5.44−7.27
PSO-SVM0.05260.06927.389.950.06790.07700.89030.84990.00460.0059−2.56−4.30
MPA-BPNN0.04570.05106.658.000.05690.06140.92300.90460.00320.0038−1.49−5.14
WOA-SVM0.04620.05766.758.950.05820.07030.91930.87480.00340.0049−2.64−5.93
BES-SVM0.04860.05487.008.190.05890.06570.91730.89070.00350.0043−1.31−4.33
DBO-BPNN0.04470.05516.548.270.05700.06390.92250.89660.00330.0041−2.60−5.17
IDBO-BPNN0.03970.04245.606.110.05340.05500.93190.92340.00290.0030−0.46−3.06
Table 10. Comparison of evaluation indexes of principal component data.
Table 10. Comparison of evaluation indexes of principal component data.
ModelsModel Performance
MAEMAPE/%RMSER2MSEFBR/%
TrainTestTrainTestTrainTestTrainTestTrainTestTrainTest
PSO-BPNN0.03270.05114.637.270.04050.06440.96090.89490.00160.0042−0.62−3.10
PSO-LSSVM0.01670.04532.567.210.04220.05060.95750.93520.00180.00261.074.67
PSO-SVM0.03880.03985.525.850.04970.05440.94110.92500.00250.0030−0.49−3.08
MPA-BPNN0.01200.02331.703.670.02000.02800.99050.98020.00040.0008−0.351.77
WOA-SVM0.01140.03221.804.800.02860.03980.98050.95990.00080.0016−0.182.51
BES-SVM0.01750.03002.494.160.03110.03530.97700.96850.00100.00121.213.34
DBO-BPNN0.01300.02461.913.610.02370.02880.98660.97900.00060.00081.022.06
IDBO-BPNN0.00450.01120.701.660.00740.01680.99870.99290.00010.0003−0.020.57
Table 11. Pearson correlation coefficient matrix.
Table 11. Pearson correlation coefficient matrix.
Effective StressGas PressureCompressive StrengthCompressive Strength
Effective stress1−0.0620.056−0.122
Gas pressure−0.06210.229−0.230
Compressive strength0.0560.2291−0.434
Compressive strength−0.122−0.230−0.4341
Table 12. Each model tested the prediction results of the sample.
Table 12. Each model tested the prediction results of the sample.
No.True ValuePredicted Value
PSO-BPNNPSO-LSSVMPSO-SVMMPA-BPNNWOA-SVMBES-SVMDBO-BPNNIDBO-BPNN
480.5160.5270.5560.5160.5640.5780.5670.5700.561
490.8100.8340.7620.8360.8400.8280.8450.8270.839
500.5160.5720.5680.5520.5810.5660.5760.5720.567
510.5140.5640.5270.5540.5570.5700.5510.5620.538
520.5110.5570.5220.5500.5160.5200.5170.5180.533
531.0561.0320.8321.0340.9450.9290.9310.9450.935
540.4890.5450.5220.5370.5160.5200.5190.5180.516
550.6800.6490.7180.6580.7420.7620.7180.7520.718
560.8450.9250.8440.9270.8720.8690.8710.8630.853
570.6450.5720.5750.5600.6080.6150.6160.6020.602
580.4310.6670.6160.6670.5980.6020.6160.5900.560
590.5800.6760.6490.6770.5980.6020.6160.5900.595
600.7680.6710.7180.6770.7580.7620.7770.7690.775
610.4780.5470.5320.5400.5490.5380.5470.5350.557
620.7450.7040.6910.6950.6430.6450.6690.6410.673
630.8500.8020.8230.8080.7810.8130.7930.7960.763
640.8340.8620.8010.8630.8170.7990.8260.8020.792
650.6540.6880.6290.6830.5950.5780.6060.5870.608
660.5670.5440.5670.5370.5150.5180.5180.5170.533
670.5820.5610.6280.5320.5890.5750.5820.5800.585
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, W.; Cui, X.; Qi, Y.; Xue, K.; Liang, R.; Bai, C. Prediction Model of Coal Gas Permeability Based on Improved DBO Optimized BP Neural Network. Sensors 2024, 24, 2873. https://doi.org/10.3390/s24092873

AMA Style

Wang W, Cui X, Qi Y, Xue K, Liang R, Bai C. Prediction Model of Coal Gas Permeability Based on Improved DBO Optimized BP Neural Network. Sensors. 2024; 24(9):2873. https://doi.org/10.3390/s24092873

Chicago/Turabian Style

Wang, Wei, Xinchao Cui, Yun Qi, Kailong Xue, Ran Liang, and Chenhao Bai. 2024. "Prediction Model of Coal Gas Permeability Based on Improved DBO Optimized BP Neural Network" Sensors 24, no. 9: 2873. https://doi.org/10.3390/s24092873

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop