A Novel Radial Basis Function Neural Network with High Generalization Performance for Nonlinear Process Modelling
Abstract
:1. Introduction
- 1.
- The convergence of the RBFNN-GP is verified in theory, which ensures its successful application;
- 2.
- The effectiveness and feasibility of the RBFNN-GP are verified by predicting the key water quality parameters in wastewater treatment process.
2. Materials and Methods
2.1. Radial Basis Function Neural Network (RBFNN)
2.2. Local Generalization Error Bound
3. RBFNN with High Generation Performance (RBFNN-GP)
3.1. Sensitivity Measurement (SM) Method
3.2. Structural Self-Organization Optimization (SSO) Strategy
3.2.1. Growth Stage
3.2.2. Prune Stage
4. Convergence Analysis
4.1. Convergence Analysis of RBFNN with Fixed Structure
4.2. Convergence Analysis of RBFNN with Changeable Structure
4.2.1. Growth Stage
4.2.2. Prune Stage
5. Experimental Studies
5.1. Benchmark Example A
5.2. Benchmark Example B
5.3. Permeability Prediction of Membrane Bio-Reactor
6. Discussion
6.1. Computational Complexity
6.2. Future Trends
7. Conclusions
- 1.
- With the help of sensitivity measurements and locally generalized error bounds, the network has a statistical performance and can reasonably achieve structure self-organization without a high dependence on sample numbers.
- 2.
- The convergence of RBFNN-GP for fixed and variable structures is guaranteed by the thresholds λ1 and λ2. Therefore, the proposed RBFNN-GP can not only reduce the number of additional parameters, but also decreases the computational burden.
- 3.
- Compared with existing algorithms, the proposed RBFNN-GP shows a good generalization ability in the prediction of key water quality parameters in wastewater treatment processes. Furthermore, this approach can be extended to other types of networks and industrial domains.
Author Contributions
Funding
Institutional Review Board Statement
Conflicts of Interest
References
- Xu, L.; Qian, F.; Li, Y.; Li, Q.; Yang, Y.-W.; Xu, J. Resource allocation based on quantum particle swarm optimization and RBF neural network for overlay cognitive OFDM System. Neurocomputing 2016, 173, 1250–1256. [Google Scholar] [CrossRef]
- Xiong, T.; Bao, Y.; Hu, Z.; Chiong, R. Forecasting interval time series using a fully complex-valued RBF neural network with DPSO and PSO algorithms. Inf. Sci. 2015, 305, 77–92. [Google Scholar] [CrossRef]
- Han, H.G.; Zhang, L.; Hou, Y.; Qiao, J.F. Nonlinear Model Predictive Control Based on a Self-Organizing Recurrent Neural Network. IEEE Trans. Neural Netw. Learn. Syst. 2015, 27, 402–415. [Google Scholar] [CrossRef] [PubMed]
- Yang, Y.-K.; Sun, T.-Y.; Huo, C.-L.; Yu, Y.-H.; Liu, C.-C.; Tsai, C.-H. A novel self-constructing Radial Basis Function Neural-Fuzzy System. Appl. Soft Comput. 2013, 13, 2390–2404. [Google Scholar] [CrossRef]
- Niyogi, P.; Girosi, F. On the Relationship between Generalization Error, Hypothesis Complexity, and Sample Complexity for Radial Basis Functions. Neural Comput. 1996, 8, 819–842. [Google Scholar] [CrossRef] [Green Version]
- Qiao, J.-F.; Han, H.-G. Optimal Structure Design for RBFNN Structure. Acta Autom. Sin. 2010, 36, 865–872. [Google Scholar] [CrossRef]
- Vapnik, V.N. Estimation of Dependences Based on Empirical Data; Springer: Berlin, Germany, 1982. [Google Scholar]
- Pollard, D. Convergence of Stochastic Processes; Springer: Berlin, Germany, 1984. [Google Scholar]
- Haussler, D. Decision-theoretic generalizations of the PAC model for neural net and other learning applications. Inf. Comput. 1992, 100, 78–150. [Google Scholar] [CrossRef] [Green Version]
- Barron, A.R. Approximation and estimation bounds for artificial neural networks. Mach. Learn. 1994, 14, 115–133. [Google Scholar] [CrossRef]
- Yeung, D.S.; Ng, W.W.Y.; Wang, D.; Tsang, E.C.C.; Wang, X.-Z. Localized Generalization Error Model and Its Application to Architecture Selection for Radial Basis Function Neural Network. IEEE Trans. Neural Netw. 2007, 18, 1294–1305. [Google Scholar] [CrossRef]
- Sarraf, A. A tight upper bound on the generalization error of feedforward neural networks. Neural Netw. 2020, 127, 1–6. [Google Scholar] [CrossRef]
- Han, H.; Wu, X.; Liu, H.; Qiao, J. An Efficient Optimization Method for Improving Generalization Performance of Fuzzy Neural Networks. IEEE Trans. Fuzzy Syst. 2019, 27, 1347–1361. [Google Scholar] [CrossRef]
- Terada, Y.; Hirose, R. Fast generalization error bound of deep learning without scale invariance of activation functions. Neural Netw. 2020, 129, 344–358. [Google Scholar] [CrossRef]
- Suzuki, T. Fast generalization error bound of deep learning from a kernel perspective. In Proceedings of the International Conference on Artificial Intelligence and Statistics, Playa Blanca, Lanzarote, 9–11 April 2018; pp. 1397–1406. [Google Scholar]
- Ng, W.; Yeung, D.; Wang, X.-Z.; Cloete, I. A study of the difference between partial derivative and stochastic neural network sensitivity analysis for applications in supervised pattern classification problems. In Proceedings of the 2004 International Conference on Machine Learning and Cybernetics (IEEE Cat. No. 04EX826), Shanghai, China, 26–29 August 2004. [Google Scholar]
- Zhou, H.; Zhang, Y.; Duan, W.; Zhao, H. Nonlinear systems modelling based on self-organizing fuzzy neural network with hierarchical pruning scheme. Appl. Soft Comput. 2020, 95, 106516. [Google Scholar] [CrossRef]
- Xie, Y.; Yu, J.; Xie, S.; Huang, T.; Gui, W. On-line prediction of ferrous ion concentration in goethite process based on self-adjusting structure RBF neural network. Neural Netw. 2019, 116, 1–10. [Google Scholar] [CrossRef] [PubMed]
- Huang, G.-B.; Saratchandran, P.; Sundararajan, N. An Efficient Sequential Learning Algorithm for Growing and Pruning RBF (GAP-RBF). IEEE Trans. Syst. Man Cybern. Part B-Cybern. 2004, 34, 2284–2292. [Google Scholar] [CrossRef]
- Huang, G.-B.; Saratchandran, P.; Sundararajan, N. A Generalized Growing and Pruning RBF (GGAP-RBF) Neural Network for Function Approximation. IEEE Trans. Neural Netw. 2005, 16, 57–67. [Google Scholar] [CrossRef]
- Han, H.G.; Wu, X.L.; Zhang, L.; Tian, Y.; Qiao, J.F. Self-Organizing RBF neural network using an adaptive gradient multi-objective particle swarm optimization. IEEE Trans. Cybern. 2019, 49, 69–82. [Google Scholar] [CrossRef] [PubMed]
- Han, H.-G.; Chen, Q.-L.; Qiao, J.-F. An efficient self-organizing RBF neural network for water quality prediction. Neural Netw. Off. J. Int. Neural Netw. Soc. 2011, 24, 717–725. [Google Scholar] [CrossRef]
- Han, H.-G.; Ma, M.-L.; Yang, H.-Y.; Qiao, J.-F. Self-organizing radial basis function neural network using accelerated second-order learning algorithm. Neurocomputing 2021, 469, 1–12. [Google Scholar] [CrossRef]
- Li, Z.-Q.; Zhao, Y.-P.; Cai, Z.-Y.; Xi, P.-P.; Pan, Y.-T.; Huang, G.; Zhang, T.-H. A proposed self-organizing radial basis function network for aero-engine thrust estimation. Aerosp. Sci. Technol. 2019, 87, 167–177. [Google Scholar] [CrossRef]
- Shi, J.R.; Wang, D.; Shang, F.H.; Zhang, H.Y. Research advances on stochastic gradient descent algorithms. Acta Autom. Sin. 2021, 47, 2103–2119. [Google Scholar] [CrossRef]
- Hoeffding, W. Probability Inequalities for Sums of Bounded Random Variables. J. Am. Stat. Assoc. 1963, 58, 13–30. [Google Scholar] [CrossRef]
- Jiang, Q.; Gao, D.-C.; Zhong, L.; Guo, S.; Xiao, A. Quantitative sensitivity and reliability analysis of sensor networks for well kick detection based on dynamic Bayesian networks and Markov chain. J. Loss Prev. Process. Ind. 2020, 66, 104180. [Google Scholar] [CrossRef]
- Belouz, K.; Nourani, A.; Zereg, S.; Bencheikh, A. Prediction of greenhouse tomato yield using artificial neural networks combined with sensitivity analysis. Sci. Hortic. 2021, 293, 110666. [Google Scholar] [CrossRef]
- Han, H.-G.; Qiao, J.-F. Adaptive computation algorithm for RBF neural network. IEEE Trans. Neural Netw. Learn. Syst. 2011, 23, 342–347. [Google Scholar] [CrossRef]
- Xie, T.; Yu, H.; Hewlett, J.; Rózycki, P.; Wilamowski, B. Fast and efficient second-order method for training radial basis function networks. IEEE Trans. Neural Networks Learn. Syst. 2012, 23, 609–619. [Google Scholar] [CrossRef]
- Han, H.-G.; Ma, M.-L.; Qiao, J.-F. Accelerated gradient algorithm for RBF neural network. Neurocomputing 2021, 441, 237–247. [Google Scholar] [CrossRef]
- Zemouri, R.; Racoceanu, D.; Zerhouni, N. Recurrent radial basis function network for time-series prediction. Eng. Appl. Artif. Intell. 2003, 16, 453–463. [Google Scholar] [CrossRef]
- Han, H.-G.; Lu, W.; Hou, Y.; Qiao, J.-F. An Adaptive-PSO-Based Self-Organizing RBF Neural Network. IEEE Trans. Neural Netw. Learn. Syst. 2016, 29, 104–117. [Google Scholar] [CrossRef]
- Teychene, B.; Guigui, C.; Cabassud, C. Engineering of an MBR supernatant fouling layer by fine particles addition: A possible way to control cake compressibility. Water Res. 2011, 45, 2060–2072. [Google Scholar] [CrossRef]
- Huyskens, C.; Brauns, E.; Vanhoof, E.; Dewever, H. A new method for the evaluation of the reversible and irreversible fouling propensity of MBR mixed liquor. J. Membr. Sci. 2008, 323, 185–192. [Google Scholar] [CrossRef]
- Feng, H.-M. Self-generation RBFNs using evolutional PSO learning. Neurocomputing 2006, 70, 241–251. [Google Scholar] [CrossRef]
Methods | No. of NNs | CPU Time(s) | Testing RMSE | Training RMSE | |||
---|---|---|---|---|---|---|---|
Mean | Dev. | Mean | Dev. | Max. | Mean | ||
Fixed-RBFNN | 8 | 100.10 | 0.096 | 0.031 | 0.0037 | 0.040 | 0.042 |
SASOA-FNN [12] | 8 | 108.29 | 0.031 | 0.029 | 0.0043 | 0.033 | 0.041 |
SOFNN-HPS [17] | 12 | 131.02 | 0.076 | 0.047 | 0.0053 | 0.063 | 0.057 |
SAS-RBFNN [18] | 9 | 119.35 | 0.024 | 0.039 | 0.0051 | 0.059 | 0.052 |
AGMOPSO [21] | 8 | 106.29 | 0.028 | 0.024 | 0.0037 | 0.039 | 0.041 |
ASOL-SORBFNN [23] | 8 | 135.46 | 0.031 | 0.035 | 0.0046 | 0.041 | 0.040 |
RBFNN-GP | 7 | 100.36 | 0.012 | 0.027 | 0.0033 | 0.029 | 0.039 |
Methods | No. of NNs | CPU Time(s) | Testing RMSE | Training RMSE | |||
---|---|---|---|---|---|---|---|
Mean | Dev. | Mean | Dev. | Max. | Mean | ||
Fixed-RBFNN | 8 | 104.10 | 0.069 | 0.036 | 0.0039 | 0.035 | 0.312 |
SASOA-FNN [12] | 7 | 126.29 | 0.030 | 0.029 | 0.0041 | 0.028 | 0.124 |
SOFNN-HPS [17] | 11 | 116.3 | 0.076 | 0.054 | 0.0066 | 0.068 | 0.261 |
SAS-RBFNN [18] | 10 | 123.97 | 0.026 | 0.034 | 0.0045 | 0.031 | 0.219 |
AGMOPSO [21] | 8 | 119.89 | 0.029 | 0.027 | 0.0039 | 0.031 | 0.217 |
ASOL-SORBFNN [23] | 8 | 133.91 | 0.065 | 0.043 | 0.0044 | 0.040 | 0.251 |
RBFNN-GP | 6 | 105.36 | 0.022 | 0.026 | 0.0032 | 0.019 | 0.215 |
Methods | No. of NNs | CPU Time(s) | Testing RMSE | Training RMSE | |||
---|---|---|---|---|---|---|---|
Mean | Dev. | Mean | Dev. | Max. | Mean | ||
Fixed-RBFNN | 8 | 105.65 | 0.037 | 0.033 | 0.0034 | 0.037 | 0.031 |
SASOA-FNN [12] | 8 | 109.21 | 0.035 | 0.028 | 0.0033 | 0.021 | 0.024 |
SOFNN-HPS [17] | 11 | 126.34 | 0.032 | 0.042 | 0.0049 | 0.025 | 0.032 |
SAS-RBFNN [18] | 9 | 121.62 | 0.042 | 0.025 | 0.0038 | 0.029 | 0.035 |
AGMOPSO [21] | 8 | 114.25 | 0.031 | 0.030 | 0.0042 | 0.031 | 0.114 |
ASOL-SORBFNN [23] | 11 | 125.31 | 0.040 | 0.028 | 0.0041 | 0.040 | 0.127 |
RBFNN-GP | 6 | 108.24 | 0.032 | 0.025 | 0.0040 | 0.022 | 0.022 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, Y.; Wang, P.; Gao, X. A Novel Radial Basis Function Neural Network with High Generalization Performance for Nonlinear Process Modelling. Processes 2022, 10, 140. https://doi.org/10.3390/pr10010140
Yang Y, Wang P, Gao X. A Novel Radial Basis Function Neural Network with High Generalization Performance for Nonlinear Process Modelling. Processes. 2022; 10(1):140. https://doi.org/10.3390/pr10010140
Chicago/Turabian StyleYang, Yanxia, Pu Wang, and Xuejin Gao. 2022. "A Novel Radial Basis Function Neural Network with High Generalization Performance for Nonlinear Process Modelling" Processes 10, no. 1: 140. https://doi.org/10.3390/pr10010140