Quantum Neural Networks Approach for Water Discharge Forecast
Abstract
:1. Introduction
2. Methodology and Data Series
2.1. Computational Experiment
2.2. Quantum Neural Network
2.2.1. Basic Concept of Quantum Neural Network
2.2.2. Structure of Quantum Neural Networks
- (1)
- Quantum Feature Mapping (Data Encoding)
- (2)
- Measurement and Optimization
- Pauli-Z measurement: Computing the expectation value ⟨Z⟩ for classification or regression tasks [50]:
- Shot-based Sampling: Since quantum measurements are probabilistic, multiple measurements are required to obtain a stable output.
2.2.3. Computational Process of QNN
- (a)
- Mapping Classical Data to Quantum States. The input data x are transformed into a quantum state using a Quantum Feature Mapping Circuit:
- (b)
- Quantum Circuit Processing. The Variational Quantum Circuit (VQC) applies quantum gate operations to the quantum state:
- (c)
- The expectation value of quantum states is measured:
- (d)
- Parameter Optimization. Classical optimization algorithms update the quantum circuit parameters, using Equation (8),
- (e)
- Training and Convergence. The above process is iterated until the loss function (mean standard error—MSE, or mean absolute error—MAE) converges, leading to optimized parameters θ.
2.3. Study Area and Data Series
3. Results and Discussion
3.1. Results for the Raw Series
3.2. Results for the Series After the Anomalies Removal
3.3. Results Comparisons
- ➢
- On the S series,
- On the training set, the lowest MAE, 5.7250, was obtained using a hybrid Sparrow Search Algorithm–Backpropagation Neural Network (SSA-BP), and on the test set, 4.2351, by a Convolutional Neural Network–Long Short-Term Memory (CNN-LSTM).
- On the training set, the lowest MSE, 80.5765, resulted from an Echo State Network (ESN) algorithm, and on the test set, 32.4993, from SSA-BP.
- On the training set, the highest R2, 0.9976, resulted after fitting a Sparrow Search Algorithm–Echo State Network (SSA-ESN) model, and on the test set, 0.9983, after using a Long Short-Term Memory (LSTM) model.
- ➢
- On the S1 series,
- On the training set, the lowest MAE, 6.5177, was obtained using CNN-LSTM, and on the test set, 4.4784, by the same algorithm.
- On the training set, the lowest MSE, 102.9393, resulted from an ESN algorithm, and on the test set, 39.7982, from CNN-LSTM.
- On the training and test sets, the highest R2, 0.9899 and 0.9917, respectively, resulted from fitting an LSTM model.
- ➢
- On the S2 series,
- On the training and test sets, the lowest MAEs, 4.7433 and 3.5245, respectively, were obtained using CNN-LSTM.
- On the training set, the lowest MSE, 57.3421, resulted from an SSA-ESN algorithm, and on the test set, 29.8323, from CNN-LSTM.
- On both training and test sets, the highest R2, 0.9992 and 0997, respectively, resulted from an LSTM model fitting.
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Rentschler, J.; Salhab, M.; Jafino, B.A. Flood exposure and poverty in 188 countries. Nat. Commun. 2022, 13, 3527. [Google Scholar] [CrossRef]
- World Meteorological Organization. Provisional State of the Global Climate in 2022; World Meteorological Organization: Geneva, Switzerland, 2022; Available online: https://www.rmets.org/sites/default/files/2022-11/provisional_state_of_the_climate_2022_1_november_for_cop27_2.pdf (accessed on 4 April 2025).
- Serdar, M.Z.; Ajjur, S.B.; Al-Ghamdi, S.G. Flood susceptibility assessment in arid areas: A case study of Qatar. Sustainability 2022, 14, 9792. [Google Scholar] [CrossRef]
- Waleed, M.; Sajjad, M. Advancing flood susceptibility prediction: A comparative assessment and scalability analysis of machine learning algorithms via artificial intelligence in high-risk regions of Pakistan. J. Flood Risk Manag. 2025, 18, e13047. [Google Scholar] [CrossRef]
- Bărbulescu, A.; Maftei, C.E. Evaluating the Probable Maximum Precipitation. Case study from the Dobrogea region, Romania. Rom. Rep. Phys. 2023, 75, 704. [Google Scholar] [CrossRef]
- Popescu-Bodorin, N.; Bărbulescu, A. A ten times smaller version of CPC Global Daily Precipitation Dataset for parallel distributed processing in Matlab and R. Rom. Rep. Phys. 2024, 76, 703. [Google Scholar]
- Zhao, G.; Pang, B.; Xu, Z.; Peng, D.; Xu, L. Assessment of urban flood susceptibility using semi-supervised machine learning model. Sci. Total Environ. 2019, 659, 940–949. [Google Scholar] [CrossRef]
- Valipour, M.; Banihabib, M.E.; Behbahani, S.M.R. Comparison of the ARMA, ARIMA, and the autoregressive artificial neural network models in forecasting the monthly inflow of Dez dam reservoir. J. Hydrol. 2013, 476, 433–441. [Google Scholar] [CrossRef]
- Phan, T.-T.-H.; Nguyen, X.H. Combining statistical machine learning models with ARIMA for water level forecasting: The case of the Red river. Adv. Water Resour. 2020, 142, 103656. [Google Scholar] [CrossRef]
- Zhang, X.; Wu, X.; Zhu, G.; Lu, X.; Wang, K. A seasonal ARIMA model based on the gravitational search algorithm (GSA) for runoff prediction. Water Supply 2022, 22, 6959–6977. [Google Scholar] [CrossRef]
- Subha, J.; Saudia, S. Robust Flood Prediction Approaches Using Exponential Smoothing and ARIMA Models. In Artificial Intelligence and Sustainable Computing; Pandit, M., Gaur, M.K., Kumar, S., Eds.; Springer: Singapore, 2023; pp. 457–470. [Google Scholar]
- Alonso Brito, G.R.; Rivero Villaverde, A.; Lau Quan, A.; Ruíz Pérez, M.E. Comparison between SARIMA and Holt–Winters models for forecasting monthly streamflow in the western region of Cuba. SN Appl. Sci. 2021, 3, 671. [Google Scholar] [CrossRef]
- Yang, C.; Chandler, R.E.; Isham, V.S. Spatial-temporal rainfall simulation using generalized linear models. Water Resour. Resear. 2005, 41, W11415. [Google Scholar] [CrossRef]
- Rima, L.; Haddad, K.; Rahman, A. Generalised Additive Model-Based Regional Flood Frequency Analysis: Parameter Regression Technique Using Generalised Extreme Value Distribution. Water 2025, 17, 206. [Google Scholar] [CrossRef]
- Iddrisu, W.A.; Nokoe, K.S.; Luguterah, A.; Antwi, E.O. Generalized Additive Mixed Modelling of River Discharge in the Black Volta River. Open J. Stat. 2017, 7, 621–632. [Google Scholar] [CrossRef]
- von Brömssen, C.; Fölster, J.; Kyllmar, K.; Bierosa, M. Modeling Complex Concentration-Discharge Relationships with Generalized Additive Models. Environ. Model. Assess. 2023, 28, 925–937. [Google Scholar] [CrossRef]
- Hirsch, R.M.; Moyer, D.L.; Archfield, S.A. Weighted Regressions on Time, Discharge, and Season (WRTDS), with an Application to Chesapeake Bay River Inputs. J. Am. Water Res. Assoc. 2010, 46, 857–880. [Google Scholar] [CrossRef] [PubMed]
- Ahmed, M.A.; Li, S.S. Machine Learning Model for River Discharge Forecast: A Case Study of the Ottawa River in Canada. Hydrology 2024, 11, 151. [Google Scholar] [CrossRef]
- Shabbir, M.; Chand, S.; Iqbal, F. A novel hybrid framework to model the relationship of daily river discharge with meteorological variables. Meteorol. Hydrol. Water Manag. 2023, 11, 70–94. [Google Scholar] [CrossRef]
- Zanial, W.N.C.W.; Malek, M.B.A.; Reba, M.N.M.; Zaini, N.; Ahmed, A.N.; Sherif, M.; Elsafie, A. River flow prediction based on improved machine learning method: Cuckoo Search-Artificial Neural Network. Appl. Water Sci. 2023, 13, 28. [Google Scholar] [CrossRef]
- Ni, L.; Wang, D.; Singh, V.P.; Wu, J.; Wang, Y.; Tao, Y.; Zhang, J. Streamflow and rainfall forecasting by two long short-term memory-based models. J. Hydrol. 2019, 583, 124296. [Google Scholar] [CrossRef]
- Dragomir, F.-L. Artificial intelligence techniques cybersecurity. In Proceedings of the International Scientific Conference STRATEGIES XXI, Bucharest, Romania, 27–28 April 2017; pp. 147–152. [Google Scholar]
- Rajkomar, A.; Dean, J.; Kohane, I. Machine Learning in Medicine. N. Engl. J. Med. 2019, 380, 1347–1358. [Google Scholar] [CrossRef]
- Dragomir, F.-L. Information system for macroprudential policies. Acta Univ. Danubius. Œconomica 2025, 21, 48–57. [Google Scholar]
- Simeone, O. Machine Learning in Engineering; Cambridge University Press: Cambridge, MA, USA, 2022. [Google Scholar]
- Shi, Y.-F.; Yang, Z.X.; Ma, S.; Kang, P.-L.; Shang, C.; Hu, P.; Liu, Z.-P. Machine Learning for Chemistry: Basics and Applications. Engineering 2023, 27, 70–83. [Google Scholar] [CrossRef]
- Bărbulescu, A.; Dumitriu, C.Ș. About the long-range dependence of cavitation effect on a copper alloy. Rom. J. Phys. 2024, 69, 904. [Google Scholar]
- Zhang, H.; Liu, Y.; Zhang, C.; Li, N. Machine Learning Methods for Weather Forecasting: A Survey. Atmosphere 2025, 16, 82. [Google Scholar] [CrossRef]
- Gogas, P.; Papadimitriou, T. Machine Learning in Economics and Finance. Comput. Econ. 2021, 57, 1–4. [Google Scholar] [CrossRef]
- Samantaray, S.; Sahoo, A.; Agnihotri, A. Prediction of Flood Discharge Using Hybrid PSO-SVM Algorithm in Barak River Basin. MethodsX 2023, 10, 102060. [Google Scholar] [CrossRef]
- Fathian, F.; Mehdizadeh, S.; Sales, A.K.; Safari, M.J.S. Hybrid models to improve the monthly river flow prediction: Integrating artificial intelligence and non-linear time series models. J. Hydrol. 2019, 575, 1200–1213. [Google Scholar] [CrossRef]
- Nielsen, M.A.; Chuang, I.L. Quantum Computation and Quantum Information: 10th Anniversary Edition; Cambridge University Press: Cambridge, MA, USA, 2010. [Google Scholar]
- Preskill, J. Quantum computing in the NISQ era and beyond. Quantum 2018, 2, 79. [Google Scholar] [CrossRef]
- Lloyd, S. Universal quantum simulators. Science 1996, 273, 1073–1078. [Google Scholar] [CrossRef]
- Farhi, E.; Goldstone, J.; Gutmann, S. A quantum approximate optimization algorithm. arXiv 2014, arXiv:1411.4028. [Google Scholar] [CrossRef]
- McClean, J.R.; Romero, J.; Babbush, R.; Aspuru-Guzik, A. The theory of variational hybrid quantum—Classical algorithms. New J. Phys. 2016, 18, 023023. [Google Scholar] [CrossRef]
- Biamonte, J.; Wittek, P.; Pancotti, N.; Rebentrost, P.; Wiebe, N.; Lloyd, S. Quantum machine learning. Nature 2017, 549, 195–202. [Google Scholar] [CrossRef]
- Cao, Y.; Romero, J.; Kieferová, M.; Shen, Z.; Babbush, R.; Aspuru-Guzik, A. Quantum-Enhanced machine learning. NPJ Quantum Inform. 2019, 5, 28. [Google Scholar]
- Kharsa, R.; Bouridane, A.; Amira, A. Advances in Quantum Machine Learning and Deep Learning for Image Classification: A Survey. Neurocomputing 2023, 560, 126843. [Google Scholar] [CrossRef]
- Schuld, M.; Sinayskiy, I.; Petruccione, F. Quantum machine learning in chemistry. J. Chem. Sci. 2018, 130, 1291–1306. [Google Scholar]
- Hirai, H. Practical application of quantum neural network to materials informatics. Sci. Rep. 2024, 14, 8583. [Google Scholar] [CrossRef]
- Beaudoin, C.; Kundu, S.; Topaloglu, R.O.; Ghosh, S. Quantum Machine Learning for Material Synthesis and Hardware Security. In Proceedings of the 41st IEEE/ACM International Conference on Computer-Aided Design ICCAD ’22, Dan Diego, CA, USA, 30 October–3 November 2022; p. 120. [Google Scholar]
- Azevedo, C.R.B.; Ferreira, T.A. The Application of Qubit Neural Networks for Time Series Forecasting with Automatic Phase Adjustment Mechanism. Available online: https://www.dcc.fc.up.pt/~ines/enia07_html/pdf/27825.pdf (accessed on 5 April 2025).
- Balakrishnan, D.; Mariappan, U.; Raghavendra, P.G.M.; Reddy, P.K.; Dinesh, R.L.N.; Jabiulla, S.B. Quantum Neural Network for Time Series Forecasting: Harnessing Quantum Computing’s Potential in Predictive Modeling. In Proceedings of the 2023 2nd International Conference on Futuristic Technologies (INCOFT), Belagavi, India, 24–26 November 2023; pp. 1–7. [Google Scholar]
- Safari, A.; Ghavifekr, A.A. Quantum Neural Networks (QNN) Application in Weather Prediction of Smart Grids. In Proceedings of the 2021 11th Smart Grid Conference (SGC), Tabriz, Iran, 7–9 December 2021; pp. 1–6. [Google Scholar]
- Jeswal, S.K.; Chakraverty, S. Recent Developments and Applications in Quantum Neural Network: A Review. Arch. Computat. Methods Eng. 2019, 26, 793–807. [Google Scholar] [CrossRef]
- Rath, M.; Date, H. Quantum data encoding: A comparative analysis of classical-to-quantum mapping techniques and their impact on machine learning accuracy. EPJ Quantum Technol. 2024, 11, 72. [Google Scholar] [CrossRef]
- Ranga, D.; Rana, A.; Prajapat, S.; Kumar, P.; Kumar, K.; Vasilakos, A.V. Quantum Machine Learning: Exploring the Role of Data Encoding Techniques, Challenges, and Future Directions. Mathematics 2024, 12, 331. [Google Scholar] [CrossRef]
- Djordjevic, I. Quantum Information Processing and Quantum Error Correction; Academic Press: Cambridge, MA, USA; Elsevier: Amsterdam, The Netherlands, 2012. [Google Scholar]
- Carleo, G.; Cirac, J.I.; Gull, S.; Martin-Delgado, M.A.; Troyer, M. Solving the quantum many-body problem with artificial neural networks. Science 2017, 355, 602–606. [Google Scholar] [CrossRef]
- Polyak, B. Introduction to Optimization. 1987. Available online: https://www.researchgate.net/profile/Boris-Polyak-2/publication/342978480_Introduction_to_Optimization/links/5f1033e5299bf1e548ba4636/Introduction-to-Optimization.pdf (accessed on 15 February 2025).
- Chen, H.; Wu, H.-C.; Chan, S.-C.; Lam, W.-H. A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization with Applications. IEEE Trans. Neural Netw. Learn. Syst. 2020, 31, 4776–4790. [Google Scholar] [CrossRef]
- Powell, M.J.D. A direct search optimization method that models the objective and constraint functions by linear interpolation. In Advances in Optimization and Numerical Analysis; Gomez, S., Hennart, J.-P., Eds.; Kluwer Academic: Dordrecht, The Netherlands, 1994; pp. 51–67. [Google Scholar]
- COBYLA. Available online: https://qiskit-community.github.io/qiskit-algorithms/stubs/qiskit_algorithms.optimizers.COBYLA.html (accessed on 15 February 2025).
- Kwak, Y.; Yun, W.Y.; Jung, J.S.; Kim, J. Quantum Neural Networks: Concepts, Applications, and Challenges. arXiv 2021, arXiv:2108.01468. [Google Scholar] [CrossRef]
- Zhou, M.-G.; Liu, Z.-P.; Yin, H.-L.; Li, C.-L.; Xu, T.-K.; Chen, Z.-B. Quantum Neural Network for Quantum Neural Computing. Research 2023, 6, 0134. [Google Scholar] [CrossRef] [PubMed]
- Jadhav, A.; Rasool, A.; Gyanchandani, M. Quantum Machine Learning: Scope for real-world problems. Procedia Comp. Sci. 2023, 218, 2612–2625. [Google Scholar] [CrossRef]
- Chendeş, V. Water Resources in Curvature Subcarpathians. Geospatial Assessments; Editura Academiei Române: Bucharest, Romania, 2011; (In Romanian with English Abstract). [Google Scholar]
- The Arrangement of the Buzău River. Available online: https://www.hidroconstructia.com/dyn/2pub/proiecte_det.php?id=110&pg=1 (accessed on 17 October 2023). (In Romanian).
- Zhen, L.; Bărbulescu, A. Echo State Network and Sparrow Search: Echo State Network for Modeling the Monthly River Discharge of the Biggest River in Buzău County, Romania. Water 2024, 16, 2916. [Google Scholar] [CrossRef]
- Bărbulescu, A.; Dumitriu, C.S.; Ilie, I.; Barbeş, S.-B. Influence of Anomalies on the Models for Nitrogen Oxides and Ozone Series. Atmosphere 2022, 13, 558. [Google Scholar] [CrossRef]
- Mocanu-Vargancsik, C.; Tudor, G. On the linear trends of a water discharge data under temporal variation. Case study: The upper sector of the Buzău river (Romania). Forum Geogr. 2020, XIX, 37–44. [Google Scholar] [CrossRef]
- Bărbulescu, A.; Zhen, L. Forecasting the River Water Discharge by Artificial Intelligence Methods. Water 2024, 16, 1248. [Google Scholar] [CrossRef]
- Naess, A. Applied Extreme Value Statistics; Springer: Cham, Switzerland, 2024. [Google Scholar]
- Zhen, L.; Bărbulescu, A. Comparative Analysis of Convolutional Neural Network-Long Short-Term Memory, Sparrow Search Algorithm-Backpropagation Neural Network, and Particle Swarm Optimization-Extreme Learning Machine Models for the Water Discharge of the Buzău River, Romania. Water 2024, 16, 289. [Google Scholar] [CrossRef]
Series | Set | MAE | MSE | R2 | Time (s) | Epochs |
---|---|---|---|---|---|---|
S | Training | 1.6851 | 4.4144 | 0.9863 | 10.8472 | 9 |
Test | 1.1937 | 2.3815 | 0.9858 | 10.8472 | 9 | |
S1 | Training | 3.8257 | 18.1368 | 0.9479 | 7.0415 | 10 |
Test | 3.7392 | 16.9383 | 0.8991 | 7.0415 | 10 | |
S2 | Training | 4.5597 | 22.9951 | 0.9179 | 4.6218 | 8 |
Test | 4.8917 | 26.2494 | 0.8436 | 4.6218 | 8 | |
So | Training | 0.9616 | 1.4660 | 0.9917 | 12.1807 | 11 |
Test | 1.0406 | 1.6954 | 0.9869 | 12.1807 | 11 | |
S1o | Training | 1.0460 | 1.7586 | 0.9910 | 5.5552 | 8 |
Test | 1.0791 | 1.6857 | 0.9870 | 5.5552 | 8 | |
S2o | Training | 3.5237 | 13.4814 | 0.9188 | 4.3140 | 8 |
Test | 3.2577 | 11.6611 | 0.9098 | 4.3140 | 8 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhen, L.; Bărbulescu, A. Quantum Neural Networks Approach for Water Discharge Forecast. Appl. Sci. 2025, 15, 4119. https://doi.org/10.3390/app15084119
Zhen L, Bărbulescu A. Quantum Neural Networks Approach for Water Discharge Forecast. Applied Sciences. 2025; 15(8):4119. https://doi.org/10.3390/app15084119
Chicago/Turabian StyleZhen, Liu, and Alina Bărbulescu. 2025. "Quantum Neural Networks Approach for Water Discharge Forecast" Applied Sciences 15, no. 8: 4119. https://doi.org/10.3390/app15084119
APA StyleZhen, L., & Bărbulescu, A. (2025). Quantum Neural Networks Approach for Water Discharge Forecast. Applied Sciences, 15(8), 4119. https://doi.org/10.3390/app15084119