1. Introduction
Water distribution networks (WDNs) are critical infrastructure systems that ensure the supply of clean water to urban areas. Water demand is pivotal in efficient operation and planning of WDNs. Accurate water demand forecasting is vital for sustainable water management in the face of population growth, urbanization, and climate change. Forecasting demand models are divided into long-term and short-term models. Long-term demand forecasting is usually based on a yearly and monthly basis, while short-term models are limited to smaller horizons, like one or several days, with daily or hourly time steps. There are a wide range of methods for water demand forecasting, including machine learning techniques such as artificial neural networks, support vector machines and random forests. Regression methods such as multilinear and nonlinear, and genetic programming are other available models [
1].
Deep learning models are the next generation of artificial neural networks, which are widely used in many fields such as image processing and natural language processing as well as water demand forecasting [
2]. A Recurrent Neural Network (RNN) is a deep learning model which can memorize short-term information. This feature makes the RNN a powerful tool for time series prediction [
3]. The Long Short-Term Memory (LSTM) model is a variation of an RNN. It can handle long time series by memorizing long-term information as well as dealing with the vanishing gradient problem [
4].
Using the LSTM model requires several hyper-parameters such as the number of units in the hidden layer, batch size, and window size. Using an optimization algorithm for the optimal determination of these hyper-parameters is a common idea [
5]. In this study, a coupled Particle Swarm Optimization (PSO) and LSTM model is implemented for short-term water demand prediction, as a part of Battle of Water Demand Forecasting (BWDF) in the third WDSA-CCWI joint conference.
2. Methodology
The proposed model in this study is based on using an optimized LSTM deep learning mode, which is introduced in the following.
2.1. LSTM
Long Short-Term Memory Networks, a type of deep learning sequential neural network, are designed to address the challenge of retaining information over time. Unlike traditional RNNs, LSTM was specifically developed by Hochreiter and Schmidhuber [
6] to combat the issue of the vanishing gradient problem encountered in RNNs and other machine learning algorithms. More details can be found in the literature [
4,
6,
7]. In this study, the LSTM module of the TensorFlow library is used in Python.
2.2. PSO
The Particle Swarm Optimization (PSO) algorithm is inspired by the movement of bird flocks and fish schools and is developed based on the concept of swarm intelligence. In this algorithm, each particle represents a possible solution to the optimization problem. Initially, an arbitrary number of particles is produced and evaluated. Then, through an iterative process, the particles move towards the optimal point. The movement of each particle depends on its previous movement direction, the location of its best position so far, and the absolute best location reported so far among all particles. The stopping condition of the algorithm can be defined as a certain number of iterations, no significant change, and/or the reaching of an acceptable solution [
8].
2.3. LSTM-PSO
The proposed model is based on using PSO for minimizing the Mean Squared Error (MSE) of the forecasting model, considering LSTM hyper-parameters as decision variables (
Figure 1). In other words, PSO tries to find the optimal settings for the LSTM model. Herein, batch size, number of training epochs and number of units are selected as optimized hyper-parameters.
3. Case Study
A real WDN located in the northeast part of Italy is selected as the case study. It contains ten District Metering Areas (DMAs) and the future forecasting demand for each DMA is the main question. There are four series of available flow data for each DMA, and four prediction models for different periods are required. More details can be found in the BWDF instructions [
9].
In this study, 10 parameters in three categories are assumed to be effective in future demand, including:
Weather data: rainfall depth, air temperature, air humidity, and wind speed.
Calendar data: the hour of the day, day of the week, day of the month, and month of the year.
Binary data: holiday or not holiday, and summertime or not summertime indices.
An LSTM model with 10 input nodes (for input parameters) and 10 output nodes (for DMA flow) is created and trained with different settings. To find the optimal setting, a PSO algorithm is implemented, and results are discussed in the next section.
4. Results and Discussion
Results for 10 DMAs’ demand forecasting are presented in
Figure 2. It contains 10 comparisons for each DMA and two more for training (80%) and testing (20%) data in all DMAs. For each DMA, the train and test data are plotted as blue and black circles, respectively. In addition, the identity line is plotted as a dashed red line for a better comparison.
As can be seen, the accuracy of the predicted values is almost acceptable for all DMAs. To quantify the comparison, the coefficient of determination (R
2) for each DMA is presented in
Table 1. In some cases, like DMA 5 and 8, the agreement between actual and predicted data are at the highest level. In some other cases, like DMA 1 and DMA 6, the model is not able to provide a good estimation of demand. It seems that there are additional important factors influencing demand flow within these DMAs.
5. Conclusions
This study has demonstrated the promising potential of an optimized LSTM network as a deep learning model, integrated with a PSO algorithm, for short-term water demand forecasting in a city in northeastern Italy. The analysis of predicted values across various DMAs reveals generally acceptable levels of accuracy. This research contributes to ongoing efforts in efficient water demand forecasting by employing deep learning techniques and optimization algorithms. This methodological framework represents a significant step towards future advancements in water distribution network management, ultimately ensuring the availability of clean water for communities.
Author Contributions
Conceptualization, M.G., A.G.S. and M.K.H.; methodology, M.G., A.G.S. and M.K.H.; software, M.G. and M.K.H.; validation, M.G.; formal analysis, M.G; investigation, M.G. and A.G.S.; resources, M.G.; data curation, M.G.; writing—original draft preparation, M.G.; writing—review and editing, M.G., A.G.S. and M.K.H.; visualizations, M.G.; supervision, M.G. and M.K.H.; project administration, M.G.; funding acquisitions, M.G. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
All data and codes are available by request from the authors.
Conflicts of Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References
- Pacchin, E.; Gagliardi, F.; Alvisi, S.; Franchini, M. A Comparison of Short-Term Water Demand Forecasting Models. Water Resour. Manag. 2019, 33, 1481–1497. [Google Scholar] [CrossRef]
- Guo, G.; Liu, S.; Wu, Y.; Li, J.; Zhou, R.; Zhu, X. Short-Term Water Demand Forecast Based on Deep Learning Method. J. Water Resour. Plan. Manag. 2018, 144, 04018076. [Google Scholar] [CrossRef]
- Hewamalage, H.; Bergmeir, C.; Bandara, K. Recurrent Neural Networks for Time Series Forecasting: Current status and future directions. Int. J. Forecast. 2021, 37, 388–427. [Google Scholar] [CrossRef]
- Staudemeyer, R.C.; Morris, E.R. Understanding LSTM—A tutorial into Long Short-Term Memory Recurrent Neural Networks. arXiv 2019, arXiv:1909.09586. [Google Scholar]
- Wang, K.; Ye, Z.; Wang, Z.; Liu, B.; Feng, T. MACLA-LSTM: A Novel Approach for Forecasting Water Demand. Sustainability 2023, 15, 3628. [Google Scholar] [CrossRef]
- Sherstinsky, A. Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) network. Phys. D Nonlinear Phenom. 2020, 404, 132306. [Google Scholar] [CrossRef]
- Graves, A. Long Short-Term Memory. In Supervised Sequence Labelling with Recurrent Neural Networks; Graves, A., Ed.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 37–45. [Google Scholar]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995. [Google Scholar]
- Alvisi, S.; Franchini, M.; Marsili, V.; Mazzoni, F.; Salomons, E. Battle of Water Demand Forecasting (BWDF). In Proceedings of the 3rd International WDSA-CCWI Joint Conference, Ferrara, Italy, 1–4 July 2024. [Google Scholar]
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).