Wind Speed Forecasting with Differentially Evolved Minimum-Bandwidth Filters and Gated Recurrent Units
Abstract
:1. Introduction
1.1. Research Motivation
1.2. Literature Highlights
1.3. Innovations
- We choose wavelets because they efficiently manage complex wind data by decomposing signals into different frequency components, allowing for the detection of complex patterns. In essence, we used wavelets to accurately identify structural breaks, thereby enhancing the accuracy and reliability of the wind speed forecasting model.
- Instead of using the time-variant DWT, we opted for MODWT, which is time-invariant such that subseries signals have the same coefficients even if the signal is shifted [3,8,19]. Plus, these methods maintain the full resolution of the original signal as it does not apply a decimation, thereby enhancing the modelling and forecasting of sub-signals. We, therefore, used the MODWT technique to decompose the original wind speed data signal into detailed and approximate frequency sub-signals with reduced complexity than the original sub-signal.
- The selection of the most suitable wavelet filter depends on the specific problem being addressed. In contrast with conventional DB4 and LA8, we also applied MB filters exhibiting excellent frequency localisation, narrow bandwidths reducing spectral leakage, and the effective isolation and filtering of noise outside a particular frequency band [22].
- GA results are reliable and consistent, but these algorithms require significant computational resources and are typically characterised by a slow convergence rate. DE differs from GA in that it can solve complex optimisation problems and optimise non-differentiable and nonlinear continuous functions with high convergence speed, making it well suited to forecasting wind speeds [23,24,25,26]. We utilised the latter stochastic algorithms, which are easy to understand and converge quickly, to find the optimal level of decomposition for the wavelet filter applied.
- Often, wind speed (as a physical quantity) presents itself at particular locations in both linear and nonlinear forms, which compromises the performance of linear models in the forecasting arena. GRUs have the ability to capture varying patterns of variance over time which is typical of wind data. By leveraging GRU’s simplicity, accuracy and computational efficiency (to some extent), we modelled and predicted each sub-signal to such a level of accuracy and precision.
- Also, most reviewed studies focused on short-term forecasts (see e.g., [10,12,13,14,15,16,17,18,19,20]), ignoring the medium-to-long-term forecasts that are critical to wind turbine maintenance and wind farm construction. We evaluated probabilistic or distributional forecast measures in both medium and long-term wind speed forecasting using pinball loss (PL), Dawid–Sebastiani (DS), and probability integral transform (PIT).
- The practicability and efficacy of the proposed forecasting model were confirmed empirically via prediction metrics. Also, this work was conducted in a way that is reliable and easy to replicate.
1.4. Structure of the Study
2. Materials and Methods
2.1. Fundamentals of Wavelet Analysis
2.1.1. Wavelet Filters
2.1.2. Multiresolution Analysis
2.1.3. Maximal Overlap Discrete Wavelet Transform
2.2. Differential Evolution
2.3. Gated Recurrent Unit
2.4. Persistence Model
2.5. Proposed Prediction Approach
- DE algorithm is used to optimise the decomposition level of highly variant wind speed data due to its simplicity, efficiency, and ability to deal with complex continuous non-differentiable problems. As a result, more statistically sound sub-signals that are easy to characterise in modelling and forecasting can be extracted.
- A time-invariant MODWT resistant to boundary effects is used, employing a variety of wavelet filters, particularly the DB4, LA8, and MB8, to decompose wind speed data into detailed (high) and approximate (low) frequency components. These components have reduced noise and volatility, exposing short-term and long-term trends. They can be modelled and forecasted with ease and efficiency.
- The advantage of nonlinear approximators GRUs lies not only in their simplicity, but also in their ability to efficiently and accurately capture complex wind behaviour’s (including linear and nonlinear components) in each sub-signal. For this reason, we opted for GRU, renowned for its capability to manage vanishing gradients, a characteristic particularly suited for handling detailed signals. This choice was made to ensure the prediction of each sub-signal is conducted with an unprecedented level of accuracy and reliability, all the while mitigating the risk of model overfitting or explosion.
- Finally, we leveraged the MRA, known for its effective data compression, denoising, feature extraction, and efficient reconstruction, to reconstruct the original wind data and arrive at the final forecast using all sub-signal forecasts.
Algorithm 1: Wavelet-MODWT-GRU |
1. Input: Wind speed data ( ) |
A. Data Cleaning and Preprocessing |
2. data_cleaning_and_preprocessing 3. Load original wind speed data into R program environment. 4. Clean and format data inconsistencies and anomalies caused by environmental factors and instrument instability. 5. Retain as wind turbines resort to feathering beyond this limit and are switched off. 6. Divide data into 80% training () and 20% testing sets () with such that . 7. output |
B. DE hyperparameter search |
8. de_hyperparameter_search 9. Initialise the wavelet filter. 10. Define the objective function based on the original wind data () and the reconstructed series () such that the mean sum of square (MSE) error is given by . The function is specific to the wavelet filter and is used to evaluate performance. 11. Set parameter bounds within which DE will search. Thus, set population size, number of iterations, crossover probability, parameter bounds, and weights. This is vital for DE to search a relevant interval and to improve search efficiency. 12. Run the DE until the predetermined termination criterion (i.e., number of runs) is reached. 13. output |
C. Signal denoising and processing |
14. signal_denoising_and_formating 15. In MODWT, the optimised decomposition level () is used alongside the conditions, filters and boundary parameters to decompose the into less noisy and more statistically sound sub-signals with and . 16. Each sub-signal is divided into two sets, namely; the training set (80%) ( ) and the testing set (20%) ( ). 17. Normalise sub-signals using the min-max criterion such that and . This ensures that and are compatible with the hyperbolic tangent function and minimise noise/variance effects on the predictions. 18. output |
D. GRU hyperparameter search |
19. gru_hyperparameter_search 20. Array data into a 3D format (i.e., samples, time steps, feature) for compatibility with the GRU network. 21. Initialise parameters: input shape, batch size, dropout rates, epochs, activation function, loss function, learning rate, and optimiser. 22. Train GRU model and evaluate performance based on ) using the normalised training dataset . 23. Preserve model parameters with optimal performance. 24. output |
E. Test GRU performance |
25. test_gru_performance 26. Superimpose the GRU model with optimal parameters on to generate normalised forecasts . 27. Return to the original sub-signal forecast via where are the original sub-signal predictions and are the normalised sub-signal predictions. 28. Evaluate the performance of the GRU predictions for each sub-signal using RMSE, MAE, and MAPE. 29. output |
F. Signal reconstruction and output evaluation |
30. signal_reconstruction_and_output_evaluation 31. All sub-signals predictions are used to reconstruct such that -. 32. Use performance metrics and statistical tests (i.e., RMSE, MAE, MAPE, coefficient of determination (R2), PL, MD, Mincer-Zarnowitz (MZ), PIT, and DS) to compare and . 33. output |
34. Output:, RMSE, MAE, MAPE, R2, PL, MD, MZ, PIT, and DS |
2.6. Data Description
2.7. Performance Evaluation
2.7.1. Deterministic Forecast Evaluation Scores
2.7.2. Probabilistic Forecast Evaluation Scores
2.7.3. Biasedness Assessment
2.7.4. Predictive Accuracy Assessment
3. Results
3.1. Computational Tools
3.2. Exploratory Data Analysis
3.3. Deterministic and Probabilistic Performance Evaluation
3.4. Predictive Performance Analysis
4. Conclusions
- DE optimises the decomposition level (at ) efficiently and simply, resulting in MODWT decomposed sub-signals that are easier to characterise for GRU modelling.
- The location of the station, wavelet filter applied, decomposition level, and forecasting horizon all affected model performance.
- The MB filter (followed by the LA filter) produced better wind speed forecasts for short-to-longer forecast horizons, less variant-to-high-variant, and small-to-larger datasets.
- Although the naive model is easy to understand, it is ineffective when dealing with wind speed data. Similar results were obtained in the work of [10].
- Generally, the increase in lead times led to an increase in error metrics and a decrease in the predictive ability of all models. Nonetheless, the proposed approach still demonstrated superiority over all other models.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zhen, H.; Niu, D.; Yu, M.; Wang, K.; Liang, Y.; Xu, X. A Hybrid Deep Learning Model and Comparison for Wind Power Forecasting Considering Temporal-Spatial Feature Extraction. Sustainability 2020, 12, 9490. [Google Scholar] [CrossRef]
- Wang, X.; Guo, P.; Huang, X. A Review of Wind Power Forecasting Models. Energy Procedia 2011, 12, 770–777. [Google Scholar] [CrossRef]
- Chandra, D.; Sailaja Kumari, M.; Sydulu, M.; Grimaccia, F.; Mussetta, M. Adaptive Wavelet Neural Network-Based Wind Speed Forecasting Studies. J. Electr. Eng. Technol. 2014, 9, 1812–1821. [Google Scholar] [CrossRef]
- Sivhugwana, K.S.; Ranganai, E. An Ensemble Approach to Short-Term Wind Speed Predictions Using Stochastic Methods, Wavelets and Gradient Boosting Decision Trees. Wind 2024, 4, 44–67. [Google Scholar] [CrossRef]
- Soman, S.S.; Zareipour, H. A Review of Wind Power and Wind Speed Forecasting Methods with Different Time Horizons. In Proceedings of the North American Power Symposium 2010, Arlington, TX, USA, 26–28 September 2010. [Google Scholar] [CrossRef]
- Gardner, W.A.; Napolitano, A.; Paura, L. Cyclostationarity: Half a Century of Research. Signal Process. 2006, 86, 639–697. [Google Scholar] [CrossRef]
- Percival, D.B.; Walden, A.T. Wavelet Methods for Time Series Analysis; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
- Zhang, Z.; Telesford, Q.K.; Giusti, C.; Lim, K.O.; Bassett, D.S. Choosing Wavelet Methods, Filters, and Lengths for Functional Brain Network Construction. PLoS ONE 2016, 11, e0157243. [Google Scholar] [CrossRef]
- Gensler, A. Wind Power Ensemble Forecasting: Performance Measures and Ensemble Architectures for Deterministic and Probabilistic Forecasts. Ph.D. Thesis, University of Kassel, Hessen, Germany, 21 September 2018. [Google Scholar]
- Singh, S.N.; Mohapatra, A. Repeated Wavelet Transform-Based ARIMA Model for Very Short-Term Wind Speed Forecasting. Renew. Energy 2019, 136, 128. [Google Scholar] [CrossRef]
- Valdivia-Bautista, S.M.; Domínguez-Navarro, J.A.; Pérez-Cisneros, M.; Vega-Gómez, C.J.; Castillo-Téllez, B. Artificial Intelligence in Wind Speed Forecasting: A Review. Energies 2023, 16, 2457. [Google Scholar] [CrossRef]
- Catalão, J.P.S.; Pousinho, H.M.I.; Mendes, V.M.F. Short-Term Wind Power Forecasting in Portugal by Neural Networks and Wavelet Transform. Renew. Energy 2011, 36, 1245–1251. [Google Scholar] [CrossRef]
- Berrezzek, F.; Khelil, K.; Bouadjila, T. Efficient Wind Speed Forecasting Using Discrete Wavelet Transform and Artificial Neural Networks. Rev. d’Intelligence Artificielle 2019, 33, 447–452. [Google Scholar] [CrossRef]
- Niu, D.; Pu, D.; Dai, S. Ultra-Short-Term Wind-Power Forecasting Based on the Weighted Random Forest Optimized by the Niche Immune Lion Algorithm. Energies 2018, 11, 1098. [Google Scholar] [CrossRef]
- Patel, Y.; Deb, D. Machine Intelligent Hybrid Methods Based on Kalman Filter and Wavelet Transform for Short-Term Wind Speed Prediction. Wind 2022, 2, 37–50. [Google Scholar] [CrossRef]
- Catalão, J.P.S.; Pousinho, H.M.I.; Mendes, V.M.F. Hybrid Wavelet-PSO-ANFIS Approach for Short-Term Wind Power Forecasting in Portugal. IEEE Trans. Sustain. Energy 2011, 2, 50–59. [Google Scholar] [CrossRef]
- Xiang, J.; Qiu, Z.; Hao, Q.; Cao, H. Multi-Time Scale Wind Speed Prediction Based on WT-Bi-LSTM. MATEC Web Conf. 2020, 309, 05011. [Google Scholar] [CrossRef]
- Liu, Y.; Guan, L.; Hou, C.; Han, H.; Liu, Z.; Sun, Y.; Zheng, M. Wind Power Short-Term Prediction Based on LSTM and Discrete Wavelet Transform. Appl. Sci. 2019, 9, 1108. [Google Scholar] [CrossRef]
- Sivhugwana, K.S.; Ranganai, E. Short-Term Wind Speed Prediction via Sample Entropy: A Hybridisation Approach against Gradient Disappearance and Explosion. Computation 2024, 12, 163. [Google Scholar] [CrossRef]
- Domínguez-Navarro, J.A.; Lopez-Garcia, T.B.; Valdivia-Bautista, S.M. Applying Wavelet Filters in Wind Forecasting Methods. Energies 2021, 14, 3181. [Google Scholar] [CrossRef]
- Khelil, K.; Berrezzek, F.; Bouadjila, T. GA-based design of optimal discrete wavelet filters for efficient wind speed forecasting. Neural Comput. Appl. 2020, 33, 4373–4386. [Google Scholar] [CrossRef]
- Morris, J.M.; Peravali, R. Minimum-bandwidth discrete-time wavelets. Signal Process. 1999, 76, 181–193. [Google Scholar] [CrossRef]
- Storn, R.; Price, K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Wang, Y.; Cai, Z.; Zhang, Q. Differential Evolution With Composite Trial Vector Generation Strategies and Control Parameters. IEEE Trans. Evol. Comput. 2011, 15, 55–66. [Google Scholar] [CrossRef]
- Eltaeib, T.; Mahmood, A. Differential Evolution: A Survey and Analysis. Appl. Sci. 2018, 8, 1945. [Google Scholar] [CrossRef]
- Leon, M.; Xiong, N. Investigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems. In Artificial Intelligence and Soft Computing, ICAISC 2014; Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M., Eds.; Lecture Notes in Computer Science, 8467; Springer: Cham, Switzerland, 2014. [Google Scholar] [CrossRef]
- Merry, R.J.E. Wavelet Theory and Applications: A Literature Study; DCT Rapporten; Technische Universiteit Eindhoven: Eindhoven, The Netherlands, 2005. [Google Scholar]
- Daubechies, I. Ten Lectures on Wavelets; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 1992. [Google Scholar]
- Gröchenig, K. Foundations of Time-Frequency Analysis; Springer Science & Business Media: New York, NY, USA, 2001. [Google Scholar]
- Kovačević, J.; Goyal, V.K. Fourier and Wavelet Signal Processing; Verlag Nicht Ermittelbar: 2010. Available online: https://www.fourierandwavelets.org/FWSP_a3.2_2013.pdf (accessed on 4 October 2024).
- Mallat, S.G. A Theory for Multiresolution Signal Decomposition: The Wavelet Representation. IEEE Trans. Pattern Anal. Mach. Intell. 1989, 11, 674–693. [Google Scholar] [CrossRef]
- Misiti, M.; Misiti, Y.; Oppenheim, G.; Poggi, J.M. Wavelets Toolbox User’s Guide; The MathWorks: Natick, MA, USA, 2000. [Google Scholar]
- Dghais, A.A.; Ismail, M.T. A Comparative Study between Discrete Wavelet Transform and Maximal Overlap Discrete Wavelet Transform for Testing Stationarity. Int. J. Math. Comput. Phys. Electr. Comput. Eng. 2013, 7, 1677–1681. [Google Scholar]
- Rodrigues, D.V.Q.; Zuo, D.; Li, C. A MODWT-Based Algorithm for the Identification and Removal of Jumps/Short-Term Distortions in Displacement Measurements Used for Structural Health Monitoring. IoT 2022, 3, 60–72. [Google Scholar] [CrossRef]
- Alarcon-Aquino, V.; Barria, J.A. Change Detection in Time Series Using the Maximal Overlap Discrete Wavelet Transform. Lat. Am. Appl. Res. 2009, 39, 145–152. [Google Scholar]
- Zaharie, D. A Comparative Analysis of Crossover Variants in Differential Evolution. In Proceedings of the International Multiconference on Computer Science and Information Technology, Gosier, Guadaloupe, 4–9 March 2007; pp. 171–181. Available online: https://staff.fmi.uvt.ro/~daniela.zaharie/lucrari/imcsit07.pdf (accessed on 2 October 2024).
- Eiben, Á.E.; Hinterding, R.; Michalewicz, Z. Parameter Control in Evolutionary Algorithms. IEEE Trans. Evol. Comput. 1999, 3, 124–141. [Google Scholar] [CrossRef]
- Mullen, K.M.; Ardia, D.; Gil, D.L.; Windover, D.; Cline, J. DEoptim: An R package for global optimization by differential evolution. J. Stat. Softw. 2011, 40, 1–26. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Cho, K.; van Merrienboer, B.; Bahdanau, D.; Bengio, Y. On the Properties of Neural Machine Translation: Encoder-Decoder Approaches. arXiv 2014, arXiv:1409.1259. [Google Scholar] [CrossRef]
- Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv 2014, arXiv:1412.3555. [Google Scholar] [CrossRef]
- Hyndman, R.J.; Athanasopoulos, G. Forecasting: Principles and Practice, 2nd ed.; OTexts: Melbourne, Australia, 2021. [Google Scholar]
- Jordan, A.; Krüger, F.; Lerch, S. Evaluating probabilistic forecasts with scoringRules. J. Stat. Softw. 2019, 90, 1–37. Available online: https://www.jstatsoft.org/article/view/v090i12 (accessed on 17 December 2024). [CrossRef]
- Gneiting, T.; Raftery, A.E. Strictly Proper Scoring Rules, Prediction, and Estimation. J. Am. Stat. Assoc. 2007, 102, 359–378. [Google Scholar] [CrossRef]
- Bosse, N.I.; Gruson, H.; Cori, A.; van Leeuwen, E.; Funk, S.; Abbott, S. Evaluating Forecasts with Scoringutils in R. arXiv 2022, arXiv:2205.07090. [Google Scholar] [CrossRef]
- Mincer, J.; Zarnowitz, V. The Evaluation of Economic Forecasts. In Economic Forecasts and Expectations; National Bureau of Economic Research: Cambridge, MA, USA, 1969; pp. 3–46. Available online: https://www.nber.org/system/files/chapters/c1214/c1214.pdf (accessed on 17 June 2024).
- Werner, E.; Tilmann, G.; Alexander, J.; Fabian, K. Of Quantiles and Expectiles: Consistent Scoring Functions, Choquet Representations, and Forecast Rankings. J. R. Stat. Soc. B 2016, 78, 505–562. [Google Scholar] [CrossRef]
- Whitcher, B. waveslim: Basic Wavelet Routines for One-, Two-, and Three-Dimensional Signal Processing; R package version 1.8.5; 2024. Available online: https://cran.r-project.org/package=waveslim (accessed on 3 October 2024).
- Allaire, J.J.; Chollet, F. Keras: R Interface to ‘Keras’. CRAN: Contributed Packages. R Package. 2024. Available online: https://cran.r-project.org/web/packages/keras/vignettes/ (accessed on 21 June 2024).
- Kaur, D.; Lie, T.T.; Nair, N.K.; Vallès, B. Wind Speed Forecasting Using Hybrid Wavelet Transform-ARMA Techniques. Aims Energy 2015, 3, 13–24. [Google Scholar] [CrossRef]
Filter | |||
---|---|---|---|
DB4 | LA8 | MB8 | Citation |
|
|
| [8,22,27,28,29,30,31] |
|
|
| [8,22,27,28,29,30,31] |
|
|
| [8,22,27,28,29,30,31,32,33,34,35] |
|
|
| [8,22,27,28,29,30,31,32,33,34,35] |
Station | Month | N | Granularity | Training | Testing |
---|---|---|---|---|---|
Alexander Bay | 1–31 August 2022 | 4462 | 10 min | 3570 | 892 |
Humansdorp | 1–25 April 2021 | 3600 | 10 min | 2800 | 720 |
Jozini | 1–17 December 2020 | 2400 | 10 min | 1920 | 480 |
Station | Mast ID | Longitude (°E) | Latitude (°N) | Elevation (m) | Anemometer Height (m) |
---|---|---|---|---|---|
Alexander Bay | WM01 | 16.664410 | 28.601882 | 152 | 61.85 |
Humansdorp | WM08 | 24.514360 | 34.109965 | 110 | 61.84 |
Jozini | WM13 | 32.16636 | 27.42605 | 80 | 61.75 |
Model | Main Hyperparameter | Search Space |
---|---|---|
DE | Number of iterations | 50–60 |
Population size | 45–60 | |
Crossover probability | 0.75–0.85 | |
Weights | 0.5–0.6 | |
Bounds | 1–10 | |
MODWT | filters | “la8”, ”d4”, ”mb8” |
GRU | Dropout rates | 0–0.5 |
Time steps | 1–10 | |
Epochs | 1–100 | |
Learning rate | 0–0.1 | |
Activation function | tanh | |
Loss function | MSE | |
Optimiser | Adam |
Station | Min | Q1 | Median | Mean | Q3 | Max | Std.dev | Skewness | Kurtosis | JB (p-Value) |
---|---|---|---|---|---|---|---|---|---|---|
Alexander Bay | 0.210 | 2.790 | 5.030 | 5.727 | 8.140 | 18.360 | 3.5739 | 0.6861 | 2.7581 | <2.2 × 10−16 |
Humansdorp | 0.2349 | 3.3596 | 5.2340 | 5.5351 | 7.2610 | 17.3089 | 2.8532 | 0.6385 | 3.1533 | <2.2 × 10−16 |
Jozini | 0.4045 | 3.2022 | 5.0106 | 5.3241 | 7.2137 | 14.1427 | 2.6980 | 0.4722 | 2.5046 | <2.2 × 10−16 |
Data | Model | Performance Indicator | ||||
---|---|---|---|---|---|---|
RMSE | MAE | MAPE (%) | MZ Bias Test | |||
Alexander Bay (August,WM01) N = 4462 h = 892 | ||||||
M1 (LA8) | 0.5256 | 0.3968 | 14.1156 | 0.9810 | Unbiased | |
M2 (DB4) | 0.5562 | 0.4050 | 14.3321 | 0.9792 | Biased | |
M3 (MB8) | 0.5102 | 0.3787 | 12.2258 | 0.9824 | Biased | |
M1 (LA8) | 0.7557 | 0.5847 | 37.2397 | 0.9607 | Unbiased | |
M2 (DB4) | 0.8676 | 0.6520 | 15.4997 | 0.9505 | Biased | |
M3 (MB8) | 0.7882 | 0.6007 | 26.4022 | 0.9585 | Biased | |
M1 (LA8) | 1.2909 | 0.9729 | 15.8073 | 0.8872 | Biased | |
M2 (DB4) | 1.3119 | 0.9923 | 16.8702 | 0.8839 | Biased | |
M3 (MB8) | 1.2150 | 0.9081 | 14.7143 | 0.8983 | Unbiased | |
Humansdorp (April, WM08) N = 3600 h = 720 | ||||||
M1 (LA8) | 0.4767 | 0.3580 | 8.8326 | 0.9718 | Unbiased | |
M2 (DB4) | 0.5482 | 0.4059 | 10.4977 | 0.9631 | Biased | |
M3 (MB8) | 0.4678 | 0.3544 | 9.1258 | 0.9729 | Unbiased | |
M1 (LA8) | 0.8640 | 0.6027 | 14.3734 | 0.9073 | Unbiased | |
M2 (DB4) | 0.7322 | 0.5198 | 12.2237 | 0.9335 | Unbiased | |
M3 (MB8) | 0.7634 | 0.5512 | 12.5808 | 0.9276 | Unbiased | |
M1 (LA8) | 1.0015 | 0.7537 | 16.5959 | 0.8754 | Unbiased | |
M2 (DB4) | 0.9003 | 0.7154 | 16.6009 | 0.8994 | Unbiased | |
M3 (MB8) | 0.8619 | 0.6750 | 15.2706 | 0.9077 | Unbiased | |
Jozini (December, WM13) N = 2400 h = 480 | ||||||
M1 (LA8) | 0.7092 | 0.4924 | 10.6034 | 0.9116 | Unbiased | |
M2 (DB4) | 0.7167 | 0.5002 | 10.9543 | 0.9098 | Unbiased | |
M3 (MB8) | 0.6703 | 0.4685 | 10.0479 | 0.9210 | Unbiased | |
M1 (LA8) | 0.7789 | 0.5652 | 12.7938 | 0.8935 | Unbiased | |
M2 (DB4) | 0.7566 | 0.5512 | 12.3017 | 0.8995 | Unbiased | |
M3 (MB8) | 0.7693 | 0.5350 | 11.5570 | 0.8962 | Unbiased | |
M1 (LA8) | 0.8927 | 0.7088 | 15.5410 | 0.8599 | Unbiased | |
M2 (DB4) | 0.8096 | 0.6266 | 13.2615 | 0.8848 | Unbiased | |
M3 (MB8) | 1.0169 | 0.8060 | 18.7082 | 0.8194 | Unbiased |
Model | Performance Indicator | Skilled Indicator * | Bias | ||||
---|---|---|---|---|---|---|---|
RMSE | MAE | MAPE (%) | RMSE | MAE | MAPE | MZ Test | |
Alexander Bay (August) | |||||||
M3 (MB8) | 0.5102 | 0.3787 | 12.2258 | 0.8705 | 0.8819 | 0.7045 | Biased |
M4 (GRU) | 0.7147 | 0.5099 | 8.2808 | 0.8186 | 0.8410 | 0.7999 | Biased |
M5 (Naive) | 3.9391 | 3.2069 | 41.3795 | Biased | |||
Humansdorp (April) | |||||||
M3 (MB8) | 0.4768 | 0.3544 | 9.1258 | 0.8321 | 0.8493 | 0.7793 | Unbiased |
M4 (GRU) | 0.7557 | 0.5376 | 12.7266 | 0.7340 | 0.7714 | 0.6923 | Unbiased |
M5(Naive) | 2.8406 | 2.3516 | 41.3546 | Biased | |||
Jozini (December) | |||||||
M3 (MB8) | 0.6703 | 0.4685 | 10.0479 | 0.7482 | 0.7885 | 0.8093 | Unbiased |
M4 (GRU) | 0.8497 | 0.6377 | 13.5132 | 0.6808 | 0.7121 | 0.7435 | Biased |
M5 (Naive) | 2.6616 | 2.2151 | 52.6837 | Biased |
Model | PL Score | DS Score | PIT Score | |
---|---|---|---|---|
= 0.95 | Mean | KS Test (D) | KS (p-Value) | |
Alexander Bay (August) | ||||
M3 (MB8) | 0.7029 | 3.6753 | 0.0309 | 0.3633 |
M4 (GRU) | 0.6635 | 3.6798 | 0.0503 | 0.0223 |
M5 (Naive) | 0.7187 | 3.7441 | 0.1199 | 1.615 × 10−11 |
Humansdorp (April) | ||||
M3 (MB8) | 0.6361 | 3.0853 | 0.0481 | 0.0729 |
M4 (GRU) | 0.6020 | 3.0905 | 0.0428 | 0.1451 |
M5 (Naive) | 0.5204 | 3.0881 | 0.0677 | 0.0028 |
Jozini (December) | ||||
M3 (MB8) | 0.4286 | 2.7419 | 0.0689 | 0.0217 |
M4 (GRU) | 0.4248 | 2.7618 | 0.0902 | 0.0009 |
M5 (Naive) | 0.5156 | 2.9833 | 0.2192 | 2.2 × 10−16 |
Method | Performance Indicator | Skilled Indicator | |||||
---|---|---|---|---|---|---|---|
Lead Time (Minutes) | RMSE | MAE | MAPE (%) | RMSE | MAE | MAPE | |
M3 | 10 | 0.5102 | 0.3787 | 12.2258 | 0.8705 | 0.8819 | 0.7045 |
60 | 0.5898 | 0.4330 | 7.9098 | 0.8507 | 0.8657 | 0.8099 | |
M4 | 10 | 0.7147 | 0.5099 | 8.2808 | 0.8186 | 0.8411 | 0.7999 |
60 | 2.3845 | 1.8621 | 39.4635 | 0.3964 | 0.4225 | 0.0514 | |
M5 | 10 | 3.9391 | 3.2069 | 41.3795 | |||
60 | 3.9503 | 3.2243 | 41.6032 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sivhugwana, K.S.; Ranganai, E. Wind Speed Forecasting with Differentially Evolved Minimum-Bandwidth Filters and Gated Recurrent Units. Forecasting 2025, 7, 27. https://doi.org/10.3390/forecast7020027
Sivhugwana KS, Ranganai E. Wind Speed Forecasting with Differentially Evolved Minimum-Bandwidth Filters and Gated Recurrent Units. Forecasting. 2025; 7(2):27. https://doi.org/10.3390/forecast7020027
Chicago/Turabian StyleSivhugwana, Khathutshelo Steven, and Edmore Ranganai. 2025. "Wind Speed Forecasting with Differentially Evolved Minimum-Bandwidth Filters and Gated Recurrent Units" Forecasting 7, no. 2: 27. https://doi.org/10.3390/forecast7020027
APA StyleSivhugwana, K. S., & Ranganai, E. (2025). Wind Speed Forecasting with Differentially Evolved Minimum-Bandwidth Filters and Gated Recurrent Units. Forecasting, 7(2), 27. https://doi.org/10.3390/forecast7020027