A Novel Multi-Objective Hybrid Evolutionary-Based Approach for Tuning Machine Learning Models in Short-Term Power Consumption Forecasting
Abstract
:1. Introduction
2. Related Work and Literature Review
3. The Proposed GA-SHADE-MO Algorithm
3.1. Multi-Objective Optimization
3.1.1. Problem Statement
3.1.2. State-of-the-Art Approaches for Multi-Objective Optimization
3.2. GA-SHADE-MO
4. The Experimental Setup and Results
4.1. Power Consumption Forecasting Problem
4.2. Measurement Data
4.3. Modelling Schemes and Input Variables
4.4. Model Optimization Using GA-SHADE-MO and Settings
4.5. Performance Evaluation
4.6. Numerical Results
4.6.1. Forecasting Daily Energy
4.6.2. Forecasting Hourly Energy
5. Discussion
5.1. Discussion of Daily Level of Forecasting
5.1.1. All-Features Scenario on Daily Level
5.1.2. Excluded Ambient Temperature Scenario on Daily Level
5.1.3. Power Lag and Time Scenario on Daily Level
5.1.4. Detailed Analysis of the Tuned ML Model on Daily Level and All-Features Scenario
5.1.5. GA-SHADE-MO vs. Ransom Search on Daily Level and All-Features Scenario
5.2. Discussion of Hourly Level of Forecasting
5.2.1. All-Features Scenario
5.2.2. Excluded Ambient Temperature Scenario
5.2.3. Power Lag and Time Scenario
5.2.4. Detailed Analysis of the Tuned ML Model on Hourly Level and All-Features Scenario
5.2.5. GA-SHADE-MO in Comparison with Random Search, Hourly Level
5.3. Discussion on the Role of Correlation in Feature Selection vs. GA-SHADE-MO
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
AI | Artificial intelligence |
ANN | Artificial neural network |
ARIMA | AutoRegressive Integrated Moving Average |
CNN | Convolutional Neural Network |
ConvLSTM | Convolutional Long Short-Term Memory |
CR | Crossover rate |
DB-Net | Hybrid network model by incorporating a dilated convolutional neural network |
DE | Differential Evolution |
DF-CNNLSTM | Domain fusion of Convolutional Neural Networks and Long Short-Term Memory (LSTM) networks |
DNN | Deep Neural Network |
DT | Decision Tree |
EA | Evolutionary Algorithm |
ENCV | ElasticNetCV |
EPC-PM | Ensemble learning based power consumption prediction model |
F | Scale factor |
FA | Factor Analysis |
FCM–BP | Fuzzy C-Mean clustering BP Neural Network |
GA | Genetic Algorithm |
GARCH | Generalized Autoregressive Conditional Heteroskedasticity |
GA-SHADE-MO | The hybrid evolutionary-based multi-objective algorithm, combined SHADE and GA |
H | Historical Memory |
HVAC | Heating, Ventilation, and Air Conditioning |
IA | Index of Agreement |
IBEA | The Indicator-Based Evolutionary Algorithm |
LR | Linear Regression |
LSTM | Long short-term memory |
MAE | Mean absolute error |
ML | Machine learning |
MLP | Multi-layer perceptron |
MO | Multi-objective |
MOEA/D | The Multi-Objective Evolutionary Algorithm based on Decomposition |
MOGAs | Multi-Objective Genetic Algorithms |
MRA-ANN | Multiple Regression Analysis-Artificial Neural Network |
MSE | Mean square error |
NPGA | The Niched Pareto Genetic Algorithm |
NSGA | Non-dominated Sorting Genetic Algorithm |
PCA | principal component analysis |
PSF | Pattern Sequence Forecasting |
PSO | Particle Swarm Optimization |
R2 | Coefficient of determination |
RF | Random Forest |
RNN | Recurrent Neural Network |
RobustSTL | A robust seasonal-trend decomposition algorithm for long time series |
SHADE | Success-history-based parameter adaptation for differential evolution |
SVR | Support vector regression |
TCN | Temporal convolutional network |
TL-MCLSTM | Deep model named multi-channel long short-term memory with time location genetic algorithms |
VEGA | Vector Evaluated Genetic Algorithm |
XGBoost | Extreme Gradient Boosting |
Appendix A
Regression Model | Hyperparameters | Value Ranges | Type |
---|---|---|---|
Linear Regression | None | None | None |
ElasticNetCV | l1_ratio (ratio of L1 regularization, controls the balance between L1 and L2 regularization) | [0.0; 1.0] | Real |
Decision Tree | max_depth (maximum depth of the tree) min_samples_split (minimum number of samples required to split a node) min_samples_leaf (minimum number of samples required in a leaf node) | [2; 20] [2; 20] [2; 20] | Integer Integer Integer |
Random Forest | n_estimators (number of trees in the forest) max_depth (maximum depth of the trees) min_samples_split (minimum number of samples required to split a node) min_samples_leaf (minimum number of samples required in a leaf node) | [1; 300] [2; 20] [2; 20] [2; 20] | Integer Integer Integer Integer |
Multi-layer perceptron | hidden_layers (number of hidden layers) hidden_layer_sizes (size of each hidden layer) batch_size (size of the mini-batch for training) | [1; 5] [2; 50] [1; 200] | Integer Integer Integer |
XGBoost | colsample_bytree (fraction of features to be selected for each tree) learning_rate (learning rate, controls the weight updates) max_depth (maximum depth of the decision tree) alpha (L1 regularization on weights) n_estimators (number of trees in the ensemble) | [0.001; 1.0] [0.001; 1.0] [1; 20] [1; 10] [1; 300] | Real Real Integer Integer Integer |
References
- Sharma, M.; Mittal, N.; Mishra, A.; Gupta, A. Survey of electricity demand forecasting and demand side management techniques in different sectors to identify scope for improvement. Smart Grids Sustain. Energy 2023, 8, 9. [Google Scholar] [CrossRef]
- Barthelmie, R.J.; Murray, F.; Pryor, S.C. The economic benefit of short-term forecasting for wind energy in the UK electricity market. Energy Policy 2008, 36, 1687–1696. [Google Scholar] [CrossRef]
- Cicceri, G.; Tricomi, G.; D’Agati, L.; Longo, F.; Merlino, G.; Puliafito, A. A Deep Learning-Driven Self-Conscious Distributed Cyber-Physical System for Renewable Energy Communities. Sensors 2023, 23, 4549. [Google Scholar] [CrossRef] [PubMed]
- Karaman, Ö.A. Prediction of Wind Power with Machine Learning Models. Appl. Sci. 2023, 13, 11455. [Google Scholar] [CrossRef]
- Wei, N.; Li, C.; Peng, X.; Zeng, F.; Lu, X. Conventional models and artificial intelligence-based models for energy consumption forecasting: A review. J. Pet. Sci. Eng. 2019, 181, 106187. [Google Scholar] [CrossRef]
- Huang, H.; Jia, R.; Shi, X.; Liang, J.; Dang, J. Feature selection and hyper parameters optimization for short-term wind power forecast. Appl. Intell. 2021, 51, 6752–6770. [Google Scholar] [CrossRef]
- Vakhnin, A.; Ryzhikov, I.; Brester, C.; Niska, H.; Kolehmainen, M. Weather-Based Prediction of Power Consumption in District Heating Network: Case Study in Finland. Energies 2024, 17, 2840. [Google Scholar] [CrossRef]
- Moletsane, P.P.; Motlhamme, T.J.; Malekian, R.; Bogatmoska, D.C. Linear regression analysis of energy consumption data for smart homes. In Proceedings of the 41st International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 21–25 May 2018; pp. 395–399. [Google Scholar]
- Tso, G.K.; Yau, K.K. Predicting electricity energy consumption: A comparison of regression analysis, decision tree and neural networks. Energy 2007, 32, 1761–1768. [Google Scholar] [CrossRef]
- Vinagre, E.; Pinto, T.; Ramos, S.; Vale, Z.; Corchado, J.M. Electrical energy consumption forecast using support vector machines. In Proceedings of the 2016 27th International Workshop on Database and Expert Systems Applications (DEXA), Porto, Portugal, 5–8 September 2016; pp. 171–175. [Google Scholar]
- Azadeh, A.; Ghaderi, S.F.; Sohrabkhani, S. Annual electricity consumption forecasting by neural network in high energy consuming industrial sectors. Energy Convers. Manag. 2008, 49, 2272–2278. [Google Scholar] [CrossRef]
- Salam, A.; El Hibaoui, A. Comparison of machine learning algorithms for the power consumption prediction:-case study of tetouan city–. In Proceedings of the 2018 6th International Renewable and Sustainable Energy Conference (IRSEC), Rabat, Morocco, 5–8 December 2018; pp. 1–5. [Google Scholar]
- Reddy, S.; Akashdeep, S.; Harshvardhan, R.; Kamath, S. Stacking Deep learning and Machine learning models for short-term energy consumption forecasting. Adv. Eng. Inform. 2022, 52, 101542. [Google Scholar]
- Sultana, N.; Hossain, S.Z.; Almuhaini, S.H.; Düştegör, D. Bayesian optimization algorithm-based statistical and machine learning approaches for forecasting short-term electricity demand. Energies 2022, 15, 3425. [Google Scholar] [CrossRef]
- Li, K.; Hu, C.; Liu, G.; Xue, W. Building’s electricity consumption prediction using optimized artificial neural networks and principal component analysis. Energy Build. 2015, 108, 106–113. [Google Scholar] [CrossRef]
- Li, J.; Chen, H.; Yang, J.; Liu, S.; Nie, Y.; Li, J. Power Consumption Forecast Based on Ridge Regression Model. In Proceedings of the 5th International Conference on Information Technologies and Electrical Engineering, Changsha, China, 4–6 November 2022; pp. 297–302. [Google Scholar]
- Musleh, D.A.; Al Metrik, M.A. Machine Learning and Bagging to Predict Midterm Electricity Consumption in Saudi Arabia. Appl. Syst. Innov. 2023, 6, 65. [Google Scholar] [CrossRef]
- Zhou, J.; Wang, Q.; Khajavi, H.; Rastgoo, A. Sensitivity analysis and comparative assessment of novel hybridized boosting method for forecasting the power consumption. Expert Syst. Appl. 2024, 249, 123631. [Google Scholar] [CrossRef]
- Divina, F.; Gilson, A.; Goméz-Vela, F.; García Torres, M.; Torres, J.F. Stacking ensemble learning for short-term electricity consumption forecasting. Energies 2018, 11, 949. [Google Scholar] [CrossRef]
- Chi, D. Research on electricity consumption forecasting model based on wavelet transform and multi-layer LSTM model. Energy Rep. 2022, 8, 220–228. [Google Scholar] [CrossRef]
- Bian, H.; Zhong, Y.; Sun, J.; Shi, F. Study on power consumption load forecast based on K-means clustering and FCM–BP model. Energy Rep. 2020, 6, 693–700. [Google Scholar] [CrossRef]
- Eynard, J.; Grieu, S.; Polit, M. Wavelet-based multi-resolution analysis and artificial neural networks for forecasting temperature and thermal power consumption. Eng. Appl. Artif. Intell. 2011, 24, 501–516. [Google Scholar] [CrossRef]
- Lin, C.H.; Nuha, U.; Lin, G.Z.; Lee, T.F. Hourly power consumption forecasting using robuststl and tcn. Appl. Sci. 2022, 12, 4331. [Google Scholar] [CrossRef]
- Khan, N.; Haq, I.U.; Ullah, F.U.M.; Khan, S.U.; Lee, M.Y. CL-net: ConvLSTM-based hybrid architecture for batteries’ state of health and power consumption forecasting. Mathematics 2021, 9, 3326. [Google Scholar] [CrossRef]
- Khan, N.; Haq, I.U.; Khan, S.U.; Rho, S.; Lee, M.Y.; Baik, S.W. DB-Net: A novel dilated CNN based multi-step forecasting model for power consumption in integrated local energy systems. Int. J. Electr. Power Energy Syst. 2021, 133, 107023. [Google Scholar] [CrossRef]
- Peña-Guzmán, C.; Rey, J. Forecasting residential electric power consumption for Bogotá Colombia using regression models. Energy Rep. 2020, 6, 561–566. [Google Scholar] [CrossRef]
- Son, N. Comparison of the deep learning performance for short-term power load forecasting. Sustainability 2021, 13, 12493. [Google Scholar] [CrossRef]
- Kumar, J.; Gupta, R.; Saxena, D.; Singh, A.K. Power consumption forecast model using ensemble learning for smart grid. J. Supercomput. 2023, 79, 11007–11028. [Google Scholar] [CrossRef]
- Yan, K.; Wang, X.; Du, Y.; Jin, N.; Huang, H.; Zhou, H. Multi-step short-term power consumption forecasting with a hybrid deep learning strategy. Energies 2018, 11, 3089. [Google Scholar] [CrossRef]
- Moon, J.; Park, J.; Hwang, E.; Jun, S. Forecasting power consumption for higher educational institutions based on machine learning. J. Supercomput. 2018, 74, 3778–3800. [Google Scholar] [CrossRef]
- Gomez-Quiles, C.; Asencio-Cortes, G.; Gastalver-Rubio, A.; Martinez-Alvarez, F.; Troncoso, A.; Manresa, J.; Riquelme, J.C.; Riquelme-Santos, J.M. A novel ensemble method for electric vehicle power consumption forecasting: Application to the Spanish system. IEEE Access 2019, 7, 120840–120856. [Google Scholar] [CrossRef]
- Shao, X.; Pu, C.; Zhang, Y.; Kim, C.S. Domain fusion CNN-LSTM for short-term power consumption forecasting. IEEE Access 2020, 8, 188352–188362. [Google Scholar] [CrossRef]
- Shao, X.; Kim, C.S. Multi-step short-term power consumption forecasting using multi-channel LSTM with time location considering customer behavior. IEEE Access 2020, 8, 125263–125273. [Google Scholar] [CrossRef]
- Nagy, M.; Mansour, Y.; Abdelmohsen, S. Multi-objective optimization methods as a decision making strategy. Int. J. Eng. Res. Technol. 2020, 9, 516–522. [Google Scholar]
- Zadeh, L. Optimality and non-scalar-valued performance criteria. IEEE Trans. Autom. Control 1963, 8, 59–60. [Google Scholar] [CrossRef]
- Seo, T.; Asakura, Y. Multi-objective linear optimization problem for strategic planning of shared autonomous vehicle operation and infrastructure design. IEEE Trans. Intell. Transp. Syst. 2021, 23, 3816–3828. [Google Scholar] [CrossRef]
- Mohseny-Tonekabony, N.; Sadjadi, S.J.; Mohammadi, E.; Tamiz, M.; Jones, D.F. Robust, extended goal programming with uncertainty sets: An application to a multi-objective portfolio selection problem leveraging DEA. Ann. Oper. Res. 2024, 1–56. [Google Scholar] [CrossRef]
- Holland, J.H. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence; MIT Press: Cambridge, MA, USA, 1975. [Google Scholar]
- Schaffer, J.D. Some Experiments in Machine Learning Using Vector Evaluated Genetic Algorithms. Ph.D. Thesis, Vanderbilt University, Nashville, TN, USA, 1985. [Google Scholar]
- Fonseca, C.M.; Fleming, P.J. Genetic algorithms for multiobjective optimization: Formulationdiscussion and generalization. Icga 1993, 93, 416–423. [Google Scholar]
- Horn, J.; Nafpliotis, N.; Goldberg, D.E. A niched Pareto genetic algorithm for multiobjective optimization. In Proceedings of the First IEEE Conference on Evolutionary Computation, Orlando, FL, USA, 27–29 June 1994; pp. 82–87. [Google Scholar]
- Srinivas, N.; Deb, K. Muiltiobjective optimization using nondominated sorting in genetic algorithms. Evol. Comput. 1994, 2, 221–248. [Google Scholar] [CrossRef]
- Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T.A.M.T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 2002, 6, 182–197. [Google Scholar] [CrossRef]
- Coello, C.A.C.; Pulido, G.T.; Lechuga, M.S. Handling multiple objectives with particle swarm optimization. IEEE Trans. Evol. Comput. 2004, 8, 256–279. [Google Scholar] [CrossRef]
- Purshouse, R.C.; Deb, K.; Mansor, M.M.; Mostaghim, S.; Wang, R. A review of hybrid evolutionary multiple criteria decision making methods. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation, Beijing, China, 6–11 July 2014; pp. 1147–1154. [Google Scholar]
- Zhang, Q.; Li, H. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. 2007, 11, 712–731. [Google Scholar] [CrossRef]
- Falcón-Cardona, J.G.; Coello, C.A.C. Indicator-based multi-objective evolutionary algorithms: A comprehensive survey. ACM Comput. Surv. 2020, 53, 29. [Google Scholar] [CrossRef]
- Gunantara, N. A review of multi-objective optimization: Methods and its applications. Cogent Eng. 2018, 5, 1502242. [Google Scholar] [CrossRef]
- Pereira, J.L.J.; Oliver, G.A.; Francisco, M.B.; Cunha Jr, S.S.; Gomes, G.F. A review of multi-objective optimization: Methods and algorithms in mechanical engineering problems. Arch. Comput. Methods Eng. 2022, 29, 2285–2308. [Google Scholar] [CrossRef]
- Katoch, S.; Chauhan, S.S.; Kumar, V. A review on genetic algorithm: Past, present, and future. Multimed. Tools Appl. 2021, 80, 8091–8126. [Google Scholar] [CrossRef] [PubMed]
- Tanabe, R.; Fukunaga, A. Success-history based parameter adaptation for differential evolution. In Proceedings of the 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; pp. 71–78. [Google Scholar]
- Sangswang, A.; Konghirun, M. Optimal Strategies in Home Energy Management System Integrating Solar Power, Energy Storage, and Vehicle-to-Grid for Grid Support and Energy Efficiency. IEEE Trans. Ind. Appl. 2020, 56, 5716–5728. [Google Scholar] [CrossRef]
- Yuan, X.; Cai, Q.; Deng, S. Power consumption behavior analysis based on cluster analysis. In Proceedings of the International Symposium on Artificial Intelligence and Robotics 2021, Fukuoka, Japan, 21–22 August 2021; Volume 11884, pp. 476–486. [Google Scholar]
- de Lemos Martins, T.A.; Faraut, S.; Adolphe, L. Influence of context-sensitive urban and architectural design factors on the energy demand of buildings in Toulouse, France. Energy Build. 2019, 190, 262–278. [Google Scholar] [CrossRef]
- Hyndman, R.J.; Athanasopoulos, G. Forecasting: Principles and Practice; OTexts: Melbourne, Australia, 2018. [Google Scholar]
- Cerqueira, V.; Torgo, L.; Mozetič, I. Evaluating time series forecasting models: An empirical study on performance estimation methods. Mach. Learn. 2020, 109, 1997–2028. [Google Scholar] [CrossRef]
- Fumo, N.; Biswas, M.R. Regression analysis for prediction of residential energy consumption. Renew. Sustain. Energy Rev. 2015, 47, 332–343. [Google Scholar] [CrossRef]
- Liu, W.; Dou, Z.; Wang, W.; Liu, Y.; Zou, H.; Zhang, B.; Hou, S. Short-term load forecasting based on elastic net improved GMDH and difference degree weighting optimization. Appl. Sci. 2018, 8, 1603. [Google Scholar] [CrossRef]
- Cody, C.; Ford, V.; Siraj, A. Decision tree learning for fraud detection in consumer energy consumption. In Proceedings of the 2015 IEEE 14th International Conference on Machine Learning and Applications, Miami, FL, USA, 9–11 December 2015; pp. 1175–1179. [Google Scholar]
- Zogaan, W.A. Power Consumption prediction using Random Forest model. Int. J. Mech. Eng. 2022, 7, 329–341. [Google Scholar]
- Wahid, F.; Kim, D.H. Short-term energy consumption prediction in Korean residential buildings using optimized multi-layer perceptron. Kuwait J. Sci. 2017, 44, 67–77. [Google Scholar]
- 62. Abbasi, R.A.; Javaid, N.; Ghuman, M.N.J.; Khan, Z.A.; Ur Rehman, S.; Amanullah. Short term load forecasting using XGBoost. In Web, Artificial Intelligence and Network Applications, Proceedings of the Workshops of the 33rd International Conference on Advanced Information Networking and Applications, Matsue, Japan, 27–29 March 2019; Springer: Cham, Switzerland, 2019; pp. 1120–1131. [Google Scholar]
- Tran, M.K.; Panchal, S.; Chauhan, V.; Brahmbhatt, N.; Mevawalla, A.; Fraser, R.; Fowler, M. Python-based scikit-learn machine learning models for thermal and electrical performance prediction of high-capacity lithium-ion battery. Int. J. Energy Res. 2022, 46, 786–794. [Google Scholar] [CrossRef]
- Chen, T.; Guestrin, C. Xgboost: A scalable tree boosting system. In Proceedings of the 22nd ACM Sigkdd International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
- Surakhi, O.; Zaidan, M.A.; Fung, P.L.; Hossein Motlagh, N.; Serhan, S.; AlKhanafseh, M.; Ghoniem, R.M.; Hussein, T. Time-Lag Selection for Time-Series Forecasting Using Neural Network and Heuristic Algorithm. Electronics 2021, 10, 2518. [Google Scholar] [CrossRef]
- Mystakidis, A.; Koukaras, P.; Tsalikidis, N.; Ioannidis, D.; Tjortjis, C. Energy Forecasting: A Comprehensive Review of Techniques and Technologies. Energies 2024, 17, 1662. [Google Scholar] [CrossRef]
Model | Data Source | Authors | Feature Selection | Hyperparameters |
---|---|---|---|---|
Wavelet transform and multi-layer LSTM | The electricity consumption from U.S. Electric Power Company | D. Chi [20] | None | Fixed hyperparameters |
K-means and FCM–BP | The load data of 200 users from an area of Nanjing | H. Bian, et al. [21] | None | Fixed hyperparameters |
MRA-ANN | The multi-energy district boiler, La Rochelle, west coast of France | J. Eynard, et al. [22] | None | Grid search |
The hybrid of RobustSTL and TCN | Hourly Power Consumption of Turkey | C. H. Lin, et al. [23] | None | Fixed hyperparameters |
ConvLSTM and LSTM | NASA Battery Dataset, Individual Household Electric Power Consumption Dataset, Domestic Energy Management System Dataset | N. Khan, et al. [24] | None | Fixed hyperparameters |
DB-Net | IHEPC dataset from the UCI ML repository; the Korean AICT dataset | N. Khan, et al. [25] | None | Grid search |
A multiple regression model, a multiple econometric regression model and a LR model of double logarithm | The six socio-economic strata in Bogotá City | C. Peña-Guzmán, et al. [26] | None | None |
DNN, RNN, CNN, LSTM | Companies B and T located in Naju, Jeollanam-do | N. Son [27] | Correlation analysis | Fixed hyperparameters |
EPC-PM | The UMass Smart dataset | J. Kumar, et al. [28] | None | Fixed hyperparameters |
DNN hybrid | Five real-world household power consumption datasets | K. Yan, et al. [29] | None | Fixed hyperparameters |
SVR, ANN | Four building clusters in a university | J. Moon. et al. [30] | PCA and FA | Grid search |
The learning ensemble of ARIMA, GARCH and PSF | The Spanish Control Centre for the Electric Vehicle | C. Gomez-Quiles, et al. [31] | None | Fixed hyperparameters |
DF-CNNLSTM | PJM Hourly Energy Consumption Data | X. Shao, et al. [32] | None | Fixed hyperparameters |
TL-MCLSTM | Two subsets from Pennsylvania-New Jersey Maryland | X. Shao, et. al. [33] | None | Grid search |
Abbreviation of the Feature | Description of the Feature |
---|---|
T, Tlag1, Tlag2, Tlag3 | Averaged ambient temperature at the day, one day ago, two days ago, and three days ago. |
Tlag24, Tlag48, Tlag72 | Averaged ambient temperature, 24 h ago, 48 h ago, and 72 h ago. |
P1lag, P2lag, P3lag | Power consumption one day ago, two days ago, and three days ago, respectively. |
P24lag, P48lag, P72lag | Power consumption 24 h ago, 48 h ago, and 72 h ago, respectively. |
Pr | Atmospheric pressure |
Rel | Relative humidity |
Dew | Dew point |
Cl | Cloud cover level |
windcos | Direction of the wind transformed in cos |
windsin | Direction of the wind transformed in sin |
windsp | Wind speed |
hcos | cos(hour·2π/24) transformation of hours |
hsin | sin(hour·2π/24) transformation of hours |
dcos | cos(day·2π/7) transformation of days |
dsin | sin(day·2π/7) transformation of days |
wcos | cos(week·2π/52) transformation of weeks |
wsin | sin(week·2π/52) transformation of weeks |
mcos | cos(month·2π/12) transformation of months |
msin | sin(month·2π/12) transformation of months |
The Best ML Tuned Model | Validation | Test | ||||||
---|---|---|---|---|---|---|---|---|
MAE | MSE | IA | R2 | MAE | MSE | IA | R2 | |
LR | 7.144 | 124.351 | 0.934 | 0.983 | 8.122 | 133.879 | 0.957 | 0.989 |
ENCV | 7.152 | 123.971 | 0.934 | 0.983 | 8.156 | 134.737 | 0.957 | 0.989 |
DT | 8.796 | 182.728 | 0.907 | 0.976 | 9.556 | 180.824 | 0.942 | 0.985 |
RF | 7.434 | 133.028 | 0.93 | 0.982 | 8.312 | 135.63 | 0.956 | 0.989 |
MLP | 6.685 | 115.374 | 0.939 | 0.984 | 7.527 | 115.959 | 0.963 | 0.990 |
XGBoost | 7.424 | 130.873 | 0.931 | 0.982 | 8.101 | 132.323 | 0.957 | 0.989 |
The Best ML Tuned Model | Validation | Test | ||||||
---|---|---|---|---|---|---|---|---|
MAE | MSE | IA | R2 | MAE | MSE | IA | R2 | |
LR | 7.338 | 126.076 | 0.933 | 0.983 | 8.270 | 137.162 | 0.956 | 0.989 |
ENCV | 7.232 | 123.059 | 0.935 | 0.983 | 8.511 | 141.938 | 0.954 | 0.988 |
DT | 9.355 | 187.305 | 0.901 | 0.974 | 10.516 | 209.288 | 0.933 | 0.983 |
RF | 7.926 | 146.398 | 0.923 | 0.980 | 8.682 | 152.150 | 0.951 | 0.988 |
MLP | 6.824 | 117.734 | 0.939 | 0.984 | 7.903 | 124.932 | 0.960 | 0.989 |
XGBoost | 7.646 | 135.900 | 0.927 | 0.981 | 8.577 | 149.363 | 0.952 | 0.988 |
The Best ML Tuned Model | Validation | Test | ||||||
---|---|---|---|---|---|---|---|---|
MAE | MSE | IA | R2 | MAE | MSE | IA | R2 | |
LR | 9.541 | 196.054 | 0.899 | 0.973 | 9.984 | 196.182 | 0.937 | 0.983 |
ENCV | 9.493 | 195.795 | 0.899 | 0.973 | 9.962 | 196.618 | 0.937 | 0.983 |
DT | 10.274 | 228.431 | 0.879 | 0.967 | 12.647 | 292.150 | 0.906 | 0.973 |
RF | 9.803 | 217.283 | 0.886 | 0.969 | 11.240 | 225.884 | 0.927 | 0.980 |
MLP | 9.525 | 201.665 | 0.893 | 0.971 | 11.211 | 226.656 | 0.927 | 0.980 |
XGBoost | 9.712 | 205.661 | 0.892 | 0.971 | 11.382 | 236.580 | 0.924 | 0.979 |
The Best ML Tuned Model | Validation | Test | ||||||
---|---|---|---|---|---|---|---|---|
MAE | MSE | IA | R2 | MAE | MSE | IA | R2 | |
LR | 0.922 | 1.709 | 0.811 | 0.946 | 0.910 | 1.722 | 0.835 | 0.954 |
ENCV | 0.922 | 1.709 | 0.811 | 0.946 | 0.906 | 1.716 | 0.836 | 0.954 |
DT | 0.910 | 1.660 | 0.817 | 0.949 | 0.910 | 1.783 | 0.829 | 0.952 |
RF | 0.857 | 1.434 | 0.840 | 0.955 | 0.860 | 1.533 | 0.853 | 0.959 |
MLP | 0.825 | 1.272 | 0.858 | 0.961 | 0.884 | 1.970 | 0.811 | 0.951 |
XGBoost | 0.839 | 1.335 | 0.852 | 0.959 | 0.875 | 1.677 | 0.839 | 0.956 |
The Best ML Tuned Model | Validation | Test | ||||||
---|---|---|---|---|---|---|---|---|
MAE | MSE | IA | R2 | MAE | MSE | IA | R2 | |
LR | 0.946 | 1.781 | 0.804 | 0.943 | 0.925 | 1.776 | 0.830 | 0.953 |
ENCV | 0.946 | 1.780 | 0.804 | 0.943 | 0.924 | 1.775 | 0.830 | 0.953 |
DT | 0.930 | 1.710 | 0.811 | 0.947 | 0.938 | 1.843 | 0.823 | 0.950 |
RF | 0.865 | 1.468 | 0.837 | 0.954 | 0.871 | 1.563 | 0.850 | 0.958 |
MLP | 0.835 | 1.317 | 0.853 | 0.960 | 0.895 | 1.821 | 0.826 | 0.952 |
XGBoost | 0.847 | 1.365 | 0.848 | 0.958 | 0.874 | 1.606 | 0.846 | 0.958 |
The Best ML Tuned Model | Validation | Test | ||||||
---|---|---|---|---|---|---|---|---|
MAE | MSE | IA | R2 | MAE | MSE | IA | R2 | |
LR | 0.968 | 1.878 | 0.793 | 0.941 | 0.946 | 1.879 | 0.820 | 0.948 |
ENCV | 0.967 | 1.876 | 0.794 | 0.941 | 0.946 | 1.881 | 0.820 | 0.948 |
DT | 0.986 | 1.943 | 0.787 | 0.939 | 0.990 | 2.004 | 0.808 | 0.944 |
RF | 0.949 | 1.774 | 0.805 | 0.945 | 0.965 | 1.885 | 0.819 | 0.946 |
MLP | 0.943 | 1.685 | 0.817 | 0.950 | 0.997 | 2.032 | 0.805 | 0.943 |
XGBoost | 0.932 | 1.693 | 0.815 | 0.948 | 0.953 | 1.862 | 0.822 | 0.947 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vakhnin, A.; Ryzhikov, I.; Niska, H.; Kolehmainen, M. A Novel Multi-Objective Hybrid Evolutionary-Based Approach for Tuning Machine Learning Models in Short-Term Power Consumption Forecasting. AI 2024, 5, 2461-2496. https://doi.org/10.3390/ai5040120
Vakhnin A, Ryzhikov I, Niska H, Kolehmainen M. A Novel Multi-Objective Hybrid Evolutionary-Based Approach for Tuning Machine Learning Models in Short-Term Power Consumption Forecasting. AI. 2024; 5(4):2461-2496. https://doi.org/10.3390/ai5040120
Chicago/Turabian StyleVakhnin, Aleksei, Ivan Ryzhikov, Harri Niska, and Mikko Kolehmainen. 2024. "A Novel Multi-Objective Hybrid Evolutionary-Based Approach for Tuning Machine Learning Models in Short-Term Power Consumption Forecasting" AI 5, no. 4: 2461-2496. https://doi.org/10.3390/ai5040120
APA StyleVakhnin, A., Ryzhikov, I., Niska, H., & Kolehmainen, M. (2024). A Novel Multi-Objective Hybrid Evolutionary-Based Approach for Tuning Machine Learning Models in Short-Term Power Consumption Forecasting. AI, 5(4), 2461-2496. https://doi.org/10.3390/ai5040120