PSO Based Optimized Ensemble Learning and Feature Selection Approach for Efficient Energy Forecast
Abstract
:1. Introduction
- (i)
- The proposed optimized-ensemble model is a hybrid that consists of random sampling, feature selection, and ensemble learning.
- (ii)
- In the proposed study, the use of PSO is two-fold; it both helps in feature selection and for achieving optimal prediction results among different ensemble combinations by using it to optimize hyper-parameters. We refer to it as an optimized-ensemble model.
- (iii)
- The effectiveness of the proposed optimized-ensemble model is compared to different individual models, varying models of the ensemble, and the previously proposed method, i.e., ANN-PSO. The results show that optimized ensemble has better performance than non-optimized ensemble, individual models, and previously presented models.
2. Related Works
2.1. Energy Consumption Forecasting Techniques
2.2. PSO-Based Feature Selection
2.3. PSO for Hyper-Parameter Optimization
3. Proposed Methodology
3.1. Conceptual View
3.2. Architectural View
3.2.1. Data
3.2.2. PSO-Based Feature Selection
3.2.3. Random Sampling
3.2.4. PSO Optimized Hybrid Ensemble Learning Model
- (A)
- Hyper-parameter Tuning Using PSO
- Being an increasingly popular meta-heuristic algorithm, PSO has a more robust global-search ability.
- In most cases, PSO has substantially improved computational effectiveness.
- It is easier to implement as compared with other meta-heuristic algorithms.
- (B)
- Learners for optimized-ensemble model
- (1)
- GradientBoosting
- (2)
- CatBoost
- (3)
- XGBoost
- (4)
- LightBoost
- (5)
- RandomForest
- (C)
- Predictions
3.2.5. Evaluation
4. Implementation Setup
5. Performance Evaluation
6. Discussion and Concluding Remarks
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Available online: https://www.sciencedirect.com/science/article/pii/B9780128184837000020 (accessed on 22 July 2021).
- Available online: https://www.irena.org/-/media/Files/IRENA/Agency/Publication/2018/Apr/IRENA_Report_GET_2018.pdf (accessed on 22 July 2021).
- Available online: https://www.energy.gov/sites/prod/files/2017/01/f34/Electricity%20End%20Uses,%20Energy%20Efficiency,%20and%20Distributed%20Energy%20Resources.pdf (accessed on 22 July 2021).
- Available online: https://www.iea.org/reports/energy-efficiency-2020/buildings (accessed on 22 July 2021).
- Jain, R.K.; Smith, K.M.; Culligan, P.J.; Taylor, J.E. Forecasting energy consumption of multi-family residential buildings using support vector regression: Investigating the impact of temporal and spatial monitoring granularity on performance accuracy. Appl. Energy 2014, 123, 168–178. [Google Scholar] [CrossRef]
- Howard, B.; Parshall, L.; Thompson, J.; Hammer, S.; Dickinson, J.; Modi, V. Spatial distribution of urban building energy consumption by end use. Energy Build. 2012, 45, 141–151. [Google Scholar] [CrossRef]
- Malik, S.; Shafqat, W.; Lee, K.T.; Kim, D.H. A Feature Selection-Based Predictive-Learning Framework for Optimal Actuator Control in Smart Homes. Actuators 2021, 10, 84. [Google Scholar] [CrossRef]
- Yu, L.; Wang, Z.; Tang, L. A decomposition–ensemble model with data-characteristic-driven reconstruction for crude oil price forecasting. Appl. Energy 2015, 156, 251–267. [Google Scholar] [CrossRef]
- Ciulla, G.; D’Amico, A. Building energy performance forecasting: A multiple linear regression approach. Appl. Energy 2019, 253, 113500. [Google Scholar] [CrossRef]
- Lü, X.; Lu, T.; Kibert, C.J.; Viljanen, M. Modeling and forecasting energy consumption for heterogeneous buildings using a physical–statistical approach. Appl. Energy 2015, 144, 261–275. [Google Scholar] [CrossRef]
- Arora, S.; Taylor, J.W. Short-term forecasting of anomalous load using rule-based triple seasonal methods. IEEE Trans. Power Syst. 2013, 28, 3235–3242. [Google Scholar] [CrossRef] [Green Version]
- Kavaklioglu, K. Modeling and prediction of Turkey’s electricity consumption using Support Vector Regression. Appl. Energy 2011, 88, 368–375. [Google Scholar] [CrossRef]
- Rodrigues, F.; Cardeira, C.; Calado, J.M.F. The daily and hourly energy consumption and load forecasting using artificial neural network method: A case study using a set of 93 households in Portugal. Energy Procedia 2014, 62, 220–229. [Google Scholar] [CrossRef] [Green Version]
- Tso, G.K.; Yau, K.K. Predicting electricity energy consumption: A comparison of regression analysis, decision tree and neural networks. Energy 2007, 32, 1761–1768. [Google Scholar] [CrossRef]
- Bouktif, S.; Fiaz, A.; Ouni, A.; Serhani, M.A. Multi-sequence LSTM-RNN deep learning and metaheuristics for electric load forecasting. Energies 2020, 13, 391. [Google Scholar] [CrossRef] [Green Version]
- Lago, J.; De Ridder, F.; De Schutter, B. Forecasting spot electricity prices: Deep learning approaches and empirical comparison of traditional algorithms. Appl. Energy 2018, 221, 386–405. [Google Scholar] [CrossRef]
- Salgado, R.M.; Lemes, R.R. A hybrid approach to the load forecasting based on decision trees. J. Control Autom. Electr. Syst. 2013, 24, 854–862. [Google Scholar] [CrossRef]
- Li, Q.; Ren, P.; Meng, Q. Prediction model of annual energy consumption of residential buildings. In Proceedings of the 2010 International Conference on Advances in Energy Engineering, Beijing, China, 19–20 June 2010; pp. 223–226. [Google Scholar]
- Ahmad, A.S.; Hassan, M.Y.; Abdullah, M.P.; Rahman, H.A.; Hussin, F.; Abdullah, H.; Saidur, R. A review on applications of ANN and SVM for building electrical energy consumption forecasting. Renew. Sustain. Energy Rev. 2014, 33, 102–109. [Google Scholar] [CrossRef]
- Gezer, G.; Tuna, G.; Kogias, D.; Gulez, K.; Gungor, V.C. PI-controlled ANN-based energy consumption forecasting for Smart Grids. In Proceedings of the 2015 12th International Conference on Informatics in Control, Automation and Robotics (ICINCO), Colmar, France, 21–23 July 2015; Volume 1, pp. 110–116. [Google Scholar]
- Bi, Y.; Xue, B.; Zhang, M. An automated ensemble learning framework using genetic programming for image classification. In Proceedings of the Genetic and Evolutionary Computation Conference, Prague, Czech Republic, 13–17 July 2019; pp. 365–373. [Google Scholar]
- Panthong, R.; Srivihok, A. Wrapper feature subset selection for dimension reduction based on ensemble learning algorithm. Procedia Comput. Sci. 2015, 72, 162–169. [Google Scholar] [CrossRef] [Green Version]
- Yang, X.; Lo, D.; Xia, X.; Sun, J. TLEL: A two-layer ensemble learning approach for just-in-time defect prediction. Inf. Softw. Technol. 2017, 87, 206–220. [Google Scholar] [CrossRef]
- Huang, Y.; Yuan, Y.; Chen, H.; Wang, J.; Guo, Y.; Ahmad, T. A novel energy demand prediction strategy for residential buildings based on ensemble learning. Energy Procedia 2019, 158, 3411–3416. [Google Scholar] [CrossRef]
- Yang, Y.; Hong, W.; Li, S. Deep ensemble learning based probabilistic load forecasting in smart grids. Energy 2019, 189, 116324. [Google Scholar] [CrossRef]
- Krisshna, N.A.; Deepak, V.K.; Manikantan, K.; Ramachandran, S. Face recognition using transform domain feature extraction and PSO-based feature selection. Appl. Soft Comput. 2014, 22, 141–161. [Google Scholar] [CrossRef]
- Kumar, S.U.; Inbarani, H.H. PSO-based feature selection and neighborhood rough set-based classification for BCI multiclass motor imagery task. Neural Comput. Appl. 2017, 28, 3239–3258. [Google Scholar] [CrossRef]
- Tama, B.A.; Rhee, K.H. A combination of PSO-based feature selection and tree-based classifiers ensemble for intrusion detection systems. In Proceedings of the Advances in Computer Science and Ubiquitous Computing, Cebu, Philippines, 15–17 December 2015; Springer: Singapore, 2015; pp. 489–495. [Google Scholar]
- Amoozegar, M.; Minaei-Bidgoli, B. Optimizing multi-objective PSO based feature selection method using a feature elitism mechanism. Expert Syst. Appl. 2018, 113, 499–514. [Google Scholar] [CrossRef]
- Rostami, M.; Forouzandeh, S.; Berahmand, K.; Soltani, M. Integration of multi-objective PSO based feature selection and node centrality for medical datasets. Genomics 2020, 112, 4370–4384. [Google Scholar] [CrossRef]
- Bergstra, J.; Bengio, Y. Random search for hyper-parameter optimization. J. Mach. Learn. Res. 2012, 13, 281–305. [Google Scholar]
- Available online: https://analyticsindiamag.com/why-is-random-search-better-than-grid-search-for-machine-learning/ (accessed on 16 July 2021).
- Hutter, F.; Hoos, H.H.; Leyton-Brown, K. Sequential Model-Based Optimization for General Algorithm Configuration. In Learning and Intelligent Optimization; Springer: Cham, Switzerland, 2011; pp. 507–523. [Google Scholar]
- Snoek, J.; Rippel, O.; Swersky, K.; Kiros, R.; Satish, N.; Sundaram, N.; Patwary, M.M.A.; Prabhat, M.; Adams, R.P. Scalable Bayesian Optimization Using Deep Neural Networks. In Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 6–11 July 2015. [Google Scholar]
- Lorenzo, P.R.; Nalepa, J.; Kawulok, M.; Ramos, L.S.; Pastor, J.R. Particle swarm optimization for hyper-parameter selection in deep neural networks. In Proceedings of the Genetic and Evolutionary Computation Conference, Berlin, Germany, 15–19 July 2017; pp. 481–488. [Google Scholar]
- Lorenzo, P.R.; Nalepa, J.; Ramos, L.S.; Pastor, J.R. Hyper-parameter selection in deep neural networks using parallel particle swarm optimization. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, Berlin, Germany, 15–19 July 2017; pp. 1864–1871. [Google Scholar]
- Wang, Y.; Zhang, H.; Zhang, G. cPSO-CNN: An efficient PSO-based algorithm for fine-tuning hyper-parameters of convolutional neural networks. Swarm Evol. Comput. 2019, 49, 114–123. [Google Scholar] [CrossRef]
- Nalepa, J.; Lorenzo, P.R. Convergence analysis of PSO for hyper-parameter selection in deep neural networks. In Proceedings of the International Conference on P2P, Parallel, Grid, Cloud and Internet Computing, Yonago, Japan, 28–30 October 2017; Springer: Cham, Switzerland, 2017; pp. 284–295. [Google Scholar]
- Guo, Y.; Li, J.Y.; Zhan, Z.H. Efficient hyperparameter optimization for convolution neural networks in deep learning: A distributed particle swarm optimization approach. Cybern. Syst. 2020, 52, 36–57. [Google Scholar] [CrossRef]
- Palaniswamy, S.K.; Venkatesan, R. Hyperparameters tuning of ensemble model for software effort estimation. J. Ambient Intell. Hum. Comput. 2021, 12, 6579–6589. [Google Scholar] [CrossRef]
- Tan, T.Y.; Zhang, L.; Lim, C.P.; Fielding, B.; Yu, Y.; Anderson, E. Evolving ensemble models for image segmentation using enhanced particle swarm optimization. IEEE Access 2019, 7, 34004–34019. [Google Scholar] [CrossRef]
- Khanesar, M.A.; Teshnehlab, M.; Shoorehdeli, M.A. A novel binary particle swarm optimization. In Proceedings of the 2007 Mediterranean Conference on Control & Automation, Athens, Greece, 27–29 June 2007; pp. 1–6. [Google Scholar]
- Malik, S.; Kim, D. Prediction-learning algorithm for efficient energy consumption in smart buildings based on particle regeneration and velocity boost in particle swarm optimization neural networks. Energies 2018, 11, 1289. [Google Scholar] [CrossRef] [Green Version]
- Cui, G.; Qin, L.; Liu, S.; Wang, Y.; Zhang, X.; Cao, X. Modified PSO algorithm for solving planar graph coloring problem. Prog. Nat. Sci. 2008, 18, 353–357. [Google Scholar] [CrossRef]
- Chau, K.W. Particle swarm optimization training algorithm for ANNs in stage prediction of Shing Mun River. J. Hydrol. 2006, 329, 363–367. [Google Scholar] [CrossRef] [Green Version]
- Feurer, M.; Klein, A.; Eggensperger, K.; Springenberg, J.; Hutter, F. Efficient and robust automated machine learning. In Proceedings of the Advances in Neural Information Processing Systems 28 (NIPS 2015), Montreal, QC, Canada, 7–12 December 2015. [Google Scholar]
- Jiang, M.; Jiang, S.; Zhi, L.; Wang, Y.; Zhang, H. Study on parameter optimization for support vector regression in solving the inverse ECG problem. Comput. Math. Methods Med. 2013, 2, 158056. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Available online: https://www.kaggle.com/taranvee/smart-home-dataset-with-weather-information (accessed on 5 July 2021).
- Band, S.S.; Janizadeh, S.; Pal, S.C.; Saha, A.; Chakrabortty, R.; Shokri, M.; Mosavi, A. Novel ensemble approach of deep learning neural network (DLNN) model and particle swarm optimization (PSO) algorithm for prediction of gully erosion susceptibility. Sensors 2020, 20, 5609. [Google Scholar] [CrossRef] [PubMed]
- Qin, C.; Zhang, Y.; Bao, F.; Zhang, C.; Liu, P.; Liu, P. XGBoost Optimized by Adaptive Particle Swarm Optimization for Credit Scoring. Math. Prob. Eng. 2021, 2021, 1–18. [Google Scholar]
- Ali, L.; Wajahat, I.; Golilarz, N.A.; Keshtkar, F.; Chan Bukhari, S.A. LDA–GA–SVM: Improved hepatocellular carcinoma prediction through dimensionality reduction and genetically optimized support vector machine. Neural Comput. Appl. 2021, 33, 2783–2792. [Google Scholar] [CrossRef]
Techniques | Strengths | Weakness |
---|---|---|
Classical Methods |
|
|
Individual Machine Learning Methods |
|
|
Ensemble Machine Learning Methods |
|
|
# | Hyper-Parameters | Range |
---|---|---|
1 | Hidden layers size | 0–15 |
2 | alpha (L2 regularizer) | 0.0001–0.1 |
3 | Number of estimators | 0.1–0.8 |
4 | Random state | 5–100 |
5 | C(Regularization parameter) | 1–100 |
6 | γ (Gamma) | 1–300 |
7 | ε (tolerance) | 0.01–30 |
# | Parameters | Values |
---|---|---|
1 | c1 (Local coefficient) | 1.5 |
2 | c2 (Global coefficient) | 1.5 |
3 | Inertia weight | 0.7 |
4 | No. of particles | 8 |
5 | Fitness criteria | RMSE |
Components | Specifications |
---|---|
Operating System | Windows 10 Professional Edition |
Processor | Intel i5 9th Generation |
Memory (RAM) | 16 GB |
Programming Language | Python 3.8.1 |
IDE | Jupyter (Conda 4.9.1) |
Techniques | Models | RMSE | MAE | MSE |
---|---|---|---|---|
Individual | CatBoost | 27.89 | 25.76 | 789.34 |
XGBoost | 25.62 | 23.44 | 651.88 | |
LightBoost | 26.33 | 24.90 | 883.92 | |
Random Forest | 24.54 | 20.76 | 671.58 | |
Gradient Boosting | 24.88 | 22.38 | 641.46 | |
Optimized-Individuals | CatBoost | 27.01 | 25.12 | 700.67 |
XGBoost | 24.94 | 23.40 | 619.81 | |
LightBoost | 25.68 | 22.39 | 804.48 | |
Random Forest | 23.47 | 18.91 | 632.45 | |
Gradient Boosting | 24.61 | 20.15 | 621.91 | |
Ensemble | CatBoost, XGBoost, GradientBoost | 18.42 | 17.11 | 342.76 |
CatBoost, LightBoost | 20.96 | 18.78 | 480.61 | |
RandomForest, CatBoost | 21.55 | 19.62 | 520.77 | |
RandomForest, XGBoost, CatBoost | 16.81 | 15.28 | 389.51 | |
CatBoost, XGBoost, LightBoost | 11.73 | 10.53 | 291.27 | |
Optimized-Ensemble | CatBoost, XGBoost, GradientBoost | 17.85 | 16.22 | 300.33 |
CatBoost, LightBoost | 17.99 | 16.01 | 451.35 | |
RandomForest, CatBoost | 16.81 | 18.73 | 489.36 | |
RandomForest, XGBoost, CatBoost | 10.68 | 11.83 | 310.84 | |
Proposed Model | 6.05 | 5.76 | 161.37 |
Models | Dataset | RMSE | MAE | MSE |
---|---|---|---|---|
PSO-DLNN [49] | Smart meter | 18.73 | 20.86 | 650.74 |
Our Dataset | 17.52 | 20.01 | 655.33 | |
PSO-XGBoost [50] | Smart meter | 17.91 | 21.99 | 620.16 |
Our Dataset | 18.66 | 19.71 | 590.42 | |
LDA–GA–SVM [51] | Smart meter | 19.23 | 19.55 | 590.18 |
Our Dataset | 15.48 | 20.79 | 660.63 | |
ANN-PSO [7] | Smart meter | 9.23 | 11.79 | 367.49 |
Our Dataset | 7.82 | 12.35 | 401.77 | |
Proposed Model | Smart meter | 7.01 | 9.28 | 198.36 |
Our Dataset | 6.05 | 5.76 | 161.37 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shafqat, W.; Malik, S.; Lee, K.-T.; Kim, D.-H. PSO Based Optimized Ensemble Learning and Feature Selection Approach for Efficient Energy Forecast. Electronics 2021, 10, 2188. https://doi.org/10.3390/electronics10182188
Shafqat W, Malik S, Lee K-T, Kim D-H. PSO Based Optimized Ensemble Learning and Feature Selection Approach for Efficient Energy Forecast. Electronics. 2021; 10(18):2188. https://doi.org/10.3390/electronics10182188
Chicago/Turabian StyleShafqat, Wafa, Sehrish Malik, Kyu-Tae Lee, and Do-Hyeun Kim. 2021. "PSO Based Optimized Ensemble Learning and Feature Selection Approach for Efficient Energy Forecast" Electronics 10, no. 18: 2188. https://doi.org/10.3390/electronics10182188
APA StyleShafqat, W., Malik, S., Lee, K. -T., & Kim, D. -H. (2021). PSO Based Optimized Ensemble Learning and Feature Selection Approach for Efficient Energy Forecast. Electronics, 10(18), 2188. https://doi.org/10.3390/electronics10182188