Next Article in Journal
PEM Fuel Cell Emulators: A Review
Previous Article in Journal
Development of an FPGA-Based Robotic Anti-Electromagnetic Interference Unsorted Bin-Picking System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Time Series Prediction Methodology and Ensemble Model Using Real-World Data

1
Department of Software, Sangmyung University, Chunan 330720, Republic of Korea
2
School of Artificial Intelligence Convergence, Hallym University, Chuncheon 24252, Republic of Korea
*
Authors to whom correspondence should be addressed.
Electronics 2023, 12(13), 2811; https://doi.org/10.3390/electronics12132811
Submission received: 3 June 2023 / Revised: 22 June 2023 / Accepted: 23 June 2023 / Published: 25 June 2023
(This article belongs to the Section Power Electronics)

Abstract

:
Time series data analysis and forecasting have recently received considerable attention, supporting new technology development trends for predicting load fluctuations or uncertainty conditions in many domains. In particular, when the load is small, such as a building, the effect of load fluctuation on the total load is relatively large compared to the power system, except for specific factors, and the amount is very difficult to quantify. Recently, accurate power consumption prediction has become an important issue in the Internet of Things (IoT) environment. In this paper, a traditional time series prediction method was applied and a new model and scientific approach were used for power prediction in IoT and big data environments. To this end, to obtain data used in real life, the power consumption of commercial refrigerators was continuously collected at 15 min intervals, and prediction results were obtained by applying time series prediction methods (e.g., RNN, LSTM, and GRU). At this time, the seasonality and periodicity of electricity use were also analyzed. In this paper, we propose a method to improve the performance of the model by classifying power consumption into three classes: weekday, Saturday, and Sunday. Finally, we propose a method for predicting power consumption using a new type of ensemble model combined with three time series methods. Experimental results confirmed the accuracy of RNN (i.e., 96.1%), LSTM (i.e., 96.9%), and GRU (i.e., 96.4%). In addition, it was confirmed that the ensemble model combining the three time series models showed 98.43% accuracy in predicting power consumption. Through these experiments and approaches, scientific achievements for time series data analysis through real data were accomplished, which provided an opportunity to once again identify the need for continuous real-time power consumption monitoring.

1. Introduction

The functions of the smart grid are becoming more important in the global trend of changing energy policy from existing supply-oriented to demand-oriented management [1,2]. The smart grid is a system that bi-directionally shares information between system operators and power consumers by integrating information technology and next-generation intelligent meter infrastructure into the existing grid [3,4,5]. Under a smart grid, a power grid operator can reduce peak demand and stabilize power grid operation by inducing power users to change power usage according to given grid conditions. Power grid users can reduce unnecessary power usage, which can be economically beneficial through reduced electricity demand and sales. To this end, various energy management systems have been developed for various types of power consumers [6,7]. Among them, household electric power consumers, who occupy 20% of the total electric power consumption because their individual electric power consumption is small but who form a considerable proportion of the population, will be able to make a significant contribution to demand management if they are integrated and managed through intermediate operators, such as demand management companies. The important point here is that it can be a decisive factor and point for the entire smart grid system, where individual power consumption has a huge impact, which will have a major impact on overall peak demand.
There have been several research studies and academic progress has been made in electric power demand forecasting, and novel forecasting methods have been developed. Power demand, through prediction, is used for the supply and demand management of the power system in an electric power system [8,9,10]. Power demand forecasting for such a large load has been studied to enhance the prediction method with the previous time series characteristics [11,12] or analyze the relationship with elements such as weather and events. In particular, much prediction research, such as that on stock, statistical services, and practical systems, has been developed in cloud computing and IoT environments [13,14]. However, previous studies have been predominantly predicated on large-scale loads targeting power systems where load variability or uncertainty is not significant [15,16,17].
However, when a small-sized or medium-sized power consumption, such as a building, is examined within the entire electric power system, assuming that a power consumption variation occurs due to a specific cause, the influence of the power consumption in the entire system is considerably large, and thus, the attributions of power consumption should be considered [18,19,20,21,22]. Therefore, it is difficult to predict the power consumption variation, and thus, a prediction method using time series-based deep learning, using patterns of existing commercial refrigerator usage and power consumption, is needed by analyzing the complex relationships in power consumption. Thus, as shown by the example of small-sized or medium-sized power consumption, a novel prediction method that sufficiently reflects the demand, variation characteristics, and variation of the power consumption is required at a specific power consumption. However, among the many methods for more precise demand forecasting, it is difficult to identify demand forecasting by time series analysis based on various data pattern analyses and actual data centering. Thus, an accurate and precise demand forecasting of power consumption is required for a stable and economical load operation and power supply even for small-, medium- and large-sized power consumptions where an energy management system (EMS) is introduced for the efficient use of energy. Recently, however, devices such as Smart Plug or ‘Enertok’, which perform real-time monitoring of home appliances based on the Internet of Things, have been activated [21,22]. Using such devices, consumers are guided to energy conservation through real-time power consumption monitoring, and power consumption can be reduced through an alarm function based on the consumer setting.
In this paper, we use a scientific approach by applying traditional time series prediction methods for power prediction in IoT and big data environments. To utilize periodicity for time series analysis, we considered the separation and utilization of different types of categories according to the significantly different use on weekdays, Saturdays, and holidays. In addition, the ensemble combination of different prediction models was applied to improve the prediction accuracy performance. To this end, a commercial refrigerator power consumption was collected for 15 min, and the time series prediction methods RNN, LSTM, and GRU (i.e., Recurrent Neural Network, Long Short-Term Memory, and Gated Recurrent Unit, respectively) were applied to obtain the prediction [23,24], i.e., power consumption. As a result of applying the class separation, we confirmed that the RNN (i.e., 96.1%), LSTM (i.e., 96.9%), and GRU (i.e., 96.4%) provided accuracy in power consumption. Note that, the ensemble-based approach is well known to improve the prediction accuracy with increasing generalization [25]. To provide more improved accuracy, we designed the ensemble model with three models, such as RNN, LSTM, and GRU, based on deep learning methods. The experimental results of the ensemble model combining the three time series models showed 98.43% accuracy in power consumption. Therefore, it is expected that time series prediction performance can be improved through the strategy of the proposed method according to the application of various commercial refrigerators.
The composition of this paper is as follows. In Section 2, we describe the energy consumption forecast and pattern analysis, and real-world data and measurement, respectively. In Section 3 and Section 4, we describe foundational three time series prediction methods and ensemble models for predicting power consumption. Finally, in Section 5, we present the conclusion to the study.

2. Background

2.1. Energy Consumption Pattern

The study of power consumption patterns for efficient power generation is important [1,2,3,4,5,6,7]. As a method for predicting the demand for electricity, machine learning and time series data analysis methods have been studied [8,9,10,11,12,13,14,15,16,17]. However, when a small-sized or medium-sized power consumption such as a building is examined in the entire electric power system, assuming that a power consumption variation occurs due to a specific cause, the influence of the power consumption within the entire system is considerably large, and thus the attributions of power consumption should be considered [18,19,20,21,22]. Therefore, analyzing the power consumption pattern in order to predict accurate power consumption is an important issue. Recently, devices such as Smart Plug or ‘Enertok’, which perform real-time monitoring of home appliances based on the Internet of Things, have been activated [21,22]. Using these devices, consumers are guided to conserve energy through real-time energy consumption monitoring, and power consumption is reduced through an alarm function based on the consumer’s setting. In addition, as research on the development of real-time automatic building energy performance analysis and diagnosis technology has progressed, a method for analyzing the energy consumption patterns of buildings using the machine learning method has been developed [23,24,25]. In this method, the energy consumption pattern is analyzed from the temperature and behavior patterns of the user. In addition, power consumption pattern analysis can be applied to cognitive science in specific situations, such as in abnormal detections and safety management systems for the elderly living alone.

2.2. Real-World Data and Measurement

In this paper, actual data from a commercial refrigerator installed in the Japan Logistics Center, as shown in Figure 1, was utilized. The refrigerator has a central center, and left and right doors. Figure 1b shows the refrigerator installed in the refrigerator. Figure 1c shows the power consumption of the outdoor unit from 3.3 kW to 4.1 kW. In addition, the maligning method of the indoor unit is an open cycle method. The items are stored as shown in Figure 1d. In this paper, we use a scientific approach by applying the traditional deep learning method and the time series data analysis method for power prediction in IoT and big data environments. To this end, the commercial refrigerator power consumption was collected for 15 min, and the machine learning method (i.e., support vector machine) and time series prediction methods (i.e., RNN, LSTM, and GRU) were applied to obtain our prediction, i.e., power consumption.
In our study, we used the RNN, LSTM, and GRU algorithms to predict refrigerator usage patterns on weekdays and weekends using 80-day actual refrigerator power consumption data. Small- or medium-sized refrigerator power consumption data were composed of data measured for 15 min per day for about 12 weeks in order to distinguish the power pattern of the refrigerator on weekdays and the power pattern at weekends (Saturday, Sunday). The data pattern is shown in Figure 2.
Figure 2 shows the power consumption data measured in 15 min increments per day, consisting of a total of 96 data measurements from 00:00 to 23:45. Figure 2a shows weekday data (Mon–Fri), Figure 2b,c show Saturday and Sunday data, respectively. Table 1 shows configuration information for the ensemble models and Table 2 shows different patterns between classes (i.e., Class1: Weekday, Class2: Saturday, Class3: Sunday). In the case of Class1, this showed power use staying at 200 and then suddenly rising. This was because the door of the refrigerator was opened. When the door of a refrigerator is opened, cold air is discharged, and the refrigerator uses more power to cool again. In this paper, the part where power consumption increases rapidly due to the opening of the refrigerator is called a peak spot. In the case of Class2, this showed that power usage stayed at 200 and sometimes rose suddenly. This was because Class2 opened the refrigerator door less frequently than Class1 due to the characteristics of the day of the week. In conclusion, Class2 had fewer peak spots than Class 1. In the case of Class3, the power consumption data were mostly around 200. This means that the refrigerator door was not opened in Class3. Characteristically, Class3 had no peak spot. In conclusion, we confirmed that there was an interclass seasonality and an inherent periodicity of the class. Note that the classification accuracy can be improved with time series prediction approaches due to inherent periodicity and seasonality. Therefore, we proposed a way to predict power usage by dividing it into three classes: weekly, Saturday, and Sunday.

3. Foundational Models for Energy Consumption Prediction

3.1. Time Series Prediction Methods

The time series forecasting method is a method of predicting future data from past sequence data and is widely used in critical applications, such as demand forecasting for goods sold by retailers, traffic flow, and energy demand and supply [23,24,25]. As for the time series prediction method, various models, such as multilayer perceptron and convolutional neural networks can be used, except for the models based on recurrent neural networks used in this paper.
First, RNN is an artificial neural network that forms a cyclic structure. RNNs are mainly used for time series problems. The basic equations related to the structure of basic RNNs are presented in Equation (1). xt denotes the input value at time t, yt denotes the output value, ht denotes the hidden state of a current state, and ht denotes the hidden state of a previous state. Wx and Wh are the input and interconnected weight matrices for the output of the hidden layer.
h t = t a n h   ( W h h t 1 + W x x t + b t ) where   y t = W h h t + b t
Second, LSTM is a special type of RNN, which has the ability to perform learning that requires a long dependency period [23]. In RNN, when the distance between the relevant information and the point where the information is used is far, the gradient decreases during backpropagation, resulting in a vanishing gradient problem, in which learning ability is greatly reduced. LSTM solved the vanishing gradient problem by controlling the information to be forgotten and the information to be remembered. LSTM consists of four steps: forgetting information, determining which of the information to store in the cell state, updating the state, and finally, exporting the output value.
The first step of LSTM is to select what information to forget from the state. The forget is determined through the Sigmoid layer, and the gate of this step is called a forget gate layer. In this step, ht−1 and xt are input and are delivered to Ct−1. This step is defined by Equation (2).
f t = σ ( W f [ h t 1 , x t ] + b f )
The next step of LSTM is to determine whether to store the input information in a state. It determines which information to update through the sigmoid layer, called the input gate layer, and generates a new candidate value it. This step is defined by Equation (3).
i t = σ ( W i [ h t 1 , x t ] + b i )
In the next step, the previous state Ct−1 is updated to create a new state Ct. In this step, Ct−1 is multiplied by ft to forget. Add it and C ˜ t product and Ct−1 to produce a new Ct. This step is defined as Equation (4).
C t = f t C t 1 + i t C ˜ t where   C ˜ t = t a n h   ( W c [ h t 1 , x t ] + b c )
The final step is to determine the output value. First, xt and ht−1 are input to the Sigmoid layer, and ot, which is the part to be exported to output, is generated. Next, state Ct is calculated with tanh, and multiplied by ot to generate the final output, ht. This step is defined by Equation (5).
o t = σ ( W o [ h t 1 , x t ] + b o ) where   h t = σ tanh   ( C t )
At the same time, GRU, which is a simplified and improved version of LSTM, has been reported.
Unlike the structure of LSTM, in GRU, the vectors Ct and ht of the two states are merged into one vector ht. zt controls both the forget gate and the input gate. If the zt output equals 1, the forget gate opens and the input gate closes, and if the zt output equals 0, the forget gate closes and the input gate opens. The GRU cell has no output gate, the entire state vector ht is output per time step, and rt controls ht−1. The state of the GRU and the output of each layer are as shown in Equations (6)–(9).
r t = σ W x r T · x t + W h r T · h t 1 + b r
z t = σ ( W x z T · x t + W h z T · h t 1 + b z )
g t = t a n h W x g T · x t + W h g T · r t h t 1 + b g
h t = z t h t 1 + ( 1 z t ) g t

3.2. Time Series Data Conversion

Time series methods require time series conversion of data. In this paper, we use the previous day’s power usage to predict the power usage. This paper uses the power usage on the day before and after t hours to predict the power usage for t hours, which was defined as shown in Equation (10). The pre-power consumption ( P r e P t , w ) for predicting the power consumption for t hours was defined as in Equation (10). P t is the power consumption at t time and w is the window size, which means how much time will be taken back and forth from t time for prediction. l is the observed number of power usage during the day. In this paper, 96 is used because we observed for 15 min a day for 24 h. Algorithm 1 converts power usage into time series data. X is a feature for prediction, and Y is the answer.
P r e P t , w = P t w l , P t w + 1 l , P t w + 2 l P t l , , P t + w 1 l , P t + w l
Algorithm 1. Time series convert power consumption data
Input: Power consumption data P, Window Size w, Daily Observed Num l
Output: Time series power consumption learning data Dt
X = [ ]
Y = [ ]
for t in range(l + w, len(P) − w)
  X.append( P r e P t , w )
  Y.append( P t )
Dt[‘X’] = X
Dt[‘Y’] = Y
return Dt

4. Proposed Strategies

4.1. Applying the Time Series Prediction Methods

Time series prediction is a methodology employed to forecast future values by analyzing patterns within sequential data. The primary objective of time series prediction is to estimate forthcoming values, taking into account historical trends, seasonal variations, and periodic patterns. In recent years, there has been significant research focusing on leveraging deep learning techniques for time series prediction tasks. In this paper, we propose a novel approach that utilized three prominent time series prediction methods, LSTM, GRU, and RNN, to predict patterns in refrigerator usage. Specifically, we present a methodology that involves classifying usage patterns into three categories: weekly, Saturday, and Sunday, while considering the seasonality associated with refrigerator usage on specific days. Figure 3 shows the structure of the time series prediction methods. PrePt,w was input into the model to predict the power usage Pt at time t. The input value was vectorized through four hidden layers. The vectorized input value predicted Pt through the last output layer. Since the input value was not large, small models consisting of 4 depth and 100 size layers were constructed. To compare the performance between models, the models used the same layer size and the same depth.
In this paper, we used the same hyperparameters of the time series prediction models. Adam with a learning rate of 0.01 was used as the optimization function and mean squared error (MSE) was used as the loss function. As a hyperparameter for learning, learning proceeded with a batch size of 128 and an epoch of 300.

4.2. Applying the Ensemble Methods

In short-term prediction tasks, time series prediction methods rely heavily on empirical data to achieve good prediction performance. However, collecting a sufficient amount of training data poses a challenge due to the time costs involved and the limitations of the measuring device. Therefore, it is important to address the data shortage problem in order to explore models that can be generalized. Ensemble methods have emerged as a major approach to improving model generalization by combining multiple models instead of relying on a single model. Among these ensemble methods, stacking ensembles have been recognized as effective techniques for improving the generalization performance of deep learning (DL) models and machine learning (ML) models. This paper uses a method for stacking RNN, LSTM, and GRU ensembles. Figure 4 presents a configuration diagram that ensembles three models: RNN, LSTM, and GRU.
PrePt,w is input into the time series models to predict the power usage Pt at time t. The input value is vectorized through four hidden layers. We combined the resulting vectors through the time series prediction methods into one vector and then predicted the Pt through the DNN layer.
Algorithm 2 ensembles the time series methods. Dt, in which the power consumption is converted by Algorithm 1, is used for learning time series methods. Each learned model removes the last layer. The model from which the last layer is removed returns feature extracted vectors. The vectors returned by each model are combined into one vector Vconcat and then Vconcat passes through three hidden layers to predict the power usage P. This paper experiments with a combination of four ensembles to find a suitable combination of ensembles.
Algorithm 2. Time series methods ensemble model
Input: Time series power consumption data Dt, Time series prediction methods M
Output: Ensemble Model M
for m in M:
  Train m on Dt
  Drop last layer m
Vconcat = Concat([m.output for m in M])
X = Dense(100)(Vconcat)
X = Dense(100)(X)
P = Dense(100)(X)
M = Model(Vconcat, P)
Train M on Dt
The performance of ensemble models can vary depending on the specific combination of models utilized. In the case of ensembling three time series prediction methods (LSTM, GRU, RNN), there are a total of four possible combinations. Table 1 presents the composition of each ensemble. Ensemble A comprises all three time series prediction methods, while the remaining models consist of two time series prediction methods, with one method being excluded for each ensemble. To identify ensemble models with strong generalization performance, four ensemble models (Ensemble A, Ensemble B, Ensemble C, and Ensemble D) were evaluated.
Table 1. Configuration information for the ensemble models.
Table 1. Configuration information for the ensemble models.
MODEL NAMEINCLUDE LSTMINCLUDE GRUINCLUDE RNN
ENSEMBLE AOOO
ENSEMBLE BOOX
ENSEMBLE COXO
ENSEMBLE DXOO

5. Experimental Results

5.1. Experimental Results

The experimental conditions for proving the continuity of power consumption and the time series data correlations examined in this paper may have been different; however, in general, they were as follows. First, note that the hardware test specification was performed on Intel Core i7-10900k, GTX-3090, and Windows 10, Enterprise version. In this case, the data used for the performance measurement were data from the refrigerator power consumption for 80 days, which were collected every 15 min, generalized, and then supplemented by applying the data interpolation method. Thus, by collecting 4 data per hour, 96 data were calculated each day, resulting in a total of 7680 data collected over the 80-day period. The form of power consumption data collected in the refrigerator was composed of power for the day of the week. Using these data, we predicted power consumption based on three characteristics on weekdays, Saturdays, and Sundays and the time series methods of RNN, LSTM, and GRU, which are the techniques for analyzing time series data. Table 2 shows the number of refrigerator power consumption data per class and the corresponding percentage for the class in relation to the total data.
Table 2. Total number of data by class.
Table 2. Total number of data by class.
ClassNumber of Data (Percentage of Total Data)
Class1 (Weekday)5376 (70%)
Class2 (Saturday)1152 (15%)
Class3 (Sunday)1152 (15%)
The performance evaluation scale of the algorithm using the data can be used to derive the mean squared error (MSE), mean absolute error (MAE), and accuracy through calculating the measured results. In this paper, the RNN, LSTM, and GRU models were applied to predict the power consumption data for the refrigerator. Table 3 shows the experimental results without dividing the data into three classes.
When learning was conducted without class separation, it was confirmed that the models had high performance in the order of weekday, Saturday, and Sunday. The reason for the high weekday performance was that the model used MSE as a loss function in the learning process, giving a higher weight to detect the peak spot. If the weight of the peak spot detection increases, the model will over-detect the peak spot on Saturdays and Sundays, when the peak spot is less than on a weekday. Over-detection of the peak spot causes the model performance to degrade. We proposed class separation to prevent over-detection of the peak spot and improve its performance. Table 4 shows the experimental results with the proposed division into three classes.
The results when the classes were classified showed an improvement in overall performance compared to when the classes were not separated. The performance of the three models was lower on weekdays than on Saturdays and Sundays. This was because Saturdays and Sundays have fewer peak spots than weekdays, so power usage is monotonous. As a result of comparing the performance of the three models, the overall LSTM result was higher than that of the other models. However, the GRU showed higher MSE and MAE performance in the process on Saturday. We confirm that our proposed classification of classes can improve performance by detecting unique patterns in each class. Ensemble models combine the advantages of other models to provide better performance. Table 5 shows the experimental results of the various ensemble models (Ensemble A, Ensemble, B, Ensemble C, and Ensemble D) proposed in Section 4.2.
It was observed in this study that the utilization of ensemble models generally yielded higher accuracy compared to the use of a single model across all three classes. In particular, the three time series prediction methods showed that weekday power consumption predictions were less accurate than Saturday and Sunday power consumption predictions, but the ensemble model made the weekday power consumption predictions similar to the Saturday and Sunday power usage predictions. Ensemble predictions in the week showed up to 3.2% improvement over the 95.23% accuracy of LSTM, which provided the best accuracy among a single model. Figure 5 shows the performance comparison result for the ensemble models.

5.2. Discussion of Results

The experiment revealed that Ensemble B, which combined LSTM [23] and GRU [24], exhibited higher accuracy compared to the other three ensemble models (Ensemble A, Ensemble C, and Ensemble D). The accuracy of Ensemble B during weekdays was found to be 98.4%, matching the highest accuracy achieved by Ensemble A. Furthermore, Ensemble B demonstrated 98.8% accuracy on Saturdays, surpassing Ensemble C, the second most accurate model, by 0.6%, and averaging 0.9% higher accuracy than the other three models. Similarly, on Sundays, Ensemble B achieved 98.2% accuracy, surpassing Ensemble C, the second most accurate model, by 0.3%, and averaging 0.9% higher accuracy than the other three models. The addition of an RNN model to Ensemble B resulted in Ensemble A; however, Ensemble A did not surpass the performance of Ensemble B. This paper confirms that combining multiple models does not always guarantee performance improvement. In this paper, although the experimental data were collected and tested for 12 weeks for one commercial refrigerator, we believe that the proposed strategy can help to predict the power consumption of the various commercial refrigerators according to their usage behaviors. To utilize periodicity for time series analysis, we considered the separation and utilization of different types of categories according to the significantly different use on weekdays, Saturdays, and holidays. In addition, the ensemble combination of different prediction models was applied to improve the prediction accuracy performance. Therefore, it is expected that the time series prediction performance can be improved through the strategy of the proposed method according to the application of various commercial refrigerators.

6. Conclusions

The prediction of power use patterns can be determined by taking the variables of load and time into consideration, and it is necessary to develop a methodology to predict them effectively. In this paper, we proposed the properties of predictions in terms of effective comparison and prediction accuracy among many methods for analyzing time series data. Furthermore, we proposed an ensemble model that combined the three time series algorithms to improve the prediction accuracy. For instance, when a small load, such as a building, is compared with a power system, the influence of the load is considerably larger, which is not easily quantified. Therefore, analyzing the power consumption pattern to predict accurate power consumption is an important issue.
In this paper, the power consumption of a refrigerator was measured in units of 15 min, and time series methods were applied to predict power consumption. The experimental results confirmed the accuracy of RNN (i.e., 96.1%), LSTM (i.e., 96.9%), and GRU (i.e., 96.4%). Furthermore, the experimental results of the ensemble model combining the three time series models showed 98.43%, providing high accuracy in power consumption.
In order to confirm the comparative advantage of the analysis of time series data, we applied the three time series method and classified the data into three classes: weekdays, Saturday, and holidays. This paper predicted power use patterns with three time series prediction methods and proposed an ensemble model that combined three time series for clearer prediction. Academic achievements have been accomplished that clarify the predictive nature of power use patterns, and provide an opportunity to once again identify the need for constant real-time power consumption monitoring.

Author Contributions

M.K. developed stimuli, and interpreted the results; S.L. and T.J. supervised the project, conducted the model’s data analysis, and wrote the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by a research grant from Hallym University in 2023 (202301-012).

Data Availability Statement

Data available on request due to restrictions e.g., privacy or ethical.

Acknowledgments

The experimental data were analyzed with the help of Encored Technologies.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bera, S.; Misra, S.; Rodrigues, J.J. Cloud Computing Applications for Smart Grid: A Survey. IEEE Trans. Parallel Distrib. Syst. 2015, 26, 1477–1494. [Google Scholar] [CrossRef]
  2. Tushar, W.; Chai, B.; Yuen, C.; Smith, D.B.; Wood, K.L.; Yang, Z.; Poor, H.V. Three-Party Energy Management with Distributed Energy Resources in Smart Grid. IEEE Trans. Ind. Electron. 2015, 62, 2487–2498. [Google Scholar] [CrossRef] [Green Version]
  3. Marzband, M.; Yousefnejad, E.; Sumper, A.; Domínguez-García, J.L. Real time experimental implementation of optimum energy management system in standalone Microgrid by using multi-layer ant colony optimization. Int. J. Electr. Power Energy Syst. 2016, 75, 265–274. [Google Scholar] [CrossRef] [Green Version]
  4. Beaudin, M.; Zareipour, H. Home energy management systems: A review of modelling and complexity. Renew. Sustain. Energy Rev. 2015, 45, 318–335. [Google Scholar] [CrossRef]
  5. El-Baz, W.; Tzscheutschler, P. Short-term smart learning electrical load prediction algorithm for home energy management systems. Appl. Energy 2015, 147, 10–19. [Google Scholar] [CrossRef]
  6. Raza, M.Q.; Khosravi, A. A review on artificial intelligence based load demand forecasting techniques for smart grid and buildings. Renew. Sustain. Energy Rev. 2015, 50, 1352–1372. [Google Scholar] [CrossRef]
  7. Ahmed, M.S.; Mohamed, A.; Homod, R.Z.; Shareef, H.; Sabry, A.H.; Bin Khalid, K. Smart plug prototype for monitoring electrical appliances in Home Energy Management System. In Proceedings of the 2015 IEEE Student Conference on Research and Development (SCOReD), Kuala Lumpur, Malaysia, 13–14 December 2015; pp. 32–36. [Google Scholar] [CrossRef]
  8. Majidpour, M.; Qiu, C.; Chu, P.; Gadh, R.; Pota, H.R. Fast Prediction for Sparse Time Series: Demand Forecast of EV Charging Stations for Cell Phone Applications. IEEE Trans. Ind. Inform. 2015, 11, 242–250. [Google Scholar] [CrossRef]
  9. Lee, W.-J.; Hong, J. A hybrid dynamic and fuzzy time series model for mid-term power load forecasting. Int. J. Electr. Power Energy Syst. 2015, 64, 1057–1062. [Google Scholar] [CrossRef]
  10. Qiu, X.; Ren, Y.; Suganthan, P.N.; Amaratunga, G.A. Empirical Mode Decomposition based ensemble deep learning for load demand time series forecasting. Appl. Soft Comput. 2017, 54, 246–255. [Google Scholar] [CrossRef]
  11. Letham, B.; Rudin, C.; McCormick, T.H.; Madigan, D. Interpretable classifiers using rules and Bayesian analysis: Building a better stroke prediction model. Ann. Appl. Stat. 2015, 9, 1350–1371. [Google Scholar] [CrossRef]
  12. Yuan, X.; Chen, C.; Yuan, Y.; Huang, Y.; Tan, Q. Short-term wind power prediction based on LSSVM–GSA model. Energy Convers. Manag. 2015, 101, 393–401. [Google Scholar] [CrossRef]
  13. Lee, S.; Jeong, T. Cloud-Based Parameter-Driven Statistical Services and Resource Allocation in a Heterogeneous Platform on Enterprise Environment. Symmetry 2016, 8, 103. [Google Scholar] [CrossRef] [Green Version]
  14. Lee, S.; Jeong, T. Forecasting Purpose Data Analysis and Methodology Comparison of Neural Model Perspective. Symmetry 2017, 9, 108. [Google Scholar] [CrossRef]
  15. Shi, X.; Chen, Z.; Wang, H.; Yeung, D.Y.; Wong, W.K.; Woo, W.C. Convolutional LSTM network: A machine learning approach for precipitation nowcasting. In Proceedings of the 28th International Conference on Neural Information Processing Systems, Cambridge, MA, USA, 7–12 December 2015. [Google Scholar]
  16. Sands, T.M.; Tayal, D.; Morris, M.E.; Monteiro, S.T. Robust stock value prediction using support vector machines with particle swarm optimization. In Proceedings of the 2015 IEEE Congress on Evolutionary Computation (CEC), IEEE, Sendai, Japan, 25–28 May 2015. [Google Scholar]
  17. Yang, Z.; Ce, L.; Lian, L. Electricity price forecasting by a hybrid model, combining wavelet transform, ARMA and kernel-based extreme learning machine methods. Appl. Energy 2017, 190, 291–305. [Google Scholar] [CrossRef]
  18. Shrouf, F.; Miragliotta, G. Energy management based on Internet of Things: Practices and framework for adoption in production management. J. Clean. Prod. 2015, 100, 235–246. [Google Scholar] [CrossRef]
  19. Pan, J.; Jain, R.; Paul, S.; Vu, T.; Saifullah, A.; Sha, M. An internet of things framework for smart energy in buildings: Designs, prototype, and experiments. IEEE Internet Things J. 2015, 2, 527–537. [Google Scholar] [CrossRef] [Green Version]
  20. Lee, I.; Lee, K. The Internet of Things (IoT): Applications, investments, and challenges for enterprises. Bus. Horiz. 2015, 58, 431–440. [Google Scholar] [CrossRef]
  21. Arasteh, H.; Hosseinnezhad, V.; Loia, V.; Tommasetti, A.; Troisi, O.; S-khah, M.; Siano, P. Iot-based smart cities: A survey. In Proceedings of the Environment and Electrical Engineering (EEEIC), 2016 IEEE 16th International Conference on Environment and Electrical Engineering (EEEIC), Florence, Italy, 7–10 June 2016. [Google Scholar]
  22. Lee, S.; Jeong, T. Large-Scale Distributed System and Design Methodology for Real-Time Cluster Services and Environments. Electronics 2022, 11, 4037. [Google Scholar] [CrossRef]
  23. Yu, Y.; Si, X.; Hu, C.; Zhang, J. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures. Neural Comput. 2019, 31, 1235–1270. [Google Scholar] [CrossRef]
  24. Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv 2014, arXiv:1412.3555. [Google Scholar]
  25. Benidis, K.; Rangapuram, S.S.; Flunkert, V.; Wang, Y.; Maddix, D.; Turkmen, C.; Gasthaus, J.; Bohlke-Schneider, M.; Salinas, D.; Stella, L.; et al. Deep Learning for Time Series Forecasting: Tutorial and Literature Survey. ACM Comput. Surv. 2022, 55, 1–36. [Google Scholar] [CrossRef]
Figure 1. Energy data ingestion and measurement in real enterprise market. (a) Real-world data from local logistic center; (b) Local energy consumption monitoring equipment; (c) Real-world data measurement; (d) Local logistics of enterprise market.
Figure 1. Energy data ingestion and measurement in real enterprise market. (a) Real-world data from local logistic center; (b) Local energy consumption monitoring equipment; (c) Real-world data measurement; (d) Local logistics of enterprise market.
Electronics 12 02811 g001aElectronics 12 02811 g001b
Figure 2. Weekday and weekend (Saturday and Sunday) refrigerator power consumption data. (a) Power consumption data on a Weekday; (b) Power consumption data on a Saturday; (c) Power consumption data on a Sunday.
Figure 2. Weekday and weekend (Saturday and Sunday) refrigerator power consumption data. (a) Power consumption data on a Weekday; (b) Power consumption data on a Saturday; (c) Power consumption data on a Sunday.
Electronics 12 02811 g002
Figure 3. Structure of a time series prediction method.
Figure 3. Structure of a time series prediction method.
Electronics 12 02811 g003
Figure 4. Structure of an ensemble model.
Figure 4. Structure of an ensemble model.
Electronics 12 02811 g004
Figure 5. Comparison results of the ensemble models: Ensemble A(LSTM + GRU + RNN), Ensemble B(LSTM + GRU), Ensemble C(LSTM + RNN), Ensemble D(GRU+RNN).
Figure 5. Comparison results of the ensemble models: Ensemble A(LSTM + GRU + RNN), Ensemble B(LSTM + GRU), Ensemble C(LSTM + RNN), Ensemble D(GRU+RNN).
Electronics 12 02811 g005
Table 3. Comparison results for well-known popular time series prediction methods.
Table 3. Comparison results for well-known popular time series prediction methods.
ModelClassMSEMAEAccuracy (%)
RNNWeekday11,32763.9489.03
Saturday17,70090.8375.94
Sunday22,988104.7372.66
LSTMWeekday990853.0289.94
Saturday18,54289.472.5
Sunday38,291123.966.1
GRUWeekday10,31661.4289.30
Saturday17,13589.376.2
Sunday38,486125.8667.5
Table 4. Comparison results of the time series prediction methods divided into three classes.
Table 4. Comparison results of the time series prediction methods divided into three classes.
ModelClassMSEMAEAccuracy (%)
RNNWeekday478547.9993.85
Saturday321936.1997.2
Sunday237133.6397.33
LSTMWeekday354238.7895.23
Saturday224928.9298.2
Sunday194027.2297.52
GRUWeekday427244.594.9
Saturday217627.1498.1
Sunday238432.3396.2
Table 5. Comparison results of the ensemble models.
Table 5. Comparison results of the ensemble models.
ModelClassMSEMAEAccuracy (%)
Ensemble AWeekday143522.498.4
Saturday205127.4497.6
Sunday178225.897.2
Ensemble BWeekday167725.1198.4
Saturday130620.2198.8
Sunday170625.698.2
Ensemble CWeekday328438.895.9
Saturday124919.698.2
Sunday186527.597.9
Ensemble DWeekday252732.097.2
Saturday142822.0197.9
Sunday231433.0597.1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kim, M.; Lee, S.; Jeong, T. Time Series Prediction Methodology and Ensemble Model Using Real-World Data. Electronics 2023, 12, 2811. https://doi.org/10.3390/electronics12132811

AMA Style

Kim M, Lee S, Jeong T. Time Series Prediction Methodology and Ensemble Model Using Real-World Data. Electronics. 2023; 12(13):2811. https://doi.org/10.3390/electronics12132811

Chicago/Turabian Style

Kim, Mintai, Sungju Lee, and Taikyeong Jeong. 2023. "Time Series Prediction Methodology and Ensemble Model Using Real-World Data" Electronics 12, no. 13: 2811. https://doi.org/10.3390/electronics12132811

APA Style

Kim, M., Lee, S., & Jeong, T. (2023). Time Series Prediction Methodology and Ensemble Model Using Real-World Data. Electronics, 12(13), 2811. https://doi.org/10.3390/electronics12132811

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop