Next Article in Journal
Scaling Performance Parameters of Reciprocating Engines for Sustainable Energy System Optimization Modelling
Next Article in Special Issue
Extreme Gradient Boosting Model for Day-Ahead STLF in National Level Power System: Estonia Case Study
Previous Article in Journal
Nuclear Energy and Financial Development for a Clean Environment: Examining the N-Shaped Environmental Kuznets Curve Hypothesis in Top Nuclear Energy-Consuming Countries
Previous Article in Special Issue
A Dual-Stage Solar Power Prediction Model That Reflects Uncertainties in Weather Forecasts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comparing the Simple to Complex Automatic Methods with the Ensemble Approach in Forecasting Electrical Time Series Data

1
Department of Statistics, Universitas Sebelas Maret, Surakarta 57126, Indonesia
2
Informatics Engineering, Vocational School, Universitas Sebelas Maret, Surakarta 57129, Indonesia
3
Department of Statistics, Federal University of Bahia, Salvador 40170-110, Brazil
*
Author to whom correspondence should be addressed.
Energies 2023, 16(22), 7495; https://doi.org/10.3390/en16227495
Submission received: 12 September 2023 / Revised: 2 November 2023 / Accepted: 3 November 2023 / Published: 8 November 2023
(This article belongs to the Special Issue Forecasting Techniques for Power Systems with Machine Learning)

Abstract

:
The importance of forecasting in the energy sector as part of electrical power equipment maintenance encourages researchers to obtain accurate electrical forecasting models. This study investigates simple to complex automatic methods and proposes two weighted ensemble approaches. The automated methods are the autoregressive integrated moving average; the exponential smoothing error–trend–seasonal method; the double seasonal Holt–Winter method; the trigonometric Box–Cox transformation, autoregressive, error, trend, and seasonal model; Prophet and neural networks. All accommodate trend and seasonal patterns commonly found in monthly, daily, hourly, or half-hourly electricity data. In comparison, the proposed ensemble approaches combine linearly (EnL) or nonlinearly (EnNL) the forecasting values obtained from all the single automatic methods by considering each model component’s weight. In this work, four electrical time series with different characteristics are examined, to demonstrate the effectiveness and applicability of the proposed ensemble approach—the model performances are compared based on root mean square error (RMSE) and absolute percentage errors (MAPEs). The experimental results show that compared to the existing average weighted ensemble approach, the proposed nonlinear weighted ensemble approach successfully reduces the RMSE and MAPE of the testing data by between 28% and 82%.

1. Introduction

Electricity has an important role in human life. Therefore, research in this field is continually developing, from the development of renewable energy sources, and energy efficiency to mathematical modeling. Time series data related to the energy sector, such as electricity load, supply, and consumption, have a unique pattern and challenge researchers to produce methods with accurate forecasting values. There has been extensive discussion in [1,2,3,4,5,6,7,8,9,10,11] of the conventional model approach for forecasting electrical time series data, including the ARIMA (autoregressive integrated moving average) and exponential smoothing. Meanwhile, numerous authors [12,13,14,15,16,17,18,19,20,21] explored machine learning approaches, including Prophet and neural networks (NN). The combination of the two approaches has also been widely explored (see [22,23,24,25,26]).
In electric load forecasting, regression and ARIMA hybrid approaches have succeeded in representing the behavior of linear trends and the influence of holidays [27]. Meanwhile, for more complex patterns, the study in [28] introduces the TLSNN (two-level seasonal neural network) and the TLCSNN (two-level complex seasonal neural network) models, which take advantage of SSA (singular spectrum analysis) to decompose trend and seasonal patterns so that a more appropriate trend function can be chosen to represent other nonlinear trend patterns. Furthermore, in the TLCSNN, the seasonal function is approximated by a combination of amplitude-modulated sinusoidal functions to capture non-integer seasonal periods [29,30]. At the same time, NN (neural networks) can be used to handle nonlinearity relationships in the data.
Holt and Winter [31,32] introduced exponential smoothing as a simple method for capturing trend and seasonality patterns in time series data. This method has been a competitive alternative to the seasonal ARIMA and is widely used in applications (see [18,33,34,35,36,37,38,39,40]) because of its robustness and accuracy [41]. An overview of the development of the exponential smoothing model for time series with trend and seasonal patterns can be seen in [3]. This method has attracted researchers’ attention in electrical load data forecasting. Taylor [42] developed exponential smoothing for accommodating the trend and multiple seasonal patterns in load time series. Another study [43] developed an exponential smoothing method with a state–space approach that can model high-frequency electrical load data well. Later, another study [44] proposed a hybrid exponential smoothing-neural network and applied the method to the high-frequency Indonesian load time series. Exponential smoothing has become more popular since the study in [45] combined this method with a neural network and this took first place in the M4 competition [46,47].
The development of machine learning algorithms has encouraged researchers to propose automatic methods that make them easy to use. Users do not need to understand in-depth knowledge of statistics to be able to implement and run them. Many of these methods are old methods with a new twist. For example, the Prophet model developed by the Facebook (FB) team [48] has fundamentally the same idea as the simple model built by combining trend, seasonal, and irregular components discussed in [49]. The difference is in the more flexible trend and seasonality functions.
Several packages in R and Python offer automated models for time series forecasting. These include the ARIMA, exponential smoothing, NN, and Prophet models. Even though the model is built automatically based on specific criteria, such as minimizing AIC, RMSE, or other error evaluation tools, in some cases, the resulting model may not provide forecasting accuracy as expected. To address this issue, several researchers have developed ensemble models. An ensemble model is built from a pool of models [50] or the same model but with different parameters [51,52]. An ensemble model can also be considered to solve the problems resulting from the model, data, and parameter uncertainty.
As stated in [9,51,52], the resulting forecast of ensemble models can be calculated using the mean, median, mode, and trimmed mean techniques. These methods are simple but ignore the possibility of differences in the weights of each base models. The following year, the studies in [53] and [54] proposed the weighted ensemble method based on a nonlinear optimization and the artificial bee colony algorithm, respectively. Both [53,54] considered absolute error to obtain the weights of the base models. According to [55,56], absolute error is less sensitive to the outlier than the square error. Meanwhile, in electricity load time series, not all the extreme values can be ignored or be considered as outliers that represent corrupted parts of the data. This could be related to the community habits that occur periodically, such as the “mudik” culture in every Idul Fitri in Indonesia, which causes the load demand to decrease significantly compared to other days [28]. On the other hand, in defining an ensemble model, it is necessary to pay attention to the selection of both the base model’s appropriateness and the weighting of the base model [52,54]. Inspired by [52,53,54], this work proposes alternative weighting strategies in the ensemble model which are expected to improve forecasting accuracy by considering a pool of single methods appropriate to the time series, with trend and seasonal patterns found in electrical time series.
The motivation and contribution of this study are summarized as follows. First, considering the development of the automatic models, i.e., the auto ARIMA, the ETS method, the DSHW method, the TBATS model, NN, and Prophet models, this study evaluates the implementation of those automatic models to forecast the electricity time series data, which usually show trend and seasonal patterns. Second, in handling the problem in model selection for the historical data of the electricity time series as the source of model uncertainty, this work proposed two weighted ensemble approaches that combine the six single constituent automatic models linearly (called EnL) and nonlinearly (called EnNL), respectively, by minimizing square errors. To show the effectiveness and applicability of the proposed ensemble approach, four time series datasets with different behaviors and different characteristics, i.e., the monthly electricity net generation in the US, hourly household electricity demand in Ontario, Canada, half-hourly electricity demand in England and Wales, and half-hourly electricity demand in Victoria, Australia are considered.
The organization of this paper is as follows. Section 2 provides a brief overview of the methods used to model the time series data and also the procedure of the proposed ensemble approaches. The quantitative analysis results and the findings are summarized and discussed in Section 3. Finally, the conclusion and future directions are presented in Section 4.

2. Materials and Methods

This section provides a brief overview of the ARIMA, exponential smoothing including the ETS (error–trend–seasonal) method, the DSHW (double seasonal Holt–Winter) method, the TBATS (trigonometric, Box–Cox transformation, ARIMA error, trend, and seasonal) model, and two machine learning methods, Prophet and NN.

2.1. The ARIMA

The ARIMA is probably the most classic and popular method for time series analysis and forecasting. For the seasonal time series, the model can be represented as in Equation (1)
ϕ P B s ψ p B 1 B d 1 B s D Y t = θ q B η Q B s ϵ t ,
and denoted as the ARIMA( p , d , q )x P , D , Q s , where s is the seasonal period [49,57]. The regular autoregressive and moving average factors are notated as ψ p B and θ q B , while the seasonal autoregressive and moving average are represented by ϕ P B s and η Q B s , where p ,   q ,   P , and Q indicate the order of the model. The number of regular and seasonal differencing needed to stationarize the time series is notated by d and D , respectively. When P = D = Q = s = 0 , then the model presented in Equation (1) becomes a regular ARIMA (with no seasonal part).
Hyndman and Khandakar [57] proposed an algorithm named auto.arima, which estimates the ARIMA model parameters by considering unit root tests, MLE, and AICc. The default of this algorithm includes the determination of d (between 0 and 2) using the repeated KPSS test, the stipulation of the order of the model by minimizing the AICc, and the variation in the defined model by adding or reducing the order of the model until no lower AICc is found. The chosen model is the model with the smallest AICc.

2.2. Exponential Smoothing

This work considered three automatic exponential smoothing methods provided in the R software package forecast version 8.20, i.e., the ETS method, the DSHW method, and the TBATS model. Exponential smoothing for trend and seasonal components was initially developed in [31,32], named the Holt–Winter (HW) method. The trend and seasonal patterns can be taken into account in the model either as additive or multiplicative components. The additive HW method assumes that the trend and seasonal component have a linear relationship in the model, while the multiplicative HW method considers that the input, i.e., the past observations, which are presented as a trend, and seasonal components follow a nonlinear relationship to the output. This means that the additive HW is recommended when the seasonal variation tends to be constant, while the multiplicative HW is considered when there is a change in the seasonal variation proportional to the level of the series.
The ETS method is exponential smoothing with the state–space approach, which has an automatic forecasting procedure proposed in [58]. In this case, the level, trend, and seasonal components as states change over time. The model notated by ETS (“Error”, “Trend”, “Seasonal”) method with the possibility for “Error” is additive (A) or multiplicative (M). The “Trend” is None or No trend (N), Additive (A), or Additive with a Damped parameter, while “Seasonal” is None or No seasonal (N), Additive (A), or Multiplicative (M). These state–space models are more general for the exponential smoothing model. In the analysis, the error, trend, and season components are automatically selected and the best model is chosen based on AICc. Further discussion about this method can be found in [59].
The DSHW method was developed by [42] to accommodate two seasonality that cannot be handled by the standard HW. This model includes two seasonal periods where the longer seasonal period is a multiple of a short one or in other words, there is a repetition of a short seasonal effect within the longer one [42]. The best model is fitted by minimizing the mean square error using dshw() function in the R software.
The TBATS model is more general than the DSHW method. It is also useful for modeling high-frequency time series with trend and double seasonal patterns. The seasonal period is not only limited to the integer value but can also be the non-integer value [43].

2.3. Prophet

The Prophet model developed by the Facebook team consists of trend, seasonality, and holiday components that can be written as
Y t = T t + S t + H t + ϵ t .
where T t , S t , and H t represent the trend, seasonal, and holiday effect, respectively [48]. The error term, ϵ t , is usually assumed to be normally distributed. It accommodates any changes that are not taken into account in the model. The flexibility of the Prophet model represented in Equation (2) makes it work well for time series with trends, strong seasonal effects, and several seasons of historical data. Prophet is robust to missing data and shifts in the trend and typically handles outliers well [48].
In the analysis, a linear function and the Fourier series are employed to model the trend component and to represent the periodic component, respectively. The piecewise linear model is represented as
T t = β 0 + β 1 t + β 2 t c +
where t shows the time steps and c is the value of break point while β j ,   j = 0 ,   1 ,   2 are unknown parameters. Equation (3) can also be written as Equation (4).
T t = β 0 + β t t c β 0 β 2 c + β 1 + β 2 t t > c
Meanwhile, the seasonal effect is approximated by Fourier series as in Equation (5).
S t = i = 1 n F a i cos 2 π i t s + b i sin 2 π i t s
where a i and b i for i = 1 ,   2 , ,   n i are unknown parameters, s is the seasonal period and n F is the order of Fourier series. In this work, the break points are determined automatically by the Prophet algorithm with the change prior scale is considered to be 0.05 and the holiday effect ( H t ) is ignored, as in other alternative models used in this study.

2.4. Neural Networks

The neural network algorithm discussed in this work is a feedforward neural network whose inputs are lagged values of time series and the output is the forecast values. Differently from the ARIMA, it enables modeling the nonlinear relationship between input and output variables in the model. The output of the NN can be obtained by Equation (6).
Y t = v o + j = 1 n h v j f n e t j
where v j ,   j = 0 ,   1 ,   ,   n h are the unknown parameters and f is a sigmoid function defined as
f n e t j = 1 1 + exp n e t j
and
n e t j = w 0 j + k = 1 n k w k j Y t k + l = 1 n l w l j Y t l s .
w 0 j   j = 1 ,   ,   n h are the bias feed into the j t h node of the hidden layer, w k j   k = 1 ,   2 ,   ,   n k ;   j = 1 ,   2 ,   ,   n h are the weights connected the input Y t k to the node j in the hidden layer, and w l j   l = 1 ,   2 ,   ,   n l ;   j = 1 ,   2 ,   ,   n h are the weights connected the input Y t l s to the node j in the hidden layer, where s is the seasonal period. The terms n k ,   n l , and n h represent the number of regular lagged input, seasonal lagged input, and hidden nodes, respectively. The values of n k and n l in Equation (8) are selected automatically according to the AIC value, while n h is set to the nearest integer of n k + n l + 1 / 2 [60] using nnetar() function in the R software.

2.5. Proposed Ensemble Methods

The proposed ensemble methods include six steps.
Step 1:
Divide the time series data into two parts, training and testing data. Let Y t ,   t = 1 ,   ,   N , then the training dataset is Y 1 ,   Y 2 ,   . ,   Y N 24 and the testing dataset is { Y N 24 + 1 ,   Y N 24 + 2 ,   ,   Y N } , where N is the sample size.
Step 2:
Model the train data using automatic models, i.e., the ARIMA, the ETS method, the DSHW method, the TBATS model, NN, and Prophet.
Step 3:
Calculate the forecast values up step h ahead, where, in this work, h = 1 ,   2 ,   ,   24 and is the time horizon for the testing data. The forecast values, Y ^ N 24 + h i   for   i = 1 ,   2 ,   ,   6 , are obtained by the models listed in step 2.
Step 4:
Assemble the models with two weighting strategies
  • based on the linear relationship function, named EnL,
    Y ^ N 24 + h = i = 1 6 w i Y ^ N 24 + h i
    where w i , i = 1 ,   2 ,   ,   6 , are obtained by minimizing square error, which can be written as in Equation (10).
    min g w = t = 1 N 24 i = 1 6 w i Y ^ t i Y t 2
    with linear constraint i = 1 6 w i = 1 .
  • based on the nonlinear relationship function, named EnNL
    Y ^ N 24 + h = v 0 + j = 1 n h v j f w 0 j + i = 1 6 w i j Y ^ N 24 + h i
    where f is a sigmoid function as written in Equation (7). The weights v 0 ,   v j ,   w 0 j , and w i j for i = 1 ,   2 ,   ,   6 ;   j = 1 ,   2 ,   ,   n h are estimated by an NN approach. The network is trained for input–target pairs, i.e., Y ^ t 1 ,   Y ^ t 2 ,   ,   Y ^ t 6 and Y t ,   t = 1 ,   2 ,   ,   N 24 , using a backpropagation algorithm with the Levenberg–Marquart optimization. The number of n h was set between 1 and 5 and selected based on the minimum RMSE while the maximum number of the epoch was 5000.
Step 5:
Calculate the forecast values using Equation (9) for the EnL method and Equation (11) for the EnNL method.
Step 6:
Evaluate the model based on RMSE and MAPE using Equation (12) and Equation (13), respectively.
RMSE = t = 1 n o b s Y t Y ^ t 2 n o b s 1 / 2 ,
MAPE = 1 n o b s t = 1 n o b s Y t Y ^ t Y t × 100 % ,
where n o b s is the number of observations included in the calculation. In this study, the two proposed weighted ensemble approaches are compared with an existing average weighted ensemble and six single models including the ARIMA, the ETS method, the DSHW method, the TBATS model, NN and Prophet. Those six single models are considered capable of modeling time series with trend and seasonal data as in electrical time series data. Clearly, the research stages are presented in Figure 1.
The experiment consists of modeling four datasets by six single automated and three ensemble methods and computing 1 to 24 steps ahead, forecasting using all constructed models. All models are fitted on a Hewlett-Packard laptop manufactured from China with intel® core i-7 4.70 GHz processor with 32 GB of RAM and running R 4.2.2 for the single automated models and Matlab 2023a for the ensemble models.

3. Results and Discussion

In this section, four time series data with different seasonal periods, i.e., monthly, hourly, and half-hourly, are presented as illustrative examples. The characteristic of each dataset is described in Table 1.
These four datasets are considered in order to show the effectiveness and applicability of the proposed ensemble approach for electricity load forecasting.

3.1. Data

The four time series datasets are described below.
Data 1:
US Monthly Electricity Total Net Generation
The first time series is the monthly total net generation of electricity from January 1973 to May 2001 presented in Figure 2. These data can be accessed freely from the library “fpp” in the R software and are named “usmelec” [61]. The data are split into two parts, i.e., the first 317 observations from January 1973 to May 1999 as the training data and the last 24 observations as the testing data.
Figure 2 shows that the data tend to increase linearly from time to time, with a change in seasonal variation.
Data 2:
Hourly Electricity Demand in Ontario
The second time series is hourly electricity demand in Ontario, Canada, from 1 August 2016, 00:00 to 31 December 2016, 23:00. These data were obtained from [62]. In the experimental study, the first 3648 observations (1 August 2016, 00:00 to 30 December 2016, 23:00) are considered as the training dataset and the rest (31 December 2016, from hour 00:00 to 23:00) as the testing set. The pattern of the data can be seen in Figure 3. Differently from Data 1, these data have a nonlinear trend. The change in seasonal variation is bigger in the beginning and more stable in the rest.
Data 3:
Half-Hourly Electricity Demand in England and Wales.
Data 3 were also discussed in [42] and can be accessed from the library “forecast” in the R software with the name “taylor”. We consider 2354 observations as the training data, i.e., data from 5 June 2000, 00:00:00 to 24 July 2000, 00:30:00. The test data consist of the last 24 observations (24 July 2000, from hour 01:00:00 to 12:30:00) [63]. Figure 4 shows that the data have no trend, with a stable seasonal variation.
Data 4:
Half-Hourly Electricity Demand in Victoria, Australia
A half-hourly electricity demand in Victoria, Australia, time series can be accessed freely from the library “tsibbledata” in the R software with the name “vic_elec” [64]. As shown in Figure 5, these data have a more complex pattern than Data 3, presented in Figure 4. In the analysis, we estimate the models using the 2354 first observations (12 November 2014, 00:00:00 to 31 December 2014, 00:30:00); and in the test, the last 24 last observations (31 December 2014, from 01:00:00 to 12:30:00).

3.2. Experimental Results

Firstly, all training data for each of the four time series are modeled using the ARIMA, the ETS method, the DSHW method, the TBATS model, NN and Prophet. The forecast values obtained from those six single models are then combined using the existing average weighted ensemble approach, and the two proposed weighted ensemble approaches, based on the linear relationship and nonlinear relationship function. The error evaluation in terms of RMSE and MAPE can be seen in Table 2.
Table 2 shows that for the four datasets, the EnNL shows the smallest RMSE and MAPE compared to EnA and EnL, and all other single models. Meanwhile, EnL outperforms EnA. The ensemble model is worth recommending from MAPE’s viewpoint, even though EnA is slightly higher than 2% for Data 4, likewise for other automatic single models, apart from the ARIMA and Prophet. However, in selecting a model, further evaluation of the model’s performance on testing data is needed.
The performance of models in terms of RMSE and MAPE for the testing data are summarized in Table 3. For Data 1 and Data 4, EnNL consistently provides the lowest RMSE and MAPE as in the training data (see Table 2). EnNL successfully modeled and forecasted Data 1, which have trend and seasonal patterns with changing variations. Supported by Figure 6, the MAPEs obtained from EnNL for all lead times up to twenty-four steps ahead are less than 2%. Within the discussion of Data 1, EnA and EnL are recommended for one-step-ahead forecasting.
As for Data 2, no model produces a MAPE value of less than 2% for up to twenty-four-steps-ahead forecast values (see Table 3). For this case, further development is needed, such as involving the holiday effect and selecting a more appropriate trend and seasonal function [28,29,65]. However, based on Table 4 and Figure 7, all models except Prophet can be considered to forecast up to seven steps ahead. In Figure 7, it can also be seen that EnNL produces smaller MAPEs than those obtained from the other two ensemble models.
For Data 3, EnNL is the most recommended model based on the RMSE and MAPE values of the training data. Interestingly, based on Table 2, Prophet produces the smallest MAPE for the testing dataset and has a larger RMSE than EnNL. For further analysis, the MAPE behavior up to the next twenty-four half-hourly forecasts for all models is presented in Figure 8. The comparison of the actual values to forecast values up to 24 steps ahead using the Prophet method and the three ensemble approaches is also presented in Figure 9. Further, based on MAPE behavior (Figure 8) and forecasting values (Figure 9) for one to twenty-four steps ahead and considering RMSE and MAPE analysis on training data (Table 2) and testing data (Table 3), EnL and EnNL are more recommended compared to others including the Prophet. In this case, the accuracy of EnL and EnNL may still be improved by considering the influence of holidays on each constituent model.
Finally, Figure 10 shows that for Data 4, the EnNL, NN, and TBATS models produce relatively stable MAPE values of no more than 2%. Meanwhile, EnA tends to increase with increasing time ahead to be forecast. Even though the DSHW method provides relatively small RMSE and MAPE values on training data (Table 1), it cannot provide accurate forecasting values for the testing data (Table 2 and Figure 10). The ARIMA provides the largest RMSE and MAPE values on training data but produces the most accurate forecast for two to eight lead times and is less accurate after that (Figure 10).
According to the experiments, it can be summarized that most of the single automated models produced good-fitted values for the four different time series datasets. However, these models are better combined to minimize the forecast error because of the problem arising from the model uncertainty, that is mitigated by combining the results of different models in an ensemble framework. Furthermore, the experimental results show that the proposed EnNL approach significantly reduced RMSE and MAPE obtained from Data 1, Data 3, and Data 4 for up to twenty-four-steps-ahead forecasts compared to the EnA. It has reduced the RMSE and MAPE obtained from EnA by approximately 55%, 82%, 28%, 59%, 73%, and 30%, respectively. Further, for Data 2, the EnNL model forecasted up to seven steps ahead more accurately than EnA and EnL.

4. Conclusions

High-frequency electricity forecasting always become a challenging area to study. This study applied six single automatic models, i.e., the ARIMA, the ETS method, the DSHW method, the TBATS model, NN, and Prophet, and their combinations for modeling and forecasting time series data with trend and seasonal patterns encountered in electricity data. Automatic models are considered easily implemented, especially for users who do not have in-depth knowledge about the modeling procedure. Meanwhile, the ensemble models are regarded as handling the source of uncertainty both in the data and the model. In this case, we apply the automatic models provided by default by the package in R software without considering the influence of holidays. The forecast values obtained by the single automatic models were then combined based on three weighting strategies, i.e., averaging (EnA) and two proposed ensemble approaches considered the linearity (EnL) and nonlinearity relationships (EnNL), respectively. We compared the forecasting accuracy of each model based on RMSE and MAPEs. Experimental results of the four electricity time series data, i.e., US monthly electricity net generation, hourly household electricity demand in Ontario, half-hourly electricity demand in England and Wales, and half-hourly electricity demand in Victoria, show that the proposed ensemble approach, especially EnNL, provides better multistep-ahead forecast accuracy than those obtained from EnA, EnL, and other single models. In each case, it may differ for the number of future periods that can be predicted accurately.
In energy demand forecasting, understanding the character of data and various factors affecting the patterns is needed to develop the model. Involving the variable of the holiday effect and selecting the suitable weighted combination of the most suitable constituent models may considerably provide better performance accuracy.
Although the proposed ensemble approaches effectively enhance forecast accuracy, improvement can still be made. Finally, further investigations of this model development will become the focus of our future research.

Author Contributions

Conceptualization, W.S.; methodology, W.S.; software, W.S. and Y.Y.; validation, W.S., Y.Y. and P.C.R.; formal analysis, W.S., C.D.S. and R.H.; investigation, W.S.; data curation, W.S., C.D.S. and R.H.; writing—original draft preparation, W.S.; writing—review and editing, W.S. and P.C.R.; visualization, W.S., Y.Y. and S.S.; supervision, W.S.; project administration, C.D.S. and R.H.; funding acquisition, W.S., Y.Y. and S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Education, Culture, Research, and Technology Indonesia with the source from the DIPA Direktorat Riset, Teknologi, dan Pengabdian Kepada Masyarakat, Direktorat Jenderal Pendidikan Tinggi, Riset, dan Teknologi Kemendikbud 2023 with the research funding decision letter number 0217/E5/PG.02.00/2023 (28 February 2023) regarding Funding Recipients for the Operational Cost Assistance Program for State Universities for Advanced Research Programs in Higher Education for Fiscal Year 2023 under National Competitive Basic Research Grant Number SP DIPA-023.17.1.690523/2023 with the contract number 055/E5/PG.02.00.PL/2023 (12 April 2023) and 1380.1/UN27.22/PT.01.03/2023 (13 April 2023).

Data Availability Statement

All data discussed in this paper can be accessed freely from the source written in Section 3.

Acknowledgments

The authors thank Lembaga Penelitian dan Pengabdian kepada Masyarakat (LPPM) Universitas Sebelas Maret and Direktorat Riset, Teknologi, dan Pengabdian kepada Masyarakat (DRTPM) for their support. We also thank the editorial team and three anonymous reviewers for their valuable comments for improving the quality of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Macaira, P.M.; Sousa, R.C.; Oliveira, F.L.C. Forecasting Brazil’s electricity consumption with pegels exponential smoothing techniques. IEEE Lat. Am. Trans. 2016, 14, 1252–1258. [Google Scholar] [CrossRef]
  2. Rendon-Sanchez, J.F.; de Menezes, L.M. Structural combination of seasonal exponential smoothing forecasts applied to load forecasting. Eur. J. Oper. Res. 2019, 275, 916–924. [Google Scholar] [CrossRef]
  3. Sulandari, W.; Suhartono; Subanar; Rodrigues, P.C. Exponential Smoothing on Modeling and Forecasting Multiple Seasonal Time Series: An Overview. Fluct. Noise Lett. 2021, 20, 2130003. [Google Scholar] [CrossRef]
  4. Mahia, F.; Dey, A.R.; Masud, M.A.; Mahmud, M.S. Forecasting Electricity Consumption using ARIMA Model. In Proceedings of the 2019 International Conference on Sustainable Technologies for Industry 4.0 (STI 2019), Dhaka, Bangladesh, 24–25 December 2019; pp. 1–6. [Google Scholar]
  5. Wei, L.; Zhen-gang, Z. Based on Time Sequence of ARIMA Model in the Application of Short-Term Electricity Load Forecasting. In Proceedings of the 2009 International Conference on Research Challenges in Computer Science, Shanghai, China, 28–29 December 2009; pp. 11–14. [Google Scholar]
  6. Nepal, B.; Yamaha, M.; Yokoe, A.; Yamaji, T. Electricity load forecasting using clustering and ARIMA model for energy management in buildings. Jpn. Archit. Rev. 2020, 3, 62–76. [Google Scholar] [CrossRef]
  7. Al-Musaylh, M.S.; Deo, R.C.; Adamowski, J.F.; Li, Y. Short-term electricity demand forecasting with MARS, SVR and ARIMA models using aggregated demand data in Queensland, Australia. Adv. Eng. Inform. 2018, 35, 1–16. [Google Scholar] [CrossRef]
  8. Elsaraiti, M.; Ali, G.; Musbah, H.; Merabet, A.; Little, T. Time series analysis of electricity consumption forecasting using ARIMA model. In Proceedings of the 2021 IEEE Green Technologies Conference (GreenTech), Denver, CO, USA, 7–9 April 2021; pp. 259–262. [Google Scholar]
  9. De Oliveira, E.M.; Oliveira, F.L.C. Forecasting mid-long term electric energy consumption through bagging ARIMA and exponential smoothing methods. Energy 2018, 144, 776–788. [Google Scholar] [CrossRef]
  10. Chodakowska, E.; Nazarko, J.; Nazarko, L. Arima models in electrical load forecasting and their robustness to noise. Energies 2021, 14, 7952. [Google Scholar] [CrossRef]
  11. Da Silva, F.L.C.; Da Costa, K.; Rodrigues, P.C.; Salas, R.; López-Gonzales, J.L. Statistical and Artificial Neural Networks Models for Electricity Consumption Forecasting in the Brazilian Industrial Sector. Energies 2022, 15, 588. [Google Scholar] [CrossRef]
  12. Dudek, G. Neural networks for pattern-based short-term load forecasting: A comparative study. Neurocomputing 2016, 205, 64–74. [Google Scholar] [CrossRef]
  13. Arslan, S. A hybrid forecasting model using LSTM and Prophet for energy consumption with decomposition of time series data. PeerJ Comput. Sci. 2022, 8, e1001. [Google Scholar] [CrossRef]
  14. Bashir, T.; Haoyong, C.; Tahir, M.F.; Liqiang, Z. Short term electricity load forecasting using hybrid prophet-LSTM model optimized by BPNN. Energy Rep. 2022, 8, 1678–1686. [Google Scholar] [CrossRef]
  15. Chadalavada, R.J.; Raghavendra, S.; Rekha, V. Electricity requirement prediction using time series and Facebook’s PROPHET. Indian J. Sci. Technol. 2020, 13, 4631–4645. [Google Scholar] [CrossRef]
  16. Long, C.; Yu, C.; Li, T. Prophet-Based Medium and Long-Term Electricity Load Forecasting Research. J. Phys. Conf. Ser. 2022, 2356, 012002. [Google Scholar] [CrossRef]
  17. Shohan, M.J.A.; Faruque, M.O.; Foo, S.Y. Forecasting of electric load using a hybrid LSTM-neural prophet model. Energies 2022, 15, 2158. [Google Scholar] [CrossRef]
  18. Almazrouee, A.I.; Almeshal, A.M.; Almutairi, A.S.; Alenezi, M.R.; Alhajeri, S.N. Long-term forecasting of electrical loads in Kuwait using prophet and Holt–Winters models. Appl. Sci. 2020, 10, 5627. [Google Scholar] [CrossRef]
  19. Almazrouee, A.I.; Almeshal, A.M.; Almutairi, A.S.; Alenezi, M.R.; Alhajeri, S.N.; Alshammari, F.M. Forecasting of electrical generation using prophet and multiple seasonality of Holt–Winters models: A case study of Kuwait. Appl. Sci. 2020, 10, 8412. [Google Scholar] [CrossRef]
  20. Stefenon, S.F.; Seman, L.O.; Mariani, V.C.; Coelho, L.d.S. Aggregating prophet and seasonal trend decomposition for time series forecasting of Italian electricity spot prices. Energies 2023, 16, 1371. [Google Scholar] [CrossRef]
  21. Zhao, Y.; Guo, N.; Chen, W.; Zhang, H.; Guo, B.; Shen, J.; Tian, Z. Multi-step ahead forecasting for electric power load using an ensemble model. Expert Syst. Appl. 2023, 211, 118649. [Google Scholar] [CrossRef]
  22. Ren, F.; Tian, C.; Zhang, G.; Li, C.; Zhai, Y. A hybris for power demand prediction of electric vehicles based on SARIMA and deep learning with integration of periodic features. Energy 2022, 250, 123738. [Google Scholar] [CrossRef]
  23. Dudek, G.; Pelka, P.; Smyl, S. A hybrid residual dilated LSTM and exponential smoothing model for midterm electric load forecasting. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 2879–2891. [Google Scholar] [CrossRef]
  24. Mayrink, V.; Hippert, H.S. A hybrid method using Exponential Smoothing and Gradient Boosting for electrical short-term load forecasting. In Proceedings of the 2016 IEEE Latin American Conference on Computational Intelligence (LA-CCI), Cartagena, Colombia, 2–4 November 2016; pp. 1–6. [Google Scholar]
  25. Dat, N.Q.; Ngoc Anh, N.T.; Nhat Anh, N.; Solanki, V.K. Hybrid online model based multi seasonal decompose for short-term electricity load forecasting using ARIMA and online RNN. J. Intell. Fuzzy Syst. 2021, 41, 5639–5652. [Google Scholar] [CrossRef]
  26. Lee, J.; Cho, Y. National-scale electricity peak load forecasting: Traditional, machine learning, or hybrid model? Energy 2022, 239, 122366. [Google Scholar] [CrossRef]
  27. Soares, L.J.; Medeiros, M.C. Modeling and forecasting short-term electricity load: A comparison of methods with an application to Brazilian data. Int. J. Forecast. 2008, 24, 630–644. [Google Scholar] [CrossRef]
  28. Sulandari, W.; Subanar, S.; Suhartono, S.; Utami, H.; Lee, M.H.; Rodrigues, P.C. SSA-based hybrid forecasting models and applications. Bull. Electr. Eng. Inform. 2020, 9, 2178–2188. [Google Scholar] [CrossRef]
  29. Sulandari, W.; Subanar Suhartono Utami, H. Amplitude-Modulated Sinusoidal Model for The Periodic Components of SSA Decomposition. In Proceedings of the 2018 International Symposium on Advanced Intelligent Informatics (SAIN), Yogyakarta, Indonesia, 29–30 August 2018; pp. 66–71. [Google Scholar]
  30. Sulandari, W.; Subanar, S.; Suhartono, S.; Utami, H.; Lee, M.H. Estimating the function of oscillatory components in SSA-based forecasting model. Int. J. Adv. Intell. Inform. 2019, 5, 11–23. [Google Scholar] [CrossRef]
  31. Holt, C.C. Forecasting seasonals and trends by exponentially weighted moving averages. Int. J. Forecast. 2004, 20, 5–10. [Google Scholar] [CrossRef]
  32. Winters, P.R. Forecasting Sales by Exponentially Weighted Moving Averages. Manag. Sci. 1960, 6, 324–342. [Google Scholar] [CrossRef]
  33. Bezerra, A.K.L.; Santos, É.M.C. Prediction the daily number of confirmed cases of COVID-19 in Sudan with ARIMA and Holt Winter exponential smoothing. Int. J. Dev. Res. 2020, 10, 39408–39413. [Google Scholar]
  34. Da Veiga, C.P.; Da Veiga, C.R.P.; Catapan, A.; Tortato, U.; Da Silva, W.V. Demand forecasting in food retail: A comparison between the Holt-Winters and ARIMA models. WSEAS Trans. Bus. Econ. 2014, 11, 608–614. [Google Scholar]
  35. Tratar, L.F.; Strmčnik, E. The comparison of Holt–Winters method and Multiple regression method: A case study. Energy 2016, 109, 266–276. [Google Scholar] [CrossRef]
  36. Tikunov, D.; Nishimura, T. Traffic prediction for mobile network using Holt-Winter’s exponential smoothing. In Proceedings of the 2007 15th International Conference on Software, Telecommunications and Computer Networks, Split, Croatia, 27–29 September 2007; pp. 1–5. [Google Scholar]
  37. Dantas, T.M.; Oliveira, F.L.C.; Repolho, H.M.V. Air transportation demand forecast through Bagging Holt Winters methods. J. Air Transp. Manag. 2017, 59, 116–123. [Google Scholar] [CrossRef]
  38. Fauzi, N.F.; Ahmadi, N.S.; Shafii, N.H. A comparison study on fuzzy time series and holt-winter model in forecasting tourist arrival in Langkawi, Kedah. J. Comput. Res. Innov. 2020, 5, 34–43. [Google Scholar] [CrossRef]
  39. Elmunim, N.A.; Abdullah, M.; Hasbi, A.M.; Bahari, S.A. Comparison of statistical Holt-Winter models for forecasting the ionospheric delay using GPS observations. Indian J. Radio Space Phys. 2015, 44, 28–34. [Google Scholar]
  40. Djakaria, I.; Saleh, S.E. COVID-19 forecast using Holt-Winters exponential smoothing. J. Phys. Conf. Ser. 2021, 1882, 012033. [Google Scholar] [CrossRef]
  41. Makridakis, S.; Chatfield, C.; Hibon, M.; Lawrence, M.; Mills, T.; Ord, K.; Simmons, L.F. The M2-competition: A real-time judgmentally based forecasting study. Int. J. Forecast. 1993, 9, 5–22. [Google Scholar] [CrossRef]
  42. Taylor, J.W. Short-term electricity demand forecasting using double seasonal exponential smoothing. J. Oper. Res. Soc. 2003, 54, 799–805. [Google Scholar] [CrossRef]
  43. De Livera, A.M.; Hyndman, R.J.; Snyder, R.D. Forecasting time series with complex seasonal patterns using exponential smoothing. J. Am. Stat. Assoc. 2011, 106, 1513–1527. [Google Scholar] [CrossRef]
  44. Sulandari, W.; Subanar, S.; Suhartono, S.; Utami, H. Forecasting electricity load demand using hybrid exponential smoothing-artificial neural network model. Int. J. Adv. Intell. Inform. 2016, 2, 131–139. Available online: http://www.ijain.org/index.php/IJAIN/article/view/69 (accessed on 20 February 2017). [CrossRef]
  45. Smyl, S. A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting. Int. J. Forecast. 2020, 36, 75–85. [Google Scholar] [CrossRef]
  46. Makridakis, S.; Spiliotis, E.; Assimakopoulos, V. The M4 Competition: Results, findings, conclusion and way forward. Int. J. Forecast. 2018, 34, 802–808. [Google Scholar] [CrossRef]
  47. Makridakis, S.; Spiliotis, E.; Assimakopoulos, V. The M4 Competition: 100,000 time series and 61 forecasting methods. Int. J. Forecast. 2020, 36, 54–74. [Google Scholar] [CrossRef]
  48. Taylor, S.J.; Letham, B. Forecasting at scale. Am. Stat. 2018, 72, 37–45. [Google Scholar] [CrossRef]
  49. Wei, W.W.S. Time Series Analysis: Univariate and Multivariate Methods, 2nd ed.; Pearson Addison-Wesley: Boston, MA, USA, 2006; Available online: http://tocs.ulb.tu-darmstadt.de/130292508.pdf (accessed on 6 January 2017).
  50. Montero-Manso, P.; Athanasopoulos, G.; Hyndman, R.J.; Talagala, T.S. FFORMA: Feature-based forecast model averaging. Int. J. Forecast. 2020, 36, 86–92. [Google Scholar] [CrossRef]
  51. Acar, E.; Rais-Rohani, M. Ensemble of metamodels with optimized weight factors. Struct. Multidisc. Optim. 2009, 37, 279–294. [Google Scholar] [CrossRef]
  52. Petropoulos, F.; Hyndman, R.J.; Bergmeir, C. Exploring the sources of uncertainty: Why does bagging for time series forecasting work? Eur. J. Oper. Res. 2018, 268, 545–554. [Google Scholar] [CrossRef]
  53. Hao, J.; Feng, Q.; Suo, W.; Gao, G.; Sun, X. Ensemble forecasting for electricity consumption based on nonlinear optimization. Procedia Comput. Sci. 2019, 162, 19–24. [Google Scholar] [CrossRef]
  54. Hao, J.; Sun, X.; Feng, Q. A novel ensemble approach for the forecasting of energy demand based on the artificial bee colony algorithm. Energies 2020, 13, 550. [Google Scholar] [CrossRef]
  55. Hyndman, R.J.; Koehler, A.B. Another look at measures of forecast accuracy. Int. J. Forecast. 2006, 22, 679–688. [Google Scholar] [CrossRef]
  56. Chicco, D.; Warrens, M.J.; Jurman, G. The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE, and RMSE in regression analysis evaluation. PeerJ Comput. Sci. 2021, 7, e623. [Google Scholar] [CrossRef]
  57. Hyndman, R.J.; Khandakar, Y. Automatic time series forecasting: The forecast package for R. J. Stat. Softw. 2008, 27, 1–23. [Google Scholar] [CrossRef]
  58. Hyndman, R.J.; Koehler, A.B.; Snyder, R.D.; Grose, S. A state space framework for automatic forecasting using exponential smoothing methods. Int. J. Forecast. 2002, 18, 439–454. [Google Scholar] [CrossRef]
  59. Hyndman, R.; Koehler, A.B.; Ord, J.K.; Snyder, R.D. Forecasting with Exponential Smoothing: The State Space Approach; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
  60. Hyndman, R.; Athanasopoulos, G. Forecasting: Principles and Practice, 3rd ed.; OTexts: Melbourne, Australia, 2021; Available online: https://otexts.com/fpp3/ (accessed on 30 May 2023).
  61. Usmelec Function-RDocumentation. Available online: https://www.rdocumentation.org/packages/fpp/versions/0.5/topics/usmelec (accessed on 30 May 2023).
  62. Predicting Hourly Electricity Demand in Ontario|Statistical Society of Canada. Available online: https://ssc.ca/en/case-study/predicting-hourly-electricity-demand-ontario (accessed on 30 May 2023).
  63. Half-Hourly Electricity Demand—Taylor. Available online: https://pkg.robjhyndman.com/forecast/reference/taylor.html (accessed on 30 May 2023).
  64. R: Half-Hourly Electricity Demand for Victoria, Australia. Available online: https://search.r-project.org/CRAN/refmans/tsibbledata/html/vic_elec.html (accessed on 30 May 2023).
  65. Sulandari, W.; Subanar, S.; Suhartono, S.; Utami, H. Forecasting Time Series with Trend and Seasonal Patterns Based on SSA. In Proceedings of the 2017 3rd International Conference on Science in Information Technology (ICSITech) “Theory and Applicattion of IT for Education, Industry and Society in Big Data Era”, Bandung, Indonesia, 25–26 October 2017; pp. 694–699. [Google Scholar]
Figure 1. Stages of this research.
Figure 1. Stages of this research.
Energies 16 07495 g001
Figure 2. US monthly electricity total net generation from January 1973 to May 2021.
Figure 2. US monthly electricity total net generation from January 1973 to May 2021.
Energies 16 07495 g002
Figure 3. Hourly electricity demand in Ontario for 1 August 2016, 00:00 to 31 December 2016, 23:00.
Figure 3. Hourly electricity demand in Ontario for 1 August 2016, 00:00 to 31 December 2016, 23:00.
Energies 16 07495 g003
Figure 4. Hourly electricity demand in Wales and England for 5 June 2000 00:00:00 to 24 July 2000 00:30:00.
Figure 4. Hourly electricity demand in Wales and England for 5 June 2000 00:00:00 to 24 July 2000 00:30:00.
Energies 16 07495 g004
Figure 5. Half-hourly electricity demand in Victoria for 12 November 2014, 00:00:00 to 31 December 2014, 12:30:00.
Figure 5. Half-hourly electricity demand in Victoria for 12 November 2014, 00:00:00 to 31 December 2014, 12:30:00.
Energies 16 07495 g005
Figure 6. MAPEs of the testing data for Data 1 using the nine models.
Figure 6. MAPEs of the testing data for Data 1 using the nine models.
Energies 16 07495 g006
Figure 7. MAPEs of the testing data for Data 2 using the nine models.
Figure 7. MAPEs of the testing data for Data 2 using the nine models.
Energies 16 07495 g007
Figure 8. MAPEs of the testing data for Data 3 using the nine models.
Figure 8. MAPEs of the testing data for Data 3 using the nine models.
Energies 16 07495 g008
Figure 9. Comparison of the actual and forecast values obtained by Prophet, EnA, EnL, and EnNL models for the testing data of Data 3.
Figure 9. Comparison of the actual and forecast values obtained by Prophet, EnA, EnL, and EnNL models for the testing data of Data 3.
Energies 16 07495 g009
Figure 10. MAPEs of the testing data for Data 4 using the nine models.
Figure 10. MAPEs of the testing data for Data 4 using the nine models.
Energies 16 07495 g010
Table 1. Characteristics of the four datasets discussed in this study.
Table 1. Characteristics of the four datasets discussed in this study.
DataNameFrequencyNumber of Observations in the Training DataCharacteristic
1usmelecMonthly317Linear trend, increasing seasonal variation
2ontarioHourly3648Nonlinear trend, relative constant seasonal variation
3taylorHalf-hourly2354No trend, constant seasonal variation
4vic_elecHalf-hourly2354Fluctuate in level and seasonal variation
Table 2. Comparison of RMSEs and MAPEs for the training data obtained from the nine models.
Table 2. Comparison of RMSEs and MAPEs for the training data obtained from the nine models.
RMSEMAPE
ModelData 1Data 2Data 3Data 4Data 1Data 2Data 3Data 4
ARIMA6.57237.781010.63464.982.13%0.96% 22.39%7.35%
ETS6.38273.51481.9589.212.02%1.29% 21.22%21.35% 2
DSHW6.63253.71251.3340.462.16%1.13% 20.65% 20.67% 2
TBATS6.40246.49229.5943.992.03%1.05% 20.60% 20.75% 2
NN8.37158.20171.3635.042.92%0.69% 20.44% 20.61% 2
Prophet8.451118.171467.91333.532.93%5.58%4.16%5.96%
EnA6.17261.69359.83119.791.98% 21.27% 20.95% 22.04%
EnL6.04157.78177.4629.981.94% 20.70% 20.46% 20.52% 2
EnNL5.601149.17 1156.55 128.42 11.89% 1,20.67% 1,20.40% 1,20.49% 1,2
1 The smallest values in each column. 2 MAPE is less than or equal to 2%.
Table 3. Comparison of RMSEs and MAPEs for the testing data obtained from the nine models.
Table 3. Comparison of RMSEs and MAPEs for the testing data obtained from the nine models.
RMSEMAPE
ModelData 1Data 2Data 3Data 4Data 1Data 2Data 3Data 4
ARIMA9.61859.755767.47180.962.80%4.85%14.39%3.32%
ETS11.19986.6111,724.40186.512.66%5.59%27.76%3.56%
DSHW9.871141.282094.31590.192.74%6.67%5.35%12.94%
TBATS11.66918.672903.67103.292.73%5.17%7.14%2.00% 2
NN12.291160.342357.5295.413.05%6.52%6.96%1.98% 2
Prophet16.27619.86 1732.69146.064.08%3.60% 11.95% 1,22.85%
EnA10.30904.523488.10107.372.74%5.16%8.01%2.22%
EnL10.921138.94644.24228.032.75%6.41%2.03%4.99%
EnNL4.61 11166.62627.41 176.55 11.11% 1,26.56%2.12%1.54% 1,2
1 The smallest values in each column. 2 MAPE is less than or equal to 2%.
Table 4. Comparison of MAPEs on the testing data of Data 2 with lead time forecast values up to seven steps ahead obtained from the nine models.
Table 4. Comparison of MAPEs on the testing data of Data 2 with lead time forecast values up to seven steps ahead obtained from the nine models.
ModelMAPE
h = 1h = 2h = 3h = 4h = 5h = 6h = 7
ARIMA0.50%0.63%0.44%0.53%0.85%1.43%2.37%
ETS1.03%1.39%1.43%1.30%1.10%1.30%1.91%
DSHW0.13%0.13%0.42%0.87%1.48%2.30%3.39%
TBATS0.55%0.84%0.81%0.72%0.93%1.52%2.66%
NN0.76%0.89%0.74%0.56%0.65%1.09%2.07%
Prophet2.50%2.91%3.11%3.02%2.49%2.86%3.66%
EnA0.87%1.09%1.02%0.77%0.93%1.49%2.45%
EnL0.86%1.02%0.90%0.72%0.75%1.14%2.07%
EnNL0.75%0.87%0.72%0.55%0.64%1.08%2.05%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sulandari, W.; Yudhanto, Y.; Subanti, S.; Setiawan, C.D.; Hapsari, R.; Rodrigues, P.C. Comparing the Simple to Complex Automatic Methods with the Ensemble Approach in Forecasting Electrical Time Series Data. Energies 2023, 16, 7495. https://doi.org/10.3390/en16227495

AMA Style

Sulandari W, Yudhanto Y, Subanti S, Setiawan CD, Hapsari R, Rodrigues PC. Comparing the Simple to Complex Automatic Methods with the Ensemble Approach in Forecasting Electrical Time Series Data. Energies. 2023; 16(22):7495. https://doi.org/10.3390/en16227495

Chicago/Turabian Style

Sulandari, Winita, Yudho Yudhanto, Sri Subanti, Crisma Devika Setiawan, Riskhia Hapsari, and Paulo Canas Rodrigues. 2023. "Comparing the Simple to Complex Automatic Methods with the Ensemble Approach in Forecasting Electrical Time Series Data" Energies 16, no. 22: 7495. https://doi.org/10.3390/en16227495

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop