Next Article in Journal
Performance Analysis of Synchronous Multi-Radio Multi-Link MAC Protocols in IEEE 802.11be Extremely High Throughput WLANs
Previous Article in Journal
Traffic Information Enrichment: Creating Long-Term Traffic Speed Prediction Ensemble Model for Better Navigation through Waypoints
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Analysis of Meteorological Factor Multivariate Models for Medium- and Long-Term Photovoltaic Solar Power Forecasting Using Long Short-Term Memory

1
Department of Information and Communication Engineering, College of ICT Convergence, Honam University, Gwangsan-gu, Gwangju 62399, Korea
2
Department of Computer Engineering, Mokpo National University, Jeollannam-do 58554, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(1), 316; https://doi.org/10.3390/app11010316
Submission received: 23 November 2020 / Revised: 23 December 2020 / Accepted: 28 December 2020 / Published: 30 December 2020
(This article belongs to the Section Energy Science and Technology)

Abstract

:
Solar power generation is an increasingly popular renewable energy topic. Photovoltaic (PV) systems are installed on buildings to efficiently manage energy production and consumption. Because of its physical properties, electrical energy is produced and consumed simultaneously; therefore solar energy must be predicted accurately to maintain a stable power supply. To develop an efficient energy management system (EMS), 22 multivariate numerical models were constructed by combining solar radiation, sunlight, humidity, temperature, cloud cover, and wind speed. The performance of the models was compared by applying a modified version of the traditional long short-term memory (LSTM) approach. The experimental results showed that the six meteorological factors influence the solar power forecast regardless of the season. These are, from most to least important: solar radiation, sunlight, wind speed, temperature, cloud cover, and humidity. The models are rated for suitability to provide medium- and long-term solar power forecasts, and the modified LSTM demonstrates better performance than the traditional LSTM.

1. Introduction

Recently, considerable research has been conducted on low pollution, renewable energy sources to address carbon emissions and environmental problems, including the Republic of Korea’s “Implementation Plan for Renewable Energy 2030” [1]. This proposed strategy would increase the country’s share of renewable energy from 7.6% (15.1 GW) in 2017 to 20% (63.8 GW) in 2030 by encouraging the use of clean energy such as solar, wind, hydropower, biofuels, and waste recycling. Among these energy sources, solar power generation is one of the most widespread renewable energy industries and has been used extensively as an alternative to existing power generation methods. The solar power supply was expected to account for 30% (5.7 GW) of the country’s renewable energy in 2017, and supply 57% (36.5 GW) of renewables by 2030. Photovoltaic (PV) power generation is the most efficient among renewable energy resources for expanding small-scale, distributed power supplies, and it is expected to continue to grow because its power generation costs are decreasing the fastest [1,2]. Power supply utility companies operate large, centralized power plants to meet supply and demand for electricity. The utilities have generators to supply baseload power, and additional capacity to meet peaks in power demand. Most have the option to import additional production from other power companies if necessary, to meet high demand loads, resulting in some clear limitations. Some power plants can take a long time to come online at maximum production, while others are expensive to operate and are only used for peak-shaving. The demand on some plants can at times be greater than the total power plant output. Power supply companies try to prepare for surges and declines in electrical demand by controlling the energy production and transmission system through the use of predicted power consumption, and operating via demand response (DR) that helps to effectively reduce peaks [3,4].
The energy management system (EMS) controls energy in all areas of the power system, including power generation, transmission, distribution, and consumption. The EMS software has been adapted for buildings (BEMS), factories (FEMS), and homes (HEMS). The purpose of the EMS is to ensure efficiency in energy use and operation, although individual systems and details differ depending on the area of use. Until recently, electric power systems have consisted of centralized power generation and one-way, distributed consumption, with little more than experience and guesswork to manage and operate energy grids. The development of smart grids incorporating various types of EMS aids intelligent and efficient operation of power systems. The prediction of small-scale, distributed solar power generation that may feed into these electrical grids is very important [5,6].
For solar power forecasting approaches, different methodologies are preferred, depending on factors such as forecasting horizons, model inputs, and data characteristics. These methods can be broadly divided into three categories: physical, data-driven, and hybrid approaches.
(1)
The physical approach forecasts solar power using mathematical modeling that takes into consideration weather data (air pressure, temperature, humidity, jet streams, etc.) and environmental characteristics (orography, topography, land use, etc.) [7,8,9]. This approach requires a large volume of data, because the accuracy of the prediction increases in proportion to the amount of data. As such, the predictive model is often difficult to construct, and the resulting model structure is complicated because it must mathematically deal with many variables. The calculation process is complicated and requires a long computation time. The most widely used application for physical models is numerical weather prediction (NWP) [10,11,12], which is more suitable for long-term forecasting than for short-term and medium-term forecasting because of the large number of computational resources utilized.
(2)
The data-driven approach predicts solar power through pattern analysis, which is trained on past data in time increments, such as 15 min, one hour, and daily units. This method is suitable for predicting solar power values within a short period of time as the amount of past data increases. Data-driven approaches can be either statistical or artificial intelligence methods. Statistics can obtain high accuracy in short-term forecasting, but cannot accurately forecast long-term solar power due to the progressive accumulation of errors. An example of a statistical approach is the auto regression integrated moving average (ARIMA) model, a time series analysis model [13] that has many disadvantages. Several approaches have been suggested to work around these limitations [14,15]. In contrast, artificial intelligence includes neural networks [16], fuzzy inference [17], particle swarm optimization [18], genetic algorithms [19], support vector machines [20], and deep learning [21], which utilize long short-term memory [22], gated recurrent units [23], autoencoders [24], and convolutional neural networks [25]. Artificial intelligence approaches have superior performance for general purposes but are limited because the relationships between model elements cannot be accurately explained.
(3)
Hybrid approaches produce predictions by applying a statistical technique after using a physical approach. This method combines the advantages of physical and statistical approaches. In other words, traditional physical approaches are more suitable for long-term forecasting because of their large scale, while the data-driven approaches are accurate for short-term forecasting but accumulate errors in long-term forecasts. Traditional hybrid approaches suffer from many errors, and studies are underway to improve their predictions [26,27,28,29]. Recently, hybrid approaches have been developed by combining machine learning and fuzzy theory, machine learning and deep learning, and multiple types of deep learning [30,31,32,33].
The objectives of this study can be summarized as follows:
(1)
A total of 22 numerical models were constructed for correlation analysis to assess the influence of meteorological factors such as solar radiation (SR), sunlight, humidity, temperature, cloud cover (CC), and wind speed (WS) on PV solar power forecasting.
(2)
Traditional long short-term memory (LSTM) is limited when predicting solar power generation in the time step between predictions, because small inaccuracies from the previous step compound and continuously increase the prediction error. The goal of the modified LSTM was to obtain the observed value from the time interval between predictions, and use the observed value instead of the predicted value to update the network for solar power generation forecasts.
The correlation analysis between solar power and meteorological factors was verified by collecting data on buildings located in Ansan (Location “A” in Figure 1), Gyeonggi-do, Korea, from April 2018 to March 2019 in one-hour increments. Weather data for accurately forecasting solar power were collected by attaching sensors to the building in Ansan; however, this building is located in an industrial complex that has a temporary installation for solar power generation and has no relevance for weather data. Therefore, the meteorological data were collected from the closest meteorological observation point at Suwon (Location “B” in Figure 1), Gyeonggi-do, Korea. Data from the Suwon observation point were collected on an hourly basis by accessing the weather open data portal site of the Korea Meteorological Administration [34].
The organization of this study is as follows: Section 2 describes the correlations between solar power and meteorological factors based on data from April 2018. Recurrent neural networks (RNNs) and LSTM, which have superior performance among the artificial intelligence approaches, are explained and the limitations of traditional LSTM are introduced. Section 3 describes the solar PV data set and the 22 models used to verify the proposed approach. Section 4 presents the results from the analysis of the monthly and annual average solar power calculations for each of the models, and Section 5 outlines findings, conclusions and plans for future work. Abbreviations mentioned in this article are described in Table A1 of Appendix A.

2. Materials and Methods

2.1. Correlations between Solar Power and Meteorological Factors

Previous studies have indicated that the production of electricity from solar power plants is closely related to the amount of available sunlight [35,36]; in other words, the most important consideration for solar power generation is the amount of sunlight received by the PV panel. Therefore, more power is generated during the summer with long sunshine duration than is generated during the winter. There is also a close correlation between the amount of sunshine per day and dust pollution, such as that from yellow sand, and automobile pollution (smog). Solar power generation is therefore slightly higher in May, June, September, and October than in July–August when the sunshine duration is long but the amount of power generated is reduced due to high dust and smog levels when temperatures exceed 30 °C.
Scatterplots of the correlations between the meteorological factors and solar power (SP) in April 2018 are shown in Figure 2, and a comparison of the performance of the correlations between SP and various meteorological factors is listed in Table 1, where the best values are bolded. The performance of these relationships is evaluated based on different statistical measures, such as the sum of squared error (SSE), correlation coefficient (R), coefficient of determination (R2), and root mean square error (RMSE). The correlation indicates the degree of association between the two variables. Rs approaching +1 indicate strong positive correlations, those approaching −1 indicate strong negative correlations, and those approaching 0 indicate an almost negligible linear relationship. The smaller the SSE and RMSE, the better the estimated regression line.
Table 2 shows statistical data between SP and meteorological factors. For statistical data, minimum, maximum, mean, median, standard deviation, and range were adopted. Among these data, the standard deviation is a number representing the distribution of the data, and the smaller the standard deviation, the closer the variables are to the mean value. Therefore, the results show that the relationship between SP and SR is the strongest, followed by the relationships between SR and sunshine, humidity, temperature, WS, and CC.

2.2. PV System and Measured Method

A PV system is designed to supply usable SP by means of PV cells. The building structure is H-BEAM with sandwich panels for the exterior walls, and the PV system capacity is 150 kW. Figure 3 shows the proposed wiring diagram of the SP system. The solar panel is in series and parallel, and is a string inverter type with three inverters (50 kW × 3). PV power is measured with a 3P/4W electronic watt–hour meter. The measurement cycle is measured in units of 15 min but was collected in units of 1 h in this study.

2.3. Recurrent Neural Networks

In machine learning, RNNs are mainly used for analysis of time series data and sequence data [36]. The input of the multilayer neural network is activated only in the output direction, and the node of the hidden layer cannot utilize the past information (input); however, the RNN has a circular internal structure, as shown in Figure 4, which allows past information to be used. The RNN receives and processes the elements constituting time series data or sequence data one at a time, creates a corresponding internal node according to the input time, and stores the information processed up to that point. Information stored in this way can be output through the output node. Because of these characteristics, RNNs have been used in fields such as speech recognition, language recognition, and handwriting recognition with good results [37].
In Figure 4, the weight between the input layer and the hidden layer is represented by U, the weight between the hidden layer and the output layer is represented by V , and the weight between the hidden layers is represented by W. Since RNNs share weights, they have the same weights U ,   V ,   and   W at all times. The weight-sharing function improves prediction ability, saves computation time, and has an advantage when the number of inputs is variable:
h t = tanh U × x t + W × h t 1 ,
o t = σ V × h t .
In Equations (1) and (2), x t is the value of the input layer at time t, h t 1 is the value of the hidden layer at time t − 1, and o t is the value of the output layer at time t. The activation functions used in h t and o t are the hyperbolic tangent function (tanh) and the sigmoid ( σ ) function, respectively.

2.4. Long Short-Term Memory

RNNs have the advantages described in the previous section as well as the following limitations. (1) As shown previously in Figure 4, multiple neural networks are connected, and the same weight is multiplied several times, resulting in a gradient explosion problem and a gradient vanishing problem [38]. (2) RNNs reflect information from the past, but as time progresses the previous information arrives in a weakened state, making it difficult for the network to incorporate older data. (3) Finally, RNNs have difficulty distinguishing between continuously delivered information and unnecessary data that should be deleted. Therefore, an LSTM that uses a new “gate” concept to solve the gradient vanishing problem, and the information transmission problem can be utilized [39]; this gate decides whether to remember or delete the current data. Representative studies of LSTMs include handwriting recognition [40], handwriting generation [41], speech recognition [42], machine translation [43], and image captioning [44,45].
Figure 5 shows the structure of the LSTM. The interior of the LSTM block consists of a cyclic memory cell and three types of gates (input, forget, and output gates). The LSTM calculates the final output through a hidden layer like the RNN but controls the flow of information by appropriately using the three gates during the calculation of the hidden layer. The input gate controls the amount of the current input state ( c t ) flowing to the current cell state ( s t ). The forget gate controls the amount of the cell state before time t ( s t 1 ) flowing to the cell state at time t ( s t ). The states of the LSTM cells are computed as follows in Equations (3)–(5) [46,47], where g t , f t , and q t are the input, forget, and output gates, respectively:
g t = σ U g × x t + W g ×   h t 1 + b g ,
f t = σ U f × x t + W f ×   h t 1 + b f ,
q t = σ U q × x t + W q ×   h t 1 + b q .
The c t is calculated by applying U c , W c , and b b to the input data at time t and the hidden layer ( h t 1 ) at time (t − 1), respectively. The s t calculates the current information using c t , g t and f t . The hidden node ( h t ) is calculated using s t and q t . Finally, the current output state ( o t ) multiplies the hidden layer by V .
c t = t a n h U c × x t + W c ×   h t 1 + b c ,
s t = f t × s t 1 + g t × c t ,
h t = q t × t a n h ( s t ) ,
o t = V × h t ,
In Equations (6)–(9), U g , U f , U q , U c , W g , W f , and W q are the weights of the input value x t and the previous hidden layer ( h t 1 ) at time t for each gate, respectively; b g , b f , b q , and b c are the biases of g t , f t , and q t , and c t , respectively.

2.5. Limitations of the Traditional LSTM

Figure 6 shows the solar power forecast results using a traditional LSTM for January 2019. The traditional LSTM uses the previous prediction value in the time step when forecasting solar power; if the prediction value is incorrect, the forecast value will continuously increase and, the prediction error will increase. Figure 6a shows the comparison between observed and predicted value and Figure 6b shows that prediction error of the observed and predicted values. Therefore, since the observed value is important for the prediction, the weight is added to the observed value instead of the previous data in this study, and the network state is then updated for the forecast.

3. Proposed Approaches

This section explains the test dataset of SP, proposes 22 models for SP and weather data, and describes the proposed method in detail.

3.1. Test Dataset

In this study, data were collected from a building located in Ansan, Gyeonggi-do, Korea. The collection period was from April 2018 to March 2019, and SP generation and meteorological data were collected every hour. Table 3 shows the monthly data collected and the total data collected for one year.

3.2. Multivariate Numerical Models

Table 4 shows the 22 multivariate numerical models constructed by combining six meteorological elements—SP, SR, sunlight, CC, WS, humidity, and temperature—that have the greatest effect on solar power. In this study, performance tests were conducted on the traditional and proposed LSTMs with these 22 models.

3.3. Proposed Method

In the following sections, the pre-processing step is outlined for data standardization and division into training and test data. The proposed LSTM is then described, and a post-processing step is applied to the simulation results.

3.3.1. Data Pre-Processing

Data collected on a monthly and yearly basis are divided into training (80%) and test (20%) data. To prevent divergence in the training set, the training data were standardized as shown in Equation (10) so that the average is 0 and the variance is 1. The test data were standardized as shown in Equation (11) at the time of prediction using the same parameters as the training data:
T r a i n s t d = T r a i n i T r a i n m e a n T r a i n s i g ,
T e s t s t d = T e s t i T e s t m e a n T e s t s i g .
In Equation (10), T r a i n i is the training data, and T r a i n m e a n , T r a i n s i g , and T r a i n s t d are the mean, standard deviation, and standardization of the training data, respectively. In Equation (11), T e s t i is the test data, and T e s t m e a n , T e s t s i g , and T e s t n s t d are the mean, standard deviation, and standardization of the test data, respectively.

3.3.2. Model Setting and Training

The traditional LSTM should not affect the prediction of new data because the previous predicted value affects the prediction of new data. Therefore, to predict each sequence, the neural network state is reset. This prevents previous predictions from affecting new data predictions. Since the observed value of the time step between predictions can be accessed, the neural network state is updated using the observed value instead of the predicted value using Equation (12):
s t = 1 α × f t × s t 1 + α × g t × c t .
In Equation (12), since the previous cell state and forgetting gate did not reflect changes to the current input data well, many errors occurred. Therefore, if the weight ( α = 0.8 ) is added to g t and c t rather than to the previous cell state and f t , it is determined that the performance will be excellent because the current data are not lost.
For training and testing, the model was set up as follows because it has excellent performance: the hidden LSTM layer consisted of 200 blocks, the maximum number of iterations was fixed at 250, the initial learning rate was 0.005, the activation function was the ReLU (rectified linear unit) [48], and the optimization was “Adam” [49].

3.3.3. Post-Processing

Standardization was used to prevent divergence of the training and test data in the pre-processing stage, and the standardized training and test data are replaced with the original values using Equations (13) and (14), respectively:
T r a i n p r e d = T r a i n s i g × T r a i n p r e d + T r a i n m e a n ,
T e s t p r e d = T e s t s i g × T e s t p r e d + T e s t m e a n .
Figure 7 shows the solar power forecast results using a proposed LSTM for January 2019. Figure 7a compares between observed and predicted values and Figure 7b predicted error between observed and predicted values. As shown in Figure 7, the proposed method is more accurate than the traditional LSTM used in the experiment. The proposed LSTM can adapt to fluctuations in the observed data, which improves its accuracy and performance. Additionally, the prediction error of the traditional LSTM is 6.68, while the prediction error of the proposed LSTM is 2.07. Therefore, the proposed LSTM method is more effective and adaptive for solar power forecasting.

4. Results and Discussion

4.1. Test Environment and Metrics for Evaluation

To verify the traditional and proposed method, experiments were performed on a PC equipped with an Intel Xeon (R) E-2136 3.31 GHz CPU and 32 GB RAM. The test operating system was Windows 10 (64 bit), and the experimental program was the deep learning toolbox and the statistics and machine learning toolbox supported by MATLAB R2019a [50]. Finally, the MAE (mean absolute difference), RMSE, and computation time (ms) were adopted to evaluate the solar power forecasting error, as shown in Equations (15) and (16):
M A E = 1 n × i = 1 n y i y ^ i ,
R M S E = 1 n × i = 1 n y i y ^ i 2 ,
where y i and y ^ i indicate the observed and predicted values, respectively, and n is the number of test data points.

4.2. Performance Comparison between the Monthly and Annual Averages

The weather in Korea 30 years ago was less unusual than it is now, with spring from March to May, summer from June to August, autumn from September to November, and winter from December to March. However, Korean weather patterns have now compressed the period of spring from April to May and autumn from October to November, while the summer season (June to September) and winter season (December to March) has become longer. As shown in Figure 8, the greatest to least impact of the meteorological factors on solar power was SR > sunlight > WS > temperature > CC > humidity [51]. Thus, while five of the meteorological factors had a substantial impact on solar power generation, humidity does not significantly affect the month-to-month solar power generation forecast.
Table 5 shows the results of the proposed LSTM method for a multivariate model combining the six considered meteorological factors (with the RMSE < 1.5). Among the 22 multivariate models, model 16 was the best at accurately predicting solar power regardless of the season. The input parameters of model 16 are SP, SR, sunlight, and WS. Model 1 has the advantage of a short computation time because it has only a few input parameters; however, since the model learns with previous SP data values, it cannot accurately predict solar power. As shown in Table 5, humidity and WS do not significantly affect the solar power forecasts individually; however, models 19, 21, and 22 have smaller RMSE values owing to the correlation between humidity and wind speed. The results in Table 5 indicate that the models cannot accurately predict SP given the seasonal weather conditions in Korea, but model 16 is better for long-term solar power forecasting. The models with more checkmarks in the columns of Table 5 are better for medium-term solar power forecasting, depending on the regional and climatic conditions.
Table 6 and Table 7 are the performance comparisons of the traditional LSTM and the proposed LSTM, respectively, for medium-term and long-term solar power forecasting by the models. Table 6 is the result of comparing the performance of medium-term solar power forecasting for April, which represents the spring season in Korea. Δ is the difference between traditional LSTM and proposed LSTM. The medium-term solar power forecasting results for August, October, and January, included in Table A2, Table A3 and Table A4 of Appendix B, represent the Korean summer, autumn, and winter, respectively.
Figure 9 and Figure 10 show the results of 22 models for the proposed and traditional LSTM by month (January, April, August, and October). The 22 models of the proposed LSTM forecast values similar to the observed values, but the 22 models of the traditional LSTM predict values that differ from the observed values.
Figure 11 shows the best and worst models for the proposed and traditional LSTMs by month (January, April, August, and October), respectively. When compared seasonally, models 2 and 6 show the best performance in the traditional and proposed methods, whereas models 14, 17, and 19 have the worst performance.
Figure 12 compares the RMSE values and the differences between the monthly and yearly averages determined by the models. Figure 12a,b shows that comparison between the monthly and yearly average by models and difference the monthly and yearly average. As shown in Figure 12a, the RMSE values of the monthly and yearly averages are almost similar; however, models 1, 5, 6, 8, and 10, which present a yearly difference of more than 0.5 compared to the monthly average, should not be used, especially in long-term solar power forecasting.
Figure 13a,b shows that comparison of the computation times for the monthly and yearly averages for each model and difference between monthly and yearly average, respectively. The total amount of data in the experiment per year and month is the same, but the computation time for the yearly average is longer than that for the monthly average. This is because the amount of experimental data for the year is larger than that for each month, requiring a longer computation time. Additionally, the computation time increases as the number of input parameters increases.

5. Conclusions

Among the various renewable energy sources, solar energy is being increasingly supported by the Korean government. Solar power systems have been installed in numerous locations such as homes, buildings, and factories, and electricity has been supplied accordingly. Forecasting solar power is essential for a smooth, reliable power supply; however, these forecasts are strongly influenced by meteorological factors. This study analyzed six meteorological factors with the goal of accurately predicting solar power. A total of 22 multivariate models were proposed to assess various combinations of the six meteorological factors: solar radiation (SR), sunlight, humidity, temperature, cloud cover (CC), and wind speed (WS). Analysis of the multivariate numerical models included modifications of the traditional long short-term memory (LSTM) method and application of the proposed LSTM.
The main results are as follows. (1) The six meteorological factors affect the solar power forecast in the following descending order: SR, sunlight, WS, temperature, CC, and humidity. SR has the greatest influence on solar power forecasting, while humidity has effectively no influence on the solar power forecast. (2) Among the 22 models, model 16 is superior, regardless of the characteristics of abnormal weather in Korea. The meteorological factors incorporated in model 16 are SP, SR, sunlight, and WS. (3) The remaining models were ranked according to medium- and long-term solar power forecasting accuracy. Models 6, 7, 9, 13, and 19 showed reasonable performance for long-term forecasting, whereas models 1, 5, 8, and 10 were not suitable. All models except 1, 5, 8, and 10 are appropriate for medium-term solar power forecasting. (4) The root mean square error and the mean absolute difference of the proposed LSTM are superior to those of the traditional LSTM. (5) The calculation time increases as the number of variables in a model increases. (6) The proposed medium- to long-term photovoltaic (PV) solar power forecasting contributes to efficient power consumption, demand response, and a smooth power supply for buildings.
The limitations of the article are as follows. (1) Attaching the weather sensors (temperature, humidity, WS, SR, etc.) to the solar installation site to acquire meteorological and solar power data simultaneously will improve the accuracy of the solar power forecasts. However, this research was limited to collecting meteorological data at an established government weather station some distance from the site. (2) The proposed algorithm has been applied to the various sites for less than a year; however, seasonal measurements could not be repeated.
Future work will collect meteorological data simultaneously at the site where PV is installed, and measurements will extend for more than a year. The methodology will also be applied to other solar power plants to verify the proposed model.

Author Contributions

N.S. supervised and wrote the article and implemented the algorithm. M.J. surveyed related research and background. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

This article was supported by research fund from Honam University, 2020.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Acronym table.
Table A1. Acronym table.
AbbreviationMeaning
EMSenergy management system
LSTMlong short-term memory
PVPhotovoltaic
DRdemand response
BEMSBuilding energy management system
FEMSFactory energy management system
NWPnumerical weather prediction
ARIMAauto regression integrated moving average
RNNrecurrent neural network
SRsolar radiation
CCcloud cover
WSwind speed
SSEsum of squared error
Rcorrelation coefficient
R2coefficient of determination
RMSEroot mean square error
Tanhhyperbolic tangent function
ReLUrectified linear unit
MAEmean absolute error

Appendix B

Table A2. Performance comparison for medium-term solar power forecasting (August) by the models.
Table A2. Performance comparison for medium-term solar power forecasting (August) by the models.
No.RMSEMAEComputation
TraditionalProposedΔTraditionalProposedΔTime (s)
126.6810.815.8817.996.3911.6142.73
243.570.5243.0427.40.4226.9872.26
362.521.6960.8340.171.638.5767.35
416.65.2211.3813.943.8810.0660.42
527.51.2226.2822.530.9221.6255.19
6101.923.398.6393.833.0390.7950.29
735.230.9134.3222.580.7121.8747.2
845.691.1344.5629.911.1128.864.73
948.366.0842.2837.294.7432.5560.77
1027.771.6826.0922.541.4121.1258.24
1142.531.8940.6427.151.5325.6255.8
12110.071.38108.7106.41.19105.2154.14
1318.445.8612.5815.384.5510.8368.09
1429.11.4227.6825.821.1424.6866.28
1525.41.8723.5316.971.4215.5565.88
1645.490.8944.638.630.6937.9464.94
1751.131.8349.3150.81.149.777.57
1848.972.1646.8248.721.3347.3977.86
1949.952.4547.539.421.0438.3876.94
2085.022.0882.9383.151.4481.7290.41
2156.81.3455.4642.540.8441.7192.08
2241.631.0540.5734.290.733.59104.93
Table A3. Performance comparison for medium-term solar power forecasting (October) by the models.
Table A3. Performance comparison for medium-term solar power forecasting (October) by the models.
No.RMSEMAEComputation
TraditionalProposedΔTraditionalProposedΔTime (s)
118.589.748.8411.995.98665.83
289.71.9187.7988.321.8986.43112.27
319.880.8119.0611.890.7511.13103.55
422.987.7615.2218.425.812.6295.64
528.162.1725.9918.891.5517.3390.02
646.892.7344.1629.062.0926.9685.68
732.872.7630.1119.82.0617.7482.54
832.70.6532.0518.850.5618.29114.87
937.438.4828.9530.156.6423.51111.51
107.131.665.476.151.424.74108.49
1126.472.4224.0616.561.7614.79104.97
1215.061.6813.389.831.488.35102.89
1327.958.0519.9123.536.1117.43132.53
1425.271.9923.2916.421.4215134.46
1525.92.2723.6316.721.6415.08126.96
1624.851.0923.7613.240.8912.35125.36
1779.253.4275.8378.613.1975.42166.82
18102.232.9699.27101.042.298.83174.02
1974.511.7172.873.260.9672.3179.51
2066.332.164.2457.091.4255.66214.63
2153.21.8251.3848.841.647.24209.94
2253.051.6551.446.661.1645.5248.35
Table A4. Performance comparison for medium-term solar power forecasting (Winter) by the models.
Table A4. Performance comparison for medium-term solar power forecasting (Winter) by the models.
No.RMSEMAEComputation
TraditionalProposedΔTraditionalProposedΔTime (s)
130.666.8523.8119.334.7214.6173.49
227.410.412713.970.3513.62125.42
318.621.6616.9612.211.6410.56115.51
421.955.5216.4318.773.9614.81107.74
524.471.5622.9113.281.2312.05100.33
630.833.0927.7418.222.315.9295.58
739.671.3638.3223.021.0421.9891.48
824.380.4223.9512.890.3512.54126.17
927.886.4521.4323.214.6518.55121.53
1026.071.8424.2316.181.4414.73117.53
1132.092.9529.1418.512.216.31113.34
1238.641.0537.5922.390.8321.57110.19
1322.426.2716.1519.314.7414.57139.51
1417.671.5416.138.991.227.77138.51
1538.75335.7522.962.1620.8138.52
1628.490.8727.6213.920.6513.27149.24
1731.382.7728.6120.761.9718.79192.29
1882.752.6780.0874.761.9372.53187.78
1919.151.0818.0812.010.7611.26179.5
2050.342.348.0447.911.6546.27225.09
2196.541.2395.3183.990.9583.04217.48
2267.911.3466.5750.460.8749.6257.9

References

  1. Korea Energy Economics Institute (KEEI). Modified CFI 2030 Plan to Implement Energy Self-Reliance Island; KEEI: Ulsan, Korea, 2019. [Google Scholar]
  2. Lee, S.H.; Cho, I.H. International Renewable Energy Policy Change and Market Analysis; Korea Energy Economics Institute: Seongan-dong, Korea, 2018. [Google Scholar]
  3. Park, H. Stand-alone Photovoltaic Power Generation System-Energy Storage System Market and Growth Trend. KISTI 2014, 11. [Google Scholar]
  4. Balijepalli, V.S.K.M.; Pradhan, V.; Khaparde, S.A.; Shereef, R.M. Review of demand response under smart grid paradigm. In Proceedings of the ISGT2011-India, Kollam, India, 1–3 December 2011; pp. 236–243. [Google Scholar]
  5. Yang, D.H. Study on Prediction of Photovoltaic Power Generation and Power Consumption for Efficient Building Energy Management. Master’s Thesis, University of Science and Technology, Daejeon, Korea, 2018. [Google Scholar]
  6. Rashid, M.M.U.; Granelli, F.; Hossain, M.A.; Alam, M.S.; Al-Ismail, F.S.; Karmaker, A.K.; Rahaman, M.M. Development of Home Energy Management Scheme for a Smart Grid Community. Energies 2020, 13, 4288. [Google Scholar] [CrossRef]
  7. Zhao, J.; Guo, Z.-H.; Su, Z.-Y.; Zhao, Z.-Y.; Xiao, X.; Liu, F. An improved multi-step forecasting model based on WRF ensembles and creative fuzzy systems for wind speed. Appl. Energy 2016, 162, 808–826. [Google Scholar] [CrossRef]
  8. Zameer, A.; Arshad, J.; Khan, A.; Raja, M.A.Z. Intelligent and robust prediction of short term wind power using genetic programming based ensemble of neural networks. Energy Convers. Manag. 2017, 134, 361–372. [Google Scholar] [CrossRef]
  9. Lei, M.; Shiyan, L.; Jiang, C.; Hongling, L.; Yan, Z. A review on the forecasting of wind speed and generated power. Renew. Sustain. Energy Rev. 2009, 13, 915–920. [Google Scholar] [CrossRef]
  10. Lonij, V.P.; Brooks, A.E.; Cronin, A.D.; Leuthold, M.; Koch, K. Intra-hour forecasts of solar power production using measurements from a network of irradiance sensors. Sol. Energy 2013, 97, 58–66. [Google Scholar] [CrossRef]
  11. Lorenz, E.; Kuehnert, J.; Wolff, B.; Hammer, A.; Kramer, O.; Heinemann, D. PV power predictions on different spatial and temporal scales integrating PV measurements, satellite data and numerical weather predictions. In Proceedings of the 29th European Photovoltaic Solar Energy Conference and Exhibition, Amsterdam, The Netherlands, 22–26 September 2014. [Google Scholar]
  12. Li, Z.; Rahman, S.M.; Vega, R.; Dong, B. A Hierarchical Approach Using Machine Learning Methods in Solar Photovoltaic Energy Production Forecasting. Energies 2016, 9, 55. [Google Scholar] [CrossRef] [Green Version]
  13. Jiang, Y.; Chen, X.; Yu, K.; Liao, Y. Short-term wind power forecasting using hybrid method based on enhanced boosting algorithm. J. Mod. Power Syst. Clean Energy 2017, 5, 126–133. [Google Scholar] [CrossRef] [Green Version]
  14. Zhang, G.P. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 2003, 50, 159–175. [Google Scholar] [CrossRef]
  15. Nury, A.H.; Hasan, K.; Bin Alam, J. Comparative study of wavelet-ARIMA and wavelet-ANN models for temperature time series data in northeastern Bangladesh. J. King Saud Univ. Sci. 2017, 29, 47–61. [Google Scholar] [CrossRef] [Green Version]
  16. Haykin, S. Network, Neural: A Comprehensive Foundation; Prentice Hall: Upper Saddle River, NJ, USA, 2004. [Google Scholar]
  17. Damousis, I.G.; Alexiadis, M.C.; Theocharis, J.B.; Dokopoulos, P.S. A Fuzzy Model for Wind Speed Prediction and Power Generation in Wind Parks Using Spatial Correlation. IEEE Trans. Energy Convers. 2004, 19, 352–361. [Google Scholar] [CrossRef]
  18. Venter, G.; Sobieszczanski-Sobieski, J. Particle Swarm Optimization. AIAA J. 2013, 41, 129–132. [Google Scholar]
  19. Chang, W.A.; Ramakrishna, R.S. A Genetic Algorithm for Shortest Path Routing Problem and the Sizing of Populations. IEEE Trans. Evol. Comput. 2002, 6, 566–579. [Google Scholar] [CrossRef] [Green Version]
  20. Mariya, S.; Ilya, K.; Thorsten, S. Supervised Classification with Interdependent Variable to Support Targeted Energy Efficiency Measures in the Residential Sector. Decis. Anal. A Springer Open J. 2016, 3, 1. [Google Scholar]
  21. Patterson, J.; Gibson, A. Deep Learning: A Practitioner’s Approach; O’Reilly Media: Sebastopol, CA, USA, 2017; pp. 150–158. [Google Scholar]
  22. Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  23. Kyunghyun, C.; Bart, V.M.; Caglar, G.; Dzmitry, B.; Fethi, B.; Holger, S.; Yoshua, B. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. arXiv 2014, arXiv:1406.1078. [Google Scholar]
  24. Max, W.; Diederik, K.P. An Introduction to Variational Autoencoders. Found. Trends Mach. Learn. 2019, 12, 307–392. [Google Scholar]
  25. Valueva, M.; Nagornov, N.; Lyakhov, P.; Valuev, G.; Chervyakov, N. Application of the residue number system to reduce hardware costs of the convolutional neural network implementation. Math. Comput. Simul. 2020, 177, 232–243. [Google Scholar] [CrossRef]
  26. Okumus, I.; Dinler, A. Current status of wind energy forecasting and a hybrid method for hourly predictions. Energy Convers. Manag. 2016, 123, 362–371. [Google Scholar] [CrossRef]
  27. Sarwat, A.; Amini, M.; Domijan, A.; Damnjanovic, A.; Kaleem, F. Weather-based interruption prediction in the smart grid utilizing chronological data. J. Mod. Power Syst. Clean Energy 2015, 4, 308–315. [Google Scholar] [CrossRef] [Green Version]
  28. Wang, J.; Dong, L.L.; Yi, W.Z. The Status and Development of the Combination Forecast Method. Forecast 1997, 6, 37–45. [Google Scholar]
  29. Esen, H.; Inalli, M.; Sengur, A.; Esen, M. Modeling a ground-coupled heat pump system by a support vector machine. Renew. Energy 2008, 33, 1814–1823. [Google Scholar] [CrossRef]
  30. Wang, H.; Yi, H.; Peng, J.; Wang, G.; Liu, Y.; Jiang, H.; Liu, W. Deterministic and probabilistic forecasting of photovoltaic power based on deep convolutional neural network. Energy Convers. Manag. 2017, 153, 409–422. [Google Scholar] [CrossRef]
  31. Jamali, B.; Rasekh, M.; Jamadi, F.; Gandomkar, R.; Makiabadi, F. Using PSO-GA algorithm for training artificial neural network to forecast solar space heating system parameters. Appl. Therm. Eng. 2019, 147, 647–660. [Google Scholar] [CrossRef]
  32. Lee, W.; Kim, K.; Park, J.; Kim, J.; Kim, Y. Forecasting Solar Power Using Long-Short Term Memory and Convolutional Neural Networks. IEEE Access 2018, 6, 73068–73080. [Google Scholar] [CrossRef]
  33. Ramsami, P.; Oree, V. A hybrid method for forecasting the energy output of photovoltaic systems. Energy Convers. Manag. 2015, 95, 406–413. [Google Scholar] [CrossRef]
  34. Korea Meteorological Administration. Meteorological Data Opening Portal. Available online: https://data.kma.go.kr/data/grnd/selectAsosRltmList.do?pgmNo=36 (accessed on 12 December 2020).
  35. Chen, H. The Validity of the Theory and Its Application of Combination Forecast Methods; Science Press: Beijing, China, 2008. [Google Scholar]
  36. Cha, W.C. A Study on the Prediction of the Annual Power Generation through the Analysis on Factors Affecting Photovoltaic Power Generation. Ph.D. Thesis, Soongsil University, Seoul, Korea, 2015. [Google Scholar]
  37. Ahmed, T. Time series forecasting using artificial neural networks methodologies: A systematic review. Future Comput. Inform. J. 2018, 3, 334–340. [Google Scholar]
  38. Hochreiter, S. Untersuchungen zu dynamischen neuronalen Netzen. Diploma Thesis, Technische University of Munich, Munich, Germany, 1991. [Google Scholar]
  39. Hochreiter, S.; Schmidhuber, J. LSTM can Solve Hard Long Time Lag Problems. In Proceedings of the Advances in Neural Information Processing Systems 1996, Denver, CO, USA, 2–5 December 1996; pp. 473–479. [Google Scholar]
  40. Graves, A. Generating Sequences with Recurrent Neural Networks. arXiv 2013, arXiv:1308.0850. [Google Scholar]
  41. Liwicki, M.; Graves, A.; Bunke, H. Neural Networks for Handwriting Recognition. In Artificial Intelligence: Foundations, Theory, and Algorithms; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2012; pp. 5–24. [Google Scholar]
  42. Graves, A.; Jaitl, N. Towards end-to-end speech recognition with recurrent neural networks. In Proceedings of the 31st International Conference on International Conference on Machine Learning 2014, Beijing, China, 21–26 June 2014; pp. 1764–1772. [Google Scholar]
  43. Sutskever, I.; Vinyals, O.; Le, Q.V. Sequence to Sequence Learning with Neural Networks. In Proceedings of the 27th International Conference on Neural Information Processing Systems 2014, Montreal, QC, Canada, 8–13 December 2014; pp. 3104–3112. [Google Scholar]
  44. Xu, K.; Ba, J.; Kiros, R.; Cho, K.; Courville, A.; Salakhutdinov, R.; Zemel, R.; Bengio, Y. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention. ICML 2015, 37, 2048–2057. [Google Scholar]
  45. Vinyals, O.; Toshev, A.; Bengio, S.; Erhan, D. Show and tell: A neural image caption generator. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 3156–3164. [Google Scholar]
  46. Gers, F.A.; Schmidhuber, J.; Cummins, F. Learning to Forget: Continual Prediction with LSTM. Neural Comput. 2020, 12, 2451–2471. [Google Scholar] [CrossRef]
  47. Klaus, G.; Rupesh, K.S.; Jan, K.; Bas, R.S.; Jürgen, S. LSTM: A search space odyssey. IEEE Trans. Neural Netw. Learn. Syst. 2015, 28, 2222–2232. [Google Scholar] [CrossRef] [Green Version]
  48. Nair, V.; Hinton, G. Rectified Linear Units Improve Restricted Boltzmann Machines. In Proceedings of the 27th International Conference on International Conference on Machine Learning, Haifa, Israel, 21–24 June 2010; pp. 807–814. [Google Scholar]
  49. Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. In Proceedings of the International Conference on Learning Representations (ICLR), San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
  50. Matlab, Version 7.1. 2005. Available online: http://www.mathworks.com (accessed on 1 February 2012).
  51. Son, N.R.; Yang, S.H. Meteorological factor multivariate models affecting solar power prediction using long short-term memory. IJITEE 2020, 9, 142–147. [Google Scholar] [CrossRef]
Figure 1. The data collection area (“A”) and weather observation area (“B”) in Gyeonggi-do, Korea.
Figure 1. The data collection area (“A”) and weather observation area (“B”) in Gyeonggi-do, Korea.
Applsci 11 00316 g001
Figure 2. Scatterplots of solar power vs. (a) solar radiation (w/m2), (b) sunlight, (c) humidity (%), (d) temperature (°C), (e) cloud cover, and (f) wind speed (m/s).
Figure 2. Scatterplots of solar power vs. (a) solar radiation (w/m2), (b) sunlight, (c) humidity (%), (d) temperature (°C), (e) cloud cover, and (f) wind speed (m/s).
Applsci 11 00316 g002
Figure 3. Proposed wiring diagram of the solar power system.
Figure 3. Proposed wiring diagram of the solar power system.
Applsci 11 00316 g003
Figure 4. Structure of a recurrent neural network.
Figure 4. Structure of a recurrent neural network.
Applsci 11 00316 g004
Figure 5. Structure of long short-term memory.
Figure 5. Structure of long short-term memory.
Applsci 11 00316 g005
Figure 6. Solar power forecasting using a traditional long short-term memory (LSTM).
Figure 6. Solar power forecasting using a traditional long short-term memory (LSTM).
Applsci 11 00316 g006
Figure 7. Solar power forecasting using the proposed method.
Figure 7. Solar power forecasting using the proposed method.
Applsci 11 00316 g007
Figure 8. The meteorological factors affecting solar power forecasting by month.
Figure 8. The meteorological factors affecting solar power forecasting by month.
Applsci 11 00316 g008
Figure 9. Comparison among the 22 models for the traditional LSTM by month.
Figure 9. Comparison among the 22 models for the traditional LSTM by month.
Applsci 11 00316 g009
Figure 10. Comparison among the 22 models for the proposed LSTM by month.
Figure 10. Comparison among the 22 models for the proposed LSTM by month.
Applsci 11 00316 g010aApplsci 11 00316 g010b
Figure 11. Comparison of the best and worst models for the proposed and traditional LSTM by season.
Figure 11. Comparison of the best and worst models for the proposed and traditional LSTM by season.
Applsci 11 00316 g011aApplsci 11 00316 g011b
Figure 12. Comparisons between the monthly and yearly averages according to the RMSE values and differences determined by the models.
Figure 12. Comparisons between the monthly and yearly averages according to the RMSE values and differences determined by the models.
Applsci 11 00316 g012
Figure 13. Comparison between the monthly and yearly averages according to the computation time and differences determined by the models.
Figure 13. Comparison between the monthly and yearly averages according to the computation time and differences determined by the models.
Applsci 11 00316 g013
Table 1. Performance comparison of solar power vs. meteorological factors. Bolded values are the best in the comparison.
Table 1. Performance comparison of solar power vs. meteorological factors. Bolded values are the best in the comparison.
MonthRelationshipSSER2(R)RMSE
4SP/SR1.94 × 1050.8196(+0.9053)16.4291
SP/Sunlight4.51 × 1050.5805(+0.7619)25.0547
SP/Humidity7.27 × 1050.3237(−0.5689)31.8131
SP/Temperature7.81 × 1050.2733(+0.5227)32.9771
SP/CC1.06 × 1050.0096(−0.0979)37.4975
SP/WS1.07 × 1060.0313(+0.1769)38.0740
Table 2. Statistical data (Min, max, mean, median, standard deviation, and range) of solar power vs. meteorological factors.
Table 2. Statistical data (Min, max, mean, median, standard deviation, and range) of solar power vs. meteorological factors.
SP/SRSP/SunlightSP/HumiditySP/TemperatureSP/CCSP/WS
xyxyxyxyxyxy
min00001400.100000
max3.35136113610013626.7136101367.5136
mean0.6826.310.3026.3164.8926.3112.6426.313.7926.312.0526.31
median0.0920265212.152121.82
std.0.9838.660.4338.6624.5338.665.5238.664.2938.661.7338.66
range3.3513611368613626.6136101367.5136
Table 3. The amount of data per month and year.
Table 3. The amount of data per month and year.
MonthYear
20182019
4567891011121238640
720744696744744720672720743744672720
Table 4. Multivariate numerical models.
Table 4. Multivariate numerical models.
No.FactorsNo.Factors
1SP12SP, SR, WS
2SP, SR13SP, SR, Sunlight, Humidity
3SP, Sunlight14SP, SR, Sunlight, Temperature
4SP, Humidity15SP, SR, Sunlight, CC
5SP, Temperature16SP, SR, Sunlight, WS
6SP, CC17SP, SR, Sunlight, Humidity, Temperature
7SP, WS18SP, SR, Sunlight, Humidity, CC
8SP, SR, Sunlight19SP, SR, Sunlight, Humidity, WS
9SP, SR, Humidity20SP, SR, Sunlight, Humidity, Temperature, CC
10SP, SR, Temperature21SP, SR, Sunlight, Humidity, Temperature, WS
11SP, SR, CC22SP, SR, Sunlight, Humidity, Temperature, CC, WS
Table 5. Monthly performance comparison among the multivariate models.
Table 5. Monthly performance comparison among the multivariate models.
No. Month20182019
Factors 456789101112123
1SP
2SP, SR
3SP, Sunlight
4SP, Humidity
5SP, Temperature
6SP, CC
7SP, WS
8SP, SR, Sunlight
9SP, SR, Humidity
10SP, SR, Temperature
11SP, SR, CC
12SP, SR, WS
13SP, SR, Sunlight, Humidity
14SP, SR, Sunlight, Temperature
15SP, SR, Sunlight, CC
16SP, SR, Sunlight, WS
17SP, SR, Sunlight, Humidity, Temperature
18SP, SR, Sunlight, Humidity, CC
19SP, SR, Sunlight, Humidity, WS
20SP, SR, Sunlight, Humidity, Temperature, CC
21SP, SR, Sunlight, Humidity, Temperature, WS
22SP, SR, Sunlight, Humidity, Temperature, CC, WS
(RMSE < 1.5).
Table 6. Performance comparison for medium-term solar power forecasting (April) by the models.
Table 6. Performance comparison for medium-term solar power forecasting (April) by the models.
No.RMSEMAEComputation
TraditionalProposedΔTraditionalProposedΔTime (s)
113.625.428.28.933.215.7348.02
224.250.6423.6115.250.5414.7181.64
328.980.5528.4318.060.4317.6374.38
423.56.8516.6519.714.9214.7868.81
58.981.637.357.371.296.0864.37
69.92.187.718.261.656.6161.45
721.941.5420.414.511.3413.1657.64
831.222.2129.0123.272.1921.0778.63
926.367.4418.9222.275.5516.7275.94
10191.961.59190.37167.451.23166.2271.74
1110.592.178.417.791.296.4970.99
125.541.034.514.120.813.3167.12
1354.268.7445.5148.236.4841.7584.7
1424.562.8521.7118.012.3615.6584.5
1522.912.5120.415.271.5113.7590.12
1645.590.9844.6128.190.7427.4593.75
1768.761.4467.3267.241.0366.21115.34
1872.573.5569.0171.792.5569.24115.91
1966.721.4365.2851.570.850.77117.93
2021.412.7418.6617.21.8215.37142.68
2144.831.4343.433.671.132.58146.37
2244.051.2242.8336.080.7635.32170.85
Table 7. Performance comparison for long-term solar power forecasting by the models.
Table 7. Performance comparison for long-term solar power forecasting by the models.
NoRMSEMAEComputation
TraditionalProposedΔTraditionalProposedΔTime (s)
135.79.23−26.4726.035.83−20.2069.06
24.831.07−3.764.781.04−3.74123.42
336.691.23−35.4623.421.08−22.34111.23
4304.337.28−297.05302.125.34−296.78102.21
521.713.58−18.1315.532.32−13.2195.67
614.742.25−12.4910.231.65−8.5889.2
720.470.84−19.6312.790.66−12.1384.44
819.131.87−17.2612.451.85−10.60119.59
926.327.13−19.1922.025−17.02113.99
1025.512.5−23.0118.81.84−16.96109.16
1118.942.49−16.4512.971.87−11.10106.53
1234.710.95−33.7622.630.74−21.89102.34
1330.826.91−23.9124.954.52−20.43133.01
1466.322−64.3247.331.45−45.88134.54
1536.72.48−34.2224.581.83−22.75136.79
1630.550.85−29.7018.580.66−17.92142.04
1764.392.15−62.2463.381.62−61.76190.83
1892.073.48−88.5985.882.79−83.09193.12
1963.241.25−61.9963.20.99−62.21193.44
2046.782.74−44.0433.32.04−31.26233.92
2192.011.57−90.4489.051.22−87.83232.85
2240.671.49−39.1828.660.99−27.67248.95
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Son, N.; Jung, M. Analysis of Meteorological Factor Multivariate Models for Medium- and Long-Term Photovoltaic Solar Power Forecasting Using Long Short-Term Memory. Appl. Sci. 2021, 11, 316. https://doi.org/10.3390/app11010316

AMA Style

Son N, Jung M. Analysis of Meteorological Factor Multivariate Models for Medium- and Long-Term Photovoltaic Solar Power Forecasting Using Long Short-Term Memory. Applied Sciences. 2021; 11(1):316. https://doi.org/10.3390/app11010316

Chicago/Turabian Style

Son, Namrye, and Mina Jung. 2021. "Analysis of Meteorological Factor Multivariate Models for Medium- and Long-Term Photovoltaic Solar Power Forecasting Using Long Short-Term Memory" Applied Sciences 11, no. 1: 316. https://doi.org/10.3390/app11010316

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop