**5. Experimental Results and Analysis**

*5.1. Parameter Tuning and Statistical Analysis*

In order to illustrate the advantages of the proposed PGPM, the performance of existing PGPMs based on Support Vector Regression (SVR) [21], Decision Tree [22], Random Forest [23], LSTM [24], and Bi-LSTM [25] were compared with the Attention-Bi-LSTM PGPM proposed in the paper, and the main experimental parameters of PGPMs based on SVR, Decision Tree, and Random Forest were tuned, as shown in Tables 2–4, respectively.

**Table 2.** Parameter tuning of PGPM based on SVR.


**Table 3.** Parameter tuning of PGPM based on Decision Tree.


**Table 4.** Parameter tuning of PGPM based on Random Forest.


From Tables 2–4, the best parameters of each algorithm could be determined, for the best prediction accuracy was achieved.

Moreover, the essence of proposed Attention-Bi-LSTM PGPM is an improved version of PGPMs based on LSTM and Bi-LSTM. In order to ensure the comparability and accuracy of subsequent experimental results, the experiments parameters of the above three LSTMbased PGPMs are the same in the paper, and the related parameters are shown in Table 5.

**Table 5.** Related parameters of LSTM-based PGPMs.


Furthermore, the statistical analysis was performed for the selected parameter configurations, the way of which is to run the model training and prediction 50 times. Each time, the training dataset and testing dataset were partitioned randomly to evaluate the statistical stability of these models, and the results are shown in Table 6.

**Table 6.** Statistical analysis on the studied methods.


Table 6 shows the standard deviation for each algorithm is only 1~2 kWh, which means the prediction result is stable when the parameters are determined. Therefore, the subsequent comparison of parameter-dependent results could reflect the performance gaps of different methods from the statistical viewpoint.

Moreover, in order to evaluate the performance of the above algorithms, the Python scikit-learn library was employed to implement the PGPMs based on SVR, Decision Tree, and Random Forest algorithms, while the Tensorflow library was employed to implement the PGPMs based on LSTM, Bi-LSTM, and the proposed Attention-Bi-LSTM.
