*2.3. Error Estimation*

Any empirical correlation is associated with prediction error. The accuracy of regression models in this study is examined by the root-mean-square error (RMSE), bias of the model, and the coefficient of determination *(*R2) [46]. They are defined as follows:

$$\text{RMSE} = \sqrt{\frac{\sum\_{i=1}^{n} \left(\mathbf{y}\_i^\* - \mathbf{y}\right)^2}{n}} \tag{10}$$

$$\text{Bias} = \frac{1}{n} \sum\_{i=1}^{n} (\mathbf{y}\_i^\* - \mathbf{y}\_i) \tag{11}$$

$$\mathbf{R}^2 = 1 - \frac{\sum\_{i=1}^n \left(\mathbf{y\_i^\*} - \mathbf{y\_i}\right)^2}{\sum\_{i=1}^n \left(\mathbf{y\_i^\*} - \overline{\mathbf{y}}\right)^2} \tag{12}$$

where y\*, y and y represent the predicted, measured and average value of the dependent variables, respectively, and n is the number of data points used for the derivation of a particular correlation. The RMSE is considered as the absolute measure of the model's fit to the data and preferred over the mean absolute error as it places more weight on the larger error terms. On the other hand, the R<sup>2</sup> value is the relative measure of the model's fit and represents the explained percentage of variability in the

response variable as compared to the mean alone. RMSE is used for measuring the accuracy of the model along with the R<sup>2</sup> value when the main objective of the model is prediction. Small RMSE values and high R<sup>2</sup> values (between 0 and 1) imply a good fit for the model. The bias of a model measures the degree of overestimation or underestimation of the prediction as obtained from the model. Positive values of bias error sugges<sup>t</sup> that the predicted values by the model will be greater than the actual values and vice versa [47].
