*2.5. Gaussian Support Vector Machine*

Support Vector Machines (SVMs) are based on statistical learning theory, which contains polynomial classifiers, neural networks and radial basis function (RBF) networks in special cases. The SVM is thus not only theoretically well-founded but also superior in practical applications [49]. It is also commonly used to construct regression models. The function used to estimate the unknown parameter vector (such as the wind speed estimate vector) is given by:

$$f\_{SVM} = \sum\_{i=1}^{m} \sum\_{j=1}^{m} (l\_i' - l\_i) \left( l\_j' - l\_j \right) \mathbf{x}\_i^T \mathbf{x}\_j + b \tag{13}$$

where *m* is the number of samples and *l <sup>i</sup>* and *li* are the Lagrange multipliers. In this paper, *x* is observation metric, which is composed of 10 variable rows and *m* sample columns; *xi* and *xj* are the *i*-th column and *j*-th column, respectively; and *b* is the threshold. By introducing the kernel functions replacing *x<sup>T</sup> <sup>i</sup> xj* with *K xi*, *xj* , where *K xi*, *xj* is a transformation that maps *xi* to a high-dimensional space, the performance of the model can be improved. The choice of kernel function and parameters directly affects the performance of SVM [50]. The following are the commonly used positive semidefinite kernel functions, which are named as Linear function, Polynomial function and Gaussian function:

$$\mathbf{K}(\mathbf{x}\_{i\prime}, \mathbf{x}\_{j}) = \mathbf{x}\_{i}^{T} \mathbf{x}\_{j} \tag{14}$$

$$\mathcal{K}(\mathbf{x}\_{i\prime}\mathbf{x}\_{j}) = \left(\mathbf{1} + \mathbf{x}\_{i}^{T}\mathbf{x}\_{j}\right)^{p} \tag{15}$$

$$K(x\_i, x\_j) = e^{-\left\|x\_i - x\_j\right\|^2} \tag{16}$$

After testing these kernel functions, it was found that the Gaussian function had the best effect in this study. Thus, Gaussian SVM (GSVM) is considered in this paper. In this study, the Box Constraint is 0.9762, the Epsilon is 0.09762 and the Kernel Scale is set at 3.7.
