2.8.1. Linear Regression Model

The linear model structure includes *<sup>Y</sup>* <sup>∈</sup> <sup>R</sup> *<sup>n</sup>*×1, where *<sup>Y</sup>* <sup>=</sup> (*y*1, *<sup>y</sup>*2, *<sup>y</sup>*3, ..., *yn*) *<sup>T</sup>* is the response variable; *<sup>X</sup>* <sup>∈</sup> <sup>R</sup> *<sup>n</sup>*×*p*, where *<sup>X</sup>* <sup>=</sup> (*x*1, *<sup>x</sup>*2, *<sup>x</sup>*3, ..., *xn*) represents the design matrix; and *xi* = - *xi*,1, *xi*,2, *xi*,3, ..., *xi*,*<sup>p</sup>* and *<sup>β</sup>* <sup>∈</sup> <sup>R</sup> *<sup>p</sup>*×1, where *<sup>n</sup>* is the number of observations and *p* is similar to the number of features. Then, the linear regression model is given by Equation (11) [59].

$$
\Upsilon = \mu + \epsilon \tag{11}
$$

where *μ* = *β* × *X* and  is the regression error.

Then, with a given predictor *Y* and the design matrix *X*, Equation (12) solves the *β* model parameters that reduce  [59].

$$\boldsymbol{\beta} = \left(\mathbf{X}^T \mathbf{X}\right)^{-1} \mathbf{X}^T \mathbf{Y} \tag{12}$$
