2.8.2. Decision Tree Regression Model

Decision trees are hierarchical structures with nodes representing tests of the data with specific attributes and branches representing the test results. Decision tree models include IDS, C4.5, CART, and regression models. For example, the regression decision trees predict continuous random variables by finding the attributes that reduce the mean square error (MSE), obtained with Equation (13) [60].

$$MSE = \frac{1}{n} \sum\_{i=1}^{n} (y\_i - \overline{y}\_i)^2 \tag{13}$$

where *Y* = (*y*1, *y*2, *y*3, ..., *yn*) is the raw data output variable and *Y* = (*y*1, *y*2, *y*3, ..., *yn*) represents the decision tree model output [60].

For this application, the regression variable used a decision tree with the energy consumption and the node attributes as the features for the energy consumption dataset.
