*2.4. Random Forest*

Several applications rely on decision tree architecture during training, such as regression and classification; random decision forest is also an ensemble learning technique. Random forest utilizes several decision trees (CART) and then gives the most accurate outcome with the combination of these trees. The decision tree algorithm uses the Gini index technique, which measures the probability that a selected element from the set will be erroneously categorized. The total squared possibility for each class is decreased by 1 from the Gini index calculations. This technique increases the predictive power of the system. Removing the bias created by the decision tree model adds to the system. Additionally, using the "random Forest" R package, random forest can naturally order the relevance of variables in regression or classification tasks [25,26].
