*3.6. Selection of Hyperparameters*

To increase classification performance, and to select the most suitable set of parameters for a fair comparison (i.e., comparing only classifiers presenting maximum accuracy), a deep analysis was carried out. This is achieved using a grid search. This procedure consisted of choosing a combination of various values for specific parameters within a classifier and exhaustively performing a cross-validation for each combination of parameters. The accuracy score for each of these combinations was stored for comparison, and the set of parameters with the best (maximum in this case) output value of accuracy score was chosen. The updated classifier was then tested using the testing set, and a prediction (testing) accuracy score was stored. Note that, in order to apply this procedure to each selected classifier, a set of different hyperparameters was chosen. Table 5 shows the parameters, which were selected on the basis of the authors' criteria and availability within the syntax and structure of the applied algorithms (i.e., Sci-kit learn module in Python).


**Table 5.** Selected hyperparameters for grid search and cross-validation procedures.
