*3.4. Feature Selection Methods*

Feature selection, variable selection, or attribute selection plays an essential role in classification problems. It reduces the number of attributes by excluding the irrelevant and redundant ones to achieve the lower complexity model (see Figure 4). The more uncomplicated and faster models with fewer variables are desirable in machine learning models. Feature selection is an essential part of the machine learning process, leading to overfitting. Overfitting happens when the model learns details and noises made by too many variables, and then the model will not generalize well when presented with new data.

In this research, some feature selections, such as principal component analysis (PCA), particle swarm optimization (PSO), evolutionary search, genetic search, best-first search, and variance inflation factor (VIF), are used.
