*3.2. PSO for Optimal Parameters*

PSO is a kind of evolutionary computation, the basic idea of which is to find the optimal solution through the cooperation and information sharing between individuals in the group. It mimics a bird in a flock by designing a massless particle with just two properties: speed, which represents how fast it is moving, and position, which represents the direction it is moving. Each particle separately searches for the optimal solution in the search space, and records it as the current individual extreme value, and shares the individual extreme value with other particles in the whole particle swarm and finds the optimal individual extreme value as the current global optimal solution of the whole particle swarm. All particles in a swarm adjust their speed and position based on the current individual extremum they find, and the current global optimal solution shared by the whole swarm [14].

PSO-SVM Parameter Settings

*C*1: the initial value is 1.5, local search capability of PSO parameters

*C*2: 1.7 initially, PSO parameter global search capability

Maxgen: The initial value is 200, the maximum number of evolutions

Sizepop: the initial value is 20 and the maximum size of the population

K: initial 0.6 (k belongs to [0.1, 1.0]), the relationship between the speed and x (*V* = *KX*)

WV: The initial value is 1 (wV best belongs to [0.8, 1.2]), and the rate updates the elasti coefficient before the speed in the formula

WP: The initial value is 1, the elastic coefficient in front of the velocity in the population renewal formula

V: Initial 5, SVM Cross Validation parameter

Popcmax: the maximum value of the change in the SVM parameter C, initially 100.

Popcmin: the initial value is 0.1, the minimum change of SVM parameter C.

Popgmax: the initial value is 1000, the maximum value of the change of THE SVM parameter G.

Popgmin: the initial value is 0.01, the minimum change value of the SVM parameter C.

#### **4. Feature Selection Algorithms for Main Transformer Condition**

This chapter introduces the unsupervised mutual information filtering feature sorting method used in feature selection. In feature selection, the relevance of each feature is calculated first, the importance of the feature is evaluated by the forward sequential search, and finally an ordered feature sequence is output.
