*3.1. Employed Algorithms*

The authors used a specific kind of optimization method that falls into the category of metaheuristic bio-inspired algorithms. Optimization algorithms' primary goal is to reduce or maximize an objective function, also known as a fitness function, by modifying the so-called decision variables: in this case the −→*<sup>k</sup>* vector's components, which are bound by established constraints [42]. Although heuristic techniques cannot always guarantee the optimal result, they aim to produce satisfactory solutions, or solutions that are at least close to the optimal outcome, at a reasonable processing cost. MSAs use a variety of strategies to identify effective solutions to optimize problems starting from a large population of acceptable candidates, while making few assumptions about the problem being optimized [43]. Furthermore, the algorithms are bio-inspired, since they have underlying mechanisms based on biological processes. In fact, natural adaptation might be viewed as a type of optimization. In this work, the authors have approached on the evolutionary (EA) and swarm intelligence (SI) algorithms. Following a careful literature review as reported in Section 2, we selected three different MSA:


#### 3.1.1. Evolutionary Algorithms

Evolutionary algorithms draw their inspiration from the natural evolutionary behavior. They are described by a few key characteristics and parameters:

• Population: The solution "pool", which is initialized at the start of the process;


Individuals serve as a representation of the different solutions, and a score, known as the fitness value, is assigned to them by means of Equation (4), calculated by analyzing the phenotype, or set of traits, of the subjects in question.

**Differential Evolution.** One of the most famous EA is the differential evolution strategy [43,44], which is also one of the tested algorithms in this work. The main concept of this algorithm follows the genetic principles.

Figure 2 shows the logical process behind this optimization algorithm. The process starts with a step called mutation: three individuals (three vectors in this case) are selected randomly from the population and a fourth vector is created by calculating the difference (the evolution is differential) between the first two vectors, multiplying it by a mutation factor, and then adding the third one. The recombination or crossover phase begins at this stage, when the altered parameters of the starting vector are combined with those of the so-called target vector to produce the trial vector. The trial and target vectors are compared during the selection phase. If the trial's score exceeds the target, it will be used as the next target; otherwise, it will be discarded. In the next generation, the vector with the highest fitness value will be maintained. The pseudo-code for this algorithm is reported in Figure 3.

Genetic algorithms are another popular EA optimization method (GA). In [23,45], a detailed investigation of the implementation of such algorithms on comparable challenges for PHM techniques was performed. However, in previous tests, the global GA performance (considering the same metrics employed in this work) results were inferior, with respect to other metaheuristic optimization methods [38].

**Figure 2.** A schematic representation of the Differential Evolution Algorithm.

```
START
\\ Parameters definition
    Set the population dimension N
    Set the vector k dimension D
    Set the mutation factor F in [0,2]
    Set the crossover factor C in [0,1]
\\ Initialise
    t=0; %counter
    Create NxD random inidividuals
    WHILE (stopping criterion)
        FOR (i=1:N)
        \\ Mutation
            Random choice of three vectors x1, x2, x3
            v=x1+F(x2-x3)
        \\ Crossover
            Random extraction of an index y between 0 and D
            %the trial vector takes at least one value of the v vector
            FOR z=1:D
                Random extraction of an index w between0e1
                IF w<C or z=y
                     u(i)=v(i)
                ELSE
                     u(i)=x(i)
                END IF
            END FOR
        \\ Selection
            IF f(u(i))<f(x(i))
                x(i, t+1)=u(i)
            ELSE
                x(i, t+1)=x(i,t)
            END IF
        END FOR
        t=t+1
    END WHILE
```
END

**Figure 3.** Pseudo-code for DE algorithm, as taken from [41].
