*2.1. Many-Objective Optimization*

Multi-objective optimization [20–23] falls into the field of multiple criteria decisionmaking. It optimizes all goals at the same time to ge<sup>t</sup> the optimal solution. Therefore, multi-objective optimization problems (MOPs) ge<sup>t</sup> a set of solutions. Generally, a MOP is an optimization problem with two or three objectives. A many-objective optimization problem (MaOP) is an optimization problem [24–27] with four or more objectives. In recent years, many researchers have used multi-objective optimization methods to solve practical problems [28–31], such as scheduling [32,33], planning [34–36], fault diagnosis [37–39], classification [40,41], test-sheet composition [42], object extraction [43], variable reduction [44], and virtual machine placement [45]. Multi-objective evolutionary algorithms

(MOEAs), such as non-dominated sorting GA [46], multi-objective particle swarm optimization (MOPSO) [47–49], NSGA-II [50], NSGA-III [51,52], decomposition-based MOEA [53] and corresponding improved versions [54–56], are the most used solutions.

In many-objective optimization problems, minimization problems simultaneously optimize minimize objectives to obtain the maximum benefit. Within the scope of mathematics, minimization problems are embodied in the minimization of objective functions (that is, to minimize all objective values of objective functions as far as possible). In this paper, we use the minimum optimization model to carry out seed schedule. The definition of minimum optimization problems is given below.

$$\begin{cases} \text{Min } \mathbf{F}(\mathbf{x}) = [f\_1(\mathbf{x}), f\_2(\mathbf{x}), \dots, f\_m(\mathbf{x})]^T \\ \text{s.t. } m > 3 \\ \mathbf{x} \in X \subseteq \mathbb{R}^n \end{cases} \tag{1}$$

where *<sup>F</sup>*(*x*)is the objective vector, *fi*(*x*) is the *i*-th objective to be minimized, *x* = (*<sup>x</sup>*1, ··· , *xn*) is a vector of *n* decision variables, *X* is an *n*-dimensional decision space, and *m* denotes the number of objectives to be optimized.

**Definition 1** (Pareto Dominance [57])**.** *Given any two decision vectors x, y with M objectives for the minimization optimization.* ∀*<sup>x</sup>*, *y* ∈ *X, if there is fm*(*x*) ≤ *fm*(*y*) *for all m* = 1, 2, ··· , *M then x dominates y, which is denoted as x* ≺ *y.*

**Definition 2** (Pareto Optimal [57])**.** *Assuming that x*<sup>∗</sup> ∈ *X, if there is no solution x* ∈ *X satisfying x* ≺ *<sup>x</sup>*<sup>∗</sup>*, then x*<sup>∗</sup> *is the Pareto optimal solution.*

**Definition 3** (Pareto Optimal Set [57])**.** *All the Pareto optimal solutions constitute the Pareto optimal set (PS).*

**Definition 4** (Pareto Front [57])**.** *All the objective vectors of the solutions in Pareto optimal set constitute the Pareto front (PF).*

Figure 1 is a solution distribution under two-dimensional objective space, where all points represent solutions. For a minimal optimization problem, it can be seen that the point A is smaller than the point C under the two-dimensional objective space, that is, there is a dominance relationship between the point A and the point C, and the point C is dominated the point A. For the points A and B in Figure 1, we can see that the point A is greater than the point B on the *f*2 axis, but the point A is less than B on the *f*1 axis, so there is not a dominance relationship between the point A and the point B.

**Figure 1.** Solutions in a two-dimensional objective space.

#### *2.2. Coverage-Based Greybox Fuzzing*

CGF is an evolutionary algorithm that includes two stages: the static analysis stage and the fuzzing loop stage. In the static analysis stage, it executes compile-time or dynamic binary instrumentation to obtain the instrumented target program. In the fuzzing loop stage, CGF uses a series of initial seeds provided by the user as inputs and maintains a seed queue stored in the seed pool. CGF first selects a saved seed input from the seed queue and mutates it to generate the new input by using mutation strategies. Next, the target program is executed with the new input. Then, lightweight instrumentation technique is used to gather coverage information, if the new input causes a crash, it will be marked and added to the crash set. If the new input leads to new coverage, CGF will judge that the new input is interesting and add it to the seed pool. Algorithm 1 shows the workflow of CGF in the fuzzing loop stage.

