After determining the calculation method for the quality score, this paper proposes an intelligent algorithm to reduce the value of the score. This article describes the algorithm from the following three aspects: the choice of the intelligent algorithm, the components and the optimization of the algorithm.
3.1. Selection of an Intelligent Algorithm
Algorithm selection is the first step in algorithm design. The basic principle of optimization algorithm selection is that the algorithm must be able to fit the characteristics of the problem, and the most important impact on the performance of the intelligent optimization algorithm is the distribution characteristics of the solution. Therefore, the optimization algorithm can be selected by determining the difference of the distribution density function [
18].
This article randomly selects a piece of raw material fish data. The raw material can be divided into 10 small pieces, with the
score of each small piece being ≤0.05 as the standard; the ideal range of the
Score is ≤0.5. We divide the feasible solution into samples by uniformly sampling 10,000 times. The value range of the objective function is divided into a series of cells, and the frequency of the value interval of the objective function is used for distribution statistics. The statistical results are shown in
Figure 8.
In
Figure 8, at both ends of the value range, the probability of the optimal solution drops rapidly. According to the statistical results of the knowledge base in [
14], for this type of distribution density problem, the simulated annealing (SA) algorithm has a higher solution accuracy and can accelerate the convergence of the model to the optimized solution. Therefore, to solve the precise fish body cutting problem, this paper developed a more effective simulated annealing algorithm.
The basic principle of the SA algorithm is to optimize the parameters on the basis of the initial solution to find a final solution that satisfies the termination condition or the value of the objective function. Due to the irregular shape of the fish body and other factors, there are a large number of local solutions to the problem. This is a major challenge for the algorithm’s optimization ability, and it is necessary to jump out of the local solution in time to approach the global optimum. Therefore, this paper uses the information guidance strategy to improve the simulated annealing algorithm.
3.2. Cutting Algorithm
After determining the use of the SA algorithm, this section will explain the basic principles of the SA algorithm and the optimizations made for the problems in this article. After determining the initial temperature T, it mainly includes four steps: the determination of the initial solution, the generation of a new solution, the Metropolis criterion, and the cooling criterion. The calculation steps of the SA algorithm are as follows (Algorithm 1):
Algorithm 1: SA algorithm |
Input: number of iterations iter; initial temperature T; current solution; inner loop |
Output: best solution |
1. | iter=0 |
2. | initialise T |
3. | stop criterion = maximum number of iterations |
4. | initialise current solution |
5. | current cost = Evaluate(current solution) |
6. | while not stop criterion do |
7. | while inner loop do |
8. | Neighbour = Generate(current solution) |
9. | Neighbour cost = Evaluate(Neighbour) |
10. | if Accept(current cost, Neighbour cost, T) |
11. | current solution = Neighbour |
12. | Current cost = Neighbour cost |
13. | end |
14. | Update(best solution, iter) |
15. | end |
16. | Update(T) |
17. | Update(stop criterion) |
18. | end |
19. | return best solution |
Initial solution
Since the cuts must be continuously distributed throughout the fish body, the starting position of the initial solution must be determined, which is determined by the algorithm’s preprocessing strategy. After determining the starting position, use the real number vector to establish the initial solution, the size of which is 2n + 1, that is, X = [x1, x2, …, x2n + 1]. The first n elements are the length of each small piece, and the n + 1th to 2n+1th elements are the cutting angles of the front and back sides of each small piece, so X can also be expressed as [length(1)…length(n), angle(1)…angle(n+1)].
Generation of new solutions
In the standard intelligent algorithm, the pure mutation operation is not instructive, and the efficiency is low, and there are many local solutions to this cutting problem, and it is easy to fall into the local optimum. If the knowledge accumulated in the search process can be combined, it will help improve the search performance of the algorithm.
This paper uses the information-guided simulated annealing algorithm [
19], and uses the change trend of the solution in two adjacent iterations as the next search direction for the individual. For example, for the optimization problem min f(x), the population size is
N, and the
k-th generation individuals are
X1 (
k),
X2 (
k), ...,
XN (
k). Introduce a vector
to record the next search direction of the individual
, where
. Let
G(
i) denote the algebra of individual i’s survival. According to rule I, record the survival algebra
G of the individual and the search direction vector
D of the next step, and update it at each step.
Rule I: If ,
Then G(i)=1, ;
Otherwise, , where .
Rule II: If G(i)=1,
Then ;
Otherwise, move one step randomly. The moving step length is related to the survival algebra of the individual and the value of the function,
According to Rule I and Rule II, if the performance of the current solution is better than that of the previous generation and random perturbation is performed while maintaining the search direction to obtain a new solution; if the performance of the current solution is worse than the previous generation, random disturbance obtains a new solution; if the search falls into a certain local optimal solution, then as the individual survival algebra continues to increase, the amplitude of the disturbance also increases, helping to deviate from the local optimal solution.
The algorithm will produce infeasible solutions in the process, so it is necessary to check the feasibility of the newly generated solution and make adjustments. The method used in this paper is: if the variable exceeds the feasible range, based on the boundary value, make a new solution .
In the cutting stage, after calculation, if the fitness function of the solution is f(S), the fitness of the current solution ret1 is f (S1), and the fitness of the new solution ret2 generated according to the current solution ret1 is f (S2). According to Metropolis criterion, if df = f (S2) − f (S1) < 0, it means that the new solution ret2 is better than the current solution ret1. Replace ret1 with ret2, otherwise accept the new solution with probability of exp(−df/T).
The Metropolis guideline is:
Use the cooling rate q for cooling, that is T = qT. In each cycle, if T is less than the end temperature, stop the iteration and output the current state, otherwise continue the iteration.
This section describes the calculation process of the simulated annealing algorithm and the corresponding optimization strategy. Next, consider combining it with actual data to solve the multi-objective optimization problem of this article, and analyze the effect of the algorithm.