Next Article in Journal
An Efficient Task Autonomous Planning Method for Small Satellites
Previous Article in Journal
Knowledge Acquisition from Critical Annotations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Apple Image Recognition Multi-Objective Method Based on the Adaptive Harmony Search Algorithm with Simulation and Creation

1
College of Information Science and Technology, Gansu Agricultural University, Lanzhou 730070, China
2
School of Electronic and Information Engineering, Lanzhou Jiaotong University, Lanzhou 730070, China
*
Author to whom correspondence should be addressed.
Information 2018, 9(7), 180; https://doi.org/10.3390/info9070180
Submission received: 29 June 2018 / Revised: 17 July 2018 / Accepted: 18 July 2018 / Published: 20 July 2018

Abstract

:
Aiming at the low recognition effect of apple images captured in a natural scene, and the problem that the OTSU algorithm has a single threshold, lack of adaptability, easily caused noise interference, and over-segmentation, an apple image recognition multi-objective method based on the adaptive harmony search algorithm with simulation and creation is proposed in this paper. The new adaptive harmony search algorithm with simulation and creation expands the search space to maintain the diversity of the solution and accelerates the convergence of the algorithm. In the search process, the harmony tone simulation operator is used to make each harmony tone evolve towards the optimal harmony individual direction to ensure the global search ability of the algorithm. Despite no improvement in the evolution, the harmony tone creation operator is used to make each harmony tone to stay away from the current optimal harmony individual for extending the search space to maintain the diversity of solutions. The adaptive factor of the harmony tone was used to restrain random searching of the two operators to accelerate the convergence ability of the algorithm. The multi-objective optimization recognition method transforms the apple image recognition problem collected in the natural scene into a multi-objective optimization problem, and uses the new adaptive harmony search algorithm with simulation and creation as the image threshold search strategy. The maximum class variance and maximum entropy are chosen as the objective functions of the multi-objective optimization problem. Compared with HS, HIS, GHS, and SGHS algorithms, the experimental results showed that the improved algorithm has higher a convergence speed and accuracy, and maintains optimal performance in high-dimensional, large-scale harmony memory. The proposed multi-objective optimization recognition method obtains a set of non-dominated threshold solution sets, which is more flexible than the OTSU algorithm in the opportunity of threshold selection. The selected threshold has better adaptive characteristics and has good image segmentation results.

1. Introduction

The harmony search (HS) algorithm [1] is a swarm intelligence algorithm that simulates the process of music players’ repeated tuning of the pitch of each musical instrument through memory to achieve a wonderful harmony process. It has been successfully applied to many engineering problems. Like other swarm intelligence algorithms, the HS algorithm also has problems with premature convergence and stagnant convergence [2,3]. In recent years, scholars have made many improvements to the optimization performance of the HS algorithm, such as the probabilistic idea of applying the distributed estimation algorithm in the [2] to design an adaptive adjustment strategy and improve the search ability of the algorithm. In [3], the HS algorithm and the genetic algorithm (GA) are combined to obtain a better initial solution based on multiple iterations of the genetic algorithm, and then the harmony search algorithm is used to further search for possible solutions in the adjacent region. Zhu et al. linearly combines the optimal harmony vector with the two harmony vectors randomly selected in the population to generate a new harmony and expand the local search area [4]. Wu et al. uses the competitive elimination mechanism to update the harmony memory library to accelerate the process of survival of the fittest. Some studies have designed two key parameters of the HS algorithm to overcome the shortcomings of fixed parameters [5]. These two parameters also have improved from different perspectives and proposed different improved harmony search algorithms in [6,7,8]. The studies in [9,10] apply the improved HS algorithm to multi-objective optimization problems.
Relevant scholars have conducted a large number of basic studies on the HS algorithm, and proposed improvements, achieving good results in application. Simulated annealing and a genetic algorithm are used to optimally adjust the weights for the aggregation mechanism [11]. An innovative tuning approach for fuzzy control systems (CSs) with a reduced parametric sensitivity using the grey wolf optimizer (GWO) algorithm was proposed in [12]. The GWO algorithm is used in solving the optimization problems, where the objective functions include the output sensitivity functions [12]. Echo state networks (ESN) are a special form of recurrent neural networks (RNNs), which allow for the black box modeling of nonlinear dynamical systems. The harmony search (HS) algorithm shows good performance, proposed for training the echo state network in an online manner [13]. The design of the Takagi–Sugeno fuzzy controllers in state feedback formed using swarm intelligence optimization algorithms was proposed in [14], and the particle swarm optimization, simulated annealing and gravitational search algorithms are adopted. In terms of image applications of HS, the following studies have been conducted. A novel approach to visual tracking called the harmony filter is presented. The harmony filter models the target as a color histogram and searches for the best estimated target location using the Bhattacharyya coefficient as a fitness metric [15]. In [16], a new dynamic clustering algorithm based on the hybridization of harmony search (HS) and fuzzy c-means to automatically segment MRI brain images in an intelligent manner was presented. Improvements to Harmony Search (HS) in circle detection was proposed that enables the algorithm to robustly find multiple circles in larger data sets and still work on realistic images that are heavily corrupted by noisy edges [17]. A new approach to estimate the vanishing point using a harmony search (HS) algorithm was proposed in [18]. In the theoretical background of HS, the following studies have been conducted. In order to consider the realistic design situations, a novel derivative for discrete design variables based on a harmony search algorithm was proposed in [19]. In [20], it presents a simple mathematical analysis of the explorative search behavior of harmony search (HS) algorithm. The comparison of the final designs of conventional methods and metaheuristic based structural optimization methods was discussed in [21]. In the improvement of adaptive methods of HS, the following studies have been conducted. An adaptive harmony search algorithm was proposed for solving structural optimization problems to incorporate a new approach for adjusting these parameters automatically during the search for the most efficient optimization process [22]. A novel technique to eliminate tedious and experience-requiring parameter assigning efforts was proposed for the harmony search algorithm in [23]. Pan et al. presents a self-adaptive global best harmony search (SGHS) algorithm for solving continuous optimization problems [24].
Recognition of apple images is one of the most basic aspects of computer processing apple grading [25]. Apple image recognition refers to the separation of apple fruit from branches, leaves, soil, and sky, i.e., image segmentation techniques [26]. The threshold segmentation method is an extremely important and widely used image segmentation method [27]. Among them, the maximum class variance (OTSU) [28] and maximum entropy method [29] are the two most commonly used threshold segmentation methods. Both types of segmentation methods have different degrees of defects [30]. The classical OTSU algorithm has the problem that the ideal segmentation effect cannot be obtained for complex images of natural scenes [30]. Although the OTSU algorithm solves the problem of threshold selection, it has the problems of lacking adaptability, easily causes noise interference and over-segmentation, and requires a great deal of time for computing, and so on. Therefore, the classical OTSU algorithm still needs further improvement. In order to comprehensively utilize the advantages of the two segmentation methods, a multi-objective optimization problem is introduced. In recent years, the HS algorithm has made some progress in solving multi-objective optimization problems, such as the stand-alone scheduling problem [10], the wireless sensor network routing problem [31], the urban one-way traffic organization optimization problem [32], and the distribution network model problem [33]. In [34], the study employed a phenomenon-mimicking algorithm and harmony search to perform the bi-objective trade-off. The harmony search algorithm explored only a small amount of total solution space in order to solve this combinatorial optimization problem. A multi-objective optimization model based on the harmony search algorithm (HS) was presented to minimize the life cycle cost (LCC) and carbon dioxide equivalent (CO2-eq) emissions of the buildings [35]. A novel fuzzy-based velocity reliability index using fuzzy theory and harmony search was proposed, which is to be maximized while the design cost is simultaneously minimized [36]. These studies are closely integrated with specific application issues and cannot be easily extended to other application areas.
This paper proposes an apple image recognition multi-objective method based on the adaptive harmony search algorithm with simulation and creation (ARMM-AHS). This method transforms the image segmentation problem of natural scenes into a multi-objective optimization problem. It takes the maximum class variance and maximum entropy as the two objective functions of the optimization problem. A new adaptive harmony search algorithm with simulation and creation (SC-AHS) is proposed as a strategy for the image threshold search to achieve natural scene apple image segmentation. The results showed that the SC-AHS algorithm improves the global search ability and the diversity of the solution of the HS algorithm, and has obvious performance advantages of fast convergence. The ARMM-AHS method can effectively segment the apple image.

2. Theory Basis

2.1. Threshold Segmentation Principle of the OTSU Algorithm

The OTSU algorithm is a global threshold selection method. The basic idea is to use the grayscale histogram of the image to dynamically determine the segmentation threshold of the image with the maximum variance of the target and the background.
Suppose that the gray level of an apple image is L the gray value of each pixel is 0 , 1 , , L 1 , the number of pixels with gray value i is W i , the total number of pixels in the image is W = i = 0 L 1 W i , and the probability of gray value i is set to p i = W i W .
Let the threshold t divide the image into two classes C 0 and C 1 (target and background), that is, C 0 and C 1 respectively correspond to pixels with gray levels { 0 , 1 , , t } and { t + 1 , t + 2 , , L 1 } , then the variance between the two classes is as shown in Equation (1):
σ B 2 = ω 0 ( μ 0 μ T ) 2 + ω 1 ( μ 1 μ T ) 2
The optimal threshold t * should maximize the variance between classes, which is an optimization problem, as shown in Equation (2):
t * = a r g max 0 t L 1 { σ B 2 ( t ) }
Among them, ω 0 and ω 1 respectively represent the probability of the occurrence of class C 0 and class C 1 . μ 0 and μ 1 represent the average gray value of the class C 0 and class C 1 , respectively. μ T represents the average gray level of the entire image. Therefore, the larger the value of σ B 2 , the better the threshold that is selected.

2.2. Maximum Entropy Segmentation Principle

For the same image, let us assume that the threshold t divides the image into two classes C 0 and C 1 . The probability distribution of the class C 0 is { p 0 P t , p 1 P t , , p t P t } , and the probability distribution of the class C 1 is { p t + 1 1 P t , p t + 2 1 P t , , p L 1 1 P t } , where P t = i = 0 t p i ,   0 t L 1 .
The entropy formula is as shown in Equation (3):
H ( t ) = i = 0 t p i P t ln p i P t i = t + 1 L 1 p i 1 P t ln p i 1 P t
The optimal threshold should maximize the entropy, which is also an optimization problem, as shown in Equation (4).
t * = a r g max 0 t L 1 { H ( t ) }

3. Adaptive Harmony Search Algorithm with Simulation and Creation

The inertial particle and fuzzy reasoning were introduced into the bird swarm optimization algorithm (BSA) in [37] to make those birds that are foraging jump out of the local optimal solution to enhance the ability of global optimization. The fast shuffled frog leaping algorithm (FSFLA) proposed in [38] makes each frog individual evolve toward the best individual, which accelerates the individual evolutionary speed and improves the search speed of the algorithm. According to the no-free lunch theorem [39], this paper fuses the ideas of the above two different types of optimization algorithms’ mechanisms and proposes an adaptive harmony search algorithm with simulation and creation (SC-AHS) to improve the HS algorithm optimization performance.

3.1. Theory Basis

3.1.1. Adaptive Factor of Harmony Tone

Definition 1 (Adaptive factor of harmony tone).
Randomly generate Harmony Memory whose Size is HMS to find out the global optimal harmony x i g = ( x 1 g , x 2 g , , x N g ) , current optimal harmony x i n o w = ( x 1 n o w , x 2 n o w , , x N n o w ) , and global worst harmony x i w = ( x 1 w , x 2 w , , x N w ) in each creation of the harmony memory. The adaptive factor of the harmony tone ξ i is defined as follows, and it is used to suppress randomness during simulation and creation of harmony tone:
ξ i = i = 1 N r i x i n o w i = 1 N r i ,   i = 1 , 2 , , N r i = f ( x i g ) f ( x i w ) f ( x i n o w )

3.1.2. Harmony Tone Simulation Operator

Definition 2 (Harmony tone simulation operator).
In one harmony evolution, each tone component x i j , i = 1 , 2 , , N of each harmony x j ,   j = 1 , 2 , , H M S in the harmony memory adaptively evolves through the harmony tone simulation operator, as follows, to adjust the harmony individuals to evolve to optimal harmony individuals:
x i j , n e w = x i j , o l d + ξ i ( x i g x i j , o l d )

3.1.3. Harmony Tone Creation Operator

Definition 3 (Harmony tone creation operator).
In one harmony evolution, each tone component x i j ,   i = 1 , 2 , , N of each harmony x j ,   j = 1 , 2 , , H M S in the harmony memory adaptively evolves through the harmony tone creation operator, as follows, to adjust harmony individuals away from the current optimal harmony individual evolution:
x i j , n e w = x i j , o l d ξ i ( x i n o w x i j , o l d )

3.2. Adaptive Evolution Theorem with Simulation and Creation

Theorem 1 (Adaptive evolution theorem with simulation and creation).
In one harmony evolution, each tone component x j ,   j = 1 , 2 , , H M S in each harmony x i j , i = 1 , 2 , , N in the harmony memory adaptively evolves through the harmony tone simulation operator according to the Equation (6). If f ( x j , n e w ) < f ( x j , o l d ) , replace x i j , o l d with x i j , n e w , respectively. Otherwise, it evolves through the harmony tone creation operator according to the Equation (7). If f ( x j , n e w ) f ( x j , o l d ) still exists, use Equation (8) to randomly generate the harmony tone:
x i j , n e w = L B i + r ( U B i L B i )

3.3. SC-AHS Algorithm Process

Algorithm 1. SC-AHS
Input: the maximum number of evolutions T m a x , the size of the harmony memory HMS, the number of tone component N.
Output: global optimal harmony x g .
Step1: Randomly generate initial harmony individuals which number is HMS, marked as X H M S = { x j | x j = ( x 1 j , x 2 j , , x N j ) ,   j = 1 , 2 , , H M S } . The harmony memory is initialized according to Equation (1). The fitness of x j is f ( x j ) .
Step2: Each tone component x i j ,   i = 1 , 2 , , N of each harmony x j ,   j = 1 , 2 , , H M S is updated.
  Begin
  Sort each harmony by fitness f ( x j ) in descending order to determine x i g , x i n o w and x i w .
  For ( x j ,   j = 1 , 2 , , H M S )
  Begin
  Calculated ξ i according to Equation (5)
  For ( x i j ,   i = 1 , 2 , , N )
    Update x i j , o l d according to Equation (6) to perform harmony tone simulation
  End
  if ( f ( x j , n e w ) < f ( x j , o l d ) )
  For ( x i j ,   i = 1 , 2 , , N )
x i j , o l d = x i j , n e w
  End
  else
  For ( x i j ,   i = 1 , 2 , , N )
       Update x i j , o l d according to Equation (7) to perform harmony tone creation
  End
  If ( f ( x j , n e w ) < f ( x j , o l d ) )
  For ( x i j ,   i = 1 , 2 , , N )
x i j , o l d = x i j , n e w
  End
  else
  For ( x i j ,   i = 1 , 2 , , N )
  Randomly generate harmony tone x i j , n e w according to Equation (8).
  End
  End
  End
Step3: Determine if the number of evolutions has been reached T m a x . If not, go to Step 2. Otherwise, the algorithm stops and outputs x g .

3.4. Efficiency Analysis of SC-AHS

The efficiency analysis of the algorithm shows that the time complexity of the HS algorithm is O ( T m a x × N ) , and the space complexity is O ( H M S × N ) , while the time complexity of the SC-AHS algorithm is O ( T m a x × N × H M S ) , and the space complexity is O ( H M S × N ) . Since T m a x H M S , the execution time and storage space of the SC-AHS algorithm and the HS algorithm are of the same order of magnitude. This shows that the SC-AHS algorithm does not increase the computational overhead and space.

4. Apple Image Recognition Multi-Objective Method Based on an Adaptive Harmony Search Algorithm with Simulation and Creation

4.1. Mathematical Problem Description of the Multi-Objective Optimization Recognition Method

Suppose that the gray level of an apple image is L , the gray value of each pixel is 0 , 1 , , L 1 . There are two classes to be distinguished in the apple image, and the corresponding harmonic threshold in each class is written as x j ,   j = 1 , 2 , , H M S ,   0 x j L 1 . The optimization process of seeking thresholds for segmentation of apple images can be regarded as a multi-objective optimization problem, in which the maximum class variance and maximum entropy are taken as two objective functions of the multi-objective optimization problem. The proposed SC-AHS algorithm is used as an image threshold search strategy. If we take the calculation of the minimization function as an example, the multi-objective optimization problem can be defined as follows:
{ min f ( x ) = [ f 1 ( x ) , f 2 ( x ) ] s . t . x j = ( x 1 j , x 2 j , , x N j ) X R N , N = 8 X = { x j , j = 1 , 2 , , H M S | 0 x j L 1 } f 1 ( x * ) = min 0 x j L 1 { σ B 2 ( x j ) } f 2 ( x * ) = min 0 x j L 1 { H ( x j ) }

4.2. Process of the Apple Image Recognition Multi-Objective Method Based on an Adaptive Harmony Search Algorithm with Simulation and Creation

Step 1: Read and preprocess the apple image. Count the gray level L of the image and the number of corresponding pixels W i to calculate the probability p i of the occurrence of grayscale.
Step 2: Enter the maximum number of evolutions T m a x , the size of the harmony memory bank HMS, and the number of tone components N.
Step 2.1: HMS initial harmony thresholds are generated randomly, which is denoted as X = { x j | x j = ( x 1 j , x 2 j , , x N j ) ,   j = 1 , 2 , , H M S } . Each harmonic threshold is binary coded and calculated f 1 ( X ) and f 2 ( X ) according to Equation (9) to initialize the harmony memory. The fitness of the harmonic threshold x j is f 1 ( x j ) and f 2 ( x j ) .
Step 2.2: Construct a Pareto optimal solution set, and note the harmonic threshold deleted from the optimal solution set as x w , o l d .
Step 2.3: Each tone component x i w , o l d ,   i = 1 , 2 , , N of the harmony threshold x w , o l d is updated according to the SC-AHS algorithm step.
Step 2.4: Determine whether the number of evolutions has been reached T m a x . If not, go to Step 2.2. Otherwise, the algorithm stops and the optimal harmonic threshold solution set is output and noted as x * .
Step 3: Use the optimal harmony threshold solution set x * to divide the apple image and output the segmented image.
The apple image recognition multi-objective method based on an adaptive harmony search algorithm with simulation and creation process is shown in Figure 1.

5. Experiment Design and Results Analysis

5.1. Experiment Design

The experimental samples were chosen from the mature HuaNiu apples. A series of mature HuaNiu apple image segmentation experiments are conducted under six conditions of direct sunlight with strong, medium, and weak illumination, and backlighting with strong, medium, and weak illumination in a natural scene. The experiment contains two parts. Firstly, the benchmark functions are used to test the optimization performance of the SC-AHS algorithm. Secondly, the ARMM-AHS method is used to segment and identify the apple image. The experiment used a Lenovon (Beijing, China) Notebook computer with an Intel Core i5 CPU 2.6 GHz and 4.0 GB of memory. The test platform was Microsoft Visual C++ 6.0 and Matlab 2012.

5.2. Benchmark Function Comparison Experiment and Result Analysis

Five benchmark functions, including Sphere, Rastrigrin, Griewank, Ackley, and Rosenbrock, were used to simulate the individual fitness of harmony. The parameters of each function are shown in Table 1 [40].
f 1 ( x ) = i = 1 N x i 2
f 2 ( x ) = i = 1 N [ x i 2 10 cos ( 2 π x i ) + 10 ]
f 3 ( x ) = 1 4000 i = 1 N x i 2 i = 1 N cos ( x i i ) + 1
f 4 ( x ) = 20 exp [ 0.2 1 N i = 1 N x i 2 ] exp [ 1 N i = 1 N cos ( 2 π x i ) ] + 20 + e
f 5 ( x ) = i = 1 N [ 100 ( x i + 1 x i 2 ) 2 + ( 1 x i ) 2 ]

5.2.1. Fixed Parameters Experiment

The fixed parameters experiment uses a minimum value optimization simulation to compare and test the optimization performance of the HS, Improved Harmony Search (IHS) [7], Global-best Harmony Search (GHS) [41], Self-adaptive Global best Harmony Search (SGHS) [24], and SC-AHS algorithms.
In the fixed parameter experiment, the experimental parameters were set as follows. HMS = 200, HMCR = 0.9, PAR = 0.3, BW = 0.01, T m a x = 20,000, N = 30. The HCMR is the Harmony memory library probability, the PAR is the tone fine-tuning probability, the BW is the tone fine-tuning bandwidth, and T m a x is the number of evolutions. In the IHS algorithm, the pitch tune probability minimum value PARmin = 0.01, the pitch tuning probability maximum value PARmax = 0.99, the pitch tune bandwidth minimum value BWmin = 0.0001, and the pitch tune bandwidth maximum value BW max = 1 / ( 20 × ( U B L B ) ) [7]. In the GHS algorithm, PARmin = 0.01, PARmax = 0.99 [41]. In the SGHS algorithm, the initial probability value of the harmony memory is HMCRm = 0.98, the standard deviation of the value of the harmony memory is HMCRs = 0.01, and the probability of the harmony memory is normally distributed within [0.9, 1.0], the average initial value of the bandwidth is PARm = 0.9, the standard deviation of the pitch tune bandwidth PARs = 0.05, the pitch tune bandwidth is normally distributed within [0.0, 1.0], the specific evolution numbers LP = 100 [24], the pitch tune bandwidth minimum BWmin = 0.0001, and the pitch tune bandwidth maximum BW max = 1 / ( 20 × ( U B L B ) ) [7]. Upper Bound (UB) is the maximum value of the search range of the algorithm, and Lower Bound (LB) is the minimum value of the search range of the algorithm. In the SC-AHS algorithm, α i n i t = 0.1 , α f i n a l = 1.2 . The final test results were averaged after 30 independent runs. The performance evaluation of the algorithm uses the following methods: (1) the method of convergence accuracy and speed of the analysis algorithm under a fixed evolutionary frequency; and (2) the method of comparing the evolution curve of the average optimal value of each function.
The convergence accuracy of each algorithm is shown in Table 2. For both the unimodal Sphere (f1) and Rosenbrock (f5) functions, the data in Table 2 shows that the algorithm has reached the theoretical minimum of 0 for both functions. It shows that the convergence accuracy of this algorithm is very high, and the convergence speed and accuracy of other functions are slightly smaller than the SC-AHS algorithm. For the multimodal Rastrigrin (f2) function, Table 2 shows that the SC-AHS algorithm has reached the theoretical minimum of 0. It indicates that the algorithm has high convergence accuracy, and the convergence accuracy of other functions does not reach the theoretical minimum. For multimodal Griewank (f3) and Ackley (f4) functions, the data in Table 2 shows that the GHS algorithm reached the theoretical minimum value of 0 as it reaches the 5000 iterations in the Griewank (f3) function optimization. At the same time, the convergence accuracy of the GHS algorithm in the Ackley (f4) function optimization is also better than other algorithms, indicating that for these two functions, the SC-AHS algorithm converges faster, and the convergence accuracy is better than the other algorithms, except the GHS algorithm.
By comparing the minimum values of the HS, HIS [7], GHS [41], and SGHS [24] algorithms, and the SC-AHS algorithm with fixed parameters, the results show that for unimodal Sphere (f1) and Rosenbrock (f5) functions and multimodal Rastrigrin (f2) function, the SC-AHS algorithm has better performance than the other algorithms. For the multimodal Grievenk (f3) and Ackley (f4) functions, the SC-AHS algorithm converges faster than other algorithms. In addition to the GHS algorithm, the SC-AHS algorithm has better convergence accuracy than other algorithms. For each function, compared with the HS algorithm, the convergence accuracy and speed of the SC-AHS algorithm are significantly improved. Compared with other algorithms, the SC-AHS algorithm has the advantage of fast convergence.

5.2.2. Dynamic Parameters Experiment

The dynamic parameters experiment uses two test methods, including varying the number of harmony tone components and varying the harmony memory size, HMS. Under the condition of dynamic parameters, the optimization performance effect of the above algorithms, like HS, HIS [7], GHS [41], SGHS [24], and the SC-AHS is compared. The average value of each function, the relative change rate [42], and the average change value were used as measures of the performance of the algorithm.

Test of Average Optimum with Different Harmony Tone Components of Five Functions

The number of harmony tone components corresponds to the dimension of the function. High-dimensional, multimodal, complex optimization problems can easily cause a local optimum and premature convergence. For this reason, it is extremely necessary to adjust the dimension of the benchmark function (i.e., the number of tone components) and test the optimization performance of each algorithm in high dimensions. The tone components are in the range of 50 to 100, and the other parameters are unchanged. The comparison of the changes in the number of tone components for the optimization performance of each algorithm is shown in Table 3. The average optimal value of each function varied with the number of tone components is shown in Figure 2, Figure 3, Figure 4, Figure 5 and Figure 6. In Table 3, there is:
R e l a t i v e   r a t e   o f   c h a n g e = | a v e r a g e   o p t i m a l   v a l u e   o f   t h e   50   t o n e   c o m p o n e n t s a v e r a g e   o p t i m a l   v a l u e   o f   t h e   100   t o n e   c o m p o n e n t s | a v e r a g e   o p t i m a l   v a l u e   o f   t h e   50   t o n e   c o m p o n e n t s × 100 %
A v e r a g e   c h a n g e = i = 1 6 a v e r a g e   o p t i m a l   v a l u e   o f   d i f f e r e n t   n u m b e r s   o f   t o n e   c o m p o n e n t s 6 × 100 %
It can be seen from the above that for the uni-peak Sphere (f1) and Rosenbrock (f5) functions, the average optimal value of each algorithm has an increasing trend with the increase of the number of harmonic components, but the SC-AHS algorithm shows a decline. In the trend, the relative change rate of the SC-AHS algorithm in Table 3 is also the lowest. For the multi-peak Rastrigrin (f2), Griewank (f3), and Ackley (f4) functions, the average optimal value of the SC-AHS algorithm varies unstably. The average change value of the SC-AHS algorithm in Table 3 is the lowest among the algorithms. Therefore, the SC-AHS algorithm does not show a significant decrease in convergence accuracy with the increase in the number of tone components. That is, the algorithm still has good convergence performance in high-dimensional cases.

Test of the Average Optimum with Different Harmony Memory Sizes of Five Functions

The population diversity can be affected with the changes of the size of the harmony memory HMS. The range of parameters HMS is 20, 50, 100, 150, and 200. Other parameters are unchanged. The effect of the change in the size of the harmony memory on the optimization performance of each algorithm is shown in Table 4. The comparison of the average optimum value of each function changed with the size of the harmony memory as showed in Figure 7, Figure 8, Figure 9, Figure 10 and Figure 11.
In Table 4, there is:
R e l a t i v e   r a t e   o f   c h a n g e = | a v e r a g e   o p t i m a l   v a l u e   o f   20   f o r   t h e   h a r m o n y   m e m o r y a v e r a g e   o p t i m a l   v a l u e   o f   200   f o r   t h e   h a r m o n y   m e m o r y | a v e r a g e   o p t i m a l   v a l u e   o f   20   f o r   t h e   h a r m o n y   m e m o r y × 100 %
A v e r a g e   c h a n g e = i = 1 5 a v e r a g e   o p t i m a l   v a l u e   o f   d i f f e r e n t   s i z e s   o f   h a r m o n y   m e m o r i e s 5 × 100 %
It can be seen that for the single-peak Sphere (f1) and Rosenbrock (f5) functions and the multi-peak Rastrigrin (f2) function, the average value of each algorithm has an increasing trend with the increase of the size of the harmony memory, but the SC-AHS algorithm shows a significant downward trend. The relative change rate of the SC-AHS algorithm in Table 4 is higher than other algorithms, indicating that with the increase of the size of the harmony memory, the diversity of the population is maintained and the convergence accuracy is improved. For the multi-peak Grievenk (f3) and Ackley (f4) functions, the average optimal value of the SC-AHS algorithm is not stable. The average change value of the SC-AHS algorithm in Table 4 is the lowest among the algorithms, and the relative change rate of the algorithm is higher than other algorithms. Therefore, the SC-AHS algorithm increases the convergence accuracy of the algorithm with the increase of the size of the harmonic memory, and it has better convergence than other algorithms.

5.2.3. Test of SC-AHS Algorithm with Dynamic Parameters Experiment

The experiment is mainly to verify that the SC-AHS algorithm has fast convergence performance. The experiment still adopts the conditions of changing parameters, and still adopts two test methods, including varying the number of harmonic components and varying the harmonic memory size, HMS. The effect of changing the parameters under test on the convergence speed of the SC-AHS algorithm is studied. The average optimal value of each function and the number of evolutions are used as a measure of the performance of the SC-AHS algorithm.

Test of Average Optimum with Different Harmony Tone Components of the SC-AHS Algorithm

The range of the number of harmonic components, N, is from 50 to 100. The other parameters are unchanged. The effect of the change in the number of harmonic components on the convergence rate of the SC-AHS algorithm is shown in Table 5. The comparison of the change of the average optimum value and the sum of the tone components of each function is shown in Figure 12. It can be seen that with the increase in the number of tone components, there is no apparent upward trend in the average optimal value of the functions. Table 5 shows that the number of evolutions of the SC-AHS algorithm is less than 20,000 for single-peak Sphere (f1) and Rosenbrock (f5) functions, as well as multi-peak Grievenk function (f3), and the number of tone components is different, indicating that the SC-AHS algorithm still has a high convergence speed in the high-dimensional case.

Test of Average Optimum with Different Harmony Memory Sizes of the SC-AHS Algorithm

The value range of the HMS memory size is also 20, 50, 100, 150, and 200, respectively, and the other parameter settings are unchanged. The effect of the change in the size of the harmony memory on the convergence rate of the SC-AHS algorithm is shown in Table 6. The comparison of the average optimum value of each function with the change of the size of the harmony memory is shown in Figure 13. It can be seen that, as the size of the harmonic memory increases, the average optimal value of each function has a more obvious downward trend. This is because the size of the harmonic memory increases, which increases the diversity of the population and the search performance of the algorithm. Table 6 shows that the number of SC-AHS evolutions is lower than 20,000 for single-peak Sphere (f1), Rosenbrock (f5), multi-peak Rastrigrin (f2), and Griewank (f3) functions. It shows that the SC-AHS algorithm has increased the convergence speed when the harmony memory is large.

5.3. Apple Image Segmentation Experiment

The experiment uses a single apple image as input, and takes the ARMM-AHS method proposed in this paper as the threshold search strategy of images to perform apple image segmentation experiments. The maximum class variance function and the maximum entropy function are used as the two objective fitness functions of the multi-objective optimization method. The main interface of the apple image segmentation program developed with Matlab2012 GUI is shown in Figure 14.
Five segmentation indicators based on the image recognition rate, such as the segmentation rate (RSeg), fruit segmentation rate (RApp), branch segmentation rate (RBra), segmentation success rate (RSuc), and missing rate (RMis), are respectively defined in this paper. The equations of the five indicators are defined in Equations (19)–(23). Among them, N t o is the total number of segmentation targets (including the number of fruits and the number of branches), N p o is the total number of targets obtained by manual visualization (including the number of fruits and the number of branches), N a is the total number of segmentation fruits, N p a is the total number of fruits obtained by manual visualization, and N l is the total number of segmentation branches. Due to the effects of the illumination condition and interference factors, the experiment sets the fruit size that is greater than 1/4 that of complete fruit as a fruit statistics value, and sets the branch length that is greater than 1/4 that of a complete branch as a branch of the statistics value, and fruits that are too small in the perspective of the image are not counted in the statistics values.
R S e g = N t o N p o × 100 %
R A p p = N a N t o × 100 %
R B r a = N l N t o × 100 % = R A p p = N t o N a N t o × 100 %
R S u c = N a N p a × 100 %
R M i s = N p a N a N p a × 100 %
Figure 15, Figure 16 and Figure 17 show the comparison of the results of image segmentation under three illumination conditions, including direct sunlight with strong illumination, backlighting with strong illumination, and backlighting with medium illumination. The figures show the original image, the preprocessed image, the Pareto optimal frontier fitness curve using the SC-AHS optimization algorithm, the ARMM-AHS method segmentation result, and the OTSU algorithm segmentation result. Figure 15, Figure 16 and Figure 17 show that the multi-objective optimization method can be used to obtain a set of non-dominated threshold solutions, which can provide more opportunities to select thresholds. The thresholds of the non-dominated solution set under six illumination conditions using the ARMM-AHS method is shown in Table 7. The number of non-dominated solutions under six illumination conditions are different.
The comparison of the thresholds selected from the non-dominated solution set under six illumination conditions with OTSU thresholds is shown in Table 8. The effect of apple image segmentation under six lighting conditions is shown in Table 9. The data in Table 9 shows that the segmentation rate and segmentation success rate of the direct sunlight with strong illumination and backlighting with strong illumination all reach 100%. When the segmentation result includes branches, the branch segmentation rate and the missing rate are equivalent. This shows that the branches are well segmented and achieve the desired segmentation effect.

6. Conclusions

This paper proposes a new adaptive harmony search algorithm with simulation and creation, which expands the search space to maintain the diversity of the solution and accelerates the convergence of the algorithm compared with HS, HIS, GHS, and SGHS algorithms. An apple image recognition multi-objective method based on the adaptive harmony search algorithm with simulation and creation is proposed. The multi-objective optimization recognition method transforms the apple image recognition problem collected in the natural scene into a multi-objective optimization problem, and uses the new adaptive harmony search algorithm with simulation and creation as the image threshold search strategy. The proposed multi-objective optimization recognition method obtains a set of non-dominated threshold solution sets, which is more flexible than the OTSU algorithm in the opportunity of threshold selection. The selected threshold has better adaptive characteristics and has good image segmentation results.

Author Contributions

L.L. conceived and designed the multi-objective method and the experiments, and wrote the paper; J.H. performed the experiments and analyzed the experiments. Both authors have read and approved the final manuscript.

Funding

This work is supported by the National Nature Science Foundation of China (grant No. 61741201, 61462058).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zong, W.G.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  2. Ouyang, H.B.; Xia, H.G.; Wang, Q.; Ma, G. Discrete global harmony search algorithm for 0–1 knapsack problems. J. Guangzhou Univ. (Nat. Sci. Ed.) 2018, 17, 64–70. [Google Scholar]
  3. Xie, L.Z.; Chen, Y.J.; Kang, L.; Zhang, Q.; Liang, X.J. Design of MIMO radar orthogonal polyphone code based on genetic harmony algorithm. Electron. Opt. Control. 2018, 1, 1–7. [Google Scholar]
  4. Zhu, F.; Liu, J.S.; Xie, L.L. Local Search Technique Fusion of Harmony Search Algorithm. Comput. Eng. Des. 2017, 38, 1541–1546. [Google Scholar]
  5. Wu, C.L.; Huang, S.; Wang, Y.; Ji, Z.C. Improved Harmony Search Algorithm in Application of Vulcanization Workshop Scheduling. J. Syst. Simul. 2017, 29, 630–638. [Google Scholar]
  6. Peng, H.; Wang, Z.X. An Improved Adaptive Parameters Harmony Search Algorithm. Microelectron. Comput. 2016, 33, 38–41. [Google Scholar]
  7. Mahdavi, M.; Fesanghary, M.; Damangir, E. An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 2007, 188, 1567–1579. [Google Scholar] [CrossRef]
  8. Han, H.Y.; Pan, Q.K.; Liang, J. Application of improved harmony search algorithm in function optimization. Comput. Eng. 2010, 36, 245–247. [Google Scholar]
  9. Zhang, T.; Xu, X.Q.; Ran, H.J. Multi-objective Optimal Allocation of FACTS Devices Based on Improved Differential Evolution Harmony Search Algorithm. Proc. CSEE 2018, 38, 727–734. [Google Scholar]
  10. LIU, L.; LIU, X.B. Improved Multi-objective Harmony Search Algorithm for Single Machine Scheduling in Sequence Dependent Setup Environment. Comput. Eng. Appl. 2013, 49, 25–29. [Google Scholar]
  11. Hosen, M.A.; Khosravi, A.; Nahavandi, S.; Creighton, D. Improving the quality of prediction intervals through optimal aggregation. IEEE Trans. Ind. Electron. 2015, 62, 4420–4429. [Google Scholar] [CrossRef]
  12. Precup, R.E.; David, R.C.; Petriu, E.M. Grey wolf optimizer algorithm-based tuning of fuzzy control systems with reduced parametric sensitivity. IEEE Trans. Ind. Electron. 2017, 64, 527–534. [Google Scholar] [CrossRef]
  13. Saadat, J.; Moallem, P.; Koofigar, H. Training echo state neural network using harmony search algorithm. Int. J. Artif. Intell. 2017, 15, 163–179. [Google Scholar]
  14. Vrkalovic, S.; Teban, T.A.; Borlea, L.D. Stable Takagi-Sugeno fuzzy control designed by optimization. Int. J. Artif. Intell. 2017, 15, 17–29. [Google Scholar]
  15. Fourie, J.; Mills, S.; Green, R. Harmony filter: A robust visual tracking system using the improved harmony search algorithm. Image Vis. Comput. 2010, 28, 1702–1716. [Google Scholar] [CrossRef]
  16. Alia, O.M.; Mandava, R.; Aziz, M.E. A hybrid harmony search algorithm for MRI brain segmentation. Evol. Intell. 2011, 4, 31–49. [Google Scholar] [CrossRef]
  17. Fourie, J. Robust circle detection using Harmony Search. J. Optim. 2017, 2017, 1–11. [Google Scholar] [CrossRef]
  18. Moon, Y.Y.; Zong, W.G.; Han, G.T. Vanishing point detection for self-driving car using harmony search algorithm. Swarm Evol. Comput. 2018, 41, 111–119. [Google Scholar] [CrossRef]
  19. Zong, W.G. Novel Derivative of Harmony Search Algorithm for Discrete Design Variables. Appl. Math. Comput. 2008, 199, 223–230. [Google Scholar]
  20. Das, S.; Mukhopadhyay, A.; Roy, A.; Abraham, A.; Panigrahi, B.K. Exploratory Power of the Harmony Search Algorithm: Analysis and Improvements for Global Numerical Optimization. IEEE Trans. Syst. Man Cybern. 2011, 41, 89–106. [Google Scholar] [CrossRef] [PubMed]
  21. Saka, M.P.; Hasançebi, O.; Zong, W.G. Metaheuristics in Structural Optimization and Discussions on Harmony Search Algorithm. Swarm Evol. Comput. 2016, 28, 88–97. [Google Scholar] [CrossRef]
  22. Hasançebi, O.; Erdal, F.; Saka, M.P. Adaptive Harmony Search Method for Structural Optimization. J. Struct. Eng. 2010, 136, 419–431. [Google Scholar] [CrossRef]
  23. Zong, W.G.; Sim, K.B. Parameter-Setting-Free Harmony Search Algorithm. Appl. Math. Comput. 2010, 217, 3881–3889. [Google Scholar]
  24. Pan, Q.K.; Suganthan, P.N.; Tasgetiren, M.F.; Liang, J.J. A self-adaptive global best harmony search algorithm for continuous optimization problems. Appl. Math. Comput. 2010, 216, 830–848. [Google Scholar] [CrossRef]
  25. Bao, X.M.; Wang, Y.M. Apple image segmentation based on the minimum error Bayes decision. Trans. Chin. Soc. Agric. Eng. 2006, 22, 122–124. [Google Scholar]
  26. Qian, J.P.; Yang, X.T.; Wu, X.M.; Chen, M.X.; Wu, B.G. Mature apple recognition based on hybrid color space in natural scene. Trans. Chin. Soc. Agric. Eng. 2012, 28, 137–142. [Google Scholar]
  27. Lu, B.B.; Jia, Z.H.; He, D. Remote-Sensing Image Segmentation Method based on Improved OTSU and Shuffled Frog-Leaping Algorithm. Comput. Appl. Softw. 2011, 28, 77–79. [Google Scholar]
  28. Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
  29. Kapur, J.N.; Sahoo, P.K.; Wong, A.K.C. A new method for gray-level picture thresholding using the entropy of the histogram. Comput. Vis. Graph. Image Process. 1985, 29, 273–285. [Google Scholar] [CrossRef]
  30. Zhao, F.; Zheng, Y.; Liu, H.Q.; Wang, J. Multi-population cooperation-based multi-objective evolutionary algorithm for adaptive thresholding image segmentation. Appl. Res. Comput. 2018, 35, 1858–1862. [Google Scholar]
  31. Li, M.; Cao, X.L.; Hu, W.J. Optimal multi-objective clustering routing protocol based on harmony search algorithm for wireless sensor networks. Chin. J. Sci. Instrum. 2014, 35, 162–168. [Google Scholar]
  32. Dong, J.S.; Wang, W.Z.; Liu, W.W. Urban One-way traffic optimization based on multi-objective harmony search approach. J. Univ. Shanghai Sci. Technol. 2014, 36, 141–146. [Google Scholar]
  33. Li, Z.; Zhang, Y.; Bao, Y.K.; Guo, C.X.; Wang, W.; Xie, Y.Z. Multi-objective distribution network reconfiguration based on system homogeneity. Power Syst. Prot. Control. 2016, 44, 69–75. [Google Scholar]
  34. Zong, W.G. Multiobjective Optimization of Time-Cost Trade-Off Using Harmony Search. J. Constr. Eng. Manag. 2010, 136, 711–716. [Google Scholar]
  35. Fesanghary, M.; Asadi, S.; Zong, W.G. Design of low-emission and energy-efficient residential buildings using a multi-objective optimization algorithm. Build. Environ. 2012, 49, 245–250. [Google Scholar] [CrossRef]
  36. Zong, W.G. Multiobjective optimization of water distribution networks using fuzzy theory and harmony search. Water 2015, 7, 3613–3625. [Google Scholar]
  37. Shi, X.D.; Gao, Y.L. Improved Bird Swarm Optimization Algorithm. J. Chongqing Univ. Technol. (Nat. Sci.) 2018, 4, 177–185. [Google Scholar]
  38. Wang, L.G.; Gong, Y.X. A Fast Shuffled Frog Leaping Algorithm. In Proceedings of the 9th International Conference on Natural Computation (ICNC 2013), Shenyang, China, 23–25 July 2013; pp. 369–373. [Google Scholar]
  39. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  40. Wang, L. Intelligent Optimization Algorithms and Applications; Tsinghua University Press: Beijing, China, 2004; pp. 2–6. ISBN 9787302044994. [Google Scholar]
  41. Omran, M.G.H.; Mahdavi, M. Global-best harmony search. Appl. Math. Comput. 2008, 198, 634–656. [Google Scholar] [CrossRef]
  42. Han, J.Y.; Liu, C.Z.; Wang, L.G. Dynamic Double Subgroups Cooperative Fruit Fly Optimization Algorithm. Parttern Recognit. Artif. Intell. 2013, 26, 1057–1067. [Google Scholar]
Figure 1. Flowchart of the apple image recognition multi-objective method based on an adaptive harmony search algorithm with simulation and creation.
Figure 1. Flowchart of the apple image recognition multi-objective method based on an adaptive harmony search algorithm with simulation and creation.
Information 09 00180 g001
Figure 2. Comparison of the average optimum with different harmony tone components of the Sphere function.
Figure 2. Comparison of the average optimum with different harmony tone components of the Sphere function.
Information 09 00180 g002
Figure 3. Comparison of the average optimum with different harmony tone components of the Rastrigrin function.
Figure 3. Comparison of the average optimum with different harmony tone components of the Rastrigrin function.
Information 09 00180 g003
Figure 4. Comparison of the average optimum with different harmony tone components of the Griewank function.
Figure 4. Comparison of the average optimum with different harmony tone components of the Griewank function.
Information 09 00180 g004
Figure 5. Comparison of the average optimum with different harmony tone components of the Ackley function.
Figure 5. Comparison of the average optimum with different harmony tone components of the Ackley function.
Information 09 00180 g005
Figure 6. Comparison of the average optimum with different harmony tone components of the Rosenbrock function.
Figure 6. Comparison of the average optimum with different harmony tone components of the Rosenbrock function.
Information 09 00180 g006
Figure 7. Comparison of the average optimum with different harmony memory sizes of the Sphere function.
Figure 7. Comparison of the average optimum with different harmony memory sizes of the Sphere function.
Information 09 00180 g007
Figure 8. Comparison of the average optimum with different harmony memory sizes of the Rastrigrin function.
Figure 8. Comparison of the average optimum with different harmony memory sizes of the Rastrigrin function.
Information 09 00180 g008
Figure 9. Comparison of the average optimum with different harmony memory sizes of the Griewank function.
Figure 9. Comparison of the average optimum with different harmony memory sizes of the Griewank function.
Information 09 00180 g009
Figure 10. Comparison of the average optimum with different harmony memory sizes of the Ackley function.
Figure 10. Comparison of the average optimum with different harmony memory sizes of the Ackley function.
Information 09 00180 g010
Figure 11. Comparison of the average optimum with different harmony memory sizes of the Rosenbrock function.
Figure 11. Comparison of the average optimum with different harmony memory sizes of the Rosenbrock function.
Information 09 00180 g011
Figure 12. Comparison of the average optimum with different harmony tone components of five functions.
Figure 12. Comparison of the average optimum with different harmony tone components of five functions.
Information 09 00180 g012
Figure 13. Comparison of the average optimum with different harmony memory sizes of five functions.
Figure 13. Comparison of the average optimum with different harmony memory sizes of five functions.
Information 09 00180 g013
Figure 14. Main interface of the apple image segmentation program.
Figure 14. Main interface of the apple image segmentation program.
Information 09 00180 g014
Figure 15. Comparison of the results of image segmentation under direct sunlight under the strong illumination condition.
Figure 15. Comparison of the results of image segmentation under direct sunlight under the strong illumination condition.
Information 09 00180 g015
Figure 16. Comparison of the results of image segmentation under backlighting under the strong illumination condition.
Figure 16. Comparison of the results of image segmentation under backlighting under the strong illumination condition.
Information 09 00180 g016
Figure 17. Comparison of the results of image segmentation under backlighting under the medium illumination condition.
Figure 17. Comparison of the results of image segmentation under backlighting under the medium illumination condition.
Information 09 00180 g017
Table 1. Parameters of five benchmark functions.
Table 1. Parameters of five benchmark functions.
FunctionNamePeak ValueDimensionSearch CopeTheoretical Optimal ValueTarget Accuracy
f1Sphereunimodal30[−100,100]010−15
f2Rastrigrinmultimodal30[−5.12,5.12]010−2
f3Griewankmultimodal30[−600,600]010−2
f4Ackleymultimodal30[−32,32]010−6
f5Rosenbrockunimodal30[−30,30]010−15
Table 2. Comparison of the results of fixed global evolution times.
Table 2. Comparison of the results of fixed global evolution times.
AlgorithmOptimization Performance f 1 f 2 f 3 f 4 f 5
HSOptimum5.07 × 1021.76 × 1014.47 × 1005.18 × 1001.13 × 104
Standard deviation1.36 × 1009.17 × 10−24.89 × 10−31.88 × 10−23.06 × 102
IHSOptimum3.80 × 1022.83 × 1017.62 × 1006.10 × 1004.49 × 104
Standard deviation6.12 × 10−36.81 × 10−18.60 × 10−61.76 × 10−32.26 × 101
GHSOptimum1.20 × 1023.62 × 10−40.00 × 1005.89 × 10−163.00 × 101
Standard deviation9.19 × 10−24.70 × 10−26.11 × 10−44.90 × 10−293.33 × 100
SGHSOptimum2.47 × 1022.98 × 1015.92 × 1005.62 × 1001.89 × 104
Standard deviation8.50 × 10−121.97 × 10−12.67 × 10−56.89 × 10−41.27 × 101
DOSC-AFHSOptimum0.00 × 1000.00 × 1001.97 × 10−23.27 × 10−60.00 × 100
Standard deviation0.00 × 1004.27 × 10−11.02 × 10−152.72 × 10−50.00 × 100
Table 3. Comparison of the optimal performance with different harmony tone components of five functions.
Table 3. Comparison of the optimal performance with different harmony tone components of five functions.
FunctionsNAverage Optimal Value of HSAverage Optimal Value of IHSAverage Optimal Value of GHSAverage Optimal Value of SGHSAverage Optimal Value of SC-AHS
f 1 501.29 × 1039.48 × 1022.00 × 1021.17 × 1039.62 × 10−19
601.35 × 1031.96 × 1032.40 × 1021.09 × 1032.49 × 10−21
701.63 × 1031.92 × 1031.12 × 1033.59 × 1033.34 × 10−17
802.67 × 1032.82 × 1033.20 × 1021.82 × 1041.00 × 10−18
904.04 × 1034.49 × 1033.20 × 1012.33 × 1045.78 × 10−17
1005.83 × 1036.08 × 1033.60 × 1033.59 × 1042.57 × 10−25
Relative rate of change/%3.52 × 1025.42 × 1021.70 × 1032.97 × 1031.00 × 102
Average change2.80 × 1033.04 × 1039.19 × 1021.39 × 1041.55 × 10−17
f 2 503.50 × 1015.73 × 1015.81 × 1017.38 × 1021.29 × 101
604.76 × 1017.81 × 1014.36 × 1007.09 × 1021.69 × 101
706.69 × 1011.18 × 1027.24 × 1009.06 × 1022.29 × 101
806.75 × 1011.13 × 1029.30 × 1011.05 × 1023.78 × 101
901.09 × 1021.47 × 1022.01 × 1001.19 × 1034.38 × 101
1001.13 × 1021.75 × 1022.01 × 1001.62 × 1026.07 × 101
Relative rate of change/%2.22 × 1022.06 × 1029.65 × 1017.80 × 1013.69 × 102
Average change7.31 × 1011.15 × 1022.78 × 1016.35 × 1023.25 × 101
f 3 501.03 × 1018.13 × 1002.80 × 1001.03 × 1034.93 × 10−3
601.59 × 1011.66 × 1011.92 × 10−12.00 × 1029.86 × 10−3
701.77 × 1011.99 × 1013.52 × 1001.66 × 1021.11 × 10−16
802.94 × 1013.22 × 1013.88 × 1001.52 × 1011.23 × 10−2
904.81 × 1013.76 × 1011.40 × 1012.78 × 1024.93 × 10−3
1004.67 × 1014.96 × 1013.34 × 1013.52 × 1011.72 × 10−2
Relative rate of change/%3.56 × 1025.10 × 1021.09 × 1039.66 × 1012.49 × 102
Average change2.80 × 1012.73 × 1019.63 × 1002.88 × 1028.21 × 10−3
f 4 501.26 × 1011.22 × 1011.12 × 1018.42 × 1007.10 × 10−5
601.31 × 1011.25 × 1011.44 × 1001.07 × 1017.73 × 10−6
701.11 × 1011.25 × 1013.87 × 1001.61 × 1018.51 × 100
808.45 × 1008.93 × 1002.11 × 1002.00 × 1011.88 × 101
902.79 × 1003.42 × 1007.55 × 10−12.03 × 1011.56 × 101
1006.41 × 10−11.96 × 10−14.75 × 10−22.05 × 1012.08 × 101
Relative rate of change/%9.49 × 1019.84 × 1019.96 × 1011.43 × 1022.93 × 107
Average change8.11 × 1008.30 × 1003.24 × 1001.60 × 1011.06 × 101
f 5 502.62 × 1051.53 × 1053.03 × 1024.33 × 1081.00 × 10−20
605.24 × 1057.42 × 1055.50 × 1021.16 × 1071.00 × 10−20
704.95 × 1059.88 × 1051.09 × 1031.08 × 1071.00 × 10−20
801.26 × 1061.14 × 1068.46 × 1042.29 × 1051.00 × 10−20
902.04 × 1061.15 × 1067.97 × 1053.56 × 1071.00 × 10−20
1003.37 × 1061.46 × 1061.00 × 1022.30 × 1051.00 × 10−20
Relative rate of change/%1.19 × 1038.55 × 1026.70 × 1019.99 × 1010.00 × 100
Average change1.32 × 1069.39 × 1051.47 × 1058.18 × 1071.00 × 10−20
Table 4. Comparison of the optimal performance with different harmony memory sizes of five functions.
Table 4. Comparison of the optimal performance with different harmony memory sizes of five functions.
FunctionsHMSAverage Optimal Value of HSAverage Optimal Value of IHSAverage Optimal Value of GHSAverage Optimal Value of SGHSAverage Optimal Value of SC-AHS
f 1 202.09 × 1041.38 × 1041.60 × 1014.65 × 1038.23 × 10−20
502.59 × 1035.33 × 1034.00 × 1003.07 × 1032.14 × 10−27
1001.02 × 1032.33 × 1031.20 × 1021.68 × 1037.58 × 10−17
1508.41 × 1021.04 × 1034.80 × 1026.58 × 1024.29 × 10−25
2005.05 × 1026.20 × 1021.20 × 1024.31 × 1021.00 × 10−30
Relative rate of change/%9.76 × 1019.55 × 1016.50 × 1029.07 × 1011.00 × 102
Average change5.16 × 1034.63 × 1031.48 × 1022.10 × 1031.52 × 10−17
f 2 201.60 × 1021.82 × 1023.49 × 1011.22 × 1021.59 × 101
507.11 × 1019.51 × 1011.16 × 1001.72 × 1027.96 × 100
1003.23 × 1015.28 × 1011.16 × 1005.40 × 1011.99 × 100
1502.70 × 1012.87 × 1012.04 × 1002.73 × 1029.95 × 10−1
2001.76 × 1012.68 × 1014.02 × 1002.25 × 1011.78 × 10−15
Relative rate of change/%8.91 × 1018.53 × 1018.85 × 1018.15 × 1011.00 × 102
Average change6.17 × 1017.71 × 1018.65 × 1001.29 × 1025.37 × 100
f 3 201.77 × 1021.25 × 1021.12 × 1007.10 × 1016.14 × 10−1
504.32 × 1015.19 × 1011.60 × 1003.30 × 1012.46 × 10−2
1001.70 × 1011.29 × 1017.20 × 10−16.35 × 1001.11 × 10−16
1506.50 × 1001.01 × 1011.92 × 10−15.19 × 1012.22 × 10−2
2004.46 × 1005.21 × 1002.08 × 1006.63 × 1001.97 × 10−2
Relative rate of change/%9.75 × 1019.58 × 1018.57 × 1019.07 × 1019.68 × 101
Average change4.97 × 1014.11 × 1011.14 × 1003.38 × 1011.36 × 10−1
f 4 201.54 × 1011.48 × 1015.89 × 10−161.43 × 1018.97 × 10−1
501.30 × 1011.24 × 1015.89 × 10−169.69 × 1008.96 × 10−1
1009.06 × 1009.64 × 1005.89 × 10−165.89 × 1008.96 × 10−1
1506.05 × 1006.22 × 1005.89 × 10−165.89 × 1004.16 × 10−5
2005.15 × 1006.81 × 1004.59 × 1005.54 × 1008.48 × 10−7
Relative rate of change/%6.66 × 1015.41 × 1017.80 × 10176.13 × 1011.00 × 102
Average change9.73 × 1009.98 × 1009.19 × 10−18.27 × 1005.38 × 10−1
f 5 201.57 × 1074.96 × 1073.00 × 1011.09 × 1064.36 × 10−2
504.17 × 1063.71 × 1064.66 × 1029.13 × 1047.72 × 10−16
1002.14 × 1055.21 × 1053.76 × 1027.31 × 1041.00 × 10−20
1501.85 × 1045.58 × 1053.00 × 1011.47 × 1061.00 × 10−20
2001.08 × 1041.05 × 1054.66 × 1021.67 × 1041.00 × 10−20
Relative rate of change/%9.99 × 1019.98 × 1011.45 × 1039.85 × 1011.00 × 102
Average change4.01 × 1061.09 × 1072.73 × 1025.49 × 1058.72 × 10−3
Table 5. Influences of convergence performance with different harmony tone components of the SC-AHS algorithm.
Table 5. Influences of convergence performance with different harmony tone components of the SC-AHS algorithm.
FunctionOptimize PerformanceN = 50N = 60N = 70N = 80N = 90N = 100
f 1 Average optimal value9.62 × 10192.49 × 10213.34 × 10171.00 × 10185.78 × 10172.57 × 1025
Evolution times/times309414480548620765
f 2 Average optimal value1.29 × 1011.69 × 1012.29 × 1013.78 × 1014.38 × 1016.07 × 101
Evolution times/times20,00020,00020,00020,00020,00020,000
f 3 Average optimal value4.93 × 1039.86 × 1031.11 × 10161.23 × 1024.93 × 1031.72 × 102
Evolution times/times20,00020,00019,76320,00020,00020,000
f 4 Average optimal value7.10 × 1057.73 × 1068.51 × 1001.88 × 1011.56 × 1012.08 × 101
Evolution times/times20,00020,00020,00020,00020,00020,000
f 5 Average optimal value1.00 × 10201.00 × 10201.00 × 10201.00 × 10201.00 × 10201.00 × 1020
Evolution times/times529267717802985510,44110,890
Table 6. Comparisons of the convergence performance with different harmony memory sizes of the SC-AHS algorithm.
Table 6. Comparisons of the convergence performance with different harmony memory sizes of the SC-AHS algorithm.
FunctionOptimize PerformanceHMS = 20HMS = 50HMS = 100HMS = 150HMS = 200
f 1 Average optimal value8.23 × 10202.14 × 10277.58 × 10174.29 × 10251.00 × 1030
Evolution times/times1606280203219209
f 2 Average optimal value1.59 × 1017.96 × 1001.99 × 1009.95 × 1011.78 × 1015
Evolution times/times20,00020,00020,00020,00018,361
f 3 Average optimal value3.06 × 1022.46 × 1021.11 × 10162.22 × 1021.97 × 102
Evolution times/times20,00020,00011,93620,00020,000
f 4 Average optimal value2.27 × 1051.60 × 1083.93 × 1092.87 × 10101.31 × 1011
Evolution times/times20,00020,00020,00020,00020,000
f 5 Average optimal value2.43 × 1034.33 × 10171.00 × 10201.00 × 10201.00 × 1020
Evolution times/times20,00020,000429639323346
Table 7. Thresholds of the non-dominated solution set under six illumination conditions using the ARMM-AHS method.
Table 7. Thresholds of the non-dominated solution set under six illumination conditions using the ARMM-AHS method.
Direct Sunlight with StrongDirect Sunlight with MediumDirect Sunlight with WeakBacklighting with StrongBacklighting with MediumBacklighting with Weak
Non-Dominated Solution Number: 54Non-Dominated Solution Number: 36Non-Dominated Solution Number: 28Non-Dominated Solution Number: 18Non-Dominated Solution Number: 54Non-Dominated Solution Number: 62
77121130131107146
11612113013199152
77114130131107129
77121130131126113
77121130131107152
116121130131126152
116131130131126152
77121130131126115
116121130131107115
116121130131107115
116121130131106113
7712113013199113
7712113017299113
77131130228107122
1161141306699152
77121130221126113
11613113020699113
7712113015106113
116114130107120
77121130106129
116121130106120
771214699129
77114133106146
7712185106129
7712172126120
77121146107120
7711455126152
7714945107113
7711499152
116178126129
11624399146
1163999115
116198126122
11666126120
77166126146
11697126122
77107122
77126120
77107152
77106122
77106120
116126122
116106146
153168120
6970146
231230120
20598146
146106122
36159113
33163122
22170202
211058
16320847
20391122
214
28
129
209
181
204
248
172
Table 8. Comparison of thresholds selected from the non-dominated solution set under six illumination conditions with OTSU thresholds.
Table 8. Comparison of thresholds selected from the non-dominated solution set under six illumination conditions with OTSU thresholds.
Illumination ConditionSelect Threshold by SC-AHSOTSU Threshold
Direct sunlight with strong116129
Direct sunlight with medium121129
Direct sunlight with weak130133
Backlighting with strong131144
Backlighting with medium99130
Backlighting with weak113130
Table 9. Apple image segmentation effect comparison under six illumination conditions.
Table 9. Apple image segmentation effect comparison under six illumination conditions.
Illumination ConditionRSegRAppRBraRSucRMis (%)
Direct sunlight with strong100.085.714.3100.00.0
Direct sunlight with medium80.075.025.075.025.0
Direct sunlight with weak80.0100.00.080.020.0
Backlighting with strong100.083.316.7100.00.0
Backlighting with medium85.783.316.783.316.7
Backlighting with weak66.7100.00.066.733.3

Share and Cite

MDPI and ACS Style

Liu, L.; Huo, J. Apple Image Recognition Multi-Objective Method Based on the Adaptive Harmony Search Algorithm with Simulation and Creation. Information 2018, 9, 180. https://doi.org/10.3390/info9070180

AMA Style

Liu L, Huo J. Apple Image Recognition Multi-Objective Method Based on the Adaptive Harmony Search Algorithm with Simulation and Creation. Information. 2018; 9(7):180. https://doi.org/10.3390/info9070180

Chicago/Turabian Style

Liu, Liqun, and Jiuyuan Huo. 2018. "Apple Image Recognition Multi-Objective Method Based on the Adaptive Harmony Search Algorithm with Simulation and Creation" Information 9, no. 7: 180. https://doi.org/10.3390/info9070180

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop