Next Article in Journal
Fractional Partial Differential Equations Associated with Lêvy Stable Process
Next Article in Special Issue
Innovative Platform for Designing Hybrid Collaborative & Context-Aware Data Mining Scenarios
Previous Article in Journal
Optimal Filtering of Markov Jump Processes Given Observations with State-Dependent Noises: Exact Solution and Stable Numerical Schemes
Previous Article in Special Issue
Using Machine Learning Classifiers to Recognize the Mixture Control Chart Patterns for a Multiple-Input Multiple-Output Process
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A db-Scan Hybrid Algorithm: An Application to the Multidimensional Knapsack Problem

Escuela de Ingeniería en Construcción, Pontificia Universidad Católica de Valparaíso, 2362807 Valparaíso, Chile
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2020, 8(4), 507; https://doi.org/10.3390/math8040507
Submission received: 3 March 2020 / Revised: 21 March 2020 / Accepted: 23 March 2020 / Published: 2 April 2020
(This article belongs to the Special Issue Computational Intelligence)

Abstract

:
This article proposes a hybrid algorithm that makes use of the db-scan unsupervised learning technique to obtain binary versions of continuous swarm intelligence algorithms. These binary versions are then applied to large instances of the well-known multidimensional knapsack problem. The contribution of the db-scan operator to the binarization process is systematically studied. For this, two random operators are built that serve as a baseline for comparison. Once the contribution is established, the db-scan operator is compared with two other binarization methods that have satisfactorily solved the multidimensional knapsack problem. The first method uses the unsupervised learning technique k-means as a binarization method. The second makes use of transfer functions as a mechanism to generate binary versions. The results show that the hybrid algorithm using db-scan produces more consistent results compared to transfer function (TF) and random operators.

1. Introduction

With the incorporation of technologies such as big data and the Internet of Things, the concept of real-time decisions has become relevant at the industrial level. Each of these decisions can be modeled as an optimization problem or, in many cases, a combinatorial optimization problem (COP). Examples of COPs are found in different areas: machine learning [1], transportation [2], facility layout design [3], logistics [4], scheduling problems [2,5], resource allocation [6,7], routing problems [8], robotics applications [9], image analysis [10], engineering design problems [11], fault diagnosis of machinery [12], and manufacturing problems [13], among others. If the problem is large, metaheuristics have been a good approximation to obtain adequate solutions. However, having a greater amount of data and requiring solutions in close to real time for some cases motivates us to continue strengthening the methods that address these problems.
One way to classify metaheuristics is according to the search space in which they work. In that sense, we have metaheuristics that work in continuous spaces, discrete spaces, and mixed spaces [14]. An important line of inspiration for metaheuristic algorithms is natural phenomena, many of which develop in a continuous space. Examples of metaheuristics inspired by natural phenomena in continuous spaces include particle swarm optimization [15], black hole optimization [16], cuckoo search [17], the bat algorithm [18], the firefly algorithm [19], the fruitfly algorithm [20], the artificial fish swarm [21], and the gravitational search algorithm [22]. The design of binary versions of these algorithms entails important challenges when preserving their intensification and diversification properties [14]. The details of binarization methods are specified in Section 3.
A strategy that has strengthened the results of metaheuristic algorithms has been the hybridization of these with techniques that come from the same or other areas. The main hybridization proposals found in the literature are the following: (i) matheuristics, which combine heuristics or metaheuristics with mathematical programming [23]; (ii) hybrid heuristics, a combination between different heuristic or metaheuristic methods [24]; (iii) simheuristics, where simulation and metaheuristics are combined together to solve a problem [25]; and (iv) Integration between metaheuristic algorithms and machine learning techniques. The last, hybridization between the areas of machine learning and metaheuristic algorithms, is an emerging research line in the areas of computer science and operational research. We find that hybridization occurs mainly with two intentions. The first is with the goal that metaheuristics will help machine learning algorithms improve their results (for example, [26,27]). The second intention is that machine learning techniques will be used to strengthen metaheuristic algorithms (for example, [28,29]). The details of the hybridization forms are specified in Section 4.
This article is inspired by the research lines mentioned above. A hybrid algorithm is designed that explores the application of a machine learning algorithm in a binarization operator to allow continuous metaheuristics to address combinatorial optimization problems. The contributions of this work are detailed below:
  • A machine learning technique is used with the objective of obtaining binary versions of metaheuristics defined and used in continuous optimization to tackle COPs in a simple and effective way. To perform the binarization process, this algorithm uses db-scan, which corresponds to an unsupervised learning algorithm. The selected metaheuristics are particle swarm optimization (PSO) and cuckoo search (CS). Their selection is based on the fact that they have been frequently used in solving continuous optimization problems and their parameterization is relatively simple, which allows us to focus on the binarization process.
  • The multidimensional knapsack problem (MKP) was used to check the performance of the obtained binary versions. MKP was chosen because it is a problem extensively studied in the literature therefore we have specific instances making it easy to evaluate our hybrid algorithm. The details and applications of MKP are expanded on in Section 2.
  • Two random operators are designed in order to define a baseline and quantify the contribution of the hybrid algorithm that uses db-scan in the binarization process. Additionally, to make the comparison more robust, the performance of our hybrid algorithm was evaluated with methods that use k-means and transfer functions (TF) as binarization mechanisms.
This article is organized in the following sections. The definition of MKP is detailed in Section 2. Subsequently, the state-of-the-art of the main binarization and hybridization techniques between machine learning and metaheuristics are developed in Section 3. The detail of the proposed hybrid binarization algorithm is described in Section 4. In Section 5, we study the contribution of the db-scan operator to the binarization process. Additionally, in this same section, the proposed hybrid algorithm is compared with the binarization techniques that use TF and k-means. Finally, the main conclusions and future lines of investigation are detailed in Section 6.

2. Multidimensional Knapsack Problem

Due to having a large number of applications and NP -hard computational complexity, a major research effort has been dedicated to the MKP. This optimization problem has been addressed by different types of techniques. Examples of exact algorithms that have efficiently resolved the MKP are found in [30,31]. There are also hybrid algorithms where exact algorithms are combined with depth-first search [32] or with variable fixation [33]. However, exact algorithms are capable of producing optimal solutions for small and medium-sized instances, usually with a number of variables n 250 and a number of restrictions m 10 [32,33]. This makes MKP an interesting problem for metaheuristics to address.
In the case of metaheuristics, there are several algorithms that have addressed the MKP. A modification of the harmony search algorithm that redesigns the memory rule and improves exploration capabilities was proposed in [34]. A binary artificial algae algorithm was designed in [35] that uses transfer functions to perform binarization in addition to incorporating a local search operator. A hybrid algorithm based on the k-means technique was proposed in [29]. Additionally, this algorithm incorporates local perturbation and search operators. In [36], a binary multiverse optimizer was designed to address medium-sized problems. This multiverse algorithm uses a transfer function mechanism to perform binarization. Finally, a two-phase tabu-evolutionary algorithm was developed by Lai et al. [37] to address large instances of the MKP.
Let N = { 1 , 2 , , n } be a set of n elements and M = { 1 , 2 , , m } be a set of m resources with capacity limit b i for each resource i M . Then, each element j has profit p j and consumes an amount of resources c i j . The MKP consists of selecting a subset of elements such that the limit capacity of each resource is not exceeded while the profit of the selected elements is maximized. Formally, the problem is defined as follows:
Maximize   P ( x 1 , , x n ) = j = 1 n p j x j ,
subject to:
j = 1 n c i j x j b i , i { 1 , , m } .
x i j { 0 , 1 } ,
where b i corresponds to the capacity limitation of resource i M . Each element j N has a requirement of c i j regarding resource i as well as a benefit p j . Moreover, x j { 0 , 1 } indicates whether the element j is in the knapsack, j { 1 , , n } , c i j 0 , p j > 0 , b j > 0 , n corresponds to the number of items, and m is the number of knapsack constraints.
As mentioned above, the MKP has numerous applications. MKP modeling has been used in project selection and capital budgeting [38] applications as well as in the delivery of groceries in vehicles with multiple compartments [39] and the daily photographic scheduling problem of an Earth observation satellite [40]. Another interesting problem related to the MKP is the shelf space allocation problem [41]. Additionally, we found applications in the allocation of databases and processors in distributed data processing [42].

3. Related Work

3.1. Related Binarization Work

Because many processes of nature that are modeled in continuous spaces have inspired metaheuristic algorithms, there are a large number of these that are designed to work in continuous spaces. In particular, the metaheuristics cuckoo search (CS) and particle swarm optimization (PSO) have been widely used in solving continuous problems. However, there are a large number of combinatorial optimization problems, where a significant number of these are NP -hard type. This motivates the search for robust methods that allow algorithms that operate in continuous spaces to tackle combinatorial optimization problems.
When developing a classification of existing binarization techniques, two large groups [14] are defined. The first group designs an adaptation of a continuous algorithm to work in binary environments. This adaptation usually turns out to be specific to the metaheuristic algorithm and the problem that is being solved. We call this group the specific binarizations. The second group separates the binarization process from the metaheuristic algorithm. Therefore, the latter continues to work in a continuous search space. Once the continuous solutions are obtained, they are binarized. We call this group the generic binarizations. In Figure 1a,b, the generic and specific binarization diagrams are shown.
Examples of specific binarizations are frequently found in quantum binary approaches and in set-based approaches [14]. In the case of a quantum approach, continuous algorithms are adapted based on the uncertainty principle, where position and velocity cannot be determined simultaneously. In [43], a quantum binary gray wolf optimizer is proposed to solve the unit commitment problem. Using a quantum binary lightning search algorithm, in [44], the optimal placement of vehicle-to-grid charging stations in the distribution power system was addressed. The short-term hydropower generation scheduling problem was successfully addressed by [45] using a quantum-binary social spider optimization algorithm. In the case of the set-based approach, in [46], this method succeeded in solving the feature selection problem. Additionally, the vehicle routing problem with time windows was addressed in [47] by a particle swarm optimization set-based approach. Other examples of specific binarizations are found in [48], where a chaotic antlion algorithm was used to find a parameterization of the support vector machine technique. In this case, a chaotic map and random walk operators were used.
In the case of generic transformations, the simplest and most commonly used binarization method corresponds to the transfer functions (TFs). In this method, the particle has a position given by a solution in one iteration and a velocity corresponding to the vector obtained from the difference in the position of the particle between two consecutive iterations. The TF is a very simple operator that relates the velocities of the particles in PSO with a transition probability. The TF takes values of R n and generates transition probability values in [ 0 , 1 ] n . Depending on the form of the function, they are generally classified as S-shaped [49] and V-shaped functions [50]. However, in recent years, the study of transfer functions has been extended by defining new families. In [51], a recurrence generated parametric Fibonacci hyperbolic tangent activation function has been defined and applied to neural networks. A Family of Functions Based on Half–Hyperbolic Tangent Activation Function was introduced in [52]. In this work, the authors demonstrated the existence of upper and lower estimates for the Hausdorff approximation of the sign function.
When the function produces a value between 0 and 1, the next step is to use a rule that allows 0 or 1 to be obtained. For this, well-defined rules have been used that use the concepts of complements, elites, and random functions, among others. In [53], a quadratic binary Harris hawk optimization, which uses transfer functions, successfully addressed the feature selection problem. Additionally, a feature selection problem in [54] was solved by a binary dragonfly optimization algorithm. In this case, a time-varying transfer function was used. Finally, in [55], binary butterfly optimization was used to solve the feature selection problem.
The main challenge a binarization framework has to tackle is associated with spatial disconnection [56]. When two solutions that are close in continuous space are not close in binary space when applying the binarization process, a spatial disconnection occurs. As a consequence of the existence of a spatial disconnection, alterations are observed in the exploration and exploitation properties of the optimization algorithm. These alterations result in a decrease in precision and an increase in the convergence times of the algorithms. In [57], we analyzed how TFs altered the exploration and exploitation process. We also find an analysis of these properties in [56], for the angle modulation technique.

3.2. Hybridizing Metaheuristics with Machine Learning

Metaheuristics form a wide family of algorithms. These algorithms are classified as incomplete optimization techniques and are usually inspired by natural or social phenomena. The main objective of these is to solve problems of high computational complexity, and they have the property of not having to deeply alter their optimization mechanism when the problem to be solved is modified. On the other hand, machine learning techniques correspond to algorithms capable of learning from a dataset [58]. If we classify these algorithms according to the method of learning, there are three main categories: supervised learning, unsupervised learning, and learning by reinforcement. Machine learning algorithms are usually used to solve time series problems, anomaly detection, computational vision, data transformation, dimensionality reduction, regression, and data classification, among others [59].
Among state-of-the-art algorithms that integrate machine learning techniques with metaheuristic algorithms, we have found two main approaches. In the first approach, machine learning techniques are used to improve the quality of the solutions and convergence times obtained by the metaheuristic algorithms. The second approach uses metaheuristic algorithms to improve the performance of machine learning techniques. Usually, the metaheuristic is responsible for solving an optimization problem related to the machine learning technique more efficiently than the machine learning technique alone.
When we analyze the integration mechanisms that take the first approach, we identify two lines of research. In the first line, machine learning techniques are used as metamodels to select different metaheuristics by choosing the most appropriate metaheuristic for each instance. The second line aims to use specific operators that make use of machine learning algorithms, and, subsequently, to integrate specific operators into a metaheuristic.
According to the articles found that use a general integration mechanism between machine learning algorithms and metaheuristics, three main groups are defined: hyper-heuristics, cooperative strategies, and algorithm selection. The approach through the algorithm selection method aims to select from a group of algorithms, the most appropriate algorithm for the instance being solved. This selection is made using a set of characteristics and associating the best algorithm that has solved similar instances. In the case of the hyper-heuristic strategy, the approach is to automate the design of heuristic methods in order to tackle a wide range of problems. Finally, in the case of cooperative strategies, they are aimed at combining algorithms through a parallel or a sequential mechanism, assuming that this combination will produce more robust methods. Examples of these approaches are found in [28], where the algorithm selection strategy is used and applied to the berth scheduling problem. A hyper-heuristic algorithm was used in [60] and was applied to the nurse training problem. A direct cooperation mechanism was used in [61] to solve the permutation Flow stores problem.
A metaheuristic is determined by its evolution mechanism, together with different operators, such as initialization solution operators, solution perturbation, population management, binarization, parameter setting, and local search operators. Specific integrations explore machine learning applications in some of these operators. In the design of binary versions of algorithms that work naturally in continuous spaces, we find binarization operators in [2]. These binary operators use unsupervised learning techniques to perform the binarization process. In [62], the concept of percentiles was explored in the process of generating binary algorithms. In addition, in [5], the Apache spark big data framework was applied to manage the population size of solutions to improve convergence times and the quality of results. Another interesting line of research was found in the adjustment of metaheuristic parameters. In [63], the parameter setting of a chess classification system was implemented. Based on decision trees and using fuzzy logic, a semi-automatic parameter setting algorithm was designed in [64]. The initiation of solutions of a metaheuristic is frequently carried out in a random way. However, using machine learning techniques, it has been possible to improve the performance of metaheuristic algorithms, through the process of initiating solutions. In [65], the initiation of solutions of a genetic algorithm through the case-based reasoning technique was applied to the problem of the design of a weighted circle. Again, in the initiation of a genetic algorithm, in [66], Hopfield neural networks were used. The genetic algorithm, together with the Hopfield networks, were applied to the economic dispatch problem.
In the other direction, where metaheuristics support the development of more robust machine learning algorithms, there are many studies and applications. For example, we find applications in feature selection, parameter settings, feature extraction. In [67], an integrated genetic algorithm with SVM was applied to the recognition of breast cancer. The hybrid algorithm improved the classification compared to the original SVM technique. In particular, the genetic algorithm was applied to the extraction of characteristics from the images involved in the analysis. Again for feature extraction in [68] a multiverse optimizer was used. Additionally, this optimizer was used to perform SVM tuning. In the case of neural networks, depending on the type of network and its topology, obtaining the weights properly can be a difficult and time-consuming task. In particular, in [69], the tuning of a feed-forward neural network was addressed through an improved monarch butterfly algorithm. The integration of a firefly algorithm with the least-squares support vector machine technique was developed in [70] with the goal of solving a geotechnical problem efficiently. The prediction of the compressive strength of high-performance concrete was modeled in [71] through a metaheuristic-optimized least-squares support vector regression algorithm. In [72], a hybrid algorithm that integrates metaheuristics with artificial neural networks was designed with the aim of improving the prediction of stock prices. Another example of price prediction was developed in [26]. In this case, the improvement was achieved using a sliding-window metaheuristic-optimized machine-learning regression and was applied to a construction company in Taiwan. We also find in [73], an application of a firefly algorithm applied for tuning parameters in the least-squares vector regression technique. In this case, the improved algorithm was applied to predictions in engineering design. Applications of metaheuristics to unsupervised machine learning techniques are also found in the literature. In particular, there are a large number of studies applied to cluster techniques. In [74], an algorithm based on the combination of a metaheuristic with a kernel intuitionistic fuzzy c-means method was designed and applied to different datasets. Another interesting problem is the search for centroids because it requires a large computing capacity. An artificial bee colony algorithm was used in [75], to find centroids in an energy efficiency problem of a wireless sensor network. Planning for the transportation of employees from an oil platform through helicopters was addressed in [76] through cluster search using a metaheuristic.

4. Binary db-Scan Algorithm

To efficiently solve the MKP, the binary db-scan algorithm is composed of five operators. The first operator corresponds to the initialization of the solutions. This operator is detailed in Section 4.1. After the population of solutions is initialized, the next step is to verify whether the maximum iteration criterion is met. If the criterion is not satisfied, then the binary db-scan operator is used. In this operator, the metaheuristic is executed in the continuous space to later group the solutions considering the absolute value of the velocity and using the db-scan technique. The details of this operator are described in Section 4.2. Subsequently, with the clusters generated by the db-scan operator, the transition operator is used, which aims to binarize the solutions grouped by db-scan. The transition operator is described in Section 4.3. Then, if the solutions obtained do not satisfy all the constraints, the repair operator described in Section 4.5 is applied. Finally, a random perturbation operator is used that is associated with the criterion of the number of iterations that are performed without the best value being modified; this operator is detailed in Section 4.4. The general flow chart of the binary db-scan algorithm is shown in Figure 2.

4.1. Initialization Operator

Each solution is generated as follows. First, we select an item randomly. Then, we consult the constraints of our problem to see whether there are other elements that can be incorporated. The list of possible elements to be incorporated is obtained, the weight for each of these elements is calculated, and one of the three best elements is selected. The procedure continues until no more elements can be incorporated. The pseudocode is shown in Algorithm 1.
Algorithm 1 Initialization algorithm.
  1:
Function initAlgo( l E l e m e n t s )
  2:
Input The list of elements ( l E l e m e n t s )
  3:
Output The solution (p)
  4:
p []
  5:
e l e m e n t RandElement( l E l e m e n t s )
  6:
p.append( e l e m e n t )
  7:
while (There exist elements that satisfy the constraints:) do
  8:
l P o s i b l e E l e m e n t s PosibleElements( l E l e m e n t s )
  9:
e l e m e n t BestElement( l P o s i b l e E l e m e n t s )
10:
p.append( e l e m e n t )
11:
end while
12:
return  p
Several techniques have been proposed in the literature to calculate the weight of each element. For example, in [77], the pseudoutility in the surrogate duality approach was introduced. The pseudoutility of each variable is given in Equation (4).
δ i = p i j = 1 m w j c i j
Another more intuitive measure was proposed in [78]. This measure focuses on the average occupancy of resources. Its equation is shown in Equation (5).
δ i = j = 1 m c i j m b j p i
In this article, we use a variation of this last measure focused on the average occupation shown in Equation (6). In this equation, c k j represents the cost of object k in knapsack j, b j corresponds to the capacity constraint of knapsack j, and p i corresponds to the profit of element i. This heuristic was proposed in [29], and its objective is to select the elements that enter the knapsack.
δ i = j = 1 m c i j m ( b j k S c k j ) p i

4.2. Binary db-Scan Operator

Continuous metaheuristic solutions are clustered through the binary db-scan operator. If we make the analogy of solutions with particles, the position of the particle represents the location of the particle in the search space. Velocity is interpreted as a transition vector between a state t and a state t + 1 .
The density-based spatial clustering of applications with noise (db-scan) is used as a technique to obtain clusters. Db-scan works with the concept of density to find clusters. The algorithm was proposed in 1996 by Ester et al. [79]. Let us consider a set S within a metric space, then the db-scan algorithm will group the points that meet a minimum density criterion and the others are labeled as outliers. To achieve this task, db-scan requires two parameters: a radius ϵ and a minimum number of neighbors δ . The main steps of the algorithm are shown below:
  • Find the points in the ϵ neighborhood of every point and identify the core points with more than δ neighbors.
  • Find the connected components of core points on the neighbor graph, ignoring all noncore points.
  • Assign each noncore point to a nearby cluster if the cluster is an ϵ neighbor; otherwise, assign it to noise.
Let us define l p ( t ) as the position list given by a M H metaheuristic in the t iteration. Then, the binary operator d b - s c a n will have M H and l p ( t ) as input objects. The operator’s goal is to generate the clusters of the solutions delivered by M H . As a first step, the operator must iterate l p ( t ) using M H , which will obtain another list l p ( t + 1 ) with the positions of the solutions in the iteration t + 1 . Finally, with l p ( t + 1 ) and l p ( t ) , we obtain a list of velocities l V ( t + 1 ) .
Let v p ( t + 1 ) l V ( t + 1 ) be the velocity vector in the transition between t and t + 1 corresponding to particle p. The dimension of the vector is n and is basically determined by the number of columns that the problem has. Let v i p ( t + 1 ) v p ( t + 1 ) be the value for dimension i of the vector v p ( t + 1 ) . Then, l V i ( t + 1 ) corresponds to the list of absolute values of v i p ( t + 1 ) , v p ( t + 1 ) l V ( t + 1 ) . Then, we apply db-scan to the list l V i ( t + 1 ) , thereby obtaining the number of clusters n C l u s t e r s ( t + 1 ) and the cluster to which each v i ( t + 1 ) belongs, l V i C l u s t e r s ( t + 1 ) , where a b s ( v i ( t + 1 ) ) l V i ( t + 1 ) ) . The mechanism for the binary db-scan operator is shown in Algorithm 2.
Algorithm 2 Binary db-scan operator.
1:
Function BinaryDbscan( M H , l p ( t ) )
2:
Input  M H , l p ( t )
3:
Output  n C l u s t e r s ( t + 1 ) , l V i C l u s t e r s ( t + 1 )
4:
l p ( t + 1 ) , l V ( t + 1 ) applyMH( M H ( l p ( t ) ) )
5:
l V i ( t + 1 ) ClusterList( l V ( t + 1 ) )
6:
n C l u s t e r s ( t + 1 ) , l V i C l u s t e r s ( t + 1 ) applyDbscan( l V i ( t + 1 ) )
7:
return  n C l u s t e r s ( t + 1 ) , l V i C l u s t e r s ( t + 1 )

4.3. Transition operator

The number of clusters and the list with the identifier of the membership of each element in the cluster is returned by the db-scan operator. Using these objects, the transition operator returns binarized solutions. To execute this binarization the identifier I d ( J ) Z that identifies the cluster, is assigned in an orderly manner. The value 0 is assigned to the cluster that has the absolute value of the centroid with the smallest value. As an example, let v j J and v i I be elements of Groups J and I, respectively, and a b s ( v j ) > a b s ( v i ) ; then, I d ( J ) > I d ( I ) . Additionally, in the case that db-scan labels some element as an outlier, Equation (7) is used to assign the probability of transition. In Equation (7), α represents a minimum transition coefficient and β models the separation between the different clusters.
P t r ( J ) = α + β I d ( J ) T , where   T   is   the   total   number   clusters ,   not   considering   outliers
Finally, to execute the binarization process, consider p ( t ) as the position of a particle in iteration t. Let p i ( t ) be the value of dimension i for particle p ( t ) , and let v i p ( t + 1 ) be the velocity of particle p ( t ) in the ith dimension to transform p ( t ) from iteration t to iteration t + 1 . Additionally, let there be v i p ( t + 1 ) J , where J is one of the clusters identified by the binary db-scan operator. Then, we use Equation (8) to generate the binary positions of the particles in iteration t + 1 .
p i ( t + 1 ) : = p ^ i ( t ) , if   r a n d < P t r ( J )   where   v i p ( t + 1 ) J , p i ( t ) , otherwise
When v i p ( t + 1 ) o u t l i e r s , a transition probability is assigned randomly. Finally, after the transition operator is applied, a repair operator is used, as described in Section 4.5, for solutions that do not satisfy some of the restrictions. The details of the transition operator are shown in Algorithm 3.
Algorithm 3 Transition algorithm.
  1:
Function Transition( l p ( t ) , l V i C l u s t e r s ( t + 1 ) , n C l u s t e r s ( t + 1 ) )
  2:
Input  l p ( t ) , l V i C l u s t e r s ( t + 1 ) , n C l u s t e r s ( t + 1 )
  3:
Output  l B i n a r y P ( t + 1 )
  4:
for  p i ( t ) , v i p ( t + 1 ) in ( l p ( t ) , l V i ( t + 1 ) ) do
  5:
if  v i x ( t + 1 ) not in o u t l i e r s then
  6:
   P t r ( p i ) TransitionProbability( l V i C l u s t e r s ( t + 1 ) , n C l u s t e r s ( t + 1 ) ) –Equation (7)
  7:
else
  8:
   P t r ( p i ) randomTransitionProbability( l V i C l u s t e r s ( t + 1 ) , n C l u s t e r s ( t + 1 ) )
  9:
end if
10:
l B i n a r y P ( t + 1 ) .append( p i ( t + 1 ) ) BinaryPosition( P t r ( p i ( t ) ) , L i s t V i C l u s t e r s ( t + 1 ) ) –Equation (8)
11:
end for
12:
for  p ( t + 1 ) in l B i n a r y P ( t + 1 ) do
13:
l B i n a r y P ( t + 1 ) [ p ( t + 1 ) ] Repair( p ( t + 1 ) )
14:
end for
15:
return  l B i n a r y P ( t + 1 )

4.4. Random Perturbation Operator

Because the MKP is a difficult problem to solve, there is a condition that the algorithm is confined to local optimums. To address this situation, optimization is complemented by a perturbation operator. Once the condition that the solution does not improve is met, the perturbation operator performs a set of random deletions defined by the value η ν , where ν is a parameter to estimate, which multiplies the total length of the solution to obtain the value η ν . The procedure is outlined in Algorithm 4.
Algorithm 4 Perturbation algorithm.
1:
Function Perturbation( p i n , η ν )
2:
Input Input solution p i n , strength of perturbation η ν
3:
Output The perturbed solution p o u t
4:
p p i n
5:
for i=1 to η ν do
6:
 Randomly remove an element of p
7:
end for
8:
p o u t RepairOperator(p)
9:
return  p o u t

4.5. Repair Operator

Because the transition operator makes modifications to the solution, it may happen that the new solution does not respect any of the constraints of the optimization problem. This article solves this difficulty through a repair operator. The operator considers the solution to be repaired as input and returns a repaired solution as output. The operator first asks if the solution needs repair. If it is true, the repair procedure uses Equation (6) with the goal of ranking the elements that are eliminated. This elimination procedure runs until the solution obtained meets all the constraints. Subsequently, the possibility of incorporating new elements should be verified. To rank the elements to incorporate, Equation (6) is used again. After completing this procedure, the repair operator returns the repaired solution. The pseudocode of this process is given in Algorithm 5.
Algorithm 5 Repair algorithm.
  1:
Function Repair(p)
  2:
Input Input solution p
  3:
Output The Repair solution p
  4:
b R e p a i r Repair(p)
  5:
while ( b R e p a i r == True) do
  6:
p m a x MaxWeight(p)
  7:
p removeElement(p, p m a x )
  8:
b R e p a i r Repair(p)
  9:
end while
10:
s t a t e ← False
11:
while ( s t a t e == False) do
12:
p m i n MinWeight(p)
13:
if ( p m i n = = ) then
14:
   s t a t e ← True
15:
else
16:
   p addElement(p, p m i n )
17:
end if
18:
end while
19:
return  p

5. Results and Discussion

To determine the importance of the db-scan operator in the binarization, three groups of experiments were defined. The first group of experiments aimed to define a comparison baseline. This baseline is determined through random operators. The results and comparisons of db-scan with the random operators are developed in Section 5.2. The second group of experiments compared db-scan with k-means. K-means is another clustering technique frequently used in data analysis. The comparison and its results are described in Section 5.3. The design of the binarization framework using k-means is found in [50]. Finally, the third group of experiments developed the comparison between db-scan and TF, which is detailed in Section 5.4. In this last case, state-of-the-art algorithms were used to make the comparison. Additionally, the methodology to determine the parameters involved in the algorithms used in the binarization process is detailed in Section 5.1. To carry out the different experiments, the PSO and CS algorithms were used. They were chosen mainly because they are simple to parameterize, both have successfully solved a large number of optimization problems [2,5,80,81,82], and there are simplified convergence models for CS [83] and PSO [56].
The c b . 5 , 500 , c b . 10 , 500 , and c b . 30 , 500 instances that correspond to the most difficult instances of the Beasley OR-library (http://www.brunel.ac.uk/mastjjb/jeb/orlib/mknapinfo.html) were selected to carry out the experiments. In the execution, a laptop with Windows 10 and Python 2.7 was used as the programming language. The laptop has an Intel Core i7-8550U processor with 16 GB of RAM. As a statistical test to measure significance, the Wilcoxon signed-rank non-parametric test was used.

5.1. Parameter Settings

The methodology used in determining the parameters is based on the evaluation of four measures. These measurements are defined in Equations (9)–(12) and through the use of radar plots determine which is the appropriate parameterization. More detail on the methodology used can be found in [29,50].
Definition 1
([29]). Measure definitions:
  • The percentage deviation of the best value obtained in the ten executions compared with the best known value (see Equation (9)):
    b S o l u t i o n = 1 K n o w n B e s t V a l u e B e s t V a l u e K n o w n B e s t V a l u e
  • The percentage deviation of the worst value obtained in the ten executions compared with the best known value (see Equation (10)):
    w S o l u t i o n = 1 K n o w n B e s t V a l u e W o r s t V a l u e K n o w n B e s t V a l u e
  • The percentage deviation of the average value obtained in the ten executions compared with the best known value (see Equation (11)):
    a S o l u t i o n = 1 K n o w n B e s t V a l u e A v e r a g e V a l u e K n o w n B e s t V a l u e
  • The convergence time for the best value in each experiment normalized (see Equation (12)):
    n T i m e = 1 c o n v e r g e n c e T i m e m i n T i m e m a x T i m e m i n T i m e
For PSO, the coefficients c 1 and c 2 were set to 2. The parameter ω linearly decreased from 0.9 to 0.4. For the parameters used by db-scan, the minimum number of neighbors ( m i n P t s ) was estimated as a percentage of the number of particles (N). Specifically, N = 30 and m i n P t s = 10 . To select the parameters, problems c b . 5 . 250 were chosen. The parameter settings are shown in Table 1 and Table 2. In both tables, the column labeled “Value” represents the selected value, and the column labeled “Range” corresponds to the set of scanned values.

5.2. The Contribution of the db-Scan Binary Operator

This section aims to determine the contribution of the db-scan operator to the MKP results. For this purpose, two random operators are designed that serve as a baseline for the comparison. The first random operator uses a fixed transition probability regardless of the velocity of the particle. We denote this operator with B- r a n d , and we use the two transition probability values 0.3 (B- r a n d 3 ) and 0.5 (B- r a n d 5 ). The second operator additionally incorporates the cluster concept. Three clusters are defined, and each is assigned a transition probability value among the values {0.1, 0.3, 0.5}. This operator is denoted as B C - r a n d 3 . To develop comparisons between the different algorithms, CS is used as the optimization algorithm, and all implementations use the same initiation, perturbation, and repair operators.
To make the comparisons, the set of problems c b . 5 . 500 of the OR-library was used and divided into three groups: Group 0, Problems 0–9; Group 1, Problems 10–19; and Group 2, Problems 20–29. The results obtained from the comparison of the d b - s c a n algorithm with the B- r a n d and C B - r a n d operators are shown in Table 3 and in Figure 3.
When comparing the best values in Table 3, we observe that d b - s c a n has the same or better performance than random algorithms in all instances. When applying the Wilcoxon test, we see that the difference is significant. However, when we analyze the average, we see that the difference is small. In the case of the avg indicator, again, d b - s c a n is higher in all cases. However, when comparing the average, we see that the difference is much larger. The Wilcoxon test again indicates that this difference is statistically significant. We must emphasize that, in this experiment, the only operator that was replaced was d b - s c a n . Observing the shape of the distributions with the violin plots, we see that the median, the interquartile range, and the dispersion are much more robust in d b - s c a n than in the rest of the algorithms.

5.3. K-means Algorithm Comparison

In this section, we develop a comparison between the db-scan and k-means clustering techniques. The k-means technique has been used to obtain binary versions of swarm intelligence algorithms. This technique has been successfully applied to the set-covering problem [50] and to the knapsack problem [29]. Unlike db-scan, k-means needs to set the number of clusters; therefore, this number is a parameter to estimate. In this experiment, the initiation, repair, and perturbation operators were exactly the same, and we only modified the binarization mechanism by replacing db-scan with k-means. In the case of k-means, and guided by the results obtained in [29], the number of clusters k = 5 was used, and we worked with the set of problems c b . 30 . 500 to make comparisons. As in the previous experiments, we used three groups: Problems 0–9 as Group 0, 10–19 as Group 1 and 20–29 as Group 2. The results are shown in Table 4 and Figure 4. When we analyze the best known and average indicators, we see that there is very little difference between the results obtained by both techniques. However, when we perform a group analysis, we see that db-scan performs better than k-means in Group 0. The Wilcoxon test indicates that the performance is statistically significant. On the other hand, k-means has a better performance than db-scan in Groups 1 and 2. For Group 1, the difference is not significant, and, in the case of Group 2, it is significant in favor of the algorithm that uses k-means. The violin plot distributions do not show a relevant difference between the algorithms. Visually, it is observed that the interquartile range of the algorithm that uses db-scan obtains better values in Group 0. However, in Group 2, k-means obtains better values. The dispersion is similar in the different groups, and, in Group 0, a greater dispersion is observed for the algorithms that use db-scan.

5.4. Transfer Function Comparison

In this section, we compare the algorithm that uses db-scan with another general binarization mechanism, which uses transfer functions. As described in Section 3.1 and in [14], this binarization uses a transfer function T : R [ 0 , 1 ] to transform the particle velocity into a value in [0,1]; this value intuitively represents a probability. Subsequently, through a binarization mechanism, this probability becomes 0 or 1. In this comparison, we use the two best algorithms, to the best of our knowledge, that have resolved the MKP and that use transfer functions as a binarization mechanism.
The first algorithm used in the comparison, the binary artificial algae algorithm (BAAA), was developed in [35]. The BAAA uses the function t a n h = e τ | x | 1 e τ | x | + 1 as a transfer function, setting the parameter t a u to a fixed value of 1.5. Additionally, the BAAA incorporates an elitist local search operator with the aim of improving the quality of solutions. For the execution of the algorithm, an Intel Core (TM) 2 dual-CPU [email protected] GHz, with 4 GB RAM and the 64-bit Windows 7 operating system, was used. The maximum number of BAAA iterations was 35,000.
In Table 5, the results of the comparison are shown. The best result is marked in bold. When analyzing the best indicator, it is observed that d b - s c a n - C S obtains 25 best values, 21 for d b - s c a n - P S O and 6 for the B A A A . The sum is greater than 30 because some values are repeated. In the case of the average d b - s c a n - C S indicator, 16 best values were obtained, 13 for d b - s c a n - P S O and 1 for the B A A A . The Wilcoxon test indicates that the difference is significant.
The second algorithm corresponds to the binary differential search (BDS) algorithm designed in [84]. This algorithm uses t a n h = e π | x | 1 e π | x | + 1 as a transfer function; as a binarization mechanism, it uses a random procedure (TR-BDS) and an elitist procedure (TE-BDS). In the transfer function, the parameter τ was set to a value of 2.5. As a maximum number of iterations, each BDS variant used 10,000 iterations. The BDS experiments were developed with MATLAB 7.5 using a PC with a Pentium dual core i7-4770 processor, 16 GB RAM and the Windows operating system. c b . 10 . 500 was used as a dataset in the comparison. In Table 6, the results obtained by the different algorithms are shown. When we analyze the best indicator, we find that T R - B D S obtains 5 best values, and T E - B D S obtains 22. In the case of d b - s c a n - P S O and d b - s c a n - C S , the results are three and seven best values, respectively. When we compare the average of the best value indicator, we observe that d b - s c a n - C S obtains the best result, followed by d b - s c a n - P S O . This indicates that, although T E - B D S obtains the greatest number of best values, there are cases where its results have worse performance than d b - s c a n - C S and d b - s c a n - P S O . The Wilcoxon statistical test indicates that this difference is not significant for the case of T E - B D S . When analyzing the average indicator, the T R - B D S algorithm obtained the best value, T E - B D S 5 times, and d b - s c a n - P S O and d b - s c a n - C S 12 times each. This result confirms that d b - s c a n - P S O and d b - s c a n - C S consistently obtain better values than the B D S variants. The Wilcoxon test indicates that the difference is significant.

6. Conclusions

In this work, an algorithm is proposed that uses the clustering db-scan technique to enable swarm intelligence continuous metaheuristics to solve COPs. Additionally, the algorithm uses a perturbation operator in case the solutions fall into a deep local optimum. For the experiments, the 90 largest instances commonly used in the literature were used. In comparison with random operators, we see that binarization with db-scan allows more robust binary versions to be obtained, enabling consistently better results to be obtained and reducing the dispersion of these versions with respect to random operators. In the experiments that used TFs, according to the best of our knowledge, the best algorithms that have resolved the MKP and that use TFs as a binarization method were chosen. In the case of the B A A A , the results of binarizations with db-scan were better for both the best and the average indicators. In the case of T E - B D S , the difference was significant on average. In the case of k-means, the results were similar, showing that db-scan performed significantly higher in Group 0 and k-means in Group 2.
There are several possible directions for further extensions and improvements of the present work. The first line arises from observing the configuration parameters presented in Table 1 and Table 2. The configuration procedure can be simplified and improved by incorporating adaptive mechanisms that allow the parameters to be modified in accordance with the feedback obtained from the candidate solutions. The second line is related to what was observed in the comparison between the k-means and db-scan techniques developed in Section 5.3. None of these algorithms can perform significantly better than the others on all problems. Then, by incorporating an intelligent agent that uses value-action or policy gradient methods frequently used in reinforcement learning, a more robust algorithm is obtained that allows the identification of the appropriate technique or parameterization for the problem or the stage of the problem that is being solved. Another possible line of research is to explore the population management of solutions dynamically. Through analyzing the history of exploration and exploitation of the search space, one can identify regions where it is necessary to increase the population and others where it is appropriate to decrease it. Finally, an interesting line of research is to use new transfer functions, such as those defined in [52,85], and evaluate their performance on a problem such as MKP. Additionally, and suggested by the research carried out in the previously cited articles, a procedure can be explored to make the optimal estimation of the parameter τ .

Author Contributions

Conceptualization, J.G. and P.M.; methodology, J.G.; software, M.V.; validation, J.G., H.P. and P.M.; formal analysis, J.G.; investigation, J.G. and P.M; resources, H.P.; data curation, M.V.; writing—original draft preparation, J.G and M.V.; writing—review and editing, J.G.; visualization, J.G.; supervision, H.P.; project administration, H.P.; funding acquisition, J.G. All authors have read and agreed to the published version of the manuscript.

Funding

José García was supported by the Grant CONICYT/FONDECYT/INICIACION/11180056.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Al-Madi, N.; Faris, H.; Mirjalili, S. Binary multi-verse optimization algorithm for global optimization and discrete problems. Int. J. Mach. Learn. Cybern. 2019, 10, 3445–3465. [Google Scholar] [CrossRef]
  2. García, J.; Moraga, P.; Valenzuela, M.; Crawford, B.; Soto, R.; Pinto, H.; Pe na, A.; Altimiras, F.; Astorga, G. A Db-Scan Binarization Algorithm Applied to Matrix Covering Problems. Comput. Intell. Neurosci. 2019, 2019, 3238574. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Kim, M.; Chae, J. Monarch Butterfly Optimization for Facility Layout Design Based on a Single Loop Material Handling Path. Mathematics 2019, 7, 154. [Google Scholar] [CrossRef] [Green Version]
  4. Korkmaz, S.; Babalik, A.; Kiran, M.S. An artificial algae algorithm for solving binary optimization problems. Int. J. Mach. Learn. Cybern. 2018, 9, 1233–1247. [Google Scholar] [CrossRef]
  5. García, J.; Altimiras, F.; Pe na, A.; Astorga, G.; Peredo, O. A binary cuckoo search big data algorithm applied to large-scale crew scheduling problems. Complexity 2018, 2018, 8395193. [Google Scholar] [CrossRef]
  6. Abdel-Basset, M.; Zhou, Y. An elite opposition-flower pollination algorithm for a 0-1 knapsack problem. Int. J. Bio-Inspired Comput. 2018, 11, 46–53. [Google Scholar] [CrossRef]
  7. García, J.; Lalla-Ruiz, E.; Voß, S.; Droguett, E.L. Enhancing a machine learning binarization framework by perturbation operators: Analysis on the multidimensional knapsack problem. Int. J. Mach. Learn. Cybern. 2020, 1–20. [Google Scholar] [CrossRef]
  8. Saeheaw, T.; Charoenchai, N. A comparative study among different parallel hybrid artificial intelligent approaches to solve the capacitated vehicle routing problem. Int. J. Bio-Inspired Comput. 2018, 11, 171–191. [Google Scholar] [CrossRef]
  9. Valdez, F.; Castillo, O.; Jain, A.; Jana, D.K. Nature-inspired optimization algorithms for neuro-fuzzy models in real-world control and robotics applications. Comput. Intell. Neurosci. 2019, 2019, 9128451. [Google Scholar] [CrossRef]
  10. Adeli, A.; Broumandnia, A. Image steganalysis using improved particle swarm optimization based feature selection. Appl. Intell. 2018, 48, 1609–1622. [Google Scholar] [CrossRef]
  11. Balande, U.; Shrimankar, D. SRIFA: Stochastic Ranking with Improved-Firefly-Algorithm for Constrained Optimization Engineering Design Problems. Mathematics 2019, 7, 250. [Google Scholar] [CrossRef] [Green Version]
  12. Fu, W.; Tan, J.; Zhang, X.; Chen, T.; Wang, K. Blind parameter identification of MAR model and mutation hybrid GWO-SCA optimized SVM for fault diagnosis of rotating machinery. Complexity 2019, 2019, 3264969. [Google Scholar] [CrossRef]
  13. Soto, R.; Crawford, B.; Aste Toledo, A.; Castro, C.; Paredes, F.; Olivares, R. Solving the manufacturing cell design problem through binary cat swarm optimization with dynamic mixture ratios. Comput. Intell. Neurosci. 2019, 2019, 4787856. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Crawford, B.; Soto, R.; Astorga, G.; García, J.; Castro, C.; Paredes, F. Putting continuous metaheuristics to work in binary search spaces. Complexity 2017, 2017, 8404231. [Google Scholar] [CrossRef] [Green Version]
  15. Shi, Y.; Eberhart, R.C. Particle swarm optimization: Developments, applications and resources. In Proceedings of the 2001 congress on evolutionary computation, Seoul, Korea, 27–30 May 2001; Volume 1, pp. 81–86. [Google Scholar]
  16. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  17. Yang, X.S.; Deb, S. Cuckoo search via Lévy flights. In Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009; pp. 210–214. [Google Scholar]
  18. Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin, Germany, 2010; pp. 65–74. [Google Scholar]
  19. Yang, X.S. Firefly algorithms for multimodal optimization. In International Symposium on Stochastic Algorithms; Springer: Berlin, Germany, 2009; pp. 169–178. [Google Scholar]
  20. Pan, W.T. A new fruit fly optimization algorithm: Taking the financial distress model as an example. Knowl.-Based Syst. 2012, 26, 69–74. [Google Scholar] [CrossRef]
  21. Li, X.l.; Shao, Z.j.; Qian, J.x. An optimizing method based on autonomous animats: Fish-swarm algorithm. Syst. Eng. Theory Pract. 2002, 22, 32–38. [Google Scholar]
  22. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  23. Caserta, M.; Voß, S. Matheuristics: Hybridizing Metaheuristics and Mathematical Programming. In Metaheuristics: Intelligent Problem Solving; Springer: Berlin, Germany, 2009; pp. 1–38. [Google Scholar]
  24. Talbi, E.G. Combining metaheuristics with mathematical programming, constraint programming and machine learning. Ann. Oper. Res. 2016, 240, 171–215. [Google Scholar] [CrossRef]
  25. Juan, A.A.; Faulin, J.; Grasman, S.E.; Rabe, M.; Figueira, G. A review of simheuristics: Extending metaheuristics to deal with stochastic combinatorial optimization problems. Oper. Res. Perspect. 2015, 2, 62–72. [Google Scholar] [CrossRef] [Green Version]
  26. Chou, J.S.; Nguyen, T.K. Forward Forecast of Stock Price Using Sliding-Window Metaheuristic-Optimized Machine-Learning Regression. IEEE Trans. Ind. Inf. 2018, 14, 3132–3142. [Google Scholar] [CrossRef]
  27. Sayed, G.I.; Tharwat, A.; Hassanien, A.E. Chaotic dragonfly algorithm: An improved metaheuristic algorithm for feature selection. Appl. Intell. 2019, 49, 188–205. [Google Scholar] [CrossRef]
  28. de León, A.D.; Lalla-Ruiz, E.; Melián-Batista, B.; Moreno-Vega, J.M. A Machine Learning-based system for berth scheduling at bulk terminals. Expert Syst. Appl. 2017, 87, 170–182. [Google Scholar] [CrossRef]
  29. García, J.; Crawford, B.; Soto, R.; Castro, C.; Paredes, F. A k-means binarization framework applied to multidimensional knapsack problem. Appl. Intell. 2018, 48, 357–380. [Google Scholar] [CrossRef]
  30. Gavish, B.; Pirkul, H. Efficient algorithms for solving multiconstraint zero-one knapsack problems to optimality. Math. Program. 1985, 31, 78–105. [Google Scholar] [CrossRef]
  31. Vimont, Y.; Boussier, S.; Vasquez, M. Reduced costs propagation in an efficient implicit enumeration for the 01 multidimensional knapsack problem. J. Comb. Optim. 2008, 15, 165–178. [Google Scholar] [CrossRef]
  32. Boussier, S.; Vasquez, M.; Vimont, Y.; Hanafi, S.; Michelon, P. A multi-level search strategy for the 0–1 multidimensional knapsack problem. Discret. Appl. Math. 2010, 158, 97–109. [Google Scholar] [CrossRef]
  33. Mansini, R.; Speranza, M.G. Coral: An exact algorithm for the multidimensional knapsack problem. INFORMS J. Comput. 2012, 24, 399–415. [Google Scholar] [CrossRef]
  34. Zhang, B.; Pan, Q.K.; Zhang, X.L.; Duan, P.Y. An effective hybrid harmony search-based algorithm for solving multidimensional knapsack problems. Appl. Soft Comput. 2015, 29, 288–297. [Google Scholar] [CrossRef]
  35. Zhang, X.; Wu, C.; Li, J.; Wang, X.; Yang, Z.; Lee, J.M.; Jung, K.H. Binary artificial algae algorithm for multidimensional knapsack problems. Appl. Soft Comput. 2016, 43, 583–595. [Google Scholar] [CrossRef] [Green Version]
  36. Abdel-Basset, M.; El-Shahat, D.; Faris, H.; Mirjalili, S. A binary multi-verse optimizer for 0-1 multidimensional knapsack problems with application in interactive multimedia systems. Comput. Ind. Eng. 2019, 132, 187–206. [Google Scholar] [CrossRef]
  37. Lai, X.; Hao, J.K.; Glover, F.; Lü, Z. A two-phase tabu-evolutionary algorithm for the 0–1 multidimensional knapsack problem. Inf. Sci. 2018, 436, 282–301. [Google Scholar] [CrossRef]
  38. Petersen, C.C. Computational experience with variants of the Balas algorithm applied to the selection of R&D projects. Manag. Sci. 1967, 13, 736–750. [Google Scholar]
  39. Chajakis, E.; Guignard, M. A model for delivery of groceries in vehicle with multiple compartments and Lagrangean approximation schemes. In Proceedings of the Congreso Latino Ibero-Americano de Investigación de Operaciones e Ingeniería de Sistemas, México city, Mexico, 9 October 1992. [Google Scholar]
  40. Vasquez, M.; Hao, J.K. A logic-constrained knapsack formulation and a tabu algorithm for the daily photograph scheduling of an earth observation satellite. Comput. Optim. Appl. 2001, 20, 137–157. [Google Scholar] [CrossRef]
  41. Yang, M.H. An efficient algorithm to allocate shelf space. Eur. J. Oper. Res. 2001, 131, 107–118. [Google Scholar] [CrossRef]
  42. Gavish, B.; Pirkul, H. Allocation of databases and processors in a distributed data processing. Manag. Distrib. Data Process. 1982, 32, 215–231. [Google Scholar]
  43. Srikanth, K.; Panwar, L.K.; Panigrahi, B.K.; Herrera-Viedma, E.; Sangaiah, A.K.; Wang, G.G. Meta-heuristic framework: Quantum inspired binary grey wolf optimizer for unit commitment problem. Comput. Electr. Eng. 2018, 70, 243–260. [Google Scholar] [CrossRef]
  44. Aljanad, A.; Mohamed, A.; Shareef, H.; Khatib, T. A novel method for optimal placement of vehicle-to-grid charging stations in distribution power system using a quantum binary lightning search algorithm. Sustain. Cities Soc. 2018, 38, 174–183. [Google Scholar] [CrossRef]
  45. Hu, H.; Yang, K.; Liu, L.; Su, L.; Yang, Z. Short-Term Hydropower Generation Scheduling Using an Improved Cloud Adaptive Quantum-Inspired Binary Social Spider Optimization Algorithm. Water Resour. Manag. 2019, 33, 2357–2379. [Google Scholar] [CrossRef]
  46. Hamedmoghadam, H.; Jalili, M.; Yu, X. An opinion formation based binary optimization approach for feature selection. Phys. A: Stat. Mech. Its Appl. 2018, 491, 142–152. [Google Scholar] [CrossRef]
  47. Gong, Y.J.; Zhang, J.; Liu, O.; Huang, R.Z.; Chung, H.S.H.; Shi, Y.H. Optimizing the vehicle routing problem with time windows: A discrete particle swarm optimization approach. IEEE Trans. Syst. Man, Cybern. Part C (Appl. Rev.) 2011, 42, 254–267. [Google Scholar] [CrossRef]
  48. Tharwat, A.; Hassanien, A.E. Chaotic antlion algorithm for parameter optimization of support vector machine. Appl. Intell. 2018, 48, 670–686. [Google Scholar] [CrossRef]
  49. Yang, Y.; Mao, Y.; Yang, P.; Jiang, Y. The unit commitment problem based on an improved firefly and particle swarm optimization hybrid algorithm. In Proceedings of the IEEE Chinese Automation Congress (CAC), Changsha, China, 7–8 November 2013; pp. 718–722. [Google Scholar]
  50. García, J.; Crawford, B.; Soto, R.; Astorga, G. A clustering algorithm applied to the binarization of Swarm intelligence continuous metaheuristics. Swarm Evol. Comput. 2019, 44, 646–664. [Google Scholar] [CrossRef]
  51. Kyurkchiev, N.; Iliev, A. A note on the new Fibonacci hyperbolic tangent activation function. Int. J. Innov. Sci. Eng. Technol. 2017, 4, 364–368. [Google Scholar]
  52. Kyurkchiev, V.; Kyurkchiev, N. A family of recurrence generated functions based on the “half-hyperbolic tangent activation function”. Biomed. Stat. Inf. 2017, 2, 87–94. [Google Scholar]
  53. Too, J.; Abdullah, A.R.; Mohd Saad, N. A New Quadratic Binary Harris Hawk Optimization for Feature Selection. Electronics 2019, 8, 1130. [Google Scholar] [CrossRef] [Green Version]
  54. Mafarja, M.; Aljarah, I.; Heidari, A.A.; Faris, H.; Fournier-Viger, P.; Li, X.; Mirjalili, S. Binary dragonfly optimization for feature selection using time-varying transfer functions. Knowl.-Based Syst. 2018, 161, 185–204. [Google Scholar] [CrossRef]
  55. Arora, S.; Anand, P. Binary butterfly optimization approaches for feature selection. Expert Syst. Appl. 2019, 116, 147–160. [Google Scholar] [CrossRef]
  56. Leonard, B.J.; Engelbrecht, A.P.; Cleghorn, C.W. Critical considerations on angle modulated particle swarm optimisers. Swarm Intell. 2015, 9, 291–314. [Google Scholar] [CrossRef]
  57. Saremi, S.; Mirjalili, S.; Lewis, A. How important is a transfer function in discrete heuristic algorithms. Neural Comput. Appl. 2015, 26, 625–640. [Google Scholar] [CrossRef] [Green Version]
  58. Bishop, C.M. Pattern Recognition and Machine Learning; Springer: Berlin, Germany, 2006. [Google Scholar]
  59. García, J.; Pope, C.; Altimiras, F. A Distributed-Means Segmentation Algorithm Applied to Lobesia botrana Recognition. Complexity 2017, 2017. [Google Scholar] [CrossRef] [Green Version]
  60. Asta, S.; Özcan, E.; Curtois, T. A tensor based hyper-heuristic for nurse rostering. Knowl.-Based Syst. 2016, 98, 185–199. [Google Scholar] [CrossRef] [Green Version]
  61. Martin, S.; Ouelhadj, D.; Beullens, P.; Ozcan, E.; Juan, A.A.; Burke, E.K. A multi-agent based cooperative approach to scheduling and routing. Eur. J. Oper. Res. 2016, 254, 169–178. [Google Scholar] [CrossRef] [Green Version]
  62. García, J.; Crawford, B.; Soto, R.; Astorga, G. percentile transition ranking algorithm applied to binarization of continuous swarm intelligence metaheuristics. In International Conference on Soft Computing and Data Mining; Springer: Berlin, Germany, 2018; pp. 3–13. [Google Scholar]
  63. Vecek, N.; Mernik, M.; Filipic, B.; Xrepinsek, M. Parameter tuning with Chess Rating System (CRS-Tuning) for meta-heuristic algorithms. Inf. Sci. 2016, 372, 446–469. [Google Scholar] [CrossRef]
  64. Ries, J.; Beullens, P. A semi-automated design of instance-based fuzzy parameter tuning for metaheuristics based on decision tree induction. J. Oper. Res. Soc. 2015, 66, 782–793. [Google Scholar] [CrossRef] [Green Version]
  65. Li, Z.Q.; Zhang, H.L.; Zheng, J.H.; Dong, M.J.; Xie, Y.F.; Tian, Z.J. Heuristic evolutionary approach for weighted circles layout. In International Symposium on Information and Automation; Springer: Berlin, Germany, 2010; pp. 324–331. [Google Scholar]
  66. Yalcinoz, T.; Altun, H. Power economic dispatch using a hybrid genetic algorithm. IEEE Power Eng. Rev. 2001, 21, 59–60. [Google Scholar] [CrossRef]
  67. Kaur, H.; Virmani, J.; Thakur, S. Chapter 10 - A genetic algorithm-based metaheuristic approach to customize a computer-aided classification system for enhanced screen film mammograms. In U-Healthcare Monitoring Systems; Dey, N., Ashour, A.S., Fong, S.J., Borra, S., Eds.; Advances in Ubiquitous Sensing Applications for Healthcare; Academic Press: Cambridge, MA, USA, 2019; pp. 217–259. [Google Scholar]
  68. Faris, H.; Hassonah, M.A.; Ala M, A.Z.; Mirjalili, S.; Aljarah, I. A multi-verse optimizer approach for feature selection and optimizing SVM parameters based on a robust system architecture. Neural Comput. Appl. 2018, 30, 2355–2369. [Google Scholar] [CrossRef]
  69. Faris, H.; Aljarah, I.; Mirjalili, S. Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl. Intell. 2018, 48, 445–464. [Google Scholar] [CrossRef]
  70. Chou, J.S.; Thedja, J.P.P. Metaheuristic optimization within machine learning-based classification system for early warnings related to geotechnical problems. Autom. Constr. 2016, 68, 65–80. [Google Scholar] [CrossRef]
  71. Pham, A.D.; Hoang, N.D.; Nguyen, Q.T. Predicting compressive strength of high-performance concrete using metaheuristic-optimized least squares support vector regression. J. Comput. Civ. Eng. 2015, 30, 06015002. [Google Scholar] [CrossRef]
  72. Göçken, M.; Özçalıcı, M.; Boru, A.; Dosdoğru, A.T. Integrating metaheuristics and artificial neural networks for improved stock price prediction. Expert Syst. Appl. 2016, 44, 320–331. [Google Scholar] [CrossRef]
  73. Chou, J.S.; Pham, A.D. Nature-inspired metaheuristic optimization in least squares support vector regression for obtaining bridge scour information. Inf. Sci. 2017, 399, 64–80. [Google Scholar] [CrossRef]
  74. Kuo, R.; Lin, T.; Zulvia, F.; Tsai, C. A hybrid metaheuristic and kernel intuitionistic fuzzy c-means algorithm for cluster analysis. Appl. Soft Comput. 2018, 67, 299–308. [Google Scholar] [CrossRef]
  75. Mann, P.S.; Singh, S. Energy efficient clustering protocol based on improved metaheuristic in wireless sensor networks. J. Netw. Comput. Appl. 2017, 83, 40–52. [Google Scholar] [CrossRef]
  76. de Alvarenga Rosa, R.; Machado, A.M.; Ribeiro, G.M.; Mauri, G.R. A mathematical model and a Clustering Search metaheuristic for planning the helicopter transportation of employees to the production platforms of oil and gas. Comput. Ind. Eng. 2016, 101, 303–312. [Google Scholar] [CrossRef]
  77. Pirkul, H. A heuristic solution procedure for the multiconstraint zero? one knapsack problem. Nav. Res. Logist. 1987, 34, 161–172. [Google Scholar] [CrossRef]
  78. Kong, X.; Gao, L.; Ouyang, H.; Li, S. Solving large-scale multidimensional knapsack problems with a new binary harmony search algorithm. Comput. Oper. Res. 2015, 63, 7–22. [Google Scholar] [CrossRef]
  79. Ester, M.; Kriegel, H.P.; Sander, J.; Xu, X. A density-based algorithm for discovering clusters in large spatial databases with noise. Kdd 1996, 96, 226–231. [Google Scholar]
  80. Jiang, M.; Luo, J.; Jiang, D.; Xiong, J.; Song, H.; Shen, J. A cuckoo search-support vector machine model for predicting dynamic measurement errors of sensors. IEEE Access 2016, 4, 5030–5037. [Google Scholar] [CrossRef]
  81. Zhou, Y.; Wang, N.; Xiang, W. Clustering hierarchy protocol in wireless sensor networks using an improved PSO algorithm. IEEE Access 2017, 5, 2241–2253. [Google Scholar] [CrossRef]
  82. Mao, C.; Lin, R.; Xu, C.; He, Q. Towards a trust prediction framework for cloud services based on PSO-driven neural network. IEEE Access 2017, 5, 2187–2199. [Google Scholar] [CrossRef]
  83. He, X.S.; Wang, F.; Wang, Y.; Yang, X.S. Global Convergence Analysis of Cuckoo Search Using Markov Theory. In Nature-Inspired Algorithms and Applied Optimization; Springer: Berlin, Germany, 2018; pp. 53–67. [Google Scholar]
  84. Liu, J.; Wu, C.; Cao, J.; Wang, X.; Teo, K.L. A binary differential search algorithm for the 0–1 multidimensional knapsack problem. Appl. Math. Model. 2016, 40, 9788–9805. [Google Scholar] [CrossRef] [Green Version]
  85. Golev, A.; Iliev, A.; Kyurkchiev, N. A Note on the Soboleva’Modified Hyperbolic Tangent Activation Function. Int. J. Innov. Sci. Eng. Technol. 2017, 4, 177–182. [Google Scholar]
Figure 1. (a) Generic binarization diagram; and (b) specific binarization diagram.
Figure 1. (a) Generic binarization diagram; and (b) specific binarization diagram.
Mathematics 08 00507 g001
Figure 2. A general flow chart of the binary db-scan algorithm.
Figure 2. A general flow chart of the binary db-scan algorithm.
Mathematics 08 00507 g002
Figure 3. Gap comparison of d b - s c a n , B- r a n d , and B C - r a n d algorithms for the cb.5.500 MKP dataset.
Figure 3. Gap comparison of d b - s c a n , B- r a n d , and B C - r a n d algorithms for the cb.5.500 MKP dataset.
Mathematics 08 00507 g003
Figure 4. Gap comparison between the d b - s c a n and k- m e a n s algorithms for the c b . 30 . 500 dataset.
Figure 4. Gap comparison between the d b - s c a n and k- m e a n s algorithms for the c b . 30 . 500 dataset.
Mathematics 08 00507 g004
Table 1. Parameter setting for the PSO algorithm.
Table 1. Parameter setting for the PSO algorithm.
ParametersDescriptionValueRange
α Initial transition coefficient0.1[0.08, 0.1, 0.12]
β Transition probability coefficient0.6[0.5, 0.6, 0.7]
NNumber of particles30[30, 40, 50]
ϵ ϵ db-scan parameter0.3[0.3,0.4,0.5]
m i n P t s Point db-scan parameter10%[10,12,14]
Iteration NumberMaximum iterations900[800,900,1000]
Table 2. Parameter setting for the CS algorithm.
Table 2. Parameter setting for the CS algorithm.
ParametersDescriptionValueRange
α Transition probability coefficient0.1[0.08, 0.1, 0.12]
β Transition probability coefficient0.5[0.5, 0.6, 0.7]
NNumber of particles30[30, 40, 50]
ϵ ϵ db-scan parameter0.3[0.3,0.4,0.5]
m i n P t s Point db-scan parameter12%[10,12,14]
γ Step length0.01[0.009,0.01,0.011]
κ Levy distribution parameter1.5[1.4,1.5,1.6]
Iteration NumberMaximum iterations900[800,900,1000]
Table 3. Comparison of d b - s c a n , B- r a n d , and B C - r a n d operators for the cb.5.500 MKP dataset.
Table 3. Comparison of d b - s c a n , B- r a n d , and B C - r a n d operators for the cb.5.500 MKP dataset.
InstanceBestdb-scan-CS B-rand3-CS B-rand5-CS B-rand3-CS
KnownBestAvgBestAvgBestAvgBestAvg
0120,148120,096120,029.9120,066119,012.1120,066118,983.5120,066119,001.3
1117,879117,730117,617.5117,702116,316.1117,730116,375.2117,702116,421.8
2121,131121,039120,937.9120,951118,921.4120,951118,902.2120,951118,981.7
3120,804120,683120,522.8120,583119,162.6120,572118,901.6120,572118,971.4
4122,319122,280122,165.2122,231120,742.6122,231120,931.4122,231121,065.9
5122,024121,982121,868.7121,957120,164.3121,957120,189.1121,957120,217.5
6119,127119,070118,950.0119,068116,839.4119,068116,821.3119,068116,897.1
7120,568120,472120,336.6120,463118,963.9120,463118,967.2120,463119,054.5
8121,586121,377121,161.9121,077120,243.1121,052120,241.5121,052120,295.2
9120,717120,524120,362.9120,499118,841.4120,499118,889.1120,499118,977.2
10218,428218,296218,163.7218,185217,085.2218,196217,023.1218,185217,115.4
11221,202220,951220,813.9220,852219,156.1220,852219,198.7220,852219,277.2
12217,542217,349217,254.3217,258215,732.1217,258215,738.4217,258215,812.8
13223,560223,518223,455.2223,510221,984.7223,510221,962.3223,510222,054.2
14218,966218,848218,771.5218,811216,873.9218,811216,822.1218,811216,999.7
15220,530220,441220,342.2220,429219,012.8220,441219,084.3220,429219,175.2
16219,989219,858219,717.9219,785217,983.1219,785217,961.2219,785218,096.8
17218,215218,032217,890.1218,010216,321.5218,010216,338.1218,010216,479.2
18216,976216,866216,798.8216,940214,547.2216,940214,579.6216,940214,654.2
19219,719219,631219,520.0219,602217,946.2219,602217,955.6219,602218,086.2
20295,828295,717295,628.4295,652294,269.6295,652294,281.3295,652294,376.1
21308,086307,924307,860.6307,783305,754.9307,783305,701.2307,783305,874.6
22299,796299,796299,717.8299,727298,158.9299,727298,126.6299,727298,279.5
23306,480306,480306,445.2306,469304,841.3306,469304,891.5306,469305,056.2
24300,342300,245300,202.5300,240298,814.7300,240298,799.1300,240298,941.6
25302,571302,492302,442.3302,481300,769.8302,481300,732.6302,481300,926.9
26301,339301,284301,238.3301,272300,126.1301,272300,121.8301,272300,312.2
27306,454306,325306,264.2306,325304,467.8306,290304,503.2306,290304,701.8
28302,828302,769302,721.4302,749300,891.8302,749300,825.3302,749300,912.8
29299,910299,774299,722.7299,774298,947.1299,757298,964.7299,757299,147.3
Average214,168.8214,061.63213,964.15214,015.03212,429.72214,013.8212,427.09214,012.1212,538.78
Wilcoxon p-value 3.40 × 10 5 1.73 × 10 6 3.4031 × 10 5 1.73 × 10 6 1.64 × 10 5 1.73 × 10 6
Table 4. Comparison between the d b - s c a n and k- m e a n s operators for the cb.30.500 MKP dataset.
Table 4. Comparison between the d b - s c a n and k- m e a n s operators for the cb.30.500 MKP dataset.
InstanceBest Knowndb-scan-CS db-scan-PSO k-means-PSO k-means-CS
BestAvgBestAvgBestAvgBestAvg
0116,056115,526115,371.4115,526115,362.9115,526115,239.2115,526115,232.3
1114,810114,405114,325.1114,684114,327.2114,352114,174.1114,405114,165.1
2116,741116,256116,137.6116,583116,301.4116,158116,061.3116,256116,052.9
3115,354114,782114,699.3114,782114,687.3114,739114,581.3114,782114,545.7
4116,525116,353115,921.8116,353115,924.6115,994115,880.4115,995115,876.8
5115,741115,594115,172.4115,244115,177.8115,342115,146.2115,342115,140.1
6114,181113,952113,549.8113,712113,535.4113,712113,429.6113,712113,438.2
7114,348113,626113,527.3113,626113,525.1113,610113,531.4113,626113,519.5
8115,419114,822114,666.2114,822114,638.5114,822114,697.6114,822114,673.1
9117,116116,467116,351.5116,467116,344.3116,382116,379.8116,467116,370.6
Group 0 average115,629.1115,178.3114,972.24115,179.9114,982.45115,093.3114,912.09115,063.7114,901.43
Wilcoxon p-value 0.02 0.01
10218,104217,776217,561.1217,607217,541.2217,776217,620.3217,776217,618.7
11214,648214,110214,082.2214,110214,070.4214,110213,998.4214,110214,001.3
12215,978215,580215,500.1215,580215,504.3215,580215,498.1215,638215,497.6
13217,910217,201217,119.8217,301217,109.3217,301217,217.3217,301217,211.8
14215,689215,036214,951.2215,036214,961.2215,036214,991.2215,116214,997.2
15215,919215,326215,204.5215,408215,104.5215,408215,221.4215,408215,167.2
16215,907215,576215,426.4215,576215,407.6215,576215,488.8215,576215,491.4
17216,542215,999215,942.8215,999215,987.4216,057216,012.6216,057215,977.2
18217,340217,013216,839.3216,882216,860.2217,013216,878.7217,013216,821.5
19214,739214,194214,104.3214,194214,133.7214,332214,129.1214,332214,121.1
20301,675301,343301,201.3301,343301,187.4301,343301,249.2301,343301,240.3
Group 1 average216,277.6215,781.1215,673.17215,769.3215,667.98215,832.7215,705.59215,818.9215,690.5
Wilcoxon p-value 0.11 0.26
21300,055299,636299,564.2299,636299,551.2299,636299,584.5299,720299,577.2
22305,087304,850304,741.9304,850304,769.8304,995304,752.4304,995304,749.1
23302,032301,658301,572.4301,658301,531.4301,658301,586.1301,645301,585.8
24304,462304,186304,081.4304,186304,080.9304,186304,109.9304,186304,100.6
25297,012296,450296,404.9296,450296,411.5296,450296,422.1296,521296,411.4
26303,364302,917302,832.6302,917302,839.1302,917302,841.3302,941302,841.8
27307,007306,616306,448.9306,616306,446.5306,616306,453.8306,616306,449.2
28303,199302,636302,541.2302,636302,549.6302,791302,567.3302,791302,561.9
29300,596300,170300,037.3300,170300,042.5300,170300,072.4300,170300,062.4
Group 2 average302,446.5302,044.9301,942.61302,046.2301,940.99302,092.8301,963.9302,076.2301,957.97
Wilcoxon p-value 0.007 0.04
Total average211,451.9211,001.4210,862.7210,998.5210,863.8211,006.3210,860.5210,986.3210,850.0
Table 5. Comparison between the d b - s c a n and B A A A algorithms for the cb.5.500 MKP dataset (The best result is marked in bold).
Table 5. Comparison between the d b - s c a n and B A A A algorithms for the cb.5.500 MKP dataset (The best result is marked in bold).
InstanceBestdb-scan-CS db-scan-PSO BAAA
KnownBestAvgStdBestAvgStdBestAvgStd
0120,148120,096120,029.935.50120,096120,025.833.75120,066120,013.721.57
1117,879117,730117,617.566.99117,837117,617.883.42117,702117,560.511.4
2121,131121,039120,937.962.11120,951120,933.565.31120,951120,782.987.96
3120,804120,683120,522.877.17120,752120,520.178.8120,572120,340.6106.01
4122,319122,280122,165.269.27122,280122,164.468.37122,231122,101.856.95
5122,024121,982121,868.773.17121,982121,871.971.09121,957121,741.884.33
6119,127119,070118,95077.24119,070118,950.672.32119,070118,913.463.01
7120,568120,472120,336.677.56120,472120,33682.25120,472120,331.269.09
8121,586121,377121,161.9107.14121,185121,162.510.14121,052120,683.6834.88
9120,717120,524120,362.988.39120,499120,331.984.21120,499120,296.3110.06
10218,428218,296218,163.777.55218,296218,160.481.85218,185217,984.7123.94
11221,202220,951220,813.970.66220,951220,810.867.52220,852220,527.5169.16
12217,542217,349217,254.351.88217,349217,25051.04217,258217,056.7104.95
13223,560223,518223,455.238.96223,518223,459.442.03223,510223,450.926.02
14218,966218,848218,771.546.90218,962218,77548.66218,811218,634.397.52
15220,530220,441220,342.257.51220,428220,34638.48220,429220,375.931.86
16219,989219,858219,717.972.25219,858219,72171.26219,785219,619.393.01
17218,215218,032217,890.175.96218,010217,88976.21218,032217,813.2115.37
18216,976216,866216,798.841.18216,940216,803.245.86216,940216,86232.51
19219,719219,631219,52072.39219,631219,521.975.27219,602219,435.154.45
20295,828295,717295,628.447.69295,717295,627.851.28295,652295,50576.3
21308,086307,924307,860.632.56307,924307,861.835.38307,783307,577.5135.94
22299,796299,796299,717.846.31299,796299,720.644.25299,727299,664.128.81
23306,480306,480306,445.221.01306,480306,448.519.94306,469306,38531.64
24300,342300,245300,202.525.23300,240300,199.425.8300,240300,136.751.84
25302,571302,492302,442.326.69302,481302,441.230.64302,492302,37653.94
26301,339301,284301,238.327.53301,272301,240.719.06301,272301,15844.3
27306,454306,325306,264.237.49306,325306,259.738.91306,290306,138.484.56
28302,828302,769302,721.428.90302,749302,720.126.28302,769302,690.134.11
29299,910299,774299,722.733.20299,774299,718.935.82299,757299,702.331.66
Average214,168.8214,061.6213,964.155.5214,060.8213,964.051.8214,014.2213,862.095.6
Wilcoxon p-value 9.31 × 10−6 7.69 × 10−6
Table 6. Comparison between the d b - s c a n and B D S algorithms for the cb.10.500 MKP dataset (The best result is marked in bold).
Table 6. Comparison between the d b - s c a n and B D S algorithms for the cb.10.500 MKP dataset (The best result is marked in bold).
InstanceBestT R-DBS TE-DBS db-scan-PSO db-scan-CS
KnownBestAvgBestAvgBestAvgBestAvg
0117,821114,716114,425.4117,811117,801.2117,558117,299.9117,726117,502.5
1119,249119,232119,223.0119,249118,024119,232118,987.7119,139118,942.7
2119,215119,215117,625.6119,215117,801.4119,039118,836.8119,039118,719.0
3118,829118,813117,625.8118,813117,801.2118,598118,517.5118,586118,428.9
4116,530114,687114,312.4116,509114,357.2116,434116,093.4116,312116,009.9
5119,504119,504112,503.7119,504117,612.8119,257119,112.8119,402119,162.7
6119,827116,094115,629.1119,827119,427.4119,691119,556.1119,663119,507.9
7118,344116,642115,531.9118,301117,653.3118,016117,909.8118,058117,756.6
8117,815114,654114,204117,815115,236.4117,550117,370.1117,550117,238.3
9119,251114,016113,622.8119,231118,295.1118,896118,733.0118,896118,517.1
10217,377209,191208,710.2217,377212,570.3217,010216,890.5217,126216,889.8
11219,077219,077217,277.2219,077218,570.2218,872218,594.2218,872218,588.9
12217,847210,282210,172.3217,377212,570.4217,573217,553.9217,447217,342.0
13216,868209,242206,178.6216,868216,468.9216,570216,481.5216,570216,465.3
14213,873207,017206,656207,017206,455213,474213,373.2213,474213,361.8
15215,086204,643203,989.5215,086215,086215,013214,945.4214,829214,700.2
16217,940205,439204,828.9217,940217,440.5217,583217,479.3217,629217,560.6
17219,990208,712207,881.6219,984209,990.2219,675219,520.3219,675219,548.9
18214,382210,503209,787.6210,735211,038.2214,015213,865.8214,045213,939.4
19220,899205,020204,435.7220,899219,986.8220,582220,395.8220,582220,522.2
20304,387304,387302,658.8304,387304,264.5304,102303,954.8304,102304,016.2
21302,379302,379301,658.6302,379302,164.4302,263302,081.9302,263302,155.5
22302,417290,931290,859.9302,416302,014.6302,103301,966.5302,118302,066.8
23300,784290,859290,021.4291,295291,170.6300,542300,481.7300,566300,493.7
24304,374289,365288,950.1304,374304,374.0304,267304,168.4304,267304,192.6
25301,836292,411292,061.8301,836301,836.0301,730301,465.9301,730301,327.3
26304,952291,446290,516.2291,446291,446304,833304,780.4304,905304,811.2
27296,478293,662293,125.5295,342294,125.5296,263296,191.5296,363296,285.6
28301,359285,907285,293.4288,907287,923.4301,085301,027.6301,085301,032.0
29307,089290,300289,552.4295,358290,525.2306,881306,781306,881306,782.5
Average212,859.3206,278.2205,310.6210,879.2209,511.0212,623.6212,480.6212,630.0212,462.3
Wilcoxon p-value T R 2.15 × 10−51.9 × 10−62.60 × 10−61.88 × 10−6
Wilcoxon p-value T E 0.480.0010.380.001

Share and Cite

MDPI and ACS Style

García, J.; Moraga, P.; Valenzuela, M.; Pinto, H. A db-Scan Hybrid Algorithm: An Application to the Multidimensional Knapsack Problem. Mathematics 2020, 8, 507. https://doi.org/10.3390/math8040507

AMA Style

García J, Moraga P, Valenzuela M, Pinto H. A db-Scan Hybrid Algorithm: An Application to the Multidimensional Knapsack Problem. Mathematics. 2020; 8(4):507. https://doi.org/10.3390/math8040507

Chicago/Turabian Style

García, José, Paola Moraga, Matias Valenzuela, and Hernan Pinto. 2020. "A db-Scan Hybrid Algorithm: An Application to the Multidimensional Knapsack Problem" Mathematics 8, no. 4: 507. https://doi.org/10.3390/math8040507

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop