Next Article in Journal
On Hierarchical Text Language-Identification Algorithms
Previous Article in Journal
Generalized Kinetic Monte Carlo Framework for Organic Electronics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Combinatorial GVNS (General Variable Neighborhood Search) Optimization for Dynamic Garbage Collection

by
Christos Papalitsas
1,*,
Panayiotis Karakostas
2,
Theodore Andronikos
1,
Spyros Sioutas
1 and
Konstantinos Giannakis
1
1
Department of Informatics, Ionian University, 7 Tsirigoti Square, 49132 Corfu, Greece
2
Department of Applied Informatics, University of Macedonia, 54636 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
Algorithms 2018, 11(4), 38; https://doi.org/10.3390/a11040038
Submission received: 20 November 2017 / Revised: 7 March 2018 / Accepted: 23 March 2018 / Published: 27 March 2018

Abstract

:
General variable neighborhood search (GVNS) is a well known and widely used metaheuristic for efficiently solving many NP-hard combinatorial optimization problems. We propose a novel extension of the conventional GVNS. Our approach incorporates ideas and techniques from the field of quantum computation during the shaking phase. The travelling salesman problem (TSP) is a well known NP-hard problem which has broadly been used for modelling many real life routing cases. As a consequence, TSP can be used as a basis for modelling and finding routes via the Global Positioning System (GPS). In this paper, we examine the potential use of this method for the GPS system of garbage trucks. Specifically, we provide a thorough presentation of our method accompanied with extensive computational results. The experimental data accumulated on a plethora of TSP instances, which are shown in a series of figures and tables, allow us to conclude that the novel GVNS algorithm can provide an efficient solution for this type of geographical problem.

1. Introduction

Many complex real world problems can be formulated as combinatorial optimization (CO) problems. Technically, CO problems require a proper solution from a discrete finite set of feasible solutions in order to simultaneously achieve the minimization (or maximization) of a cost function and the satisfaction of the problem’s given constraints. One such problem is finding the shortest path (or a path “close” to the shortest with respect to some appropriate metric) for a global positioning system (GPS) in a short period of time. The above real-world problem can be perfectly modeled by the travelling salesman problem (TSP).
The travelling salesman problem (TSP) is one of the most widely studied combinatorial optimization problems. Solving the TSP means finding the minimum cost route so that the salesman (the person or entity who travels along a specific route containing many nodes) can start from a point of origin and return to the origin after passing from all given nodes once. The first use of the term “travelling salesman problem” appeared around 1931–1932. Remarkably, a century earlier, in 1832, a book was printed in Germany [1], which, although dedicated to other issues, in the last chapter deals with the essence of the TSP problem: “With a suitable choice and route planning, you can often save so much time [making] suggestions [...]. The most important aspect is to cover as many locations [as possible], without visiting a location [a] second time.” In that book, the TSP is expressed for the first time using some examples of routes through Germany and Switzerland. However, an in-depth study of the problem is not attempted in this book. The TSP was expressed mathematically for the first time in the 19th century by Hamilton and Kirkman [1]. A cycle in a graph is a closed path that starts and ends at the same node and visits every other node exactly once. A cycle containing all vertices of the graph is called Hamiltonian.
In short, TSP is the problem of finding the shortest Hamiltonian cycle. The Hamiltonian graph problem, i.e., deciding whether a graph has a Hamiltonian cycle, is reducible to the TSP. One can see this by assigning zero length to the edges of the graph and constructing a new edge of length, one for each missing edge. If the solution of the TSP for the resulting graph is zero, then the original graph contains a Hamiltonian cycle; if it is a positive number, then the original graph contains no Hamiltonian cycle [2]. TSP is NP-hard and is of great significance in various fields, such as operational research and theoretical computer science. In practice, TSP amounts to finding the best way one can visit all the cities, return to the starting point, and minimize the cost of the tour.
Typically, TSP is represented by a graph. Specifically, the problem is stated in terms of a complete graph G = ( V , A ) , in which V = { v 1 , v 2 , , v n } is the set of nodes and A = { ( v i , v j ) : v i , v j V and v i v j } is the set of the directed edges or arcs. Each arc is associated with a weight c i j representing the cost (or the distance) of moving from node i to node j. If c i j is equal to c j i , the TSP is symmetric (sTSP); otherwise, it is called asymmetric (aTSP). The fact that TSP is NP-hard implies that there is no known polynomial-time algorithm for finding an optimal solution regardless of the size of the problem instance [3].
Real world problems, such as those related to GPS, can be formulated as instances of the TSP. This class of routing problems requires good solutions computed in a short amount of time. In order to improve the computational time, it is common practice to sacrifice some of the solution’s quality by adopting heuristic and metaheuristic approaches [4,5,6]. Heuristics are fast approximation computational methods divided into construction and improvement heuristics. Construction heuristics are used to build feasible initial solutions, and improvement heuristics are applied to achieve better solutions. It is customary to apply improvement heuristics iteratively. Metaheuristics are general optimization frameworks that can be appropriately modified in their individual characteristics in order to generate efficient methods for solving specific classes of optimization problems.
The main contribution of this paper is the application of the novel extension of general variable neighborhood search (GVNS for short) metaheuristic to a GPS problem. We achieve efficient solutions in a short period of time for a GPS application used for garbage trucks, which is modeled as an instance of the TSP. The proposed method guarantees optimal or near-optimal solutions for a real life routing application. GPS applications typically use the nearest neighbor (NN) or some of its variations in order to achieve good results for the routing section. The routing section is a critical part of a typical GPS application because, if the provided routes are optimal or near-optimal, then the end result will also be near-optimal. The routing solutions produced by the GVNS were compared to the ones obtained by the well known NN algorithm and some of its most widely used variations. The computational results reveal that both GVNS’s first and best improvement solutions provide efficient routes that are either optimal or near-optimal and outperform with a wide margin classic, widely used methods, such as NN and its modifications. This novel GVNS is a quantum-inspired expansion of the conventional GVNS in which the shaking function is based on complex unit vectors.
This paper is organized as follows. In Section 2 we present related works, in Section 3 we explain metaheuristics and the variable neighborhood search, and we describe our algorithm in detail. A GPS application and the specific problem we tackle with our algorithm is presented in Section 4. In Section 5, the experimental results of our implementation are presented in a series of matrices and figures that clearly demonstrate that our method outperforms standard approaches like the NN algorithm. Finally, conclusions and ideas for future work are described in Section 6.

2. Related Work

The research community has shown great interest in solving tangible, real world problems via methods applicable to CO problems. Recently, many authors have been actively trying to enhance conventional optimization methods by introducing principles and techniques originating from unconventional methods of computation in the hope that they would prove superior to traditional approaches. For example, Sandip et al. [7] proposed several novel techniques which they called quantum-inspired ant colony optimization, quantum-inspired differential evolution, and quantum-inspired particle swarm optimization, respectively, for multi-level color image thresholding. These techniques find optimal threshold values at different levels of thresholding for color images.
A new quantum-inspired social evolution (QSE) algorithm was proposed by hybridizing a well-known social evolution algorithm with an emerging quantum-inspired evolutionary one. The proposed QSE algorithm was applied to the 0–1 knapsack problem and the performance of the algorithm was compared to various evolutionary, swarm, and quantum-inspired evolutionary algorithmic variants. Pavithr and Gursaran claim that the performance of the QSE algorithm is better than or at least comparable to the different evolutionary algorithmic variants it was tested against [8].
Wei Fang et al. proposed a decentralized form of quantum-inspired particle swarm optimization with a cellular structured population for maintaining population diversity and balancing global and local search [9]. Zheng et al. conducted an interesting study by applying a novel hybrid quantum-inspired evolutionary algorithm to a permutation flow-shop scheduling problem. They proposed a simple representation method for the determination of job sequence in the permutation flow-shop scheduling problem based on the probability amplitude of qubits [10].
Lu et al. designed a quantum-inspired space search algorithm in order to solve numerical optimization problems. In their algorithm, the feasible solution is decomposed into regions in terms of quantum representation. The search progresses from one generation to the next, while the quantum bits evolve gradually to increase the probability of region selection [11]. Wu et al. in [12] proposed a novel approach using a quantum-inspired algorithm based on game-theoretic principles. In particular, they reduced the problem they studied to choosing strategies in evolutionary games. Quantum games and their strategies seem very promising, offering enhanced capabilities over classic ones [13].
Solutions based on variable neighborhood search (VNS) have been applied to route planning problems. Sze et al. proposed a hybrid adaptive variable neighborhood search algorithm for solving the capacitated vehicle routing problem (VRP) [14]. A two-level VNS heuristic has been developed in order to tackle the clustered VRP by Defryn and Sorensen [15]. In [16], a VNS approach for the solution of the recently introduced swap-body VRP is proposed. Curtin et al. made an extensive comparative study of well known methods and ready-to-use software and they concluded that no software or classic method can guarantee an optimal solution to the TSP problems that model GIS problem with more than 25 nodes [17]. Papalitsas et al. proposed a GVNS approach for the TSP with time windows [4] and a quantum-inspired GVNS (qGVNS) for solving the TSP with time windows [18].
Moreover, the proposed method can also be applied on other classes of problems from real world, such as optimization on localization in hospitals, smart cities, and smart parking systems. Tsiropoulou et al. applied RFID technologies to tag-to-tag communication paradigms in order to achieve improved energy-efficiency and operational effectiveness [19]. Liebig et al. presented a system for trip planning that consolidates future traffic threats [20]. Specifically, this system measures traffic flow in areas with low sensor coverage by using a Gaussian Process Regression. Many studies also deal with the optimization of localization and positioning of doctors and nurses in hospitals and health care organizations [21,22].
In this work we demonstrate that, for small problems, our method achieves the optimal value; for larger problems, it guarantees a close to optimal solution with a deviation of 1–3%.

3. Novel Variable Neighborhood Search

In this section, we describe the proposed VNS method version. For completeness, we state the necessary background notions and formal definitions (such as on metaheuristics and VNS).

3.1. Metaheuristics

A metaheuristic is a high-level heuristic, designed to find, create, or select a lower level heuristic (for example a local search algorithm), which can provide an adequate solution to an optimization problem. It is particularly useful for instances with missing or incomplete information, or when the computing capacity is limited. According to the literature on metaheuristic optimization [23], the word “metaheuristics” was devised and proposed by Glover. Metaheuristic algorithms are able to make assumptions about the optimization problem to be solved and thus can be used for a wide variety of problems. Obviously, compared with exact methods, metaheuristic procedures do not guarantee a global optimal solution for each category of problems [24].
Many metaheuristic algorithms apply some form of stochastic optimization. This implies that the generated solution depends on a set of random variables. By searching in a large set of feasible solutions, the metaheuristic procedures can often find good solutions with less computational effort than exact algorithms, iterative methods, or simple heuristic procedures. Therefore, metaheuristic procedures are useful approaches for optimization problems in many practical situations.

3.2. Variable Neighborhood Search (VNS)

VNS is a metaheuristic for solving combinatorial and global optimization problems, proposed by Mladenovic and Hansen [25,26]. The main idea of this framework is the systematic neighborhood change in order to achieve an optimal (or a close-to-optimal) solution [27]. VNS and its extensions have proven their efficiency in solving many combinatorial and global optimization problems [28].
Each VNS heuristic consists of three parts. The first one is a shaking procedure (diversification phase) used to escape local optimal solutions. The next one is the neighborhood change move, in which the following neighborhood structure that will be searched is determined; during this part, an approval or rejection criterion is also applied on the last solution found. The third part is the improvement phase (intensification) achieved through the exploration of neighborhood structures through the application of different local search moves. Variable neighborhood descent (VND) is a method in which the neighborhood change procedure is performed deterministically.
GVNS is a VNS variant where the VND method is used as the improvement procedure. GVNS has been successfully tested in many applications, as several recent works have demonstrated [29,30].

3.3. Description of Our Algorithm

As does the original GVNS, our version of GVNS consists of a VND local search, a diversification procedure, and a neighborhood change step. In our method, the pipe-VND (exploitation in the same neighborhood while improvements are also being made) is used during the improvement phase. During the improvement phase of the pipe-VND, two classic local search strategies are applied: the relocate and the 2-opt. In the relocate, the solutions are obtained by removing a node and inserting it in a different position of the current route. In the 2-opt, the solutions are obtained by breaking two edges and reconnecting them in a different order.
The biggest difference between our approach and the classic GVNS is in the diversification phase. The main use of a shaking function is to resolve local minima traps within a VNS procedure. In our approach, perturbation is achieved by adopting techniques from the field of quantum computation.
Quantum-inspired procedures are not actual quantum algorithms designed to run on future quantum computers, but conventional, classical algorithms that utilize principles and ideas from the field of quantum computing. Quantum computing was envisioned by Feynman [31,32], who was the first to observe that it is not possible to efficiently simulate an actual quantum system using a classical computer. More in-depth information regarding quantum computation and its principles can be found in [33,34].
In our case, during each shaking call, a simulated quantum n-qubit register generates a complex n-dimensional unit vector. A quantum register is the quantum analogue of a classical processor register. The dimension n of the complex unit vector is greater than or equal to the dimension of the problem. Our algorithm takes as input the complex n-dimensional vector and produces a real n-dimensional vector. The i-th component, 1 i n , of the real vector is equal to the modulus squared of the i-th component of the complex vector. Obviously, the components of the real vector are real numbers in the interval [ 0 , 1 ] .
Each node of the current solution is associated with precisely one of the components of the real n-dimensional vector. In effect, the vector components are used as a flag for each node in the current solution. Under this correspondence between vector components and nodes, the sorting of the components of the real vector, will induce an identical ordering among the nodes in the solution. Thus, the ordered route produced after this shaking move will drive our exploration effort in another search space.
At this point, it should be mentioned that the NN heuristic is used in order to produce an initial feasible solution (the first node is set as a depot). From an algorithmic perspective, the procedure is summarized in the next pseudocode fragment [35]. The solution method is provided in Algorithm 1:
Algorithm 1: Pseudocode of the novel general variable neighborhood search (GVNS)
Algorithms 11 00038 i001

4. GPS Application for Garbage Trucks Modeled as a Travelling Salesman Problem

An operation with substantial importance for the handling of everyday scheduling of a city’s traffic is the routing of garbage trucks from their depots to every dustbin on their routes and back to their depots. The optimal routes correspond to minimum required transportation time and minimum distance. Finding optimal routes typically proves to be time-consuming, especially in the case of metropolitan cities with very dense road networks. However, by exploiting recent advances from the field of metaheuristics, it is possible to attain efficient, near-optimal solutions in a short amount of time for many practical cases. We take advantage of the performance improvement brought by a novel metaheuristic procedure based on VNS. Our novel GVNS, in combination with the minimal required computational time, can provide a significantly enhanced the solution for these kind of problems. The incorporation of the enhanced VNS procedure within the GIS will lower the system’s response time and provide close to optimal solutions.
GIS technology integrates common database operations such as query and statistical analysis with the unique visualization and geographic analysis benefits offered by maps [36,37]. Among other things, a GIS facilitates the modeling of spatial networks (e.g., road networks) offering algorithms to query and analyze them. Spatial networks are modeled with graphs. In the case of road networks, the graph’s arcs correspond to street segments, whereas the nodes correspond to street segment intersections. Each arc has a weight associated with it, representing the cost of traversing it.
A GIS usually provides a number of tools for the analysis of spatial networks. It generally offers tools to find the shortest or minimum route through a network and heuristic procedures to find the most efficient route to a series of locations. Such a problem is typically modeled as an instance of the TSP. Our implementation solves efficiently the TSP problem, finding near-optimal solutions for a range of small, medium, or large benchmark problems. Distance matrix calculation can be used to calculate distances between pairs of nodes representing origins and destinations, whereas location-allocation functions determine site locations and assign demand to sites. These capabilities of GIS for analyzing spatial networks enable them to be used as decision support systems for the districting and routing of vehicles [38,39].
Routing a garbage truck from its depot to each dustbin and back is modeled by finding routes for the travelling salesman problem. The GIS will be used to find the optimal routes corresponding to minimum required transportation time. The GIS can also present the driver with directions corresponding to the routes generated. These directions will be transmitted to the garbage truck. In a real-time system, the time performance of the routing function is of vital significance. Metaheuristics, like our implementation presented here, can guarantee that.

Our Approach

In this paper, we describe a system offering a solution to the problem of garbage truck routing management. It is based on quantum-inspired metaheuristics applied on TSP and integrated to GIS/GPS technologies. Our approach is an integrated waste management solution. Based on the functional requirements and some case studies, [40], the components of our application are designed and decomposed into subsystems and smaller functional units. Operations and relationships between subsystems are defined for each subsystem:
  • Bin Sensors: equipment that estimates the waste bin fill level and collects, stores, and transmits bins data. This will help our main system to take into account specific bins and avoid computation on final route empty bins.
  • Data Gathering from Bins: a unit that communicates with the bin sensors and delivers the collected information to the central system. It can be installed on passing vehicles and consists of three components:
    • a communicator, which implements the communication with the bin;
    • a storage procedure, which temporarily stores the data until transferred to the central system;
    • a transfer procedure, which implements the data transfer to the central system.
    All these operations will be applied via the Global System for Mobile Communications (GSM) network.
  • GPS Navigation Applications Integrated in Trucks: a classic navigation application through GPS and will provide navigation guidance to a truck driver and instructions regarding which bins should be collected.
  • A Central System: the back-end system of our application. Its main part is the data storing, bin data, vehicle data, and all data needed to compute the most efficient routes. Furthermore, data storing will keep any information retrieved from other subsystems, particularly from the data mining subsystem and the routing optimizations subsystem. All generated spatial information for the current route will be stored locally in a spatial database.
  • A Map Substructure: A REST-ful API based on maps that will provide all the required functionality for creating rich-web applications based on geographic and descriptive data.
  • A Data Mining Subsystem: a system mainly used to estimate the fullness of bins when we do not have the available information updated.
  • A Routing Optimization Structure: Our main contribution is mainly based on implementing efficient routing algorithms for this application. We propose this novel metaheuristic, which provides optimal or close to optimal solutions over a short period of time. Our implementation of routing functionality gives an overall comparative advantage compared with the other implementations for two main reasons:
    • We can compute efficient solutions in a short period of time. This makes the application efficient because we can compute and re-compute, live and continuously, many times for the same route and feed the results to the application.
    • Our routing algorithm outperforms classic methods, such as the NN, that are used to find routes on GPS, and the solutions for every class of problem are near-optimal.
A real-time system like ours must be able to give prompt replies to such queries because, in these situations, the response time is of vital importance. By using efficient local search structures and the novel GVNS for the problem tour, the metaheuristic algorithm can provide better results.
Most GIS/GPS implementations use either the NN or one of its variations to compute routes. NN is a well known and widely used construction heuristic in network designing problems. Initially, in NN, an arbitrary node is inserted in the route as the starting node, and then, iteratively, the nearest to the last added node, selected to insert in the route. The procedure is terminated when all the nodes have been added in the route. Therefore, because of the popularity of NN, when we test the performance of our algorithm, we compare its results both with the optimal solution and the solutions achieved by NN. This comparison demonstrates that GVNS produces results that, in all cases, are near-optimal and significantly superior to the NN algorithm, which is widely used by most GIS/GPS software. As a result of these better routing solutions, the total amount of fuel is drastically reduced. Thus, our method is environmentally friendly, since fuel consumption has a direct impact on the environment.

5. Experimental Results

This section is devoted to the presentation of the experimental results that showcase the strengths of the novel GVNS. The experimental tests were implemented in Fortran 90 and ran on a laptop PC with an Intel Core i7-6700HQ Processor at 2.59 GH and 16 GB memory. We ran each benchmark at least five times and imposed a limit of 60 s per run. We chose this threshold for two reasons. First, it is adequate to compute an optimal or near-optimal solution, and, second, it is short enough to allow for repeated experiments. We then kept the best solution found and computed the average of all runs in order to derive the final cost. The algorithms were tested on 48 benchmark instances from the TSPLIB. TSPLIB is a library that contains a collection of benchmarks for the TSP. These benchmarks are characterized by their diversity and their variation with respect to the dimension of the problem. It is precisely for these qualities that they are widely used by researchers for comparing results [41].

5.1. Novel GVNS versus Nearest Neighbor

In this work, we propose a novel GVNS version based on quantum computing principles and techniques, and we apply it to a garbage collection application, an actual GPS-based problem. Below we present the computational results of our approach. As already mentioned, NN and its variants are widely used in GPS/GIS applications in order to construct the tour of the underlying graph. In particular, we compare our method against the NN heuristic and two of its most well-known variants: the repeated NN and the improved NN. This section presents the comparative analysis among NN, GVNS, and the optimal value (OV), showing the results in a series of figures and tables. It is important to point out that we chose to examine the NN because of its widespread use in GPS applications.
Table 1 contains the aggregated experimental results. Specifically, it contains the benchmark name, the NN cost, the average for first improvement (FI) for each problem, the gap between GVNS using FI vs. OV, the average for best improvement (BI) for each problem, the gap between GVNS using BI vs. OV, and the OV. Given the outcome x, its gap from the optimal value O V is computed by the formula O V - x O V . The gap is widely used in the field of optimization to measure how close to the optimal a particular solution is. The data demonstrate that GVNS
  • is consistently very close to the optimal value both with FI and BI, and
  • outperforms NN in all cases.
The shaded lines in Table 1 emphasize the superiority of GVNS for certain instances of the benchmarks. Specifically, we highlight these instances where GVNS has a 2% or less gap from the optimal value. For example, the NN’s gap from the optimal value for att48 is −0.2101, whereas GVNS’s BI search strategy achieves −0.0016, and GVNS using FI achieves −0.0024. Moreover, GVNS achieves the optimal value (0.0000 gap from OV) for bayg29, bays29, eil51, fri26, gr17, gr21, and gr24. Another characteristic case is the benchmark lin105, where NN has a −0.4156 gap from the optimal, while GVNS with BI strategy achieves −0.0163, and GVNS using FI achieves −0.0159.

5.2. GVNS versus Nearest Neighbor Variants

In addition to the previous setup, we ran additional tests using the two most popular variants of the NN heuristic. These are the repeated NN and the improved NN. The Improved NN is a variant of the classic NN in which the starting pair of the route is the shortest edge in the distance matrix [42]. The remaining nodes are added to the route in a way identical to the simple NN method. Repeated or repetitive NN is another modification in which the NN algorithm is applied to every node. Finally, the route with the minimum total cost is selected as the best one.
We compared GVNS with the improved NN and the repeated NN, and the experimental results are contained in Table 2 and Table 3. These tables show the Benchmark Name, the improved NN tour cost, the repeated NN tour cost, the GVNS using BI tour cost, the GVNS using FI tour cost, and the Optimal Value (OV). For consistency, we ran the same experiments as before, but instead of the simple NN, we used the improved NN and the repeated NN, and we compared the results with GVNS and the OV. The improved NN and repeated NN achieve better results compared to the simple NN for each benchmark. However, the fact remains that GVNS using FI or using BI still outperforms these improved NN-based methods.
Table 2 and Table 3 confirm that GVNS using either BI or FI outperforms the improved NN and the repeated NN. Let us consider, for example, the benchmark eil101, which consists of 101 nodes. The computational results show that that the improved NN has a tour cost of 823, the repeated NN has a tour cost of 746, GVNS on first improvement has 649 and GVNS on best improvement has 647, while the optimal tour’s cost Value is 629. It is clear that GVNS is marginally close to the OV and the gap between both versions of GVNS and both NN variants is relatively high. Other examples of benchmarks that highlight the superiority of GVNS are bayg29, d198, and ch130.
Table 3 contains computational data that corroborate that GVNS performs better than the improved and the repeated NN. Characteristic examples are gr229, lin105, and pr136, which are benchmark instances of 229, 105, and 136 nodes, respectively. We can therefore conclude that GVNS is always close to the optimal value and clearly outperforms the NN variants.

5.3. Novel GVNS Versus Conventional GVNS

The data in Table 4 reveal that the novel GVNS is indeed an improvement over the classic GVNS. From the data one may conclude that in all cases this new version of GVNS is at least as good as GVNS and in many cases outperforms classical GVNS. This is in accordance with the work in [35], which, through a comparative analysis, also showed that the quantum-inspired GVNS achieves better results than the conventional GVNS.
Table 5 contains the experimental results of the novel GVNS on some asymmetric TSP benchmarks.
In order to further test the feasibility of our method, we applied the novel GVNS to some national TSPs (nTSPs) in order to test the behavior on much bigger instances. Table 6 contains the computational results of novel GVNS on nTSPs.
Our aim in this paper was the implementation of a method based on a well-known metaheuristic that would be efficient enough to solve real world problems, such as the garbage collection challenge studied here. We wanted to develop an algorithm that would be able to find near-optimal solutions in a relatively short period of time. To achieve our goals, we chose to implement this novel version of GVNS, since we expected to outperform the conventional GVNS. This was indeed confirmed experimentally, as the results in Table 4 show. The bulk of our experiments were meant to determine how GVNS fares against well-established methods that are widely applied for GPS, like the NN and its most important variations. Table 1, Table 2 and Table 3 contain experimental evidence suggesting that GVNS outperforms the NN, the repeated NN, and the improved NN heuristics in all cases.

5.4. Graphical Representation of the Results

In this section, we present two different sets of figures with three different bar charts for each set. Each figure concerns a different subset of the total experiments, sorted according to the optimal value. In the first set (Figure 1, Figure 2 and Figure 3), each benchmark problem is represented with four different bars, one for each method. Specifically, one represents the NN algorithm, one the optimal value, and another two depict GVNS (best and first improvement). It can be easily seen that our implementation achieves results very close to optimal, unlike the NN algorithm, which does not provide efficient solutions.
A notable example of Figure 1 is ch130. We can see that GVNS (using either FI or BI) is much closer to the optimal value, compared to NN.
It is clear from Figure 2 that, in kroC100, KroA100, KroD100, KroE100, KroB100, KroB150, KroA150, KroA200, and KroB200, the gap between the NN method and both variants of the GVNS is too high. We can also observe some cases in Figure 2, where the GVNS achieves the optimal value. These are berlin52, hk48, and att48.
We observe that, in most benchmarks, in Figure 3, GVNS using FI or using BI comes close to the optimal value, while the NN bar is further away from the optimal value. Examples include lin318, pr124, gr137pr76, bier127.
In the next set of figures (i.e., Figure 4, Figure 5 and Figure 6), each benchmark is represented with five different bars; one for the improved NN algorithm, one for the repeated NN algorithm, one for the optimal value, and another two for our implementation (best and first improvement). A close examination reveals that GVNS is once again very close to optimal values, whereas the improved and repeated NN algorithms fail to provide equally good solutions.
Looking at Figure 4, we infer that, when it comes to medium benchmark problems, GVNS significantly approximates the optimal value, unlike both variants of NN, which are far from the optimal. We can particularly see this with kroB100, lin105, kroB150 and kroB200.
In general, Figure 4, Figure 5 and Figure 6 show that the improved NN appears to be a better method than repeated NN. However, in all cases, both GVNS variants outperform the two NN variants.
Figure 6 demonstrates once again the superiority of GVNS. Let us consider benchmark pr76, where GVNS touches the optimal point, while both the repeated and improved NN have a much lower performance. In addition, in Figure 4, Figure 5 and Figure 6, we can see for a different set of benchmarks that again GVNS outperforms the NN variants, significantly closer to the optimal value in each case.
To sum up, the graphical results allow us to conclude that GVNS with BI or FI produces results that are appreciably close to the OV in most benchmark tests, whereas the NN algorithm is far from being close (for every case). Previously, we provided the analytical results in tabular form. The graphical representation of these results makes it easy to see the efficiency of the proposed algorithm, since one can immediately see that the divergence from optimal is almost negligible.

6. Conclusions and Future Work

This work studied an application of garbage collector routing using a new metaheuristic method based on a novel version of GVNS. We modeled the underlying problem as a TSP instance and went on to solve it using GVNS. Our GVNS algorithm, differs from conventional approaches due to its inspiration from principles of quantum computing. Our study was focused on quick and efficient transitions to different areas of the search space. This enabled us to find efficient routes in a short period of time. To assess the efficiency of our approach, we performed extensive experimental tests using well-known benchmarks from TSPLIB. The results were quite encouraging, as they confirmed that the novel GVNS outperforms methods that are widely used in practice, such as NN, repeated NN, and improved NN.
A direction for future work could be the investigation of alternative neighborhood structures and neighborhood change moves in VND (variable neighborhood descent) under the GVNS framework. In the same vein, one could study modifications during the perturbation phase in order to achieve even closer to optimal solutions, particularly on bigger asymmetric benchmarks. Yet another direction could be the use of a multi-improvement strategy [43]. In any event, we plan to apply, adopt if necessary, and assess GVNS to other TSP variants and real-life routing optimization problems.

Author Contributions

All of the authors have contributed extensively to this work. C.P. and P.K. conceived the initial algorithm and worked on the first prototypes. P.K., K.G., and C.P. thoroughly analyzed the current literature gathering all the necessary material. T.A. assisted C.P. in designing the methods used in the main part. T.A. and S.S. were responsible for supervising the construction of this work. S.S. was responsible for the interlinking between the theoretic model and the actual application. C.P. and K.G. contributed to the appropriate typing of the formal definitions and the maths used in the paper. All the authors contributed to the writing of the paper.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
GISGeographic Information System
GPSGlobal Positioning System
GVNSGeneral Variable Neighborhood Search
NNNearest Neighbor
OVOptimal Value
TSPTravelling Salesman Problem
VNDVariable Neighborhood Descent
VNSVariable Neighborhood Search

References

  1. Voigt, B.F. Der Handlungsreisende, Wie er Sein Soll und was er zu thun Hat, um Aufträge zu Erhalten und Eines Glücklichen Erfolgs in Seinen Geschäften Gewiss zu zu Sein. Commis-Voageur, Ilmenau; Verlag Bernd Schramm: Kiel, Germany, 1981. [Google Scholar]
  2. Lawler, E.L.; Lenstra, J.K.; Rinnooy Kan, A.H.G.; Shmoys, D.B. The Traveling Salesman Problem: A Guided Tour of Combinatorial Optimization; Wiley: Hoboken, NJ, USA, 1985; p. 476. [Google Scholar]
  3. Rego, C.; Gamboa, D.; Glover, F.; Osterman, C. Traveling salesman problem heuristics: Leading methods, implementations and latest advances. Eur. J. Oper. Res. 2011, 211, 427–441. [Google Scholar] [CrossRef]
  4. Papalitsas, C.; Giannakis, K.; Andronikos, T.; Theotokis, D.; Sifaleras, A. Initialization methods for the TSP with Time Windows using Variable Neighborhood Search. In Proceedings of the IEEE 6th International Conference on Information, Intelligence, Systems and Applications (IISA 2015), Corfu, Greece, 6–8 July 2015. [Google Scholar]
  5. Silva, R.F.D.; Urrutia, S. A General VNS heuristic for the traveling salesman problem with time windows. Discrete Optim. 2010, 7, 203–211. [Google Scholar] [CrossRef]
  6. Mladenovic, N.; Todosijevic, R.; Urosevic, D. An efficient GVNS for solving Traveling Salesman Problem with Time Windows. Electron. Notes Discrete Math. 2012, 39, 83–90. [Google Scholar] [CrossRef]
  7. Dey, S.; Bhattacharyya, S.; Maulik, U. New quantum-inspired meta-heuristic techniques for multi-level colour image thresholding. Appl. Soft Comput. 2016, 46, 677–702. [Google Scholar] [CrossRef]
  8. Pavithr, R.; Gursaran. Quantum Inspired Social Evolution (QSE) algorithm for 0–1 knapsack problem. Swarm Evol. Comput. 2016, 29, 33–46. [Google Scholar] [CrossRef]
  9. Fang, W.; Sun, J.; Chen, H.; Wu, X. A decentralized quantum-inspired particle swarm optimization algorithm with cellular structured population. Inf. Sci. 2016, 330, 19–48. [Google Scholar] [CrossRef]
  10. Zheng, T.; Yamashiro, M. A novel hybrid quantum-inspired evolutionary algorithm for permutation flow-shop scheduling. J. Stat. Manag. Syst. 2009, 12, 1165–1182. [Google Scholar] [CrossRef]
  11. Lu, T.C.; Juang, J.C. Quantum-inspired space search algorithm (QSSA) for global numerical optimization. Appl. Math. Comput. 2011, 218, 2516–2532. [Google Scholar] [CrossRef]
  12. Wu, Q.; Jiao, L.; Li, Y.; Deng, X. A novel quantum-inspired immune clonal algorithm with the evolutionary game approach. Prog. Nat. Sci. 2009, 19, 1341–1347. [Google Scholar] [CrossRef]
  13. Giannakis, K.; Papalitsas, C.; Kastampolidou, K.; Singh, A.; Andronikos, T. Dominant Strategies of Quantum Games on Quantum Periodic Automata. Computation 2015, 3, 586–599. [Google Scholar] [CrossRef]
  14. Sze, J.F.; Salhi, S.; Wassan, N. A hybridisation of adaptive variable neighbourhood search and large neighbourhood search: Application to the vehicle routing problem. Expert Syst. Appl. 2016, 65, 383–397. [Google Scholar] [CrossRef]
  15. Defryn, C.; Sörensen, K. A fast two-level variable neighborhood search for the clustered vehicle routing problem. Comput. Oper. Res. 2017, 83, 78–94. [Google Scholar] [CrossRef]
  16. Huber, S.; Geiger, M.J. Order matters–A Variable Neighborhood Search for the Swap-Body Vehicle Routing Problem. Eur. J. Oper. Res. 2017, 263, 419–445. [Google Scholar] [CrossRef]
  17. Curtin, K.M.; Voicu, G.; Rice, M.T.; Stefanidis, A. A comparative analysis of traveling salesman solutions from geographic information systems. Trans. GIS 2014, 18, 286–301. [Google Scholar] [CrossRef]
  18. Papalitsas, C.; Karakostas, P.; Giannakis, K.; Sifaleras, A.; Andronikos, T. Initialization methods for the TSP with Time Windows using qGVNS. In Proceedings of the 6th International Symposium on Operational Research, OR in the Digital Era—ICT Challenges, Thessaloniki, Greece, 8–10 June 2017. [Google Scholar]
  19. Tsiropoulou, E.E.; Baras, J.S.; Papavassiliou, S.; Sinha, S. RFID-based smart parking management system. Cyber Phys. Syst. 2017, 3, 22–41. [Google Scholar] [CrossRef]
  20. Liebig, T.; Piatkowski, N.; Bockermann, C.; Morik, K. Predictive Trip Planning - Smart Routing in Smart Cities. In Proceedings of the Workshops of the EDBT/ICDT 2014 Joint Conference (EDBT/ICDT 2014), Athens, Greece, 28 March 2014; Volume 1133, pp. 331–338. [Google Scholar]
  21. Van Haute, T.; De Poorter, E.; Crombez, P.; Lemic, F.; Handziski, V.; Wirström, N.; Wolisz, A.; Voigt, T.; Moerman, I. Performance analysis of multiple Indoor Positioning Systems in a healthcare environment. Int. J. Health Geogr. 2016, 15, 7. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Woo, H.; Lee, H.J.; Kim, H.C.; Kang, K.J.; Seo, S.S. Hospital wireless local area network-based tracking system. Healthc. Inform. Res. 2011, 17, 18–23. [Google Scholar] [CrossRef] [PubMed]
  23. Glover, F.; Kochenberger, G.A.; Glover, F.; Kochenberger, G.A. Handbook of Metaheuristics; Kluwer Academic Publishers: Norwell, MA, USA, 2003; p. 557. [Google Scholar]
  24. Goldberg, D. Genetic Algorithms in Search, Optimization, and Machine Learning; Addison-Wesley: Boston, MA, USA, 1989. [Google Scholar]
  25. Mladenovic, N.; Hansen, P. Variable neighborhood search. Comput. Oper. Res. 1997, 24, 1097–1100. [Google Scholar] [CrossRef]
  26. Hansen, P.; Mladenovic, N.; Todosijevic, R.; Hanafi, S. Variable neighborhood search: Basics and variants. EURO J. Comput. Optim. 2017, 5, 423–454. [Google Scholar] [CrossRef]
  27. Mladenovic, N.; Todosijevic, R.; Uroševic, D. Less is more: Basic variable neighborhood search for minimum differential dispersion problem. Inf. Sci. 2016, 326, 160–171. [Google Scholar] [CrossRef]
  28. Jarboui, B.; Derbel, H.; Hanafi, S.; Mladenovic, N. Variable neighborhood search for location routing. Comput. Oper. Res. 2013, 40, 47–57. [Google Scholar] [CrossRef]
  29. Sifaleras, A.; Konstantaras, I.; Mladenović, N. Variable neighborhood search for the economic lot sizing problem with product returns and recovery. Int. J. Prod. Econ. 2015, 160, 133–143. [Google Scholar] [CrossRef]
  30. Sifaleras, A.; Konstantaras, I. General variable neighborhood search for the multi-product dynamic lot sizing problem in closed-loop supply chain. Electron. Notes Discrete Math. 2015, 47, 69–76. [Google Scholar] [CrossRef]
  31. Feynman, R.P. Simulating physics with computers. Int. J. Theor. Phys. 1982, 21, 467–488. [Google Scholar] [CrossRef]
  32. Feynman, R.P.; Hey, J.; Allen, R.W. Feynman Lectures on Computation; CRC Press: Boca Raton, FL, USA, 1998. [Google Scholar]
  33. Nielsen, M.A.; Chuang, I.L. Quantum Computation and Quantum Information; Cambridge Series on Information and the Natural Sciences; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
  34. Yanofsky, N.S.; Mannucci, M.A.; Mannucci, M.A. Quantum Computing for Computer Scientists; Cambridge University Press: Cambridge, UK, 2008; Volume 20. [Google Scholar]
  35. Papalitsas, C.; Karakostas, P.; Kastampolidou, K. A Quantum Inspired GVNS: Some Preliminary Results. In GeNeDis 2016; Vlamos, P., Ed.; Springer International Publishing: Cham, Switzerland, 2017; pp. 281–289. [Google Scholar]
  36. Franklin, C. An Introduction to Geographic Information Systems: Linking Maps to Databases. Database 1992, 15, 12–21. [Google Scholar]
  37. Muller, J.C. Guest editorial latest developments in GIS/LIS. Int. J. Geogr. Inf. Sci. 1993, 7, 293–303. [Google Scholar] [CrossRef]
  38. Crossland, M.D.; Wynne, B.E.; Perkins, W.C. Spatial decision support systems: An overview of technology and a test of efficacy. Decis. Support Syst. 1995, 14, 219–235. [Google Scholar] [CrossRef]
  39. Keenan, P.B. Spatial decision support systems for vehicle routing. Decis. Support Syst. 1998, 22, 65–71. [Google Scholar] [CrossRef]
  40. Asimakopoulos, G.; Christodoulou, S.; Gizas, A.; Triantafillou, V.; Tzimas, G.; Gialelis, J.; Voyiatzis, A.; Karadimas, D.; Papalambrou, A. Architecture and Implementation Issues, Towards a Dynamic Waste Collection Management System. In Proceedings of the 24th International Conference on World Wide Web, Florence, Italy, 18–22 May 2015; ACM: New York, NY, USA, 2015; pp. 1383–1388. [Google Scholar]
  41. Reinelt, G. TSPLIB. 2013. Available online: http://comopt.ifi.uni-heidelberg.de/software/TSPLIB95/ (accessed on 25 September 2017).
  42. Pimentel, F.G.S.L. Double-ended nearest and loneliest neighbour—A nearest neighbour heuristic variation for the travelling salesman problem. Revista de Ciências da Computação 2016, 6, 17–30. [Google Scholar]
  43. Rios, E.; Ochi, L.S.; Boeres, C.; Coelho, I.M.; Coelho, V.N.; Mladenović, N. A performance study on multi improvement neighborhood search strategy. Electron. Notes Discrete Math. 2017, 58, 199–206. [Google Scholar] [CrossRef]
Figure 1. NN vs. GVNS vs. OV (1/3).
Figure 1. NN vs. GVNS vs. OV (1/3).
Algorithms 11 00038 g001
Figure 2. NN vs. GVNS vs. OV (2/3).
Figure 2. NN vs. GVNS vs. OV (2/3).
Algorithms 11 00038 g002
Figure 3. NN vs. GVNS vs. OV (3/3).
Figure 3. NN vs. GVNS vs. OV (3/3).
Algorithms 11 00038 g003
Figure 4. NN variants vs. GVNS vs. OV (1/3).
Figure 4. NN variants vs. GVNS vs. OV (1/3).
Algorithms 11 00038 g004
Figure 5. NN variants vs. GVNS vs. OV (2/3).
Figure 5. NN variants vs. GVNS vs. OV (2/3).
Algorithms 11 00038 g005
Figure 6. NN variants vs. GVNS vs. OV (3/3).
Figure 6. NN variants vs. GVNS vs. OV (3/3).
Algorithms 11 00038 g006
Table 1. The computational results demonstrate that novel GVNS outperforms NN on all TSP instances.
Table 1. The computational results demonstrate that novel GVNS outperforms NN on all TSP instances.
Benchmark NameNNGap NN vs. OVGVNS BIGap GVNS BI vs. OVGVNS FIGap GVNS FI vs. OV
OV
a2803157−0.22412779−0.07752766−0.07252579
att4812,861−0.210110,645−0.001610,654−0.002410,628
bayg292005−0.245316100.000016100.00001610
bays292258−0.117820200.000020200.00002020
bier127135,737−0.1475121,393−0.0263121,551−0.0276118,282
kroA10027,807−0.306521,664−0.017921,774−0.023121,282
burma144501−0.35443454−0.03943454−0.03943323
ch1307579−0.24046342−0.03806373−0.04306110
ch1508191−0.25476849−0.04926871−0.05256528
d49341,666−0.190337,715−0.077537,882−0.082335,002
kroB10029,158−0.316922,514−0.016822,786−0.029122,141
kroC10026,227−0.264021,148−0.019221,245−0.023920,749
kroD10026,947−0.265421,768−0.022321,916−0.029221,294
kroE10027,460−0.244322,512−0.020122,709−0.029022,068
kroA15033,633−0.268027,641−0.042127,794−0.047926,524
kroB15034,499−0.320227,032−0.034527,274−0.043826,130
kroA20035,859−0.221030,900−0.052231,179−0.061729,368
kroB20036,980−0.256231,119−0.057131,387−0.066229,437
d19818,240−0.155816,196−0.026416,260−0.030415,780
brg18069,550−34.66662026−0.03902038−0.04511950
berlin528980−0.19067547−0.00077590−0.00647542
dantzig42956−0.3676701−0.0029701−0.0029699
eil101803−0.2766647−0.0286649−0.0318629
eil51511−0.19394280.0000429−0.0023428
eil76642−0.1933548−0.0186549−0.0204538
fri261112−0.18679370.00009370.0000937
gil2623208−0.34902571−0.08122558−0.07572378
gr172187−0.048920850.000020850.00002085
gr213333−0.231227070.000027070.00002707
gr241553−0.220912720.000012720.00001272
gr486098−0.20845048−0.00045054−0.00165046
gr9675,065−0.359656,084−0.015856,133−0.016755,209
gr1209351−0.34707197−0.03677199−0.03706942
gr13798,720−0.413272,381−0.036272,536−0.038469,853
gr20247,080−0.172342,419−0.056342,287−0.053040,160
gr229169,715−0.2608141,387−0.0504142,175−0.0563134,602
gr431221,402−0.2916184,140−0.0742184,993−0.0792171,414
hk4813,181−0.150011,498−0.003211,531−0.006111,461
lin10520,356−0.415614,613−0.016314,607−0.015914,379
lin31854,019−0.285245,018−0.071145,179−0.074942,029
pcb44261,979−0.220555,416−0.091355,804−0.099050,778
pr76153,462−0.4188109,103−0.0087109,421−0.0117108,159
pr10746,680−0.053645,183−0.019945,464−0.026244,303
pr12469,297−0.173959,705−0.011459,742−0.012159,030
pr136120,769−0.247999,622−0.0295100,718−0.040896,772
pr14461,652−0.053258,776−0.004158,991−0.007858,537
pr15285,699−0.163074,703−0.013974,943−0.017173,682
pr22694,683−0.178181,379−0.012681,781−0.017680,369
Table 2. The experimental results show that GVNS outperforms the improved and the repeated NN on TSP instances (1/2).
Table 2. The experimental results show that GVNS outperforms the improved and the repeated NN on TSP instances (1/2).
Benchmark NameImproved NNRepeated NNGVNS BIGVNS FIOV
a28031713008277927662579
att4813,44712,01210,64510,65410,628
bayg2919381935161016101610
bays2923072134202020202020
bier127148,330133,953121,393121,551118,282
kroA10028,24424,69821,66421,77421,282
burma1444703822345434543323
ch13073427129634263736110
ch15076997113684968716528
d49341,85840,18937,71537,88235,002
kroB10028,52525,88422,51422,78622,141
kroC10025,51123,66021,14821,24520,749
kroD10029,20224,85221,76821,91621,294
kroE10028,12524,78222,51222,70922,068
kroA15032,01931,47927,64127,79426,524
kroB15037,11331,61127,03227,27426,130
kroA20036,82534,54330,90031,17929,368
kroB20038,84435,38931,11931,38729,437
d19818,48517,62016,19616,26015,780
brg18098,46059,550202620381950
berlin5291568181754775907542
dantzig42957864701701699
eil101823746647649629
eil51555482428429428
Table 3. The experimental results show that GVNS outperforms the improved and the repeated NN on TSP instances (2/2).
Table 3. The experimental results show that GVNS outperforms the improved and the repeated NN on TSP instances (2/2).
Benchmark NameImproved NNRepeated NNGVNS BIGVNS FIOV
eil76614608548549538
fri26982965937937937
gil26230272823257125582378
gr1721872178208520852085
gr2132653003270727072707
gr2417691553127212721272
gr4862875840504850545046
gr9670,06065,41656,08456,13355,209
gr12087918438719771996942
gr13793,74984,37672,38172,53669,853
gr20247,46045,03042,41942,28740,160
gr229167,062157,745141,387142,175134,602
gr431218,383197,405184,140184,993171,414
hk4813,05212,13711,49811,53111,461
lin10520,35916,93514,61314,60714,379
lin31854,03149,20145,01845,17942,029
pcb44264,62358,95055,41655,80450,778
pr76154,708130,921109,103109,421108,159
pr10746,76546,68045,18345,46444,303
pr12469,15467,05559,70559,74259,030
pr136127,551114,55399,622100,71896,772
pr14461,90460,96458,77658,99158,537
pr15285,34979,56474,70374,94373,682
pr22695,57392,55281,37981,78180,369
Table 4. Data based on the novel GVNS and the conventional GVNS [35].
Table 4. Data based on the novel GVNS and the conventional GVNS [35].
ProblemAv. Value (nGVNS)Best (GVNS)ProblemAv. Value (nGVNS)Best (GVNS)
bayg2916101653br173939
bays2920202069ftv331357.81489
fri26937969ftv351544.61791
gr1720852085ftv381616.61778
gr2412721278ftv441763.82014
ulysses1668596859p4355545629
ulysses2270137013ry48p14,698.215,134
gr485049.45325
hk4811,508.611,884
Table 5. GVNS on asymmetric TSPs.
Table 5. GVNS on asymmetric TSPs.
BenchmarkOVGVNS FIGVNS BI
br17393939
ft53690573287135
ft7038,67340,69140,206
ftv33128613391286
ftv35147314991473
ftv38153015851541
ftv44161317601644
ftv47177819921816
ftv55160819851665
ftv64183923821986
ftv70195025572157
ftv170275539233852
kro124p36,23043,18737,076
p43562056235620
rbg323132627552755
rbg358116327552755
rbg403246527552755
rbg443272028212806
ry48p14,42214,69914,439
Table 6. GVNS on National TSPs.
Table 6. GVNS on National TSPs.
BenchmarkOVGVNS FIGVNS BI
ar9152837,4791,648,5961,648,596
gr9882300,899388,944388,944
eg7146172,387220,315220,315
fi10639520,527649,604649,604
ho14473177,105484,812484,812
ei8246206,171258,889258,889
ja9847491,924612,304612,304
kz99761,061,8821,358,2471,358,247
lu98011,34023,68812,388
mo14185427,377529,729529,729
nu349696,132221,920108,639
mu197986,891120,90895,413
qa194935297279717
rw162126,05158,14828,866
tz6117394,718501,184483,082
uy73479,11486,02286,201
wi2927,60327,60327,603
ym7663238,314308,747308,747
zi92995,345112,775103,557
ca46631,290,3191,646,8891,463,565
it16862557,315706,069706,069

Share and Cite

MDPI and ACS Style

Papalitsas, C.; Karakostas, P.; Andronikos, T.; Sioutas, S.; Giannakis, K. Combinatorial GVNS (General Variable Neighborhood Search) Optimization for Dynamic Garbage Collection. Algorithms 2018, 11, 38. https://doi.org/10.3390/a11040038

AMA Style

Papalitsas C, Karakostas P, Andronikos T, Sioutas S, Giannakis K. Combinatorial GVNS (General Variable Neighborhood Search) Optimization for Dynamic Garbage Collection. Algorithms. 2018; 11(4):38. https://doi.org/10.3390/a11040038

Chicago/Turabian Style

Papalitsas, Christos, Panayiotis Karakostas, Theodore Andronikos, Spyros Sioutas, and Konstantinos Giannakis. 2018. "Combinatorial GVNS (General Variable Neighborhood Search) Optimization for Dynamic Garbage Collection" Algorithms 11, no. 4: 38. https://doi.org/10.3390/a11040038

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop