Next Article in Journal
Multi-Task Forecasting of the Realized Volatilities of Agricultural Commodity Prices
Previous Article in Journal
A Novel Method for Localized Typical Blemish Image Data Generation in Substations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Pareto-Optimal Algorithm for Flow Shop Scheduling Problem

1
Department of Industrial Engineering, Vali-e-Asr University of Rafsanjan, Rafsanjan 7718897111, Iran
2
Department of Astronautics, Electrical and Energetic Engineering (DIAEE) Sapienza University, 00184 Rome, Italy
3
Department of Energy Management and Optimization, Institute of Science and High Technology and Environmental Sciences, Graduate University of Advanced Technology, Kerman 7631885356, Iran
4
Department of Electrical and Computer Engineering, University of Louisiana at Lafayette, Lafayette, LA 70504, USA
5
Department of Industrial Engineering, Science and Research Branch, Islamic Azad University, Tehran 1477893855, Iran
*
Author to whom correspondence should be addressed.
Mathematics 2024, 12(18), 2951; https://doi.org/10.3390/math12182951
Submission received: 21 July 2024 / Revised: 15 September 2024 / Accepted: 18 September 2024 / Published: 23 September 2024

Abstract

:
Minimizing job waiting time for completing related operations is a critical objective in industries such as chemical and food production, where efficient planning and production scheduling are paramount. Addressing the complex nature of flow shop scheduling problems, which pose significant challenges in the manufacturing process due to the vast solution space, this research employs a novel multiobjective genetic algorithm called distance from ideal point in genetic algorithm (DIPGA) to identify Pareto-optimal solutions. The effectiveness of the proposed algorithm is benchmarked against other powerful methods, namely, NSGA, MOGA, NSGA-II, WBGA, PAES, GWO, PSO, and ACO, using analysis of variance (ANOVA). The results demonstrate that the new approach significantly improves decision-making by evaluating a broader range of solutions, offering faster convergence and higher efficiency for large-scale scheduling problems with numerous jobs. This innovative method provides a comprehensive listing of Pareto-optimal solutions for minimizing makespan and total waiting time, showcasing its superiority in addressing highly complex problems.

1. Introduction

The flow shop, a well-recognized scheduling challenge, accounts for a significant portion of production systems and has garnered considerable interest due to its complexity and practical importance. It has been extensively researched across various industries such as electronics, automotive, paper, textiles, and tile manufacturing [1,2]. In practical manufacturing settings, transitioning between jobs often involves setup time, a crucial element distinct from processing time. Setup time, or configuration time, encompasses activities like cleaning, adjusting machine parameters, or changing tools. If not effectively managed, setup time can consume more than 20% of available machine capacity [3]. The duration of setup time for each job on a machine depends on the preceding job processed on that machine. To enhance production efficiency, jobs can be categorized into groups (or families) that share identical machine setup requirements. This strategy is referred to as flow shop sequence-dependent group scheduling (FSDGS) [4].
Early studies on flow shop scheduling (FSS) problems were mostly on the basis of Johnson’s theorem, which yields an approach to find the best solution for two or three machines with given specifications [5,6]. The initial meta-heuristics for the permutation flow shop scheduling problem (PFSP) are the simulated annealing algorithms [7,8]. A no-wait FSS problem with two machines was offered by [9] through the minimization of the makespan and limited availability of machines. A heuristic algorithm was proposed by [10] for the FSS problem with no-wait, where the total flow time is assumed as the key criterion. Ref. [11] proposed using diverse meta-heuristics like simulated annealing, neighborhood search, and tabu search for continuous FSS problems. They also studied the trade-offs of solution quality and running time along with the effort and knowledge required to calibrate and implement the proposed algorithm. A new heuristic approach was developed by [12] for minimizing the makespan in an FSS problem without intermediary storage and some processors. Complexity categorization of different types of two-machine permutation FSS problem was introduced by [13] as a way of minimizing the makespan where various jobs were performed utilizing no-wait in the process. A no-wait flow shop paradox was introduced in [14]. It was reported that the optimal makespan degrades with a higher speed in some machines. Using the characteristics of parallel FSS, Ref. [15] examined the scheduling scheme’s effect on the operators’ imbalanced workload, which is a way to describe the level of work balance of employees based on work standard deviation. Also, a multiobjective scheduling optimization model was established for parallel flow shops. The model was solved based on the improved NSGA-II algorithm. Ref. [16] investigated standard no-wait and no-idle FSS problems with degrading professions. Ref. [17] studied a task-scheduling problem in a no-wait flow shop with two batching machines to minimize the makespan. Ref. [18] offered a memetic algorithm (MA) utilizing differential evolution (DE) to tackle multiobjective no-wait flow shop schedule problems (MNFSSPs). Ref. [19] studied an FSS problem with machine skipping and transportation times and proposed new robust multiobjective electromagnetism (MOEM) algorithms. Ref. [20] proposed a multiobjective genetic algorithm (GA) for solving an FSS problem, which used the weighted sum of several objectives including mean flow times, makespan minimization, and machine idle time. Ref. [21] investigated a multiobjective particle swarm algorithm for bi-criteria FSS intending to achieve weighted mean tardiness and a minimum weighted mean completion time. Ref. [22] suggested three meta-heuristic algorithms: the iterated greedy algorithm (IG), the artificial immune system (AIS), and a hybrid method (AIS-IG). The objective was to minimize the maximum completion time (makespan) for flow shop scheduling (FSS) problems with small buffers between consecutive machines. Ref. [23] suggested a solution for the flexible FSS problem with undefined processing time in the aeronautical composite lay-up workshop. Ref. [21] also suggested a multiobjective immune algorithm for scheduling a flow shop problem, which was compared to a standard multiobjective genetic algorithm. What makes this study different in several respects from the previous ones is modeling the multiobjective combinatorial optimization and introducing a novel method to solve the problem. Over the past few years, a surge of interest in using genetic algorithms (GA) has happened, which are mostly used to tackle different single- and multiobjective problems in operation and production management such as NP-hard and combinatorial problems [24,25,26,27,28]. A hybrid multiobjective evolutionary algorithm was examined by [29]. The algorithm was on the basis of differential evolution (HMOEA/DE) and was designed to tackle flow shop scheduling problems (FSPs), as well as minimizing the makespan and tardiness simultaneously. Ref. [30] examined a sub-population-based hybrid monkey search algorithm used for FSS and showed that the problem was a polynomial non-deterministic time hard (NP-hard) combinatorial optimization type. Ref. [31] proposed an improved NSGA-II that can make multiobjective optimization guided not only by crowding distance solutions but also by the preferences of material designers.
The contributions and novelties of this paper are as follows:
This paper introduces a novel multiobjective genetic algorithm, named distance from ideal point in genetic algorithm (DIPGA), specifically designed to tackle the complexities of flow shop scheduling (FSS) problems.
The DIPGA algorithm is tailored to address two critical objectives in scheduling: minimizing makespan and total waiting time, which are particularly important in industries such as chemical and food production where efficiency is paramount.
The effectiveness of DIPGA is rigorously benchmarked against several established algorithms, namely, NSGA, MOGA, NSGA-II, WBGA, PAES, GWO, PSO, and ACO. This comparative analysis is conducted using analysis of variance (ANOVA) to ensure a robust performance evaluation.
The remainder of this paper is organized as follows. The FSS problem is introduced and formulated in Section 2. The proposed procedure and distance from the ideal point in the genetic algorithm (DIPGA) are illustrated in Section 3. A few examples are discussed in Section 4 to show the introduced method. Further, the results are compared in Section 5. Finally, the conclusions are given in Section 6.

2. The FSS Problem

The FSS problem entails scheduling n jobs on m machines with certain processing times. The key assumption of the problem is that the n jobs are completed on the m machines in the same order. Each job should only go through each machine once. A job is only processed on one machine at a time, and each machine can only complete one job at a time, without interruptions. The objective is to devise a schedule that meets all objectives while minimizing the makespan and total queue time. One of our aims in this paper is minimizing job waiting time before operation. Some industries, like chemical and food industries, try to implement product scheduling given the risk of deterioration or decay of the products before operation. Therefore, a multiobjective mathematical model was proposed for the FSS problem. The scheduling problem is mainly modeled based on the assumptions mentioned below:
(i)
The processing times of each machine are given and fixed, which can be zero if the machine performs no processing.
(ii)
The setup times are included in the processing time and are not dependent on the position of the job in the sequence.
(iii)
A job can be processed only on one machine at any given time, and each machine processes only one job at a time.
(iv)
It is not possible to preempt the job operation on the machines.

2.1. Mathematical Modeling

We formulated the multiobjective FSS problem through notions mentioned below:
n is the total number of jobs to be scheduled,
m represents the total number of machines in the process,
t i , j represents the processing time for job i on machine j (i = 1, 2,…, n), (j = 1, 2,… m),
P i represents the job sequenced in ith position of a schedule,
C P i , j represents the completion time of job P i on machine j,
C P i , m represents the completion time of job P i on mth machine,
D P i , j represents the waiting time for job i for machine j (i = 1, 2,…, n), (j = 2,…, m),
I P i , j represents the tardiness of job P i on machine j,
The completion times of an n job, m machine flow shop problem can be calculated as follows:
C P 1 , 1 = t 1,1
C P 1 , j = C P 1 , j 1 + t 1 , j   j = 1   t o   m
C P i , j = max { C P i 1 , j , C P i , j 1 } + t i , j   i = 1   t o   n ,   j = 1   t o   m
m a k e s p a n = C P n , m
D P i , j = max { 0 , C P i 1 , j C P i , j 1 }
T o t a l   w a i t i n g   t i m e   ( T W T ) = i = 1 n j = 2 m D P i ,   j
I P i , j = max { 0 , C P i , j 1 C P i 1 , j }
  T o t a l   t a r d i n e s s   ( T T ) = j = 1 n i = 1 m I P i ,   j

2.2. Objective Function

The completion times of an n job, m machine flow shop problem can be calculated as follows:
The first objective function is to minimize the makespan ( C P n , m ):
m i n   F 1 = C P n , m
The second objective function is to minimize the total waiting time ( T W T = i = 1 n j = 2 m D P i ,   j ):
m i n   F 2 = i = 1 n j = 2 m D P i ,   j

3. Multiobjective Optimization

Many business, engineering, and management applications highly depend on optimization. To this end, several mostly conflicting objectives need to be met. Handling such problems requires converting all the goals into a single-objective (SO) function. The principal aim is to achieve a solution that minimizes or maximizes the SO function while adhering to the physical limitations of the process or system. This solution produces a single value representing a compromise among the objectives. The process of formulating this function is essential to reach the optimal compromise. Various objectives are transformed into an SO function by aggregating them into a weighted function or by converting all but one into constraints.
This method of solving multiobjective (MO) optimization problems has several constraints:
It necessitates prior knowledge of the relative significance of goals, as well as the limitations of objectives translated into constraints.
The aggregated function provides only one solution.
It is challenging to evaluate the trade-offs between objectives.
When the search space is not convex, the solution may not be attainable.
Simple optimization is not acceptable for systems with several conflicting goals. System engineers prefer to have all probable solutions that optimize all objectives simultaneously. Multiobjective (MO) problems are more challenging than single-objective (SO) ones because there is no single solution; instead, there may be a group of acceptable trade-off optimal solutions. This group is referred to as the Pareto front. MO optimization is a crucial step in the multi-criteria decision-making (MCDM) process, which involves identifying all Pareto optimal solutions. The decision-maker (DM) selects the preferred solution from the Pareto set. Creating the Pareto set has several advantages. It allows the DM to make an informed decision by observing multiple options, as it includes solutions that are the best choices from a comprehensive perspective. This approach contrasts with SO optimization, which might neglect the trade-off viewpoint. Assume a standard multiobjective minimization problem with p decision variables and q objectives (p > 1):
min y = f x = f 1 x ,   f 2 x ,   f 3 x , , f q x   x R P   a n d   y R P
Definition 1.
Solution a dominates solution b if
f i ( a ) f i b     i { 1 ,   2 ,   ,   q }
f i ( a ) < f i ( b )     i { 1 ,   2 ,   ,   q }
In multiobjective optimization, solutions that dominate others while not being dominated themselves are known as non-dominated solutions. Non-dominated sorting involves categorizing solutions based on their dominance relationships. Specifically, a member A is considered to dominate member B if
None of the objectives of A are worse than the corresponding objectives of B;
At least one objective of A is strictly better than the corresponding objective of B.
This definition ensures that a non-dominated solution represents a feasible trade-off where it is not possible to improve one objective without degrading at least one other objective. This concept is fundamental in Pareto-based methods for multiobjective optimization, where the aim is to identify solutions that lie on the Pareto front, representing the optimal trade-offs between conflicting objectives.
Definition 2.
Vector a is regarded as a globally Pareto-optimal solution when vector b in which b dominates a does not exist. The whole group of Pareto-optimal solutions is also known as the Pareto-optimal set. The set’s pertinent images in the objective space are recognized as the Pareto-optimal frontier (see Figure 1).
Several methods are available to solve multiobjective optimization problems, each with its strengths and considerations. Popular methods include e-constraint methods, sequential optimizations, goal programming, weighting methods, distance-based methods, goal attainment, and direction-based methods [32,33,34,35]. These methods aim to effectively handle the challenges posed by multiobjective optimization, such as producing the Pareto front—a set of non-dominated solutions representing optimal trade-offs between conflicting objectives. Meta-heuristic methods are particularly noteworthy for their ability to address the limitations of classical approaches. These methods excel in exploring multiple points along the Pareto front simultaneously, thereby providing several solutions in a single run. They do not require prior knowledge of the relative significance of objectives, making them suitable for ill-posed problems with mixed-type and incommensurable objectives. Moreover, they are versatile in handling various shapes of the Pareto front. However, meta-heuristic methods may experience performance degradation as the number of objectives increases, due to challenges in effectively ranking solutions on the Pareto front. They also typically require tuning parameters such as the number of Pareto samples and sharing factors. A recent innovation in this field is the distance-based improved genetic algorithm (also abbreviated as DIPGA), which aims to achieve Pareto-optimal solutions based on the distance (mean deviation from the ideal point) in multiobjective problems. This approach highlights ongoing advancements in meta-heuristic techniques tailored for multiobjective optimization challenges.

3.1. Multiobjective GA

Genetic algorithms (GAs) are an effective population-based approach for solving multiobjective optimization problems. Modifying a generic GA designed for single-objective optimization allows it to generate a set of non-dominated solutions in a single run. GAs excel in exploring multiple areas of solution spaces simultaneously, making them capable of finding diverse solutions for complex problems with discontinuous, non-convex, and multi-modal solution spaces. The crossover operator in GAs leverages solution structures across different objectives, facilitating the discovery of new non-dominated solutions in unexplored regions of the Pareto front. Furthermore, most multiobjective GAs do not require user prioritization, scaling, or weighting of objectives, contributing to their popularity as heuristic methods for designing and optimizing multiobjective problems. According to Ref. [36], approximately 90% of approaches for solving multiobjective optimization aim to approximate the true Pareto front, with a significant portion relying on meta-heuristic approaches, 70% of which are evolutionary. The vector evaluation GA (VEGA) was introduced by Ref. [37], and subsequently, numerous multiobjective evolutionary algorithms have been developed, including the niched Pareto genetic algorithm (NPGA) [38], multiobjective genetic algorithm (MOGA) [39], improved strength Pareto evolutionary algorithm (SPEA2) [40], random weighted genetic algorithm (RWGA) [41], weight-based genetic algorithm (WBGA) [42], strength Pareto evolutionary algorithm (SPEA) [43], Pareto-archived evolution strategy (PAES) [44], region-based selection in evolutionary multiobjective optimization (PESA-II) [45], Pareto envelope-based selection algorithm (PESA) [44], multiobjective evolutionary algorithm (MEA) [46], fast non-dominated sorting genetic algorithm (NSGA-II) [47], dynamic multiobjective evolutionary algorithm (DMOEA) [48], micro-GA [49], and rank-density-based genetic algorithm (RDGA) [50]. While there are numerous types of multiobjective GAs, the aforementioned algorithms are widely recognized and have been extensively studied across different applications through comparative studies. Several survey studies [39,51,52,53] on evolutionary multiobjective optimization have been published. Ref. [54] proposed a novel ε-dominance MOEA (EDMOEA) that employs steady-state replacement and pair-comparison selection instead of Pareto ranking. It is an elitist algorithm designed to maintain population diversity based on the ε-dominance relation. The results demonstrate that EDMOEA achieves superior performance compared to other algorithms such as ε-MOEA, IBEA, SPEA2, NSGA-II, PESA, and PESA-II on test problems. Ref. [55] investigated a discrete optimization problem aiming to generate an optimal set of solutions from a larger set of Pareto optimal solutions—an NP-hard problem. To tackle this, they provided five heuristics and two explicit algorithms, comparing their efficiency using five test problems. Generally, multiobjective GAs differ in terms of elitism, fitness assignment processes, and diversification approaches.

3.2. Proposed Multiobjective Optimization Approach (DIPGA)

In this section, we define the DIPGA proposed in this study to address the flow shop scheduling (FSS) problem. The overall structure of the proposed algorithm is illustrated in Figure 2. Each job must pass through every machine exactly once. Given n jobs, there are n! possible solutions, leading to an extensive search space. Consequently, it is essential to develop an efficient and novel algorithm. A chromosome, by definition, is a group of integer values (genes) that represent a series of jobs. Each gene indicates a job, and a set of n jobs is represented by a chromosome. The main steps of DIPGA are as follows:
  • Step 1. Generating i times the initial population unsystematically. The population size of each generation equals N.
  • Step 2. Choose the non-dominated Pareto solutions for each initial population by following step 2.1 to step 2.4 (each Pareto solution consists of k individual non-dominated solutions). See Figure 2.
    • Step 2.1. Sorting the solution based on one of the objective functions (F1 and F2).
    • Step 2.2. Ranking the sorting solution.
    • Step 2.3. The first ranking is the first individual solution in the non-dominated Pareto solutions.
    • Step 2.4. The available solution is selected as an individual solution in a non-dominated Pareto when at least one of the values of the objective functions is better than the previously ranked objective function. The genetic algorithm begins by generating N non-dominated solutions. The starting population is called generation 1.
  • Step 3. Determining the ideal point ( A x , A y ) in each generation. A x and A y are the best solutions obtained via the algorithm for each generation (extreme point).
  • Step 4. Calculating the summation distance (fitness) between the ideal point and k individual solutions in the non-dominated front i ( U i k , V i k ) (red lines in Figure 3).
    d i = ( A X U ik ) 2 + ( A Y V ik ) 2
    f i t i = k = 1 K d i K
    Figure 3. Distance between the ideal point and individual solutions in the non-dominated front (red lines).
    Figure 3. Distance between the ideal point and individual solutions in the non-dominated front (red lines).
    Mathematics 12 02951 g003
  • Step 5. To select the parents for generating the next generation g in step 1, normalized fitness is calculated based on Equations (16) and (17), in which it signifies the mean fitness of the Pareto solution in generation g , z i g is the normalized fitness of the Pareto solution i, and σg represents the standard deviation of fitness in generation g . Since the problem’s objective function is a minimization problem, the solution with the best fitness is selected as the elite solution and added to a list titled the blacklist. To prevent the algorithm from getting stuck in a local optimum, it is necessary to give lower-ranking solutions a chance to be selected. Therefore, crossover and mutation operators are applied. The rates of these operators are adjusted to account for the presence of blacklisted chromosomes. For blacklisted chromosomes, the crossover operator is applied, and the resulting offspring are added to the elite list. Similarly, the mutation operator is used on blacklisted chromosomes, and the newly generated Pareto solutions are also added to the elite list. This approach ensures diversity and helps in avoiding premature convergence to local optima.
    σ g = ( i = 1 N ( f i t ( i ) m g ) 2 N - 1 ) 1 / 2
    z i g = F i t ( i ) m g σ g
  • Step 6. In this step, we produced an offspring of the parents (elite list) to enter the next generation. When the jobs are less than 50, a one-point crossover is employed in this problem. In addition, given the number of jobs, the two-point mutation was utilized. For example, with nine jobs, two randomly selected chromosomes containing possible genes are as follows:
Parent 1215364798
Parent 2432541325
In this case, due to the small number of jobs, one-/two-point mutations are employed. By using the aforementioned operators, the offspring of these parents are generated as follows:
Random points and one-cut-point crossover (e1 = 4)
Offspring 1432561798
Offspring 2215364798
Random points and two-point mutation (e1 = 6, e2 = 9)
Offspring 1215368794
It is notable that the rate of mutation ( F m ( g ) ) declines over generations, as defined by Equation (18). Consequently, the mutation rate reaches zero in the final generation:
F ( m ) = 1 n g G
  • Step 7. Repeat steps 2 to 4 until the σ g = 0 (see Figure 4).

4. Illustrative Examples

As an instance, this section presents a two-machine FSS problem with ten jobs (Table 1). Table 1 lists the data for this project. In this example, there are 3,628,800 possible solutions. The model was solved to find the best solution.
Because there are many jobs in this example, a combination of one-point crossover, blacklist crossover rates, and two-point mutation was used, along with pre-specified weights and a blacklist mutation rate. The DIPGA parameters were adopted according to De Jong’s parameter settings [56]. G = 100 (number of generations), N = 70 (population size), blacklist crossover rate = 0.85, one-point crossover rate = 0.6, blacklist mutation rate = 0.8, and mutation rate = 0.2. Table 2 presents the job sequence. The best solution in Figure 5 reveals that point [1,57] is the ideal point.

5. Experimental Evaluations

A comprehensive experimental examination was conducted to compare the performance of DIPGA with several established methods: weight-based genetic algorithm (WBGA) [42], non-dominated sorted genetic algorithm (NSGA) [58], multiobjective genetic algorithm (MOGA) [39], Pareto-archived evolution strategy (PAES) [44], non-dominated sorting genetic algorithm (NSGA-II) [47], grey wolf optimizer (GWO) [59], particle swarm optimization (PSO) [60], and ant colony optimization (ACO) [61]. The study utilized 21 test problems sourced from the OR library [52], varying the number of jobs from 20 to 75 and the number of machines from 5 to 20. The experimental implementation of the proposed methods was conducted using Python (https://www.python.org/, accessed on 1 July 2024), a widely adopted platform for scientific research and data analysis. Each problem instance was solved three times to ensure the robustness of the results. In this evaluation, the mean deviation from the ideal point (MDI) was employed as a standard performance measure, calculated using
M D I i = i = 1 k ( F 1 i F 1 * i ) 2 + ( F 2 i F 2 * i ) 2 k
where F1* and F2* are the optimum solutions achieved by each algorithm for the instance; F1 and F2 denote the makespan and total waiting time, respectively; k represents the point count for each Pareto solution. Statistical analysis using ANOVA was conducted to analyze the findings, and the least significant difference (LSD) interval and mean plots are depicted in Figure 6 and Table 3. Monitoring the reduction in Hamming convergence during the GA’s operation provided insights into the approximate time to convergence. The decrease in variance of fitness functions across generations indicated increasing similarity among chromosomes, thereby demonstrating progress towards optimal conditions. Figure 7 illustrates the convergence curves comparing DIPGA with other algorithms.
The experimental results demonstrate that DIPGA consistently outperforms NSGA, MOGA, NSGA-II, WBGA, PAES, GWO, PSO, and ACO statistically in this problem domain. This superiority is attributed to DIPGA’s ability to effectively explore and exploit the solution space, producing solutions closer to the ideal Pareto front. Moreover, the comprehensive nature of the experimental setup, including multiple runs and rigorous statistical analysis, ensures the reliability and validity of the comparisons drawn among these algorithms.
This section presents the results in a question-and-answer format for discussion.
  • What were the main algorithms compared against DIPGA in the experimental evaluations?
    The main algorithms compared against DIPGA were the weight-based genetic algorithm (WBGA), non-dominated sorted genetic algorithm (NSGA), multiobjective genetic algorithm (MOGA), Pareto-archived evolution strategy (PAES), non-dominated sorting genetic algorithm (NSGA-II), grey wolf optimizer (GWO), particle swarm optimization (PSO), and ant colony optimization (ACO).
  • How was the performance of the algorithms measured in this study?
    The performance of the algorithms was measured using the mean deviation from the ideal point (MDI), which calculates the distance of the achieved solutions from the optimum solutions for each instance.
  • What was the significance of the statistical analysis performed in this study?
    The statistical analysis using ANOVA was significant as it provided a rigorous method for comparing the performance of the algorithms, ensuring the reliability and validity of the results obtained from the comparisons.
  • What was the main finding regarding the performance of DIPGA compared to the other algorithms?
    The main finding was that DIPGA consistently outperforms the other algorithms statistically, demonstrating a better ability to explore and exploit the solution space, resulting in solutions closer to the ideal Pareto front.

6. Conclusions

This paper proposed a novel multiobjective genetic algorithm called distance from ideal point in genetic algorithm (DIPGA) to find an optimal solution for flow shop scheduling, considering the makespan and total waiting time. The running speed of the algorithm, elitism, and rapid convergence mean that it is a good choice for large scheduling with numerous jobs. Furthermore, we utilized the mean deviation of the ideal point (MDI) to compare the performance of DIPGA, NSGA, WBGA, MOGA, NSGA-II, PAES, GWO, PSO, and ACO through ANOVA. Given the uncertainty in processing time, it is possible to extend this model to cases with higher realistic aspects.
Future work on DIPGA should be focused on increasing the number of machines as too many jobs are introduced in this implementation. This may be achieved by tuning the parameters of DIPGA by experimentation. Parameter experimentation on DIPGA may also reduce its runtime.

Author Contributions

Conceptualization, N.S.-P., A.H. and H.A.; methodology, N.S.-P. and H.A.; investigation, N.S.-P. and H.A.; visualization, N.S.-P. and H.A.; writing—original draft preparation, N.S.-P. and H.A.; writing—review and editing, N.S.-P., A.H., H.A. and A.F.; supervision, N.S.-P. and A.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Andrés, C.; Albarracín, J.M.; Tormo, G.; Vicens, E.; García-Sabater, J.P. Group technology in a hybrid flowshop environment: A case study. Eur. J. Oper. Res. 2005, 167, 272–281. [Google Scholar] [CrossRef]
  2. Salmasi, N.; Logendran, R.; Skandari, M.R. Total flow time minimization in a flowshop sequence-dependent group scheduling problem. Comput. Oper. Res. 2010, 37, 199–212. [Google Scholar] [CrossRef]
  3. Pinedo, M.L. Scheduling; Springer International Publishing: Cham, Switzerland, 2016. [Google Scholar] [CrossRef]
  4. Sekkal, D.N.; Belkaid, F. A multi-objective optimization algorithm for flow shop group scheduling problem with sequence dependent setup time and worker learning. Expert Syst. Appl. 2023, 233, 120878. [Google Scholar] [CrossRef]
  5. Kamburowski, J. The nature of simplicity of Johnson’s algorithm. Omega 1997, 25, 581–584. [Google Scholar] [CrossRef]
  6. Johnson, S.M. Optimal two- and three-stage production schedules with setup times included. Nav. Res. Logist. Q. 1954, 1, 61–68. [Google Scholar] [CrossRef]
  7. Osman, I.; Potts, C. Simulated annealing for permutation flow-shop scheduling. Omega 1989, 17, 551–557. [Google Scholar] [CrossRef]
  8. Ogbu, F.A.; Smith, D.K. The application of the simulated annealing algorithm to the solution of the n/m/Cmax flowshop problem. Comput. Oper. Res. 1990, 17, 243–253. [Google Scholar] [CrossRef]
  9. Espinouse, M.L.; Formanowicz, P.; Penz, B. Minimizing the makespan in the two-machine no-wait flow-shop with limited machine availability. Comput. Ind. Eng. 1999, 37, 497–500. [Google Scholar] [CrossRef]
  10. Bertolissi, E. Heuristic algorithm for scheduling in the no-wait flow-shop. J. Mater. Process. Technol. 2000, 107, 459–465. [Google Scholar] [CrossRef]
  11. Fink, A.; Voß, S. Solving the continuous flow-shop scheduling problem by metaheuristics. Eur. J. Oper. Res. 2003, 151, 400–414. [Google Scholar] [CrossRef]
  12. Thornton, H.W.; Hunsucker, J.L. A new heuristic for minimal makespan in flow shops with multiple processors and no intermediate storage. Eur. J. Oper. Res. 2004, 152, 96–114. [Google Scholar] [CrossRef]
  13. Bouquard, J.L.; Billaut, J.C.; Kubzin, M.A.; Strusevich, V.A. Two-machine flow shop scheduling problems with no-wait jobs. Oper. Res. Lett. 2005, 33, 255–262. [Google Scholar] [CrossRef]
  14. Spieksma, F.C.R.; Woeginger, G.J. The no-wait flow-shop paradox. Oper. Res. Lett. 2005, 33, 603–608. [Google Scholar] [CrossRef]
  15. Hu, Z.; Liu, W.; Ling, S.; Fan, K. Research on multi-objective optimal scheduling considering the balance of labor workload distribution. PLoS ONE 2021, 16, e0255737. [Google Scholar] [CrossRef] [PubMed]
  16. Wang, J.-B. Flow shop scheduling problems with decreasing linear deterioration under dominant machines. Comput. Oper. Res. 2007, 34, 2043–2058. [Google Scholar] [CrossRef]
  17. Oulamara, A. Makespan minimization in a no-wait flow shop problem with two batching machines. Comput. Oper. Res. 2007, 34, 1033–1050. [Google Scholar] [CrossRef]
  18. Qian, B.; Wang, L.; Huang, D.X.; Wang, X. Multi-objective no-wait flow-shop scheduling with a memetic algorithm based on differential evolution. Soft. Comput. 2009, 13, 847–869. [Google Scholar] [CrossRef]
  19. Khalili, M.; Tavakkoli-Moghaddam, R. A multi-objective electromagnetism algorithm for a bi-objective flowshop scheduling problem. J. Manuf. Syst. 2012, 31, 232–239. [Google Scholar] [CrossRef]
  20. Ponnambalam, S.G.; Jagannathan, H.; Kataria, M.; Gadicherla, A. A TSP-GA multi-objective algorithm for flow-shop scheduling. Int. J. Adv. Manuf. Technol. 2004, 23, 909–915. [Google Scholar] [CrossRef]
  21. Tavakkoli-Moghaddam, R.; Rahimi-Vahed, A.-R.; Mirzaei, A.H. Solving a Bi-Criteria Permutation Flow Shop Problem Using Immune Algorithm. In Proceedings of the 2007 IEEE Symposium on Computational Intelligence in Scheduling, Honolulu, HI, USA, 1–5 April 2007; pp. 49–56. [Google Scholar]
  22. Abdollahpour, S.; Rezaeian, J. Minimizing makespan for flow shop scheduling problem with intermediate buffers by using hybrid approach of artificial immune system. Appl. Soft. Comput. 2015, 28, 44–56. [Google Scholar] [CrossRef]
  23. Wang, Y.; Xie, N. Flexible flow shop scheduling with interval grey processing time. Grey Syst. Theory Appl. 2021, 11, 779–795. [Google Scholar] [CrossRef]
  24. Gen, M.; Cheng, R. Genetic Algorithms and Engineering Optimization; John Wiley & Sons: Hoboken, NJ, USA, 1999. [Google Scholar]
  25. Dimopoulos, C.; Zalzala, A.M.S. Recent developments in evolutionary computation for manufacturing optimization: Problems, solutions, and comparisons. IEEE Trans. Evol. Comput. 2000, 4, 93–113. [Google Scholar] [CrossRef]
  26. Ahn, G.; Hur, S. Multiobjective Real-Time Scheduling of Tasks in Cloud Manufacturing with Genetic Algorithm. Math Probl. Eng. 2021, 2021, 1–10. [Google Scholar] [CrossRef]
  27. Lv, L.; Shen, W. An improved NSGA-II with local search for multi-objective integrated production and inventory scheduling problem. J. Manuf. Syst. 2023, 68, 99–116. [Google Scholar] [CrossRef]
  28. Tian, G.; Zhang, L.; Fathollahi-Fard, A.M.; Kang, Q.; Li, Z.; Wong, K.Y. Addressing a Collaborative Maintenance Planning Using Multiple Operators by a Multi-Objective Metaheuristic Algorithm. IEEE Trans. Autom. Sci. Eng. 2023, 9, e22242. [Google Scholar] [CrossRef]
  29. Zhang, W.; Wang, Y.; Yang, Y.; Gen, M. Hybrid multiobjective evolutionary algorithm based on differential evolution for flow shop scheduling problems. Comput. Ind. Eng. 2019, 130, 661–670. [Google Scholar] [CrossRef]
  30. Marichelvam, M.K.; Tosun, Ö.; Geetha, M. Hybrid monkey search algorithm for flow shop scheduling problem under makespan and total flow time. Appl. Soft. Comput. 2017, 55, 82–92. [Google Scholar] [CrossRef]
  31. Zhang, P.; Qian, Y.; Qian, Q. Multi-objective optimization for materials design with improved NSGA-II. Mater. Today Commun. 2021, 28, 102709. [Google Scholar] [CrossRef]
  32. Tamiz, M.; Jones, D.; Romero, C. Goal programming for decision making: An overview of the current state-of-the-art. Eur. J. Oper. Res. 1998, 111, 569–581. [Google Scholar] [CrossRef]
  33. Xin, B.; Chen, L.; Chen, J.; Ishibuchi, H.; Hirota, K.; Liu, B. Interactive Multiobjective Optimization: A Review of the State-of-the-Art. IEEE Access 2018, 6, 41256–41279. [Google Scholar] [CrossRef]
  34. Sadjadi, S.J.; Heidari, M.; Alinezhad Esboei, A. Augmented ε-constraint method in multiobjective staff scheduling problem: A case study. Int. J. Adv. Manuf. Technol. 2014, 70, 1505–1514. [Google Scholar] [CrossRef]
  35. Deb, K. Multi-objective Optimisation Using Evolutionary Algorithms: An Introduction. In Multi-Objective Evolutionary Optimisation for Product Design and Manufacturing; Springer: London, UK, 2011; pp. 3–34. [Google Scholar]
  36. Jones, D.F.; Mirrazavi, S.K.; Tamiz, M. Multi-objective meta-heuristics: An overview of the current state-of-the-art. Eur. J. Oper. Res. 2002, 137, 1–9. [Google Scholar] [CrossRef]
  37. Schaffer, J.D. Multiple objective optimization with vector evaluated genetic algorithms. In Proceedings of the First International Conference on Genetic Algorithms and Their Applications; Psychology Press: London, UK, 2014; pp. 93–100. [Google Scholar]
  38. Horn, J.; Nafpliotis, N.; Goldberg, D.E. A niched Pareto genetic algorithm for multiobjective optimization. In Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence, Orlando, FL, USA, 27–29 June 1994; pp. 82–87. [Google Scholar]
  39. Fonseca, C.M.; Fleming, P.J. Multiobjective optimization and multiple constraint handling with evolutionary algorithms. I. A unified formulation. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 1998, 28, 26–37. [Google Scholar] [CrossRef]
  40. Kim, M.; Hiroyasu, T.; Miki, M.; Watanabe, S. SPEA2+: Improving the performance of the strength Pareto evolutionary algorithm 2. In Parallel Problem Solving from Nature-PPSN VIII: 8th International Conference, Birmingham, UK, 18–22 September 2004; Proceedings 8; Springer: Berlin/Heidelberg, Germany, 2004; pp. 742–751. [Google Scholar]
  41. Murata, T.; Ishibuchi, H. MOGA: Multi-objective genetic algorithms. In Proceedings of the IEEE International Conference on Evolutionary Computation, Perth, WA, Australia, 29 November 1995–1 December 1995; pp. 289–294. [Google Scholar]
  42. Hajela, P.; Lin, C.-Y. Genetic search strategies in multicriterion optimal design. Struct. Optim. 1992, 4, 99–107. [Google Scholar] [CrossRef]
  43. Zitzler, E.; Thiele, L. Multiobjective evolutionary algorithms: A comparative case study and the strength Pareto approach. IEEE Trans. Evol. Comput. 1999, 3, 257–271. [Google Scholar] [CrossRef]
  44. Knowles, J.D.; Corne, D.W. Approximating the Nondominated Front Using the Pareto Archived Evolution Strategy. Evol. Comput. 2000, 8, 149–172. [Google Scholar] [CrossRef]
  45. Corne, D.W.; Jerram, N.R.; Knowles, J.D.; Oates, M.J. PESA-II: Region-based selection in evolutionary multiobjective optimization. In Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, San Francisco, CA, USA, 7 July 2001; pp. 283–290. [Google Scholar]
  46. Sarker, R.; Liang, K.-H.; Newton, C. A new multiobjective evolutionary algorithm. Eur. J. Oper. Res. 2002, 140, 12–23. [Google Scholar] [CrossRef]
  47. Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T.A.M.T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 2002, 6, 182–197. [Google Scholar] [CrossRef]
  48. Yen, G.G.; Haiming, L.u. Dynamic multiobjective evolutionary algorithm: Adaptive cell-based rank and density estimation. IEEE Trans. Evol. Comput. 2003, 7, 253–274. [Google Scholar] [CrossRef]
  49. Coello Coello Coello, C.A.; Toscano Pulido, G. A Micro-Genetic Algorithm for Multiobjective Optimization; Springer: Berlin/Heidelberg, Germany, 2001; pp. 126–140. [Google Scholar]
  50. Haiming Lu Yen, G.G. Rank-density-based multiobjective genetic algorithm and benchmark test function study. IEEE Trans. Evol. Comput. 2003, 7, 325–343. [Google Scholar] [CrossRef]
  51. Coello Coello, C.A. A Comprehensive Survey of Evolutionary-Based Multiobjective Optimization Techniques. Knowl. Inf. Syst. 1999, 1, 269–308. [Google Scholar] [CrossRef]
  52. Xiujuan, L.; Zhongke, S. Overview of multi-objective optimization methods. J. Syst. Eng. Electron. 2004, 15, 142–146. [Google Scholar]
  53. Jensen, M.T. Reducing the Run-Time Complexity of Multiobjective EAs: The NSGA-II and Other Algorithms. IEEE Trans. Evol. Comput. 2003, 7, 503–515. [Google Scholar] [CrossRef]
  54. Li, M.; Liu, L.; Lin, D. A fast steady-state ε-dominance multi-objective evolutionary algorithm. Comput. Optim. Appl. 2011, 48, 109–138. [Google Scholar] [CrossRef]
  55. Kao, G.K.; Jacobson, S.H. Finding preferred subsets of Pareto optimal solutions. Comput. Optim. Appl. 2008, 40, 73–95. [Google Scholar] [CrossRef]
  56. De Jong, K.A.; Spears, W.M. An Analysis of the Interacting Roles of Population Size and Crossover in Genetic Algorithms; Springer: Berlin/Heidelberg, Germany, 1991; pp. 38–47. [Google Scholar]
  57. Rezaei, H.; Bozorg-Haddad, O.; Chu, X. Grey Wolf Optimization (GWO) Algorithm. In Advanced Optimization by Nature-Inspired Algorithms; Springer: Berlin/Heidelberg, Germany, 2018; pp. 81–91. [Google Scholar]
  58. Srinivas, N.; Deb, K. Muiltiobjective Optimization Using Nondominated Sorting in Genetic Algorithms. Evol. Comput. 1994, 2, 221–248. [Google Scholar] [CrossRef]
  59. Wang, D.; Tan, D.; Liu, L. Particle swarm optimization algorithm: An overview. Soft Comput. 2018, 22, 387–408. [Google Scholar] [CrossRef]
  60. Dorigo, M.; Di Caro, G. Ant colony optimization: A new meta-heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; pp. 1470–1477. [Google Scholar]
  61. Available online: http://people.brunel.ac.uk/~mastjjb/jeb/info.html (accessed on 15 February 2024).
Figure 1. Illustration of Pareto front for a bi-objective optimization problem.
Figure 1. Illustration of Pareto front for a bi-objective optimization problem.
Mathematics 12 02951 g001
Figure 2. Non-dominated Pareto front in the population.
Figure 2. Non-dominated Pareto front in the population.
Mathematics 12 02951 g002
Figure 4. Flow chart of the proposed DIPGA algorithm.
Figure 4. Flow chart of the proposed DIPGA algorithm.
Mathematics 12 02951 g004
Figure 5. Best Pareto solution of proposed DIPGA.
Figure 5. Best Pareto solution of proposed DIPGA.
Mathematics 12 02951 g005
Figure 6. The LSD intervals and mean plot for the DIPGA, NSGA, MOGA, NSGA-II, WBGA, PAES, GWO, PSO, and ACO.
Figure 6. The LSD intervals and mean plot for the DIPGA, NSGA, MOGA, NSGA-II, WBGA, PAES, GWO, PSO, and ACO.
Mathematics 12 02951 g006
Figure 7. Comparison of convergence times.
Figure 7. Comparison of convergence times.
Mathematics 12 02951 g007
Table 1. The processing time for job i on machine j.
Table 1. The processing time for job i on machine j.
Job t i , 1 t i , 2
152
226
312
475
566
637
772
851
9116
10203
Table 2. Job sequence.
Table 2. Job sequence.
93106451278
37254869101
91042581673
Table 3. Comparison of different multiobjective algorithms by average of the MDI.
Table 3. Comparison of different multiobjective algorithms by average of the MDI.
InstanceDIPGANSGAMOGANSGA-IIWBGAPAESGWOPSOACO
20 × 50.0350.0550.0720.0420.0870.0450.0370.0510.04
20 × 50.0230.0440.0410.0390.0780.0480.0330.0480.035
20 × 50.0390.040.0820.0530.0930.0560.0380.0520.041
20 × 100.0350.0650.0550.0490.0910.0510.0340.050.037
20 × 100.0380.0580.0770.0580.0770.0560.0360.0490.038
20 × 100.0210.0410.0560.0370.0840.0340.0320.0480.036
20 × 150.0230.0480.080.0530.0810.0410.0330.0510.037
20 × 150.030.0350.0940.0320.0670.0320.0360.0470.039
20 × 150.0350.0330.0470.0380.0670.0390.0350.0450.038
30 × 100.0270.0530.0740.0410.0840.0580.0340.0510.037
30 × 100.0210.0440.0820.0390.0990.0610.0350.0480.039
30 × 100.0280.0420.0930.040.0940.0470.0330.0490.038
30 × 150.0310.0440.0720.0430.0830.0530.0370.050.039
30 × 150.0270.0430.0650.0450.0810.0570.0340.0480.038
30 × 150.0250.0410.0570.0450.0470.0570.0320.0460.036
50 × 100.0330.0550.0590.0380.0670.0490.0340.0480.038
50 × 100.0330.0510.0840.0360.0780.0490.0350.0490.039
50 × 100.0290.0320.0780.0360.0870.0690.0340.0470.037
75 × 200.0410.0630.0910.0520.0880.0650.0360.050.038
75 × 200.0350.0650.0510.0540.0940.0490.0370.0490.039
75 × 200.0340.0520.0870.0360.0920.0640.0350.0480.038
Average0.030.0470.0710.0430.0810.0510.0390.0470.041
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shahsavari-Pour, N.; Heydari, A.; Fekih, A.; Asadi, H. A Novel Pareto-Optimal Algorithm for Flow Shop Scheduling Problem. Mathematics 2024, 12, 2951. https://doi.org/10.3390/math12182951

AMA Style

Shahsavari-Pour N, Heydari A, Fekih A, Asadi H. A Novel Pareto-Optimal Algorithm for Flow Shop Scheduling Problem. Mathematics. 2024; 12(18):2951. https://doi.org/10.3390/math12182951

Chicago/Turabian Style

Shahsavari-Pour, Nasser, Azim Heydari, Afef Fekih, and Hamed Asadi. 2024. "A Novel Pareto-Optimal Algorithm for Flow Shop Scheduling Problem" Mathematics 12, no. 18: 2951. https://doi.org/10.3390/math12182951

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop