Simulation-Based EDAs for Stochastic Programming Problems
Abstract
:1. Introduction
- the calculations and time cost of the objective function,
- the difficulty of computing the exact gradient of the objective function, as well as the high cost and time consuming element of calculating numerical approximations of it, and
- the included noise in objective functions.
2. Estimation of Distribution Algorithms
Algorithm 1 Pseudo code for the standard EDAs |
|
Algorithm 2 Learning the joint density function |
|
3. Variable Sampling Path
Algorithm 3 Sampling Pure Random Search (SPRS) Algorithm |
|
4. Estimation of Distribution Algorithms for Simulation-Based Optimization
4.1. Function Transformation
Algorithm 4 Min-Max Sampling Search (MMSS) Algorithm |
|
4.2. The Proposed EDA-Based Method
Algorithm 5 The proposed EDA-based Algorithm |
|
- EDA-D: If the objective function has no noise, and its values are directly calculated from the function form.
- EDA-SPRS: If the objective function contains random variables and its values are estimated using the SPRS technique.
- EDA-MMSS: If the objective function contains random variables and its values are estimated using the MMSS technique.
5. Numerical Experiments
5.1. Test Functions
5.2. Parameter Settings
5.3. Performance Analysis
6. Results and Discussion
6.1. Numerical Results on Global Optimization
6.2. Simulation Based Optimization Results
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A. Hard Test Functions
h | Function Name | Bounds | Global Min |
---|---|---|---|
Shifted sphere function | |||
Shifted Schwefel’s function 1.2 | |||
Shifted rotated high conditioned elliptic function | |||
Shifted Schwefel’s function 1.2 with noise in fitness | |||
Schwefel’s function 2.6 with global optimum on bounds | |||
Shifted Rosenbrock’s function | 390 | ||
Shifted rotated Griewank’s function without bounds | |||
Shifted rotated Ackley’s function with global optimum | |||
on bounds | |||
Shifted Rastrigin’s function | |||
Shifted rotated Rastrigin’s function | |||
Shifted rotated Weierstrass function | 90 | ||
Schwefel’s function 2.13 | |||
Expanded extended Griewank’s + Rosenbrock’s function | |||
Expanded rotated extended Scaffer’s function | |||
Hybrid composition function | 120 | ||
Rotated hybrid composition function | 120 | ||
Rotated hybrid composition function with noise in fitness | 120 | ||
Rotated hybrid composition function | 10 | ||
Rotated hybrid composition function with narrow | 10 | ||
basin global optimum | |||
Rotated hybrid composition function with global | 10 | ||
optimum on the bounds | |||
Rotated hybrid composition function | 360 | ||
Rotated hybrid composition function with high | 360 | ||
condition number matrix | |||
Non-Continuous rotated hybrid composition function | 360 | ||
Rotated hybrid composition function | 260 | ||
Rotated hybrid composition function without bounds | 260 |
Appendix B. Classical Test Functions—Set A
Appendix B.1. Goldstein and Price Function
Appendix B.2. Rosenbrock Function
Appendix B.3. Griewank Function
Appendix B.4. Pinter Function
Appendix B.5. Modified Griewank Function
Appendix B.6. Griewank function with non-Gaussian noise
Appendix B.7. Griewank function with (50D)
Appendix C. Classical Test Functions—Set B
Appendix C.1. Ackley Function
Appendix C.2. Alpine Function
Appendix C.3. Axis Parallel Function
Appendix C.4. DeJong Function
Appendix C.5. Drop Wave Function
Appendix C.6. Griewank Function
Appendix C.7. Michalewicz Function
Appendix C.8. Moved Axis Function
Appendix C.9. Pathological Function
Appendix C.10. Rastrigin Function
Appendix C.11. Rosenbrock Function
Appendix C.12. Schwefel Function
Appendix C.13. Tirronen Function
References
- Kizhakke Kodakkattu, S.; Nair, P. Design optimization of helicopter rotor using kriging. Aircr. Eng. Aerosp. Technol. 2018, 90, 937–945. [Google Scholar] [CrossRef]
- Kim, P.; Ding, Y. Optimal design of fixture layout in multistation assembly processes. IEEE Trans. Autom. Sci. Eng. 2004, 1, 133–145. [Google Scholar] [CrossRef]
- Kleijnen, J.P. Simulation-optimization via Kriging and bootstrapping: A survey. J. Simul. 2014, 8, 241–250. [Google Scholar] [CrossRef]
- Fu, M.C.; Hu, J.Q. Sensitivity analysis for Monte Carlo simulation of option pricing. Probab. Eng. Inf. Sci. 1995, 9, 417–446. [Google Scholar] [CrossRef] [Green Version]
- Plambeck, E.L.; Fu, B.R.; Robinson, S.M.; Suri, R. Throughput optimization in tandem production lines via nonsmooth programming. In Proceedings of the 1993 Summer Computer Simulation Conference, Boston, MA, USA, 19–21 July 1993; pp. 70–75. [Google Scholar]
- Pourhassan, M.R.; Raissi, S. An integrated simulation-based optimization technique for multi-objective dynamic facility layout problem. J. Ind. Inf. Integr. 2017, 8, 49–58. [Google Scholar] [CrossRef]
- Semini, M.; Fauske, H.; Strandhagen, J.O. Applications of discrete-event simulation to support manufacturing logistics decision-making: a survey. In Proceedings of the 38th Conference on Winter Simulation, Monterey, CA, USA, 3–6 December 2006; pp. 1946–1953. [Google Scholar]
- Chong, L.; Osorio, C. A simulation-based optimization algorithm for dynamic large-scale urban transportation problems. Transp. Sci. 2017, 52, 637–656. [Google Scholar] [CrossRef] [Green Version]
- Gürkan, G.; Yonca Özge, A.; Robinson, S.M. Sample-path solution of stochastic variational inequalities. Math. Program. 1999, 84, 313–333. [Google Scholar] [CrossRef] [Green Version]
- Frühwirth-Schnatter, S. Data augmentation and dynamic linear models. J. Time Ser. Anal. 1994, 15, 183–202. [Google Scholar] [CrossRef] [Green Version]
- Van Dyk, D.A.; Meng, X.L. The art of data augmentation. J. Comput. Graph. Stat. 2001, 10, 1–50. [Google Scholar] [CrossRef]
- Andradóttir, S. Handbook of Simulation: Principles, Methodology, Advances, Applications, and Practice. Available online: https://www.wiley.com/en-us/Handbook+of+Simulation%3A+Principles%2C+Methodology%2C+Advances%2C+Applications%2C+and+Practice-p-9780471134039 (accessed on 16 March 2020).
- Gosavi, A. Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning. Available online: https://www.researchgate.net/publication/238319435_Simulation-Based_Optimization_Parametric_Optimization_Techniques_and_Reinforcement_Learning (accessed on 16 March 2020).
- Fu, M.C. Optimization for simulation: Theory vs. practice. INFORMS J. Comput. 2002, 14, 192–215. [Google Scholar] [CrossRef]
- BoussaïD, I.; Lepagnot, J.; Siarry, P. A survey on optimization metaheuristics. Inf. Sci. 2013, 237, 82–117. [Google Scholar] [CrossRef]
- Glover, F.W.; Kochenberger, G.A. Handbook of Metaheuristics; Springer: Boston, MA, USA, 2006. [Google Scholar]
- Ribeiro, C.C.; Hansen, P. Essays and surveys in metaheuristics; Springer Science & Business Media: New York, NY, USA, 2012. [Google Scholar]
- Siarry, P. Metaheuristics. Available online: https://link.springer.com/book/10.1007/978-3-319-45403-0#about (accessed on 18 March 2020).
- Pellerin, R.; Perrier, N.; Berthaut, F. A survey of hybrid metaheuristics for the resource-constrained project scheduling problem. Eur. J. Oper. Res. 2019, 27, 437. [Google Scholar] [CrossRef]
- Mohamed, A.W.; Hadi, A.A.; Mohamed, A.K. Gaining-sharing knowledge based algorithm for solving optimization problems: A novel nature-inspired algorithm. Int. J. Mach. Learn. Cybern. 2019, 1–29. [Google Scholar] [CrossRef]
- Doğan, B.; Ölmez, T. A new metaheuristic for numerical function optimization: Vortex Search algorithm. Inf. Sci. 2015, 293, 125–145. [Google Scholar] [CrossRef]
- Huang, C.; Li, Y.; Yao, X. A Survey of Automatic Parameter Tuning Methods for Metaheuristics. Available online: https://ieeexplore.ieee.org/document/8733017 (accessed on 16 March 2020).
- Wang, J.; Zhang, Q.; Abdel-Rahman, H.; Abdel-Monem, M.I. A rough set approach to feature selection based on scatter search metaheuristic. J. Syst. Sci. Complex. 2014, 27, 157–168. [Google Scholar] [CrossRef]
- Hedar, A.; Ali, A.F.; Hassan, T. Genetic algorithm and tabu search based methods for molecular 3D-structure prediction. Numer. Algebra Control. Optim. 2011, 1, 191–209. [Google Scholar] [CrossRef]
- Hedar, A.; Fukushima, M. Heuristic pattern search and its hybridization with simulated annealing for nonlinear global optimization. Optim. Methods Softw. 2004, 19, 291–308. [Google Scholar] [CrossRef]
- Hedar, A.; Fukushima, M. Directed evolutionary programming: To wards an improved performance of evolutionary programming. In Proceedings of the IEEE International Conference on Evolutionary Computation, Vancouver, BC, Canada, 16–21 July 2006; pp. 1521–1528. [Google Scholar]
- Hedar, A.; Fukushima, M. Derivative-free filter simulated annealing method for constrained continuous global optimization. J. Glob. Optim. 2006, 35, 521–549. [Google Scholar] [CrossRef]
- Larrañaga, P.; Lozano, J.A. Estimation of Distribution Algorithms: A New Tool For Evolutionary Computation; Springer Science & Business Media: Boston, MA, USA, 2001. [Google Scholar]
- Hauschild, M.; Pelikan, M. An introduction and survey of estimation of distribution algorithms. Swarm Evol. Comput. 2011, 1, 111–128. [Google Scholar] [CrossRef] [Green Version]
- Yang, Q.; Chen, W.N.; Li, Y.; Chen, C.P.; Xu, X.M.; Zhang, J. Multimodal estimation of distribution algorithms. IEEE Trans. Cybern. 2016, 47, 636–650. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Krejca, M.S.; Witt, C. Theory of Estimation-of-Distribution Algorithms. Available online: https://www.researchgate.net/publication/337425690_Theory_of_Estimation-of-Distribution_Algorithms (accessed on 16 March 2020).
- Homem-De-Mello, T. Variable-sample methods for stochastic optimization. ACM Trans. Model. Comput. Simul. (Tomacs) 2003, 13, 108–133. [Google Scholar] [CrossRef]
- Faris, H.; Aljarah, I.; Mirjalili, S. Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl. Intell. 2018, 48, 445–464. [Google Scholar] [CrossRef]
- Lozano, J.A.; Larrañaga, P.; Inza, I.; Bengoetxea, E. Towards a New Evolutionary Computation: Advances on Estimation of Distribution Algorithms. Available online: https://link.springer.com/book/10.1007/3-540-32494-1#about (accessed on 18 March 2020).
- Baluja, S. Population-Based Incremental Learning. A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning. Available online: https://www.ri.cmu.edu/pub_files/pub1/baluja_shumeet_1994_2/baluja_shumeet_1994_2.pdf (accessed on 16 March 2020).
- Mühlenbein, H.; Paass, G. From Recombination of Genes to the Estimation Of Distributions I. Binary Parameters. Available online: http://www.muehlenbein.org/estbin96.pdf (accessed on 16 March 2020).
- Pelikan, M.; Goldberg, D.E.; Lobo, F.G. A survey of optimization by building and using probabilistic models. Comput. Optim. Appl. 2002, 21, 5–20. [Google Scholar] [CrossRef]
- Dong, W.; Wang, Y.; Zhou, M. A latent space-based estimation of distribution algorithm for large-scale global optimization. Soft Comput. 2018, 8, 1–23. [Google Scholar] [CrossRef]
- Sebag, M.; Ducoulombier, A. Extending Population-Based Incremental Learning To Continuous Search Spaces. Available online: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.42.1884 (accessed on 16 March 2020).
- Tsutsui, S.; Pelikan, M.; Goldberg, D.E. Evolutionary Algorithm Using Marginal Histogram Models in Continuous Domain. Available online: http://medal-lab.org/files/2001019.pdf (accessed on 16 March 2020).
- Larranaga, P.; Etxeberria, R.; Lozano, J.; Pena, J.; Pe, J. Optimization by Learning and Simulation of Bayesian and Gaussian Networks. Available online: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.1895 (accessed on 16 March 2020).
- Larrañaga, P.; Etxeberria, R.; Lozano, J.A.; Peña, J.M. Optimization in cOntinuous Domains by Learning and Simulation of Gaussian Networks. Available online: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.31.3105 (accessed on 16 March 2020).
- Wang, J.; Hedar, A.; Zheng, G.; Wang, S. Scatter search for rough set attribute reduction. In Proceedings of the International Joint Conference on Computational Sciences and Optimization, Sanya, China, 24–26 April 2009; pp. 531–535. [Google Scholar]
- Hedar, A.; Ali, A.F. Genetic algorithm with population partitioning and space reduction for high dimensional problem. In Proceedings of the International Conference on Computer Engineering & Systems, Cairo, Egypt, 14–16 December 2009; pp. 151–156. [Google Scholar]
- Hedar, A.; Fukushima, M. Minimizing multimodal functions by simplex coding genetic algorithm. Optim. Methods Softw. 2003, 18, 265–282. [Google Scholar] [CrossRef]
- Floudas, C.A.; Pardalos, P.M.; Adjiman, C.; Esposito, W.R.; Gümüs, Z.H.; Harding, S.T.; Klepeis, J.L.; Meyer, C.A.; Schweiger, C.A. Handbook of Test Problems in Local and Global Optimization; Springer Science & Business Media: Boston, MA, USA, 2013. [Google Scholar]
- Liang, J.J.; Suganthan, P.N.; Deb, K. Novel composition test functions for numerical global optimization. In Proceedings of the 2005 IEEE Swarm Intelligence Symposium, Pasadena, CA, USA, 8–10 June 2005. [Google Scholar]
- Suganthan, P.N.; Hansen, N.; Liang, J.J.; Deb, K.; Chen, Y.-P.; Auger, A.; Tiwari, S. Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization. Available online: http://www.cmap.polytechnique.fr/~nikolaus.hansen/Tech-Report-May-30-05.pdf (accessed on 16 March 2020).
- He, D.; Lee, L.H.; Chen, C.H.; Fu, M.C.; Wasserkrug, S. Simulation optimization using the cross-entropy method with optimal computing budget allocation. ACM Trans. Model. Comput. Simul. 2010, 20, 4. [Google Scholar] [CrossRef]
- Hedar, A.; Allam, A.A. Scatter Search for Simulation-Based Optimization. In Proceedings of the 2017 International Conference on Computer and Applications, Dubai, UAE, 6–7 September 2017; pp. 244–251. [Google Scholar]
- Hedar, A.; Allam, A.A.; Deabes, W. Memory-Based Evolutionary Algorithms for Nonlinear and Stochastic Programming Problems. Mathematics 2019, 7, 1126. [Google Scholar] [CrossRef] [Green Version]
- García, S.; Fernández, A.; Luengo, J.; Herrera, F. A study of statistical techniques and performance measures for genetics-based machine learning: accuracy and interpretability. Soft Comput. 2009, 13, 959. [Google Scholar] [CrossRef]
- Sheskin, D.J. Handbook of Parametric and Nonparametric Statistical Procedures; The CRC Press: Boca Raton, FL, USA, 2003. [Google Scholar]
- Zar, J.H. Biostatistical Analysis; Pearson: London, UK, 2013. [Google Scholar]
- Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
- García-Martínez, S.; Molina, D.; Lozano, M.; Herrera, F. A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behavior: a case study on the CEC’2005 special session on real parameter optimization. J. Heuristics 2009, 15, 617–644. [Google Scholar] [CrossRef]
- Fan, Q.; Yan, X.; Xue, Y. Prior knowledge guided differential evolution. Soft Comput. 2017, 21, 6841–6858. [Google Scholar] [CrossRef]
- Gosh, A.; Das, S.; Mallipeddi, R.; Das, A.K.; Dash, S.S. A modified differential evolution with distance-based selection for continuous optimization in the presence of noise. IEEE Access 2017, 5, 26944–26964. [Google Scholar] [CrossRef]
No. | f | Function Name | n | No. | f | Function Name | n |
---|---|---|---|---|---|---|---|
1 | Branin RCOS | 2 | 6 | Perm | 4 | ||
2 | Goldstein Price | 2 | 7 | Trid | 6 | ||
3 | Schwefel | 2 | 8 | Griewank | 20 | ||
4 | Zakharov | 2 | 9 | DixonPrice | 25 | ||
5 | Shekel | 4 | 10 | Ackley | 30 |
Function | Name | n | Stochastic Variable Distribution |
---|---|---|---|
Goldstein & Price | 2 | ||
Rosenbrock | 5 | ||
Griewank | 2 | ||
Pinter | 5 | ||
Modified Griewank | 2 | ||
Griewank | 2 | ||
Griewank | 50 |
Parameter | Definition | Best Value |
---|---|---|
R | Population size | 60 |
S | No. of selected individual size | |
The initial value for the sample size | 50 | |
The maximum value for the sample size | 5000 | |
The threshold of small sample sizes | 300 | |
The parameter of the function transformation | ||
Max no. of function evaluations for GO | 10,000n | |
Max no. of function evaluations for SBO | 1,000,000 | |
Runs | No. of independent runs in each experiment | 25 |
EDA-SPRS | EDA-MMSS | |||
---|---|---|---|---|
Best | Average | Best | Average | |
5.06 | 1.64 | 6.61 | 6.15 | |
2.52 | 3.11 | 4.50 | 7.69 | |
3.75 | 4.24 | 2.90 | 3.22 | |
1.39 | 1.62 | 9.34 | 2.23 | |
3.59 | 3.02 | 9.11 | 2.11 | |
9.40 | 4.77 | 2.91 | 2.87 | |
3.14 | 4.56 | 2.25 | 2.99 |
Comparison Criteria | Compared Methods | p-Value | Best Method | ||
---|---|---|---|---|---|
Best Solutions | EDA-SPRS, EDA-MMSS | 14 | 14 | 0.7104 | – |
Average Errors | EDA-SPRS, EDA-MMSS | 15 | 13 | 0.7104 | – |
f | DSS | EDA-D |
---|---|---|
3.58 | 3.59 | |
6.00 | 5.02 | |
4.69 | 1.77 | |
2.39 | 1.00 | |
2.55 | 2.50 | |
5.21 | 2.49 | |
2.88 | 3.48 | |
9.43 | 1.21 | |
5.67 | 1.00 | |
1.04 | 9.52 |
Comparison Criteria | Compared Methods | p-Value | Best Method | ||
---|---|---|---|---|---|
Average Errors | DSS, EDA-D | 22 | 33 | 0.7337 | – |
h | EDA-D | jDE | SaDE | JADE | CoDE | SSCPDE | CoBiDE | PKDE |
---|---|---|---|---|---|---|---|---|
4.63 | 0 | 0 | 0 | 6.73 | 0 | 0 | 0 | |
4.72 | 4.78 | 1.12 | 8.15 | 1.23 | 7.79 | 1.19 | 3.54 | |
4.51 | 2.03 | 5.35 | 8.18 | 1.20 | 7.14 | 7.80 | 4.89 | |
5.05 | 3.44 | 1.45 | 8.39E–16 | 5.17 | 1.59 | 8.88 | 7.83 | |
6.93 | 4.46 | 3.13 | 4.86 | 3.64 | 3.92 | 3.71 | 6.29 | |
7.97 | 2.25 | 4.01 | 2.58 | 1.33 | 7.80 | 1.66 | 2.65 | |
1.53 | 1.16 | 1.84 | 9.36 | 8.86 | 4.51 | 3.68 | 6.07 | |
2.00 | 2.09 | 2.09 | 2.09 | 2.02 | 2.09 | 2.07 | 2.03 | |
1.52 | 0 | 9.95 | 0 | 0 | 0 | 0 | 0 | |
3.18 | 5.79 | 4.48 | 2.43 | 4.16 | 2.84 | 4.34 | 4.57 | |
3.42 | 2.83 | 1.65 | 2.53 | 1.26 | 1.95 | 5.67 | 1.36 | |
1.80 | 1.20 | 2.17 | 6.68 | 3.34 | 1.64 | 2.96 | 3.72 | |
5.76 | 1.66 | 3.90 | 1.48 | 1.58 | 2.50 | 2.64 | 2.35 | |
1.40 | 1.30 | 1.26 | 1.23 | 1.24 | 1.22 | 1.22 | 1.23 | |
8.98 | 3.18 | 3.74 | 3.76 | 4.03 | 3.30 | 4.10 | 3.43 | |
4.67 | 8.49 | 7.71 | 9.63 | 6.39 | 4.97 | 8.42 | 6.48 | |
4.52 | 1.39 | 8.70 | 1.02 | 8.50 | 5.56 | 6.82 | 6.83 | |
9.69 | 9.04 | 8.78 | 9.04 | 9.05 | 9.00 | 9.04 | 9.00 | |
1.02 | 9.04 | 8.60 | 9.04 | 9.04 | 9.00 | 9.04 | 9.00 | |
9.80 | 9.04 | 8.73 | 9.04 | 9.04 | 9.00 | 9.04 | 9.00 | |
1.09 | 5.00 | 5.43 | 5.10 | 5.00 | 5.00 | 5.00 | 5.00 | |
1.24 | 8.67 | 9.36 | 8.64 | 8.63 | 8.83 | 8.54 | 8.86 | |
1.26 | 5.34 | 5.69 | 5.34 | 5.34 | 5.34 | 5.34 | 5.34 | |
1.36 | 2.00 | 2.00 | 2.00 | 2.00 | 2.00 | 2.00 | 2.00 | |
1.37 | 2.11 | 2.13 | 2.11 | 2.11 | 2.11 | 2.10 | 2.11 |
Comparison Criteria | Compared Methods | p-Value | Best Method | ||
---|---|---|---|---|---|
EDA-D, jDE | 263 | 62 | 0.2327 | – | |
EDA-D, SaDE | 261 | 64 | 0.4151 | – | |
EDA-D, JADE | 269 | 56 | 0.1116 | – | |
Average Errors | EDA-D, CoDE | 274 | 51 | 0.1683 | – |
EDA-D, SSCPDE | 273 | 52 | 0.1510 | – | |
EDA-D, CoBiDE | 273 | 52 | 0.1456 | – | |
EDA-D, PKDE | 275 | 50 | 0.1456 | – |
f | EDA-SPRS | EDA-MMSS | DESSP | DSSSP | ||||
---|---|---|---|---|---|---|---|---|
Best | Average | Best | Average | Best | Average | Best | Average | |
5.06 | 1.64 | 6.61 | 6.15 | 5.00 | 2.33 | 1.46 | 2.94 | |
2.52 | 3.11 | 4.50 | 7.69 | 8.05 | 3.55 | 4.08 | 6.56 | |
3.75 | 4.24 | 2.90 | 3.22 | 1.00 | 3.31 | 5.90 | 1.04 | |
1.39 | 1.62 | 9.34 | 2.23 | 1.44 | 3.75 | 2.75 | 6.71 | |
3.59 | 3.02 | 9.11 | 2.11 | 4.00 | 3.87 | 2.82 | 1.91 | |
9.40 | 4.77 | 2.91 | 2.87 | 1.00 | 2.13 | 3.00 | 9.21 | |
3.14 | 4.56 | 2.25 | 2.99 | 2.79 | 3.96 | 8.41 | 1.24 |
f | |||||||
---|---|---|---|---|---|---|---|
EDA-SPRS | 79 | 10 | 39 | 75 | 10 | 5 | 241 |
EDA-MMSS | 85 | 11 | 40 | 76 | 11 | 5 | 251 |
DESSP | 162 | 20 | 119 | 177 | 25 | 10 | 476 |
DSSSP | 414 | 103 | 317 | 307 | 114 | 35 | 683 |
Comparison Criteria | Compared Methods | p-Value | Best Method | ||
---|---|---|---|---|---|
EDA-SPRS, EDA-MMSS | 14 | 14 | 0.7104 | – | |
EDA-SPRS, DESSP | 21 | 7 | 0.2593 | – | |
Best Solutions | EDA-SPRS, DSSSP | 9 | 19 | 0.9015 | – |
EDA-MMSS, DESSP | 16 | 12 | 0.3829 | – | |
EDA-MMSS, DSSSP | 11 | 17 | 0.8048 | – | |
EDA-SPRS, EDA-MMSS | 15 | 13 | 0.7104 | – | |
EDA-SPRS, DESSP | 14 | 14 | 0.8048 | – | |
Average Errors | EDA-SPRS, DSSSP | 9 | 19 | 0.8048 | – |
EDA-MMSS, DESSP | 22 | 6 | 0.2086 | – | |
EDA-MMSS, DSSSP | 18 | 10 | 0.8048 | – | |
EDA-SPRS, EDA-MMSS | 0.5 | 27.5 | 0.6474 | – | |
EDA-SPRS, DESSP | 0 | 28 | 0.3141 | – | |
Processing Time | EDA-SPRS, DSSSP | 0 | 28 | 0.0169 | EDA-SPRS |
EDA-MMSS, DESSP | 0 | 28 | 0.3642 | – | |
EDA-MMSS, DSSSP | 0 | 28 | 0.0169 | EDA-MMSS |
g | EDA-SPRS | EDA-MMSS | DE/rand/1 | jDE | GADS |
---|---|---|---|---|---|
2.73 | 5.60 | 3.67 | 4.59 | 1.95 | |
1.79 | 8.54 | 5.89 | 4.25 | 3.24 | |
3.67 | 9.25 | 6.56 | 4.27 | 7.44 | |
1.63 | 4.14 | 1.02 | 5.82 | 6.49 | |
3.88 | 2.82 | 9.98 | 7.67 | 2.76 | |
2.00 | 2.42 | 4.18 | 1.99 | 4.75 | |
2.40 | 2.28 | 6.34 | 2.08 | 1.29 | |
5.67 | 3.73 | 1.65 | 9.19 | 1.05 | |
1.07 | 1.12 | 5.74 | 4.65 | 3.81 | |
2.50 | 4.26 | 3.91 | 2.90 | 2.31 | |
3.41 | 4.37 | 6.36 | 4.12 | 7.50 | |
8.23 | 4.51 | 7.89 | 8.22 | 6.09 | |
9.64 | 6.19 | 1.18 | 9.91 | 1.46 | |
g | DERSFTS | OBDE | NADE | MUDE | MDE-DS |
1.10 | 3.35 | 2.99 | 2.59 | 0 | |
5.67 | 5.69 | 3.29 | 2.69 | 8.53 | |
6.84 | 8.23 | 3.32 | 2.36 | 7.08 | |
1.06 | 1.45 | 4.53 | 3.68 | 7.92 | |
7.84 | 4.16 | 8.30 | 7.69 | 5.41 | |
3.95 | 5.21 | 1.65 | 1.46 | 1.41 | |
8.52 | 9.27 | 2.46 | 2.32 | 2.02 | |
1.66 | 2.15 | 6.84 | 5.28 | 7.81 | |
3.92 | 4.25 | 5.05 | 5.41 | 1.39 | |
3.83 | 3.78 | 1.96 | 2.00 | 1.89 | |
6.12 | 7.26 | 3.76 | 2.49 | 2.60 | |
1.01 | 8.03 | 5.60 | 6.00 | 0 | |
5.89 | 1.08 | 1.82 | 1.57 | 1.03 |
Comparison Criteria | Compared Methods | p-Value | Best Method | ||
---|---|---|---|---|---|
EDA-SPRS, EDA-MMSS | 23 | 68 | 0.3560 | – | |
EDA-SPRS, DE/rand/1 | 19 | 72 | 0.0483 | EDA-SPRS | |
EDA-SPRS, jDE | 15 | 76 | 0.0513 | – | |
EDA-SPRS, GADS | 20 | 71 | 0.0649 | – | |
EDA-SPRS, DERSFTS | 10 | 81 | 0.0513 | – | |
EDA-SPRS, OBDE | 19 | 72 | 0.0544 | – | |
EDA-SPRS, NADE | 15 | 76 | 0.0578 | – | |
EDA-SPRS, MUDE | 20 | 71 | 0.0812 | – | |
Average Errors | EDA-SPRS, MDE-DS | 81 | 10 | 0.0077 | MDE-DS |
EDA-MMSS, DE/rand/1 | 10 | 81 | 0.1008 | – | |
EDA-MMSS, jDE | 10 | 81 | 0.1119 | – | |
EDA-MMSS, GADS | 10 | 81 | 0.1239 | – | |
EDA-MMSS, DERSFTS | 10 | 81 | 0.1008 | – | |
EDA-MMSS, OBDE | 10 | 81 | 0.0812 | – | |
EDA-MMSS, NADE | 6 | 85 | 0.1119 | – | |
EDA-MMSS, MUDE | 6 | 85 | 0.1662 | – | |
EDA-MMSS, MDE-DS | 84 | 7 | 0.0025 | MDE-DS |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hedar, A.-R.; Allam, A.A.; Abdel-Hakim, A.E. Simulation-Based EDAs for Stochastic Programming Problems. Computation 2020, 8, 18. https://doi.org/10.3390/computation8010018
Hedar A-R, Allam AA, Abdel-Hakim AE. Simulation-Based EDAs for Stochastic Programming Problems. Computation. 2020; 8(1):18. https://doi.org/10.3390/computation8010018
Chicago/Turabian StyleHedar, Abdel-Rahman, Amira A. Allam, and Alaa E. Abdel-Hakim. 2020. "Simulation-Based EDAs for Stochastic Programming Problems" Computation 8, no. 1: 18. https://doi.org/10.3390/computation8010018
APA StyleHedar, A. -R., Allam, A. A., & Abdel-Hakim, A. E. (2020). Simulation-Based EDAs for Stochastic Programming Problems. Computation, 8(1), 18. https://doi.org/10.3390/computation8010018