Comparative Analysis of Accelerated Models for Solving Unconstrained Optimization Problems with Application of Khan’s Hybrid Rule
Abstract
:1. Class of Accelerated Gradient Descent Methods and Its Benefits
- Weak Wolfe’s line search:
- Strong Wolfe’s line search:
- Backtracking algorithm:
- Objective function , the direction of the search at the point and numbers and are required;
- ;
- , take ;
- Return .
- matrix is defined as a scalar matrix, i.e.,
- matrix is defined as a diagonal matrix, i.e.,
- matrix is defined as a full matrix.
- Objective function , the direction of the search at the point and numbers and are required;
- Apply Backtracking algorithm to calculate ;
- Compute ;
- Compute
- Return .
- According to the general iteration form (1), the search direction in method, defined by relation (8), is One of the essential properties of the accelerated parameter is its positiveness. If in some iterative step k of the accelerated gradient algorithms with leading iterative rules (8), (9) and (14) this necessary condition is not fulfilled, then the k-th accelerated scalar value is set to be . Bearing this fact in mind, we easily conclude that
- Accelerated double direction scheme (9) contains two search vectors. The first one, denoted as , is defined by (10). The second one is of the same form as in the iteration, i.e., . In the procedure (10), crucial element in deriving vector is defined as a solution of the minimization problem (11) that depends on the gradient , under the assumption Thus, the defined vector direction is a relaxed differentiable variant of the procedure for determination of the search vector (rule 2) in [23] and accordingly, the is globally optimum of the problem (11). Subsequently, we consider only the second direction, which is already performed in the previous item.
- In scheme, vector direction can be seen as . Checking the descent condition (2), we get
2. Three-Term Khan’s Hybridization Principle over the Accelerated Gradient Descent Models
- ,
- ,
- .
- Hybrid gradient descent method () [32]
- Hybrid accelerated gradient descent method () [32]
- Hybrid modified accelerated gradient descent method () [32]
- Hybrid modified improved gradient descent method () [32]
3. Dolan–Moré Performance Profiles and Comparisons
- Codes are written in the visual C++ programming language and run on a Workstation Intel(R) Core(TM) 2.3 GHz.
- The Backtracking parameters values taken are: and . These are standard values for the Backtracking parameters applied in various optimization models with Backtracking algorithm [2,14,15,20,21,22,28,29,30,31]. This set of values means that a small portion of the decrease predicted by the linear approximation of the current point is accepted.
- The stopping criteria are:
1. Extended Penalty |
2. Perturbed Quadratic |
3. Raydan-1 |
4. Diagonal 1 |
5. Diagonal 3 |
6. Generalized Tridiagonal-1 |
7. Diagonal 4 |
8. Extended Himmelblau |
9. Quadr. Diag. Perturbed |
10. Quadratic QF1 |
11. Exten. Quadr. Penalty QP1 |
12. Exten. Quadr. Penalty QP2 |
13. Quadratic QF2 |
14. Extended EP1 |
15. Arwhead |
16. Almost Perturbed Quadratic |
17. Engval1 |
18. Quartc |
19. Generalized Quartic |
20. LIARWHD |
21. Diagonal 6 |
22. Tridia |
23. Indef |
24. Diagonal 9 |
25. DIXON3DQ |
26. NONSCOMP |
27. BIGGSB1 |
28. Power (Cute) |
29. Hager |
30. Raydan 2 |
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Powel, M.J.D. A survey of numerical methods for unconstrained optimization. SIAM Rev. 1970, 12, 79–97. [Google Scholar] [CrossRef]
- Andrei, N. Nonlinear Conjugate Gradient Methods for Unconstrained Optimization. In Nonlinear Conjugate Gradient Methods for Unconstrained Optimization; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Nocadal, J.; Wright, S.J. Numerical Optimization. In Numerical Optimization; Springer: New York, NY, USA, 1999. [Google Scholar]
- Jacoby, S.L.S.; Kowalik, J.S.; Pizzo, J.T. Iterative Methods for Nonlinear Optimization Problems. In Iterative Methods for Nonlinear Optimization Problems; Prentice-Hall, Inc.: Englewood, NJ, USA, 1977. [Google Scholar]
- Deniss, J.E.; Kowalik, J.S.; Schnabel, R.B. Numerical Methods for Unconstrained Optimization and Nonlinear Equations. In Numerical Methods for Unconstrained Optimization and Nonlinear Equations; Prentice-Hall: Englewood Cliffs, NJ, USA, 1983. [Google Scholar]
- Fletcher, R. Practical methods of optimization. In Practical Methods of Optimization; Prentice-Hall, Wiley: New York, NY, USA, 2000. [Google Scholar]
- Luenberg, D.G.; Ye, Y. Linear and nonlinear programming. In Linear and Nonlinear Programming; Springer Science + Business Media, LLC: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Sun, W.; Yuan, Y.X. Optimization Theory and Methods: Nonlinear Programming. In Optimization Theory and Methods: Nonlinear Programming; Springer: New York, NY, USA, 2006. [Google Scholar]
- Shi, Z.J. Convergence of line search methods for unconstrained optimization. Appl. Math. Comput. 2004, 151, 393–405. [Google Scholar] [CrossRef]
- Wolfe, P. Convergence conditions for ascent methods. SIAM Rev. 1968, 11, 226–235. [Google Scholar] [CrossRef]
- Ortega, J.M.; Rheinboldt, W.C. Iterative Solution of Nonlinear Equation in Several Variables. In Iterative Solution of Nonlinear Equation in Several Variables; Academic Press: Cambridge, MA, USA, 1970. [Google Scholar]
- Armijo, L. Minimization of functions having Lipschitz continuous first partial derivatives. Pac. J. Math. 2008, 16, 1–3. [Google Scholar] [CrossRef] [Green Version]
- Brezinski, C. A classification of quasi-Newton methods. Numer. Algor. 2003, 33, 123–135. [Google Scholar] [CrossRef]
- Stanimirovic, P.S.; Miladinović, M.B. Accelerated gradient descent methods with line search. Numer. Algor. 2010, 54, 503–520. [Google Scholar] [CrossRef]
- Andrei, N. An acceleration of gradient descent algoritham with backtracing for unconstrained optimization. Numer. Algor. 2006, 42, 63–173. [Google Scholar] [CrossRef]
- Petrović, M.J.; Stanimirovic, P.S. Accelerated Double Direction Method For Solving Unconstrained Optimization Problems. Math. Probl. Eng. 2014, 2014, 965104. [Google Scholar] [CrossRef] [Green Version]
- Petrović, M.J. An accelerated Double Step Size method in unconstrained optimization. Appl. Math. Comput. 2015, 250, 309–319. [Google Scholar] [CrossRef]
- Stanimirovic, P.S.; Petrović, M.J.; Milovanović, G.V. A Transformation of Accelerated Double Step Size Method for Unconstrained Optimization. Math. Probl. Eng. 2015, 2015, 283679. [Google Scholar] [CrossRef]
- Petrović, M.J.; Valjarević, D.; Ilić, D.; Valjarević, A.; Mladenović, J. An improved modification of accelerated double direction and double step-size optimization schemes. Mathematics 2022, 10, 259. [Google Scholar] [CrossRef]
- Barzilai, B.; Borwein, J.M. Two point step-size gradient method. IMA J. Numer. Anal. 1988, 8, 141–148. [Google Scholar] [CrossRef]
- Miladinović, M.; Stanimirović, P.S.; Miljković, S. Scalar correction method for solving large scale’ unconstrained minimization problems. J. Optim. Theory Appl. 2011, 151, 304–320. [Google Scholar] [CrossRef]
- Andrei, N. A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algor. 2014, 68, 305–321. [Google Scholar] [CrossRef]
- Djuranović-Miličić, N.I.; Gardašević-Filipović, M. A multi-step curve search algorithm in nonlinear optimization: Nondifferentiable convex case. Facta Univ. Ser. Math. Inform. 2010, 25, 11–24. [Google Scholar]
- Picard, E. Memoire sur la theorie des equations aux derivees partielles et la methode des approximations successives. J. Math. Pures Appl. 1890, 6, 145–210. [Google Scholar]
- Mann, W.R. Mean value methods in iterations. Proc. Am. Math. Soc. 1953, 4, 506–510. [Google Scholar] [CrossRef]
- Ishikawa, S. Fixed points by a new iteration method. Proc. Am. Math. Soc. 1974, 44, 147–150. [Google Scholar] [CrossRef]
- Khan, S.H. A Picard-Mann hybrid iterative process. Fixed Point Theory Appl. 2013, 2013, 69. [Google Scholar] [CrossRef]
- Petrović, M.J.; Rakočević, V.; Kontrec, N.; Panić, S.; Ilić, D. Hybridization Accel. Gradient Descent Method. Numer. Algor. 2018, 79, 769–786. [Google Scholar] [CrossRef]
- Petrović, M.J.; Stanimirović, P.S.; Kontrec, N.; Maldenović, J. Hybrid modification of accelerated double direction method. Math. Probl. Eng. 2018, 2018, 1523267. [Google Scholar] [CrossRef]
- Petrović, M.J. Hybridization Rule Applied on Accelerated Double Step Size Optimization Scheme. Filomat 2019, 33, 655–665. [Google Scholar] [CrossRef]
- Petrović, M.J.; Rakočević, V.; Valjarević, D.; Ilić, D. A note on hybridization process applied on transformed double step size model. Numer. Algor. 2020, 85, 449–465. [Google Scholar] [CrossRef]
- Ivanov, M.J.; Stanimirović, P.S.; Milovanović, G.V.; Djordjević, S.; Brajević, I. Accelerated Multi Step-Size Methods Solving Unconstrained Optimization Problems. Optim. Method Softw. 2020, 85, 449–465. [Google Scholar] [CrossRef]
- Petrović, M.J.; Panić, S.; Carerević, M.M. Initial improvement of the hybrid accelerated gradient descent process. Bull. Aust. Math. Soc. 2018, 98, 331–338. [Google Scholar] [CrossRef]
- Dolan, E.; Moré, J. Benchmarking optimization software with performance profiles. Math. Program 2002, 91, 201–213. [Google Scholar] [CrossRef]
- Andrei, N. An Unconstrained Optimization Test Functions Collection. Adv. Model. Optim. 2008, 10, 147–161. Available online: https://camo.ici.ro/journal/vol10/v10a10.pdf (accessed on 16 November 2022).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rakočević, V.; Petrović, M.J. Comparative Analysis of Accelerated Models for Solving Unconstrained Optimization Problems with Application of Khan’s Hybrid Rule. Mathematics 2022, 10, 4411. https://doi.org/10.3390/math10234411
Rakočević V, Petrović MJ. Comparative Analysis of Accelerated Models for Solving Unconstrained Optimization Problems with Application of Khan’s Hybrid Rule. Mathematics. 2022; 10(23):4411. https://doi.org/10.3390/math10234411
Chicago/Turabian StyleRakočević, Vladimir, and Milena J. Petrović. 2022. "Comparative Analysis of Accelerated Models for Solving Unconstrained Optimization Problems with Application of Khan’s Hybrid Rule" Mathematics 10, no. 23: 4411. https://doi.org/10.3390/math10234411
APA StyleRakočević, V., & Petrović, M. J. (2022). Comparative Analysis of Accelerated Models for Solving Unconstrained Optimization Problems with Application of Khan’s Hybrid Rule. Mathematics, 10(23), 4411. https://doi.org/10.3390/math10234411