Another Updated Parameter for the Hestenes-Stiefel Conjugate Gradient Method

Main Article Content

Osman Omer Osman Yousif, Raouf Ziadi, Mohammed A. Saleh, Abdulgader Z. Almaymuni

Abstract

The conjugate gradient (CG) methods are considered as one of the most popular methods for solving linear and non-linear unconstrained optimization problems, especially the problems of large-scale, that is because they are characterized by low memory requirements and strong local and global convergence properties. The method of Hestenes-Stiefel (HS) usually gives good numerical results in the practical computation. However, theoretically, its convergence properties are uncertain. To address the convergence failure of HS method, many choices for its update parameter have been proposed such as the choice of Gilbert and Nocedal in 1992, of Hager and Zhang in 2005, and of Yousif et al. in 2022. In this paper, motivated by these updated parameters, we propose another updated parameter for HS, and hence another CG method which inherits all the convergence properties of Gilbert and Nocedal, Hager and Zhang, and of Yousif et al. and has better numerical results. To show the efficiency and robustness of the new modified method in practice, a numerical experiment was done.

Article Details

References

  1. B.T. Polyak, The Conjugate Gradient Method in Extremal Problems, USSR Comput. Math. Math. Phys. 9 (1969), 94–112. https://doi.org/10.1016/0041-5553(69)90035-4.
  2. E.D. Dolan, J.J. Moré, Benchmarking Optimization Software with Performance Profiles, Math. Program. 91 (2002), 201–213. https://doi.org/10.1007/s101070100263.
  3. E. Polak, G. Ribiere, Note Sur La Convergence de Méthodes de Directions Conjuguées, Rev. Fr. Inform. Rech. Oper. Ser. Rouge. 3 (1969), 35–43. https://doi.org/10.1051/m2an/196903R100351.
  4. G. Yuan, X. Lu, A Modified PRP Conjugate Gradient Method, Ann. Oper. Res. 166 (2009), 73–90. https://doi.org/10.1007/s10479-008-0420-4.
  5. G. Zoutendijk, Nonlinear Programming Computational Methods, in: J. Abadie (Ed.), Integer and Nonlinear Programming, North-Holland, Amsterdam, pp. 37–86, 1970.
  6. J.C. Gilbert, J. Nocedal, Global Convergence Properties of Conjugate Gradient Methods for Optimization, SIAM J. Optim. 2 (1992), 21–42. https://doi.org/10.1137/0802003.
  7. L. Zhang, An Improved Wei–Yao–Liu Nonlinear Conjugate Gradient Method for Optimization Computation, Appl. Math. Comput. 215 (2009), 2269–2274. https://doi.org/10.1016/j.amc.2009.08.016.
  8. M. Al-Baali, Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search, IMA J. Numer. Anal. 5 (1985), 121–124. https://doi.org/10.1093/imanum/5.1.121.
  9. M.J.D. Powell, Convergence Properties of Algorithms for Nonlinear Optimization, SIAM Rev. 28 (1986), 487–500. https://doi.org/10.1137/1028154.
  10. M.R. Hestenes, E. Steifel, Method of conjugate gradient for solving linear equations, J. Res. Natl. Bur. Stand. 49 (1952), 409–436.
  11. N. Andrei, An Unconstrained Optimization Test Functions Collection, Adv. Model. Optim. 10 (2008), 147–161.
  12. O. Omer, M. Mamat, A. Abashar, M. Rivaie, The Global Convergence Properties of a Conjugate Gradient Method, in: Kuala Lumpur, Malaysia, 2014: pp. 286–295. https://doi.org/10.1063/1.4882501.
  13. O. Omer, M. Rivaie, M. Mamat, Z. Amani, A New Conjugate Gradient Method with Sufficient Descent without Any Line Search for Unconstrained Optimization, AIP Conf. Proc. 1643 (2015), 602–608. https://doi.org/10.1063/1.4907500.
  14. O. Omer, M. Rivaie, M. Mamat, A. Abdalla, A New Conjugate Gradient Method and Its Global Convergence under the Exact Line Search, AIP Conf. Proc. 1635 (2014), 639–646. https://doi.org/10.1063/1.4903649.
  15. O.O.O. Yousif, A. Abdelrahman, M. Mohammed, M.A. Saleh, A Sufficient Condition for the Global Convergence of Conjugate Gradient Methods for Solving Unconstrained Optimisation Problems, Sci. J. King Faisal Univ. 23 (2022), 106-112. https://doi.org/10.37575/b/sci/220013.
  16. O.O.O. Yousif, The Convergence Properties of RMIL+ Conjugate Gradient Method under the Strong Wolfe Line Search, Appl. Math. Comput. 367 (2020), 124777. https://doi.org/10.1016/j.amc.2019.124777.
  17. O.O.O. Yousif, M.A.Y. Mohammed, M.A. Saleh, M.K. Elbashir, A Criterion for the Global Convergence of Conjugate Gradient Methods under Strong Wolfe Line Search, J. King Saud Univ. Sci. 34 (2022), 102281. https://doi.org/10.1016/j.jksus.2022.102281.
  18. P. Wolfe, Convergence Conditions for Ascent Methods, SIAM Rev. 11 (1969), 226–235. https://doi.org/10.1137/1011036.
  19. P. Wolfe, Convergence Conditions for Ascent Methods. II: Some Corrections, SIAM Rev. 13 (1971), 185–188. https://doi.org/10.1137/1013035.
  20. R. Fletcher, Practical Method of Optimization, Wiley, New York, 1997.
  21. R. Fletcher, Function Minimization by Conjugate Gradients, Comput. J. 7 (1964), 149–154. https://doi.org/10.1093/comjnl/7.2.149.
  22. W.W. Hager, H. Zhang, A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search, SIAM J. Optim. 16 (2005), 170–192. https://doi.org/10.1137/030601880.
  23. Y. Liu, C. Storey, Efficient Generalized Conjugate Gradient Algorithms, Part 1: Theory, J. Optim. Theory Appl. 69 (1991), 129–137. https://doi.org/10.1007/BF00940464.
  24. Y.H. Dai, Y. Yuan, A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property, SIAM J. Optim. 10 (1999), 177–182. https://doi.org/10.1137/S1052623497318992.
  25. Y. Yuan, W. Sun, Theory and Methods of Optimization, Science Press, Beijing, 1999.
  26. Z. Dai, F. Wen, A Modified CG-DESCENT Method for Unconstrained Optimization, J. Comput. Appl. Math. 235 (2011), 3332–3341. https://doi.org/10.1016/j.cam.2011.01.046.
  27. Z. Wei, G. Li, L. Qi, New Nonlinear Conjugate Gradient Formulas for Large-Scale Unconstrained Optimization Problems, Appl. Math. Comput. 179 (2006), 407–430. https://doi.org/10.1016/j.amc.2005.11.150.
  28. Z. Wei, S. Yao, L. Liu, The Convergence Properties of Some New Conjugate Gradient Methods, Appl. Math. Comput. 183 (2006), 1341–1350. https://doi.org/10.1016/j.amc.2006.05.150.