Advanced Hybrid Conjugate Gradient Algorithms with Proven Convergence and Superior Efficiency


Al Arabo M. D., YILMAZ F. N.

Mathematical Modelling of Engineering Problems, cilt.12, sa.9, ss.3083-3093, 2025 (Scopus) identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 12 Sayı: 9
  • Basım Tarihi: 2025
  • Doi Numarası: 10.18280/mmep.120912
  • Dergi Adı: Mathematical Modelling of Engineering Problems
  • Derginin Tarandığı İndeksler: Scopus
  • Sayfa Sayıları: ss.3083-3093
  • Anahtar Kelimeler: adaptive algorithms, benchmark functions, nonlinear conjugate gradient, sufficient descent, unconstrained optimization
  • Gazi Üniversitesi Adresli: Evet

Özet

Non-convex optimization continues to be a fundamental challenge in applied mathematics, engineering, and data science, with applications that encompass machine learning and image processing. Building on the recent Dilara, Ebru, and Ibrahim (DEI) conjugate gradient method (a conjugate gradient (CG) algorithm designed for nonconvex problems), this paper proposes three novel hybrid CG algorithms—NEW1, NEW2, and NEW3—that aim to expedite convergence by adaptively updating the rule that forms each new search direction (often denoted as β) while maintaining theoretical guarantees of global convergence and sufficient descent. We provide detailed convergence proofs for each algorithm under standard assumptions. Comprehensive evaluations on 32 benchmark functions with varying dimensionality and conditioning show average reductions of up to 49% in the number of iterations (NOI) and up to 60% in the number of function (NOF) evaluations, compared to DEI. In practical terms, these gains translate into lower computational costs on challenging real-world problems (e.g., engineering design and machine learning optimization) without sacrificing robustness, positioning NEW1–NEW3 as efficient, theoretically grounded alternatives to conventional CG methods.