language-icon Old Web
English
Sign In

Other Conjugate Gradient Methods

2020 
As already seen, the conjugate gradient algorithms presented so far use some principles based on: hybridization or modifications of the standard schemes, the memoryless or the scaled memoryless BFGS preconditioned or the three-term concept. The corresponding conjugate gradient algorithms are defined by the descent condition, the “pure” conjugacy or the Dai–Liao conjugacy conditions or by the minimization of the quadratic approximation with one or two parameters of the objective function. There are a number of convergence results, mainly based on the Zoutendijk and on the Nocedal conditions under the Wolfe line search (Dai, 2011). These algorithms have good numerical performances, being able to solve large-scale unconstrained optimization problems and applications. However, in the frame of conjugate gradient methods, which is a very active area of research, some other computational schemes were introduced in order to improve their numerical performances. They are too numerous to be presented in this study. However, a short description of some of them is as follows.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []