Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions

2020 
We study the trade-offs between convergence rate and robustness to gradient errors in designing a first-order algorithm. We focus on gradient descent and accelerated gradient (AG) methods for minim...
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    34
    Citations
    NaN
    KQI
    []