Accelerating Frank-Wolfe with Weighted Average Gradients

2021 
Relying on a conditional gradient based iteration, the Frank-Wolfe (FW) algorithm has been a popular solver of constrained convex optimization problems in signal processing and machine learning, thanks to its low complexity. The present contribution broadens its scope by replacing the gradient per FW subproblem with a weighted average of gradients. This generalization speeds up the convergence of FW by alleviating its zigzag behavior. A geometric interpretation for the averaged gradients is provided, and convergence guarantees are established for three different weight combinations. Numerical comparison shows the effectiveness of the proposed methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    0
    Citations
    NaN
    KQI
    []