Analysis of the Variance Reduction in SVRG and a New Acceleration Method

2018 
Stochastic gradient descent is a popular method in large-scale optimization for machine learning but suffers from a slow convergence. In recent years, stochastic variance reduced gradient (SVRG) is proposed to remedy this problem. Although many variants of SVRG have been studied, the analysis of variance has not been thoroughly discussed. In this paper, we propose a general framework denoted by epoch-update-indentification (EUI), which is an abstraction of the existing variants of SVRG. Under this framework i.e., EUI, we then provide a general analysis of the variance reduction technique from a new perspective. Additionally, those previous variants of SVRG have to keep a snapshot of the full gradient for each epoch, which is computationally expensive. In this paper, we also propose a new variant of SVRG named sampleVR which estimates the snapshot of the full gradient by using a sampling strategy, thus leading to decrease the gradient complexity significantly. Both the theoretical analysis and extensive empirical studies show that sampleVR achieves a good tradeoff between convergence performance and gradient complexity, and thus makes the training loss converge faster than its counterparts.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    36
    References
    2
    Citations
    NaN
    KQI
    []