Variance Reduced Stochastic Proximal Algorithm for AUC Maximization

2021 
Stochastic Gradient Descent has been widely studied with classification accuracy as a performance measure. However, these stochastic algorithms are not applicable when non-decomposable pairwise performance measures are used, such as Area under the ROC curve (AUC), a standard performance metric used when the classes are imbalanced. Several algorithms have been proposed for optimizing AUC as a performance metric, one of the recent being a Stochastic Proximal Gradient Algorithm (SPAM). However, the downside of stochastic gradient descent is that it suffers from high variance leading to very slow convergence. Several variance reduced methods have been proposed with faster convergence guarantees than vanilla stochastic gradient descent to combat this issue. Again, these variance reduced methods are not applicable when non-decomposable performance measures are used. In this paper, we develop a Variance Reduced Stochastic Proximal algorithm for AUC Maximization (VRSPAM) that combines the two areas of analyzing non-decomposable performance metrics with and optimization efforts to guarantee faster convergence. We perform an in-depth theoretical and empirical analysis to demonstrate that our algorithm converges faster than existing state-of-the-art algorithms for the AUC maximization problem.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    26
    References
    0
    Citations
    NaN
    KQI
    []