Scalable Robust Matrix Factorization With Nonconvex Loss

Authors:
Quanming Yao 4Paradigm
James Kwok Hong Kong University of Science and Technology

Introduction:

Robust matrix factorization (RMF), which uses the $ell_1$-loss, often outperforms standard matrix factorization using the $ell_2$-loss, particularly when outliers are present.

Abstract:

Robust matrix factorization (RMF), which uses the $\ell_1$-loss, often outperforms standard matrix factorization using the $\ell_2$-loss, particularly when outliers are present. The state-of-the-art RMF solver is the RMF-MM algorithm, which, however, cannot utilize data sparsity. Moreover, sometimes even the (convex) $\ell_1$-loss is not robust enough. In this paper, we propose the use of nonconvex loss to enhance robustness. To address the resultant difficult optimization problem, we use majorization-minimization (MM) optimization and propose a new MM surrogate. To improve scalability, we exploit data sparsity and optimize the surrogate via its dual with the accelerated proximal gradient algorithm. The resultant algorithm has low time and space complexities and is guaranteed to converge to a critical point. Extensive experiments demonstrate its superiority over the state-of-the-art in terms of both accuracy and scalability.

You may want to know: