Adaptive Alternating Stochastic Gradient Descent Algorithms for Large-Scale Latent Factor Analysis

2021 
Latent factor analysis (LFA) is highly efficient in knowledge discovery from high-dimensional and sparse (HiDS) matrices frequently encountered in big data and web service related applications. A stochastic gradient descent (SGD) algorithm is commonly adopted as a learning algorithm for LFA owing to its high efficiency. However, its sequential nature makes it less scalable when processing large-scale data. Although an alternating SGD algorithm decouples an LFA process to achieve parallelization, its performance relies on its hyper-parameter selection that is highly expensive to tune. To address it, this paper presents three adaptive alternating SGD algorithms, thus leading to three Parallel Adaptive LFA (PAL) models for LFA on large-scale HiDS matrices. Experimental studies on HiDS matrices from industrial service applications show that the proposed PAL models perform significantly better than existing ones in terms of both convergence rate and computational efficiency, as well as achieve competitive prediction accuracy for missing data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []