Variational Self-attention Network for Sequential Recommendation

2021 
Sequential recommendation has become an attractive topic in recommender systems. Existing sequential recommendation methods, including the methods based on the state-of-the-art self-attention mechanism, usually employ deterministic neural networks to represent user preferences as fixed-points in the latent feature spaces. However, the fixed-point vector lacks the ability to capture the uncertainty and dynamics of user preferences that are prevalent in recommender systems. In this paper, we propose a new Variational Self-Attention Network (VSAN), which introduces a variational autoencoder (VAE) into the self-attention network to capture latent user preferences. Specifically, we represent the obtained self-attention vector as density via variational inference, whose variance well characterizes the uncertainty of user preferences. Furthermore, we employ self-attention networks to learn the inference process and generative process of VAE, which well captures long-range and local dependencies. Finally, we evaluate our proposed method VSAN with two public real-world datasets. Our experimental results show the effectiveness of our model compared to the state-of-the-art approaches.
    • Correction
    • Source
    • Cite
    • Save
    43
    References
    1
    Citations
    NaN
    KQI
    []