Multiplex Behavioral Relation Learning for Recommendation via Memory Augmented Transformer Network
2021
Capturing users' precise preferences is of great importance in various
recommender systems (eg., e-commerce platforms), which is the basis of how to
present personalized interesting product lists to individual users. In spite of
significant progress has been made to consider relations between users and
items, most of the existing recommendation techniques solely focus on singular
type of user-item interactions. However, user-item interactive behavior is
often exhibited with multi-type (e.g., page view, add-to-favorite and purchase)
and inter-dependent in nature. The overlook of multiplex behavior relations can
hardly recognize the multi-modal contextual signals across different types of
interactions, which limit the feasibility of current recommendation methods. To
tackle the above challenge, this work proposes a Memory-Augmented Transformer
Networks (MATN), to enable the recommendation with multiplex behavioral
relational information, and joint modeling of type-specific behavioral context
and type-wise behavior inter-dependencies, in a fully automatic manner. In our
MATN framework, we first develop a transformer-based multi-behavior relation
encoder, to make the learned interaction representations be reflective of the
cross-type behavior relations. Furthermore, a memory attention network is
proposed to supercharge MATN capturing the contextual signals of different
types of behavior into the category-specific latent embedding space. Finally, a
cross-behavior aggregation component is introduced to promote the comprehensive
collaboration across type-aware interaction behavior representations, and
discriminate their inherent contributions in assisting recommendations.
Extensive experiments on two benchmark datasets and a real-world e-commence
user behavior data demonstrate significant improvements obtained by MATN over
baselines. Codes are available at: https://github.com/akaxlh/MATN.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
55
References
3
Citations
NaN
KQI