Leveraging Meta-path Based Context For Top N Recommendation With Co-attention Mechanism

Authors:
Binbin Hu Beijing University of Posts and Telecommunications
Chuan Shi Beijing University of Posts and Telecommunications
Xin Zhao School of Information, Renmin University of China
Philip S. Yu University of Illinois at Chicago

Introduction:

This paper studies Heterogeneous information network (HIN) .To construct the meta-path based context, the authors propose to use a priority based sampling technique to select high-quality path instances.

Abstract:

Heterogeneous information network (HIN) has been widely adopted in recommender systems due to its excellence in modeling complex context information. Although existing HIN based recommendation methods have achieved performance improvement to some extent, they have two major shortcomings. First, these models seldom learn an explicit representation for path or meta-path in the recommendation task. Second, they do not consider the mutual effect between the meta-path and the involved user-item pair in an interaction. To address these issues, we develop a novel deep neural network with the co-attention mechanism for leveraging rich meta-path based context for top-N recommendation. We elaborately design a three-way neural interaction model by explicitly incorporating meta-path based context. To construct the meta-path based context, we propose to use a priority based sampling technique to select high-quality path instances. Our model is able to learn effective representations for users, items and meta-path based context for implementing a powerful interaction function. The co-attention mechanism improves the representations for meta-path based con- text, users and items in a mutual enhancement way. Extensive experiments on three real-world datasets have demonstrated the effectiveness of the proposed model. In particular, the proposed model performs well in the cold-start scenario and has potentially good interpretability for the recommendation results.

You may want to know: