Group event recommendation based on graph multi-head attention network combining explicit and implicit information

2022 
Abstract In event-based social networks (EBSN), group event recommendation has become an important task for groups to quickly find events that they are interested in. Existing methods on group event recommendation either consider just one type of information, explicit or implicit, or separately model the explicit and implicit information. However, these methods often generate a problem of data sparsity or of model vector redundancy. In this paper, we present a Graph Multi-head Attention Network (GMAN) model for group event recommendation that integrates the explicit and implicit information in EBSN. Specifically, we first construct a user-explicit graph based on the user's explicit information, such as gender, age, occupation and the interactions between users and events. Then we build a user-implicit graph based on the user's implicit information, such as friend relationships. The incorporated both explicit and implicit information can effectively describe the user's interests and alleviate the data sparsity problem. Considering that there may be a correlation between the user's explicit and implicit information in EBSN, we take the user's explicit vector representation as the input of the implicit information aggregation when modeling with graph neural networks. This unified user modeling can solve the aforementioned problem of user model vector redundancy and is also suitable for event modeling. Furthermore, we utilize a multi-head attention network to learn richer implicit information vectors of users and events from multiple perspectives. Finally, in order to get a higher level of group vector representation, we use a vanilla attention mechanism to fuse different user vectors in the group. Through experimenting on two real-world Meetup datasets, we demonstrate that GMAN model consistently outperforms state-of-the-art methods on group event recommendation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    56
    References
    0
    Citations
    NaN
    KQI
    []