Deep multiplex graph infomax: Attentive multiplex network embedding using global information

2020 
Abstract Network embedding has recently garnered attention due to the ubiquity of the networked data in the real-world. A network is useful for representing the relationships among objects, and these network include social network, publication network, and protein–protein interaction network. Most existing network embedding methods assume that only a single type of relation exists between nodes. However, we focus on the fact that two nodes in a network can be connected by multiple types of relations; such a network is called multi-view network or multiplex network. Although several existing work consider the multiplexity of a network, they overlook node attributes, resort to node labels for training, and fail to model the global properties of a graph. In this work, we present an unsupervised network embedding method for attributed multiplex network called  DMGI , inspired by Deep Graph Infomax (DGI) that maximizes the mutual information between local patches of a graph, and the global representation of the entire graph. Building on top of DGI, we devise a systematic way to jointly integrate the node embeddings from multiple graphs by introducing (1) the consensus regularization framework that minimizes the disagreements among the relation-type specific node embeddings, and (2) the universal discriminator that discriminates true samples regardless of the relation types. We also show that the attention mechanism infers the importance of each relation type, and thus can be useful for filtering unnecessary relation types as a preprocessing step. We perform comprehensive experiments not only on unsupervised downstream tasks, such as clustering and similarity search, but also a supervised downstream task, i.e., node classification, and demonstrate that  DMGI  outperforms the state-of-the-art methods, even though  DMGI  is fully unsupervised. The source code is can be found here https://github.com/pcy1302/DMGI .
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    58
    References
    13
    Citations
    NaN
    KQI
    []