LoG: a locally-global model for entity disambiguation

2020 
Entity disambiguation (ED) aims to link textual mentions in a document to the correct named entities in a knowledge base (KB). Although global ED models usually outperform local models by collectively linking mentions based on the topical coherence assumption, they may still incur incorrect entity assignment when a document contains multiple topics. Therefore, we propose a Locally-Global model (LoG) for ED which extracts global features locally, i.e., among a limited number of neighboring mentions, to combine the respective superiority of both models. In particular, we derive mention neighbors according to the syntactic distance on a dependency parse tree, and propose a tree connection method CoSimTC to measure the cross-tree distance between mentions. We also recognize the importance of keywords in a document for collective entity disambiguation, which reveal the central topic information of the document. Hence, we propose a keyword extraction method Sent2Word to detect keywords of each document. Furthermore, we extend the Graph Attention Network (GAT) to integrate both local and global features to produce a discriminative representation for each candidate entity. Our experimental results on six widely-adopted public datasets demonstrate better performance compared with state-of-the-art ED approaches. The high efficiency of the LoG model further verifies its feasibility in practice.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    1
    Citations
    NaN
    KQI
    []