Enhancing both Local and Global Entity Linking Models with Attention

2021 
Entity linking aims at mapping the mentions in a document to their corresponding entities in a given knowledge base, which involves two continuous steps, i.e., local step which focuses on modeling the semantic meaning of the context around the mention, and global step which optimizes the refereed entities coherence in the document. Upon the existing great efforts on both steps, this paper would like to enhance both local and global entity linking models with several attention mechanisms respectively. Particularly, we propose to leverage self-attention mechanism and LSTM-based attention mechanism to better capture the inter-dependencies between tokens in the mention context for the local entity linking models. We also adopt a hierarchical attention network with a multi-head attention layer to better represent documents with one or multiple topics for the global entity linking models, which could help alleviate the side effect of error accumulation. Extensive empirical study on standard benchmarks proves the effectiveness of the proposed models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []