Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features

2019 
In the process of knowledge graph construction, entity linking is a pivotal step, which maps mentions in text to a knowledge base. Existing models only utilize individual information to represent their latent features and ignore the correlation between entities and their mentions. Besides, in the process of entity feature extraction, only partial latent features, i.e., context features, are leveraged to extract latent features, and the pivotal entity structural features are ignored. In this paper, we propose SA-ESF, which leverages the symmetrical Bi-LSTM neural network with the double attention mechanism to calculate the correlation between mentions and entities in two aspects: (1) entity embeddings and mention context features; (2) mention embeddings and entity description features; furthermore, the context features, structural features, and entity ID feature are integrated to represent entity embeddings jointly. Finally, we leverage (1) the similarity score between each mention and its candidate entities and (2) the prior probability to calculate the final ranking results. The experimental results on nine benchmark dataset validate the performance of SA-ESF where the average F1 score is up to 0.866.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    10
    Citations
    NaN
    KQI
    []