Sentence Matching with Deep Self-attention and Co-attention Features.

2021 
Sentence matching refers to extracting the semantic relation between two sentences which is widely applied in many natural language processing tasks such as natural language inference, paraphrase identification and question answering. Many previous methods apply a siamese network to capture semantic features and calculate cosine similarity to represent sentences relation. However, they could be effective for overall rough sentence semantic but not sufficient for word-level matching information. In this paper, we proposed a novel neural network based on attention mechanism which focuses on learning richer interactive features of two sentences. There are two complementary components in our model: semantic encoder and interactive encoder. Interactive encoder compares sentences semantic features which are encoded by semantic encoder. In addition, semantic encoder considers the output of interactive encoder as supplementary matching features. Experiments on three benchmark datasets proved that self-attention network and cross-attention network can efficiently learn the semantic and interactive features of sentences, and achieved state-of-the-art results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []