LadRa-Net: Locally-Aware Dynamic Re-read Attention Net for Sentence Semantic Matching

2021 
Sentence semantic matching requires an agent to determine the semantic relation between two sentences, which is widely used in various natural language tasks, such as natural language inference (NLI) and paraphrase identification (PI). Much recent progress has been made in this area, especially attention-based methods and pretrained language model-based methods. However, most of these methods focus on all the important parts in sentences in a static way and only emphasize how important the words are to the query, inhibiting the ability of the attention mechanism. In order to overcome this problem and boost the performance of the attention mechanism, we propose a novel dynamic reread (DRr) attention, which can pay close attention to one small region of sentences at each step and reread the important parts for better sentence representations. Based on this attention variation, we develop a novel DRr network (DRr-Net) for sentence semantic matching. Moreover, selecting one small region in DRr attention seems insufficient for sentence semantics, and employing pretrained language models as input encoders will introduce incomplete and fragile representation problems. To this end, we extend DRr-Net to locally aware dynamic reread attention net (LadRa-Net), in which local structure of sentences is employed to alleviate the shortcoming of byte-pair encoding (BPE) in pretrained language models and boost the performance of DRr attention. Extensive experiments on two popular sentence semantic matching tasks demonstrate that DRr-Net can significantly improve the performance of sentence semantic matching. Meanwhile, LadRa-Net is able to achieve better performance by considering the local structures of sentences. In addition, it is exceedingly interesting that some discoveries in our experiments are consistent with some findings of psychological research.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []