Learning to Select Relevant Knowledge for Neural Machine Translation

2021 
Most memory-based methods use encoded retrieved pairs as the translation memory (TM) to provide external guidance, but there still exist some noisy words in the retrieved pairs. In this paper, we propose a simple and effective end-to-end model to select useful sentence words from the encoded memory and incorporate them into the NMT model. Our model uses a novel memory selection mechanism to avoid the noise from similar sentences and provide external guidance simultaneously. To verify the positive influence of selected retrieved words, we evaluate our model on the single-domain dataset namely JRC-Acquis and multi-domain dataset comprised of existing benchmarks including WMT, IWSLT, JRC-Acquis, and OpenSubtitles. Experimental results demonstrate our method can improve the translation quality under different scenarios.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []