Deep Residual and Deep Dense Attentions in English Chinese Translation

2021 
Neural Machine Translation (NMT) with attention mechanism has achieved impressively improvement for automated translation. However, such models may lose information during multiple times of attention representations. This paper focuses on dealing with the over-attention problem. In our English-Chinese translation experimental results, the proposed model reduces the error rate of information in output sentences about 0.5%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    0
    Citations
    NaN
    KQI
    []