Analysis of Word Dependency Relations and Subword Models in Abstractive Text Summarization

2021 
Abstractive text summarization is an important task in natural language processing. As many texts are becoming available in the digital world at very high speeds, people begin to need automated systems to summarize such bulk data in a condensed form that only holds the necessary information. With recent advances in deep learning techniques, abstractive text summarization has gained even more attention. Attention-based sequence-to-sequence models are adapted for this task and achieved state-of-the-art results. Additional mechanisms like pointer/generator and coverage have become the standard mechanisms for abstractive summarization. On top of these common approaches, we observe the effects of integrating word dependency relations and using subword models. We show that word dependency relations increase the performance. We also present that subword usage is another viable option to be included for models as well.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    0
    Citations
    NaN
    KQI
    []