An Attention-Based Approach to Text Summarization

2019 
Text summarization has become increasingly important in today’s world of information overload. Recently, simpler networks using only attention mechanisms have been tried out for neural machine translation. We propose to use a similar model to carry out the task of text summarization. The proposed model not only trains faster than the usually used recurrent neural network-based architectures but also gives encouraging results. We trained our model on a dump of Wikipedia articles and managed to get a ROUGE-1 f-measure score of 0.54 and BLEU score of 15.74.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    9
    References
    0
    Citations
    NaN
    KQI
    []