Joint knowledge-powered topic level attention for a convolutional text summarization model

2021 
Abstract Abstractive text summarization (ATS) often fails to capture salient information and preserve the original meaning of the content in the generated summaries due to a lack of background knowledge. We present a method to provide the topic information based on the background knowledge of documents to a deep learning-based summarization model. This method comprises a topic knowledge base (TKB) and convolutional sequence network-based text summarization model with knowledge-powered topic level attention (KTOPAS). TKB employs conceptualization to retrieve the semantic salient knowledge of documents and the knowledge-powered topic model (KPTopicM) to generate coherent and meaningful topic information by utilizing the knowledge that represents the documents well. KTOPAS obtains knowledge-powered topic information (also called topic knowledge) from TKB and incorporates the topic knowledge into the convolutional sequence network through a high-level topic level attention to resolve the existing issues in ATS. KTOPAS introduces a tri-attention channel to jointly learn the attention of the source elements over the summary elements, the source elements over topic knowledge, and topic knowledge over the summary elements to present the contextual alignment information from three aspects and combine them using the softmax function to generate the final probability distribution which enables the model to produce coherent, concise, and human-like summaries with word diversity. By conducting experiments on datasets, namely CNN/Daily Mail and Gigaword, the results show that our proposed method consistently outperforms the competing baselines. Moreover, TKB improves the effectiveness of the resulting summaries by providing topic knowledge to KTOPAS and demonstrates the quality of the proposed method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    0
    Citations
    NaN
    KQI
    []