Transformer-Based Coattention: Neural Architecture for Reading Comprehension

2020 
Machine reading comprehension (MRC) is one of the primary challenges in natural language understanding (NLU), its objective is to give the correct answer based on the questions asked from the specified context. Nowadays, attention mechanisms have been widely used in reading comprehension tasks. In this chapter, based on the analysis of two state-of-the-art attention mechanisms, Coattention and Multi-head Attention, the Transformer-based Coattention (TBC) is proposed. Furthermore, a general hybrid scheme is proposed to incorporate the TBC into pretrained MRC models with little extra training cost. Our experiments on Stanford Question Answering Dataset (SQuAD) and Discrete Reasoning Over the content of Paragraphs (DROP) show that our hybrid scheme make models achieve better performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    15
    References
    0
    Citations
    NaN
    KQI
    []