Multi-Head Bidirectional Attention for MRC

2019 
Machine reading comprehension is a challenging task for natural language processing problem that usually requires both an in-depth understanding of complex interactions between the given context and the query and knowledge of the world. Reading comprehension problems require analysis of interconnected representations between context and query. SQuAD offers a large number of questions and their answers providing a testbed for evaluating machine comprehension algorithms. As the answers lie within the given context, we believe it is limiting and unnatural to represent the question or context with a single vector or apply single attention to model the interaction between the context and query. We propose an end-to-end neural multi-head bidirectional attention encoder architecture to understand the complex dependency between query and context. our experiments show a significant performance increase and faster convergence from the baseline model.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    0
    Citations
    NaN
    KQI
    []