Bidirectional Gated Recurrent Unit Networks for Relation Classification with Multiple Attentions and Semantic Information.

2019 
Relation classification is an important part in natural language processing (NLP) field. The main task of relation classification is extracting the relations between target entities. In recent years, there are many methods for relation classification and some of them have achieved quite good results, but these methods have not given enough attention to the target words, and the semantic information of words is also lack of utilization. In order to make good use of the contextual information in the sentences as much as possible, we adopt the bidirectional gated recurrent unit networks (BGRU). On this basis, in order to focus on the computing process of target entities and target sentences, we add the multiple attention mechanism. Meanwhile, other semantic information such as the named entity and part of speech information of the word are also added as input data so as to make full use of the words’ information in the corpus. We have conducted some experiments on the widely used datasets, and we got up to 3% improvement in the F1 value compared to previous optimal method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []