Emotion Classification with Explicit and Implicit Syntactic Information.

2021 
Emotion classification has become a hot research topic in natural language processing due to its wide application. Existing studies suffer from the error propagation problem when using the syntax information in emotion classification since the parser can not produce perfect syntax trees. To address this problem, we propose a new approach by comparing and combining different levels of syntactic information to make full use of syntactic information and alleviate the error propagation. First, we propose to use graph convolutional networks (GCN) to encode dependency trees, in which the probability matrix of all dependency arcs (edge-weighted graph) is treated as the GCN adjacent matrix. Next, we extract the dependency parser encoder hidden representations as the implicit syntactic representations, which can directly avoid the error propagation problem. Finally, we fuse the two different syntax-aware information and inject them into our baseline model as extra inputs. Further experimental results show that the explicit and implicit syntactic information can improve the performance of a BERT-based system which is much stronger than the baseline. In addition, we find that the syntactic knowledge that BERT can express is limited, and the syntactic information of our model brings more contributions, which makes our model consistently outperform the BERT on different sentence lengths.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    0
    Citations
    NaN
    KQI
    []