BERT Based Hierarchical Sequence Classification for Context-Aware Microblog Sentiment Analysis

2019 
In microblog sentiment analysis task, most of the existing algorithms treat each microblog isolatedly. However, in many cases, the sentiments of microblogs can be ambiguous and context-dependent, such as microblogs in an ironic tone or non-sentimental contents conveying certain emotional tendency. In this paper, we consider the context-aware sentiment analysis as a sequence classification task, and propose a Bidirectional Encoder Representation from Transformers (BERT) based hierarchical sequence classification model. Our proposed model extends BERT pre-trained model, which is powerful of dependency learning and semantic information extracting, with Bidirectional Long Short Term Memory (BiLSTM) and Conditional Random Field (CRF) layers. Fine-tuning such a model on the sequence classification task enables the model to jointly consider the representation with the contextual information and the transition between adjacent microblogs. Experimental evaluations on a public context-aware dataset show that the proposed model can outperform other reported methods by a large margin.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    3
    Citations
    NaN
    KQI
    []