Comparative Analyses of Bert, Roberta, Distilbert, and Xlnet for Text-Based Emotion Recognition

2020 
Transformers' feat is attributed to its better language understanding abilities to achieve state-of-the-art results in medicine, education, and other major NLP tasks. This paper analyzes the efficacy of BERT, RoBERTa, DistilBERT, and XLNet pre-trained transformer models in recognizing emotions from texts. The paper undertakes this by analyzing each candidate model's output compared with the remaining candidate models. The implemented models are fine-tuned on the ISEAR data to distinguish emotions into anger, disgust, sadness, fear, joy, shame, and guilt. Using the same hyperparameters, the recorded model accuracies in decreasing order are 0.7431, 0.7299, 0.7009, 0.6693 for RoBERTa, XLNet, BERT, and DistilBERT, respectively.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    8
    References
    3
    Citations
    NaN
    KQI
    []