Adversarial Multi-task Label Embedding for Text Classification

2019 
Multi-task learning makes use of the potential correlation among related tasks to perform well in text classification. However, in the most multi-task works, labels are converted to meaningless one-hot vectors, which cause the loss of label semantics closely related to text semantics. Besides, shared and private features captured by previous shared-private multi-task learning framework are usually confused by the fact that the shared unit simply shares the parameters. In this paper, we propose the Adversarial Multi-task Label Embedding model. In this model, we integrate label semantics and improve the performance of multi-task learning. We introduce adversarial training and orthogonal constraints into the multi-task learning framework to prevent shared features and private features from interacting with each other. Extensive experimental results on six benchmark datasets demonstrate that our proposed approach is superior to the state-of-the-art multi-task text classification methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []