Self-inhibition Residual Convolutional Networks for Chinese Sentence Classification.

2018 
Convolutional network has become a dominant approach in many Natural Language Processing (NLP) tasks. However, these networks are pretty shallow and simple so they are not able to capture the hierarchical feature of text. In addition, text preprocessing of those models in Chinese are quite rough, which leads to the loss of rich semantic information. In this paper, we explore deep convolutional networks for Chinese sentence classification and present a new model named Self-Inhibition Residual Convolutional Network (SIRCNN). This model employs extra Chinese character information and replaces convolutional block with self-inhibiting residual convolutional block to improve performance of deep network. It is one of the few explorations which use deep convolutional network in various text classification tasks. Experiments show that our model can achieve state-of-the-art accuracy on three different datasets with a better convergence rate.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    1
    Citations
    NaN
    KQI
    []