Chinese Text Classification Model Based On Bert And Capsule Network Structure

2021 
In view of the fact that natural language has strong contextual dependence on sentence structure, but the existing Chinese short text classification algorithms often have problems such as sparse features, irregular words and massive data, a new chinese news classification model based on BERT and capsule network structure is proposed. First, BERT's multi-layer bidirectional Transformer feature extractor based on attention mechanism is used to obtain a more global expression of feature relationships between words and sentences. Then input the obtained data into the capsule network layer to enhance the features and reduce the time cost to enhance the accuracy of data feature selection. Finally, text feature information with different weights is input into Softmax function layer for news classification. The experimental results on Sohu news data set show that the proposed feature fusion model has better classification performance than other models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []