LI Chaofan, MA Kai. Text classification model based on multi-channel attention mechanism[J]. Microelectronics & Computer, 2022, 39(4): 33-40. DOI: 10.19304/J.ISSN1000-7180.2021.1184
Citation: LI Chaofan, MA Kai. Text classification model based on multi-channel attention mechanism[J]. Microelectronics & Computer, 2022, 39(4): 33-40. DOI: 10.19304/J.ISSN1000-7180.2021.1184

Text classification model based on multi-channel attention mechanism

  • In order to solve the problems of loss of key feature information, poor model performance and classification effect due to sparse text features when processing text classification tasks with convolutional neural networks (CNN) and recurrent neural networks (RNN). This paper proposes a text classification model based on multi-channel attention mechanism. Firstly, the vector representation is performed using a form of character and word fusion. Then, using CNN and BiLSTM to extract local features and contextual information of the text. The attention mechanism is used to weight the output information of each channel to highlight the importance of the feature words in the context information. Finally, the output results are fused and the text category probabilities are calculated using softmax. The results of comparative experiments on the data set show that the proposed model has a better classification effect. Compared with the classification effect of the model for a single channel, the F1 values are improved by 1.44% and 1.16%, respectively, which verifies the effectiveness of the proposed model in handling the text classification task. The proposed model complements the shortcomings of CNN and BiLSTM in extracting features, and effectively alleviates the problem of CNN losing word order information and the problem of gradients in BiLSTM processing text sequences. The model can effectively integrate local and global features of text and highlight key information to obtain a more comprehensive text feature, so it is suitable for text classification tasks.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return