Bidirectional LSTM with self-attention mechanism and multi-channel features for sentiment classification

Publication date: Available online 10 January 2020Source: NeurocomputingAuthor(s): Weijiang Li, Fang Qi, Ming Tang, Zhengtao YuAbstractThere are a lot of linguistic knowledge and sentiment resources nowadays, but in the current research with deep learning framework, these kinds of unique sentiment information are not fully used in sentiment analysis tasks. Moreover, the sentiment analysis task can be seen as a sequence model, and the sequence model has a problem: the model will decode the input file sequences into a specific length vector. If the length of the vectors is set too short, the input text information will be lost, and finally the text will be misjudged. To solve these problems, we propose a bidirectional LSTM model with self-attention mechanism and multi-channel features (SAMF-BiLSTM). The method models the existing linguistic knowledge and sentiment resources in sentiment analysis tasks to form different feature channels, and uses self-attention mechanism to enhance the sentiment information. SAMF-BiLSTM model can fully exploit the relationship between target words and sentiment polarity words in a sentence, and does not rely on manually organized sentiment lexicon. In addition, we propose the SAMF-BiLSTM-D model based on SAMF-BiLSTM model for document-level text classification tasks. The method obtains the representation of all sentences in the document through SAMF-BiLSTM training, then integrates BiLSTM to learn the representation of all sentences, and further...
Source: Neurocomputing - Category: Neuroscience Source Type: research