透過您的圖書館登入
IP:18.222.148.124
  • 期刊

A Multi-Layer Attention Network based on Self-Attention Mechanism

摘要


The aspect-level sentiment analysis task aims to determine the sentiment polarity of different target objects in a sentence. The previous model for aspect-level sentiment analysis tasks used the RNN network or the Attention network modeling context to connect with the target. Although the RNN network can establish the long-term dependency of sentences, it is difficult to implement in parallel. However, a single self-attention mechanism cannot capture the position information of the sequence. For this reason, this paper proposes a multi-layer attention network based on self-attention, so that the model can obtain context information that is more closely related to the context. And through experiments and analysis, it is proved that our model can achieve good results when facing common aspect-level sentiment analysis data sets.

參考文獻


Binxuan Huang, Tanglan Qu, and Kathleen M.Caley, “Aspect Level Sentiment Classification with Attention-over-Attention Neural Network,”2018.
Maria Pontiki, Dimitris Galanis, John Pavlopoulos, Harris Papageorgiou, Ion Androutsopoulos, and Suresh Manandhar. 2014. Semeval-2014 task 4: Aspect based sentiment analysis. In Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval 2014), pages 27–35.
Peng Chen, Zhongqian Sun, Lidong Bing and Wei Yang, “Recurrent Attention Network on Memory for Aspect Sentiment Analysis,” Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing,2017.
Xin Li, Lidong Bing, Wai Lam, and Bei Shi, “Transformation Networks for Target-Oriented Sentiment Classification,” arXiv preprint arXiv:1805.01086,2018.
Duyu Tang, Bing Qin, and Ting Liu, “Aspect Level Sentiment Classification with Deep Memory Network,” Conference on Empirical Methods in Natural Language Processing 2016:214224,2016.

延伸閱讀