The aspect-level sentiment analysis task aims to determine the sentiment polarity of different target objects in a sentence. The previous model for aspect-level sentiment analysis tasks used the RNN network or the Attention network modeling context to connect with the target. Although the RNN network can establish the long-term dependency of sentences, it is difficult to implement in parallel. However, a single self-attention mechanism cannot capture the position information of the sequence. For this reason, this paper proposes a multi-layer attention network based on self-attention, so that the model can obtain context information that is more closely related to the context. And through experiments and analysis, it is proved that our model can achieve good results when facing common aspect-level sentiment analysis data sets.