In recent years, deep neural networks have made breakthroughs. Many studies use recurrent neural networks (RNNs) to encode sequence data and obtain context correlation. In this study, behavior recognition is the main axis, and the model can focus on the important input by using the long short-term memory (LSTM) and attention mechanism, which overcomes the problem of model performance degradation caused by too long sequence data, and makes the context vector more representative. Compared with the model without attention mechanism, the proposed method improves the recognition rate by 3.8% on average.