透過您的圖書館登入
IP:3.129.70.157
  • 期刊

Application of Knowledge Distillation in Representation Learning Person Re-identification Model

摘要


The lock targets in monitoring system is important for fully exerting the surveillance capability of mobile devices and saving working time. To save the time required and huge amount of computing resources, a fast person re-identification (Re-ID) method is proposed. In this paper, we use knowledge distillation to make a large teacher model (ResNet50) guide a small but effective student model (MobileNet v2) for representation learning. Experimental results demonstrate that the proposed method is feasible. Compared with the teacher model and the student model, the system applied the knowledge distillation method can save more 55.4% of time and increase mAP 12.73% and Rank-1 8.63%, respectively.

延伸閱讀