文章詳目資料

Journal of Computers EIMEDLINEScopus

  • 加入收藏
  • 下載文章
篇名 Application of Knowledge Distillation in Representation Learning Person Re-identification Model
卷期 31:2
作者 Chang LiuHang MaJun-Jie JinXin-Lun ZhouWen-Bai Chen
頁次 277-286
關鍵字 deep neural networkknowledge distillationperson Re-IDrepresentation learningEIMEDLINEScopus
出刊日期 202004
DOI 10.3966/199115992020043102023

中文摘要

英文摘要

The lock targets in monitoring system is important for fully exerting the surveillance capability of mobile devices and saving working time. To save the time required and huge amount of computing resources, a fast person re-identification (Re-ID) method is proposed. In this paper, we use knowledge distillation to make a large teacher model (ResNet50) guide a small but effective student model (MobileNet v2) for representation learning. Experimental results demonstrate that the proposed method is feasible. Compared with the teacher model and the student model, the system applied the knowledge distillation method can save more 55.4% of time and increase mAP 12.73% and Rank-1 8.63%, respectively.

本卷期文章目次

相關文獻