Journal of System Simulation ›› 2024, Vol. 36 ›› Issue (11): 2517-2527.doi: 10.16182/j.issn1004731x.joss.24-0754

Previous Articles    

Object Detection of Lightweight Transformer Based on Knowledge Distillation

Wang Gaihua1,2, Li Kehong1, Long Qian1,3, Yao Jingxuan1, Zhu Bolun1, Zhou Zhengshu1, Pan Xuran1   

  1. 1.College of Artificial Intelligence, Tianjin University of Science & Technology, Tianjin 300457, China
    2.Hubei Key Laboratory of Optical Information and Pattern Recognition, Wuhan Institute of Technology, Wuhan 430205, China
    3.Beijing Smarter Technology Co. , Ltd, Beijing 100020, China
  • Received:2024-07-15 Revised:2024-09-19 Online:2024-11-13 Published:2024-11-19
  • Contact: Li Kehong

Abstract:

In autonomous driving, the efficiency and accuracy of object detection are significant. Object detection based on Transformer structure has gradually become the mainstream method, eliminating the complex anchor generation and non-maximum suppression (NMS). It has problems of high computing cost and slow convergence. An object detection model of the based lightweight pooling transformer (LPT) is designed, which contains a pooling backbone network and dual pooling attention mechanism. A general knowledge distillation method is intended for the DETR (detection transformer) model, which transfers prediction results, query vector, and features extracted by the teacher as knowledge to the LPT model to improve its accuracy. To verify the application potential of the distilled LPT model in autonomous driving, extensive experiments are conducted on the MS COCO 2017 dataset. The results show that the method has great efficiency and accuracy, and is competitive with some advanced techniques.

Key words: object detection, knowledge distillation, lightweight, DETR(detection Transformer), Transformer, autonomous driving

CLC Number: