Open Access iconOpen Access

ARTICLE

An Improved Knowledge Distillation Algorithm and Its Application to Object Detection

Min Yao1,*, Guofeng Liu2, Yaozu Zhang3, Guangjie Hu1

1 School of Information Engineering, Shanghai Maritime University, Shanghai, 201306, China
2 Baidu, Beijing, 100000, China
3 Shanghai Freesense Technology Co., Ltd., Shanghai, 200000, China

* Corresponding Author: Min Yao. Email: email

Computers, Materials & Continua 2025, 83(2), 2189-2205. https://doi.org/10.32604/cmc.2025.060609

Abstract

Knowledge distillation (KD) is an emerging model compression technique for learning compact object detector models. Previous KD often focused solely on distilling from the logits layer or the feature intermediate layers, which may limit the comprehensive learning of the student network. Additionally, the imbalance between the foreground and background also affects the performance of the model. To address these issues, this paper employs feature-based distillation to enhance the detection performance of the bounding box localization part, and logit-based distillation to improve the detection performance of the category prediction part. Specifically, for the intermediate layer feature distillation, we introduce feature resampling to reduce the risk of the student model merely imitating the teacher model. At the same time, we incorporate a Spatial Attention Mechanism (SAM) to highlight the foreground features learned by the student model. In terms of output layer feature distillation, we divide the traditional distillation targets into target-class objects and non-target-class objects, aiming to improve overall distillation performance. Furthermore, we introduce a one-to-many matching distillation strategy based on Feature Alignment Module (FAM), which further enhances the student model’s feature representation ability, making its feature distribution closer to that of the teacher model, and thus demonstrating superior localization and classification capabilities in object detection tasks. Experimental results demonstrate that our proposed methodology outperforms conventional distillation techniques in terms of object detecting performance.

Keywords

Deep learning; model compression; knowledge distillation; object detection

Cite This Article

APA Style
Yao, M., Liu, G., Zhang, Y., Hu, G. (2025). An Improved Knowledge Distillation Algorithm and Its Application to Object Detection. Computers, Materials & Continua, 83(2), 2189–2205. https://doi.org/10.32604/cmc.2025.060609
Vancouver Style
Yao M, Liu G, Zhang Y, Hu G. An Improved Knowledge Distillation Algorithm and Its Application to Object Detection. Comput Mater Contin. 2025;83(2):2189–2205. https://doi.org/10.32604/cmc.2025.060609
IEEE Style
M. Yao, G. Liu, Y. Zhang, and G. Hu, “An Improved Knowledge Distillation Algorithm and Its Application to Object Detection,” Comput. Mater. Contin., vol. 83, no. 2, pp. 2189–2205, 2025. https://doi.org/10.32604/cmc.2025.060609



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 287

    View

  • 92

    Download

  • 0

    Like

Share Link