Home / Journals / CMC / Online First / doi:10.32604/cmc.2025.069353
Special Issues
Table of Content

Open Access

ARTICLE

APPLE_YOLO: Apple Detection Method Based on Channel Pruning and Knowledge Distillation in Complicated Environments

Xin Ma1,2, Jin Lei3,4,*, Chenying Pei4 and Chunming Wu4
1 Department of Aircraft Control and Information Engineering, Jilin University of Chemical Technology, Jilin, 132022, China
2 Micro Engineering and Micro Systems Laboratory, School of Mechanical and Aerospace Engineering, Jilin University, Changchun, 130025, China
3 School of Marine Science and Technology, Northwestern Polytechnical University, Xi’an, 710129, China
4 Key Laboratory of Modern Power System Simulation and Control & Renewable Energy Technology, Ministry of Education, Northeast Electric Power University, Jilin, 132012, China
* Corresponding Author: Jin Lei. Email: 2202200376@neepu.edu.cn
(This article belongs to the Special Issue: Advances in Object Detection: Methods and Applications)

Computers, Materials & Continua https://doi.org/10.32604/cmc.2025.069353

Received 20 June 2025; Accepted 25 September 2025; Published online 07 November 2025

Abstract

This study proposes a lightweight apple detection method employing cascaded knowledge distillation (KD) to address the critical challenges of excessive parameters and high deployment costs in existing models. We introduce a Lightweight Feature Pyramid Network (LFPN) integrated with Lightweight Downsampling Convolutions (LDConv) to substantially reduce model complexity without compromising accuracy. A Lightweight Multi-channel Attention (LMCA) mechanism is incorporated between the backbone and neck networks to effectively suppress complex background interference in orchard environments. Furthermore, model size is compressed via Group_Slim channel pruning combined with a cascaded distillation strategy. Experimental results demonstrate that the proposed model achieves a 1% higher Average Precision (AP) than the baseline while maintaining extreme lightweight advantages (only 800 k parameters). Notably, the two-stage KD version achieves over 20 Frames Per Second (FPS) on Central Processing Unit (CPU) devices, confirming its practical deployability in real-world applications.

Keywords

LMCA; LFPN; LDConv; group_slim; distillation
  • 202

    View

  • 37

    Download

  • 0

    Like

Share Link