Open Access iconOpen Access

ARTICLE

Hierarchical Shape Pruning for 3D Sparse Convolution Networks

Haiyan Long1, Chonghao Zhang2, Xudong Qiu3, Hai Chen2,*, Gang Chen4,*

1 School of Information Engineering, Liaodong University, Dandong, 118003, China
2 School of Computer Science and Technology, Anhui University, Hefei, 230601, China
3 Institute of Artificial Intelligence, Beihang University, Beijing, 100191, China
4 School of Aerospace Engineering, Xiamen University, Xiamen, 361005, China

* Corresponding Authors: Hai Chen. Email: email; Gang Chen. Email: email

Computers, Materials & Continua 2025, 84(2), 2975-2988. https://doi.org/10.32604/cmc.2025.065047

Abstract

3D sparse convolution has emerged as a pivotal technique for efficient voxel-based perception in autonomous systems, enabling selective feature extraction from non-empty voxels while suppressing computational waste. Despite its theoretical efficiency advantages, practical implementations face under-explored limitations: the fixed geometric patterns of conventional sparse convolutional kernels inevitably process non-contributory positions during sliding-window operations, particularly in regions with uneven point cloud density. To address this, we propose Hierarchical Shape Pruning for 3D Sparse Convolution (HSP-S), which dynamically eliminates redundant kernel stripes through layer-adaptive thresholding. Unlike static soft pruning methods, HSP-S maintains trainable sparsity patterns by progressively adjusting pruning thresholds during optimization, enlarging original parameter search space while removing redundant operations. Extensive experiments validate effectiveness of HSP-S across major autonomous driving benchmarks. On KITTI’s 3D object detection task, our method reduces 93.47% redundant kernel computations while maintaining comparable accuracy (1.56% mAP drop). Remarkably, on the more complex NuScenes benchmark, HSP-S achieves simultaneous computation reduction (21.94% sparsity) and accuracy gains (1.02% mAP (mean Average Precision) and 0.47% NDS (nuScenes detection score) improvement), demonstrating its scalability to diverse perception scenarios. This work establishes the first learnable shape pruning framework that simultaneously enhances computational efficiency and preserves detection accuracy in 3D perception systems.

Keywords

Shape pruning; model compressing; 3D sparse convolution

Cite This Article

APA Style
Long, H., Zhang, C., Qiu, X., Chen, H., Chen, G. (2025). Hierarchical Shape Pruning for 3D Sparse Convolution Networks. Computers, Materials & Continua, 84(2), 2975–2988. https://doi.org/10.32604/cmc.2025.065047
Vancouver Style
Long H, Zhang C, Qiu X, Chen H, Chen G. Hierarchical Shape Pruning for 3D Sparse Convolution Networks. Comput Mater Contin. 2025;84(2):2975–2988. https://doi.org/10.32604/cmc.2025.065047
IEEE Style
H. Long, C. Zhang, X. Qiu, H. Chen, and G. Chen, “Hierarchical Shape Pruning for 3D Sparse Convolution Networks,” Comput. Mater. Contin., vol. 84, no. 2, pp. 2975–2988, 2025. https://doi.org/10.32604/cmc.2025.065047



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 272

    View

  • 99

    Download

  • 0

    Like

Share Link