Open Access
ARTICLE
Towards Real-Time Multi-Person Pose Estimation via Feature Selection and Sharpening Mechanisms
1 School of Mathematics and Statistics, Huangshan University, Huangshan, China
2 College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, Nanjing, China
3 Huangshan Technology Innovation Center for Digital Economy and Big Data Analysis, Huangshan, China
* Corresponding Author: Jianwei Hu. Email:
Computer Modeling in Engineering & Sciences 2026, 146(3), 32 https://doi.org/10.32604/cmes.2026.079062
Received 14 January 2026; Accepted 03 March 2026; Issue published 30 March 2026
Abstract
Real-time multi-person pose estimation (MPE) built upon neural network architectures aims to simultaneously detect multiple human instances and regress joint coordinates in dynamic scenes. However, due to factors such as high model complexity and limited expression of keypoint information, both the efficiency and accuracy of real-time MPE remain to be improved. To mitigate the adverse impacts caused by the aforementioned issues, this work develops FSEM-Pose, a real-time MPE model rooted in the YOLOv10 framework. In detail, first, FSEM-Pose upgrades the backbone module of the baseline network by introducing the Feature Shuffling-Convolution (FS-Conv), which effectively reduces the backbone size while maximizing the retention of spatial information from the input image. Second, FSEM-Pose incorporates a Feature Saliency Enhancement Module (FSEM) to strengthen the feature encoding of human keypoints, thereby improving the accuracy of pose estimation. Finally, FSEM-Pose further enhances inference efficiency via a lightweight optimization of the head using shared convolutional layers. Our method achieves competitive results across multiple accuracy and efficiency metrics on the MS COCO 2017 and CrowdPose datasets. While being lightweight in design, it improves average precision (AP) by 2.1% and 2.5%, respectively.Keywords
Cite This Article
Copyright © 2026 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools