Open Access
ARTICLE
HUANNet: A High-Resolution Unified Attention Network for Accurate Counting
Robotics Research Center, College of Electrical Engineering and Automation, Shandong University of Science and Technology, Qingdao, 266590, China
* Corresponding Author: Zhiguo Zhang. Email:
Computers, Materials & Continua 2026, 86(1), 1-20. https://doi.org/10.32604/cmc.2025.069340
Received 20 June 2025; Accepted 10 September 2025; Issue published 10 November 2025
Abstract
Accurately counting dense objects in complex and diverse backgrounds is a significant challenge in computer vision, with applications ranging from crowd counting to various other object counting tasks. To address this, we propose HUANNet (High-Resolution Unified Attention Network), a convolutional neural network designed to capture both local features and rich semantic information through a high-resolution representation learning framework, while optimizing computational distribution across parallel branches. HUANNet introduces three core modules: the High-Resolution Attention Module (HRAM), which enhances feature extraction by optimizing multi-resolution feature fusion; the Unified Multi-Scale Attention Module (UMAM), which integrates spatial, channel, and convolutional kernel information through an attention mechanism applied across multiple levels of the network; and the Grid-Assisted Point Matching Module (GPMM), which stabilizes and improves point-to-point matching by leveraging grid-based mechanisms. Extensive experiments show that HUANNet achieves competitive results on the ShanghaiTech Part A/B crowd counting datasets and sets new state-of-the-art performance on dense object counting datasets such as CARPK and XRAY-IECCD, demonstrating the effectiveness and versatility of HUANNet.Keywords
Cite This Article
Copyright © 2026 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools