Home / Journals / CMC / Online First / doi:10.32604/cmc.2025.070990
Special Issues
Table of Content

Open Access

ARTICLE

ResghostNet: Boosting GhostNet with Residual Connections and Adaptive-SE Blocks

Yuang Chen1,2, Yong Li1,*, Fang Lin1,2, Shuhan Lv1,2, Jiaze Jiang1,2
1 Key Laboratory of CTC & IE (Engineering University of PAP), Ministry of Education, Xi’an, 710086, China
2 Graduate Student Brigade, Engineering University of PAP, Xi’an, 710086, China
* Corresponding Author: Yong Li. Email: email

Computers, Materials & Continua https://doi.org/10.32604/cmc.2025.070990

Received 29 July 2025; Accepted 26 September 2025; Published online 27 October 2025

Abstract

Aiming at the problem of potential information noise introduced during the generation of ghost feature maps in GhostNet, this paper proposes a novel lightweight neural network model called ResghostNet. This model constructs the Resghost Module by combining residual connections and Adaptive-SE Blocks, which enhances the quality of generated feature maps through direct propagation of original input information and selection of important channels before cheap operations. Specifically, ResghostNet introduces residual connections on the basis of the Ghost Module to optimize the information flow, and designs a weight self-attention mechanism combined with SE blocks to enhance feature expression capabilities in cheap operations. Experimental results on the ImageNet dataset show that, compared to GhostNet, ResghostNet achieves higher accuracy while reducing the number of parameters by 52%. Although the computational complexity increases, by optimizing the usage strategy of GPU cache memory, the model’s inference speed becomes faster. The ResghostNet is optimized in terms of classification accuracy and the number of model parameters, and shows great potential in edge computing devices.

Keywords

Residual connections; adaptive-SE blocks; lightweight neural network; GPU memory usage
  • 462

    View

  • 248

    Download

  • 2

    Like

Share Link