Open Access iconOpen Access

ARTICLE

A Hybrid Split-Attention and Transformer Architecture for High-Performance Network Intrusion Detection

Gan Zhu1, Yongtao Yu2,*, Xiaofan Deng1, Yuanchen Dai3, Zhenyuan Li3

1 School of Software, Yunnan University, Kunming, 650504, China
2 Yunnan Key Laboratory of Smart City in Cyberspace Security, Yuxi Normal University, Yuxi, 653100, China
3 School of Information Science and Technology, Yunnan Normal University, Kunming, 650500, China

* Corresponding Author: Yongtao Yu. Email: email

Computer Modeling in Engineering & Sciences 2025, 145(3), 4317-4348. https://doi.org/10.32604/cmes.2025.074349

Abstract

Existing deep learning Network Intrusion Detection Systems (NIDS) struggle to simultaneously capture fine-grained, multi-scale features and long-range temporal dependencies. To address this gap, this paper introduces TransNeSt, a hybrid architecture integrating a ResNeSt block (using split-attention for multi-scale feature representation) with a Transformer encoder (using self-attention for global temporal modeling). This integration of multi-scale and temporal attention was validated on four benchmarks: NSL-KDD, UNSW-NB15, CIC-IDS2017, and CICIOT2023. TransNeSt consistently outperformed its individual components and several state-of-the-art models, demonstrating significant quantitative gains. The model achieved high efficacy across all datasets, with F1-Scores of 99.04% (NSL-KDD), 91.92% (UNSW-NB15), 99.18% (CIC-IDS2017), and 97.85% (CICIOT2023), confirming its robustness.

Keywords

Intrusion detection; transformer; resnest; split attention; deep learning

Cite This Article

APA Style
Zhu, G., Yu, Y., Deng, X., Dai, Y., Li, Z. (2025). A Hybrid Split-Attention and Transformer Architecture for High-Performance Network Intrusion Detection. Computer Modeling in Engineering & Sciences, 145(3), 4317–4348. https://doi.org/10.32604/cmes.2025.074349
Vancouver Style
Zhu G, Yu Y, Deng X, Dai Y, Li Z. A Hybrid Split-Attention and Transformer Architecture for High-Performance Network Intrusion Detection. Comput Model Eng Sci. 2025;145(3):4317–4348. https://doi.org/10.32604/cmes.2025.074349
IEEE Style
G. Zhu, Y. Yu, X. Deng, Y. Dai, and Z. Li, “A Hybrid Split-Attention and Transformer Architecture for High-Performance Network Intrusion Detection,” Comput. Model. Eng. Sci., vol. 145, no. 3, pp. 4317–4348, 2025. https://doi.org/10.32604/cmes.2025.074349



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 18

    View

  • 8

    Download

  • 0

    Like

Share Link