Open Access
ARTICLE
Fuzzy Attention Convolutional Neural Networks: A Novel Approach Combining Intuitionistic Fuzzy Sets and Deep Learning
Zheng Zhao1, Doo Heon Song2, Kwang Baek Kim1,*
1 Department of Artificial Intelligence, Silla University, Busan, 46958, Republic of Korea
2 Department of Computer Games, Yong-In Art & Science University, Yongin, 17145, Republic of Korea
* Corresponding Author: Kwang Baek Kim. Email:
(This article belongs to the Special Issue: Recent Fuzzy Techniques in Image Processing and its Applications)
Computers, Materials & Continua 2026, 87(2), 32 https://doi.org/10.32604/cmc.2026.073969
Received 29 September 2025; Accepted 15 December 2025; Issue published 12 March 2026
Abstract
Deep learning attention mechanisms have achieved remarkable progress in computer vision, but still face limitations when handling images with ambiguous boundaries and uncertain feature representations. Conventional attention modules such as SE-Net, CBAM, ECA-Net, and CA adopt a deterministic paradigm, assigning fixed scalar weights to features without modeling ambiguity or confidence. To overcome these limitations, this paper proposes the Fuzzy Attention Network Layer (FANL), which integrates intuitionistic fuzzy set theory with convolutional neural networks to explicitly represent feature uncertainty through membership (
μ), non-membership (
ν), and hesitation (
π) degrees. FANL consists of four core modules: (1) feature dimensionality reduction via global pooling, (2) fuzzy modeling using learnable clustering centers, (3) adaptive attention generation through weighted fusion of fuzzy components, and (4) feature refinement through residual connections. A cross-layer guidance mechanism is further introduced to enhance hierarchical feature propagation, allowing high-level semantic features to incorporate fine-grained texture information from shallow layers. Comprehensive experiments on three benchmark datasets—PathMNIST-30000, full PathMNIST, and BloodMNIST—demonstrate the effectiveness and generalizability of FANL. The model achieves 84.41 ± 0.56% accuracy and a 1.69% improvement over the baseline CNN while maintaining lightweight computational complexity. Ablation studies show that removing any component causes a 1.7%–2.0% performance drop, validating the synergistic contribution of each module. Furthermore, FANL provides superior uncertainty calibration (ECE = 0.0452) and interpretable selective prediction under uncertainty. Overall, FANL presents an efficient and uncertainty-aware attention framework that improves both accuracy and reliability, offering a promising direction for robust visual recognition under ambiguous or noisy conditions.
Keywords
Attention mechanism; deep learning; intuitionistic fuzzy set; PathMNIST
Cite This Article
APA Style
Zhao, Z., Song, D.H., Kim, K.B. (2026). Fuzzy Attention Convolutional Neural Networks: A Novel Approach Combining Intuitionistic Fuzzy Sets and Deep Learning.
Computers, Materials & Continua,
87(2), 32.
https://doi.org/10.32604/cmc.2026.073969
Vancouver Style
Zhao Z, Song DH, Kim KB. Fuzzy Attention Convolutional Neural Networks: A Novel Approach Combining Intuitionistic Fuzzy Sets and Deep Learning. Comput Mater Contin. 2026;87(2):32.
https://doi.org/10.32604/cmc.2026.073969
IEEE Style
Z. Zhao, D. H. Song, and K. B. Kim, “Fuzzy Attention Convolutional Neural Networks: A Novel Approach Combining Intuitionistic Fuzzy Sets and Deep Learning,”
Comput. Mater. Contin., vol. 87, no. 2, pp. 32, 2026.
https://doi.org/10.32604/cmc.2026.073969

Copyright © 2026 The Author(s). Published by Tech Science Press.
This work is licensed under a
Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.