Home / Journals / CMC / Online First / doi:10.32604/cmc.2026.073969
Special Issues
Table of Content

Open Access

ARTICLE

Fuzzy Attention Convolutional Neural Networks: A Novel Approach Combining Intuitionistic Fuzzy Sets and Deep Learning

Zheng Zhao1, Doo Heon Song2, Kwang Baek Kim1,*
1 Department of Artificial Intelligence, Silla University, Busan, 46958, Republic of Korea
2 Department of Computer Games, Yong-In Art & Science University, Yongin, 17145, Republic of Korea
* Corresponding Author: Kwang Baek Kim. Email: email
(This article belongs to the Special Issue: Recent Fuzzy Techniques in Image Processing and its Applications)

Computers, Materials & Continua https://doi.org/10.32604/cmc.2026.073969

Received 29 September 2025; Accepted 15 December 2025; Published online 09 January 2026

Abstract

Deep learning attention mechanisms have achieved remarkable progress in computer vision, but still face limitations when handling images with ambiguous boundaries and uncertain feature representations. Conventional attention modules such as SE-Net, CBAM, ECA-Net, and CA adopt a deterministic paradigm, assigning fixed scalar weights to features without modeling ambiguity or confidence. To overcome these limitations, this paper proposes the Fuzzy Attention Network Layer (FANL), which integrates intuitionistic fuzzy set theory with convolutional neural networks to explicitly represent feature uncertainty through membership (μ), non-membership (ν), and hesitation (π) degrees. FANL consists of four core modules: (1) feature dimensionality reduction via global pooling, (2) fuzzy modeling using learnable clustering centers, (3) adaptive attention generation through weighted fusion of fuzzy components, and (4) feature refinement through residual connections. A cross-layer guidance mechanism is further introduced to enhance hierarchical feature propagation, allowing high-level semantic features to incorporate fine-grained texture information from shallow layers. Comprehensive experiments on three benchmark datasets—PathMNIST-30000, full PathMNIST, and BloodMNIST—demonstrate the effectiveness and generalizability of FANL. The model achieves 84.41 ± 0.56% accuracy and a 1.69% improvement over the baseline CNN while maintaining lightweight computational complexity. Ablation studies show that removing any component causes a 1.7%–2.0% performance drop, validating the synergistic contribution of each module. Furthermore, FANL provides superior uncertainty calibration (ECE = 0.0452) and interpretable selective prediction under uncertainty. Overall, FANL presents an efficient and uncertainty-aware attention framework that improves both accuracy and reliability, offering a promising direction for robust visual recognition under ambiguous or noisy conditions.

Keywords

Attention mechanism; deep learning; intuitionistic fuzzy set; PathMNIST
  • 65

    View

  • 8

    Download

  • 0

    Like

Share Link