Open Access iconOpen Access

ARTICLE

crossmark

Enhancing Deep Learning Semantics: The Diffusion Sampling and Label-Driven Co-Attention Approach

Chunhua Wang1,2, Wenqian Shang1,2,*, Tong Yi3,*, Haibin Zhu4

1 State Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing, 100024, China
2 School of Computer and Cyber Sciences, Communication University of China, Beijing, 100024, China
3 School of Computer Science and Engineering, Guangxi Normal University, Guilin, 541004, China
4 Department of Computer Science, Nipissing University, North Bay, ON P1B 8L7, Canada

* Corresponding Authors: Wenqian Shang. Email: email; Tong Yi. Email: email

(This article belongs to the Special Issue: The Next-generation Deep Learning Approaches to Emerging Real-world Applications)

Computers, Materials & Continua 2024, 79(2), 1939-1956. https://doi.org/10.32604/cmc.2024.048135

Abstract

The advent of self-attention mechanisms within Transformer models has significantly propelled the advancement of deep learning algorithms, yielding outstanding achievements across diverse domains. Nonetheless, self-attention mechanisms falter when applied to datasets with intricate semantic content and extensive dependency structures. In response, this paper introduces a Diffusion Sampling and Label-Driven Co-attention Neural Network (DSLD), which adopts a diffusion sampling method to capture more comprehensive semantic information of the data. Additionally, the model leverages the joint correlation information of labels and data to introduce the computation of text representation, correcting semantic representation biases in the data, and increasing the accuracy of semantic representation. Ultimately, the model computes the corresponding classification results by synthesizing these rich data semantic representations. Experiments on seven benchmark datasets show that our proposed model achieves competitive results compared to state-of-the-art methods.

Keywords


Cite This Article

APA Style
Wang, C., Shang, W., Yi, T., Zhu, H. (2024). Enhancing deep learning semantics: the diffusion sampling and label-driven co-attention approach. Computers, Materials & Continua, 79(2), 1939-1956. https://doi.org/10.32604/cmc.2024.048135
Vancouver Style
Wang C, Shang W, Yi T, Zhu H. Enhancing deep learning semantics: the diffusion sampling and label-driven co-attention approach. Comput Mater Contin. 2024;79(2):1939-1956 https://doi.org/10.32604/cmc.2024.048135
IEEE Style
C. Wang, W. Shang, T. Yi, and H. Zhu "Enhancing Deep Learning Semantics: The Diffusion Sampling and Label-Driven Co-Attention Approach," Comput. Mater. Contin., vol. 79, no. 2, pp. 1939-1956. 2024. https://doi.org/10.32604/cmc.2024.048135



cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 121

    View

  • 38

    Download

  • 0

    Like

Share Link