TY - EJOU AU - Wang, Chunhua AU - Shang, Wenqian AU - Yi, Tong AU - Zhu, Haibin TI - Enhancing Deep Learning Semantics: The Diffusion Sampling and Label-Driven Co-Attention Approach T2 - Computers, Materials \& Continua PY - 2024 VL - 79 IS - 2 SN - 1546-2226 AB - The advent of self-attention mechanisms within Transformer models has significantly propelled the advancement of deep learning algorithms, yielding outstanding achievements across diverse domains. Nonetheless, self-attention mechanisms falter when applied to datasets with intricate semantic content and extensive dependency structures. In response, this paper introduces a Diffusion Sampling and Label-Driven Co-attention Neural Network (DSLD), which adopts a diffusion sampling method to capture more comprehensive semantic information of the data. Additionally, the model leverages the joint correlation information of labels and data to introduce the computation of text representation, correcting semantic representation biases in the data, and increasing the accuracy of semantic representation. Ultimately, the model computes the corresponding classification results by synthesizing these rich data semantic representations. Experiments on seven benchmark datasets show that our proposed model achieves competitive results compared to state-of-the-art methods. KW - Semantic representation; sampling attention; label-driven co-attention; attention mechanisms DO - 10.32604/cmc.2024.048135