Open Access iconOpen Access

ARTICLE

crossmark

A Transformer-Based Deep Learning Framework with Semantic Encoding and Syntax-Aware LSTM for Fake Electronic News Detection

Hamza Murad Khan1, Shakila Basheer2, Mohammad Tabrez Quasim3, Raja`a Al-Naimi4, Vijaykumar Varadarajan5, Anwar Khan1,*

1 Department of Electronics, University of Peshawar, Peshawar, 25120, Pakistan
2 Department of Information Systems, College of Computer and Information Science, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh, 11671, Saudi Arabia
3 Department of Computer Science and Artificial Intelligence, College of Computing and Information Technology, University of Bisha, P.O. Box 551, Bisha, 61922, Saudi Arabia
4 Department of Mathematics, University of Petra, Amman, 1199, Jordon
5 School of Computer Science, University of Technology, Sydney, 2007, Australia

* Corresponding Author: Anwar Khan. Email: email

Computers, Materials & Continua 2026, 86(1), 1-25. https://doi.org/10.32604/cmc.2025.069327

Abstract

With the increasing growth of online news, fake electronic news detection has become one of the most important paradigms of modern research. Traditional electronic news detection techniques are generally based on contextual understanding, sequential dependencies, and/or data imbalance. This makes distinction between genuine and fabricated news a challenging task. To address this problem, we propose a novel hybrid architecture, T5-SA-LSTM, which synergistically integrates the T5 Transformer for semantically rich contextual embedding with the Self-Attention-enhanced (SA) Long Short-Term Memory (LSTM). The LSTM is trained using the Adam optimizer, which provides faster and more stable convergence compared to the Stochastic Gradient Descend (SGD) and Root Mean Square Propagation (RMSProp). The WELFake and FakeNewsPrediction datasets are used, which consist of labeled news articles having fake and real news samples. Tokenization and Synthetic Minority Over-sampling Technique (SMOTE) methods are used for data preprocessing to ensure linguistic normalization and class imbalance. The incorporation of the Self-Attention (SA) mechanism enables the model to highlight critical words and phrases, thereby enhancing predictive accuracy. The proposed model is evaluated using accuracy, precision, recall (sensitivity), and F1-score as performance metrics. The model achieved 99% accuracy on the WELFake dataset and 96.5% accuracy on the FakeNewsPrediction dataset. It outperformed the competitive schemes such as T5-SA-LSTM (RMSProp), T5-SA-LSTM (SGD) and some other models.

Keywords

Fake news detection; tokenization; SMOTE; test-to-text transfer transformer (T5); long short-term memory (LSTM); self-attention mechanism (SA); T5-SA-LSTM; WELFake dataset; FakeNewsPrediction dataset

Cite This Article

APA Style
Khan, H.M., Basheer, S., Tabrez Quasim, M., Al-Naimi, R., Varadarajan, V. et al. (2026). A Transformer-Based Deep Learning Framework with Semantic Encoding and Syntax-Aware LSTM for Fake Electronic News Detection. Computers, Materials & Continua, 86(1), 1–25. https://doi.org/10.32604/cmc.2025.069327
Vancouver Style
Khan HM, Basheer S, Tabrez Quasim M, Al-Naimi R, Varadarajan V, Khan A. A Transformer-Based Deep Learning Framework with Semantic Encoding and Syntax-Aware LSTM for Fake Electronic News Detection. Comput Mater Contin. 2026;86(1):1–25. https://doi.org/10.32604/cmc.2025.069327
IEEE Style
H. M. Khan, S. Basheer, M. Tabrez Quasim, R. Al-Naimi, V. Varadarajan, and A. Khan, “A Transformer-Based Deep Learning Framework with Semantic Encoding and Syntax-Aware LSTM for Fake Electronic News Detection,” Comput. Mater. Contin., vol. 86, no. 1, pp. 1–25, 2026. https://doi.org/10.32604/cmc.2025.069327



cc Copyright © 2026 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 740

    View

  • 355

    Download

  • 0

    Like

Share Link