Open Access iconOpen Access

ARTICLE

Online News Sentiment Classification Using DistilBERT

Samuel Kofi Akpatsa1,*, Hang Lei1, Xiaoyu Li1, Victor-Hillary Kofi Setornyo Obeng1, Ezekiel Mensah Martey1, Prince Clement Addo2, Duncan Dodzi Fiawoo3

1 School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu, China
2 Faculty of Applied Sciences and Mathematical Education, Akenten Appiah-Menka University of Skills Training and Entrepreneurial Development, Kumasi, Ghana
3 Jasikan College of Education, University of Cape Coast, Jasikan, Ghana

* Corresponding Author: Samuel Kofi Akpatsa. Email: email

Journal of Quantum Computing 2022, 4(1), 1-11. https://doi.org/10.32604/jqc.2022.026658

Abstract

The ability of pre-trained BERT model to achieve outstanding performances on many Natural Language Processing (NLP) tasks has attracted the attention of researchers in recent times. However, the huge computational and memory requirements have hampered its widespread deployment on devices with limited resources. The concept of knowledge distillation has shown to produce smaller and faster distilled models with less trainable parameters and intended for resource-constrained environments. The distilled models can be fine-tuned with great performance on a wider range of tasks, such as sentiment classification. This paper evaluates the performance of DistilBERT model and other pre-canned text classifiers on a Covid-19 online news binary classification dataset. The analysis shows that despite having fewer trainable parameters than the BERT-based model, the DistilBERT model achieved an accuracy of 0.94 on the validation set after only two training epochs. The paper also highlights the usefulness of the ktrain library in facilitating the building, training, and application of state-of-the-art Machine Learning and Deep Learning models.

Keywords


Cite This Article

S. K. Akpatsa, H. Lei, X. Li, V. K. Setornyo Obeng, E. M. Martey et al., "Online news sentiment classification using distilbert," Journal of Quantum Computing, vol. 4, no.1, pp. 1–11, 2022. https://doi.org/10.32604/jqc.2022.026658



cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1509

    View

  • 1212

    Download

  • 0

    Like

Share Link