Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (2)
  • Open Access

    ARTICLE

    Terrorism Attack Classification Using Machine Learning: The Effectiveness of Using Textual Features Extracted from GTD Dataset

    Mohammed Abdalsalam1,*, Chunlin Li1, Abdelghani Dahou2, Natalia Kryvinska3

    CMES-Computer Modeling in Engineering & Sciences, Vol.138, No.2, pp. 1427-1467, 2024, DOI:10.32604/cmes.2023.029911

    Abstract One of the biggest dangers to society today is terrorism, where attacks have become one of the most significant risks to international peace and national security. Big data, information analysis, and artificial intelligence (AI) have become the basis for making strategic decisions in many sensitive areas, such as fraud detection, risk management, medical diagnosis, and counter-terrorism. However, there is still a need to assess how terrorist attacks are related, initiated, and detected. For this purpose, we propose a novel framework for classifying and predicting terrorist attacks. The proposed framework posits that neglected text attributes included in the Global Terrorism Database… More >

  • Open Access

    ARTICLE

    Online News Sentiment Classification Using DistilBERT

    Samuel Kofi Akpatsa1,*, Hang Lei1, Xiaoyu Li1, Victor-Hillary Kofi Setornyo Obeng1, Ezekiel Mensah Martey1, Prince Clement Addo2, Duncan Dodzi Fiawoo3

    Journal of Quantum Computing, Vol.4, No.1, pp. 1-11, 2022, DOI:10.32604/jqc.2022.026658

    Abstract The ability of pre-trained BERT model to achieve outstanding performances on many Natural Language Processing (NLP) tasks has attracted the attention of researchers in recent times. However, the huge computational and memory requirements have hampered its widespread deployment on devices with limited resources. The concept of knowledge distillation has shown to produce smaller and faster distilled models with less trainable parameters and intended for resource-constrained environments. The distilled models can be fine-tuned with great performance on a wider range of tasks, such as sentiment classification. This paper evaluates the performance of DistilBERT model and other pre-canned text classifiers on a… More >

Displaying 1-10 on page 1 of 2. Per Page