Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (3)
  • Open Access

    ARTICLE

    Classification of Conversational Sentences Using an Ensemble Pre-Trained Language Model with the Fine-Tuned Parameter

    R. Sujatha, K. Nimala*

    CMC-Computers, Materials & Continua, Vol.78, No.2, pp. 1669-1686, 2024, DOI:10.32604/cmc.2023.046963

    Abstract Sentence classification is the process of categorizing a sentence based on the context of the sentence. Sentence categorization requires more semantic highlights than other tasks, such as dependence parsing, which requires more syntactic elements. Most existing strategies focus on the general semantics of a conversation without involving the context of the sentence, recognizing the progress and comparing impacts. An ensemble pre-trained language model was taken up here to classify the conversation sentences from the conversation corpus. The conversational sentences are classified into four categories: information, question, directive, and commission. These classification label sequences are for analyzing the conversation progress and… More >

  • Open Access

    ARTICLE

    Personality Trait Detection via Transfer Learning

    Bashar Alshouha1, Jesus Serrano-Guerrero1,*, Francisco Chiclana2, Francisco P. Romero1, Jose A. Olivas1

    CMC-Computers, Materials & Continua, Vol.78, No.2, pp. 1933-1956, 2024, DOI:10.32604/cmc.2023.046711

    Abstract Personality recognition plays a pivotal role when developing user-centric solutions such as recommender systems or decision support systems across various domains, including education, e-commerce, or human resources. Traditional machine learning techniques have been broadly employed for personality trait identification; nevertheless, the development of new technologies based on deep learning has led to new opportunities to improve their performance. This study focuses on the capabilities of pre-trained language models such as BERT, RoBERTa, ALBERT, ELECTRA, ERNIE, or XLNet, to deal with the task of personality recognition. These models are able to capture structural features from textual content and comprehend a multitude… More >

  • Open Access

    ARTICLE

    Vulnerability Detection of Ethereum Smart Contract Based on SolBERT-BiGRU-Attention Hybrid Neural Model

    Guangxia Xu1,*, Lei Liu2, Jingnan Dong3

    CMES-Computer Modeling in Engineering & Sciences, Vol.137, No.1, pp. 903-922, 2023, DOI:10.32604/cmes.2023.026627

    Abstract In recent years, with the great success of pre-trained language models, the pre-trained BERT model has been gradually applied to the field of source code understanding. However, the time cost of training a language model from zero is very high, and how to transfer the pre-trained language model to the field of smart contract vulnerability detection is a hot research direction at present. In this paper, we propose a hybrid model to detect common vulnerabilities in smart contracts based on a lightweight pre-trained language model BERT and connected to a bidirectional gate recurrent unit model. The downstream neural network adopts… More >

Displaying 1-10 on page 1 of 3. Per Page