Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (6)
  • Open Access

    ARTICLE

    Discharge Summaries Based Sentiment Detection Using Multi-Head Attention and CNN-BiGRU

    Samer Abdulateef Waheeb*

    Computer Systems Science and Engineering, Vol.46, No.1, pp. 981-998, 2023, DOI:10.32604/csse.2023.035753

    Abstract Automatic extraction of the patient’s health information from the unstructured data concerning the discharge summary remains challenging. Discharge summary related documents contain various aspects of the patient health condition to examine the quality of treatment and thereby help improve decision-making in the medical field. Using a sentiment dictionary and feature engineering, the researchers primarily mine semantic text features. However, choosing and designing features requires a lot of manpower. The proposed approach is an unsupervised deep learning model that learns a set of clusters embedded in the latent space. A composite model including Active Learning (AL), Convolutional Neural Network (CNN), BiGRU,… More >

  • Open Access

    ARTICLE

    Employing Lexicalized Dependency Paths for Active Learning of Relation Extraction

    Huiyu Sun*, Ralph Grishman

    Intelligent Automation & Soft Computing, Vol.34, No.3, pp. 1415-1423, 2022, DOI:10.32604/iasc.2022.030794

    Abstract Active learning methods which present selected examples from the corpus for annotation provide more efficient learning of supervised relation extraction models, but they leave the developer in the unenviable role of a passive informant. To restore the developer’s proper role as a partner with the system, we must give the developer an ability to inspect the extraction model during development. We propose to make this possible through a representation based on lexicalized dependency paths (LDPs) coupled with an active learner for LDPs. We apply LDPs to both simulated and real active learning with ACE as evaluation and a year’s newswire… More >

  • Open Access

    ARTICLE

    Design Principles-Based Interactive Learning Tool for Solving Nonlinear Equations

    Ahad Alloqmani1, Omimah Alsaedi1, Nadia Bahatheg1, Reem Alnanih1,*, Lamiaa Elrefaei1,2

    Computer Systems Science and Engineering, Vol.40, No.3, pp. 1023-1042, 2022, DOI:10.32604/csse.2022.019704

    Abstract Interactive learning tools can facilitate the learning process and increase student engagement, especially tools such as computer programs that are designed for human-computer interaction. Thus, this paper aims to help students learn five different methods for solving nonlinear equations using an interactive learning tool designed with common principles such as feedback, visibility, affordance, consistency, and constraints. It also compares these methods by the number of iterations and time required to display the result. This study helps students learn these methods using interactive learning tools instead of relying on traditional teaching methods. The tool is implemented using the MATLAB app and… More >

  • Open Access

    ARTICLE

    Adversarial Active Learning for Named Entity Recognition in Cybersecurity

    Tao Li1, Yongjin Hu1,*, Ankang Ju1, Zhuoran Hu2

    CMC-Computers, Materials & Continua, Vol.66, No.1, pp. 407-420, 2021, DOI:10.32604/cmc.2020.012023

    Abstract Owing to the continuous barrage of cyber threats, there is a massive amount of cyber threat intelligence. However, a great deal of cyber threat intelligence come from textual sources. For analysis of cyber threat intelligence, many security analysts rely on cumbersome and time-consuming manual efforts. Cybersecurity knowledge graph plays a significant role in automatics analysis of cyber threat intelligence. As the foundation for constructing cybersecurity knowledge graph, named entity recognition (NER) is required for identifying critical threat-related elements from textual cyber threat intelligence. Recently, deep neural network-based models have attained very good results in NER. However, the performance of these… More >

  • Open Access

    ARTICLE

    MII: A Novel Text Classification Model Combining Deep Active Learning with BERT

    Anman Zhang1, Bohan Li1, 2, 3, *, Wenhuan Wang1, Shuo Wan1, Weitong Chen4

    CMC-Computers, Materials & Continua, Vol.63, No.3, pp. 1499-1514, 2020, DOI:10.32604/cmc.2020.09962

    Abstract Active learning has been widely utilized to reduce the labeling cost of supervised learning. By selecting specific instances to train the model, the performance of the model was improved within limited steps. However, rare work paid attention to the effectiveness of active learning on it. In this paper, we proposed a deep active learning model with bidirectional encoder representations from transformers (BERT) for text classification. BERT takes advantage of the self-attention mechanism to integrate contextual information, which is beneficial to accelerate the convergence of training. As for the process of active learning, we design an instance selection strategy based on… More >

  • Open Access

    ARTICLE

    Analyzing Cross-domain Transportation Big Data of New York City with Semi-supervised and Active Learning

    Huiyu Sun1,*, Suzanne McIntosh1

    CMC-Computers, Materials & Continua, Vol.57, No.1, pp. 1-9, 2018, DOI:10.32604/cmc.2018.03684

    Abstract The majority of big data analytics applied to transportation datasets suffer from being too domain-specific, that is, they draw conclusions for a dataset based on analytics on the same dataset. This makes models trained from one domain (e.g. taxi data) applies badly to a different domain (e.g. Uber data). To achieve accurate analyses on a new domain, substantial amounts of data must be available, which limits practical applications. To remedy this, we propose to use semi-supervised and active learning of big data to accomplish the domain adaptation task: Selectively choosing a small amount of datapoints from a new domain while… More >

Displaying 1-10 on page 1 of 6. Per Page  

Share Link