Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (2)
  • Open Access

    ARTICLE

    Impact of Data Quality on Question Answering System Performances

    Rachid Karra*, Abdelali Lasfar

    Intelligent Automation & Soft Computing, Vol.35, No.1, pp. 335-349, 2023, DOI:10.32604/iasc.2023.026695

    Abstract In contrast with the research of new models, little attention has been paid to the impact of low or high-quality data feeding a dialogue system. The present paper makes the first attempt to fill this gap by extending our previous work on question-answering (QA) systems by investigating the effect of misspelling on QA agents and how context changes can enhance the responses. Instead of using large language models trained on huge datasets, we propose a method that enhances the model's score by modifying only the quality and structure of the data feed to the model. It is important to identify… More >

  • Open Access

    ARTICLE

    An Improved End-to-End Memory Network for QA Tasks

    Aziguli Wulamu1,2, Zhenqi Sun1,2, Yonghong Xie1,2,*, Cong Xu1,2, Alan Yang3

    CMC-Computers, Materials & Continua, Vol.60, No.3, pp. 1283-1295, 2019, DOI:10.32604/cmc.2019.07722

    Abstract At present, End-to-End trainable Memory Networks (MemN2N) has proven to be promising in many deep learning fields, especially on simple natural language-based reasoning question and answer (QA) tasks. However, when solving some subtasks such as basic induction, path finding or time reasoning tasks, it remains challenging because of limited ability to learn useful information between memory and query. In this paper, we propose a novel gated linear units (GLU) and local-attention based end-to-end memory networks (MemN2N-GL) motivated by the success of attention mechanism theory in the field of neural machine translation, it shows an improved possibility to develop the ability… More >

Displaying 1-10 on page 1 of 2. Per Page