Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    An Improved End-to-End Memory Network for QA Tasks

    Aziguli Wulamu1,2, Zhenqi Sun1,2, Yonghong Xie1,2,*, Cong Xu1,2, Alan Yang3

    CMC-Computers, Materials & Continua, Vol.60, No.3, pp. 1283-1295, 2019, DOI:10.32604/cmc.2019.07722

    Abstract At present, End-to-End trainable Memory Networks (MemN2N) has proven to be promising in many deep learning fields, especially on simple natural language-based reasoning question and answer (QA) tasks. However, when solving some subtasks such as basic induction, path finding or time reasoning tasks, it remains challenging because of limited ability to learn useful information between memory and query. In this paper, we propose a novel gated linear units (GLU) and local-attention based end-to-end memory networks (MemN2N-GL) motivated by the success of attention mechanism theory in the field of neural machine translation, it shows an improved possibility to develop the ability… More >

Displaying 1-10 on page 1 of 1. Per Page