TY - EJOU AU - Wulamu, Aziguli AU - Sun, Zhenqi AU - Xie, Yonghong AU - Xu, Cong AU - Yang, Alan TI - An Improved End-to-End Memory Network for QA Tasks T2 - Computers, Materials \& Continua PY - 2019 VL - 60 IS - 3 SN - 1546-2226 AB - At present, End-to-End trainable Memory Networks (MemN2N) has proven to be promising in many deep learning fields, especially on simple natural language-based reasoning question and answer (QA) tasks. However, when solving some subtasks such as basic induction, path finding or time reasoning tasks, it remains challenging because of limited ability to learn useful information between memory and query. In this paper, we propose a novel gated linear units (GLU) and local-attention based end-to-end memory networks (MemN2N-GL) motivated by the success of attention mechanism theory in the field of neural machine translation, it shows an improved possibility to develop the ability of capturing complex memory-query relations and works better on some subtasks. It is an improved end-to-end memory network for QA tasks. We demonstrate the effectiveness of these approaches on the 20 bAbI dataset which includes 20 challenging tasks, without the use of any domain knowledge. Our project is open source on github4. KW - QA system KW - memory network KW - local attention KW - gated linear unit DO - 10.32604/cmc.2019.07722