TY - EJOU AU - Sun, Jun AU - Li, Yan AU - Shen, Yatian AU - Ding, Wenke AU - Shi, Xianjin AU - Zhang, Lei AU - Shen, Xiajiong AU - He, Jing TI - Joint Self-Attention Based Neural Networks for Semantic Relation Extraction T2 - Journal of Information Hiding and Privacy Protection PY - 2019 VL - 1 IS - 2 SN - 2637-4226 AB - Relation extraction is an important task in NLP community. However, some models often fail in capturing Long-distance dependence on semantics, and the interaction between semantics of two entities is ignored. In this paper, we propose a novel neural network model for semantic relation classification called joint self-attention bi-LSTM (SA-Bi-LSTM) to model the internal structure of the sentence to obtain the importance of each word of the sentence without relying on additional information, and capture Long-distance dependence on semantics. We conduct experiments using the SemEval-2010 Task 8 dataset. Extensive experiments and the results demonstrated that the proposed method is effective against relation classification, which can obtain state-of-the-art classification accuracy just with minimal feature engineering. KW - Self-attention KW - relation extraction KW - neural networks DO - 10.32604/jihpp.2019.06357