Table of Content

Open Access iconOpen Access


Keyphrase Generation Based on Self-Attention Mechanism

Kehua Yang1,*, Yaodong Wang1, Wei Zhang1, Jiqing Yao2, Yuquan Le1

College of Computer Science and Electronic Engineering and Key Laboratory for Embedded and Network Computing of Hunan Province, Hunan University, Changsha, 410082, China.
Oath Verizon Company, Manhattan, New York, 10007, USA.

*Corresponding Author: Kehua Yang. Email: email.

Computers, Materials & Continua 2019, 61(2), 569-581.


Keyphrase greatly provides summarized and valuable information. This information can help us not only understand text semantics, but also organize and retrieve text content effectively. The task of automatically generating it has received considerable attention in recent decades. From the previous studies, we can see many workable solutions for obtaining keyphrases. One method is to divide the content to be summarized into multiple blocks of text, then we rank and select the most important content. The disadvantage of this method is that it cannot identify keyphrase that does not include in the text, let alone get the real semantic meaning hidden in the text. Another approach uses recurrent neural networks to generate keyphrases from the semantic aspects of the text, but the inherently sequential nature precludes parallelization within training examples, and distances have limitations on context dependencies. Previous works have demonstrated the benefits of the self-attention mechanism, which can learn global text dependency features and can be parallelized. Inspired by the above observation, we propose a keyphrase generation model, which is based entirely on the self-attention mechanism. It is an encoder-decoder model that can make up the above disadvantage effectively. In addition, we also consider the semantic similarity between keyphrases, and add semantic similarity processing module into the model. This proposed model, which is demonstrated by empirical analysis on five datasets, can achieve competitive performance compared to baseline methods.


Cite This Article

APA Style
Yang, K., Wang, Y., Zhang, W., Yao, J., Le, Y. (2019). Keyphrase generation based on self-attention mechanism. Computers, Materials & Continua, 61(2), 569-581.
Vancouver Style
Yang K, Wang Y, Zhang W, Yao J, Le Y. Keyphrase generation based on self-attention mechanism. Comput Mater Contin. 2019;61(2):569-581
IEEE Style
K. Yang, Y. Wang, W. Zhang, J. Yao, and Y. Le "Keyphrase Generation Based on Self-Attention Mechanism," Comput. Mater. Contin., vol. 61, no. 2, pp. 569-581. 2019.

cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 2126


  • 1378


  • 0


Related articles

Share Link