Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    Attention Weight is Indispensable in Joint Entity and Relation Extraction

    Jianquan Ouyang1,*, Jing Zhang1, Tianming Liu2

    Intelligent Automation & Soft Computing, Vol.34, No.3, pp. 1707-1723, 2022, DOI:10.32604/iasc.2022.028352

    Abstract Joint entity and relation extraction (JERE) is an important foundation for unstructured knowledge extraction in natural language processing (NLP). Thus, designing efficient algorithms for it has become a vital task. Although existing methods can efficiently extract entities and relations, their performance should be improved. In this paper, we propose a novel model called Attention and Span-based Entity and Relation Transformer (ASpERT) for JERE. First, differing from the traditional approach that only considers the last hidden layer as the feature embedding, ASpERT concatenates the attention head information of each layer with the information of the last hidden layer by using an… More >

Displaying 1-10 on page 1 of 1. Per Page