Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (2)
  • Open Access

    ARTICLE

    Attention Weight is Indispensable in Joint Entity and Relation Extraction

    Jianquan Ouyang1,*, Jing Zhang1, Tianming Liu2

    Intelligent Automation & Soft Computing, Vol.34, No.3, pp. 1707-1723, 2022, DOI:10.32604/iasc.2022.028352

    Abstract Joint entity and relation extraction (JERE) is an important foundation for unstructured knowledge extraction in natural language processing (NLP). Thus, designing efficient algorithms for it has become a vital task. Although existing methods can efficiently extract entities and relations, their performance should be improved. In this paper, we propose a novel model called Attention and Span-based Entity and Relation Transformer (ASpERT) for JERE. First, differing from the traditional approach that only considers the last hidden layer as the feature embedding, ASpERT concatenates the attention head information of each layer with the information of the last hidden layer by using an… More >

  • Open Access

    ARTICLE

    A Knowledge-Enriched and Span-Based Network for Joint Entity and Relation Extraction

    Kun Ding1, Shanshan Liu1, Yuhao Zhang2, Hui Zhang1, Xiaoxiong Zhang1,*, Tongtong Wu2,3, Xiaolei Zhou1

    CMC-Computers, Materials & Continua, Vol.68, No.1, pp. 377-389, 2021, DOI:10.32604/cmc.2021.016301

    Abstract The joint extraction of entities and their relations from certain texts plays a significant role in most natural language processes. For entity and relation extraction in a specific domain, we propose a hybrid neural framework consisting of two parts: a span-based model and a graph-based model. The span-based model can tackle overlapping problems compared with BILOU methods, whereas the graph-based model treats relation prediction as graph classification. Our main contribution is to incorporate external lexical and syntactic knowledge of a specific domain, such as domain dictionaries and dependency structures from texts, into end-to-end neural models. We conducted extensive experiments on… More >

Displaying 1-10 on page 1 of 2. Per Page