Open Access iconOpen Access

ARTICLE

Efficient Parameterization for Knowledge Graph Embedding Using Hierarchical Attention Network

Zhen-Yu Chen1, Feng-Chi Liu2, Xin Wang3, Cheng-Hsiung Lee1, Ching-Sheng Lin1,*

1 Master Program of Digital Innovation, Tunghai University, Taichung, 40704, Taiwan
2 Department of Statistics, Feng Chia University, Taichung, 40724, Taiwan
3 College of Integrated Health Sciences and the AI Plus Institute, The University at Albany, State University of New York (SUNY), Albany, NY 12222, USA

* Corresponding Author: Ching-Sheng Lin. Email: email

Computers, Materials & Continua 2025, 82(3), 4287-4300. https://doi.org/10.32604/cmc.2025.061661

Abstract

In the domain of knowledge graph embedding, conventional approaches typically transform entities and relations into continuous vector spaces. However, parameter efficiency becomes increasingly crucial when dealing with large-scale knowledge graphs that contain vast numbers of entities and relations. In particular, resource-intensive embeddings often lead to increased computational costs, and may limit scalability and adaptability in practical environments, such as in low-resource settings or real-world applications. This paper explores an approach to knowledge graph representation learning that leverages small, reserved entities and relation sets for parameter-efficient embedding. We introduce a hierarchical attention network designed to refine and maximize the representational quality of embeddings by selectively focusing on these reserved sets, thereby reducing model complexity. Empirical assessments validate that our model achieves high performance on the benchmark dataset with fewer parameters and smaller embedding dimensions. The ablation studies further highlight the impact and contribution of each component in the proposed hierarchical attention structure.

Keywords

Knowledge graph embedding; parameter efficiency; representation learning; reserved entity and relation sets; hierarchical attention network

Cite This Article

APA Style
Chen, Z., Liu, F., Wang, X., Lee, C., Lin, C. (2025). Efficient parameterization for knowledge graph embedding using hierarchical attention network. Computers, Materials & Continua, 82(3), 4287–4300. https://doi.org/10.32604/cmc.2025.061661
Vancouver Style
Chen Z, Liu F, Wang X, Lee C, Lin C. Efficient parameterization for knowledge graph embedding using hierarchical attention network. Comput Mater Contin. 2025;82(3):4287–4300. https://doi.org/10.32604/cmc.2025.061661
IEEE Style
Z. Chen, F. Liu, X. Wang, C. Lee, and C. Lin, “Efficient Parameterization for Knowledge Graph Embedding Using Hierarchical Attention Network,” Comput. Mater. Contin., vol. 82, no. 3, pp. 4287–4300, 2025. https://doi.org/10.32604/cmc.2025.061661



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 409

    View

  • 161

    Download

  • 0

    Like

Share Link