Open Access
ARTICLE
Efficient Parameterization for Knowledge Graph Embedding Using Hierarchical Attention Network
1 Master Program of Digital Innovation, Tunghai University, Taichung, 40704, Taiwan
2 Department of Statistics, Feng Chia University, Taichung, 40724, Taiwan
3 College of Integrated Health Sciences and the AI Plus Institute, The University at Albany, State University of New York (SUNY), Albany, NY 12222, USA
* Corresponding Author: Ching-Sheng Lin. Email:
Computers, Materials & Continua 2025, 82(3), 4287-4300. https://doi.org/10.32604/cmc.2025.061661
Received 29 November 2024; Accepted 01 February 2025; Issue published 06 March 2025
Abstract
In the domain of knowledge graph embedding, conventional approaches typically transform entities and relations into continuous vector spaces. However, parameter efficiency becomes increasingly crucial when dealing with large-scale knowledge graphs that contain vast numbers of entities and relations. In particular, resource-intensive embeddings often lead to increased computational costs, and may limit scalability and adaptability in practical environments, such as in low-resource settings or real-world applications. This paper explores an approach to knowledge graph representation learning that leverages small, reserved entities and relation sets for parameter-efficient embedding. We introduce a hierarchical attention network designed to refine and maximize the representational quality of embeddings by selectively focusing on these reserved sets, thereby reducing model complexity. Empirical assessments validate that our model achieves high performance on the benchmark dataset with fewer parameters and smaller embedding dimensions. The ablation studies further highlight the impact and contribution of each component in the proposed hierarchical attention structure.Keywords
Cite This Article

This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.