Open Access iconOpen Access

REVIEW

GNN: Core Branches, Integration Strategies and Applications

Wenfeng Zheng1, Guangyu Xu2, Siyu Lu3, Junmin Lyu4, Feng Bao5,*, Lirong Yin6,*

1 School of Automation, University of Electronic Science and Technology of China, Chengdu, 611731, China
2 School of the Environment, The University of Queensland, Brisbane St Lucia, QLD 4072, Australia
3 Department of Geography, Texas A&M University, College Station, TX 77843, USA
4 School of Artificial Intelligence, Guangzhou Huashang University, Guangzhou, 511300, China
5 School of Biological and Environmental Engineering, Xi’an University, Xi’an, 710065, China
6 Department of Hydrology and Atmospheric Sciences, University of Arizona, Tucson, AZ 85721, USA

* Corresponding Authors: Feng Bao. Email: email; Lirong Yin. Email: email

(This article belongs to the Special Issue: The Collection of the Latest Reviews on Advances and Challenges in AI)

Computer Modeling in Engineering & Sciences 2026, 146(1), 5 https://doi.org/10.32604/cmes.2025.075741

Abstract

Graph Neural Networks (GNNs), as a deep learning framework specifically designed for graph-structured data, have achieved deep representation learning of graph data through message passing mechanisms and have become a core technology in the field of graph analysis. However, current reviews on GNN models are mainly focused on smaller domains, and there is a lack of systematic reviews on the classification and applications of GNN models. This review systematically synthesizes the three canonical branches of GNN, Graph Convolutional Network (GCN), Graph Attention Network (GAT), and Graph Sampling Aggregation Network (GraphSAGE), and analyzes their integration pathways from both structural and feature perspectives. Drawing on representative studies, we identify three major integration patterns: cascaded fusion, where heterogeneous modules such as Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), and GraphSAGE are sequentially combined for hierarchical feature learning; parallel fusion, where multi-branch architectures jointly encode complementary graph features; and feature-level fusion, which employs concatenation, weighted summation, or attention-based gating to adaptively merge multi-source embeddings. Through these patterns, integrated GNNs achieve enhanced expressiveness, robustness, and scalability across domains including transportation, biomedicine, and cybersecurity.

Keywords

Graph neural network (GNN); Graph convolutional network (GCN); Graph attention network (GAT); Graph sampling aggregation network (GraphSAGE); integration

Cite This Article

APA Style
Zheng, W., Xu, G., Lu, S., Lyu, J., Bao, F. et al. (2026). GNN: Core Branches, Integration Strategies and Applications. Computer Modeling in Engineering & Sciences, 146(1), 5. https://doi.org/10.32604/cmes.2025.075741
Vancouver Style
Zheng W, Xu G, Lu S, Lyu J, Bao F, Yin L. GNN: Core Branches, Integration Strategies and Applications. Comput Model Eng Sci. 2026;146(1):5. https://doi.org/10.32604/cmes.2025.075741
IEEE Style
W. Zheng, G. Xu, S. Lu, J. Lyu, F. Bao, and L. Yin, “GNN: Core Branches, Integration Strategies and Applications,” Comput. Model. Eng. Sci., vol. 146, no. 1, pp. 5, 2026. https://doi.org/10.32604/cmes.2025.075741



cc Copyright © 2026 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 481

    View

  • 127

    Download

  • 0

    Like

Share Link