Home / Journals / CMES / Online First / doi:10.32604/cmes.2025.075741
Special Issues
Table of Content

Open Access

REVIEW

GNN: Core Branches, Integration Strategies and Applications

Wenfeng Zheng1, Guangyu Xu2, Siyu Lu3, Junmin Lyu4, Feng Bao5,*, Lirong Yin6,*
1 School of Automation, University of Electronic Science and Technology of China, Chengdu, 611731, China
2 School of the Environment, The University of Queensland, Brisbane St Lucia, QLD 4072, Australia
3 Department of Geography, Texas A&M University, College Station, TX 77843, USA
4 School of Artificial Intelligence, Guangzhou Huashang University, Guangzhou, 511300, China
5 School of Biological and Environmental Engineering, Xi’an University, Xi’an, 710065, China
6 Department of Hydrology and Atmospheric Sciences, University of Arizona, Tucson, AZ 85721, USA
* Corresponding Authors: Feng Bao. Email: Baofeng@xawl.edu.cn; Lirong Yin. Email: lirongy@arizona.edu
(This article belongs to the Special Issue: The Collection of the Latest Reviews on Advances and Challenges in AI)

Computer Modeling in Engineering & Sciences https://doi.org/10.32604/cmes.2025.075741

Received 07 November 2025; Accepted 15 December 2025; Published online 30 December 2025

Abstract

Graph Neural Networks (GNNs), as a deep learning framework specifically designed for graph-structured data, have achieved deep representation learning of graph data through message passing mechanisms and have become a core technology in the field of graph analysis. However, current reviews on GNN models are mainly focused on smaller domains, and there is a lack of systematic reviews on the classification and applications of GNN models. This review systematically synthesizes the three canonical branches of GNN, Graph Convolutional Network (GCN), Graph Attention Network (GAT), and Graph Sampling Aggregation Network (GraphSAGE), and analyzes their integration pathways from both structural and feature perspectives. Drawing on representative studies, we identify three major integration patterns: cascaded fusion, where heterogeneous modules such as Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), and GraphSAGE are sequentially combined for hierarchical feature learning; parallel fusion, where multi-branch architectures jointly encode complementary graph features; and feature-level fusion, which employs concatenation, weighted summation, or attention-based gating to adaptively merge multi-source embeddings. Through these patterns, integrated GNNs achieve enhanced expressiveness, robustness, and scalability across domains including transportation, biomedicine, and cybersecurity.

Keywords

Graph neural network (GNN); Graph convolutional network (GCN); Graph attention network (GAT); Graph sampling aggregation network (GraphSAGE); integration
  • 89

    View

  • 17

    Download

  • 0

    Like

Share Link