Open Access
REVIEW
GNN: Core Branches, Integration Strategies and Applications
1 School of Automation, University of Electronic Science and Technology of China, Chengdu, 611731, China
2 School of the Environment, The University of Queensland, Brisbane St Lucia, QLD 4072, Australia
3 Department of Geography, Texas A&M University, College Station, TX 77843, USA
4 School of Artificial Intelligence, Guangzhou Huashang University, Guangzhou, 511300, China
5 School of Biological and Environmental Engineering, Xi’an University, Xi’an, 710065, China
6 Department of Hydrology and Atmospheric Sciences, University of Arizona, Tucson, AZ 85721, USA
* Corresponding Authors: Feng Bao. Email: ; Lirong Yin. Email:
(This article belongs to the Special Issue: The Collection of the Latest Reviews on Advances and Challenges in AI)
Computer Modeling in Engineering & Sciences 2026, 146(1), 5 https://doi.org/10.32604/cmes.2025.075741
Received 07 November 2025; Accepted 15 December 2025; Issue published 29 January 2026
Abstract
Graph Neural Networks (GNNs), as a deep learning framework specifically designed for graph-structured data, have achieved deep representation learning of graph data through message passing mechanisms and have become a core technology in the field of graph analysis. However, current reviews on GNN models are mainly focused on smaller domains, and there is a lack of systematic reviews on the classification and applications of GNN models. This review systematically synthesizes the three canonical branches of GNN, Graph Convolutional Network (GCN), Graph Attention Network (GAT), and Graph Sampling Aggregation Network (GraphSAGE), and analyzes their integration pathways from both structural and feature perspectives. Drawing on representative studies, we identify three major integration patterns: cascaded fusion, where heterogeneous modules such as Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), and GraphSAGE are sequentially combined for hierarchical feature learning; parallel fusion, where multi-branch architectures jointly encode complementary graph features; and feature-level fusion, which employs concatenation, weighted summation, or attention-based gating to adaptively merge multi-source embeddings. Through these patterns, integrated GNNs achieve enhanced expressiveness, robustness, and scalability across domains including transportation, biomedicine, and cybersecurity.Keywords
Cite This Article
Copyright © 2026 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools