TY - EJOU AU - Wang, Hefei AU - Gu, Ruichun AU - Wang, Jingyu AU - Zhang, Xiaolin AU - Wei, Hui TI - GFL-SAR: Graph Federated Collaborative Learning Framework Based on Structural Amplification and Attention Refinement T2 - Computers, Materials \& Continua PY - 2026 VL - 86 IS - 1 SN - 1546-2226 AB - Graph Federated Learning (GFL) has shown great potential in privacy protection and distributed intelligence through distributed collaborative training of graph-structured data without sharing raw information. However, existing GFL approaches often lack the capability for comprehensive feature extraction and adaptive optimization, particularly in non-independent and identically distributed (NON-IID) scenarios where balancing global structural understanding and local node-level detail remains a challenge. To this end, this paper proposes a novel framework called GFL-SAR (Graph Federated Collaborative Learning Framework Based on Structural Amplification and Attention Refinement), which enhances the representation learning capability of graph data through a dual-branch collaborative design. Specifically, we propose the Structural Insight Amplifier (SIA), which utilizes an improved Graph Convolutional Network (GCN) to strengthen structural awareness and improve modeling of topological patterns. In parallel, we propose the Attentive Relational Refiner (ARR), which employs an enhanced Graph Attention Network (GAT) to perform fine-grained modeling of node relationships and neighborhood features, thereby improving the expressiveness of local interactions and preserving critical contextual information. GFL-SAR effectively integrates multi-scale features from every branch via feature fusion and federated optimization, thereby addressing existing GFL limitations in structural modeling and feature representation. Experiments on standard benchmark datasets including Cora, Citeseer, Polblogs, and Cora_ML demonstrate that GFL-SAR achieves superior performance in classification accuracy, convergence speed, and robustness compared to existing methods, confirming its effectiveness and generalizability in GFL tasks. KW - Graph federated learning; GCN; GNNs; attention mechanism DO - 10.32604/cmc.2025.069251