Home / Journals / CMC / Online First / doi:10.32604/cmc.2025.072909
Special Issues
Table of Content

Open Access

ARTICLE

FedCCM: Communication-Efficient Federated Learning via Clustered Client Momentum in Non-IID Settings

Hang Wen1,2, Kai Zeng1,2,*
1 Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming, 650500, China
2 Yunnan Key Laboratory of Computer Technologies Application, Kunming, 650500, China
* Corresponding Author: Kai Zeng. Email: email
(This article belongs to the Special Issue: Omnipresent AI in the Cloud Era Reshaping Distributed Computation and Adaptive Systems for Modern Applications)

Computers, Materials & Continua https://doi.org/10.32604/cmc.2025.072909

Received 06 September 2025; Accepted 06 November 2025; Published online 01 December 2025

Abstract

Federated learning often experiences slow and unstable convergence due to edge-side data heterogeneity. This problem becomes more severe when edge participation rate is low, as the information collected from different edge devices varies significantly. As a result, communication overhead increases, which further slows down the convergence process. To address this challenge, we propose a simple yet effective federated learning framework that improves consistency among edge devices. The core idea is clusters the lookahead gradients collected from edge devices on the cloud server to obtain personalized momentum for steering local updates. In parallel, a global momentum is applied during model aggregation, enabling faster convergence while preserving personalization. This strategy enables efficient propagation of the estimated global update direction to all participating edge devices and maintains alignment in local training, without introducing extra memory or communication overhead. We conduct extensive experiments on benchmark datasets such as Cifar100 and Tiny-ImageNet. The results confirm the effectiveness of our framework. On CIFAR-100, our method reaches 55% accuracy with 37 fewer rounds and achieves a competitive final accuracy of 65.46%. Even under extreme non-IID scenarios, it delivers significant improvements in both accuracy and communication efficiency. The implementation is publicly available at (accessed on 20 October 2025).

Graphical Abstract

FedCCM: Communication-Efficient Federated Learning via Clustered Client Momentum in Non-IID Settings

Keywords

Federated learning; distributed computation; communication efficient; momentum clustering; non-independent and identically distributed (non-IID)
  • 51

    View

  • 7

    Download

  • 0

    Like

Share Link