Home / Journals / CMC / Online First / doi:10.32604/cmc.2026.074244
Special Issues
Table of Content

Open Access

ARTICLE

Federated Semi-Supervised Learning Based on Feature Space Fusion

Zhe Ding1,2, Hao Yi3,4,*, Wenrui Xie3,4, Ming Zhang1, Yuxuan Xiao1, Qixu Wang1,2, Qing Chen5, Zhiguang Qin1, Dajiang Chen1
1 School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu, China
2 School of Cybersecurity, Chengdu University of Information Technology, Chengdu, China
3 China Electronic Products Reliability and Environmental Testing Research Institute, Guangzhou, China
4 Key Laboratory of the Ministry of Industry and Information Technology for Performance and Reliability Evaluation of Software and Hardware for Information Technology Application Innovation Foundation, Guangzhou, China
5 Accelink Technologies Co., Ltd., Wuhan, China
* Corresponding Author: Hao Yi. Email: email
(This article belongs to the Special Issue: Advances in Deep Learning and Neural Networks: Architectures, Applications, and Challenges)

Computers, Materials & Continua https://doi.org/10.32604/cmc.2026.074244

Received 06 October 2025; Accepted 28 January 2026; Published online 18 February 2026

Abstract

Federated semi-supervised learning (FSSL) has garnered substantial attention for enabling collaborative global model training across multiple clients to address the scarcity of labeled data and to preserve data privacy. However, FSSL is plagued by formidable challenges stemming from cross-client data heterogeneity, as existing methods fail to achieve effective fusion of feature subspaces across distinct clients. To address this issue, we propose a novel FSSL framework, named FedSPQR, which is explicitly tailored for the label-at-server scenario. On the server side, FedSPQR adopts subspace clustering and fusion method based on the Grassmann manifold to construct a unified global feature space, which is further leveraged to refine the global model. On the client side, the pre-established global feature space acts as a benchmark for aligning the local feature subspaces. Based on the aligned local feature subspaces, integrating self-supervised learning with knowledge distillation facilitates effective local learning to alleviate local bias caused by data heterogeneity. Extensive experiments on two standard public benchmarks confirm that FedSPQR outperforms state-of-the-art (SOTA) baselines by a significant margin.

Keywords

Federated semi-supervised learning; feature space fusion; knowledge distillation
  • 86

    View

  • 17

    Download

  • 0

    Like

Share Link