Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (7)
  • Open Access

    ARTICLE

    FedCW: Client Selection with Adaptive Weight in Heterogeneous Federated Learning

    Haotian Wu1, Jiaming Pei2, Jinhai Li3,*

    CMC-Computers, Materials & Continua, Vol.86, No.1, pp. 1-20, 2026, DOI:10.32604/cmc.2025.069873 - 10 November 2025

    Abstract With the increasing complexity of vehicular networks and the proliferation of connected vehicles, Federated Learning (FL) has emerged as a critical framework for decentralized model training while preserving data privacy. However, efficient client selection and adaptive weight allocation in heterogeneous and non-IID environments remain challenging. To address these issues, we propose Federated Learning with Client Selection and Adaptive Weighting (FedCW), a novel algorithm that leverages adaptive client selection and dynamic weight allocation for optimizing model convergence in real-time vehicular networks. FedCW selects clients based on their Euclidean distance from the global model and dynamically adjusts More >

  • Open Access

    ARTICLE

    FedEPC: An Efficient and Privacy-Enhancing Clustering Federated Learning Method for Sensing-Computing Fusion Scenarios

    Ning Tang1,2, Wang Luo1,2,*, Yiwei Wang1,2, Bao Feng1,2, Shuang Yang1,2, Jiangtao Xu3, Daohua Zhu3, Zhechen Huang3, Wei Liang3

    CMC-Computers, Materials & Continua, Vol.85, No.2, pp. 4091-4113, 2025, DOI:10.32604/cmc.2025.066241 - 23 September 2025

    Abstract With the deep integration of edge computing, 5G and Artificial Intelligence of Things (AIoT) technologies, the large-scale deployment of intelligent terminal devices has given rise to data silos and privacy security challenges in sensing-computing fusion scenarios. Traditional federated learning (FL) algorithms face significant limitations in practical applications due to client drift, model bias, and resource constraints under non-independent and identically distributed (Non-IID) data, as well as the computational overhead and utility loss caused by privacy-preserving techniques. To address these issues, this paper proposes an Efficient and Privacy-enhancing Clustering Federated Learning method (FedEPC). This method introduces… More >

  • Open Access

    ARTICLE

    Cluster Federated Learning with Intra-Cluster Correction

    Yunong Yang1, Long Ma1, Liang Fan2, Tao Xie3,*

    CMC-Computers, Materials & Continua, Vol.84, No.2, pp. 3459-3476, 2025, DOI:10.32604/cmc.2025.064103 - 03 July 2025

    Abstract Federated learning has emerged as an essential technique of protecting privacy since it allows clients to train models locally without explicitly exchanging sensitive data. Extensive research has been conducted on the issue of data heterogeneity in federated learning, but effective model training with severely imbalanced label distributions remains an unexplored area. This paper presents a novel Cluster Federated Learning Algorithm with Intra-cluster Correction (CFIC). First, CFIC selects samples from each cluster during each round of sampling, ensuring that no single category of data dominates the model training. Second, in addition to updating local models, CFIC… More >

  • Open Access

    ARTICLE

    A Federated Learning Incentive Mechanism for Dynamic Client Participation: Unbiased Deep Learning Models

    Jianfeng Lu1, Tao Huang1, Yuanai Xie2,*, Shuqin Cao1, Bing Li3

    CMC-Computers, Materials & Continua, Vol.83, No.1, pp. 619-634, 2025, DOI:10.32604/cmc.2025.060094 - 26 March 2025

    Abstract The proliferation of deep learning (DL) has amplified the demand for processing large and complex datasets for tasks such as modeling, classification, and identification. However, traditional DL methods compromise client privacy by collecting sensitive data, underscoring the necessity for privacy-preserving solutions like Federated Learning (FL). FL effectively addresses escalating privacy concerns by facilitating collaborative model training without necessitating the sharing of raw data. Given that FL clients autonomously manage training data, encouraging client engagement is pivotal for successful model training. To overcome challenges like unreliable communication and budget constraints, we present ENTIRE, a contract-based dynamic… More >

  • Open Access

    ARTICLE

    FedAdaSS: Federated Learning with Adaptive Parameter Server Selection Based on Elastic Cloud Resources

    Yuwei Xu, Baokang Zhao*, Huan Zhou, Jinshu Su

    CMES-Computer Modeling in Engineering & Sciences, Vol.141, No.1, pp. 609-629, 2024, DOI:10.32604/cmes.2024.053462 - 20 August 2024

    Abstract The rapid expansion of artificial intelligence (AI) applications has raised significant concerns about user privacy, prompting the development of privacy-preserving machine learning (ML) paradigms such as federated learning (FL). FL enables the distributed training of ML models, keeping data on local devices and thus addressing the privacy concerns of users. However, challenges arise from the heterogeneous nature of mobile client devices, partial engagement of training, and non-independent identically distributed (non-IID) data distribution, leading to performance degradation and optimization objective bias in FL training. With the development of 5G/6G networks and the integration of cloud computing… More >

  • Open Access

    ARTICLE

    Blockchain-Enabled Federated Learning for Privacy-Preserving Non-IID Data Sharing in Industrial Internet

    Qiuyan Wang, Haibing Dong*, Yongfei Huang, Zenglei Liu, Yundong Gou

    CMC-Computers, Materials & Continua, Vol.80, No.2, pp. 1967-1983, 2024, DOI:10.32604/cmc.2024.052775 - 15 August 2024

    Abstract Sharing data while protecting privacy in the industrial Internet is a significant challenge. Traditional machine learning methods require a combination of all data for training; however, this approach can be limited by data availability and privacy concerns. Federated learning (FL) has gained considerable attention because it allows for decentralized training on multiple local datasets. However, the training data collected by data providers are often non-independent and identically distributed (non-IID), resulting in poor FL performance. This paper proposes a privacy-preserving approach for sharing non-IID data in the industrial Internet using an FL approach based on blockchain… More >

  • Open Access

    ARTICLE

    A Client Selection Method Based on Loss Function Optimization for Federated Learning

    Yan Zeng1,2,3, Siyuan Teng1, Tian Xiang4,*, Jilin Zhang1,2,3, Yuankai Mu5, Yongjian Ren1,2,3,*, Jian Wan1,2,3

    CMES-Computer Modeling in Engineering & Sciences, Vol.137, No.1, pp. 1047-1064, 2023, DOI:10.32604/cmes.2023.027226 - 23 April 2023

    Abstract Federated learning is a distributed machine learning method that can solve the increasingly serious problem of data islands and user data privacy, as it allows training data to be kept locally and not shared with other users. It trains a global model by aggregating locally-computed models of clients rather than their raw data. However, the divergence of local models caused by data heterogeneity of different clients may lead to slow convergence of the global model. For this problem, we focus on the client selection with federated learning, which can affect the convergence performance of the… More > Graphic Abstract

    A Client Selection Method Based on Loss Function Optimization for Federated Learning

Displaying 1-10 on page 1 of 7. Per Page