Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (70)
  • Open Access

    ARTICLE

    Byzantine Robust Federated Learning Scheme Based on Backdoor Triggers

    Zheng Yang, Ke Gu*, Yiming Zuo

    CMC-Computers, Materials & Continua, Vol.79, No.2, pp. 2813-2831, 2024, DOI:10.32604/cmc.2024.050025 - 15 May 2024

    Abstract Federated learning is widely used to solve the problem of data decentralization and can provide privacy protection for data owners. However, since multiple participants are required in federated learning, this allows attackers to compromise. Byzantine attacks pose great threats to federated learning. Byzantine attackers upload maliciously created local models to the server to affect the prediction performance and training speed of the global model. To defend against Byzantine attacks, we propose a Byzantine robust federated learning scheme based on backdoor triggers. In our scheme, backdoor triggers are embedded into benign data samples, and then malicious More >

  • Open Access

    REVIEW

    Federated Learning on Internet of Things: Extensive and Systematic Review

    Meenakshi Aggarwal1, Vikas Khullar1, Sunita Rani2, Thomas André Prola3,4,5, Shyama Barna Bhattacharjee6, Sarowar Morshed Shawon7, Nitin Goyal8,*

    CMC-Computers, Materials & Continua, Vol.79, No.2, pp. 1795-1834, 2024, DOI:10.32604/cmc.2024.049846 - 15 May 2024

    Abstract The proliferation of IoT devices requires innovative approaches to gaining insights while preserving privacy and resources amid unprecedented data generation. However, FL development for IoT is still in its infancy and needs to be explored in various areas to understand the key challenges for deployment in real-world scenarios. The paper systematically reviewed the available literature using the PRISMA guiding principle. The study aims to provide a detailed overview of the increasing use of FL in IoT networks, including the architecture and challenges. A systematic review approach is used to collect, categorize and analyze FL-IoT-based articles.… More >

  • Open Access

    ARTICLE

    FL-EASGD: Federated Learning Privacy Security Method Based on Homomorphic Encryption

    Hao Sun*, Xiubo Chen, Kaiguo Yuan

    CMC-Computers, Materials & Continua, Vol.79, No.2, pp. 2361-2373, 2024, DOI:10.32604/cmc.2024.049159 - 15 May 2024

    Abstract Federated learning ensures data privacy and security by sharing models among multiple computing nodes instead of plaintext data. However, there is still a potential risk of privacy leakage, for example, attackers can obtain the original data through model inference attacks. Therefore, safeguarding the privacy of model parameters becomes crucial. One proposed solution involves incorporating homomorphic encryption algorithms into the federated learning process. However, the existing federated learning privacy protection scheme based on homomorphic encryption will greatly reduce the efficiency and robustness when there are performance differences between parties or abnormal nodes. To solve the above… More >

  • Open Access

    ARTICLE

    Design Pattern and Challenges of Federated Learning with Applications in Industrial Control System

    Hina Batool1, Jiuyun Xu1,*, Ateeq Ur Rehman2, Habib Hamam3,4,5,6

    Journal on Artificial Intelligence, Vol.6, pp. 105-128, 2024, DOI:10.32604/jai.2024.049912 - 06 May 2024

    Abstract Federated Learning (FL) appeared as an encouraging approach for handling decentralized data. Creating a FL system needs both machine learning (ML) knowledge and thinking about how to design system software. Researchers have focused a lot on the ML side of FL, but have not paid enough attention to designing the software architecture. So, in this survey, a set of design patterns is described to tackle the design issues. Design patterns are like reusable solutions for common problems that come up when designing software architecture. This paper focuses on (1) design patterns such as architectures, frameworks,… More >

  • Open Access

    ARTICLE

    WebFLex: A Framework for Web Browsers-Based Peer-to-Peer Federated Learning Systems Using WebRTC

    Mai Alzamel1,*, Hamza Ali Rizvi2, Najwa Altwaijry1, Isra Al-Turaiki1

    CMC-Computers, Materials & Continua, Vol.78, No.3, pp. 4177-4204, 2024, DOI:10.32604/cmc.2024.048370 - 26 March 2024

    Abstract Scalability and information personal privacy are vital for training and deploying large-scale deep learning models. Federated learning trains models on exclusive information by aggregating weights from various devices and taking advantage of the device-agnostic environment of web browsers. Nevertheless, relying on a main central server for internet browser-based federated systems can prohibit scalability and interfere with the training process as a result of growing client numbers. Additionally, information relating to the training dataset can possibly be extracted from the distributed weights, potentially reducing the privacy of the local data used for training. In this research… More >

  • Open Access

    REVIEW

    A Survey on Blockchain-Based Federated Learning: Categorization, Application and Analysis

    Yuming Tang1,#, Yitian Zhang2,#, Tao Niu1, Zhen Li2,3,*, Zijian Zhang1,3, Huaping Chen4, Long Zhang4

    CMES-Computer Modeling in Engineering & Sciences, Vol.139, No.3, pp. 2451-2477, 2024, DOI:10.32604/cmes.2024.030084 - 11 March 2024

    Abstract Federated Learning (FL), as an emergent paradigm in privacy-preserving machine learning, has garnered significant interest from scholars and engineers across both academic and industrial spheres. Despite its innovative approach to model training across distributed networks, FL has its vulnerabilities; the centralized server-client architecture introduces risks of single-point failures. Moreover, the integrity of the global model—a cornerstone of FL—is susceptible to compromise through poisoning attacks by malicious actors. Such attacks and the potential for privacy leakage via inference starkly undermine FL’s foundational privacy and security goals. For these reasons, some participants unwilling use their private data… More >

  • Open Access

    REVIEW

    AI Fairness–From Machine Learning to Federated Learning

    Lalit Mohan Patnaik1,5, Wenfeng Wang2,3,4,5,6,*

    CMES-Computer Modeling in Engineering & Sciences, Vol.139, No.2, pp. 1203-1215, 2024, DOI:10.32604/cmes.2023.029451 - 29 January 2024

    Abstract This article reviews the theory of fairness in AI–from machine learning to federated learning, where the constraints on precision AI fairness and perspective solutions are also discussed. For a reliable and quantitative evaluation of AI fairness, many associated concepts have been proposed, formulated and classified. However, the inexplicability of machine learning systems makes it almost impossible to include all necessary details in the modelling stage to ensure fairness. The privacy worries induce the data unfairness and hence, the biases in the datasets for evaluating AI fairness are unavoidable. The imbalance between algorithms’ utility and humanization More >

  • Open Access

    ARTICLE

    Efficient DP-FL: Efficient Differential Privacy Federated Learning Based on Early Stopping Mechanism

    Sanxiu Jiao1, Lecai Cai2,*, Jintao Meng3, Yue Zhao3, Kui Cheng2

    Computer Systems Science and Engineering, Vol.48, No.1, pp. 247-265, 2024, DOI:10.32604/csse.2023.040194 - 26 January 2024

    Abstract Federated learning is a distributed machine learning framework that solves data security and data island problems faced by artificial intelligence. However, federated learning frameworks are not always secure, and attackers can attack customer privacy information by analyzing parameters in the training process of federated learning models. To solve the problems of data security and availability during federated learning training, this paper proposes an Efficient Differential Privacy Federated Learning Algorithm based on early stopping mechanism (Efficient DP-FL). This method inherits the advantages of differential privacy and federated learning and improves the performance of model training while More >

  • Open Access

    ARTICLE

    Improving Federated Learning through Abnormal Client Detection and Incentive

    Hongle Guo1,2, Yingchi Mao1,2,*, Xiaoming He1,2, Benteng Zhang1,2, Tianfu Pang1,2, Ping Ping1,2

    CMES-Computer Modeling in Engineering & Sciences, Vol.139, No.1, pp. 383-403, 2024, DOI:10.32604/cmes.2023.031466 - 30 December 2023

    Abstract Data sharing and privacy protection are made possible by federated learning, which allows for continuous model parameter sharing between several clients and a central server. Multiple reliable and high-quality clients must participate in practical applications for the federated learning global model to be accurate, but because the clients are independent, the central server cannot fully control their behavior. The central server has no way of knowing the correctness of the model parameters provided by each client in this round, so clients may purposefully or unwittingly submit anomalous data, leading to abnormal behavior, such as becoming… More >

  • Open Access

    ARTICLE

    Privacy Enhanced Mobile User Authentication Method Using Motion Sensors

    Chunlin Xiong1,2, Zhengqiu Weng3,4,*, Jia Liu1, Liang Gu2, Fayez Alqahtani5, Amr Gafar6, Pradip Kumar Sharma7

    CMES-Computer Modeling in Engineering & Sciences, Vol.138, No.3, pp. 3013-3032, 2024, DOI:10.32604/cmes.2023.031088 - 15 December 2023

    Abstract With the development of hardware devices and the upgrading of smartphones, a large number of users save privacy-related information in mobile devices, mainly smartphones, which puts forward higher demands on the protection of mobile users’ privacy information. At present, mobile user authentication methods based on human-computer interaction have been extensively studied due to their advantages of high precision and non-perception, but there are still shortcomings such as low data collection efficiency, untrustworthy participating nodes, and lack of practicability. To this end, this paper proposes a privacy-enhanced mobile user authentication method with motion sensors, which mainly… More >

Displaying 21-30 on page 3 of 70. Per Page