Open Access iconOpen Access

ARTICLE

Differential Privacy Federated Learning Based on Adaptive Adjustment

Yanjin Cheng1,2, Wenmin Li1,2,*, Sujuan Qin1,2, Tengfei Tu1,2

1 The State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing, 100876, China
2 The School of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing, 100876, China

* Corresponding Author: Wenmin Li. Email: email

Computers, Materials & Continua 2025, 82(3), 4777-4795. https://doi.org/10.32604/cmc.2025.060380

Abstract

Federated learning effectively alleviates privacy and security issues raised by the development of artificial intelligence through a distributed training architecture. Existing research has shown that attackers can compromise user privacy and security by stealing model parameters. Therefore, differential privacy is applied in federated learning to further address malicious issues. However, the addition of noise and the update clipping mechanism in differential privacy jointly limit the further development of federated learning in privacy protection and performance optimization. Therefore, we propose an adaptive adjusted differential privacy federated learning method. First, a dynamic adaptive privacy budget allocation strategy is proposed, which flexibly adjusts the privacy budget within a given range based on the client’s data volume and training requirements, thereby alleviating the loss of privacy budget and the magnitude of model noise. Second, a longitudinal clipping differential privacy strategy is proposed, which based on the differences in factors that affect parameter updates, uses sparse methods to trim local updates, thereby reducing the impact of privacy pruning steps on model accuracy. The two strategies work together to ensure user privacy while the effect of differential privacy on model accuracy is reduced. To evaluate the effectiveness of our method, we conducted extensive experiments on benchmark datasets, and the results showed that our proposed method performed well in terms of performance and privacy protection.

Keywords

Federated learning; privacy protection; differential privacy; deep learning

Cite This Article

APA Style
Cheng, Y., Li, W., Qin, S., Tu, T. (2025). Differential privacy federated learning based on adaptive adjustment. Computers, Materials & Continua, 82(3), 4777–4795. https://doi.org/10.32604/cmc.2025.060380
Vancouver Style
Cheng Y, Li W, Qin S, Tu T. Differential privacy federated learning based on adaptive adjustment. Comput Mater Contin. 2025;82(3):4777–4795. https://doi.org/10.32604/cmc.2025.060380
IEEE Style
Y. Cheng, W. Li, S. Qin, and T. Tu, “Differential Privacy Federated Learning Based on Adaptive Adjustment,” Comput. Mater. Contin., vol. 82, no. 3, pp. 4777–4795, 2025. https://doi.org/10.32604/cmc.2025.060380



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 441

    View

  • 247

    Download

  • 0

    Like

Share Link