TY - EJOU AU - Cheng, Yanjin AU - Li, Wenmin AU - Qin, Sujuan AU - Tu, Tengfei TI - Differential Privacy Federated Learning Based on Adaptive Adjustment T2 - Computers, Materials \& Continua PY - 2025 VL - 82 IS - 3 SN - 1546-2226 AB - Federated learning effectively alleviates privacy and security issues raised by the development of artificial intelligence through a distributed training architecture. Existing research has shown that attackers can compromise user privacy and security by stealing model parameters. Therefore, differential privacy is applied in federated learning to further address malicious issues. However, the addition of noise and the update clipping mechanism in differential privacy jointly limit the further development of federated learning in privacy protection and performance optimization. Therefore, we propose an adaptive adjusted differential privacy federated learning method. First, a dynamic adaptive privacy budget allocation strategy is proposed, which flexibly adjusts the privacy budget within a given range based on the client’s data volume and training requirements, thereby alleviating the loss of privacy budget and the magnitude of model noise. Second, a longitudinal clipping differential privacy strategy is proposed, which based on the differences in factors that affect parameter updates, uses sparse methods to trim local updates, thereby reducing the impact of privacy pruning steps on model accuracy. The two strategies work together to ensure user privacy while the effect of differential privacy on model accuracy is reduced. To evaluate the effectiveness of our method, we conducted extensive experiments on benchmark datasets, and the results showed that our proposed method performed well in terms of performance and privacy protection. KW - Federated learning; privacy protection; differential privacy; deep learning DO - 10.32604/cmc.2025.060380