Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (28)
  • Open Access

    ARTICLE

    A Mix Location Privacy Preservation Method Based on Differential Privacy with Clustering

    Fang Liu*, Xianghui Meng, Jiachen Li, Sibo Guo

    CMC-Computers, Materials & Continua, Vol.86, No.2, pp. 1-21, 2026, DOI:10.32604/cmc.2025.069243 - 09 December 2025

    Abstract With the popularization of smart devices, Location-Based Services (LBS) greatly facilitates users’ life, but at the same time brings the risk of users’ location privacy leakage. Existing location privacy protection methods are deficient, failing to reasonably allocate the privacy budget for non-outlier location points and ignoring the critical location information that may be contained in the outlier points, leading to decreased data availability and privacy exposure problems. To address these problems, this paper proposes a Mix Location Privacy Preservation Method Based on Differential Privacy with Clustering (MLDP). The method first utilizes the DBSCAN clustering algorithm… More >

  • Open Access

    ARTICLE

    DPIL-Traj: Differential Privacy Trajectory Generation Framework with Imitation Learning

    Huaxiong Liao1,2, Xiangxuan Zhong2, Xueqi Chen2, Yirui Huang3, Yuwei Lin2, Jing Zhang2,*, Bruce Gu4

    CMC-Computers, Materials & Continua, Vol.86, No.1, pp. 1-21, 2026, DOI:10.32604/cmc.2025.069208 - 10 November 2025

    Abstract The generation of synthetic trajectories has become essential in various fields for analyzing complex movement patterns. However, the use of real-world trajectory data poses significant privacy risks, such as location re-identification and correlation attacks. To address these challenges, privacy-preserving trajectory generation methods are critical for applications relying on sensitive location data. This paper introduces DPIL-Traj, an advanced framework designed to generate synthetic trajectories while achieving a superior balance between data utility and privacy preservation. Firstly, the framework incorporates Differential Privacy Clustering, which anonymizes trajectory data by applying differential privacy techniques that add noise, ensuring the… More >

  • Open Access

    ARTICLE

    Differential Privacy Integrated Federated Learning for Power Systems: An Explainability-Driven Approach

    Zekun Liu1, Junwei Ma1,2,*, Xin Gong1, Xiu Liu1, Bingbing Liu1, Long An1

    CMC-Computers, Materials & Continua, Vol.85, No.1, pp. 983-999, 2025, DOI:10.32604/cmc.2025.065978 - 29 August 2025

    Abstract With the ongoing digitalization and intelligence of power systems, there is an increasing reliance on large-scale data-driven intelligent technologies for tasks such as scheduling optimization and load forecasting. Nevertheless, power data often contains sensitive information, making it a critical industry challenge to efficiently utilize this data while ensuring privacy. Traditional Federated Learning (FL) methods can mitigate data leakage by training models locally instead of transmitting raw data. Despite this, FL still has privacy concerns, especially gradient leakage, which might expose users’ sensitive information. Therefore, integrating Differential Privacy (DP) techniques is essential for stronger privacy protection.… More >

  • Open Access

    ARTICLE

    Defending against Backdoor Attacks in Federated Learning by Using Differential Privacy and OOD Data Attributes

    Qingyu Tan, Yan Li, Byeong-Seok Shin*

    CMES-Computer Modeling in Engineering & Sciences, Vol.143, No.2, pp. 2417-2428, 2025, DOI:10.32604/cmes.2025.063811 - 30 May 2025

    Abstract Federated Learning (FL), a practical solution that leverages distributed data across devices without the need for centralized data storage, which enables multiple participants to jointly train models while preserving data privacy and avoiding direct data sharing. Despite its privacy-preserving advantages, FL remains vulnerable to backdoor attacks, where malicious participants introduce backdoors into local models that are then propagated to the global model through the aggregation process. While existing differential privacy defenses have demonstrated effectiveness against backdoor attacks in FL, they often incur a significant degradation in the performance of the aggregated models on benign tasks.… More >

  • Open Access

    ARTICLE

    A Privacy-Preserving Graph Neural Network Framework with Attention Mechanism for Computational Offloading in the Internet of Vehicles

    Aishwarya Rajasekar*, Vetriselvi Vetrian

    CMES-Computer Modeling in Engineering & Sciences, Vol.143, No.1, pp. 225-254, 2025, DOI:10.32604/cmes.2025.062642 - 11 April 2025

    Abstract The integration of technologies like artificial intelligence, 6G, and vehicular ad-hoc networks holds great potential to meet the communication demands of the Internet of Vehicles and drive the advancement of vehicle applications. However, these advancements also generate a surge in data processing requirements, necessitating the offloading of vehicular tasks to edge servers due to the limited computational capacity of vehicles. Despite recent advancements, the robustness and scalability of the existing approaches with respect to the number of vehicles and edge servers and their resources, as well as privacy, remain a concern. In this paper, a lightweight… More >

  • Open Access

    ARTICLE

    Differential Privacy Federated Learning Based on Adaptive Adjustment

    Yanjin Cheng1,2, Wenmin Li1,2,*, Sujuan Qin1,2, Tengfei Tu1,2

    CMC-Computers, Materials & Continua, Vol.82, No.3, pp. 4777-4795, 2025, DOI:10.32604/cmc.2025.060380 - 06 March 2025

    Abstract Federated learning effectively alleviates privacy and security issues raised by the development of artificial intelligence through a distributed training architecture. Existing research has shown that attackers can compromise user privacy and security by stealing model parameters. Therefore, differential privacy is applied in federated learning to further address malicious issues. However, the addition of noise and the update clipping mechanism in differential privacy jointly limit the further development of federated learning in privacy protection and performance optimization. Therefore, we propose an adaptive adjusted differential privacy federated learning method. First, a dynamic adaptive privacy budget allocation strategy… More >

  • Open Access

    ARTICLE

    Federated Learning and Optimization for Few-Shot Image Classification

    Yi Zuo, Zhenping Chen*, Jing Feng, Yunhao Fan

    CMC-Computers, Materials & Continua, Vol.82, No.3, pp. 4649-4667, 2025, DOI:10.32604/cmc.2025.059472 - 06 March 2025

    Abstract Image classification is crucial for various applications, including digital construction, smart manufacturing, and medical imaging. Focusing on the inadequate model generalization and data privacy concerns in few-shot image classification, in this paper, we propose a federated learning approach that incorporates privacy-preserving techniques. First, we utilize contrastive learning to train on local few-shot image data and apply various data augmentation methods to expand the sample size, thereby enhancing the model’s generalization capabilities in few-shot contexts. Second, we introduce local differential privacy techniques and weight pruning methods to safeguard model parameters, perturbing the transmitted parameters to ensure More >

  • Open Access

    ARTICLE

    Blockchain-Enabled Federated Learning with Differential Privacy for Internet of Vehicles

    Chi Cui1,2, Haiping Du2, Zhijuan Jia1,*, Yuchu He1, Lipeng Wang1

    CMC-Computers, Materials & Continua, Vol.81, No.1, pp. 1581-1593, 2024, DOI:10.32604/cmc.2024.055557 - 15 October 2024

    Abstract The rapid evolution of artificial intelligence (AI) technologies has significantly propelled the advancement of the Internet of Vehicles (IoV). With AI support, represented by machine learning technology, vehicles gain the capability to make intelligent decisions. As a distributed learning paradigm, federated learning (FL) has emerged as a preferred solution in IoV. Compared to traditional centralized machine learning, FL reduces communication overhead and improves privacy protection. Despite these benefits, FL still faces some security and privacy concerns, such as poisoning attacks and inference attacks, prompting exploration into blockchain integration to enhance its security posture. This paper… More >

  • Open Access

    ARTICLE

    Blockchain-Enabled Federated Learning for Privacy-Preserving Non-IID Data Sharing in Industrial Internet

    Qiuyan Wang, Haibing Dong*, Yongfei Huang, Zenglei Liu, Yundong Gou

    CMC-Computers, Materials & Continua, Vol.80, No.2, pp. 1967-1983, 2024, DOI:10.32604/cmc.2024.052775 - 15 August 2024

    Abstract Sharing data while protecting privacy in the industrial Internet is a significant challenge. Traditional machine learning methods require a combination of all data for training; however, this approach can be limited by data availability and privacy concerns. Federated learning (FL) has gained considerable attention because it allows for decentralized training on multiple local datasets. However, the training data collected by data providers are often non-independent and identically distributed (non-IID), resulting in poor FL performance. This paper proposes a privacy-preserving approach for sharing non-IID data in the industrial Internet using an FL approach based on blockchain… More >

  • Open Access

    ARTICLE

    Privacy-Preserving Information Fusion Technique for Device to Server-Enabled Communication in the Internet of Things: A Hybrid Approach

    Amal Al-Rasheed1, Rahim Khan2,3,*, Tahani Alsaed4, Mahwish Kundi2,5, Mohamad Hanif Md. Saad6, Mahidur R. Sarker7,8

    CMC-Computers, Materials & Continua, Vol.80, No.1, pp. 1305-1323, 2024, DOI:10.32604/cmc.2024.049215 - 18 July 2024

    Abstract Due to the overwhelming characteristics of the Internet of Things (IoT) and its adoption in approximately every aspect of our lives, the concept of individual devices’ privacy has gained prominent attention from both customers, i.e., people, and industries as wearable devices collect sensitive information about patients (both admitted and outdoor) in smart healthcare infrastructures. In addition to privacy, outliers or noise are among the crucial issues, which are directly correlated with IoT infrastructures, as most member devices are resource-limited and could generate or transmit false data that is required to be refined before processing, i.e.,… More >

Displaying 1-10 on page 1 of 28. Per Page