Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (18)
  • Open Access

    ARTICLE

    VKFQ: A Verifiable Keyword Frequency Query Framework with Local Differential Privacy in Blockchain

    Youlin Ji, Bo Yin*, Ke Gu

    CMC-Computers, Materials & Continua, Vol.78, No.3, pp. 4205-4223, 2024, DOI:10.32604/cmc.2024.049086

    Abstract With its untameable and traceable properties, blockchain technology has been widely used in the field of data sharing. How to preserve individual privacy while enabling efficient data queries is one of the primary issues with secure data sharing. In this paper, we study verifiable keyword frequency (KF) queries with local differential privacy in blockchain. Both the numerical and the keyword attributes are present in data objects; the latter are sensitive and require privacy protection. However, prior studies in blockchain have the problem of trilemma in privacy protection and are unable to handle KF queries. We propose an efficient framework that… More >

  • Open Access

    ARTICLE

    Differentially Private Support Vector Machines with Knowledge Aggregation

    Teng Wang, Yao Zhang, Jiangguo Liang, Shuai Wang, Shuanggen Liu*

    CMC-Computers, Materials & Continua, Vol.78, No.3, pp. 3891-3907, 2024, DOI:10.32604/cmc.2024.048115

    Abstract With the widespread data collection and processing, privacy-preserving machine learning has become increasingly important in addressing privacy risks related to individuals. Support vector machine (SVM) is one of the most elementary learning models of machine learning. Privacy issues surrounding SVM classifier training have attracted increasing attention. In this paper, we investigate Differential Privacy-compliant Federated Machine Learning with Dimensionality Reduction, called FedDPDR-DPML, which greatly improves data utility while providing strong privacy guarantees. Considering in distributed learning scenarios, multiple participants usually hold unbalanced or small amounts of data. Therefore, FedDPDR-DPML enables multiple participants to collaboratively learn a global model based on weighted… More >

  • Open Access

    ARTICLE

    KSKV: Key-Strategy for Key-Value Data Collection with Local Differential Privacy

    Dan Zhao1, Yang You2, Chuanwen Luo3,*, Ting Chen4,*, Yang Liu5

    CMES-Computer Modeling in Engineering & Sciences, Vol.139, No.3, pp. 3063-3083, 2024, DOI:10.32604/cmes.2023.045400

    Abstract In recent years, the research field of data collection under local differential privacy (LDP) has expanded its focus from elementary data types to include more complex structural data, such as set-value and graph data. However, our comprehensive review of existing literature reveals that there needs to be more studies that engage with key-value data collection. Such studies would simultaneously collect the frequencies of keys and the mean of values associated with each key. Additionally, the allocation of the privacy budget between the frequencies of keys and the means of values for each key does not yield an optimal utility tradeoff.… More >

  • Open Access

    ARTICLE

    Efficient DP-FL: Efficient Differential Privacy Federated Learning Based on Early Stopping Mechanism

    Sanxiu Jiao1, Lecai Cai2,*, Jintao Meng3, Yue Zhao3, Kui Cheng2

    Computer Systems Science and Engineering, Vol.48, No.1, pp. 247-265, 2024, DOI:10.32604/csse.2023.040194

    Abstract Federated learning is a distributed machine learning framework that solves data security and data island problems faced by artificial intelligence. However, federated learning frameworks are not always secure, and attackers can attack customer privacy information by analyzing parameters in the training process of federated learning models. To solve the problems of data security and availability during federated learning training, this paper proposes an Efficient Differential Privacy Federated Learning Algorithm based on early stopping mechanism (Efficient DP-FL). This method inherits the advantages of differential privacy and federated learning and improves the performance of model training while protecting the parameter information uploaded… More >

  • Open Access

    ARTICLE

    A Differential Privacy Federated Learning Scheme Based on Adaptive Gaussian Noise

    Sanxiu Jiao1, Lecai Cai2,*, Xinjie Wang1, Kui Cheng2, Xiang Gao3

    CMES-Computer Modeling in Engineering & Sciences, Vol.138, No.2, pp. 1679-1694, 2024, DOI:10.32604/cmes.2023.030512

    Abstract As a distributed machine learning method, federated learning (FL) has the advantage of naturally protecting data privacy. It keeps data locally and trains local models through local data to protect the privacy of local data. The federated learning method effectively solves the problem of artificial Smart data islands and privacy protection issues. However, existing research shows that attackers may still steal user information by analyzing the parameters in the federated learning training process and the aggregation parameters on the server side. To solve this problem, differential privacy (DP) techniques are widely used for privacy protection in federated learning. However, adding… More > Graphic Abstract

    A Differential Privacy Federated Learning Scheme Based on Adaptive Gaussian Noise

  • Open Access

    ARTICLE

    A Blockchain-Assisted Distributed Edge Intelligence for Privacy-Preserving Vehicular Networks

    Muhammad Firdaus1, Harashta Tatimma Larasati2, Kyung-Hyune Rhee3,*

    CMC-Computers, Materials & Continua, Vol.76, No.3, pp. 2959-2978, 2023, DOI:10.32604/cmc.2023.039487

    Abstract The enormous volume of heterogeneous data from various smart device-based applications has growingly increased a deeply interlaced cyber-physical system. In order to deliver smart cloud services that require low latency with strong computational processing capabilities, the Edge Intelligence System (EIS) idea is now being employed, which takes advantage of Artificial Intelligence (AI) and Edge Computing Technology (ECT). Thus, EIS presents a potential approach to enforcing future Intelligent Transportation Systems (ITS), particularly within a context of a Vehicular Network (VNets). However, the current EIS framework meets some issues and is conceivably vulnerable to multiple adversarial attacks because the central aggregator server… More >

  • Open Access

    REVIEW

    A Survey of Privacy Preservation for Deep Learning Applications

    Ling Zhang1,*, Lina Nie1, Leyan Yu2

    Journal of Information Hiding and Privacy Protection, Vol.4, No.2, pp. 69-78, 2022, DOI:10.32604/jihpp.2022.039284

    Abstract Deep learning is widely used in artificial intelligence fields such as computer vision, natural language recognition, and intelligent robots. With the development of deep learning, people’s expectations for this technology are increasing daily. Enterprises and individuals usually need a lot of computing power to support the practical work of deep learning technology. Many cloud service providers provide and deploy cloud computing environments. However, there are severe risks of privacy leakage when transferring data to cloud service providers and using data for model training, which makes users unable to use deep learning technology in cloud computing environments confidently. This paper mainly… More >

  • Open Access

    ARTICLE

    Federated Learning Based on Data Divergence and Differential Privacy in Financial Risk Control Research

    Mao Yuxin, Wang Honglin*

    CMC-Computers, Materials & Continua, Vol.75, No.1, pp. 863-878, 2023, DOI:10.32604/cmc.2023.034879

    Abstract In the financial sector, data are highly confidential and sensitive, and ensuring data privacy is critical. Sample fusion is the basis of horizontal federation learning, but it is suitable only for scenarios where customers have the same format but different targets, namely for scenarios with strong feature overlapping and weak user overlapping. To solve this limitation, this paper proposes a federated learning-based model with local data sharing and differential privacy. The indexing mechanism of differential privacy is used to obtain different degrees of privacy budgets, which are applied to the gradient according to the contribution degree to ensure privacy without… More >

  • Open Access

    ARTICLE

    Research on Federated Learning Data Sharing Scheme Based on Differential Privacy

    Lihong Guo*

    CMC-Computers, Materials & Continua, Vol.74, No.3, pp. 5069-5085, 2023, DOI:10.32604/cmc.2023.034571

    Abstract To realize data sharing, and to fully use the data value, breaking the data island between institutions to realize data collaboration has become a new sharing mode. This paper proposed a distributed data security sharing scheme based on C/S communication mode, and constructed a federated learning architecture that uses differential privacy technology to protect training parameters. Clients do not need to share local data, and they only need to upload the trained model parameters to achieve data sharing. In the process of training, a distributed parameter update mechanism is introduced. The server is mainly responsible for issuing training commands and… More >

  • Open Access

    Fed-DFE: A Decentralized Function Encryption-Based Privacy-Preserving Scheme for Federated Learning

    Zhe Sun1, Jiyuan Feng1, Lihua Yin1,*, Zixu Zhang2, Ran Li1, Yu Hu1, Chongning Na3

    CMC-Computers, Materials & Continua, Vol.71, No.1, pp. 1867-1886, 2022, DOI:10.32604/cmc.2022.022290

    Abstract Federated learning is a distributed learning framework which trains global models by passing model parameters instead of raw data. However, the training mechanism for passing model parameters is still threatened by gradient inversion, inference attacks, etc. With a lightweight encryption overhead, function encryption is a viable secure aggregation technique in federation learning, which is often used in combination with differential privacy. The function encryption in federal learning still has the following problems: a) Traditional function encryption usually requires a trust third party (TTP) to assign the keys. If a TTP colludes with a server, the security aggregation mechanism can be… More >

Displaying 1-10 on page 1 of 18. Per Page