Open Access iconOpen Access


DISTINÏCT: Data poISoning atTacks dectectIon usiNg optÏmized jaCcard disTance

Maria Sameen1, Seong Oun Hwang2,*

1 Department of IT Convergence Engineering, Gachon University, Seongnam-si, 13120, Korea
2 Department of Computer Engineering, Gachon University, Seongnam-si, 13120, Korea

* Corresponding Author: Seong Oun Hwang. Email: email

Computers, Materials & Continua 2022, 73(3), 4559-4576.


Machine Learning (ML) systems often involve a re-training process to make better predictions and classifications. This re-training process creates a loophole and poses a security threat for ML systems. Adversaries leverage this loophole and design data poisoning attacks against ML systems. Data poisoning attacks are a type of attack in which an adversary manipulates the training dataset to degrade the ML system’s performance. Data poisoning attacks are challenging to detect, and even more difficult to respond to, particularly in the Internet of Things (IoT) environment. To address this problem, we proposed DISTINÏCT, the first proactive data poisoning attack detection framework using distance measures. We found that Jaccard Distance (JD) can be used in the DISTINÏCT (among other distance measures) and we finally improved the JD to attain an Optimized JD (OJD) with lower time and space complexity. Our security analysis shows that the DISTINÏCT is secure against data poisoning attacks by considering key features of adversarial attacks. We conclude that the proposed OJD-based DISTINÏCT is effective and efficient against data poisoning attacks where in-time detection is critical for IoT applications with large volumes of streaming data.


Cite This Article

M. Sameen and S. O. Hwang, "DistinÏct: data poisoning attacks dectection using optÏmized jaccard distance," Computers, Materials & Continua, vol. 73, no.3, pp. 4559–4576, 2022.

cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1092


  • 500


  • 0


Share Link