Open Access
ARTICLE
VRCL: A Discrimination Detection Method for Multilingual and Multimodal Information
1 Department of Cyberspace Security, Beijing Electronic Science and Technology Institute, Beijing, 100070, China
2 Department of Information and Cybersecurity, The State Information Center, Beijing, 100045, China
* Corresponding Author: Meijiao Li. Email:
Computers, Materials & Continua 2025, 85(1), 1019-1035. https://doi.org/10.32604/cmc.2025.066532
Received 10 April 2025; Accepted 23 June 2025; Issue published 29 August 2025
Abstract
With the rapid growth of the Internet and social media, information is widely disseminated in multimodal forms, such as text and images, where discriminatory content can manifest in various ways. Discrimination detection techniques for multilingual and multimodal data can identify potential discriminatory behavior and help foster a more equitable and inclusive cyberspace. However, existing methods often struggle in complex contexts and multilingual environments. To address these challenges, this paper proposes an innovative detection method, using image and multilingual text encoders to separately extract features from different modalities. It continuously updates a historical feature memory bank, aggregates the Top-K most similar samples, and utilizes a Gated Recurrent Unit (GRU) to integrate current and historical features, generating enhanced feature representations with stronger semantic expressiveness to improve the model’s ability to capture discriminatory signals. Experimental results demonstrate that the proposed method exhibits superior discriminative power and detection accuracy in multilingual and multimodal contexts, offering a reliable and effective solution for identifying discriminatory content.Keywords
Cite This Article
Copyright © 2025 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools