Open Access
ARTICLE
Multi-View Deep Fuzzy Clustering for Data Representation Learning
1 School of Software, Dalian University of Technology, Dalian, China
2 School of Computer Science and Technology, Dalian University of Technology, Dalian, China
* Corresponding Author: Zhikui Chen. Email:
(This article belongs to the Special Issue: Multimodal Learning for Big Data)
Computers, Materials & Continua 2026, 88(1), 32 https://doi.org/10.32604/cmc.2026.076717
Received 25 November 2025; Accepted 03 March 2026; Issue published 08 May 2026
Abstract
With the increasing development of ocean information technology, the multi-view fuzzy clustering is attracting increasing attention in pattern mining for massive multi-view ocean data of heterogeneous distributions, owing to its superior performance. However, the previous multi-view fuzzy clustering methods cannot fully consider informative topologies hidden in data distributions, which are crucial to recognize partitions of data. Moreover, they fail to capture invariant structures of multi-view ocean data in learning clustering-specific fusion representation. In addition, they do not take into consideration consistencies contained in the manifolds of data generation in mining soft patterns. To address those challenges, the deep multi-view generative fuzzy contrastive clustering (DMGFCC) is proposed within a Siamese architecture, which captures soft patterns of data via clustering-specific fusion representations of invariant structures in informative topologies. To be specific, a multi-view Siamese generative adversarial architecture is designed to capture the joint distribution of data as well as invariant structures, which is composed of the view-specific generator network providing pairwise implicit constraints, the view-specific discriminator network distilling knowledge of real data, and the view-specific cluster network capturing fuzzy patterns of fusion information. Furthermore, a generative adversarial dual contrastive clustering loss is devised, which consists of a generative adversarial loss fitting data distributions and a dual contrastive clustering loss learning soft patterns with consistencies of data manifolds. Finally, extensive experiments are conducted on four benchmark datasets, and the results demonstrate the competitive performance compared with the 11 representative methods.Keywords
Cite This Article
Copyright © 2026 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools