Open Access iconOpen Access

ARTICLE

crossmark

TGICP: A Text-Gated Interaction Network with Inter-Sample Commonality Perception for Multimodal Sentiment Analysis

Erlin Tian1, Shuai Zhao2,*, Min Huang2, Yushan Pan3,4, Yihong Wang3,4, Zuhe Li1

1 School of Computer Science and Technology, Zhengzhou University of Light Industry, Zhengzhou, 450002, China
2 School of Software, Zhengzhou University of Light Industry, Zhengzhou, 450002, China
3 Department of Computing, Xi’an Jiaotong-Liverpool University, Suzhou, 215123, China
4 Department of Computer Science, University of Liverpool, Liverpool, L69 7ZX, UK

* Corresponding Author: Shuai Zhao. Email: email

Computers, Materials & Continua 2025, 85(1), 1427-1456. https://doi.org/10.32604/cmc.2025.066476

Abstract

With the increasing importance of multimodal data in emotional expression on social media, mainstream methods for sentiment analysis have shifted from unimodal to multimodal approaches. However, the challenges of extracting high-quality emotional features and achieving effective interaction between different modalities remain two major obstacles in multimodal sentiment analysis. To address these challenges, this paper proposes a Text-Gated Interaction Network with Inter-Sample Commonality Perception (TGICP). Specifically, we utilize a Inter-sample Commonality Perception (ICP) module to extract common features from similar samples within the same modality, and use these common features to enhance the original features of each modality, thereby obtaining a richer and more complete multimodal sentiment representation. Subsequently, in the cross-modal interaction stage, we design a Text-Gated Interaction (TGI) module, which is text-driven. By calculating the mutual information difference between the text modality and nonverbal modalities, the TGI module dynamically adjusts the influence of emotional information from the text modality on nonverbal modalities. This helps to reduce modality information asymmetry while enabling full cross-modal interaction. Experimental results show that the proposed model achieves outstanding performance on both the CMU-MOSI and CMU-MOSEI baseline multimodal sentiment analysis datasets, validating its effectiveness in emotion recognition tasks.

Keywords

Multi-modal sentiment analysis; multi-modal fusion; graph convolutional networks; inter-sample commonality perception; gated interaction

Cite This Article

APA Style
Tian, E., Zhao, S., Huang, M., Pan, Y., Wang, Y. et al. (2025). TGICP: A Text-Gated Interaction Network with Inter-Sample Commonality Perception for Multimodal Sentiment Analysis. Computers, Materials & Continua, 85(1), 1427–1456. https://doi.org/10.32604/cmc.2025.066476
Vancouver Style
Tian E, Zhao S, Huang M, Pan Y, Wang Y, Li Z. TGICP: A Text-Gated Interaction Network with Inter-Sample Commonality Perception for Multimodal Sentiment Analysis. Comput Mater Contin. 2025;85(1):1427–1456. https://doi.org/10.32604/cmc.2025.066476
IEEE Style
E. Tian, S. Zhao, M. Huang, Y. Pan, Y. Wang, and Z. Li, “TGICP: A Text-Gated Interaction Network with Inter-Sample Commonality Perception for Multimodal Sentiment Analysis,” Comput. Mater. Contin., vol. 85, no. 1, pp. 1427–1456, 2025. https://doi.org/10.32604/cmc.2025.066476



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 2799

    View

  • 2142

    Download

  • 0

    Like

Share Link