Open Access iconOpen Access

ARTICLE

MDGET-MER: Multi-Level Dynamic Gating and Emotion Transfer for Multi-Modal Emotion Recognition

Musheng Chen1,2, Qiang Wen1, Xiaohong Qiu1,2, Junhua Wu1,*, Wenqing Fu1

1 School of Software Engineering, Jiangxi University of Science and Technology, Nanchang, 330013, China
2 Nanchang Key Laboratory of Virtual Digital Engineering and Cultural Communication, Nanchang, 330013, China

* Corresponding Author: Junhua Wu. Email: email

Computers, Materials & Continua 2026, 86(3), 34 https://doi.org/10.32604/cmc.2025.071207

Abstract

In multi-modal emotion recognition, excessive reliance on historical context often impedes the detection of emotional shifts, while modality heterogeneity and unimodal noise limit recognition performance. Existing methods struggle to dynamically adjust cross-modal complementary strength to optimize fusion quality and lack effective mechanisms to model the dynamic evolution of emotions. To address these issues, we propose a multi-level dynamic gating and emotion transfer framework for multi-modal emotion recognition. A dynamic gating mechanism is applied across unimodal encoding, cross-modal alignment, and emotion transfer modeling, substantially improving noise robustness and feature alignment. First, we construct a unimodal encoder based on gated recurrent units and feature-selection gating to suppress intra-modal noise and enhance contextual representation. Second, we design a gated-attention cross-modal encoder that dynamically calibrates the complementary contributions of visual and audio modalities to the dominant textual features and eliminates redundant information. Finally, we introduce a gated enhanced emotion transfer module that explicitly models the temporal dependence of emotional evolution in dialogues via transfer gating and optimizes continuity modeling with a comparative learning loss. Experimental results demonstrate that the proposed method outperforms state-of-the-art models on the public MELD and IEMOCAP datasets.

Keywords

Multi-modal emotion recognition; dynamic gating; emotion transfer module; cross-modal dynamic alignment; noise robustness

Cite This Article

APA Style
Chen, M., Wen, Q., Qiu, X., Wu, J., Fu, W. (2026). MDGET-MER: Multi-Level Dynamic Gating and Emotion Transfer for Multi-Modal Emotion Recognition. Computers, Materials & Continua, 86(3), 34. https://doi.org/10.32604/cmc.2025.071207
Vancouver Style
Chen M, Wen Q, Qiu X, Wu J, Fu W. MDGET-MER: Multi-Level Dynamic Gating and Emotion Transfer for Multi-Modal Emotion Recognition. Comput Mater Contin. 2026;86(3):34. https://doi.org/10.32604/cmc.2025.071207
IEEE Style
M. Chen, Q. Wen, X. Qiu, J. Wu, and W. Fu, “MDGET-MER: Multi-Level Dynamic Gating and Emotion Transfer for Multi-Modal Emotion Recognition,” Comput. Mater. Contin., vol. 86, no. 3, pp. 34, 2026. https://doi.org/10.32604/cmc.2025.071207



cc Copyright © 2026 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 536

    View

  • 175

    Download

  • 0

    Like

Share Link