Jiakang Sun1,2, Ke Chen1,2, Xinyang He1,2, Xu Liu1,2, Ke Li1,2, Cheng Peng1,2,*
CMC-Computers, Materials & Continua, Vol.83, No.1, pp. 219-238, 2025, DOI:10.32604/cmc.2025.059745
- 26 March 2025
Abstract With the advancements in parameter-efficient transfer learning techniques, it has become feasible to leverage large pre-trained language models for downstream tasks under low-cost and low-resource conditions. However, applying this technique to multimodal knowledge transfer introduces a significant challenge: ensuring alignment across modalities while minimizing the number of additional parameters required for downstream task adaptation. This paper introduces UniTrans, a framework aimed at facilitating efficient knowledge transfer across multiple modalities. UniTrans leverages Vector-based Cross-modal Random Matrix Adaptation to enable fine-tuning with minimal parameter overhead. To further enhance modality alignment, we introduce two key components: the Multimodal More >