Open Access iconOpen Access

ARTICLE

crossmark

PNMT: Zero-Resource Machine Translation with Pivot-Based Feature Converter

Lingfang Li1,2, Weijian Hu2, Mingxing Luo1,*

1 School of Information Science and Technology, Southwest Jiaotong University, Chengdu, 611730, China
2 School of Information Engineer, Inner Mongolia University of Science & Technology, Baotou, 014000, China

* Corresponding Author: Mingxing Luo. Email: email

Computers, Materials & Continua 2025, 84(3), 5915-5935. https://doi.org/10.32604/cmc.2025.064349

Abstract

Neural machine translation (NMT) has been widely applied to high-resource language pairs, but its dependence on large-scale data results in poor performance in low-resource scenarios. In this paper, we propose a transfer-learning-based approach called shared space transfer for zero-resource NMT. Our method leverages a pivot pre-trained language model (PLM) to create a shared representation space, which is used in both auxiliary source→pivot (Ms2p) and pivot→target (Mp2t) translation models. Specifically, we exploit pivot PLM to initialize the Ms2p decoder and Mp2t encoder, while adopting a freezing strategy during the training process. We further propose a feature converter to mitigate representation space deviations by converting the features from the source encoder into the shared representation space. The converter is trained using the synthetic source→target parallel corpus. The final Ms2t model combines the Ms2p encoder, feature converter, and Mp2t decoder. We conduct simulation experiments using English as the pivot language for German→French, German→Czech, and Turkish→Hindi translations. We finally test our method on a real zero-resource language pair, Mongolian→Vietnamese with Chinese as the pivot language. Experiment results show that our method achieves high translation quality, with better Translation Error Rate (TER) and BLEU scores compared with other pivot-based methods. The step-wise pre-training with our feature converter outperforms baseline models in terms of COMET scores.

Keywords

Zero-resource machine translation; pivot pre-trained language model; transfer learning; neural machine translation

Cite This Article

APA Style
Li, L., Hu, W., Luo, M. (2025). PNMT: Zero-Resource Machine Translation with Pivot-Based Feature Converter. Computers, Materials & Continua, 84(3), 5915–5935. https://doi.org/10.32604/cmc.2025.064349
Vancouver Style
Li L, Hu W, Luo M. PNMT: Zero-Resource Machine Translation with Pivot-Based Feature Converter. Comput Mater Contin. 2025;84(3):5915–5935. https://doi.org/10.32604/cmc.2025.064349
IEEE Style
L. Li, W. Hu, and M. Luo, “PNMT: Zero-Resource Machine Translation with Pivot-Based Feature Converter,” Comput. Mater. Contin., vol. 84, no. 3, pp. 5915–5935, 2025. https://doi.org/10.32604/cmc.2025.064349



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 712

    View

  • 478

    Download

  • 0

    Like

Share Link