Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (12)
  • Open Access

    ARTICLE

    Improving Low-Resource Machine Translation Using Reinforcement Learning from Human Feedback

    Liqing Wang*, Yiheng Xiao

    Intelligent Automation & Soft Computing, Vol.39, No.4, pp. 619-631, 2024, DOI:10.32604/iasc.2024.052971 - 06 September 2024

    Abstract Neural Machine Translation is one of the key research directions in Natural Language Processing. However, limited by the scale and quality of parallel corpus, the translation quality of low-resource Neural Machine Translation has always been unsatisfactory. When Reinforcement Learning from Human Feedback (RLHF) is applied to low-resource machine translation, commonly encountered issues of substandard preference data quality and the higher cost associated with manual feedback data. Therefore, a more cost-effective method for obtaining feedback data is proposed. At first, optimizing the quality of preference data through the prompt engineering of the Large Language Model (LLM), More >

  • Open Access

    ARTICLE

    Neural Machine Translation Models with Attention-Based Dropout Layer

    Huma Israr1,*, Safdar Abbas Khan1, Muhammad Ali Tahir1, Muhammad Khuram Shahzad1, Muneer Ahmad1, Jasni Mohamad Zain2,*

    CMC-Computers, Materials & Continua, Vol.75, No.2, pp. 2981-3009, 2023, DOI:10.32604/cmc.2023.035814 - 31 March 2023

    Abstract In bilingual translation, attention-based Neural Machine Translation (NMT) models are used to achieve synchrony between input and output sequences and the notion of alignment. NMT model has obtained state-of-the-art performance for several language pairs. However, there has been little work exploring useful architectures for Urdu-to-English machine translation. We conducted extensive Urdu-to-English translation experiments using Long short-term memory (LSTM)/Bidirectional recurrent neural networks (Bi-RNN)/Statistical recurrent unit (SRU)/Gated recurrent unit (GRU)/Convolutional neural network (CNN) and Transformer. Experimental results show that Bi-RNN and LSTM with attention mechanism trained iteratively, with a scalable data set, make precise predictions on unseen… More >

  • Open Access

    ARTICLE

    Text Simplification Using Transformer and BERT

    Sarah Alissa1,*, Mike Wald2

    CMC-Computers, Materials & Continua, Vol.75, No.2, pp. 3479-3495, 2023, DOI:10.32604/cmc.2023.033647 - 31 March 2023

    Abstract Reading and writing are the main interaction methods with web content. Text simplification tools are helpful for people with cognitive impairments, new language learners, and children as they might find difficulties in understanding the complex web content. Text simplification is the process of changing complex text into more readable and understandable text. The recent approaches to text simplification adopted the machine translation concept to learn simplification rules from a parallel corpus of complex and simple sentences. In this paper, we propose two models based on the transformer which is an encoder-decoder structure that achieves state-of-the-art… More >

  • Open Access

    ARTICLE

    Neural Machine Translation by Fusing Key Information of Text

    Shijie Hu1, Xiaoyu Li1,*, Jiayu Bai1, Hang Lei1, Weizhong Qian1, Sunqiang Hu1, Cong Zhang2, Akpatsa Samuel Kofi1, Qian Qiu2,3, Yong Zhou4, Shan Yang5

    CMC-Computers, Materials & Continua, Vol.74, No.2, pp. 2803-2815, 2023, DOI:10.32604/cmc.2023.032732 - 31 October 2022

    Abstract When the Transformer proposed by Google in 2017, it was first used for machine translation tasks and achieved the state of the art at that time. Although the current neural machine translation model can generate high quality translation results, there are still mistranslations and omissions in the translation of key information of long sentences. On the other hand, the most important part in traditional translation tasks is the translation of key information. In the translation results, as long as the key information is translated accurately and completely, even if other parts of the results are… More >

  • Open Access

    ARTICLE

    Translation of English Language into Urdu Language Using LSTM Model

    Sajadul Hassan Kumhar1, Syed Immamul Ansarullah2, Akber Abid Gardezi3, Shafiq Ahmad4, Abdelaty Edrees Sayed4, Muhammad Shafiq5,*

    CMC-Computers, Materials & Continua, Vol.74, No.2, pp. 3899-3912, 2023, DOI:10.32604/cmc.2023.032290 - 31 October 2022

    Abstract English to Urdu machine translation is still in its beginning and lacks simple translation methods to provide motivating and adequate English to Urdu translation. In order to make knowledge available to the masses, there should be mechanisms and tools in place to make things understandable by translating from source language to target language in an automated fashion. Machine translation has achieved this goal with encouraging results. When decoding the source text into the target language, the translator checks all the characteristics of the text. To achieve machine translation, rule-based, computational, hybrid and neural machine translation… More >

  • Open Access

    ARTICLE

    DLBT: Deep Learning-Based Transformer to Generate Pseudo-Code from Source Code

    Walaa Gad1,*, Anas Alokla1, Waleed Nazih2, Mustafa Aref1, Abdel-badeeh Salem1

    CMC-Computers, Materials & Continua, Vol.70, No.2, pp. 3117-3132, 2022, DOI:10.32604/cmc.2022.019884 - 27 September 2021

    Abstract Understanding the content of the source code and its regular expression is very difficult when they are written in an unfamiliar language. Pseudo-code explains and describes the content of the code without using syntax or programming language technologies. However, writing Pseudo-code to each code instruction is laborious. Recently, neural machine translation is used to generate textual descriptions for the source code. In this paper, a novel deep learning-based transformer (DLBT) model is proposed for automatic Pseudo-code generation from the source code. The proposed model uses deep learning which is based on Neural Machine Translation (NMT)… More >

  • Open Access

    ARTICLE

    A Real-Time Automatic Translation of Text to Sign Language

    Muhammad Sanaullah1,*, Babar Ahmad2, Muhammad Kashif2, Tauqeer Safdar2, Mehdi Hassan3, Mohd Hilmi Hasan4, Norshakirah Aziz4

    CMC-Computers, Materials & Continua, Vol.70, No.2, pp. 2471-2488, 2022, DOI:10.32604/cmc.2022.019420 - 27 September 2021

    Abstract Communication is a basic need of every human being; by this, they can learn, express their feelings and exchange their ideas, but deaf people cannot listen and speak. For communication, they use various hands gestures, also known as Sign Language (SL), which they learn from special schools. As normal people have not taken SL classes; therefore, they are unable to perform signs of daily routine sentences (e.g., what are the specifications of this mobile phone?). A technological solution can facilitate in overcoming this communication gap by which normal people can communicate with deaf people. This… More >

  • Open Access

    ARTICLE

    Integrating Deep Learning and Machine Translation for Understanding Unrefined Languages

    HongGeun Ji1,2, Soyoung Oh1, Jina Kim3, Seong Choi1,2, Eunil Park1,2,*

    CMC-Computers, Materials & Continua, Vol.70, No.1, pp. 669-678, 2022, DOI:10.32604/cmc.2022.019521 - 07 September 2021

    Abstract In the field of natural language processing (NLP), the advancement of neural machine translation has paved the way for cross-lingual research. Yet, most studies in NLP have evaluated the proposed language models on well-refined datasets. We investigate whether a machine translation approach is suitable for multilingual analysis of unrefined datasets, particularly, chat messages in Twitch. In order to address it, we collected the dataset, which included 7,066,854 and 3,365,569 chat messages from English and Korean streams, respectively. We employed several machine learning classifiers and neural networks with two different types of embedding: word-sequence embedding and the… More >

  • Open Access

    ARTICLE

    A Novel Beam Search to Improve Neural Machine Translation for English-Chinese

    Xinyue Lin1, Jin Liu1, *, Jianming Zhang2, Se-Jung Lim3

    CMC-Computers, Materials & Continua, Vol.65, No.1, pp. 387-404, 2020, DOI:10.32604/cmc.2020.010984 - 23 July 2020

    Abstract Neural Machine Translation (NMT) is an end-to-end learning approach for automated translation, overcoming the weaknesses of conventional phrase-based translation systems. Although NMT based systems have gained their popularity in commercial translation applications, there is still plenty of room for improvement. Being the most popular search algorithm in NMT, beam search is vital to the translation result. However, traditional beam search can produce duplicate or missing translation due to its target sequence selection strategy. Aiming to alleviate this problem, this paper proposed neural machine translation improvements based on a novel beam search evaluation function. And we More >

  • Open Access

    ARTICLE

    Improve Neural Machine Translation by Building Word Vector with Part of Speech

    Jinyingming Zhang1 , Jin Liu1, *, Xinyue Lin1

    Journal on Artificial Intelligence, Vol.2, No.2, pp. 79-88, 2020, DOI:10.32604/jai.2020.010476 - 15 July 2020

    Abstract Neural Machine Translation (NMT) based system is an important technology for translation applications. However, there is plenty of rooms for the improvement of NMT. In the process of NMT, traditional word vector cannot distinguish the same words under different parts of speech (POS). Aiming to alleviate this problem, this paper proposed a new word vector training method based on POS feature. It can efficiently improve the quality of translation by adding POS feature to the training process of word vectors. In the experiments, we conducted extensive experiments to evaluate our methods. The experimental result shows More >

Displaying 1-10 on page 1 of 12. Per Page