English to Urdu Neural Machine Translation using Transformer with Averaged Word Embeddings

Authors

DOI:

https://doi.org/10.21015/vtcs.v14i1.2213

Abstract

In this era of computers, the World Wide Web (WWW) content is vital for everyone. Most of the useful content on the web is in English. Machine translation from the web content into the national language of Pakistan, i.e. Urdu, has several applications such as Urdu text generation, language resources creation, language research and availability of knowledge to illiterate individuals who cannot comprehend English but know Urdu. We propose a novel English-to-Urdu machine translation model in this research work, based on the transformer model that exploits the average of three word embeddings. These three word embeddings are Urdu word2vec (skipgram-based), part-of-speech-ngram (POS-Ngram) embeddings, and POS-POS embeddings, both of which encode the rich morphological and morphosyntactic features of Urdu language inside the word embeddings. Experiments are performed using a manually compiled English-Urdu parallel corpus from OPUS corpora and Github. The proposed transformer-based approach is compared to fine-tuned Llama-3-8B, T5-small, Long Short-Term Memory (LSTM), and Bi-directional LSTM (Bi-LSTM). The evaluation metrics used are BLEU and ROUGE-L scores. The results suggest that the proposed model outperforms T5-small, LSTM and Bi-LSTM by ≈2.15, 7.44 and 5.53 points respectively, in BLEU score and by ≈1.7, 2.5 and 4.12 points, respectively, in ROUGE-L score. The proposed model shows comparable performance to the fine-tuned Llama-3-8B.

Author Biography

Syed Jamal Ud Din, Institute of Health Sciences, University of Peshawar, Pakistan

Lecturer, Institute of Health Sciences, University of Peshawar, Pakistan.

References

S. H. Kumhar et al., "Translation of english language into urdu language using lstm model," Computers, Materials and Continua, vol. 74, no. 2, pp. 3899–3912, 2023.

S. Nazir, M. Asif, S. A. Sahi, S. Ahmad, Y. Y. Ghadi, and M. H. Aziz, "Toward the development of large-scale word embedding for low-resourced language," IEEE Access, vol. 10, pp. 54091–54097, 2022.

H. Israr, M. K. Shahzad, and S. Anwar, "Improved urdu–english neural machine translation with a fully convolutional neural network encoder," International Journal of Mathematical, Engineering and Management Sciences, vol. 9, no. 5, pp. 1067–1088, 2024.

M. N. U. Hassan et al., "Lkmt: Linguistics knowledge-driven multi-task neural machine translation for urdu and english," Computers, Materials and Continua, vol. 81, no. 1, pp. 951–969, 2024.

A. Ali, M. S. Khan, and M. A. Khan, "Author profiling from short romanized urdu messages: A preliminary investigation using transfer learning models," VFAST Transactions on Software Engineering, 2023.

D. Jurafsky and J. H. Martin, Speech and Language Processing, 3rd ed. Wiley, 2021.

T. Dozat and C. D. Manning, "Deep biaffine attention for neural dependency parsing," in Proc. Int. Conf. Learn. Represent. (ICLR), 2017.

A. Vaswani, N. Shazeer, N. Parmar, et al., "Attention is all you need," in Adv. Neural Inf. Process. Syst., 2017.

J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, "Bert: Pre-training of deep bidirectional transformers for language understanding," in Proc. NAACL-HLT, Minneapolis, MN, USA, 2019, pp. 4171–4186.

T. Mikolov, K. Chen, G. Corrado, and J. Dean, "Efficient estimation of word representations in vector space," arXiv preprint arXiv:1301.3781, 2013.

D. Chen and C. D. Manning, "A fast and accurate dependency parser using neural networks," in Proc. Conf. Empir. Methods Nat. Lang. Process. (EMNLP), 2014, pp. 740–750.

A. Basit, N. U. Azeemi, and W. Raza, "Challenges in urdu machine translation," in *Proc. 7th Workshop Technol. Mach. Transl. Low-Resour. Lang. (LoResMT)*, 2024.

H. Israr et al., "Neural machine translation models with attention-based dropout layer," Computers, Materials and Continua, vol. 75, no. 2, pp. 2981–3009, 2023.

M. N. U. Hassan, M. A. Khan, S. Khan, et al., "Lkmt: Linguistics knowledge-driven multi-task neural machine translation for urdu and english," Applied Sciences, 2024.

M. Ahmed et al., "Urdu-to-english based unsupervised machine translation," Journal of Computing and Social Sciences, 2024.

T. Z. Shah, M. Imran, and S. M. Ismail, "A diachronic study determining syntactic and semantic features of urdu–english neural machine translation," Journal of King Saud University – Computer and Information Sciences, 2023.

S. A. Rauf and N. Hira, "Development of an urdu–english religious domain parallel corpus," in Proc. Mach. Transl. Summit, 2023.

M. Andrabi and A. Wahid, "Machine translation system using deep learning for english to urdu," Journal of King Saud University – Computer and Information Sciences, 2022.

F. T. Zuhra and K. Saleem, "Hybrid embeddings for transition-based dependency parsing of free word order languages," Inf. Process. Manage., vol. 60, no. 3, 2023.

E. J. Hu, Y. Shen, P. Wallis, Z. Allen-Zhu, Y. Li, S. Wang, L. Wang, and W. Chen, "Lora: Low-rank adaptation of large language models," arXiv preprint arXiv:2106.09685, 2021.

Downloads

Published

2026-02-22

How to Cite

Zuhra, F. T., Jamal Ud Din, S., Ali, H., Naz, S., Rasool, S., & Idrees, F. (2026). English to Urdu Neural Machine Translation using Transformer with Averaged Word Embeddings. VAWKUM Transactions on Computer Sciences, 14(1), 01–14. https://doi.org/10.21015/vtcs.v14i1.2213