Skip to main content

Neural Machine Translation for Low-Resource Languages

Primary supervisor

Reza Haffari

The proposed project aims to develop new methodologies for developing NMT systems between extremely low-resource languages and English. Recent advances in neural machine translation (NMT) are a significant step forward in machine translation capabilities. However, "NMT systems have a steeper learning curve with respect to the amount of training data, resulting in worse quality in low-resource settings". A number of emerging approaches, such as zero resource and unsupervised NMT, have investigated alternative methods in developing NMT models where sufficient parallel corpora are not available (eg [1,2]). This project investigates methods to enable high performing NMT in low-resource scenarios.

[1] Neural Machine Translation for Bilingually Scarce Scenarios: A Deep Multi-task Learning Approach
Poorya Zaremoodi, Gholamreza Haffari
Proceedings of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT), 2018.

[2] Learning How to Actively Learn: A Deep Imitation Learning Approach
Ming Liu, Wray Buntine, Gholamreza Haffari
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (ACL), 2018.

Required knowledge

Machine Learning

Deep Learning


Learn more about minimum entry requirements.