Skip to main content

Context-Dependent Neural Machine Translation

Primary supervisor

Reza Haffari

The meaning of an utterance depends on the broader context in which it appears. The context may refer to the paragraph, document, conversational history, or the author who has generated the utterance. In this project, we develop effective methods for translating text using the context, e.g. the rest of the sentences in the document or the conversational history.

The need to leverage the context in Machine Translation to produce more accurate and correct translations has a long history. However, this crucial aspect of the translation process has been largely ignored in the research community, due to the difficulty of designing models which can condition on the context, coupled with the difficulty of creating workable abstractions of the context. In our work, we have made progress on using context for monologue and multi-party bilingual dialogue translation [1,2, 3], capitalizing on the flexibility and expressive power of deep learning and neural networks. In this project, we will push the current state of the art.

[1] Selective Attention for Context-aware Neural Machine Translation
Sameen Maruf, Andre Martines, Gholamreza Haffari
Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics - Human Language Technologies (NAACL-HLT), 2019.

[2] Contextual Neural Model for Translating Bilingual Multi-Speaker Conversations
Sameen Maruf, Andre Martins, Gholamreza Haffari
Proceedings of the Third Conference on Machine Translation (WMT), 2018.

[3] Document Context Neural Machine Translation with Memory Networks
Sameen Maruf, Gholamreza Haffari
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (ACL), 2018.

Required knowledge

Machine Learning

Deep Learning

Learn more about minimum entry requirements.