Explorations on Multi-lingual Neural Machine Translation
Deep (recurrent) neural networks has been shown to successfully learn complex mappings between arbitrary length input and output sequences, within the effective framework of encoder-decoder networks. We investigate the extensions of this sequence to sequence models, to handle multiple sequences at the same time, within the same model. This reduces to the problem of multi-lingual machine translation (MLNMT), as we explore applicability and the benefits of MLNMT on, (1) large scale machine translation tasks, between all six languages of WMT’15 shared task, (2) low-resource language transfer problems, for Finnish, Uzbek and Turkish into English translation, (3) multi-source translation tasks where we have multi-way parallel text available, and (4) Zero-resource translation tasks where we don’t have any available bi-text between two languages. We will further discuss about the natural extensions of MLNMT model for system combination (of SMT and NMT models) and larger-context NMT (given the entire documents during translation).
- Date:
- Haut-parleurs:
- Orhan Firat
- Affiliation:
- Middle East Technical University
-
-
Arul Menezes
Partner Research Manager
-
-
Regardez suivant
-
-
LLMs vs. Torch 1.5: Why Your Code Assistant Can't Keep Up
Speakers:- Diganta Misra
-
Accelerating Multilingual RAG Systems
Speakers:- Nandan Thakur
-
-
Final intern talk: Distilling Self-Supervised-Learning-Based Speech Quality Assessment into Compact
Speakers:- Benjamin Stahl
-
Insights into the Challenges and Opportunities of Large Multi-Modal Models for Blind and Low Vision Users: CLIP
Speakers:- Daniela Massiceti
-
Making Sentence Embeddings Robust to User-Generated Content
Speakers:- Lydia Nishimwe
-
MSR Talk: Unsupervised Speech Reverberation Control with Diffusion Implicit Bridges
Speakers:- Eloi Moliner,
- Hannes Gamper
-
Multilingual Evaluation of Generative AI (MEGA)
Speakers:- Kabir Ahuja,
- Millicent Ochieng
-