Zero-Shot Cross-Lingual Transfer with Meta Learning

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Documents

Learning what to share between tasks has become a topic of great importance, as strategic sharing of knowledge has been shown to improve downstream task performance. This is particularly important for multilingual applications, as most languages in the world are under-resourced. Here, we consider the setting of training models on multiple different languages at the same time, when little or no data is available for languages other than English. We show that this challenging setup can be approached using meta-learning: in addition to training a source language model, another model learns to select which training instances are the most beneficial to the first. We experiment using standard supervised, zero-shot cross-lingual, as well as few-shot cross-lingual settings for different natural language understanding tasks (natural language inference, question answering). Our extensive experimental setup demonstrates the consistent effectiveness of meta-learning for a total of 15 languages. We improve upon the state-of-the-art for zero-shot and few-shot NLI (on MultiNLI and XNLI) and QA (on the MLQA dataset). A comprehensive error analysis indicates that the correlation of typological features between languages can partly explain when parameter sharing learned via meta-learning is beneficial.
Original languageEnglish
Title of host publicationProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
PublisherAssociation for Computational Linguistics
Publication date2020
Pages4547-4562
DOIs
Publication statusPublished - 2020
EventThe 2020 Conference on Empirical Methods in Natural Language Processing - online
Duration: 16 Nov 202020 Nov 2020
http://2020.emnlp.org

Conference

ConferenceThe 2020 Conference on Empirical Methods in Natural Language Processing
Locationonline
Periode16/11/202020/11/2020
Internetadresse

Number of downloads are based on statistics from Google Scholar and www.ku.dk


No data available

ID: 254992325