Zero-Shot Dependency Parsing with Worst-Case Aware Automated Curriculum Learning
Publikation: Bidrag til bog/antologi/rapport › Konferencebidrag i proceedings › Forskning › fagfællebedømt
Dokumenter
- Fulltext
Forlagets udgivne version, 329 KB, PDF-dokument
Large multilingual pretrained language models such as mBERT and XLM-RoBERTa have been found to be surprisingly effective for cross-lingual transfer of syntactic parsing models (Wu and Dredze, 2019), but only between related languages. However, source and training languages are rarely related, when parsing truly low-resource languages. To close this gap, we adopt a method from multi-task learning, which relies on automated curriculum learning, to dynamically optimize for parsing performance on outlier languages. We show that this approach is significantly better than uniform and size-proportional sampling in the zero-shot setting.
Originalsprog | Engelsk |
---|---|
Titel | ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers) |
Redaktører | Smaranda Muresan, Preslav Nakov, Aline Villavicencio |
Forlag | Association for Computational Linguistics (ACL) |
Publikationsdato | 2022 |
Sider | 578-587 |
ISBN (Elektronisk) | 9781955917223 |
DOI | |
Status | Udgivet - 2022 |
Begivenhed | 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022 - Dublin, Irland Varighed: 22 maj 2022 → 27 maj 2022 |
Konference
Konference | 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022 |
---|---|
Land | Irland |
By | Dublin |
Periode | 22/05/2022 → 27/05/2022 |
Sponsor | Amazon Science, Bloomberg Engineering, et al., Google Research, Liveperson, Meta |
Bibliografisk note
Publisher Copyright:
© 2022 Association for Computational Linguistics.
ID: 341490429