Zero-Shot Dependency Parsing with Worst-Case Aware Automated Curriculum Learning

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Dokumenter

  • Fulltext

    Forlagets udgivne version, 329 KB, PDF-dokument

Large multilingual pretrained language models such as mBERT and XLM-RoBERTa have been found to be surprisingly effective for cross-lingual transfer of syntactic parsing models (Wu and Dredze, 2019), but only between related languages. However, source and training languages are rarely related, when parsing truly low-resource languages. To close this gap, we adopt a method from multi-task learning, which relies on automated curriculum learning, to dynamically optimize for parsing performance on outlier languages. We show that this approach is significantly better than uniform and size-proportional sampling in the zero-shot setting.

OriginalsprogEngelsk
TitelACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers)
RedaktørerSmaranda Muresan, Preslav Nakov, Aline Villavicencio
ForlagAssociation for Computational Linguistics (ACL)
Publikationsdato2022
Sider578-587
ISBN (Elektronisk)9781955917223
DOI
StatusUdgivet - 2022
Begivenhed60th Annual Meeting of the Association for Computational Linguistics, ACL 2022 - Dublin, Irland
Varighed: 22 maj 202227 maj 2022

Konference

Konference60th Annual Meeting of the Association for Computational Linguistics, ACL 2022
LandIrland
ByDublin
Periode22/05/202227/05/2022
SponsorAmazon Science, Bloomberg Engineering, et al., Google Research, Liveperson, Meta

Bibliografisk note

Publisher Copyright:
© 2022 Association for Computational Linguistics.

ID: 341490429