Parameter sharing between dependency parsers for related languages
Publikation: Bidrag til bog/antologi/rapport › Konferencebidrag i proceedings › Forskning › fagfællebedømt
Dokumenter
- OA.Parameter sharing between dependency parsers for related languages
Forlagets udgivne version, 221 KB, PDF-dokument
Previous work has suggested that parameter sharing between transition-based neural dependency parsers for related languages can lead to better performance, but there is no consensus on what parameters to share. We present an evaluation of 27 different parameter sharing strategies across 10 languages, representing five pairs of related languages, each pair from a different language family. We find that sharing transition classifier parameters always helps, whereas the usefulness of sharing word and/or character LSTM parameters varies. Based on this result, we propose an architecture where the transition classifier is shared, and the sharing of word and character parameters is controlled by a parameter that can be tuned on validation data. This model is linguistically motivated and obtains significant improvements over a mono-lingually trained baseline. We also find that sharing transition classifier parameters helps when training a parser on unrelated language pairs, but we find that, in the case of unrelated languages, sharing too many parameters does not help.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing |
Forlag | Association for Computational Linguistics |
Publikationsdato | 2020 |
Sider | 4992-4997 |
Status | Udgivet - 2020 |
Begivenhed | 2018 Conference on Empirical Methods in Natural Language Processing - Brussels, Belgien Varighed: 31 okt. 2018 → 4 nov. 2018 |
Konference
Konference | 2018 Conference on Empirical Methods in Natural Language Processing |
---|---|
Land | Belgien |
By | Brussels |
Periode | 31/10/2018 → 04/11/2018 |
Antal downloads er baseret på statistik fra Google Scholar og www.ku.dk
Ingen data tilgængelig
ID: 214507219