Word Order Does Matter: (And Shuffled Language Models Know It)

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Dokumenter

Recent studies have shown that language models pretrained and/or fine-tuned on randomly permuted sentences exhibit competitive performance on GLUE, putting into question the importance of word order information. Somewhat counter-intuitively, some of these studies also report that position embeddings appear to be crucial for models' good performance with shuffled text. We probe these language models for word order information and investigate what position embeddings learned from shuffled text encode, showing that these models retain information pertaining to the original, naturalistic word order. We show this is in part due to a subtlety in how shuffling is implemented in previous work - before rather than after subword segmentation. Surprisingly, we find even Language models trained on text shuffled after subword segmentation retain some semblance of information about word order because of the statistical dependencies between sentence length and unigram probabilities. Finally, we show that beyond GLUE, a variety of language understanding tasks do require word order information, often to an extent that cannot be learned through fine-tuning.

OriginalsprogEngelsk
TitelACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)
RedaktørerSmaranda Muresan, Preslav Nakov, Aline Villavicencio
ForlagAssociation for Computational Linguistics (ACL)
Publikationsdato2022
Sider6907-6919
ISBN (Elektronisk)9781955917216
DOI
StatusUdgivet - 2022
Begivenhed60th Annual Meeting of the Association for Computational Linguistics, ACL 2022 - Dublin, Irland
Varighed: 22 maj 202227 maj 2022

Konference

Konference60th Annual Meeting of the Association for Computational Linguistics, ACL 2022
LandIrland
ByDublin
Periode22/05/202227/05/2022
SponsorAmazon Science, Bloomberg Engineering, et al., Google Research, Liveperson, Meta

Bibliografisk note

Publisher Copyright:
© 2022 Association for Computational Linguistics.

ID: 341489512