Analogy Training Multilingual Encoderss

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Dokumenter

  • Nicolas Garneau
  • lwp876 lwp876
  • Anders Sandholm
  • Sebastian Ruder
  • Ivan Vulić
  • Søgaard, Anders
Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies, and then implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to consistent gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.
OriginalsprogEngelsk
TitelProceedings of the AAAI-21 International Joint Conference on Artificial Intelligence
Antal sider10
ForlagAAAI Press
Publikationsdato2021
Sider12884-12892.
ISBN (Elektronisk)978-1-57735-866-4
StatusUdgivet - 2021
Begivenhed35th AAAI Conference on Artificial Intelligence - Virtual
Varighed: 2 feb. 20219 feb. 2021

Konference

Konference35th AAAI Conference on Artificial Intelligence
ByVirtual
Periode02/02/202109/02/2021
NavnProceedings of the International Joint Conference on Artificial Intelligence
Nummer14
Vol/bind35
ISSN1045-0823

Antal downloads er baseret på statistik fra Google Scholar og www.ku.dk


Ingen data tilgængelig

ID: 300671526