Comparing Unsupervised Word Translation Methods Step by Step

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

Comparing Unsupervised Word Translation Methods Step by Step. / Hartmann, Mareike ; Kementchedjhieva, Yova Radoslavova; Søgaard, Anders.

Advances in Neural Information Processing Systems 32 (NIPS 2019). 2019.

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

Hartmann, M, Kementchedjhieva, YR & Søgaard, A 2019, Comparing Unsupervised Word Translation Methods Step by Step. i Advances in Neural Information Processing Systems 32 (NIPS 2019). 33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 08/12/2019.

APA

Hartmann, M., Kementchedjhieva, Y. R., & Søgaard, A. (2019). Comparing Unsupervised Word Translation Methods Step by Step. I Advances in Neural Information Processing Systems 32 (NIPS 2019)

Vancouver

Hartmann M, Kementchedjhieva YR, Søgaard A. Comparing Unsupervised Word Translation Methods Step by Step. I Advances in Neural Information Processing Systems 32 (NIPS 2019). 2019

Author

Hartmann, Mareike ; Kementchedjhieva, Yova Radoslavova ; Søgaard, Anders. / Comparing Unsupervised Word Translation Methods Step by Step. Advances in Neural Information Processing Systems 32 (NIPS 2019). 2019.

Bibtex

@inproceedings{845fd260bc7a4715845db5ae59d75ab8,
title = "Comparing Unsupervised Word Translation Methods Step by Step",
abstract = "Cross-lingual word vector space alignment is the task of mapping the vocabularies of two languages into a shared semantic space, which can be used for dictionary induction, unsupervised machine translation, and transfer learning. In the unsupervised regime, an initial seed dictionary is learned in the absence of any known correspondences between words, through {\bf distribution matching}, and the seed dictionary is then used to supervise the induction of the final alignment in what is typically referred to as a (possibly iterative) {\bf refinement} step. We focus on the first step and compare distribution matching techniques in the context of language pairs for which mixed training stability and evaluation scores have been reported. We show that, surprisingly, when looking at this initial step in isolation, vanilla GANs are superior to more recent methods, both in terms of precision and robustness. The improvements reported by more recent methods thus stem from the refinement techniques, and we show that we can obtain state-of-the-art performance combining vanilla GANs with such refinement techniques.",
author = "Mareike Hartmann and Kementchedjhieva, {Yova Radoslavova} and Anders S{\o}gaard",
year = "2019",
language = "English",
booktitle = "Advances in Neural Information Processing Systems 32 (NIPS 2019)",
note = "33rd Conference on Neural Information Processing Systems (NeurIPS 2019) ; Conference date: 08-12-2019 Through 14-12-2019",

}

RIS

TY - GEN

T1 - Comparing Unsupervised Word Translation Methods Step by Step

AU - Hartmann, Mareike

AU - Kementchedjhieva, Yova Radoslavova

AU - Søgaard, Anders

PY - 2019

Y1 - 2019

N2 - Cross-lingual word vector space alignment is the task of mapping the vocabularies of two languages into a shared semantic space, which can be used for dictionary induction, unsupervised machine translation, and transfer learning. In the unsupervised regime, an initial seed dictionary is learned in the absence of any known correspondences between words, through {\bf distribution matching}, and the seed dictionary is then used to supervise the induction of the final alignment in what is typically referred to as a (possibly iterative) {\bf refinement} step. We focus on the first step and compare distribution matching techniques in the context of language pairs for which mixed training stability and evaluation scores have been reported. We show that, surprisingly, when looking at this initial step in isolation, vanilla GANs are superior to more recent methods, both in terms of precision and robustness. The improvements reported by more recent methods thus stem from the refinement techniques, and we show that we can obtain state-of-the-art performance combining vanilla GANs with such refinement techniques.

AB - Cross-lingual word vector space alignment is the task of mapping the vocabularies of two languages into a shared semantic space, which can be used for dictionary induction, unsupervised machine translation, and transfer learning. In the unsupervised regime, an initial seed dictionary is learned in the absence of any known correspondences between words, through {\bf distribution matching}, and the seed dictionary is then used to supervise the induction of the final alignment in what is typically referred to as a (possibly iterative) {\bf refinement} step. We focus on the first step and compare distribution matching techniques in the context of language pairs for which mixed training stability and evaluation scores have been reported. We show that, surprisingly, when looking at this initial step in isolation, vanilla GANs are superior to more recent methods, both in terms of precision and robustness. The improvements reported by more recent methods thus stem from the refinement techniques, and we show that we can obtain state-of-the-art performance combining vanilla GANs with such refinement techniques.

M3 - Article in proceedings

BT - Advances in Neural Information Processing Systems 32 (NIPS 2019)

T2 - 33rd Conference on Neural Information Processing Systems (NeurIPS 2019)

Y2 - 8 December 2019 through 14 December 2019

ER -

ID: 240315759