Miðeind's WMT 2021 submission

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Miðeind's WMT 2021 submission. / Símonarson, Haukur Barri; Snæbjarnarson, Vésteinn; Ragnarsson, Pétur Orri; Jónsson, Haukur Páll; Porsteinsson, Vilhjálmur.

WMT 2021 - 6th Conference on Machine Translation, Proceedings. Association for Computational Linguistics (ACL), 2021. p. 136-139.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Símonarson, HB, Snæbjarnarson, V, Ragnarsson, PO, Jónsson, HP & Porsteinsson, V 2021, Miðeind's WMT 2021 submission. in WMT 2021 - 6th Conference on Machine Translation, Proceedings. Association for Computational Linguistics (ACL), pp. 136-139, 6th Conference on Machine Translation, WMT 2021, Virtual, Online, Dominican Republic, 10/11/2021.

APA

Símonarson, H. B., Snæbjarnarson, V., Ragnarsson, P. O., Jónsson, H. P., & Porsteinsson, V. (2021). Miðeind's WMT 2021 submission. In WMT 2021 - 6th Conference on Machine Translation, Proceedings (pp. 136-139). Association for Computational Linguistics (ACL).

Vancouver

Símonarson HB, Snæbjarnarson V, Ragnarsson PO, Jónsson HP, Porsteinsson V. Miðeind's WMT 2021 submission. In WMT 2021 - 6th Conference on Machine Translation, Proceedings. Association for Computational Linguistics (ACL). 2021. p. 136-139

Author

Símonarson, Haukur Barri ; Snæbjarnarson, Vésteinn ; Ragnarsson, Pétur Orri ; Jónsson, Haukur Páll ; Porsteinsson, Vilhjálmur. / Miðeind's WMT 2021 submission. WMT 2021 - 6th Conference on Machine Translation, Proceedings. Association for Computational Linguistics (ACL), 2021. pp. 136-139

Bibtex

@inproceedings{51002452367140b19869db21f2d9f806,
title = "Mi{\dh}eind's WMT 2021 submission",
abstract = "We present Mi{\dh}eind's submission for the English→Icelandic and Icelandic→English subsets of the 2021 WMT news translation task. Transformer-base models are trained for translation on parallel data to generate backtranslations iteratively. A pretrained mBART-25 model is then adapted for translation using parallel data as well as the last backtranslation iteration. This adapted pretrained model is then used to re-generate backtranslations, and the training of the adapted model is continued.",
author = "S{\'i}monarson, {Haukur Barri} and V{\'e}steinn Sn{\ae}bjarnarson and Ragnarsson, {P{\'e}tur Orri} and J{\'o}nsson, {Haukur P{\'a}ll} and Vilhj{\'a}lmur Porsteinsson",
note = "Funding Information: This project was supported by the Language Technology Programme for Icelandic 2019–2023. The programme, which is managed and coordinated by Almannar{\'o}mur, is funded by the Icelandic Ministry of Education, Science and Culture. Publisher Copyright: {\textcopyright} 2021 Association for Computational Linguistics; 6th Conference on Machine Translation, WMT 2021 ; Conference date: 10-11-2021 Through 11-11-2021",
year = "2021",
language = "English",
pages = "136--139",
booktitle = "WMT 2021 - 6th Conference on Machine Translation, Proceedings",
publisher = "Association for Computational Linguistics (ACL)",
address = "United States",

}

RIS

TY - GEN

T1 - Miðeind's WMT 2021 submission

AU - Símonarson, Haukur Barri

AU - Snæbjarnarson, Vésteinn

AU - Ragnarsson, Pétur Orri

AU - Jónsson, Haukur Páll

AU - Porsteinsson, Vilhjálmur

N1 - Funding Information: This project was supported by the Language Technology Programme for Icelandic 2019–2023. The programme, which is managed and coordinated by Almannarómur, is funded by the Icelandic Ministry of Education, Science and Culture. Publisher Copyright: © 2021 Association for Computational Linguistics

PY - 2021

Y1 - 2021

N2 - We present Miðeind's submission for the English→Icelandic and Icelandic→English subsets of the 2021 WMT news translation task. Transformer-base models are trained for translation on parallel data to generate backtranslations iteratively. A pretrained mBART-25 model is then adapted for translation using parallel data as well as the last backtranslation iteration. This adapted pretrained model is then used to re-generate backtranslations, and the training of the adapted model is continued.

AB - We present Miðeind's submission for the English→Icelandic and Icelandic→English subsets of the 2021 WMT news translation task. Transformer-base models are trained for translation on parallel data to generate backtranslations iteratively. A pretrained mBART-25 model is then adapted for translation using parallel data as well as the last backtranslation iteration. This adapted pretrained model is then used to re-generate backtranslations, and the training of the adapted model is continued.

UR - http://www.scopus.com/inward/record.url?scp=85127143543&partnerID=8YFLogxK

M3 - Article in proceedings

AN - SCOPUS:85127143543

SP - 136

EP - 139

BT - WMT 2021 - 6th Conference on Machine Translation, Proceedings

PB - Association for Computational Linguistics (ACL)

T2 - 6th Conference on Machine Translation, WMT 2021

Y2 - 10 November 2021 through 11 November 2021

ER -

ID: 371184978