Sequence classification with human attention

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Sequence classification with human attention. / Barrett, Maria Jung; Bingel, Joachim; Hollenstein, Nora; Rei, Marek; Søgaard, Anders.

Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018). ed. / Anna Korhonen ; Ivan Titov . Association for Computational Linguistics, 2018. p. 302–312.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Barrett, MJ, Bingel, J, Hollenstein, N, Rei, M & Søgaard, A 2018, Sequence classification with human attention. in A Korhonen & I Titov (eds), Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018). Association for Computational Linguistics, pp. 302–312, 22nd Conference on Computational Natural Language Learning (CoNLL 2018), Brussels, Belgium, 31/10/2018.

APA

Barrett, M. J., Bingel, J., Hollenstein, N., Rei, M., & Søgaard, A. (2018). Sequence classification with human attention. In A. Korhonen , & I. Titov (Eds.), Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018) (pp. 302–312). Association for Computational Linguistics.

Vancouver

Barrett MJ, Bingel J, Hollenstein N, Rei M, Søgaard A. Sequence classification with human attention. In Korhonen A, Titov I, editors, Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018). Association for Computational Linguistics. 2018. p. 302–312

Author

Barrett, Maria Jung ; Bingel, Joachim ; Hollenstein, Nora ; Rei, Marek ; Søgaard, Anders. / Sequence classification with human attention. Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018). editor / Anna Korhonen ; Ivan Titov . Association for Computational Linguistics, 2018. pp. 302–312

Bibtex

@inproceedings{f78fb48ee4a04cec9bf7355e0e958cfb,
title = "Sequence classification with human attention",
abstract = "Learning attention functions requires largevolumes of data, but many NLP tasks simulatehuman behavior, and in this paper, weshow that human attention really does providea good inductive bias on many attentionfunctions in NLP. Specifically, we useestimated human attention derived from eyetrackingcorpora to regularize attention functionsin recurrent neural networks. We showsubstantial improvements across a range oftasks, including sentiment analysis, grammaticalerror detection, and detection of abusivelanguage.",
author = "Barrett, {Maria Jung} and Joachim Bingel and Nora Hollenstein and Marek Rei and Anders S{\o}gaard",
year = "2018",
language = "English",
isbn = "978-1-948087-72-8",
pages = "302–312",
editor = "{Korhonen }, Anna and {Titov }, {Ivan }",
booktitle = "Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018)",
publisher = "Association for Computational Linguistics",
note = "22nd Conference on Computational Natural Language Learning (CoNLL 2018) ; Conference date: 31-10-2018 Through 01-11-2018",

}

RIS

TY - GEN

T1 - Sequence classification with human attention

AU - Barrett, Maria Jung

AU - Bingel, Joachim

AU - Hollenstein, Nora

AU - Rei, Marek

AU - Søgaard, Anders

PY - 2018

Y1 - 2018

N2 - Learning attention functions requires largevolumes of data, but many NLP tasks simulatehuman behavior, and in this paper, weshow that human attention really does providea good inductive bias on many attentionfunctions in NLP. Specifically, we useestimated human attention derived from eyetrackingcorpora to regularize attention functionsin recurrent neural networks. We showsubstantial improvements across a range oftasks, including sentiment analysis, grammaticalerror detection, and detection of abusivelanguage.

AB - Learning attention functions requires largevolumes of data, but many NLP tasks simulatehuman behavior, and in this paper, weshow that human attention really does providea good inductive bias on many attentionfunctions in NLP. Specifically, we useestimated human attention derived from eyetrackingcorpora to regularize attention functionsin recurrent neural networks. We showsubstantial improvements across a range oftasks, including sentiment analysis, grammaticalerror detection, and detection of abusivelanguage.

M3 - Article in proceedings

SN - 978-1-948087-72-8

SP - 302

EP - 312

BT - Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018)

A2 - Korhonen , Anna

A2 - Titov , Ivan

PB - Association for Computational Linguistics

T2 - 22nd Conference on Computational Natural Language Learning (CoNLL 2018)

Y2 - 31 October 2018 through 1 November 2018

ER -

ID: 208746941