Learning to predict readability using eye-movement data from natives and learners

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

Learning to predict readability using eye-movement data from natives and learners. / González-Garduño, Ana V.; Søgaard, Anders.

32nd AAAI Conference on Artificial Intelligence, AAAI 2018, Proceedings. AAAI Press, 2018. s. 5118-5124.

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

González-Garduño, AV & Søgaard, A 2018, Learning to predict readability using eye-movement data from natives and learners. i 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, Proceedings. AAAI Press, s. 5118-5124, 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, New Orleans, USA, 02/02/2018.

APA

González-Garduño, A. V., & Søgaard, A. (2018). Learning to predict readability using eye-movement data from natives and learners. I 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, Proceedings (s. 5118-5124). AAAI Press.

Vancouver

González-Garduño AV, Søgaard A. Learning to predict readability using eye-movement data from natives and learners. I 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, Proceedings. AAAI Press. 2018. s. 5118-5124

Author

González-Garduño, Ana V. ; Søgaard, Anders. / Learning to predict readability using eye-movement data from natives and learners. 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, Proceedings. AAAI Press, 2018. s. 5118-5124

Bibtex

@inproceedings{ca5db38dfa464eb9b495ed00f7f99039,
title = "Learning to predict readability using eye-movement data from natives and learners",
abstract = "Readability assessment can improve the quality of assisting technologies aimed at language learners. Eye-tracking data has been used for both inducing and evaluating general-purpose NLP/AI models, and below we show that unsurprisingly, gaze data from language learners can also improve multi-task readability assessment models. This is unsurprising, since the gaze data records the reading difficulties of the learners. Unfortunately, eye-tracking data from language learners is often much harder to obtain than eye-tracking data from native speakers. We therefore compare the performance of deep learning readability models that use native speaker eye movement data to models using data from language learners. Somewhat surprisingly, we observe no significant drop in performance when replacing learners with natives, making approaches that rely on native speaker gaze information, more scalable. In other words, our finding is that language learner difficulties can be efficiently estimated from native speakers, which suggests that, more generally, readily available gaze data can be used to improve educational NLP/AI models targeted towards language learners.",
author = "Gonz{\'a}lez-Gardu{\~n}o, {Ana V.} and Anders S{\o}gaard",
year = "2018",
language = "English",
pages = "5118--5124",
booktitle = "32nd AAAI Conference on Artificial Intelligence, AAAI 2018, Proceedings",
publisher = "AAAI Press",
note = "32nd AAAI Conference on Artificial Intelligence, AAAI 2018 ; Conference date: 02-02-2018 Through 07-02-2018",

}

RIS

TY - GEN

T1 - Learning to predict readability using eye-movement data from natives and learners

AU - González-Garduño, Ana V.

AU - Søgaard, Anders

PY - 2018

Y1 - 2018

N2 - Readability assessment can improve the quality of assisting technologies aimed at language learners. Eye-tracking data has been used for both inducing and evaluating general-purpose NLP/AI models, and below we show that unsurprisingly, gaze data from language learners can also improve multi-task readability assessment models. This is unsurprising, since the gaze data records the reading difficulties of the learners. Unfortunately, eye-tracking data from language learners is often much harder to obtain than eye-tracking data from native speakers. We therefore compare the performance of deep learning readability models that use native speaker eye movement data to models using data from language learners. Somewhat surprisingly, we observe no significant drop in performance when replacing learners with natives, making approaches that rely on native speaker gaze information, more scalable. In other words, our finding is that language learner difficulties can be efficiently estimated from native speakers, which suggests that, more generally, readily available gaze data can be used to improve educational NLP/AI models targeted towards language learners.

AB - Readability assessment can improve the quality of assisting technologies aimed at language learners. Eye-tracking data has been used for both inducing and evaluating general-purpose NLP/AI models, and below we show that unsurprisingly, gaze data from language learners can also improve multi-task readability assessment models. This is unsurprising, since the gaze data records the reading difficulties of the learners. Unfortunately, eye-tracking data from language learners is often much harder to obtain than eye-tracking data from native speakers. We therefore compare the performance of deep learning readability models that use native speaker eye movement data to models using data from language learners. Somewhat surprisingly, we observe no significant drop in performance when replacing learners with natives, making approaches that rely on native speaker gaze information, more scalable. In other words, our finding is that language learner difficulties can be efficiently estimated from native speakers, which suggests that, more generally, readily available gaze data can be used to improve educational NLP/AI models targeted towards language learners.

UR - http://www.scopus.com/inward/record.url?scp=85060464573&partnerID=8YFLogxK

M3 - Article in proceedings

AN - SCOPUS:85060464573

SP - 5118

EP - 5124

BT - 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, Proceedings

PB - AAAI Press

T2 - 32nd AAAI Conference on Artificial Intelligence, AAAI 2018

Y2 - 2 February 2018 through 7 February 2018

ER -

ID: 214752544