Learning connective-based word representations for implicit discourse relation identification
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Standard
Learning connective-based word representations for implicit discourse relation identification. / Braud, Chloé Elodie; Denis, Pascal.
Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (EMNLP-16). Association for Computational Linguistics, 2016. p. 203-213.Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - Learning connective-based word representations for implicit discourse relation identification
AU - Braud, Chloé Elodie
AU - Denis, Pascal
PY - 2016
Y1 - 2016
N2 - We introduce a simple semi-supervised ap-proach to improve implicit discourse relation identification. This approach harnesses large amounts of automatically extracted discourse connectives along with their arguments to con-struct new distributional word representations. Specifically, we represent words in the space of discourse connectives as a way to directly encode their rhetorical function. Experiments on the Penn Discourse Treebank demonstrate the effectiveness of these task-tailored repre-sentations in predicting implicit discourse re-lations. Our results indeed show that, despite their simplicity, these connective-based rep-resentations outperform various off-the-shelf word embeddings, and achieve state-of-the-art performance on this problem.
AB - We introduce a simple semi-supervised ap-proach to improve implicit discourse relation identification. This approach harnesses large amounts of automatically extracted discourse connectives along with their arguments to con-struct new distributional word representations. Specifically, we represent words in the space of discourse connectives as a way to directly encode their rhetorical function. Experiments on the Penn Discourse Treebank demonstrate the effectiveness of these task-tailored repre-sentations in predicting implicit discourse re-lations. Our results indeed show that, despite their simplicity, these connective-based rep-resentations outperform various off-the-shelf word embeddings, and achieve state-of-the-art performance on this problem.
M3 - Article in proceedings
SP - 203
EP - 213
BT - Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (EMNLP-16)
PB - Association for Computational Linguistics
T2 - 2016 Conference on Empirical Methods in Natural Language Processing
Y2 - 1 November 2016 through 5 November 2016
ER -
ID: 178453451