Learning connective-based word representations for implicit discourse relation identification

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

  • Chloé Elodie Braud
  • Pascal Denis
We introduce a simple semi-supervised ap-proach to improve implicit discourse relation identification. This approach harnesses large amounts of automatically extracted discourse connectives along with their arguments to con-struct new distributional word representations. Specifically, we represent words in the space of discourse connectives as a way to directly encode their rhetorical function. Experiments on the Penn Discourse Treebank demonstrate the effectiveness of these task-tailored repre-sentations in predicting implicit discourse re-lations. Our results indeed show that, despite their simplicity, these connective-based rep-resentations outperform various off-the-shelf word embeddings, and achieve state-of-the-art performance on this problem.
Original languageEnglish
Title of host publicationProceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (EMNLP-16)
Number of pages11
PublisherAssociation for Computational Linguistics
Publication date2016
Pages203-213
ISBN (Electronic)978-1-945626-25-8
Publication statusPublished - 2016
Event2016 Conference on Empirical Methods in Natural Language Processing - Austin, United States
Duration: 1 Nov 20165 Nov 2016

Conference

Conference2016 Conference on Empirical Methods in Natural Language Processing
LandUnited States
ByAustin
Periode01/11/201605/11/2016

ID: 178453451