Structural block driven enhanced convolutional neural representation for relation extraction

Research output: Contribution to journalJournal articlepeer-review

Documents

  • Fulltext

    Submitted manuscript, 969 KB, PDF document

  • Dongsheng Wang
  • Prayag Tiwari
  • Sahil Garg
  • Hongyin Zhu
  • Peter Bruza

In this paper, we propose a novel lightweight relation extraction approach of structural block driven convolutional neural learning. Specifically, we detect the essential sequential tokens associated with entities through dependency analysis, named as a structural block, and only encode the block on a block-wise and an inter-block-wise representation, utilizing multi-scale Convolutional Neural Networks (CNNs). This is to (1) eliminate the noisy from irrelevant part of a sentence; meanwhile (2) enhance the relevant block representation with both block-wise and inter-block-wise semantically enriched representation. Our method has the advantage of being independent of long sentence context since we only encode the sequential tokens within a block boundary. Experiments on two datasets i.e., SemEval2010 and KBP37, demonstrate the significant advantages of our method. In particular, we achieve the new state-of-the-art performance on the KBP37 dataset; and comparable performance with the state-of-the-art on the SemEval2010 dataset.

Original languageEnglish
Article number105913
JournalApplied Soft Computing Journal
Volume86
Number of pages9
ISSN1568-4946
DOIs
Publication statusPublished - 2020

    Research areas

  • CNNs, Deep learning, Dependency parsing, Relation extraction

ID: 234447462