Structural block driven enhanced convolutional neural representation for relation extraction

Publikation: Bidrag til tidsskriftTidsskriftartikelfagfællebedømt

Standard

Structural block driven enhanced convolutional neural representation for relation extraction. / Wang, Dongsheng; Tiwari, Prayag; Garg, Sahil; Zhu, Hongyin; Bruza, Peter.

I: Applied Soft Computing Journal, Bind 86, 105913, 2020.

Publikation: Bidrag til tidsskriftTidsskriftartikelfagfællebedømt

Harvard

Wang, D, Tiwari, P, Garg, S, Zhu, H & Bruza, P 2020, 'Structural block driven enhanced convolutional neural representation for relation extraction', Applied Soft Computing Journal, bind 86, 105913. https://doi.org/10.1016/j.asoc.2019.105913

APA

Wang, D., Tiwari, P., Garg, S., Zhu, H., & Bruza, P. (2020). Structural block driven enhanced convolutional neural representation for relation extraction. Applied Soft Computing Journal, 86, [105913]. https://doi.org/10.1016/j.asoc.2019.105913

Vancouver

Wang D, Tiwari P, Garg S, Zhu H, Bruza P. Structural block driven enhanced convolutional neural representation for relation extraction. Applied Soft Computing Journal. 2020;86. 105913. https://doi.org/10.1016/j.asoc.2019.105913

Author

Wang, Dongsheng ; Tiwari, Prayag ; Garg, Sahil ; Zhu, Hongyin ; Bruza, Peter. / Structural block driven enhanced convolutional neural representation for relation extraction. I: Applied Soft Computing Journal. 2020 ; Bind 86.

Bibtex

@article{279ccb1e76f44e78a01d7343f41664a3,
title = "Structural block driven enhanced convolutional neural representation for relation extraction",
abstract = "In this paper, we propose a novel lightweight relation extraction approach of structural block driven convolutional neural learning. Specifically, we detect the essential sequential tokens associated with entities through dependency analysis, named as a structural block, and only encode the block on a block-wise and an inter-block-wise representation, utilizing multi-scale Convolutional Neural Networks (CNNs). This is to (1) eliminate the noisy from irrelevant part of a sentence; meanwhile (2) enhance the relevant block representation with both block-wise and inter-block-wise semantically enriched representation. Our method has the advantage of being independent of long sentence context since we only encode the sequential tokens within a block boundary. Experiments on two datasets i.e., SemEval2010 and KBP37, demonstrate the significant advantages of our method. In particular, we achieve the new state-of-the-art performance on the KBP37 dataset; and comparable performance with the state-of-the-art on the SemEval2010 dataset.",
keywords = "CNNs, Deep learning, Dependency parsing, Relation extraction",
author = "Dongsheng Wang and Prayag Tiwari and Sahil Garg and Hongyin Zhu and Peter Bruza",
year = "2020",
doi = "10.1016/j.asoc.2019.105913",
language = "English",
volume = "86",
journal = "Applied Soft Computing Journal",
issn = "1568-4946",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - Structural block driven enhanced convolutional neural representation for relation extraction

AU - Wang, Dongsheng

AU - Tiwari, Prayag

AU - Garg, Sahil

AU - Zhu, Hongyin

AU - Bruza, Peter

PY - 2020

Y1 - 2020

N2 - In this paper, we propose a novel lightweight relation extraction approach of structural block driven convolutional neural learning. Specifically, we detect the essential sequential tokens associated with entities through dependency analysis, named as a structural block, and only encode the block on a block-wise and an inter-block-wise representation, utilizing multi-scale Convolutional Neural Networks (CNNs). This is to (1) eliminate the noisy from irrelevant part of a sentence; meanwhile (2) enhance the relevant block representation with both block-wise and inter-block-wise semantically enriched representation. Our method has the advantage of being independent of long sentence context since we only encode the sequential tokens within a block boundary. Experiments on two datasets i.e., SemEval2010 and KBP37, demonstrate the significant advantages of our method. In particular, we achieve the new state-of-the-art performance on the KBP37 dataset; and comparable performance with the state-of-the-art on the SemEval2010 dataset.

AB - In this paper, we propose a novel lightweight relation extraction approach of structural block driven convolutional neural learning. Specifically, we detect the essential sequential tokens associated with entities through dependency analysis, named as a structural block, and only encode the block on a block-wise and an inter-block-wise representation, utilizing multi-scale Convolutional Neural Networks (CNNs). This is to (1) eliminate the noisy from irrelevant part of a sentence; meanwhile (2) enhance the relevant block representation with both block-wise and inter-block-wise semantically enriched representation. Our method has the advantage of being independent of long sentence context since we only encode the sequential tokens within a block boundary. Experiments on two datasets i.e., SemEval2010 and KBP37, demonstrate the significant advantages of our method. In particular, we achieve the new state-of-the-art performance on the KBP37 dataset; and comparable performance with the state-of-the-art on the SemEval2010 dataset.

KW - CNNs

KW - Deep learning

KW - Dependency parsing

KW - Relation extraction

U2 - 10.1016/j.asoc.2019.105913

DO - 10.1016/j.asoc.2019.105913

M3 - Journal article

AN - SCOPUS:85075373947

VL - 86

JO - Applied Soft Computing Journal

JF - Applied Soft Computing Journal

SN - 1568-4946

M1 - 105913

ER -

ID: 234447462