Structural block driven enhanced convolutional neural representation for relation extraction
Publikation: Bidrag til tidsskrift › Tidsskriftartikel › fagfællebedømt
Standard
Structural block driven enhanced convolutional neural representation for relation extraction. / Wang, Dongsheng; Tiwari, Prayag; Garg, Sahil; Zhu, Hongyin; Bruza, Peter.
I: Applied Soft Computing Journal, Bind 86, 105913, 2020.Publikation: Bidrag til tidsskrift › Tidsskriftartikel › fagfællebedømt
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - JOUR
T1 - Structural block driven enhanced convolutional neural representation for relation extraction
AU - Wang, Dongsheng
AU - Tiwari, Prayag
AU - Garg, Sahil
AU - Zhu, Hongyin
AU - Bruza, Peter
PY - 2020
Y1 - 2020
N2 - In this paper, we propose a novel lightweight relation extraction approach of structural block driven convolutional neural learning. Specifically, we detect the essential sequential tokens associated with entities through dependency analysis, named as a structural block, and only encode the block on a block-wise and an inter-block-wise representation, utilizing multi-scale Convolutional Neural Networks (CNNs). This is to (1) eliminate the noisy from irrelevant part of a sentence; meanwhile (2) enhance the relevant block representation with both block-wise and inter-block-wise semantically enriched representation. Our method has the advantage of being independent of long sentence context since we only encode the sequential tokens within a block boundary. Experiments on two datasets i.e., SemEval2010 and KBP37, demonstrate the significant advantages of our method. In particular, we achieve the new state-of-the-art performance on the KBP37 dataset; and comparable performance with the state-of-the-art on the SemEval2010 dataset.
AB - In this paper, we propose a novel lightweight relation extraction approach of structural block driven convolutional neural learning. Specifically, we detect the essential sequential tokens associated with entities through dependency analysis, named as a structural block, and only encode the block on a block-wise and an inter-block-wise representation, utilizing multi-scale Convolutional Neural Networks (CNNs). This is to (1) eliminate the noisy from irrelevant part of a sentence; meanwhile (2) enhance the relevant block representation with both block-wise and inter-block-wise semantically enriched representation. Our method has the advantage of being independent of long sentence context since we only encode the sequential tokens within a block boundary. Experiments on two datasets i.e., SemEval2010 and KBP37, demonstrate the significant advantages of our method. In particular, we achieve the new state-of-the-art performance on the KBP37 dataset; and comparable performance with the state-of-the-art on the SemEval2010 dataset.
KW - CNNs
KW - Deep learning
KW - Dependency parsing
KW - Relation extraction
U2 - 10.1016/j.asoc.2019.105913
DO - 10.1016/j.asoc.2019.105913
M3 - Journal article
AN - SCOPUS:85075373947
VL - 86
JO - Applied Soft Computing Journal
JF - Applied Soft Computing Journal
SN - 1568-4946
M1 - 105913
ER -
ID: 234447462