An Exploration of Encoder-Decoder Approaches to Multi-Label Classification for Legal and Biomedical Text

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

An Exploration of Encoder-Decoder Approaches to Multi-Label Classification for Legal and Biomedical Text. / Kementchedjhieva, Yova; Chalkidis, Ilias.

Findings of the Association for Computational Linguistics, ACL 2023. Association for Computational Linguistics (ACL), 2023. s. 5828-5843 (Proceedings of the Annual Meeting of the Association for Computational Linguistics).

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

Kementchedjhieva, Y & Chalkidis, I 2023, An Exploration of Encoder-Decoder Approaches to Multi-Label Classification for Legal and Biomedical Text. i Findings of the Association for Computational Linguistics, ACL 2023. Association for Computational Linguistics (ACL), Proceedings of the Annual Meeting of the Association for Computational Linguistics, s. 5828-5843, 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023, Toronto, Canada, 09/07/2023. https://doi.org/10.18653/v1/2023.findings-acl.360

APA

Kementchedjhieva, Y., & Chalkidis, I. (2023). An Exploration of Encoder-Decoder Approaches to Multi-Label Classification for Legal and Biomedical Text. I Findings of the Association for Computational Linguistics, ACL 2023 (s. 5828-5843). Association for Computational Linguistics (ACL). Proceedings of the Annual Meeting of the Association for Computational Linguistics https://doi.org/10.18653/v1/2023.findings-acl.360

Vancouver

Kementchedjhieva Y, Chalkidis I. An Exploration of Encoder-Decoder Approaches to Multi-Label Classification for Legal and Biomedical Text. I Findings of the Association for Computational Linguistics, ACL 2023. Association for Computational Linguistics (ACL). 2023. s. 5828-5843. (Proceedings of the Annual Meeting of the Association for Computational Linguistics). https://doi.org/10.18653/v1/2023.findings-acl.360

Author

Kementchedjhieva, Yova ; Chalkidis, Ilias. / An Exploration of Encoder-Decoder Approaches to Multi-Label Classification for Legal and Biomedical Text. Findings of the Association for Computational Linguistics, ACL 2023. Association for Computational Linguistics (ACL), 2023. s. 5828-5843 (Proceedings of the Annual Meeting of the Association for Computational Linguistics).

Bibtex

@inproceedings{edafbb3b7e0a477093037a9484d5b907,
title = "An Exploration of Encoder-Decoder Approaches to Multi-Label Classification for Legal and Biomedical Text",
abstract = "Standard methods for multi-label text classification largely rely on encoder-only pre-trained language models, whereas encoder-decoder models have proven more effective in other classification tasks. In this study, we compare four methods for multi-label classification, two based on an encoder only, and two based on an encoder-decoder. We carry out experiments on four datasets-two in the legal domain and two in the biomedical domain, each with two levels of label granularity- and always depart from the same pre-trained model, T5. Our results show that encoder-decoder methods outperform encoder-only methods, with a growing advantage on more complex datasets and labeling schemes of finer granularity. Using encoder-decoder models in a non-autoregressive fashion, in particular, yields the best performance overall, so we further study this approach through ablations to better understand its strengths.",
author = "Yova Kementchedjhieva and Ilias Chalkidis",
note = "Publisher Copyright: {\textcopyright} 2023 Association for Computational Linguistics.; 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 ; Conference date: 09-07-2023 Through 14-07-2023",
year = "2023",
doi = "10.18653/v1/2023.findings-acl.360",
language = "English",
series = "Proceedings of the Annual Meeting of the Association for Computational Linguistics",
pages = "5828--5843",
booktitle = "Findings of the Association for Computational Linguistics, ACL 2023",
publisher = "Association for Computational Linguistics (ACL)",
address = "United States",

}

RIS

TY - GEN

T1 - An Exploration of Encoder-Decoder Approaches to Multi-Label Classification for Legal and Biomedical Text

AU - Kementchedjhieva, Yova

AU - Chalkidis, Ilias

N1 - Publisher Copyright: © 2023 Association for Computational Linguistics.

PY - 2023

Y1 - 2023

N2 - Standard methods for multi-label text classification largely rely on encoder-only pre-trained language models, whereas encoder-decoder models have proven more effective in other classification tasks. In this study, we compare four methods for multi-label classification, two based on an encoder only, and two based on an encoder-decoder. We carry out experiments on four datasets-two in the legal domain and two in the biomedical domain, each with two levels of label granularity- and always depart from the same pre-trained model, T5. Our results show that encoder-decoder methods outperform encoder-only methods, with a growing advantage on more complex datasets and labeling schemes of finer granularity. Using encoder-decoder models in a non-autoregressive fashion, in particular, yields the best performance overall, so we further study this approach through ablations to better understand its strengths.

AB - Standard methods for multi-label text classification largely rely on encoder-only pre-trained language models, whereas encoder-decoder models have proven more effective in other classification tasks. In this study, we compare four methods for multi-label classification, two based on an encoder only, and two based on an encoder-decoder. We carry out experiments on four datasets-two in the legal domain and two in the biomedical domain, each with two levels of label granularity- and always depart from the same pre-trained model, T5. Our results show that encoder-decoder methods outperform encoder-only methods, with a growing advantage on more complex datasets and labeling schemes of finer granularity. Using encoder-decoder models in a non-autoregressive fashion, in particular, yields the best performance overall, so we further study this approach through ablations to better understand its strengths.

UR - http://www.scopus.com/inward/record.url?scp=85174996579&partnerID=8YFLogxK

U2 - 10.18653/v1/2023.findings-acl.360

DO - 10.18653/v1/2023.findings-acl.360

M3 - Article in proceedings

AN - SCOPUS:85174996579

T3 - Proceedings of the Annual Meeting of the Association for Computational Linguistics

SP - 5828

EP - 5843

BT - Findings of the Association for Computational Linguistics, ACL 2023

PB - Association for Computational Linguistics (ACL)

T2 - 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023

Y2 - 9 July 2023 through 14 July 2023

ER -

ID: 374650746