Generating Scientific Claims for Zero-Shot Scientific Fact Checking
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Standard
Generating Scientific Claims for Zero-Shot Scientific Fact Checking. / Wright, Dustin; Wadden, David; Lo, Kyle; Kuehl, Bailey; Cohan, Arman; Augenstein, Isabelle; Wang, Lucy Lu.
Generating Scientific Claims for Zero-Shot Scientific Fact Checking. Association for Computational Linguistics, 2022.Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - Generating Scientific Claims for Zero-Shot Scientific Fact Checking
AU - Wright, Dustin
AU - Wadden, David
AU - Lo, Kyle
AU - Kuehl, Bailey
AU - Cohan, Arman
AU - Augenstein, Isabelle
AU - Wang, Lucy Lu
PY - 2022
Y1 - 2022
N2 - Automated scientific fact checking is difficult due to the complexity of scientific language and a lack of significant amounts of training data, as annotation requires domain expertise. To address this challenge, we propose scientific claim generation, the task of generating one or more atomic and verifiable claims from scientific sentences, and demonstrate its usefulness in zero-shot fact checking for biomedical claims. We propose CLAIMGEN-BART, a new supervised method for generating claims supported by the literature, as well as KBIN, a novel method for generating claim negations. Additionally, we adapt an existing unsupervised entity-centric method of claim generation to biomedical claims, which we call CLAIMGEN-ENTITY. Experiments on zero-shot fact checking demonstrate that both CLAIMGEN-ENTITY and CLAIMGEN-BART, coupled with KBIN, achieve up to 90% performance of fully supervised models trained on manually annotated claims and evidence. A rigorous evaluation study demonstrates significant improvement in generated claim and negation quality over existing baselines
AB - Automated scientific fact checking is difficult due to the complexity of scientific language and a lack of significant amounts of training data, as annotation requires domain expertise. To address this challenge, we propose scientific claim generation, the task of generating one or more atomic and verifiable claims from scientific sentences, and demonstrate its usefulness in zero-shot fact checking for biomedical claims. We propose CLAIMGEN-BART, a new supervised method for generating claims supported by the literature, as well as KBIN, a novel method for generating claim negations. Additionally, we adapt an existing unsupervised entity-centric method of claim generation to biomedical claims, which we call CLAIMGEN-ENTITY. Experiments on zero-shot fact checking demonstrate that both CLAIMGEN-ENTITY and CLAIMGEN-BART, coupled with KBIN, achieve up to 90% performance of fully supervised models trained on manually annotated claims and evidence. A rigorous evaluation study demonstrates significant improvement in generated claim and negation quality over existing baselines
U2 - 10.18653/v1/2022.acl-long.175
DO - 10.18653/v1/2022.acl-long.175
M3 - Article in proceedings
BT - Generating Scientific Claims for Zero-Shot Scientific Fact Checking
PB - Association for Computational Linguistics
T2 - 60th Annual Meeting of the Association for Computational Linguistics
Y2 - 23 May 2022 through 25 May 2022
ER -
ID: 323619682