Adapting Neural Link Predictors for Data-Efficient Complex Query Answering

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Documents

  • Fulltext

    Final published version, 643 KB, PDF document

Answering complex queries on incomplete knowledge graphs is a challenging task where a model needs to answer complex logical queries in the presence of missing knowledge. Prior work in the literature has proposed to address this problem by designing architectures trained end-to-end for the complex query answering task with a reasoning process that is hard to interpret while requiring data and resource-intensive training. Other lines of research have proposed re-using simple neural link predictors to answer complex queries, reducing the amount of training data by orders of magnitude while providing interpretable answers. The neural link predictor used in such approaches is not explicitly optimised for the complex query answering task, implying that its scores are not calibrated to interact together. We propose to address these problems via CQD, a parameter-efficient score \emph{adaptation} model optimised to re-calibrate neural link prediction scores for the complex query answering task. While the neural link predictor is frozen, the adaptation component -- which only increases the number of model parameters by -- is trained on the downstream complex query answering task. Furthermore, the calibration component enables us to support reasoning over queries that include atomic negations, which was previously impossible with link predictors. In our experiments, CQD produces significantly more accurate results than current state-of-the-art methods, improving from to Mean Reciprocal Rank values averaged across all datasets and query types while using of the available training query types. We further show that CQD is data-efficient, achieving competitive results with only of the complex training queries and robust in out-of-domain evaluations. Source code and datasets are available at https://github.com/EdinburghNLP/adaptive-cqd.
Original languageEnglish
Title of host publication37th Conference on Neural Information Processing Systems (NeurIPS 2023).
Number of pages13
PublisherOpenReview.net
Publication date2024
Publication statusPublished - 2024
Event37th Conference on Neural Information Processing Systems - NeurIPS 2023 - New Orleans., United States
Duration: 10 Dec 202316 Dec 2023

Conference

Conference37th Conference on Neural Information Processing Systems - NeurIPS 2023
LandUnited States
ByNew Orleans.
Periode10/12/202316/12/2023

Links

ID: 381236392