AWARE: Exploiting evaluation measures to combine multiple assessors

Research output: Contribution to journalJournal articlepeer-review

We propose the Assessor-drivenWeighted Averages for Retrieval Evaluation (AWARE) probabilistic framework, a novel methodology for dealing with multiple crowd assessors that may be contradictory and/or noisy. By modeling relevance judgements and crowd assessors as sources of uncertainty, AWARE takes the expectation of a generic performance measure, like Average Precision, composed with these random variables. In this way, it approaches the problem of aggregating different crowd assessors from a new perspective, that is, directly combining the performance measures computed on the ground truth generated by the crowd assessors instead of adopting some classification technique to merge the labels produced by them. We propose several unsupervised estimators that instantiate the AWARE framework and we compare them with state-of-theart approaches, that is,Majoriity Vote and Expectation Maximization, on TREC collections. We found that AWARE approaches improve in terms of their capability of correctly ranking systems and predicting their actual performance scores.

Original languageEnglish
Article number20
JournalACM Transactions on Information Systems
Volume36
Issue number2
ISSN1046-8188
DOIs
Publication statusPublished - 1 Aug 2017
Externally publishedYes

    Research areas

  • AWARE, Crowdsourcing, Performance measure, Unsupervised estimators, Weighted average

ID: 216517365