How to Robustly Combine Judgements from Crowd Assessors with AWARE

Research output: Contribution to journalConference articleResearchpeer-review

We propose the Assessor-driven Weighted Averages for Retrieval Evaluation (AWARE) probabilistic framework, a novel methodology for dealing with multiple crowd assessors, who may be contradictory and/or noisy. By modeling relevance judgements and crowd assessors as sources of uncertainty, AWARE directly combines the performance measures computed on the ground-truth generated by the crowd assessors instead of adopting some classification technique to merge the labels produced by them. We propose several unsupervised estimators that instantiate the AWARE framework and we compare them with Majority Vote (MV) and Expectation Maximization (EM) showing that AWARE approaches improve both in correctly ranking systems and predicting their actual performance scores.

Original languageEnglish
JournalCEUR Workshop Proceedings
Pages (from-to)1DUMMY
Publication statusPublished - 1 Jan 2018
Externally publishedYes
Event26th Italian Symposium on Advanced Database Systems, SEBD 2018 - Castellaneta Marina (Taranto), Italy
Duration: 24 Jun 201827 Jun 2018


Conference26th Italian Symposium on Advanced Database Systems, SEBD 2018
CityCastellaneta Marina (Taranto)

    Research areas

  • AWARE, Crowdsourcing, Unsupervised estimators

ID: 216516836