Stanford InfoLab Publication Server

Evaluating the Crowd with Confidence

Joglekar, Manas and Garcia-Molina, Hector and Parameswaran, Aditya Evaluating the Crowd with Confidence. In: SIGKIDD 2013, Chicago, Illinois.


PDF - Accepted Version


Worker quality control is a crucial aspect of crowdsourcing systems; typically occupying a large fraction of the time and money invested on crowdsourcing. In this work, we devise techniques to generate confidence intervals for worker error rate estimates, thereby enabling a better evaluation of worker quality. We show that our techniques generate correct confidence intervals on a range of real-world datasets, and demonstrate wide applicability by using them to evict poorly performing workers, and provide confidence intervals on the accuracy of the answers

Item Type:Conference or Workshop Item (Paper)
ID Code:1093
Deposited By:Aditya Parameswaran
Deposited On:04 Apr 2014 17:26
Last Modified:04 Apr 2014 17:26

Download statistics

Repository Staff Only: item control page