Stanford InfoLab Publication Server

Comprehensive and Reliable Crowd Assessment Algorithms

Joglekar, Manas and Garcia-Molina, Hector and Parameswaran, Aditya (2014) Comprehensive and Reliable Crowd Assessment Algorithms. Technical Report. Stanford InfoLab.


PDF - Accepted Version


Evaluating workers is a critical aspect of any crowdsourcing system. In this paper, we devise techniques for evaluating workers by finding confidence intervals on their error rates. Unlike prior work, we focus on ``conciseness''---that is, giving as tight a confidence interval as possible. Conciseness is of utmost importance because it allows us to be sure that we have the best guarantee possible on worker error rate. Also unlike prior work, we provide techniques that work under very general scenarios, such as when not all workers have attempted every task (a fairly common scenario in practice), when tasks have non-boolean responses, and when workers have different biases for positive and negative tasks. We demonstrate conciseness as well as accuracy of our confidence intervals by testing them on a variety of conditions and multiple real-world datasets.

Item Type:Techreport (Technical Report)
ID Code:1107
Deposited By:Aditya Parameswaran
Deposited On:12 Nov 2014 13:50
Last Modified:12 Nov 2014 13:50

Download statistics

Repository Staff Only: item control page