Parameswaran, Aditya and Boyd, Stephen and Garcia-Molina, Hector and Gupta, Ashish and Polyzotis, Neoklis and Widom, Jennifer Optimal Crowd-Powered Rating and Filtering Algorithms. Technical Report. Stanford InfoLab.
BibTeX | DublinCore | EndNote | HTML |
PDF 674Kb |
Abstract
We focus on crowd-powered filtering, i.e., filtering a large set of items using humans. Filtering is one of the most commonly used building blocks in crowdsourcing applications and systems. While solutions for crowd-powered filtering exist, they make a range of implicit assumptions and restrictions, ultimately rendering them not powerful enough for real-world applications. We describe two approaches to discard these implicit assumptions and restrictions: one, that carefully generalizes prior work, leading to an optimal, but often-times intractable solution, and another, that provides a novel way of reasoning about filtering strategies, leading to a sometimes sub-optimal, but efficiently computable solution (that is provably close to optimal). We demonstrate that our techniques lead to significant reductions in error of up to 30-40% for fixed cost over prior work in a novel crowdsourcing application: peer evaluation in online courses.
Item Type: | Techreport (Technical Report) |
---|---|
Uncontrolled Keywords: | crowd algorithms, optimization, filtering, crowdsourcing, rating |
ID Code: | 1078 |
Deposited By: | Aditya Parameswaran |
Deposited On: | 30 Sep 2013 20:29 |
Last Modified: | 30 Jan 2014 06:40 |
Download statistics
Repository Staff Only: item control page