Budak, C., Garrett, R. K., & Sude, D. (in press). Better crowdcoding: Strategies for promoting accuracy in crowdsourced content analysis. Communication Methods and Measures.
In this work, we evaluate different instruction strategies to improve the quality of
crowdcoding for the concept of civility. We test the effectiveness of training,
codebooks, and their combination through 2×2 experiments conducted on two
different populations—students and Amazon Mechanical Turk workers. In
addition, we perform simulations to evaluate the trade-off between cost and
performance associated with different instructional strategies and the number of
human coders. We find that training improves crowdcoding quality, while
codebooks do not. We further show that relying on several human coders and
applying majority rule to their assessments significantly improves performance.