Wednesday, January 18, 2012

Shepherding the Crowd Yields Better Work

Title: Shepherding the Crowd Yields Better Work
Speaker: Steven Dow, HCII, CMU
Date: Wednesday, Jan 25th
Time: 12p-1p
Room: NSH 3305

Abstract:

Micro-task platforms provide massively parallel, on-demand labor. However, it can be difficult to reliably achieve high-quality work because online workers may behave irresponsibly, misunderstand the task, or lack necessary skills. This paper investigates whether timely, task-specific feedback helps crowd workers learn, persevere, and produce better results. We investigate this question through Shepherd, a feedback system for crowdsourced work. In a between-subjects study with three conditions, crowd workers wrote consumer reviews for six products they own. Participants in the None condition received no immediate feedback, consistent with most current crowdsourcing practices. Participants in the Self-assessment condition judged their own work. Participants in the External assessment condition received expert feedback. Self-assessment alone yielded better overall work than the None condition and helped workers improve over time. External assessment also yielded these benefits. Participants who received external assessment also revised their work more. We conclude by discussing interaction and infrastructure approaches for integrating real-time assessment into online work.

1 comment:

  1. Have you been contemplating whether to work from home on the internet and don't know if being in self employment would be a great idea for you. Then this article was put together to enable people like you make a decisive decision.

    Work from home on the internet

    ReplyDelete