By Steven Dow and Scott Klemmer (Stanford HCI Group)
Why should we approach crowdsourcing differently than any collaborative computing system? Sure, crowdsourcing platforms make on-demand access to people easier than ever before. And this access provides new opportunities for distributed systems and social experiments. However, workers are not simply “artificial artificial intelligence,” but real people with different skills, motivations, and aspirations. At what point did we stop treating people like human beings?
Our work focuses on people. Can we help workers improve their abilities? Can we keep them motivated? Can workers effectively carry out more creative and complex projects? Our experiments show that simple changes in work processes can significantly affect the quality of results. Our goal is to understand the cognitive, social, and motivational factors that govern creative work.
Along with our Berkeley colleagues Björn Hartmann and Anand Kulkarni, we introduce the Shepherd system to manage and provide feedback to workers on content-creation tasks. We propose two key features to help modern micro-task platforms accomplish more complex and creative work. First, formal feedback can improve worker motivation and task performance. Second, real-time visualizations of completed tasks can provide requesters a means to monitor and shepherd workers. We hypothesize that providing infrastructural support for timely and task-specific feedback and worker interaction will lead to better educated, more motivated workers, and better work results. Our next experiment will compare externally provided feedback with self assessment. Does the added cost of assessing work outweigh simpler mechanisms such as asking workers to evaluate their own work?