CrowdScale 2013: Crowdsourcing at Scale
A workshop at HCOMP 2013: The 1st Conference on Human Computation & Crowdsourcing
November 9, 2013
Crowdsourcing and human computation at scale raises a variety of open challenges vs. crowdsourcing with smaller workloads and labor pools. We believe focusing on such issues of scale will be key to taking crowdsourcing to the next level – from its uptake by early adopters today, to its future as how the world’s work gets done. To advance crowdsourcing at scale, CrowdScale will pursue two thrusts:
Track 1: Position Papers. We invite submission of 2-page position papers which identify and motivate focused, key problems or approaches for crowdsourcing at scale.
Track 2: Shared Task Challenge. We invite submissions to a shared task challenge on computing consensus from crowds: how to generate the best possible answer for each question, based on the judgments of five or more raters per question. Participants will submit 4-page papers describing their systems and preliminary results, with $1500 in prize money awarded to top performers.
One may participate in either or both tracks. Submitted papers will not be peer-reviewed or archived, so work shared in these papers can be later submitted to peer-reviewed venues. All papers will be posted on the workshop website to promote discussion within and beyond workshop participants. Workshop organizers will review all submissions to ensure quality, with high acceptance expected.
We invite submission of short (2-page) position papers which identify and motivate key problems or potential approaches for crowdsourcing at scale. We encourage submissions identifying and clearly articulating problems, even if there aren’t satisfactory solutions proposed. Submissions focusing on problems should clearly describe a problem of scale, why it matters, why it is hard, existing approaches, and desired properties of effective solutions. We welcome early work, and particularly encourage submission of visionary position papers that are forward looking.
Each submitted paper should focus on one problem. We encourage multiple submissions per author for articulating distinct problem statements or methods.
Position papers are welcome to argue the merits of an approach or problem already published in earlier work by the author (or anyone else). In this case, the approach should be clearly attributed to the prior work, and the contribution of the position paper would be its argument of why the approach is promising for crowdsourcing at scale.
During the workshop, authors will self-organize into break-out groups, with each group further elaborating upon a particular critical area meriting further work and study. Each group will summarize and report its findings at the workshop’s close. In continuing discussion beyond the workshop, organizers and participants will co-author a summary paper articulating a road map of important challenges and approaches for our community to pursue.
Position paper ideas include (but are not limited to): http://www.crowdscale.org/position-paper-ideas
Shared Task Challenge
To help advance research on crowdsourcing at scale, CrowdFlower and Google are sharing two new, large challenge datasets for multi-class classification. Both datasets are available for immediate download. To make it easy to participate, we have provided multiple formats of the data, and pointers to open source software online that is available to get started.
All participants are expected to submit a paper (up to 4 pages) describing one’s method and preliminary results on shared task metrics, and to present a poster at the workshop. Final results will be announced at the workshop, with prize money awarded to the best performer(s), as well as recognition during the workshop and in our workshop report.
Shared task participants are also invited to participate in workshop discussion throughout the day.
October 14: Position papers due
October 20: Shared task runs due
October 27: Shared task papers due
November 9: Workshop
Please see workshop website for additional information on schedule.
Questions: Email the organizers at: firstname.lastname@example.org
Omar Alonso, Microsoft
Ed Chi, Google
Lydia Chilton, University of Washington
Matt Cooper, oDesk
Peng Dai, Google
Benjamin Goldenberg, Yelp
David Huynh, Google
Panos Ipeirotis, Google/NYU
Chris Lintott, Zooniverse/GalaxyZoo
Greg Little, oDesk
Stuart Lynn, Zooniverse/GalaxyZoo
Stefano Mazzocchi, Google
Rajesh Patel, Microsoft
Mike Shwe, Google
Rion Snow, Twitter
Maria Stone, Microsoft
Alexander Sorokin, CrowdFlower
Jamie Taylor, Google
Tamsyn Waterhouse, Google
Patrick Philips, LinkedIn
Sanga Reddy Peerreddy, SetuServ