A Crowdsourcing Platform for Personalized Human Intelligence Task Assignment Based on Social Networks
The most popular micro-task crowdsourcing platform (Amazon MTurk) uses a pull methodology: Workers select which task they want to perform among the ones available on the market based on their interests, task reward, clarity of instructions, requester reputation, etc.
Allowing workers to pick their preferred tasks on a first-come-first-served basis has many advantages, such as short completion times. On the other hand, this mechanism does not guarantee that the worker who performs the task is the best fit: More suitable workers may be available in the crowd but they might be unable to pick the HIT if they were not quick enough.
Instead, we propose a push methodology to crowdsourcing micro-tasks: Our system first creates worker profiles to model their interests and skills and then uses such profiles to assign available tasks to workers.
The proposed approach does result in a significant improvement in the quality of the work done as compared to the same tasks run with anonymous workers from MTurk both for multiple choices tasks as well as for open questions. This is the case since workers can complete tasks on topics of their interest rather than competing with other workers to get the best tasks first.
We have developed a Facebook App named OpenTurk which lets crowd workers log in with their Facebook account. Next, our system collects information about the liked Facebook pages and, based on this, selects which tasks to assign to the workers.
In the paper, we experimentally compare the effectiveness of different task assignment approaches including simple Facebook category matches, expert finding techniques, and semantic based methods.
The figure above explains our best performing approach which is based on expert finding:
- First, an inverted index over the content of Facebook pages liked by the crowd is built.
- Then, the HITs metadata is used to construct a query over the index and to find Facebook pages relevant to the task.
- Finally, pages are seen as votes for the expertise of workers for the specific task: Top-k ranked workers are assigned the task.
The above figure presents worker accuracy based the number of “likes” matching the current task over different type of tasks. We can see that the accuracy may vary a lot when not enough relevant “likes” are present in the worker profile (left part). On the other hand, when workers like many Facebook pages relevant to the task, then their performance is always highly accurate (right part).
For more, see our full paper, “Pick-A-Crowd: Tell Me What You Like, and I’ll Tell You What to Do“.
Djellel E. Difallah, eXascale Infolab, U. of Fribourg, Switzerland.
Gianluca Demartini, eXascale Infolab, U. of Fribourg, Switzerland.
Victor Felder, U. of Fribourg, Switzerland.
Philippe Cudré-Mauroux, eXascale Infolab, U. of Fribourg, Switzerland.