Personalized and diverse task composition in crowdsourcing

Maha Alsayasneh, Sihem Amer-Yahia, Eric Gaussier, Vincent Leroy, Julien Pilourdault, Ria Mae Borromeo, Motomichi Toyama, Jean Michel Renders

Research output: Contribution to journalArticle

9 Citations (Scopus)

Abstract

We study task composition in crowdsourcing and the effect of personalization and diversity on performance. A central process in crowdsourcing is task assignment, the mechanism through which workers find tasks. On popular platforms such as Amazon Mechanical Turk, task assignment is facilitated by the ability to sort tasks by dimensions such as creation date or reward amount. Task composition improves task assignment by producing for each worker, a personalized summary of tasks, referred to as a Composite Task (CT). We propose different ways of producing CTs and formulate an optimization problem that finds for a worker, the most relevant and diverse CTs. We show empirically that workers' experience is greatly improved due to personalization that enforces an adequation of CTs with workers' skills and preferences. We also study and formalize various ways of diversifying tasks in each CT. Task diversity is grounded in organization studies that have shown its impact on worker motivation [33]. Our experiments show that diverse CTs contribute to improving outcome quality. More specifically, we show that while task throughput and worker retention are best with ranked lists, crowdwork quality reaches its best with CTs diversified by requesters, thereby confirming that workers look to expose their "good" work to many requesters.

Original languageEnglish
Pages (from-to)128-141
Number of pages14
JournalIEEE Transactions on Knowledge and Data Engineering
Volume30
Issue number1
DOIs
Publication statusPublished - 2018 Jan

Keywords

  • Crowdsourcing
  • Task assignment
  • Task composition
  • Task diversity

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Computational Theory and Mathematics

Fingerprint Dive into the research topics of 'Personalized and diverse task composition in crowdsourcing'. Together they form a unique fingerprint.

  • Cite this

    Alsayasneh, M., Amer-Yahia, S., Gaussier, E., Leroy, V., Pilourdault, J., Borromeo, R. M., Toyama, M., & Renders, J. M. (2018). Personalized and diverse task composition in crowdsourcing. IEEE Transactions on Knowledge and Data Engineering, 30(1), 128-141. https://doi.org/10.1109/TKDE.2017.2755660