The influence of crowd type and task complexity on crowdsourced work quality

Ria Mae Borromeo, Thomas Laurent, Motomichi Toyama

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

As the use of crowdsourcing spreads, the need to ensure the quality of crowdsourced work is magnified. While quality control in crowdsourcing has been widely studied, established mechanisms may still be improved to take into account other factors that affect quality. However, since crowdsourcing relies on humans, it is difficult to identify and consider all factors affecting quality. In this study, we conduct an initial investigation on the effect of crowd type and task complexity on work quality by crowdsourcing a simple and more complex version of a data extraction task to paid and unpaid crowds. We then measure the quality of the results in terms of its similarity to a gold standard data set. Our experiments show that the unpaid crowd produces results of high quality regardless of the type of task while the paid crowd yields better results in simple tasks. We intend to extend our work to integrate existing quality con-trol mechanisms and perform more experiments with more varied crowd members.

Original languageEnglish
Title of host publicationProceedings of the 20th International Database Engineering and Applications Symposium, IDEAS 2016
PublisherAssociation for Computing Machinery
Pages70-76
Number of pages7
Volume11-13-July-2016
ISBN (Electronic)9781450341189
DOIs
Publication statusPublished - 2016 Jul 11
Event20th International Database Engineering and Applications Symposium, IDEAS 2016 - Montreal, Canada
Duration: 2016 Jul 112016 Jul 13

Other

Other20th International Database Engineering and Applications Symposium, IDEAS 2016
CountryCanada
CityMontreal
Period16/7/1116/7/13

Fingerprint

Quality control
Experiments

Keywords

  • Crowdsourcing
  • Task complexity
  • Text extraction

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Borromeo, R. M., Laurent, T., & Toyama, M. (2016). The influence of crowd type and task complexity on crowdsourced work quality. In Proceedings of the 20th International Database Engineering and Applications Symposium, IDEAS 2016 (Vol. 11-13-July-2016, pp. 70-76). Association for Computing Machinery. https://doi.org/10.1145/2938503.2938511

The influence of crowd type and task complexity on crowdsourced work quality. / Borromeo, Ria Mae; Laurent, Thomas; Toyama, Motomichi.

Proceedings of the 20th International Database Engineering and Applications Symposium, IDEAS 2016. Vol. 11-13-July-2016 Association for Computing Machinery, 2016. p. 70-76.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Borromeo, RM, Laurent, T & Toyama, M 2016, The influence of crowd type and task complexity on crowdsourced work quality. in Proceedings of the 20th International Database Engineering and Applications Symposium, IDEAS 2016. vol. 11-13-July-2016, Association for Computing Machinery, pp. 70-76, 20th International Database Engineering and Applications Symposium, IDEAS 2016, Montreal, Canada, 16/7/11. https://doi.org/10.1145/2938503.2938511
Borromeo RM, Laurent T, Toyama M. The influence of crowd type and task complexity on crowdsourced work quality. In Proceedings of the 20th International Database Engineering and Applications Symposium, IDEAS 2016. Vol. 11-13-July-2016. Association for Computing Machinery. 2016. p. 70-76 https://doi.org/10.1145/2938503.2938511
Borromeo, Ria Mae ; Laurent, Thomas ; Toyama, Motomichi. / The influence of crowd type and task complexity on crowdsourced work quality. Proceedings of the 20th International Database Engineering and Applications Symposium, IDEAS 2016. Vol. 11-13-July-2016 Association for Computing Machinery, 2016. pp. 70-76
@inproceedings{07bfc24ec7d84245b1b6675c94857e29,
title = "The influence of crowd type and task complexity on crowdsourced work quality",
abstract = "As the use of crowdsourcing spreads, the need to ensure the quality of crowdsourced work is magnified. While quality control in crowdsourcing has been widely studied, established mechanisms may still be improved to take into account other factors that affect quality. However, since crowdsourcing relies on humans, it is difficult to identify and consider all factors affecting quality. In this study, we conduct an initial investigation on the effect of crowd type and task complexity on work quality by crowdsourcing a simple and more complex version of a data extraction task to paid and unpaid crowds. We then measure the quality of the results in terms of its similarity to a gold standard data set. Our experiments show that the unpaid crowd produces results of high quality regardless of the type of task while the paid crowd yields better results in simple tasks. We intend to extend our work to integrate existing quality con-trol mechanisms and perform more experiments with more varied crowd members.",
keywords = "Crowdsourcing, Task complexity, Text extraction",
author = "Borromeo, {Ria Mae} and Thomas Laurent and Motomichi Toyama",
year = "2016",
month = "7",
day = "11",
doi = "10.1145/2938503.2938511",
language = "English",
volume = "11-13-July-2016",
pages = "70--76",
booktitle = "Proceedings of the 20th International Database Engineering and Applications Symposium, IDEAS 2016",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - The influence of crowd type and task complexity on crowdsourced work quality

AU - Borromeo, Ria Mae

AU - Laurent, Thomas

AU - Toyama, Motomichi

PY - 2016/7/11

Y1 - 2016/7/11

N2 - As the use of crowdsourcing spreads, the need to ensure the quality of crowdsourced work is magnified. While quality control in crowdsourcing has been widely studied, established mechanisms may still be improved to take into account other factors that affect quality. However, since crowdsourcing relies on humans, it is difficult to identify and consider all factors affecting quality. In this study, we conduct an initial investigation on the effect of crowd type and task complexity on work quality by crowdsourcing a simple and more complex version of a data extraction task to paid and unpaid crowds. We then measure the quality of the results in terms of its similarity to a gold standard data set. Our experiments show that the unpaid crowd produces results of high quality regardless of the type of task while the paid crowd yields better results in simple tasks. We intend to extend our work to integrate existing quality con-trol mechanisms and perform more experiments with more varied crowd members.

AB - As the use of crowdsourcing spreads, the need to ensure the quality of crowdsourced work is magnified. While quality control in crowdsourcing has been widely studied, established mechanisms may still be improved to take into account other factors that affect quality. However, since crowdsourcing relies on humans, it is difficult to identify and consider all factors affecting quality. In this study, we conduct an initial investigation on the effect of crowd type and task complexity on work quality by crowdsourcing a simple and more complex version of a data extraction task to paid and unpaid crowds. We then measure the quality of the results in terms of its similarity to a gold standard data set. Our experiments show that the unpaid crowd produces results of high quality regardless of the type of task while the paid crowd yields better results in simple tasks. We intend to extend our work to integrate existing quality con-trol mechanisms and perform more experiments with more varied crowd members.

KW - Crowdsourcing

KW - Task complexity

KW - Text extraction

UR - http://www.scopus.com/inward/record.url?scp=84989227693&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84989227693&partnerID=8YFLogxK

U2 - 10.1145/2938503.2938511

DO - 10.1145/2938503.2938511

M3 - Conference contribution

AN - SCOPUS:84989227693

VL - 11-13-July-2016

SP - 70

EP - 76

BT - Proceedings of the 20th International Database Engineering and Applications Symposium, IDEAS 2016

PB - Association for Computing Machinery

ER -