Show simple item record

dc.contributor.authorCarvalho, Arthur
dc.contributor.authorDimitrov, Stanko
dc.contributor.authorLarson, Kate
dc.date.accessioned2020-08-28 14:05:32 (GMT)
dc.date.available2020-08-28 14:05:32 (GMT)
dc.date.issued2016-01
dc.identifier.urihttps://doi.org/10.1007/s10472-015-9492-4
dc.identifier.urihttp://hdl.handle.net/10012/16180
dc.description.abstractRecent years have seen an increased interest in crowdsourcing as a way of obtaining information from a potentially large group of workers at a reduced cost. The crowdsourcing process, as we consider in this paper, is as follows: a requester hires a number of workers to work on a set of similar tasks. After completing the tasks, each worker reports back outputs. The requester then aggregates the reported outputs to obtain aggregate outputs. A crucial question that arises during this process is: how many crowd workers should a requester hire? In this paper, we investigate from an empirical perspective the optimal number of workers a requester should hire when crowdsourcing tasks, with a particular focus on the crowdsourcing platform Amazon Mechanical Turk. Specifically, we report the results of three studies involving different tasks and payment schemes. We find that both the expected error in the aggregate outputs as well as the risk of a poor combination of workers decrease as the number of workers increases. Surprisingly, we find that the optimal number of workers a requester should hire for each task is around 10 to 11, no matter the underlying task and payment scheme. To derive such a result, we employ a principled analysis based on bootstrapping and segmented linear regression. Besides the above result, we also find that overall top-performing workers are more consistent across multiple tasks than other workers. Our results thus contribute to a better understanding of, and provide new insights into, how to design more effective crowdsourcing processes.en
dc.description.sponsorshipNatural Sciences and Engineering Research Council of Canadaen
dc.language.isoenen
dc.publisherSpringeren
dc.rightsAttribution 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectcrowdsourcingen
dc.subjecthuman computationen
dc.subjectAmazon Mechanical Turken
dc.titleHow many crowdsourced workers should a requester hire?en
dc.typeArticleen
dcterms.bibliographicCitationAnn Math Artif Intell (2016) 78:45–72 10.1007/s10472-015-9492-4en
uws.contributor.affiliation1Faculty of Mathematicsen
uws.contributor.affiliation1Faculty of Engineeringen
uws.contributor.affiliation2David R. Cheriton School of Computer Scienceen
uws.contributor.affiliation2Management Sciencesen
uws.typeOfResourceTexten
uws.peerReviewStatusRevieweden
uws.scholarLevelFacultyen


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International
Except where otherwise noted, this item's license is described as Attribution 4.0 International

UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages