An exploratory study of the impact of task selection strategies on worker performance in crowdsourcing microtasks

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

Abstract

In microtask crowdsourcing systems like Amazon Mechanical Turk (AMT) and Appen Figure-Eight, workers often employ task selection strategies, completing sequences of tasks to maximize earnings. While previous literature has explored the effects of sequential tasks with varying complexities of the same type, there is a lack of knowledge on the consequences when multiple types of tasks with similar levels of difficulty are performed. This study examines the impact of sequences of three frequently employed task types, namely image classification, text classification, and surveys, on workers' engagement, accuracy, and perceived workloads. In addition, we analyze the influence of workers' personality traits on their strategies for selecting tasks. Our study, which involved 558 participants using AMT, found that engaging in sequences of distinct task types had a detrimental effect on classification task engagement and accuracy. It also increases the perceived task load and the worker's frustration. Nevertheless, the precise order of tasks does not significantly impact these results. Moreover, we showed a slight association between personality traits and the workers' selection strategy for the tasks. The results offered valuable knowledge for designing an efficient and inclusive crowdsourcing platform.
Original languageEnglish
Title of host publicationProceedings of the 12th AAAI Conference on Human Computation and Crowdsourcing
EditorsGianluca Demartini, Ujwal Gadiraju
Place of PublicationWashington, DC
Pages2-11
Number of pages10
DOIs
Publication statusPublished - 14 Oct 2024
EventThe Twelfth AAAI Conference on Human Computation and Crowdsourcing
- Pittsburgh, Pittsburgh, United States
Duration: 16 Oct 202420 Oct 2024
Conference number: 2024
https://www.humancomputation.com/

Publication series

NameProceedings of the AAAI Conference on Human Computation and Crowdsourcing
Number1
Volume12

Conference

ConferenceThe Twelfth AAAI Conference on Human Computation and Crowdsourcing
Abbreviated titleHCOMP
Country/TerritoryUnited States
CityPittsburgh
Period16/10/2420/10/24
Internet address

Keywords

  • crowdscourcing
  • human computation
  • task selection strategies

Fingerprint

Dive into the research topics of 'An exploratory study of the impact of task selection strategies on worker performance in crowdsourcing microtasks'. Together they form a unique fingerprint.

Cite this