Crowdsourcing quality concerns: an examination of Amazon's mechanical turk

Marc Dupuis, Karen Renaud, Rosalind Searle

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

6 Citations (Scopus)

Abstract

The use of crowdsourcing platforms, such as Amazon's Mechanical Turk (MTurk), have been an effective and frequent tool for researchers to gather data from participants for a study. It provides a fast, efficient, and cost-effective method for acquiring large amounts of data for a variety of research projects, such as surveys that may be conducted to assess the use of information technology or to better understand cybersecurity perceptions and behaviors. While the use of such crowdsourcing platforms has gained both popularity and acceptance over the past several years, quality concerns remain a significant issue for the researcher. This paper examines these issues.
Original languageEnglish
Title of host publicationSIGITE 2022 - Proceedings of the 23rd Annual Conference on Information Technology Education
Subtitle of host publicationProceedings of the 23rd Annual Conference on Information Technology Education
Place of PublicationNew York
Pages127-129
Number of pages3
ISBN (Electronic)9781450393911
DOIs
Publication statusPublished - 21 Sept 2022
Event23rd Annual Conference on Information Technology Education - Illinois Institute of Technology, Chigaco, United States
Duration: 21 Sept 202224 Sept 2022
Conference number: 23rd

Publication series

NameSIGITE 2022 - Proceedings of the 23rd Annual Conference on Information Technology Education

Conference

Conference23rd Annual Conference on Information Technology Education
Abbreviated titleSIGITE'22
Country/TerritoryUnited States
CityChigaco
Period21/09/2224/09/22

Keywords

  • Amazon's Mechanical Turk (MTurk)
  • crowdsourcing
  • human subjects research
  • information technology research
  • open-ended questions
  • qualitative data
  • quality control
  • quantitative data
  • surveys

Fingerprint

Dive into the research topics of 'Crowdsourcing quality concerns: an examination of Amazon's mechanical turk'. Together they form a unique fingerprint.

Cite this