Abstract
Finding participants for experiments has always been a challenge. As technology advanced, running experiments online became a viable way to carry out research that did not require anything more than a personal computer. The natural next step in this progression emerged as crowdsourcing became an option. We report on our experience of joining this new wave of practice, and the difficulties and challenges we encountered when crowdsourcing a study. This led us to re-evaluate the validity of crowdsourced research. We report our findings, and conclude with guidelines for crowdsourced experiments.
Original language | English |
---|---|
Pages | 643-652 |
Number of pages | 9 |
DOIs | |
Publication status | Published - 26 Apr 2014 |
Event | CHI 2014 - Convention Centre, Toronto, Canada Duration: 26 Apr 2014 → 1 May 2014 http://chi2014.acm.org/ |
Conference
Conference | CHI 2014 |
---|---|
Abbreviated title | CHI 2014 |
Country/Territory | Canada |
City | Toronto |
Period | 26/04/14 → 1/05/14 |
Other | CHI 2014 is a celebration of the conference's one of a kind diversity; from the broad range of backgrounds of its attendees, to the diverse spectrum of communities and fields which the conference and its research have an impact on |
Internet address |
Keywords
- crowdsourced study
- crowdsourcing
- human factors
- research
- human-centered computing
- human computer interaction (HCI)