Crowdsourcing interactions: using crowdsourcing for evaluating interactive information retrieval systems

Guido Zuccon, Teerapong Leelanupab, Stewart Whiting, Emine Yilmaz, Joemon M. Jose, Leif Azzopardi

Research output: Contribution to journalArticlepeer-review

27 Citations (Scopus)


In the field of information retrieval (IR), researchers and practitioners are often faced with a demand for valid approaches to evaluate the performance of retrieval systems. The Cranfield experiment paradigm has been dominant for the in-vitro evaluation of IR systems. Alternative to this paradigm, laboratory-based user studies have been widely used to evaluate interactive information retrieval (IIR) systems, and at the same time investigate users’ information searching behaviours. Major drawbacks of laboratory-based user studies for evaluating IIR systems include the high monetary and temporal costs involved in setting up and running those experiments, the lack of heterogeneity amongst the user population and the limited scale of the experiments, which usually involve a relatively restricted set of users. In this paper, we propose an alternative experimental methodology to laboratory-based user studies. Our novel experimental methodology uses a crowdsourcing platform as a means of engaging study participants. Through crowdsourcing, our experimental methodology can capture user interactions and searching behaviours at a lower cost, with more data, and within a shorter period than traditional laboratory-based user studies, and therefore can be used to assess the performances of IIR systems. In this article, we show the characteristic differences of our approach with respect to traditional IIR experimental and evaluation procedures. We also perform a use case study comparing crowdsourcing-based evaluation with laboratory-based evaluation of IIR systems, which can serve as a tutorial for setting up crowdsourcing-based IIR evaluations.
Original languageEnglish
Pages (from-to)267-305
Number of pages39
JournalInformation Retrieval
Issue number2
Publication statusPublished - 1 Apr 2013
Externally publishedYes


  • crowdsourcing evaluation
  • interactive IR evaluation

Fingerprint Dive into the research topics of 'Crowdsourcing interactions: using crowdsourcing for evaluating interactive information retrieval systems'. Together they form a unique fingerprint.

Cite this