cal_eval: An evaluation tool for information retrieval

Leif Azzopardi, Paul Thomas, Alistair Moffat

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

4 Citations (Scopus)
5 Downloads (Pure)

Abstract

We present a tool (“cwl_eval”) which unifies many metrics typically used to evaluate information retrieval systems using test collections. In the C/W/L framework metrics are specified via a single function which can be used to derive a number of related measurements: Expected Utility per item, Expected Total Utility, Expected Cost per item, Expected Total Cost, and Expected Depth. The C/W/L framework brings together several independent approaches for measuring the quality of a ranked list, and provides a coherent user model-based framework for developing measures based on utility (gain) and cost.Here we outline the C/W/L measurement framework; describe the cwl_eval architecture; and provide examples of how to use it. We provide implementations of a number of recent metrics, including Time Biased Gain, U-Measure, Bejewelled Measure, and the Information Foraging Based Measure, as well as previous metrics such as Precision, Average Precision, Discounted Cumulative Gain, Rank-Biased Precision, and INST. By providing state-of-the-art and traditional metrics within the same framework, we promote a standardised approach to evaluating search effectiveness.

Original languageEnglish
Title of host publicationSIGIR 2019 - Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval
Place of PublicationNew York
Pages1321-1324
Number of pages4
ISBN (Electronic)9781450361729
DOIs
Publication statusPublished - 18 Jul 2019
Event42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2019 - Paris, France
Duration: 21 Jul 201925 Jul 2019

Publication series

NameSIGIR 2019 - Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval

Conference

Conference42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2019
CountryFrance
CityParis
Period21/07/1925/07/19

Keywords

  • information retrieval systems
  • C/W/L framework metrics
  • relevance
  • retrieval precision

Fingerprint Dive into the research topics of 'cal_eval: An evaluation tool for information retrieval'. Together they form a unique fingerprint.

  • Cite this

    Azzopardi, L., Thomas, P., & Moffat, A. (2019). cal_eval: An evaluation tool for information retrieval. In SIGIR 2019 - Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 1321-1324). (SIGIR 2019 - Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval).. https://doi.org/10.1145/3331184.3331398