Is relevance hard work? Evaluating the effort of making relevant assessments

Robert Villa, Martin Halvey

Research output: Chapter in Book/Report/Conference proceedingChapter (peer-reviewed)

19 Citations (Scopus)

Abstract

The judging of relevance has been a subject of study in information retrieval for a long time, especially in the creation of relevance judgments for test collections. While the criteria by which assessors' judge relevance has been intensively studied, little work has investigated the process individual assessors go through to judge the relevance of a document. In this paper, we focus on the process by which relevance is judged, and in particular, the degree of effort a user must expend to judge relevance. By better understanding this effort in isolation, we may provide data which can be used to create better models of search. We present the results of an empirical evaluation of the effort users must exert to judge the relevance of document, investigating the effect of relevance level and document size. Results suggest that 'relevant' documents require more effort to judge when compared to highly relevant and not relevant documents, and that effort increases as document size increases.
LanguageEnglish
Title of host publicationProceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13)
Place of PublicationNew York
Pages765-768
Number of pages4
DOIs
Publication statusPublished - 2013
Event36th international ACM SIGIR conference on Research and development in information retrieval - Dublin, Ireland
Duration: 28 Jul 20131 Aug 2013

Conference

Conference36th international ACM SIGIR conference on Research and development in information retrieval
CountryIreland
CityDublin
Period28/07/131/08/13

Fingerprint

Information retrieval

Keywords

  • information retrieval
  • relevance judgements
  • document relevance
  • information seeking behaviour

Cite this

Villa, R., & Halvey, M. (2013). Is relevance hard work? Evaluating the effort of making relevant assessments. In Proceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13) (pp. 765-768). New York. https://doi.org/10.1145/2484028.2484150
Villa, Robert ; Halvey, Martin. / Is relevance hard work? Evaluating the effort of making relevant assessments. Proceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13). New York, 2013. pp. 765-768
@inbook{e77f12c645de47478e61ea47f4663210,
title = "Is relevance hard work? Evaluating the effort of making relevant assessments",
abstract = "The judging of relevance has been a subject of study in information retrieval for a long time, especially in the creation of relevance judgments for test collections. While the criteria by which assessors' judge relevance has been intensively studied, little work has investigated the process individual assessors go through to judge the relevance of a document. In this paper, we focus on the process by which relevance is judged, and in particular, the degree of effort a user must expend to judge relevance. By better understanding this effort in isolation, we may provide data which can be used to create better models of search. We present the results of an empirical evaluation of the effort users must exert to judge the relevance of document, investigating the effect of relevance level and document size. Results suggest that 'relevant' documents require more effort to judge when compared to highly relevant and not relevant documents, and that effort increases as document size increases.",
keywords = "information retrieval, relevance judgements, document relevance, information seeking behaviour",
author = "Robert Villa and Martin Halvey",
year = "2013",
doi = "10.1145/2484028.2484150",
language = "English",
isbn = "9781450320344",
pages = "765--768",
booktitle = "Proceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13)",

}

Villa, R & Halvey, M 2013, Is relevance hard work? Evaluating the effort of making relevant assessments. in Proceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13). New York, pp. 765-768, 36th international ACM SIGIR conference on Research and development in information retrieval , Dublin, Ireland, 28/07/13. https://doi.org/10.1145/2484028.2484150

Is relevance hard work? Evaluating the effort of making relevant assessments. / Villa, Robert; Halvey, Martin.

Proceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13). New York, 2013. p. 765-768.

Research output: Chapter in Book/Report/Conference proceedingChapter (peer-reviewed)

TY - CHAP

T1 - Is relevance hard work? Evaluating the effort of making relevant assessments

AU - Villa, Robert

AU - Halvey, Martin

PY - 2013

Y1 - 2013

N2 - The judging of relevance has been a subject of study in information retrieval for a long time, especially in the creation of relevance judgments for test collections. While the criteria by which assessors' judge relevance has been intensively studied, little work has investigated the process individual assessors go through to judge the relevance of a document. In this paper, we focus on the process by which relevance is judged, and in particular, the degree of effort a user must expend to judge relevance. By better understanding this effort in isolation, we may provide data which can be used to create better models of search. We present the results of an empirical evaluation of the effort users must exert to judge the relevance of document, investigating the effect of relevance level and document size. Results suggest that 'relevant' documents require more effort to judge when compared to highly relevant and not relevant documents, and that effort increases as document size increases.

AB - The judging of relevance has been a subject of study in information retrieval for a long time, especially in the creation of relevance judgments for test collections. While the criteria by which assessors' judge relevance has been intensively studied, little work has investigated the process individual assessors go through to judge the relevance of a document. In this paper, we focus on the process by which relevance is judged, and in particular, the degree of effort a user must expend to judge relevance. By better understanding this effort in isolation, we may provide data which can be used to create better models of search. We present the results of an empirical evaluation of the effort users must exert to judge the relevance of document, investigating the effect of relevance level and document size. Results suggest that 'relevant' documents require more effort to judge when compared to highly relevant and not relevant documents, and that effort increases as document size increases.

KW - information retrieval

KW - relevance judgements

KW - document relevance

KW - information seeking behaviour

UR - http://www.sigir2013.ie/

UR - http://dl.acm.org/citation.cfm?id=2484028

U2 - 10.1145/2484028.2484150

DO - 10.1145/2484028.2484150

M3 - Chapter (peer-reviewed)

SN - 9781450320344

SP - 765

EP - 768

BT - Proceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13)

CY - New York

ER -

Villa R, Halvey M. Is relevance hard work? Evaluating the effort of making relevant assessments. In Proceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13). New York. 2013. p. 765-768 https://doi.org/10.1145/2484028.2484150