Towards better measures: evaluation of estimated resource description quality for distributed IR

M. Baillie, L. Azzopardi, F. Crestani

Research output: Contribution to conferencePaperpeer-review

54 Downloads (Pure)

Abstract

An open problem for Distributed Information Retrieval systems (DIR) is how to represent large document repositories, also known as resources, both accurately and efficiently. Obtaining resource description estimates is an important phase in DIR, especially in non-cooperative environments. Measuring the quality of an estimated resource description is a contentious issue as current measures do not provide an adequate indication of quality. In this paper, we provide an overview of these currently applied measures of resource description quality, before proposing the Kullback-Leibler (KL) divergence as an alternative. Through experimentation we illustrate the shortcomings of these past measures, whilst providing evidence that KL is a more appropriate measure of quality. When applying KL to compare different QBS algorithms, our experiments provide strong evidence in favour of a previously unsupported hypothesis originally posited in the initial Query-Based Sampling work.
Original languageEnglish
Publication statusPublished - 2006
EventFirst International Conference on Scalable Information Systems - Hong Kong
Duration: 30 May 20061 Jun 2006

Conference

ConferenceFirst International Conference on Scalable Information Systems
Abbreviated titleINFOSCALE 2006
CityHong Kong
Period30/05/061/06/06

Keywords

  • cataloguing
  • resource description
  • metadata
  • information retrieval
  • searching
  • search algorithm

Fingerprint

Dive into the research topics of 'Towards better measures: evaluation of estimated resource description quality for distributed IR'. Together they form a unique fingerprint.

Cite this