Evaluating user studies in information access

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

Abstract

Systems enabling Information Access whether it be; a file browser, a retrieval engine, a mobile device providing content, a personalized agent, etc, need to be evaluated appropriately for the discipline to be considered a science. The problem of how to appropriately evaluate such systems is even more problematic when the evaluation is conducted with human subjects.
Original languageEnglish
Title of host publicationContext
Subtitle of host publicationNature, Impact and Role - 5th International Conference on Conceptions of Library and Information Sciences, CoLIS 2005, Glasgow, UK, June 4-8, 2005. Proceedings
Place of PublicationBerlin, Heidelberg
PublisherSpringer-Verlag
Pages251-251
Number of pages1
Volume3507
ISBN (Print)978-3-540-26178-0
DOIs
Publication statusPublished - 13 Jun 2005

Publication series

NameCoLIS'05
PublisherSpringer-Verlag

Keywords

  • retrieval engines
  • mobile devices
  • information access

Fingerprint

Dive into the research topics of 'Evaluating user studies in information access'. Together they form a unique fingerprint.

Cite this