Evaluating user studies in information access

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

Abstract

Systems enabling Information Access whether it be; a file browser, a retrieval engine, a mobile device providing content, a personalized agent, etc, need to be evaluated appropriately for the discipline to be considered a science. The problem of how to appropriately evaluate such systems is even more problematic when the evaluation is conducted with human subjects.
Original languageEnglish
Title of host publicationContext
Subtitle of host publicationNature, Impact and Role - 5th International Conference on Conceptions of Library and Information Sciences, CoLIS 2005, Glasgow, UK, June 4-8, 2005. Proceedings
Place of PublicationBerlin, Heidelberg
PublisherSpringer-Verlag
Pages251-251
Number of pages1
Volume3507
ISBN (Print)978-3-540-26178-0
DOIs
Publication statusPublished - 13 Jun 2005

Publication series

NameCoLIS'05
PublisherSpringer-Verlag

Fingerprint

science
evaluation

Keywords

  • retrieval engines
  • mobile devices
  • information access

Cite this

Bailey, A., Ruthven, I., & Azzopardi, L. (2005). Evaluating user studies in information access. In Context: Nature, Impact and Role - 5th International Conference on Conceptions of Library and Information Sciences, CoLIS 2005, Glasgow, UK, June 4-8, 2005. Proceedings (Vol. 3507, pp. 251-251). (CoLIS'05). Berlin, Heidelberg: Springer-Verlag. https://doi.org/10.1007/11495222_21
Bailey, Alex ; Ruthven, Ian ; Azzopardi, Leif. / Evaluating user studies in information access. Context: Nature, Impact and Role - 5th International Conference on Conceptions of Library and Information Sciences, CoLIS 2005, Glasgow, UK, June 4-8, 2005. Proceedings. Vol. 3507 Berlin, Heidelberg : Springer-Verlag, 2005. pp. 251-251 (CoLIS'05).
@inproceedings{eaf7952570ce4687a164ecfaaafdf1cc,
title = "Evaluating user studies in information access",
abstract = "Systems enabling Information Access whether it be; a file browser, a retrieval engine, a mobile device providing content, a personalized agent, etc, need to be evaluated appropriately for the discipline to be considered a science. The problem of how to appropriately evaluate such systems is even more problematic when the evaluation is conducted with human subjects.",
keywords = "retrieval engines, mobile devices, information access",
author = "Alex Bailey and Ian Ruthven and Leif Azzopardi",
year = "2005",
month = "6",
day = "13",
doi = "10.1007/11495222_21",
language = "English",
isbn = "978-3-540-26178-0",
volume = "3507",
series = "CoLIS'05",
publisher = "Springer-Verlag",
pages = "251--251",
booktitle = "Context",

}

Bailey, A, Ruthven, I & Azzopardi, L 2005, Evaluating user studies in information access. in Context: Nature, Impact and Role - 5th International Conference on Conceptions of Library and Information Sciences, CoLIS 2005, Glasgow, UK, June 4-8, 2005. Proceedings. vol. 3507, CoLIS'05, Springer-Verlag, Berlin, Heidelberg, pp. 251-251. https://doi.org/10.1007/11495222_21

Evaluating user studies in information access. / Bailey, Alex; Ruthven, Ian; Azzopardi, Leif.

Context: Nature, Impact and Role - 5th International Conference on Conceptions of Library and Information Sciences, CoLIS 2005, Glasgow, UK, June 4-8, 2005. Proceedings. Vol. 3507 Berlin, Heidelberg : Springer-Verlag, 2005. p. 251-251 (CoLIS'05).

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

TY - GEN

T1 - Evaluating user studies in information access

AU - Bailey, Alex

AU - Ruthven, Ian

AU - Azzopardi, Leif

PY - 2005/6/13

Y1 - 2005/6/13

N2 - Systems enabling Information Access whether it be; a file browser, a retrieval engine, a mobile device providing content, a personalized agent, etc, need to be evaluated appropriately for the discipline to be considered a science. The problem of how to appropriately evaluate such systems is even more problematic when the evaluation is conducted with human subjects.

AB - Systems enabling Information Access whether it be; a file browser, a retrieval engine, a mobile device providing content, a personalized agent, etc, need to be evaluated appropriately for the discipline to be considered a science. The problem of how to appropriately evaluate such systems is even more problematic when the evaluation is conducted with human subjects.

KW - retrieval engines

KW - mobile devices

KW - information access

U2 - 10.1007/11495222_21

DO - 10.1007/11495222_21

M3 - Conference contribution book

SN - 978-3-540-26178-0

VL - 3507

T3 - CoLIS'05

SP - 251

EP - 251

BT - Context

PB - Springer-Verlag

CY - Berlin, Heidelberg

ER -

Bailey A, Ruthven I, Azzopardi L. Evaluating user studies in information access. In Context: Nature, Impact and Role - 5th International Conference on Conceptions of Library and Information Sciences, CoLIS 2005, Glasgow, UK, June 4-8, 2005. Proceedings. Vol. 3507. Berlin, Heidelberg: Springer-Verlag. 2005. p. 251-251. (CoLIS'05). https://doi.org/10.1007/11495222_21