TY - JOUR
T1 - Report on the First International Workshop on the Evaluation on Collaborative Information Seeking and Retrieval (ECol'2015)
AU - Soulier, Laure
AU - Tamine, Lynda
AU - Sakai, Tetsuya
AU - Azzopardi, Leif
AU - Pickens, Jeremy
PY - 2016/6/1
Y1 - 2016/6/1
N2 - The workshop on the evaluation of collaborative information retrieval and seeking (ECol) was held in conjunction with the 24th Conference on Information and Knowledge Management (CIKM) in Melbourne, Australia. The workshop featured three main elements. First, a keynote on the main dimensions, challenges, and opportunities in collaborative information retrieval and seeking by Chirag Shah. Second, an oral presentation session in which four papers were presented. Third, a discussion based on three seed research questions: (1) In what ways is collaborative search evaluation more challenging than individual interactive information retrieval (IIIR) evaluation? (2) Would it be possible and/or useful to standardise experimental designs and data for collaborative search evaluation? and (3) For evaluating collaborative search, can we leverage ideas from other tasks such as diversified search, subtopic mining and/or e-discovery? The discussion was intense and raised many points and issues, leading to the proposition that a new evaluation track focused on collaborative information retrieval/seeking tasks, would be worthwhile.
AB - The workshop on the evaluation of collaborative information retrieval and seeking (ECol) was held in conjunction with the 24th Conference on Information and Knowledge Management (CIKM) in Melbourne, Australia. The workshop featured three main elements. First, a keynote on the main dimensions, challenges, and opportunities in collaborative information retrieval and seeking by Chirag Shah. Second, an oral presentation session in which four papers were presented. Third, a discussion based on three seed research questions: (1) In what ways is collaborative search evaluation more challenging than individual interactive information retrieval (IIIR) evaluation? (2) Would it be possible and/or useful to standardise experimental designs and data for collaborative search evaluation? and (3) For evaluating collaborative search, can we leverage ideas from other tasks such as diversified search, subtopic mining and/or e-discovery? The discussion was intense and raised many points and issues, leading to the proposition that a new evaluation track focused on collaborative information retrieval/seeking tasks, would be worthwhile.
KW - information seeking
KW - information retieval
UR - https://www.irit.fr/ECol2015/
U2 - 10.1145/2964797.2964805
DO - 10.1145/2964797.2964805
M3 - Article
SN - 0163-5840
VL - 50
SP - 42
EP - 48
JO - ACM SIGIR Forum
JF - ACM SIGIR Forum
IS - 1
ER -