iCLEF 2006

Guidelines

iCLEF is the interactive track of CLEF (Cross-Language Evaluation Forum), an annual evaluation exercise for Multilingual Information Access systems. in iCLEF, Cross-Language search capabilities are studied from a user-inclusive perspective; the question is how best to assist users when searching information written in unknown languages, rather than how best an algorithm can find information written in languages different from the query language.

In the past, iCLEF has addressed applications such as information retrieval and question answering. However, for 2006 the focus has turned to text-based image retrieval from Flickr.

Introducing iCLEF 2006:

This year we want to explore user behaviour in a collection where the cross-language search necessity arises more naturally for average users. We have chosen Flickr, a large-scale, web-based image database based on a large social network of WWW users, with the potential for offering both challenging and realistic multilingual search tasks for interactive experiments.

We want to use the iCLEF track to explore alternative evaluation methodologies for interactive IR; for this reason, we have decided to fix the search tasks, but keep the evaluation methodology open, so that each participant can contribute with their own ideas about how to study interactive issues in CLIR.

In this experimental setting we thus provide the realistic cross-language search scenario and example tasks, and explore evaluation methodologies for a better understanding of search processes there.

We welcome research teams with interests in Cross-Language IR, Human-Computer Interaction, Interactive Information Access, Image Retrieval, and Machine Translation. The organizers will foster synergies between interested parties; expertise in all of these fields is not required to participate.

OK, I'm interested. How do I join?

Go to the main CLEF page and follow the general registration procedure. Please e-mail the organizers at iclef2006@sics.se with any enquiries regarding the task.

Target Data

The live Flickr system. Search interfaces on flickr can be designed using the flickr API (the iCLEF organization provides an example interface, see below).

Tasks

We propose three types of task:
  1. Classical ad-hoc task: "Find as many European parliament buildings as possible, pictures from the assembly hall as well as from the outside"
  2. Creative task: Find five illustrations to the text "The story of saffron".
  3. Visually oriented task; What is the name of the beach where this crab is resting?

Users will be allowed to employ 20 minutes for each task, to get around 100 minutes overall searching time (including training), to avoid fatigue effects.

Besides these three core tasks, which all participants must undertake, we allow for any additional tasks and variants that participants may think of. For instance, participants may offer a web search facility using the flickr API and study the interaction logs with real users posing real search needs.

Data provided by the organizers

All flickr data must accessed via their API. The iCLEF organization provides this multilingual search interface template working on Flickr API as an example of use, and as a default search interface which participants may use to build their own interfaces, changing the interaction facilities or inserting their own query translation machinery. Note that the template as it stands does not include any specific translation machinery (just crummy dictionary lookup). Participants are encouraged to share other interface templates and/or translation resources which might be of general interest.

Please contact iCLEF organization (iclef2006@sics.se) to get the code and assistance on how to use it.

Data to be Submitted to the Organizers

As the evaluation methodology is not fixed, there won't be centralized assessments for iCLEF this year. We will provide a questionnaire for participants intented to have an homogeneous description of all experiments and experimental designs.

Participants are encouraged to log as many details as possible about every search session. A skeleton questionnaire will be provided to collect some of the evaluation metrics

Evaluation measures

One of the challenges that iCLEF poses to participants this year is to contribute with their own experimental design to evaluate user-centred multilingual information access. Given the nature of the proposed tasks, evaluation cannot be focused on results accuracy (except partly in the first task). We encourage participants to center on the notion of user satisfaction:
  1. Happy (all tasks): Are you satisfied with how you performed the task?
  2. Complete (creative task, ad-hoc task): Did you find enough, or would you have continued if there had not been a time limit? How long would it have taken to make you satisfied, do you think?
  3. Quality (creative task): Compare the illustrations with some given set. Are any of these better than your retrieved images?

Schedule

Registration is now open. The only hard deadline is August 15, when the paper for the CLEF workshop notes is due. Meanwhile, the organization will contact participants to assist them, promote discussions on the experimental design and retrieve information on the experiments for the iCLEF overview paper.

Registration Open now
Task finalization and paper for working notes August 15
Workshop (Alicante) 20-22 September 2006