Skip to Content

Go to Overview

IRF @ CLEF Conference 20-23 September, Padua

Allan Hanbury, senior scientist at the IRF, will give a presentation and be part of the panel "PROMISE for Experimental Evaluation". Florina Piroi, postdoc research assistant, and John Tait, chief scientific officer, have co-organized this year's CLEF-IP evaluation track.

 The CLEF 2010 conference is the continuation of the popular CLEF campaigns and workshops that have run for the past ten years (2000-2009). CLEF 2010 will cover a broad range of issues from the fields of multilingual and multimodal information access evaluation.

Presentation and discussion
On Monday, 20 September 2010, 14:30 - 15:30, Allan Hanbury will be part of the panel "PROMISE for Experimental Evaluation" chaired by Jussi Karlgren, Swedish Institute of Computer Science, Sweden. Panelists are: Khalid Choukri, Evaluations and Language resources Distribution Agency (ELDA), France; Nicola Ferro, University of Padua, Italy; Maarten de Rijke, University of Amsterdam, The Netherlands; Giuseppe Santucci, Sapienza University of Rome, Italy. Participative Research labOratory for Multimedia and Multilingual Information Systems Evaluation (PROMISE) is a Network of Excellence, starting in conjunction with this first independent CLEF 2010 conference, and designed to support and develop the evaluation of multilingual and multimedia information access systems, largely through the activities taking place in Cross-Language Evaluation Forum (CLEF) today, and taking it forward in important new ways.

Furthermore, Allan Hanbury and Henning Müller, University of Applied Sciences Western Switzerland, will present their paper on Automated Component-Level Evaluation: Present and Future on Tuesday, 21 September 2010, 14:30-17:00 in the session: Evaluation Methodologies and Metrics (2).

Florina Piroi and John Tait coordinate the CLEF-IP track and the respective workshop taking place on Wednesday, 22 September. The CLEF-IP track was launched in 2009 by the IRF to investigate IR techniques for patent retrieval. After a successful first year, the track continues as a benchmarking activity. This year's track utilizes a data collection of patent documents derived from EPO sources, covering English, French, and German patents and slightly larger as in CLEF-IP'09. The workshop revolves around the following tasks:

Prior Art Candidate Search Task: find patent documents that are likely to constitute prior art to a given patent application.
Classification Task: classify a given patent document according to the IPC.
The programme for the lab session can be found here.


There are 25 runs in the Prior Art Candidates Search task and 27 runs in the Classification task, coming from 12 participating teams. Two evaluation reports made available on the IRF CLEF-IP website. A final version of these reports will be published as IRF tech reports.

For the overview of the CLEF Conference and lab sessions, please refer to the CLEF 2010 website: