Entrance Exams

[+] | [-]
Entrance Exams logo

Entrance Exams 2015

CLEF Question Answering Track (CLEF QA Track)

a CLEF 2015 Lab


2015 Guidelines (27/02/2015)

Summary - Task Description - Languages - Data - Evaluation - Important Dates - Contact


Summary

Japanese University Entrance Exams include questions formulated at various levels of complexity and test a wide range of capabilities. The challenge of "Entrance Exams" aims at evaluating systems under the same conditions humans are evaluated to enter the University. Following the first campaign we will reduce the challenge to Reading Comprehension exercises contained in the English exams. More types of exercises will be included in subsequent campaigns (2014-2016) in coordination with the "Entrance Exams" task at NTCIR. Exams are created by the Japanese National Center for University Admissions Tests. The "Entrance Exams" corpus is provided by NII's Todai Robot Project and NTCIR RITE.

Task Description

Participant systems are asked to read a given document and answer questions. Questions are given in multiple-choice format, with four options from which a single answer must be selected. Questions are taken from reading tests of real university entrance exams, and may include a variety of question types that require a wide range of inference. Systems have to answer questions by referring to "common sense knowledge" that high school students who aim to enter the university are expected to have. An important difference from previous QA exercises is that we do not intend to restrict question types. Any types of reading comprehension questions in real entrance exams will be included in the test data.

Language

Test documents, questions, and options will be made available in English, Russian, French, Spanish, Italian and German. Please contact the organization for expressions of interest in any particular language.

Data

Following 2013 and 2014 edition, the Entrance Exams 2015 test set will be composed of reading comprehension tests taken from the Japanese Center Test, which is a nation-wide achievement test for Japanese university admissions. 2013 and 2014 datasets will serve as developing data for 2015:
- 24 documents
- 120 questions (5 questions for each document) with
- 480 choices/options (4 for each question)
For testing, new documents will be delivered:
- 12 test documents
- 60 questions (5 questions for each document) with
- 240 choices/options (4 for each question)
Please, refer to Guidelines for Input (INPUT DTD) and Output (OUTPUT DTD) formats. To obtain the data,
1) Please register to CLEF QA Track - Entrance Exams
2) Make two signed agreement forms and send them to NTCIR Office by postal mail. We need actual signature, so please send them by postal mail rather than email attachment nor fax.
The data will be provided in the same XML format as the QA4MRE main task. In Data section is an excerpt from the actual data (taken from the 2007 Center Test provided by the National Center for University Entrance Examinations, Japan.

Evaluation

Test questions will be posted on a web site that will be announced to the teams who have submitted the signed agreement form to the NTCIR office. The test set will be available on May 6 and submissions will be due within 5 days from the first test set download and not later than May 15 by 11:59 p.m. (CEST). Late submissions will not be considered.

Scoring of the output produced by participant systems will be performed automatically by comparing the answers of systems against the gold standard collection with annotations made by humans. No manual assessment will be performed.

Each test will receive an evaluation score between 0 and 1 using c@1. This measure, used in previous CLEF QA Tracks, encourages systems to reduce the number of incorrect answers while maintaining the number of correct ones by leaving some questions unanswered.

Systems will receive evaluation scores from two different perspectives:

1. at the question-answering level: correct answers are counted individually without grouping them;
2. at the reading-test level: figures both for each reading test as a whole, and for each separate topic are given.

Important Dates

Registration at CLEFOpen
Send the agreement form (in Data section)by April 30, 2015
Test set releaseMay 6, 2015
Runs submissionMay 15, 2015
Individual results to participantsMay 20, 2015
Submission of Working Notes PapersJune 7, 2015
CLEF Workshop in Toulouse, FRSeptember 15-18, 2015

Contact

Anselmo Peñas (anselmo@lsi.uned.es)
Yusuke Miyao (yusuke@nii.ac.jp)

Organizing Committee

Anselmo Peñas (UNED, Spain)
Yusuke Miyao (National Institute of Informatics, Japan)
Alvaro Rodrigo (UNED, Spain)
Eduard Hovy (Carnegie Mellon University, USA)
Noriko Kando (National Institute of Informatics, Japan)
Teruko Mitamura (Carnegie Mellon University, USA)