Entrance Exams 2015CLEF Question Answering Track (CLEF QA Track)a CLEF 2015 Lab |
SummaryJapanese University Entrance Exams include questions formulated at various levels of complexity and test a wide range of capabilities. The challenge of "Entrance Exams" aims at evaluating systems under the same conditions humans are evaluated to enter the University. Following the first campaign we will reduce the challenge to Reading Comprehension exercises contained in the English exams. More types of exercises will be included in subsequent campaigns (2014-2016) in coordination with the "Entrance Exams" task at NTCIR. Exams are created by the Japanese National Center for University Admissions Tests. The "Entrance Exams" corpus is provided by NII's Todai Robot Project and NTCIR RITE.Task DescriptionParticipant systems are asked to read a given document and answer questions. Questions are given in multiple-choice format, with four options from which a single answer must be selected. Questions are taken from reading tests of real university entrance exams, and may include a variety of question types that require a wide range of inference. Systems have to answer questions by referring to "common sense knowledge" that high school students who aim to enter the university are expected to have. An important difference from previous QA exercises is that we do not intend to restrict question types. Any types of reading comprehension questions in real entrance exams will be included in the test data.LanguageTest documents, questions, and options will be made available in English, Russian, French, Spanish, Italian and German. Please contact the organization for expressions of interest in any particular language.DataFollowing 2013 and 2014 edition, the Entrance Exams 2015 test set will be composed of reading comprehension tests taken from the Japanese Center Test, which is a nation-wide achievement test for Japanese university admissions. 2013 and 2014 datasets will serve as developing data for 2015:- 24 documentsFor testing, new documents will be delivered: - 12 test documentsPlease, refer to Guidelines for Input (INPUT DTD) and Output (OUTPUT DTD) formats. To obtain the data, 1) Please register to CLEF QA Track - Entrance ExamsThe data will be provided in the same XML format as the QA4MRE main task. In Data section is an excerpt from the actual data (taken from the 2007 Center Test provided by the National Center for University Entrance Examinations, Japan. EvaluationTest questions will be posted on a web site that will be announced to the teams who have submitted the signed agreement form to the NTCIR office. The test set will be available on May 6 and submissions will be due within 5 days from the first test set download and not later than May 15 by 11:59 p.m. (CEST). Late submissions will not be considered.Scoring of the output produced by participant systems will be performed automatically by comparing the answers of systems against the gold standard collection with annotations made by humans. No manual assessment will be performed. Each test will receive an evaluation score between 0 and 1 using c@1. This measure, used in previous CLEF QA Tracks, encourages systems to reduce the number of incorrect answers while maintaining the number of correct ones by leaving some questions unanswered. Systems will receive evaluation scores from two different perspectives: 1. at the question-answering level: correct answers are counted individually without grouping them; Important Dates
ContactAnselmo Peñas (anselmo@lsi.uned.es) Yusuke Miyao (yusuke@nii.ac.jp)Organizing CommitteeAnselmo Peñas (UNED, Spain) Yusuke Miyao (National Institute of Informatics, Japan) Alvaro Rodrigo (UNED, Spain) Eduard Hovy (Carnegie Mellon University, USA) Noriko Kando (National Institute of Informatics, Japan) Teruko Mitamura (Carnegie Mellon University, USA) |