/DBpedia-Entity

DBpedia-Entity v2: A Test Collection for Entity Search

DBpedia-Entity

DBpedia-Entity is a standard test collection for entity search, which has been first released as DBpedia-Entity v1 [1], and is further updated as DBpedia-Entity v2 [2]. This repository contains the collection, baseline runs, and other details about the DBpedia dump and index.

For detailed information, please check the DBpedia-Entity v2 paper and poster.

Queries

The collection consists of a set of heterogeneous entity-bearing queries, categorized into four groups:

  • SemSearch_ES: Named entity queries; e.g., "brooklyn bridge" or "08 toyota tundra."
  • INEX-LD: IR-style keyword queries; e.g., "electronic music genres".
  • QALD2: Natural language questions; e.g., "Who is the mayor of Berlin?"
  • ListSearch: Queries that seek a particular list of entities; e.g., "Professional sports teams in Philadelphia".

All queries are prefixed with the name of the originating benchmark. SemSearch_ES, INEX-LD, and QALD2 each correspond to a separate category; the rest of the queries belong to the ListSearch category.

Collection

DBpedia-Entity v2 is built based on DBpedia version 2015-10. The collection can be found under collection/v2 and is organized as follows:

  • queries-v2.txt: 467 queries, where each line contains a queryID and query text.
  • queries-v2_stopped.txt: The same queries, with removed stop patterns and punctuation marks.
  • qrels-v2.txt: Relevance judgments in standard TREC format.
  • folds/: 5-folds of train-test queries for each query subset, to be used for cross-validation in supervised approaches. If cross-validation is performed for all queries, folds/all_queries.json should be used.

This repository also contains the DBpedia-Entity v1 collection, which was built based on DBpedia version 3.7. The collection can be found under collection/v1 and is organized similar to the v2 version. There are, however, 3 qrels file for DBpedia-Entity v1:

  • qrels-v1_37.txt: The original qrels, based on DBpedia 3.7.
  • qrels-v1_39.txt: Qrels with updated entity IDs according to DBpedia 3.9.
  • qrels-v1_2015_10.txt: Qrels with updated entity IDs according to DBpedia 2015-10.

Baselines runs

The runs folder contains all the baseline runs related to this collection in TREC format. The following runs are made available:

  • /v1: The runs related to DBpedia-Entity v1, reported in Table 2 of [2].
  • /v2: The runs related to DBpedia-Entity v2, reported in the following table. These runs are compared with respect to NDCG at ranks 10 and 100. Any new run on DBpedia-Entity v2 is supposed to be compared against these results.
Model SemSearch ES INEX-LD ListSearch QALD-2 Total
@10@100 @10@100 @10@100 @10@100 @10@100
BM25 0.24970.4110 0.18280.3612 0.06270.3302 0.27510.3366 0.25580.3582
PRMS 0.53400.6108 0.35900.4295 0.36840.4436 0.31510.4026 0.39050.4688
MLM-all 0.55280.6247 0.37520.4493 0.37120.4577 0.32490.4208 0.40210.4852
LM 0.55550.6475 0.39990.4745 0.39250.4723 0.34120.4338 0.41820.5036
SDM 0.55350.6672 0.40300.4911 0.39610.4900 0.33900.4274 0.41850.5143
LM+ELR 0.55540.6469 0.40400.4816 0.39920.4845 0.34910.4383 0.42300.5093
SDM+ELR 0.55480.6680 0.41040.4988 0.41230.4992 0.34460.4363 0.42610.5211
MLM-CA 0.62470.6854 0.40290.4796 0.40210.4786 0.33650.4301 0.43650.5143
BM25-CA 0.58580.6883 0.41200.5050 0.42200.5142 0.35660.4426 0.43990.5329
FSDM 0.65210.7220 0.42140.5043 0.41960.4952 0.34010.4358 0.45240.5342
BM25F-CA 0.62810.7200 0.43940.5296 0.42520.5106 0.36890.4614 0.46050.5505
FSDM+ELR 0.65630.7257 0.43540.5134 0.42200.4985 0.34680.4456 0.45900.5408

Citation

If using this collection in a publication, please cite the following paper:

@inproceedings{Hasibi:2017:DVT,
 author =    {Hasibi, Faegheh and Nikolaev, Fedor and Xiong, Chenyan and Balog, Krisztian and Bratsberg, Svein Erik and Kotov, Alexander and Callan, Jamie},
 title =     {DBpedia-Entity V2: A Test Collection for Entity Search},
 booktitle = {Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval},
 series =    {SIGIR '17},
 year =      {2017},
 pages =     {1265--1268},
 doi =       {10.1145/3077136.3080751},
 publisher = {ACM}
}

If possible, please also include the http://tiny.cc/dbpedia-entity URL in your paper, where the data is available for download.

Acknowledgment

This research was partialy supported by Norwegian Research Council, National Science Foundation (NSF) grant IIS-1422676, Google Faculty Research Award, and Allen Institute for Artificial Intelligence Student Fellowship. We Thank Saeid Balaneshin, Jan R. Benetka, Heng Ding, Dario Garigliotti, Mehedi Hasan, Indira Kurmantayeva, and Shuo Zhang for their help with creating relevance judgements.

Contact

In case of questions, feel free to contact f.hasibi@cs.ru.nl or krisztian.balog@uis.no.


[1] K. Balog and R. Neumayer. “A Test Collection for Entity Search in DBpedia”, In proceedings of 436th international ACM SIGIR conference on Research and development in Information Retrieval (SIGIR ’13), pages 737-740, 2013.

[2] F. Hasibi, F. Nikolaev, C. Xiong, K. Balog, S. E. Bratsberg, A. Kotov, and J. Callan. “DBpedia-Entity v2: A Test Collection for Entity Search”, In proceedings of 40th ACM SIGIR conference on Research and Development in Information Retrieval (SIGIR ’17), 2017.