This page documents BM25 regression experiments for FIRE 2012 ad hoc retrieval (Monolingual English). The document collection can be found in FIRE data page.
The exact configurations for these regressions are stored in this YAML file. Note that this page is automatically generated from this template as part of Anserini's regression pipeline, so do not modify this page directly; modify the template instead.
From one of our Waterloo servers (e.g., orca
), the following command will perform the complete regression, end to end:
python src/main/python/run_regression.py --index --verify --search --regression fire12-en
Typical indexing command:
target/appassembler/bin/IndexCollection \
-collection CleanTrecCollection \
-input /path/to/fire12-en \
-index indexes/lucene-index.fire12-en/ \
-generator DefaultLuceneDocumentGenerator \
-threads 16 -storePositions -storeDocvectors -storeRaw -language en \
>& logs/log.fire12-en &
The directory /path/to/fire12-en/
should be a directory containing the collection, containing en_BDNews24
and en_TheTelegraph_2001-2010
directories.
There should be 392,577 documents in total.
For additional details, see explanation of common indexing options.
Topics and qrels are stored here, which is linked to the Anserini repo as a submodule. They are downloaded from the FIRE data page:
topics.fire12en.176-225.txt
: topics for FIRE 2012 Monolingual English (176 to 225)qrels.fire12en.176-225.txt
: qrels (version II) for FIRE 2012 Monolingual English (176 to 225)
After indexing has completed, you should be able to perform retrieval as follows:
target/appassembler/bin/SearchCollection \
-index indexes/lucene-index.fire12-en/ \
-topics tools/topics-and-qrels/topics.fire12en.176-225.txt \
-topicreader Trec \
-output runs/run.fire12-en.bm25.topics.fire12en.176-225.txt \
-bm25 -language en &
Evaluation can be performed using trec_eval
:
tools/eval/trec_eval.9.0.4/trec_eval -m map -m P.20 -m ndcg_cut.20 tools/topics-and-qrels/qrels.fire12en.176-225.txt runs/run.fire12-en.bm25.topics.fire12en.176-225.txt
With the above commands, you should be able to reproduce the following results:
MAP | BM25 |
---|---|
FIRE 2012 (Monolingual English) | 0.3713 |
P20 | BM25 |
FIRE 2012 (Monolingual English) | 0.4970 |
nDCG@20 | BM25 |
FIRE 2012 (Monolingual English) | 0.5420 |