=Paper= {{Paper |id=Vol-2409/docker04 |storemode=property |title=University of Waterloo Docker Images for OSIRRC at SIGIR 2019 |pdfUrl=https://ceur-ws.org/Vol-2409/docker04.pdf |volume=Vol-2409 |authors=Ryan Clancy,Zeynep Akkalyoncu Yilmaz,Ze Zhong Wu,Jimmy Lin |dblpUrl=https://dblp.org/rec/conf/sigir/ClancyYWL19 }} ==University of Waterloo Docker Images for OSIRRC at SIGIR 2019== https://ceur-ws.org/Vol-2409/docker04.pdf
University of Waterloo Docker Images for OSIRRC at SIGIR 2019
                          Ryan Clancy, Zeynep Akkalyoncu Yilmaz, Ze Zhong Wu, and Jimmy Lin
                                                       David R. Cheriton School of Computer Science
                                                                  University of Waterloo

1    OVERVIEW                                                                                  Solrini and Elastirini capabilities are exposed via the interact
The University of Waterloo team submitted a total of four Docker                            hook in the OSIRRC jig. Since both Solr and Elasticsearch are de-
images to the Open-Source IR Replicability Challenge (OSIRRC)                               signed as web apps, the user can trigger the hook and then directly
at SIGIR 2019. This short overview outlines the functionality of                            navigate to a URL to access system capabilities. The batch runs
each image. As the READMEs in all our source repositories provide                           provided by the solrini and elastirini images are exactly the
details on the technical design of our images and the retrieval                             same as the anserini image.
models used in our runs, we intentionally do not duplicate this                             The final image submitted by our group packages Birch, our newest
information here.                                                                           open-source search engine2 that takes advantage of BERT [4] for
   Our primary submission is a packaging of Anserini [11, 12], an                           ad hoc document retrieval:
open-source information retrieval toolkit built around Lucene to                                     https://github.com/osirrc/birch-docker
facilitate replicable research. This anserini-docker image resides                          BERT can be characterized as one instance of a family of deep neu-
at the following URL:                                                                       ral models that make heavy use of pretraining [8, 9]. Application
        https://github.com/osirrc/anserini-docker                                           to many natural language processing tasks, ranging from sentence
The Anserini project grew out of the Open-Source IR Reproducibil-                           classification to sequence labeling, has led to impressive gains on
ity Challenge from 2015 [5] and reflects growing community in-                              standard benchmark datasets. The model has been adapted to pas-
terest in using Lucene for academic IR research [1, 2]. As Lucene                           sage ranking [7] and question answering [13], and Birch can be
was not originally designed as a research toolkit, Anserini aims to                         viewed as a continuation of this thread of research, alongside other
fill in the “missing parts” that allow researchers to run standard                          recent models such as CEDR [6]. The central insight that Birch
ad hoc retrieval experiments “right out of the box”, including com-                         explores, as detailed in Yang et al. [14], is to aggregate sentence-level
petitive baselines and integration hooks for neural ranking models.                         scores to rank documents. This image allows other researchers to
Given Lucene’s tremendous production deployment base (typically                             replicate the results of our paper with the search hook.
via Solr or Elasticsearch), better alignment between research in
information retrieval and the practice of building real world search                        REFERENCES
                                                                                             [1] L. Azzopardi, M. Crane, H. Fang, G. Ingersoll, J. Lin, Y. Moshfeghi, H. Scells, P.
engines promises a smoother transition path from the lab to the                                  Yang, and G. Zuccon. 2017. The Lucene for Information Access and Retrieval
“real world” for research innovations.                                                           Research (LIARR) Workshop at SIGIR 2017. In SIGIR. 1429–1430.
                                                                                             [2] L. Azzopardi, Y. Moshfeghi, M. Halvey, R. Alkhawaldeh, K. Balog, E. Di Buccio,
In addition to our main Anserini image, we built two ancillary                                   D. Ceccarelli, J. Fernández-Luna, C. Hull, J. Mannix, and S. Palchowdhury. 2017.
images for the OSIRRC exercise:                                                                  Lucene4IR: Developing Information Retrieval Evaluation Resources Using Lucene.
                                                                                                 SIGIR Forum 50, 2 (2017), 58–75.
        https://github.com/osirrc/solrini-docker                                             [3] R. Clancy, T. Eskildsen, N. Ruest, and J. Lin. 2019. Solr Integration in the Anserini
        https://github.com/osirrc/elastirini-docker                                              Information Retrieval Toolkit. In SIGIR. Paris, France.
                                                                                             [4] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova. 2018.                    BERT: Pre-
In production environments, Lucene is most often used as a core                                  training of Deep Bidirectional Transformers for Language Understanding. In
                                                                                                 arXiv:1810.04805.
search library that powers two widely-deployed “full stack” search                           [5] J. Lin, M. Crane, A. Trotman, J. Callan, I. Chattopadhyaya, J. Foley, G. Ingersoll, C.
applications: Solr and Elasticsearch. With “Solrini” and “Elastirini”,                           Macdonald, and S. Vigna. 2016. Toward Reproducible Baselines: The Open-Source
we have integrated Anserini with Solr and Elasticsearch, respec-                                 IR Reproducibility Challenge. In ECIR. 408–420.
                                                                                             [6] S. MacAvaney, A. Yates, A. Cohan, and N. Goharian. 2019. CEDR: Contextualized
tively. The integration is such that we can use Anserini as a common                             Embeddings for Document Ranking. In arXiv:1904.07094.
frontend to index into a backend Solr or Elasticsearch instance. This                        [7] R. Nogueira and K. Cho. 2019. Passage Re-ranking with BERT. In arXiv:1901.04085.
allows unification of the document processing pipeline (tokeniza-                            [8] M. Peters, M. Neumann, M. Iyyer, M. Gardner, C. Clark, K. Lee, and L. Zettlemoyer.
                                                                                                 2018. Deep Contextualized Word Representations. In NAACL. 2227–2237.
tion, stemming, etc.) to support standard TREC ad hoc experiments,                           [9] A. Radford, K. Narasimhan, T. Salimans, and I. Sutskever. 2018. Improving Lan-
while allowing users to take advantage of the wealth of capabilities                             guage Understanding by Generative Pre-training. Technical Report.
                                                                                            [10] E. Sadler. 2009. Project Blacklight: A Next Generation Library Catalog at a First
provided by Solr and Elasticsearch. In the case of Solr, users can in-                           Generation University. Library Hi Tech 27, 1 (2009), 57–67.
teract with sophisticated searching and faceted browsing interfaces                         [11] P. Yang, H. Fang, and J. Lin. 2017. Anserini: Enabling the Use of Lucene for
such as Project Blacklight1 [10], as described in Clancy et al. [3]. In                          Information Retrieval Research. In SIGIR. 1253–1256.
                                                                                            [12] P. Yang, H. Fang, and J. Lin. 2018. Anserini: Reproducible Ranking Baselines
the case of Elasticsearch, we can gain access to the so-called ELK                               Using Lucene. JDIQ 10, 4 (2018), Article 16.
stack (Elasticsearch, Logstash, Kibana) to provide a complete data                          [13] W. Yang, Y. Xie, A. Lin, X. Li, L. Tan, K. Xiong, M. Li, and J. Lin. 2019. End-to-End
analytics environment, including slick visualization interfaces.                                 Open-Domain Question Answering with BERTserini. In NAACL Demos. 72–77.
                                                                                            [14] W. Yang, H. Zhang, and J. Lin. 2019. Simple Applications of BERT for Ad Hoc
                                                                                                 Document Retrieval. In arXiv:1903.10972.
Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons
License Attribution 4.0 International (CC BY 4.0). OSIRRC 2019 co-located with SIGIR
2019, 25 July 2019, Paris, France.
1 https://projectblacklight.org/                                                            2 https://github.com/castorini/birch




                                                                                       36