Data for: Multi-Armed Bandits for Adjudicating Documents in Pooling-Based Evaluation of Information Retrieval Systems

Published: 22 Jun 2017 | Version 1 | DOI: 10.17632/mpckty8hyr.1
peer reviewed

This data is associated with the following peer reviewed publication:

Multi-armed bandits for adjudicating documents in pooling-based evaluation of information retrieval systems

Published in: Information Processing and Management

Latest version

  • Version 1

    2017-06-22

    Published: 2017-06-22

    DOI: 10.17632/mpckty8hyr.1

    Cite this dataset

    Losada, David; Barreiro, Álvaro; Parapar, Javier (2017), “Data for: Multi-Armed Bandits for Adjudicating Documents in Pooling-Based Evaluation of Information Retrieval Systems”, Mendeley Data, v1 http://dx.doi.org/10.17632/mpckty8hyr.1

Categories

Information Retrieval, Reinforcement Learning, Search Engine

Mendeley Library

Organise your research assets using Mendeley Library. Add to Mendeley Library

Licence

CC BY NC 3.0 Learn more

The files associated with this dataset are licensed under a Attribution-NonCommercial 3.0 Unported licence.

What does this mean?

You are free to adapt, copy or redistribute the material, providing you attribute appropriately and do not use the material for commercial purposes.

Report