/clef24-search-quality-decision-making

🆚 The Impact of Search Quality on Decision Making for Comparative Questions

Primary LanguageJupyter NotebookMIT LicenseMIT

🆚 clef24-search-quality-decision-making

Code and data for the paper The Impact of Web Search Result Quality on Decision Making.

Data

In this repository, you'll find all annotations and study responses that were collected as part of this research:

File Description
01-quality-agreement.xlsx Fleiss' kappa values per quality aspects.
02-quality.xlsx Quality annotations for each criterion's aspects
03-quality-recoded.xlsx Recoded quality annotations to separate combined fields.
04-quality-recoded-majority-vote.xlsx Final quality annotations after majority voting.
05-survey.xlsx Raw user study survey responses and topics.
06-study.xlsx Bundled topics, archived search results, quality assessments, and user study survey responses.

Code

We use Python notebooks to evaluate the quality assessments and user study responses. Follow the steps below to install and run the notebooks.

Installation

  1. Install Python 3.11

  2. Create and activate the virtual environment:

    python3.11 -m venv venv/
    source venv/bin/activate
  3. Install dependencies:

    pip install -e .

Usage

Once you have installed the required dependencies, you can launch the notebooks by running the following command (where <FILE> is a path from the list below):

jupyter-notebook notebooks/<FILE>
File Description
agreement.ipynb Measuring the inter-rater agreement for the quality assessments.
evaluation_quality.ipynb Evaluation of the quality assessments.
evaluation_user_study.ipynb Evaluation of the user study responses and hypothesis tests.

License

The source code in this repository is licensed under the MIT License.