Code and data for the paper The Impact of Web Search Result Quality on Decision Making.
In this repository, you'll find all annotations and study responses that were collected as part of this research:
File | Description |
---|---|
01-quality-agreement.xlsx |
Fleiss' kappa values per quality aspects. |
02-quality.xlsx |
Quality annotations for each criterion's aspects |
03-quality-recoded.xlsx |
Recoded quality annotations to separate combined fields. |
04-quality-recoded-majority-vote.xlsx |
Final quality annotations after majority voting. |
05-survey.xlsx |
Raw user study survey responses and topics. |
06-study.xlsx |
Bundled topics, archived search results, quality assessments, and user study survey responses. |
We use Python notebooks to evaluate the quality assessments and user study responses. Follow the steps below to install and run the notebooks.
-
Install Python 3.11
-
Create and activate the virtual environment:
python3.11 -m venv venv/ source venv/bin/activate
-
Install dependencies:
pip install -e .
Once you have installed the required dependencies, you can launch the notebooks by running the following command (where <FILE>
is a path from the list below):
jupyter-notebook notebooks/<FILE>
File | Description |
---|---|
agreement.ipynb |
Measuring the inter-rater agreement for the quality assessments. |
evaluation_quality.ipynb |
Evaluation of the quality assessments. |
evaluation_user_study.ipynb |
Evaluation of the user study responses and hypothesis tests. |
The source code in this repository is licensed under the MIT License.