🚀 RocketQA, dense retrieval for information retrieval and question answering, including both Chinese and English state-of-the-art models.
-
Updated
Dec 19, 2023 - Python
🚀 RocketQA, dense retrieval for information retrieval and question answering, including both Chinese and English state-of-the-art models.
A curated list of awesome papers related to pre-trained models for information retrieval (a.k.a., pretraining for IR).
Train Models Contrastively in Pytorch
Tevatron - A flexible toolkit for neural retrieval research and development.
Train and Infer Powerful Sentence Embeddings with AnglE | 🔥 SOTA on STS and MTEB Leaderboard
HyDE: Precise Zero-Shot Dense Retrieval without Relevance Labels
A curated list of awesome papers for Semantic Retrieval (TOIS Accepted: Semantic Models for the First-stage Retrieval: A Comprehensive Review).
EMNLP 2021 - Pre-training architectures for dense retrieval
A Python Search Engine for Humans 🥸
[SIGIR 2022] Multi-CPR: A Multi Domain Chinese Dataset for Passage Retrieval
Train Dense Passage Retriever (DPR) with a single GPU
WSDM'22 Best Paper: Learning Discrete Representations via Constrained Clustering for Effective and Efficient Dense Retrieval
SimXNS is a research project for information retrieval. This repo contains official implementations by MSRA NLC team.
Nature Biotechnology: Ultra-fast, sensitive detection of protein remote homologs using deep dense retrieval
Code and models for the paper "Questions Are All You Need to Train a Dense Passage Retriever (TACL 2023)"
An easy-to-use python toolkit for flexibly adapting various neural ranking models to any target domain.
SIGIR 2021: Efficiently Teaching an Effective Dense Retriever with Balanced Topic Aware Sampling
Code and data for reproducing baselines for TopiOCQA, an open-domain conversational question-answering dataset
CIKM'21: JPQ substantially improves the efficiency of Dense Retrieval with 30x compression ratio, 10x CPU speedup and 2x GPU speedup.
[EMNLP 2022] This is the code repo for our EMNLP‘22 paper "COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning".
Add a description, image, and links to the dense-retrieval topic page so that developers can more easily learn about it.
To associate your repository with the dense-retrieval topic, visit your repo's landing page and select "manage topics."