Guzpenha / transformer_rankersLinks
A library to conduct ranking experiments with transformers.
☆160Updated 2 years ago
Alternatives and similar repositories for transformer_rankers
Users that are interested in transformer_rankers are comparing it to the libraries listed below
Sorting:
- Training & evaluation library for text-based neural re-ranking and dense retrieval models built with PyTorch☆264Updated 2 years ago
- State of the art Semantic Sentence Embeddings☆99Updated 3 years ago
- [NAACL 2021] This is the code for our paper `Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self…☆206Updated 3 years ago
- Code and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".☆51Updated 4 years ago
- An end-to-end neural ad-hoc ranking pipeline.☆151Updated 4 months ago
- ☆162Updated 5 years ago
- On the Stability of Fine-tuning BERT: Misconceptions, Explanations, and Strong Baselines☆137Updated 2 years ago
- Document ranking via sentence modeling using BERT☆143Updated 2 years ago
- source code of bison☆26Updated 5 years ago
- Binary Passage Retriever (BPR) - an efficient passage retriever for open-domain question answering☆174Updated 4 years ago
- ☆68Updated 7 months ago
- [EMNLP 2021] Improving and Simplifying Pattern Exploiting Training☆153Updated 3 years ago
- Multi-stage passage ranking: monoBERT + duoBERT☆110Updated 5 years ago
- Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations☆134Updated 3 months ago
- SIGIR 2021: Efficiently Teaching an Effective Dense Retriever with Balanced Topic Aware Sampling☆59Updated 4 years ago
- Interpretable Evaluation for (Almost) All NLP Tasks☆195Updated 2 months ago
- Anserini notebooks☆69Updated 2 years ago
- A toolkit for end-to-end neural ad hoc retrieval☆96Updated last year
- IEEE/ACM TASLP 2020: SBERT-WK: A Sentence Embedding Method By Dissecting BERT-based Word Models☆181Updated 4 years ago
- A multilingual version of MS MARCO passage ranking dataset☆144Updated 2 years ago
- RepBERT is a competitive first-stage retrieval technique. It represents documents and queries with fixed-length contextualized embeddings…☆66Updated 4 years ago
- Overview of venues, research themes and datasets relevant for conversational search.☆146Updated 3 years ago
- EMNLP 2021 - Pre-training architectures for dense retrieval☆254Updated 3 years ago
- code and data to faciliate BERT/ELECTRA for document ranking. Details refer to the paper - PARADE: Passage Representation Aggregation for…☆96Updated 2 years ago
- ☆88Updated 3 years ago
- X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification☆141Updated 4 years ago
- Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation☆113Updated 4 years ago
- Research framework for low resource text classification that allows the user to experiment with classification models and active learning…☆101Updated 3 years ago
- Code for CEDR: Contextualized Embeddings for Document Ranking, accepted at SIGIR 2019.☆156Updated 5 years ago
- BERTserini☆26Updated 3 years ago