Leukas / CUTELinks
☆15Updated 2 months ago
Alternatives and similar repositories for CUTE
Users that are interested in CUTE are comparing it to the libraries listed below
Sorting:
- ☆23Updated 10 months ago
- ☆13Updated 11 months ago
- Embedding Recycling for Language models☆38Updated 2 years ago
- Starbucks: Improved Training for 2D Matryoshka Embeddings☆22Updated 5 months ago
- Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders☆18Updated 6 months ago
- Repo for ICML23 "Why do Nearest Neighbor Language Models Work?"☆59Updated 2 years ago
- Label shift estimation for transfer difficulty with Familiarity.☆10Updated 10 months ago
- Code for paper "Do Language Models Have Beliefs? Methods for Detecting, Updating, and Visualizing Model Beliefs"☆28Updated 3 years ago
- SWIM-IR is a Synthetic Wikipedia-based Multilingual Information Retrieval training set with 28 million query-passage pairs spanning 33 la…☆49Updated 2 years ago
- Adding new tasks to T0 without catastrophic forgetting☆33Updated 3 years ago
- Official repository for our EACL 2023 paper "LongEval: Guidelines for Human Evaluation of Faithfulness in Long-form Summarization" (https…☆44Updated last year
- IntructIR, a novel benchmark specifically designed to evaluate the instruction following ability in information retrieval models. Our foc…☆31Updated last year
- ☆14Updated last year
- State-of-the-art paired encoder and decoder models (17M-1B params)☆53Updated 3 months ago
- 🌾 Universal, customizable and deployable fine-grained evaluation for text generation.☆24Updated 2 years ago
- Code for SaGe subword tokenizer (EACL 2023)☆27Updated last year
- Research code for the paper "How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models"☆27Updated 4 years ago
- Plug-and-play Search Interfaces with Pyserini and Hugging Face☆32Updated 2 years ago
- Emotion-Aware Dialogue Response Generation by Multi-Task Learning☆13Updated 3 years ago
- [ICLR 2023] PyTorch code of Summarization Programs: Interpretable Abstractive Summarization with Neural Modular Trees☆24Updated 2 years ago
- ☆46Updated 3 years ago
- BLOOM+1: Adapting BLOOM model to support a new unseen language☆74Updated last year
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆96Updated 2 years ago
- ☆10Updated last year
- ☆65Updated 2 years ago
- Training and evaluation code for the paper "Headless Language Models: Learning without Predicting with Contrastive Weight Tying" (https:/…☆28Updated last year
- Few-shot Learning with Auxiliary Data☆31Updated last year
- Code for NAACL 2022 paper "Reframing Human-AI Collaboration for Generating Free-Text Explanations"☆31Updated 2 years ago
- ☆13Updated 3 years ago
- ☆23Updated 2 weeks ago