UKPLab / acl2024-triple-encoders
triple-encoders is a library for contextualizing distributed Sentence Transformers representations.
☆14Updated 8 months ago
Alternatives and similar repositories for acl2024-triple-encoders
Users that are interested in acl2024-triple-encoders are comparing it to the libraries listed below
Sorting:
- Code associated with the paper "Entropy-based Attention Regularization Frees Unintended Bias Mitigation from Lists"☆48Updated 2 years ago
- ☆16Updated 2 years ago
- A framework for few-shot evaluation of autoregressive language models.☆103Updated 2 years ago
- ☆45Updated 3 years ago
- Dense hybrid representations for text retrieval☆62Updated 2 years ago
- ☆54Updated 2 years ago
- The data and the PyTorch implementation for the models and experiments in the paper "Exploiting Asymmetry for Synthetic Training Data Gen…☆61Updated last year
- Mr. TyDi is a multi-lingual benchmark dataset built on TyDi, covering eleven typologically diverse languages.☆75Updated 3 years ago
- Pytorch Implementation of EncT5: Fine-tuning T5 Encoder for Non-autoregressive Tasks☆63Updated 3 years ago
- Repo for Aspire - A scientific document similarity model based on matching fine-grained aspects of scientific papers.☆52Updated last year
- This repository accompanies our paper “Do Prompt-Based Models Really Understand the Meaning of Their Prompts?”☆85Updated 3 years ago
- Interpreting Language Models with Contrastive Explanations (EMNLP 2022 Best Paper Honorable Mention)