yixuantt / PoolingAndAttnLinks
"Pooling And Attention: What Are Effective Designs For LLM-Based Embedding Models?"
☆37Updated last year
Alternatives and similar repositories for PoolingAndAttn
Users that are interested in PoolingAndAttn are comparing it to the libraries listed below
Sorting:
- Code for the EMNLP 2024 paper "Detecting and Mitigating Contextual Hallucinations in Large Language Models Using Only Attention Maps"☆141Updated last month
- LongEmbed: Extending Embedding Models for Long Context Retrieval (EMNLP 2024)☆145Updated last year
- ☆156Updated last year
- The first dense retrieval model that can be prompted like an LM☆89Updated 6 months ago
- Large language models for document ranking.☆69Updated 2 weeks ago
- FollowIR: Evaluating and Teaching Information Retrieval Models to Follow Instructions☆49Updated last year
- Co-LLM: Learning to Decode Collaboratively with Multiple Language Models☆122Updated last year
- [ICLR 2025 Oral] "Your Mixture-of-Experts LLM Is Secretly an Embedding Model For Free"☆82Updated last year
- Simple replication of [ColBERT-v1](https://arxiv.org/abs/2004.12832).☆79Updated last year
- Improving Text Embedding of Language Models Using Contrastive Fine-tuning☆65Updated last year
- Code for this paper "HyperRouter: Towards Efficient Training and Inference of Sparse Mixture of Experts via HyperNetwork"☆33Updated last year
- Codebase accompanying the Summary of a Haystack paper.☆79Updated last year