jxmorris12 / cdeLinks
code for training & evaluating Contextual Document Embedding models
☆202Updated 8 months ago
Alternatives and similar repositories for cde
Users that are interested in cde are comparing it to the libraries listed below
Sorting:
- Manage scalable open LLM inference endpoints in Slurm clusters☆278Updated last year
- ☆140Updated 5 months ago
- OpenCoconut implements a latent reasoning paradigm where we generate thoughts before decoding.☆175Updated last year
- Official repository for "Scaling Retrieval-Based Langauge Models with a Trillion-Token Datastore".☆222Updated last month
- EvolKit is an innovative framework designed to automatically enhance the complexity of instructions used for fine-tuning Large Language M…☆245Updated last year
- Code for Zero-Shot Tokenizer Transfer☆142Updated last year
- ☆151Updated 4 months ago
- BABILong is a benchmark for LLM evaluation using the needle-in-a-haystack approach.☆236Updated 4 months ago
- Simple & Scalable Pretraining for Neural Architecture Research☆306Updated last month
- An introduction to LLM Sampling☆79Updated last year
- A toolkit implementing advanced methods to transfer models and model knowledge across tokenizers.☆61Updated 6 months ago
- Prune transformer layers☆74Updated last year
- Functional Benchmarks and the Reasoning Gap☆89Updated last year
- ☆92Updated last month
- Train your own SOTA deductive reasoning model☆107Updated 10 months ago
- ☆120Updated last year
- ☆161Updated last year
- Storing long contexts in tiny caches with self-study☆231Updated last month
- Q-GaLore: Quantized GaLore with INT4 Projection and Layer-Adaptive Low-Rank Gradients.☆201Updated last year
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆109Updated 10 months ago
- A compact LLM pretrained in 9 days by using high quality data☆340Updated 9 months ago
- Code for NeurIPS'24 paper 'Grokked Transformers are Implicit Reasoners: A Mechanistic Journey to the Edge of Generalization'☆234Updated 6 months ago
- Scaling is a distributed training library and installable dependency designed to scale up neural networks, with a dedicated module for tr…☆66Updated 2 months ago
- awesome synthetic (text) datasets☆321Updated last week
- Set of scripts to finetune LLMs☆38Updated last year
- Simple replication of [ColBERT-v1](https://arxiv.org/abs/2004.12832).☆82Updated last year
- An extension of the nanoGPT repository for training small MOE models.☆226Updated 10 months ago
- The first dense retrieval model that can be prompted like an LM☆90Updated 8 months ago
- ☆137Updated last year
- EvaByte: Efficient Byte-level Language Models at Scale☆114Updated 9 months ago