arcee-ai / DistillKitLinks
An Open Source Toolkit For LLM Distillation
☆833Updated last month
Alternatives and similar repositories for DistillKit
Users that are interested in DistillKit are comparing it to the libraries listed below
Sorting:
- [ICLR 2025] Alignment Data Synthesis from Scratch by Prompting Aligned LLMs with Nothing. Your efficient and high-quality synthetic data …☆820Updated 10 months ago
- ☆559Updated last year
- Official repository for ORPO☆469Updated last year
- A library for easily merging multiple LLM experts, and efficiently train the merged LLM.☆500Updated last year
- Recipes to scale inference-time compute of open models☆1,124Updated 8 months ago
- Lighteval is your all-in-one toolkit for evaluating LLMs across multiple backends☆2,274Updated last week
- OLMoE: Open Mixture-of-Experts Language Models☆950Updated 4 months ago
- Automatic evals for LLMs☆575Updated last month
- Memory optimization and training recipes to extrapolate language models' context length to 1 million tokens, with minimal hardware.☆750Updated last year