jdeschena / sdtt
SDTT: a simple and effective distillation method for discrete diffusion models
☆16Updated last week
Related projects ⓘ
Alternatives and complementary repositories for sdtt
- Code for Discovering Preference Optimization Algorithms with and for Large Language Models☆51Updated 5 months ago
- Japanese LLaMa experiment☆52Updated 8 months ago
- Checkpointable dataset utilities for foundation model training☆32Updated 9 months ago
- A Slack Bot for summarizing arXiv papers, powered by OpenAI LLMs.☆68Updated last year
- ☆22Updated 11 months ago
- ☆21Updated last year
- ☆50Updated last year
- RealPersonaChat: A Realistic Persona Chat Corpus with Interlocutors' Own Personalities☆48Updated 8 months ago
- ☆24Updated 2 weeks ago
- Unofficial entropix impl for Gemma2 and Llama and Qwen2 and Mistral☆15Updated last month
- This project uses llama.cpp as an LLM server to perform inference and generate speech using Synthetic voice library☆22Updated 8 months ago
- 競馬予想プログラム☆13Updated last year
- Helper library for LangSmith that provides an interface to run evaluations by simply writing config files.☆23Updated this week
- Ongoing Research Project for continaual pre-training LLM(dense mode)☆28Updated last week
- ☆15Updated 8 months ago
- ☆24Updated 2 years ago
- Webブラウザから手軽にローカルLLMとおしゃべりできるソフトウェアです。☆27Updated 10 months ago
- ☆12Updated last year
- Notionに毎日新しいarXiv論文のアブストラクト日本語訳 + αを表示するスクリプト☆12Updated last year
- Mamba training library developed by kotoba technologies☆68Updated 9 months ago
- ☆14Updated 7 months ago
- Codes for the paper "A mathematical perspective on Transformers".☆32Updated 4 months ago
- 【2024年版】BERTによるテキスト分類☆24Updated 4 months ago
- ☆12Updated 4 months ago
- LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation☆21Updated 6 months ago
- ☆13Updated 2 months ago
- ☆12Updated 5 months ago
- ☆37Updated 3 months ago
- Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" in pytorch☆20Updated 8 months ago
- ☆33Updated 3 months ago