Expert Specialized Fine-Tuning
☆734May 22, 2025Updated 11 months ago
Alternatives and similar repositories for ESFT
Users that are interested in ESFT are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models☆3,245Apr 15, 2024Updated 2 years ago
- DeepSeek-VL: Towards Real-World Vision-Language Understanding☆4,091Apr 24, 2024Updated 2 years ago
- [ICLR 2024] Official implementation of DreamCraft3D: Hierarchical 3D Generation with Bootstrapped Diffusion Prior☆3,007Apr 22, 2025Updated last year
- ☆566Aug 16, 2024Updated last year
- DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model☆5,009Sep 25, 2024Updated last year
- Deploy on Railway without the complexity - Free Credits Offer • AdConnect your repo and Railway handles the rest with instant previews. Quickly provision container image services, databases, and storage volumes.
- DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models☆1,921Jan 16, 2024Updated 2 years ago
- DeepSeek-Coder-V2: Breaking the Barrier of Closed-Source Models in Code Intelligence☆6,685Nov 11, 2025Updated 5 months ago
- A curated list of open-source projects related to DeepSeek Coder☆775Nov 11, 2025Updated 5 months ago
- DeepSeek LLM: Let there be answers☆6,849Feb 4, 2024Updated 2 years ago
- Expert Parallelism Load Balancer☆1,365Mar 24, 2025Updated last year
- An experimental desktop client for using Claude Desktop's MCP with Novelcrafter codices.☆11Dec 3, 2024Updated last year
- Janus-Series: Unified Multimodal Understanding and Generation Models☆17,700Feb 1, 2025Updated last year
- Analyze computation-communication overlap in V3/R1.☆1,152Mar 21, 2025Updated last year
- DeepSeek-VL2: Mixture-of-Experts Vision-Language Models for Advanced Multimodal Understanding☆5,276Feb 26, 2025Updated last year
- Deploy to Railway using AI coding agents - Free Credits Offer • AdUse Claude Code, Codex, OpenCode, and more. Autonomous software development now has the infrastructure to match with Railway.
- Reaching LLaMA2 Performance with 0.1M Dollars☆988Jul 23, 2024Updated last year
- Exploration: using technology to aid people who lack both the ability to speak and fine motor control.☆21Oct 24, 2024Updated last year
- A bidirectional pipeline parallelism algorithm for computation-communication overlap in DeepSeek V3/R1 training.☆2,951Jan 14, 2026Updated 3 months ago
- ☆330Jul 25, 2024Updated last year
- DeepSeek Coder: Let the Code Write Itself☆23,157Nov 11, 2025Updated 5 months ago
- ☆14Apr 26, 2025Updated last year
- Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models☆139Jun 12, 2024Updated last year
- Parameter-Efficient Sparsity Crafting From Dense to Mixture-of-Experts for Instruction Tuning on General Tasks (EMNLP'24)☆146Sep 20, 2024Updated last year
- Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models☆231Nov 4, 2025Updated 5 months ago
- Deploy to Railway using AI coding agents - Free Credits Offer • AdUse Claude Code, Codex, OpenCode, and more. Autonomous software development now has the infrastructure to match with Railway.
- DeepEP: an efficient expert-parallel communication library☆9,589Updated this week
- 本项目主要对开源的MOSS SFT数据进行整理 ,转换成mnbvc多轮对话格式。MOSS-003涵盖用性、忠实性、无害性三个层面,共353w样本,MOSS-003 包含更细粒度的有用性类别标记、更广泛的无害性数据和更长对话轮数,共630w样本,☆13Dec 3, 2023Updated 2 years ago
- Q-GaLore: Quantized GaLore with INT4 Projection and Layer-Adaptive Low-Rank Gradients.☆205Jul 17, 2024Updated last year
- Muon is Scalable for LLM Training☆1,469Aug 3, 2025Updated 8 months ago
- DeepGEMM: clean and efficient FP8 GEMM kernels with fine-grained scaling☆7,144Updated this week
- A scalable automated alignment method for large language models. Resources for "Aligning Large Language Models via Self-Steering Optimiza…☆20Nov 21, 2024Updated last year
- Official Repo for Open-Reasoner-Zero☆2,093Jun 2, 2025Updated 10 months ago
- OLMoE: Open Mixture-of-Experts Language Models☆1,012Sep 23, 2025Updated 7 months ago
- Hypercorn is an ASGI and WSGI Server based on Hyper libraries and inspired by Gunicorn.☆15Jan 12, 2026Updated 3 months ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- The free energy principle☆19Feb 16, 2025Updated last year
- [ACL 2024] Progressive LLaMA with Block Expansion.☆514May 20, 2024Updated last year
- The official repo of MiniMax-Text-01 and MiniMax-VL-01, large-language-model & vision-language-model based on Linear Attention☆3,417Jul 7, 2025Updated 9 months ago
- FuseAI Project☆595Jan 25, 2025Updated last year
- Course Materials for Bayesian Psychometric Modeling☆15May 14, 2019Updated 6 years ago
- Official implementation for DenseMixer: Improving MoE Post-Training with Precise Router Gradient☆67Aug 3, 2025Updated 8 months ago
- Production-tested AI infrastructure tools for efficient AGI development and community-driven innovation☆7,985May 15, 2025Updated 11 months ago