thunlp / Delta-CoMeLinks
Delta-CoMe can achieve near loss-less 1-bit compressin which has been accepted by NeurIPS 2024
☆58Updated last year
Alternatives and similar repositories for Delta-CoMe
Users that are interested in Delta-CoMe are comparing it to the libraries listed below
Sorting:
- Mixture-of-Experts (MoE) Language Model☆195Updated last year
- GLM Series Edge Models☆158Updated 7 months ago
- Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models☆139Updated last year
- ☆96Updated last year
- [ICML 2025] |TokenSwift: Lossless Acceleration of Ultra Long Sequence Generation☆121Updated 8 months ago
- Ling-V2 is a MoE LLM provided and open-sourced by InclusionAI.☆253Updated 4 months ago
- ☆74Updated 8 months ago
- a toolkit on knowledge distillation for large language models☆266Updated this week
- [ACL 2025] An official pytorch implement of the paper: Condor: Enhance LLM Alignment with Knowledge-Driven Data Synthesis and Refinement☆40Updated 8 months ago
- ☆82Updated 10 months ago
- XVERSE-MoE-A4.2B: A multilingual large language model developed by XVERSE Technology Inc.☆39Updated last year
- 最简易的R1结果在小模型上的复现,阐述类O1与DeepSeek R1最重要的本质。Think is all your need。利用实验佐证,对于强推理能力,think思考过程性内容是AGI/ASI的核心。☆45Updated last year
- SUS-Chat: Instruction tuning done right☆49Updated 2 years ago
- Fast LLM Training CodeBase With dynamic strategy choosing [Deepspeed+Megatron+FlashAttention+CudaFusionKernel+Compiler];☆40Updated 2 years ago
- SELF-GUIDE: Better Task-Specific Instruction Following via Self-Synthetic Finetuning. COLM 2024 Accepted Paper☆32Updated last year
- Ring is a reasoning MoE LLM provided and open-sourced by InclusionAI, derived from Ling.☆106Updated 6 months ago
- FuseAI Project☆87Updated last year
- Ling is a MoE LLM provided and open-sourced by InclusionAI.☆238Updated 8 months ago
- MiroMind-M1 is a fully open-source series of reasoning language models built on Qwen-2.5, focused on advancing mathematical reasoning.☆253Updated 5 months ago
- Repo for "MaskSearch: A Universal Pre-Training Framework to Enhance Agentic Search Capability"☆148Updated 8 months ago
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆141Updated last year
- ☆51Updated last year
- Model compression toolkit engineered for enhanced usability, comprehensiveness, and efficiency.☆314Updated this week
- Towards Economical Inference: Enabling DeepSeek's Multi-Head Latent Attention in Any Transformer-based LLMs☆204Updated 2 months ago
- XVERSE-MoE-A36B: A multilingual large language model developed by XVERSE Technology Inc.☆38Updated last year
- ☆182Updated 9 months ago
- DeepSolution: Boosting Complex Engineering Solution Design via Tree-based Exploration and Bi-point Thinking☆49Updated last month
- The RedStone repository includes code for preparing extensive datasets used in training large language models.☆146Updated 2 weeks ago
- Its an open source LLM based on MOE Structure.☆58Updated last year
- ☆187Updated last year