crazycth / WizardLearner
Pretrain、decay、SFT a CodeLLM from scratch 🧙♂️
☆36Updated 10 months ago
Alternatives and similar repositories for WizardLearner:
Users that are interested in WizardLearner are comparing it to the libraries listed below
- This is a repo for showcasing using MCTS with LLMs to solve gsm8k problems☆63Updated 2 months ago
- ☆59Updated 3 months ago
- This is a personal reimplementation of Google's Infini-transformer, utilizing a small 2b model. The project includes both model and train…☆56Updated 10 months ago
- ☆105Updated 4 months ago
- ☆96Updated 11 months ago
- Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models☆129Updated 9 months ago
- ☆105Updated last month
- Feeling confused about super alignment? Here is a reading list☆42Updated last year
- ☆81Updated 10 months ago
- Hammer: Robust Function-Calling for On-Device Language Models via Function Masking☆63Updated 3 weeks ago
- SELF-GUIDE: Better Task-Specific Instruction Following via Self-Synthetic Finetuning. COLM 2024 Accepted Paper☆29Updated 9 months ago
- Fantastic Data Engineering for Large Language Models☆81Updated 2 months ago
- A highly capable 2.4B lightweight LLM using only 1T pre-training data with all details.☆161Updated this week
- A visuailzation tool to make deep understaning and easier debugging for RLHF training.☆170Updated 3 weeks ago
- ☆44Updated 9 months ago
- 代码大模型 预训练&微调&DPO 数据处理 业界处理pipeline sota☆33Updated 7 months ago
- ☆93Updated this week
- ☆101Updated 3 months ago
- 使用单个24G显卡,从0开始训练LLM☆50Updated 4 months ago
- ☆122Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆70Updated last year
- Inference Code for Paper "Harder Tasks Need More Experts: Dynamic Routing in MoE Models"☆42Updated 7 months ago
- CLongEval: A Chinese Benchmark for Evaluating Long-Context Large Language Models☆40Updated last year