Ongoing research training transformer models at scale
β395Aug 20, 2024Updated last year
Alternatives and similar repositories for Megatron-LM
Users that are interested in Megatron-LM are comparing it to the libraries listed below
Sorting:
- Home of StarCoder: fine-tuning & inference!β7,529Feb 27, 2024Updated 2 years ago
- π OctoPack: Instruction Tuning Code Large Language Modelsβ479Feb 5, 2025Updated last year
- β492Aug 15, 2024Updated last year
- A framework for the evaluation of autoregressive code generation language models.β1,021Jul 22, 2025Updated 8 months ago
- CodeGen is a family of open-source model for program synthesis. Trained on TPU-v4. Competitive with OpenAI Codex.β5,171Oct 27, 2025Updated 4 months ago
- LLMs build upon Evol Insturct: WizardLM, WizardCoder, WizardMathβ9,478Jun 7, 2025Updated 9 months ago
- CodeGen2 models for program synthesisβ271Jun 12, 2023Updated 2 years ago
- Fine-tune SantaCoder for Code/Text Generation.β196Apr 11, 2023Updated 2 years ago
- Ongoing research training transformer language models at scale, including: BERT & GPT-2β1,438Mar 20, 2024Updated 2 years ago
- β15Oct 24, 2023Updated 2 years ago
- Repository for analysis and experiments in the BigCode project.β128Mar 20, 2024Updated 2 years ago
- This is the official code for the paper CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning (Neurβ¦β564Jan 21, 2025Updated last year
- distributed trainer for LLMsβ590May 20, 2024Updated last year
- Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.β1,010Jul 29, 2024Updated last year
- β26Mar 6, 2024Updated 2 years ago
- CodeTF: One-stop Transformer Library for State-of-the-art Code LLMβ1,479May 1, 2025Updated 10 months ago
- APPS: Automated Programming Progress Standard (NeurIPS 2021)β520Jun 19, 2024Updated last year
- Dromedary: towards helpful, ethical and reliable LLMs.β1,144Sep 18, 2025Updated 6 months ago
- C++ implementation for π«StarCoderβ459Sep 9, 2023Updated 2 years ago
- β1,506May 12, 2023Updated 2 years ago
- An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.β39,428Jun 2, 2025Updated 9 months ago
- β39Oct 3, 2022Updated 3 years ago
- Large Language Model Text Generation Inferenceβ10,812Jan 8, 2026Updated 2 months ago
- Ongoing research training transformer models at scaleβ15,744Updated this week
- Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adβ¦β6,082Jul 1, 2025Updated 8 months ago
- A multi-programming language benchmark for LLMsβ299Jan 28, 2026Updated last month
- A repo for distributed training of language models with Reinforcement Learning via Human Feedback (RLHF)β4,739Jan 8, 2024Updated 2 years ago
- β283Apr 25, 2023Updated 2 years ago
- Astraios: Parameter-Efficient Instruction Tuning Code Language Modelsβ63Apr 10, 2024Updated last year
- Code used for sourcing and cleaning the BigScience ROOTS corpusβ318Mar 20, 2023Updated 3 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed librariesβ7,400Feb 3, 2026Updated last month
- Code for the TMLR 2023 paper "PPOCoder: Execution-based Code Generation using Deep Reinforcement Learning"β117Jan 9, 2024Updated 2 years ago
- Scaling Data-Constrained Language Modelsβ342Jun 28, 2025Updated 8 months ago
- LLM powered development for VSCodeβ1,316Jul 17, 2024Updated last year
- Home of CodeT5: Open Code LLMs for Code Understanding and Generationβ3,099Jan 20, 2024Updated 2 years ago
- Accessible large language models via k-bit quantization for PyTorch.β8,052Updated this week
- Code for the paper "Evaluating Large Language Models Trained on Code"β3,172Jan 17, 2025Updated last year
- QLoRA: Efficient Finetuning of Quantized LLMsβ10,858Jun 10, 2024Updated last year
- OpenLLaMA, a permissively licensed open source reproduction of Meta AIβs LLaMA 7B trained on the RedPajama datasetβ7,537Jul 16, 2023Updated 2 years ago