ProjectD-AI / LLaMA-Megatron-DeepSpeedLinks
Ongoing research training transformer language models at scale, including: BERT & GPT-2
☆69Updated last year
Alternatives and similar repositories for LLaMA-Megatron-DeepSpeed
Users that are interested in LLaMA-Megatron-DeepSpeed are comparing it to the libraries listed below
Sorting:
- ☆83Updated last year
- NTK scaled version of ALiBi position encoding in Transformer.☆68Updated last year
- Train llm (bloom, llama, baichuan2-7b, chatglm3-6b) with deepspeed pipeline mode. Faster than zero/zero++/fsdp.☆97Updated last year
- train llama on a single A100 80G node using 🤗 transformers and 🚀 Deepspeed Pipeline Parallelism☆223Updated last year
- SuperCLUE-Math6:新一代中文原生多轮多步数学推理数据集的探索之旅☆58Updated last year
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆47Updated last year
- Finetuning LLaMA with RLHF (Reinforcement Learning with Human Feedback) based on DeepSpeed Chat☆114Updated 2 years ago
- code for Scaling Laws of RoPE-based Extrapolation☆73Updated last year
- Code used for sourcing and cleaning the BigScience ROOTS corpus☆313Updated 2 years ago
- CLongEval: A Chinese Benchmark for Evaluating Long-Context Large Language Models☆40Updated last year