huggingface / transformers-bloom-inferenceLinks
Fast Inference Solutions for BLOOM
☆564Updated 10 months ago
Alternatives and similar repositories for transformers-bloom-inference
Users that are interested in transformers-bloom-inference are comparing it to the libraries listed below
Sorting:
- Crosslingual Generalization through Multitask Finetuning☆538Updated 11 months ago
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆1,411Updated last year
- Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.☆1,006Updated last year
- Official repository for LongChat and LongEval☆527Updated last year
- Code used for sourcing and cleaning the BigScience ROOTS corpus☆314Updated 2 years ago
- ☆412Updated last year
- Large-scale model inference.☆632Updated last year
- distributed trainer for LLMs☆578Updated last year
- LOMO: LOw-Memory Optimization☆989Updated last year
- Scalable PaLM implementation of PyTorch☆190Updated 2 years ago
- Automatically split your PyTorch models on multiple GPUs for training & inference☆658Updated last year
- [ICML 2024] Break the Sequential Dependency of LLM Inference Using Lookahead Decoding☆1,272Updated 5 months ago
- [NIPS2023] RRHF & Wombat☆811Updated last year
- Open-source pre-training implementation of Google's LaMDA in PyTorch. Adding RLHF similar to ChatGPT.☆473Updated last year
- train llama on a single A100 80G node using 🤗 transformers and 🚀 Deepspeed Pipeline Parallelism☆224Updated last year