huggingface / accelerateLinks
π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
β9,486Updated this week
Alternatives and similar repositories for accelerate
Users that are interested in accelerate are comparing it to the libraries listed below
Sorting:
- Hackable and optimized Transformers building blocks, supporting a composable construction.β10,326Updated this week
- Accessible large language models via k-bit quantization for PyTorch.β7,939Updated 2 weeks ago