abacaj / train-with-fsdp
☆92Updated last year
Alternatives and similar repositories for train-with-fsdp:
Users that are interested in train-with-fsdp are comparing it to the libraries listed below
- Comprehensive analysis of difference in performance of QLora, Lora, and Full Finetunes.☆82Updated last year
- Multipack distributed sampler for fast padding-free training of LLMs☆188Updated 8 months ago
- Fully fine-tune large models like Mistral, Llama-2-13B, or Qwen-14B completely for free☆231Updated 6 months ago
- ☆94Updated last year
- Full finetuning of large language models without large memory requirements☆94Updated last year
- Exploring finetuning public checkpoints on filter 8K sequences on Pile