facebookresearch / xformersLinks
Hackable and optimized Transformers building blocks, supporting a composable construction.
β10,201Updated last week
Alternatives and similar repositories for xformers
Users that are interested in xformers are comparing it to the libraries listed below
Sorting:
- Accessible large language models via k-bit quantization for PyTorch.β7,815Updated last week
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β9,398Updated this week
- Fast and memory-efficient exact attentionβ21,196Updated this week
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"β13,062Updated last year
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β20,307Updated this week
- Transformer related optimization, including BERT, GPTβ6,370Updated last year
- Using Low-rank adaptation to quickly fine-tune diffusion models.β7,493Updated last year
- Foundation Architecture for (M)LLMs