lucidrains / x-transformers
A concise but complete full-attention transformer with a set of promising experimental features from various papers
ā5,080Updated this week
Alternatives and similar repositories for x-transformers:
Users that are interested in x-transformers are comparing it to the libraries listed below
- š A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iā¦ā8,343Updated this week
- Vector (and Scalar) Quantization, in Pytorchā2,922Updated last week
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)ā8,741Updated last week
- Fast and memory-efficient exact attentionā15,541Updated this week
- Foundation Architecture for (M)LLMsā3,046Updated 10 months ago
- Accessible large language models via k-bit quantization for PyTorch.ā6,697Updated this week
- PyTorch extensions for high performance and large scale training.ā3,260Updated last month
- š¦ Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorchā2,094Updated 2 months ago
- A Unified Library for Parameter-Efficient and Modular Transfer Learningā2,649Updated last week
- An annotated implementation of the Transformer paper.ā6,000Updated 10 months ago
- Implementation of Denoising Diffusion Probabilistic Model in Pytorchā8,848Updated 4 months ago
- Scenic: A Jax Library for Computer Vision Research and Beyondā3,430Updated 2 weeks ago
- Structured state space sequence modelsā2,553Updated 7 months ago
- ā10,913Updated 2 months ago
- TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale.ā1,542Updated this week
- Train transformer language models with reinforcement learning.ā11,782Updated this week
- Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.ā2,583Updated last week
- Transformer: PyTorch Implementation of "Attention Is All You Need"ā3,347Updated 6 months ago
- Ongoing research training transformer models at scaleā11,448Updated this week
- Hackable and optimized Transformers building blocks, supporting a composable construction.ā9,063Updated this week
- Denoising Diffusion Probabilistic Modelsā4,141Updated last year
- Pytorch library for fast transformer implementationsā1,677Updated last year
- Transformer related optimization, including BERT, GPTā6,025Updated 10 months ago
- View model summaries in PyTorch!ā2,698Updated last week
- š¤ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.ā17,363Updated this week
- Reading list for research topics in multimodal machine learningā6,263Updated 6 months ago
- Reformer, the efficient Transformer, in Pytorchā2,152Updated last year
- A Collection of Variational Autoencoders (VAE) in PyTorch.ā6,914Updated 8 months ago
- A collection of resources and papers on Diffusion Modelsā11,418Updated 6 months ago
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websitesā4,762Updated 6 months ago