lucidrains / x-transformersLinks
A concise but complete full-attention transformer with a set of promising experimental features from various papers
☆5,461Updated last week
Alternatives and similar repositories for x-transformers
Users that are interested in x-transformers are comparing it to the libraries listed below
Sorting:
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆8,951Updated this week
- Vector (and Scalar) Quantization, in Pytorch☆3,420Updated last month
- 🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch☆2,144Updated 7 months ago
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,042Updated 3 weeks ago
- Fast and memory-efficient exact attention☆18,448Updated last week
- Structured state space sequence models☆2,684Updated last year
- Pytorch library for fast transformer implementations☆1,724Updated 2 years ago
- Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.☆3,016Updated 2 months ago
- Foundation Architecture for (M)LLMs☆3,094Updated last year
- Accessible large language models via k-bit quantization for PyTorch.☆7,230Updated last week
- Implementation of Denoising Diffusion Probabilistic Model in Pytorch☆9,673Updated 9 months ago
- TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale.☆1,630Updated last week
- Transformer: PyTorch Implementation of "Attention Is All You Need"☆3,878Updated last week
- PyTorch extensions for high performance and large scale training.☆3,339Updated 2 months ago
- View model summaries in PyTorch!☆2,824Updated last week
- Scenic: A Jax Library for Computer Vision Research and Beyond☆3,599Updated last week
- A high-performance Python-based I/O system for large (and small) deep learning problems, with strong support for PyTorch.☆2,707Updated last month
- A Unified Library for Parameter-Efficient and Modular Transfer Learning☆2,735Updated last month
- Hackable and optimized Transformers building blocks, supporting a composable construction.☆9,751Updated this week
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites☆4,905Updated 11 months ago
- An annotated implementation of the Transformer paper.☆6,354Updated last year
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆21,552Updated 2 weeks ago
- Reformer, the efficient Transformer, in Pytorch☆2,174Updated 2 years ago
- Transformer related optimization, including BERT, GPT☆6,248Updated last year
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆23,400Updated 4 months ago
- Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"☆7,581Updated last year
- ☆11,589Updated 4 months ago
- Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine.☆4,096Updated 11 months ago
- An open source implementation of CLIP.☆12,176Updated last month
- LAVIS - A One-stop Library for Language-Vision Intelligence☆10,757Updated 8 months ago