conceptofmind / vit-flax
Implementation of numerous Vision Transformers in Google's JAX and Flax.
☆22Updated 2 years ago
Alternatives and similar repositories for vit-flax:
Users that are interested in vit-flax are comparing it to the libraries listed below
- FID computation in Jax/Flax.☆27Updated 8 months ago
- This is a port of Mistral-7B model in JAX☆32Updated 9 months ago
- ☆30Updated 4 months ago
- HomebrewNLP in JAX flavour for maintable TPU-Training☆49Updated last year
- Automatically take good care of your preemptible TPUs☆36Updated last year
- JAX implementation of Learning to learn by gradient descent by gradient descent☆27Updated 5 months ago
- The 2D discrete wavelet transform for JAX☆41Updated 2 years ago
- Contains my experiments with the `big_vision` repo to train ViTs on ImageNet-1k.☆22Updated 2 years ago
- Lightning-like training API for JAX with Flax☆38Updated 3 months ago
- Train vision models using JAX and 🤗 transformers☆97Updated 2 months ago
- JAX Implementation of Black Forest Labs' Flux.1 family of models☆30Updated 5 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆32Updated last year
- Code repo for ICLR 24 BlogPost titled "Building Diffusion Model's theory from ground up"☆18Updated last year
- ☆87Updated 2 weeks ago
- PyTorch interface for TrueGrad Optimizers☆42Updated last year
- Flow-matching algorithms in JAX☆86Updated 7 months ago
- Utilities for PyTorch distributed☆23Updated last month
- ☆73Updated 2 years ago
- Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).☆108Updated 2 years ago
- ☆37Updated 2 years ago
- Image augmentation library for Jax☆39Updated 11 months ago
- Implementation of some personal helper functions for Einops, my most favorite tensor manipulation library ❤️☆54Updated 2 years ago
- LoRA for arbitrary JAX models and functions☆135Updated last year
- A metrics library for the JAX ecosystem☆40Updated 2 years ago
- Local Attention - Flax module for Jax☆20Updated 3 years ago
- Flax (JAX) implementation of Progressive Growing of GANs for Improved Quality, Stability, and Variation☆12Updated 3 years ago
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆83Updated last year
- The Gaussian Histogram Loss (HL-Gauss) proposed by Imani et al. with a few convenient wrappers for regression, in Pytorch☆57Updated last month
- CUDA implementation of autoregressive linear attention, with all the latest research findings☆44Updated last year
- Running Jax in PyTorch Lightning☆90Updated 3 months ago