conceptofmind / vit-flaxLinks
Implementation of numerous Vision Transformers in Google's JAX and Flax.
☆22Updated 3 years ago
Alternatives and similar repositories for vit-flax
Users that are interested in vit-flax are comparing it to the libraries listed below
Sorting:
- FID computation in Jax/Flax.☆29Updated last year
- Implementing the Denoising Diffusion Probabilistic Model in Flax☆156Updated 3 years ago
- JAX Implementation of Black Forest Labs' Flux.1 family of models☆39Updated last month
- LoRA for arbitrary JAX models and functions☆143Updated last year
- Train vision models using JAX and 🤗 transformers☆100Updated last week
- Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).☆118Updated 3 years ago
- Unofficial JAX implementations of deep learning research papers☆160Updated 3 years ago
- A simple library for scaling up JAX programs☆144Updated last month
- This is a port of Mistral-7B model in JAX☆32Updated last year
- minGPT in JAX☆48Updated 3 years ago
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆92Updated last year
- Layerwise Batch Entropy Regularization☆24Updated 3 years ago
- Implementation of a Transformer that Ponders, using the scheme from the PonderNet paper☆81Updated 4 years ago
- Automatically take good care of your preemptible TPUs☆37Updated 2 years ago
- ☆75Updated 3 years ago
- JAX implementation of Learning to learn by gradient descent by gradient descent☆28Updated 4 months ago
- HomebrewNLP in JAX flavour for maintable TPU-Training☆51Updated last year
- Contrastive Language-Image Pretraining☆144Updated 3 years ago
- ☆91Updated 3 years ago
- Easy Hypernetworks in Pytorch and Jax☆106Updated 2 years ago
- The 2D discrete wavelet transform for JAX☆44Updated 2 years ago
- EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax☆129Updated last year
- ☆118Updated 2 weeks ago
- Implementation of Gradient Agreement Filtering, from Chaubard et al. of Stanford, but for single machine microbatches, in Pytorch☆25Updated 11 months ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated last year
- Multi-framework implementation of Deep Kernel Shaping and Tailored Activation Transformations, which are methods that modify neural netwo…☆74Updated 5 months ago
- Local Attention - Flax module for Jax☆22Updated 4 years ago
- DiT (training + flow matching) in Jax☆10Updated 11 months ago
- Flax (JAX) implementation of Progressive Growing of GANs for Improved Quality, Stability, and Variation☆12Updated 4 years ago
- Just some miscellaneous utility functions / decorators / modules related to Pytorch and Accelerate to help speed up implementation of new…☆125Updated last year