conceptofmind / vit-flaxLinks
Implementation of numerous Vision Transformers in Google's JAX and Flax.
☆22Updated 3 years ago
Alternatives and similar repositories for vit-flax
Users that are interested in vit-flax are comparing it to the libraries listed below
Sorting:
- Implementing the Denoising Diffusion Probabilistic Model in Flax☆156Updated 3 years ago
- FID computation in Jax/Flax.☆29Updated last year
- JAX Implementation of Black Forest Labs' Flux.1 family of models☆39Updated last month
- Train vision models using JAX and 🤗 transformers☆100Updated last month
- A simple library for scaling up JAX programs☆144Updated 2 months ago
- Unofficial JAX implementations of deep learning research papers☆160Updated 3 years ago
- Just some miscellaneous utility functions / decorators / modules related to Pytorch and Accelerate to help speed up implementation of new…☆126Updated last year
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆92Updated last year
- Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).☆118Updated 3 years ago
- HomebrewNLP in JAX flavour for maintable TPU-Training☆51Updated last year
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated 2 years ago
- ☆35Updated last year
- Contrastive Language-Image Pretraining☆144Updated 3 years ago
- LoRA for arbitrary JAX models and functions☆143Updated last year
- Implementation of Gradient Agreement Filtering, from Chaubard et al. of Stanford, but for single machine microbatches, in Pytorch☆25Updated 11 months ago
- Implementation of a Transformer that Ponders, using the scheme from the PonderNet paper☆81Updated 4 years ago
- ☆118Updated last month
- This is a port of Mistral-7B model in JAX☆32Updated last year
- Easy Hypernetworks in Pytorch and Jax☆106Updated 2 years ago
- Implementation of Flash Attention in Jax☆223Updated last year
- Lightning-like training API for JAX with Flax☆45Updated last year
- Implementation of the Kalman Filtering Attention proposed in "Kalman Filtering Attention for User Behavior Modeling in CTR Prediction"☆59Updated 2 years ago
- Automatically take good care of your preemptible TPUs☆37Updated 2 years ago
- Machine Learning eXperiment Utilities☆47Updated 5 months ago
- The 2D discrete wavelet transform for JAX☆44Updated 2 years ago
- Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.☆265Updated 9 months ago
- Implementation of GateLoop Transformer in Pytorch and Jax☆91Updated last year
- ☆62Updated last year
- Local Attention - Flax module for Jax☆22Updated 4 years ago
- ☆63Updated 3 years ago