matthias-wright / jax-fid
FID computation in Jax/Flax.
☆26Updated 6 months ago
Alternatives and similar repositories for jax-fid:
Users that are interested in jax-fid are comparing it to the libraries listed below
- Utilities for PyTorch distributed☆23Updated last year
- ☆33Updated 4 months ago
- An implementation of PSGD Kron second-order optimizer for PyTorch☆21Updated 2 weeks ago
- PyTorch interface for TrueGrad Optimizers☆41Updated last year
- Contains my experiments with the `big_vision` repo to train ViTs on ImageNet-1k.☆22Updated 2 years ago
- JAX implementation ViT-VQGAN☆80Updated 2 years ago
- ☆33Updated last year
- Automatically take good care of your preemptible TPUs☆34Updated last year
- The 2D discrete wavelet transform for JAX☆40Updated last year
- CUDA implementation of autoregressive linear attention, with all the latest research findings☆44Updated last year
- Implementation of LogAvgExp for Pytorch☆32Updated 2 years ago
- ☆21Updated 6 months ago
- DiCE: The Infinitely Differentiable Monte-Carlo Estimator☆31Updated last year
- A JAX implementation of the continuous time formulation of Consistency Models☆84Updated last year
- Minimal JAX/Flax port of `lpips` supporting `vgg16`, with pre-trained weights stored in the 🤗 Hugging Face hub.☆14Updated 2 years ago
- Exploration into the proposed "Self Reasoning Tokens" by Felipe Bonetto☆53Updated 8 months ago
- ☆31Updated 2 months ago
- Train vision models using JAX and 🤗 transformers☆95Updated this week
- A scalable implementation of diffusion and flow-matching with XGBoost models, applied to calorimeter data.☆17Updated 2 months ago
- ☆51Updated 7 months ago
- Implementation of some personal helper functions for Einops, my most favorite tensor manipulation library ❤️☆53Updated 2 years ago
- ☆37Updated 8 months ago
- Little article showing how to load pytorch's models with linear memory consumption☆34Updated 2 years ago
- CLOOB training (JAX) and inference (JAX and PyTorch)☆70Updated 2 years ago
- Explorations into the recently proposed Taylor Series Linear Attention☆91Updated 4 months ago
- ☆51Updated last year