matthias-wright / flaxmodels
Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.
☆246Updated last year
Alternatives and similar repositories for flaxmodels:
Users that are interested in flaxmodels are comparing it to the libraries listed below
- Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).☆107Updated 2 years ago
- Contrastive Language-Image Pretraining☆142Updated 2 years ago
- CLU lets you write beautiful training loops in JAX.☆332Updated 3 weeks ago
- Unofficial JAX implementations of deep learning research papers☆153Updated 2 years ago
- JMP is a Mixed Precision library for JAX.☆191Updated last month
- LoRA for arbitrary JAX models and functions☆135Updated last year
- PIX is an image processing library in JAX, for JAX.☆404Updated 2 weeks ago
- Implementing the Denoising Diffusion Probabilistic Model in Flax☆147Updated 2 years ago
- Easy Hypernetworks in Pytorch and Jax☆97Updated 2 years ago
- ☆153Updated last year
- Official Pytorch and JAX implementation of "Efficient-VDVAE: Less is more"☆193Updated 2 years ago
- JAX Synergistic Memory Inspector☆168Updated 7 months ago
- Named tensors with first-class dimensions for PyTorch☆321Updated last year
- A functional training loops library for JAX☆86Updated last year
- A Pytree Module system for Deep Learning in JAX☆213Updated 2 years ago
- All about the fundamentals and working of Diffusion Models☆153Updated 2 years ago
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation precondition…☆168Updated 2 months ago
- Use Jax functions in Pytorch☆237Updated last year
- Run PyTorch in JAX. 🤝☆222Updated 2 weeks ago
- ☆153Updated 4 years ago
- EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax☆127Updated last year
- Simple and efficient RevNet-Library for PyTorch with XLA and DeepSpeed support and parameter offload☆126Updated 2 years ago
- This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"☆180Updated 3 years ago
- v objective diffusion inference code for JAX.☆213Updated 2 years ago
- This library would form a permanent home for reusable components for deep probabilistic programming. The library would form and harness a…☆306Updated 2 months ago
- ☆548Updated this week
- ☆68Updated last year
- The 2D discrete wavelet transform for JAX☆40Updated 2 years ago
- Library for reading and processing ML training data.☆392Updated this week
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆237Updated last year