phlippe / jax_trainerLinks
Lightning-like training API for JAX with Flax
☆44Updated 11 months ago
Alternatives and similar repositories for jax_trainer
Users that are interested in jax_trainer are comparing it to the libraries listed below
Sorting:
- Pytorch-like dataloaders for JAX.☆97Updated 6 months ago
- Running Jax in PyTorch Lightning☆114Updated 11 months ago
- Run PyTorch in JAX. 🤝☆309Updated last month
- A simple library for scaling up JAX programs☆144Updated last month
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆92Updated last year
- Turn jitted jax functions back into python source code☆22Updated 11 months ago
- ☆118Updated last month
- Implementation of Denoising Diffusion Probabilistic Models (DDPM) in JAX and Flax.☆22Updated 2 years ago
- ☆119Updated 5 months ago
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- Use Jax functions in Pytorch☆258Updated 2 years ago
- A functional training loops library for JAX☆88Updated last year
- ☆35Updated last year
- LoRA for arbitrary JAX models and functions☆143Updated last year
- A Python package of computer vision models for the Equinox ecosystem.☆110Updated last year
- Flow-matching algorithms in JAX☆112Updated last year
- Graph neural networks in JAX.☆68Updated last year
- Minimal yet performant LLM examples in pure JAX☆204Updated 2 months ago
- Neural Networks for JAX☆84Updated last year
- Diffusion models in PyTorch☆116Updated last week
- Bare-bones implementations of some generative models in Jax: diffusion, normalizing flows, consistency models, flow matching, (beta)-VAEs…☆137Updated last year
- JAX Arrays for human consumption☆110Updated last month
- This is a port of Mistral-7B model in JAX☆32Updated last year
- diffusionjax is a simple and accessible diffusion models package in JAX☆48Updated 10 months ago
- Differentiable Principal Component Analysis (PCA) implementation in JAX☆30Updated 7 months ago
- JMP is a Mixed Precision library for JAX.☆211Updated 10 months ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated last year
- ☆225Updated last year
- Non official implementation of the Linear Recurrent Unit (LRU, Orvieto et al. 2023)☆60Updated 3 months ago
- ☆62Updated last year