lucidrains / jax2torchLinks
Use Jax functions in Pytorch
β259Updated 2 years ago
Alternatives and similar repositories for jax2torch
Users that are interested in jax2torch are comparing it to the libraries listed below
Sorting:
- Run PyTorch in JAX. π€β311Updated 3 months ago
- Running Jax in PyTorch Lightningβ118Updated last year
- JMP is a Mixed Precision library for JAX.β210Updated 11 months ago
- Pytorch-like dataloaders for JAX.β98Updated last month
- A library for programmatically generating equivariant layers through constraint solvingβ280Updated 2 years ago
- Flow-matching algorithms in JAXβ114Updated last year
- Lightning-like training API for JAX with Flaxβ45Updated last year
- A simple library for scaling up JAX programsβ144Updated 2 months ago
- β118Updated last month
- A functional training loops library for JAXβ88Updated last year
- LoRA for arbitrary JAX models and functionsβ143Updated last year
- A parallel ODE solver for PyTorchβ274Updated last year
- JAX Arrays for human consumptionβ110Updated 2 months ago
- β122Updated 7 months ago
- Implementing the Denoising Diffusion Probabilistic Model in Flaxβ156Updated 3 years ago
- Second Order Optimization and Curvature Estimation with K-FAC in JAX.β304Updated last week
- Graph neural networks in JAX.β68Updated last year
- β162Updated 2 years ago
- OpTree: Optimized PyTree Utilitiesβ206Updated 2 weeks ago
- Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).β118Updated 3 years ago
- Bare-bones implementations of some generative models in Jax: diffusion, normalizing flows, consistency models, flow matching, (beta)-VAEsβ¦β141Updated 2 years ago
- A convenient way to trigger synchronizations to wandb / Weights & Biases if your compute nodes don't have internet!β88Updated last week
- Flow Annealed Importance Sampling Bootstrap (FAB). ICLR 2023.β66Updated last year
- Universal Notation for Tensor Operations in Python.β459Updated 9 months ago
- Wraps PyTorch code in a JIT-compatible way for JAX. Supports automatically defining gradients for reverse-mode AutoDiff.β58Updated 5 months ago
- Simple tools to mix and match PyTorch and Jax - Get the best of both worlds!β35Updated last week
- Modern Fixed Point Systems using Pytorchβ125Updated 2 years ago
- CLU lets you write beautiful training loops in JAX.β365Updated last week
- A comprehensive JAX/NNX library for diffusion and flow matching generative algorithms, featuring DiT (Diffusion Transformer) and its variβ¦β127Updated 3 months ago
- NF-Layers for constructing neural functionals.β93Updated 2 years ago