itsdaniele / jeometricLinks
Graph neural networks in JAX.
☆67Updated last year
Alternatives and similar repositories for jeometric
Users that are interested in jeometric are comparing it to the libraries listed below
Sorting:
- ☆115Updated this week
- A functional training loops library for JAX☆88Updated last year
- Neural Networks for JAX☆84Updated 10 months ago
- Running Jax in PyTorch Lightning☆109Updated 7 months ago
- JAX Arrays for human consumption☆106Updated last month
- Pytorch-like dataloaders for JAX.☆94Updated 2 months ago
- Run PyTorch in JAX. 🤝☆266Updated 3 weeks ago
- A Python package of computer vision models for the Equinox ecosystem.☆107Updated last year
- Lightning-like training API for JAX with Flax☆42Updated 7 months ago
- Bare-bones implementations of some generative models in Jax: diffusion, normalizing flows, consistency models, flow matching, (beta)-VAEs…☆131Updated last year
- Because we don't have enough time to read everything☆89Updated 10 months ago
- Equivariant Steerable CNNs Library for Pytorch https://quva-lab.github.io/escnn/☆30Updated 2 years ago
- Use Jax functions in Pytorch☆248Updated 2 years ago
- ☆43Updated 2 months ago
- A simple library for scaling up JAX programs☆140Updated 9 months ago
- Wraps PyTorch code in a JIT-compatible way for JAX. Supports automatically defining gradients for reverse-mode AutoDiff.☆56Updated last week
- Multiple dispatch over abstract array types in JAX.☆127Updated last month
- Einsum-like high-level array sharding API for JAX☆35Updated last year
- Meta-learning inductive biases in the form of useful conserved quantities.☆37Updated 2 years ago
- Flow-matching algorithms in JAX☆100Updated 11 months ago
- Turn jitted jax functions back into python source code☆22Updated 7 months ago
- diffusionjax is a simple and accessible diffusion models package in JAX☆46Updated 6 months ago
- Scalable and Stable Parallelization of Nonlinear RNNS☆17Updated 6 months ago
- Diffusion models in PyTorch☆107Updated last month
- LoRA for arbitrary JAX models and functions☆140Updated last year
- This is a port of Mistral-7B model in JAX☆32Updated last year
- Named Tensors for Legible Deep Learning in JAX☆197Updated this week
- Minimal, lightweight JAX implementations of popular models.☆64Updated this week
- Brain-Inspired Modular Training (BIMT), a method for making neural networks more modular and interpretable.☆172Updated 2 years ago
- ☆141Updated this week