itsdaniele / jeometricLinks
Graph neural networks in JAX.
☆68Updated last year
Alternatives and similar repositories for jeometric
Users that are interested in jeometric are comparing it to the libraries listed below
Sorting:
- Running Jax in PyTorch Lightning☆112Updated 10 months ago
- Neural Networks for JAX☆84Updated last year
- A functional training loops library for JAX☆88Updated last year
- Bare-bones implementations of some generative models in Jax: diffusion, normalizing flows, consistency models, flow matching, (beta)-VAEs…☆136Updated last year
- Because we don't have enough time to read everything☆89Updated last year
- ☆115Updated last month
- JAX Arrays for human consumption☆109Updated last week
- Equivariant Steerable CNNs Library for Pytorch https://quva-lab.github.io/escnn/☆31Updated 2 years ago
- Lightning-like training API for JAX with Flax☆44Updated 10 months ago
- A Python package of computer vision models for the Equinox ecosystem.☆109Updated last year
- Run PyTorch in JAX. 🤝☆305Updated last week
- Pytorch-like dataloaders for JAX.☆93Updated 4 months ago
- Use Jax functions in Pytorch☆253Updated 2 years ago
- Minimal, lightweight JAX implementations of popular models.☆114Updated this week
- ☆44Updated 2 months ago
- Your favourite classical machine learning algos on the GPU/TPU☆20Updated 9 months ago
- Flow-matching algorithms in JAX☆105Updated last year
- Multiple dispatch over abstract array types in JAX.☆131Updated 2 weeks ago
- A simple library for scaling up JAX programs☆144Updated 11 months ago
- A collection of graph neural networks implementations in JAX☆35Updated last year
- Exact OU processes with JAX☆55Updated 7 months ago
- ☆60Updated 3 years ago
- Brain-Inspired Modular Training (BIMT), a method for making neural networks more modular and interpretable.☆173Updated 2 years ago
- Fine-grained, dynamic control of neural network topology in JAX.☆21Updated 2 years ago
- Meta-learning inductive biases in the form of useful conserved quantities.☆37Updated 2 years ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated last year
- Automatic gradient descent☆215Updated 2 years ago
- Wraps PyTorch code in a JIT-compatible way for JAX. Supports automatically defining gradients for reverse-mode AutoDiff.☆58Updated 3 months ago
- Riemannian Optimization Using JAX☆53Updated last year
- LoRA for arbitrary JAX models and functions☆141Updated last year