fferflo / einxLinks
Universal Notation for Tensor Operations in Python.
β438Updated 6 months ago
Alternatives and similar repositories for einx
Users that are interested in einx are comparing it to the libraries listed below
Sorting:
- Named tensors with first-class dimensions for PyTorchβ331Updated 2 years ago
- Run PyTorch in JAX. π€β304Updated last week
- OpTree: Optimized PyTree Utilitiesβ196Updated last week
- π§± Modula software packageβ291Updated 2 months ago
- For optimization algorithm research and development.β542Updated last week
- Library for reading and processing ML training data.β570Updated this week
- TensorDict is a pytorch dedicated tensor container.β972Updated this week
- Efficient optimizersβ274Updated last week
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvementβ¦β400Updated last week
- β283Updated last year
- Running Jax in PyTorch Lightningβ112Updated 10 months ago
- jax-triton contains integrations between JAX and OpenAI Tritonβ428Updated last week
- CIFAR-10 speedruns: 94% in 2.6 seconds and 96% in 27 secondsβ313Updated 3 months ago
- Use Jax functions in Pytorchβ253Updated 2 years ago
- Named Tensors for Legible Deep Learning in JAXβ210Updated last week
- Compositional Linear Algebraβ489Updated 2 months ago
- A simple library for scaling up JAX programsβ144Updated 11 months ago
- CLU lets you write beautiful training loops in JAX.β356Updated 4 months ago
- The AdEMAMix Optimizer: Better, Faster, Older.β186Updated last year
- Pytorch-like dataloaders for JAX.β93Updated 4 months ago
- Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/β1,579Updated 3 weeks ago
- Memory mapped numpy arrays of varying shapesβ303Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jaxβ671Updated this week
- Scalable and Performant Data Loadingβ311Updated this week
- β218Updated 10 months ago
- β120Updated 4 months ago
- Implementation of Diffusion Transformer (DiT) in JAXβ295Updated last year
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditionβ¦β188Updated last week
- JMP is a Mixed Precision library for JAX.