google / jaxonnxruntimeLinks
A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.
☆123Updated last month
Alternatives and similar repositories for jaxonnxruntime
Users that are interested in jaxonnxruntime are comparing it to the libraries listed below
Sorting:
- A stand-alone implementation of several NumPy dtype extensions used in machine learning.☆296Updated this week
- ☆188Updated last week
- JMP is a Mixed Precision library for JAX.☆208Updated 7 months ago
- JAX-Toolbox☆335Updated this week
- jax-triton contains integrations between JAX and OpenAI Triton☆416Updated last week
- ☆115Updated last week
- Implementation of Flash Attention in Jax☆216Updated last year
- ☆52Updated last year
- OpTree: Optimized PyTree Utilities☆192Updated this week
- Run PyTorch in JAX. 🤝☆284Updated last week
- A functional training loops library for JAX☆88Updated last year
- Minimal yet performant LLM examples in pure JAX☆152Updated this week
- If it quacks like a tensor...☆59Updated 10 months ago
- JAX Synergistic Memory Inspector☆179Updated last year
- Serialize JAX, Flax, Haiku, or Objax model params with 🤗`safetensors`☆45Updated last year
- ☆330Updated this week
- A simple library for scaling up JAX programs☆143Updated 10 months ago
- Orbax provides common checkpointing and persistence utilities for JAX users☆424Updated this week
- Named Tensors for Legible Deep Learning in JAX☆201Updated last week
- LoRA for arbitrary JAX models and functions☆142Updated last year
- A Python package of computer vision models for the Equinox ecosystem.☆109Updated last year
- Neural Networks for JAX☆84Updated 11 months ago
- Demo of the unit_scaling library, showing how a model can be easily adapted to train in FP8.☆46Updated last year
- TorchFix - a linter for PyTorch-using code with autofix support☆147Updated 3 weeks ago
- Experiment of using Tangent to autodiff triton☆81Updated last year
- ☆21Updated 6 months ago
- Einsum-like high-level array sharding API for JAX☆35Updated last year
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆291Updated last year
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆140Updated 5 months ago
- This is a port of Mistral-7B model in JAX☆32Updated last year