google / flax
Flax is a neural network library for JAX that is designed for flexibility.
☆6,527Updated this week
Alternatives and similar repositories for flax:
Users that are interested in flax are comparing it to the libraries listed below
- JAX-based neural network library☆3,021Updated this week
- Optax is a gradient processing and optimization library for JAX.☆1,875Updated this week
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more☆32,098Updated this week
- Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/☆2,336Updated last week
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆8,882Updated last week
- JAX - A curated list of resources https://github.com/google/jax☆1,796Updated 2 months ago
- Enabling PyTorch on XLA Devices (e.g. Google TPU)☆2,596Updated this week
- PyTorch extensions for high performance and large scale training.☆3,308Updated last week
- A Graph Neural Network Library in Jax☆1,427Updated last year
- Efficiently computes derivatives of NumPy code.☆7,254Updated this week
- functorch is JAX-like composable function transforms for PyTorch.☆1,424Updated this week
- A machine learning compiler for GPUs, CPUs, and ML accelerators☆3,127Updated this week
- Fast and Easy Infinite Neural Networks in Python☆2,335Updated last year
- FFCV: Fast Forward Computer Vision (and other ML workloads!)☆2,924Updated 10 months ago
- Development repository for the Triton language and compiler☆15,447Updated this week
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆8,673Updated this week
- A scikit-learn compatible neural network library that wraps PyTorch☆6,014Updated last week
- Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.☆2,432Updated this week
- A Python toolbox for performing gradient-free optimization☆4,051Updated last week
- Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/☆1,390Updated this week
- ☆1,311Updated 3 weeks ago
- ☆848Updated this week
- Toolbox of models, callbacks, and datasets for AI/ML researchers.☆1,720Updated 3 weeks ago
- Hardware accelerated, batchable and differentiable optimizers in JAX.☆962Updated 3 weeks ago
- NumPy & SciPy for GPU☆10,164Updated this week
- High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.☆4,651Updated last week
- Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.☆1,429Updated this week
- torch-optimizer -- collection of optimizers for Pytorch☆3,107Updated last year
- A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch☆8,645Updated 3 weeks ago
- ML Collections is a library of Python Collections designed for ML use cases.☆949Updated this week