mila-iqia / milabenchLinks
Repository of machine learning benchmarks
☆39Updated last week
Alternatives and similar repositories for milabench
Users that are interested in milabench are comparing it to the libraries listed below
Sorting:
- ☆114Updated last week
- A simple library for scaling up JAX programs☆139Updated 8 months ago
- Scaling scaling laws with board games.☆49Updated 2 years ago
- A parallel framework for training deep neural networks☆62Updated 4 months ago
- Running Jax in PyTorch Lightning☆106Updated 7 months ago
- ☆135Updated this week
- Experiment of using Tangent to autodiff triton☆79Updated last year
- Proof-of-concept of global switching between numpy/jax/pytorch in a library.☆18Updated last year
- Multi-framework implementation of Deep Kernel Shaping and Tailored Activation Transformations, which are methods that modify neural netwo…☆71Updated 2 weeks ago
- ☆39Updated 3 years ago
- Serialize JAX, Flax, Haiku, or Objax model params with 🤗`safetensors`☆45Updated last year
- If it quacks like a tensor...☆58Updated 8 months ago
- JMP is a Mixed Precision library for JAX.☆206Updated 5 months ago
- Machine Learning eXperiment Utilities☆46Updated 3 weeks ago
- ☆58Updated last year
- A stand-alone implementation of several NumPy dtype extensions used in machine learning.☆281Updated last week
- Named Tensors for Legible Deep Learning in JAX☆188Updated this week
- Clean RL implementation using MLX☆32Updated last year
- ☆76Updated last week
- A small library for creating and manipulating custom JAX Pytree classes☆56Updated 2 years ago
- OpTree: Optimized PyTree Utilities☆187Updated last week
- A functional training loops library for JAX☆88Updated last year
- Minimal but scalable implementation of large language models in JAX☆35Updated 2 weeks ago
- ☆230Updated 5 months ago
- Fine-grained, dynamic control of neural network topology in JAX.☆21Updated last year
- Learn online intrinsic rewards from LLM feedback☆41Updated 7 months ago
- Neural Networks for JAX☆84Updated 9 months ago
- ☆60Updated 3 years ago
- Wraps PyTorch code in a JIT-compatible way for JAX. Supports automatically defining gradients for reverse-mode AutoDiff.☆55Updated 2 months ago
- Jax/Flax rewrite of Karpathy's nanoGPT☆59Updated 2 years ago