google-deepmind / dks
Multi-framework implementation of Deep Kernel Shaping and Tailored Activation Transformations, which are methods that modify neural network models (and their initializations) to make them easier to train.
☆70Updated 2 weeks ago
Alternatives and similar repositories for dks
Users that are interested in dks are comparing it to the libraries listed below
Sorting:
- ☆114Updated this week
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation precondition…☆174Updated last week
- Meta-learning inductive biases in the form of useful conserved quantities.☆37Updated 2 years ago
- ☆60Updated 3 years ago
- Image augmentation library for Jax☆39Updated last year
- Experiment of using Tangent to autodiff triton☆78Updated last year
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆83Updated last year
- ☆32Updated 7 months ago
- ☆53Updated 7 months ago
- Official repository for the paper "Can You Learn an Algorithm? Generalizing from Easy to Hard Problems with Recurrent Networks"☆59Updated 3 years ago
- Neural Networks for JAX☆84Updated 7 months ago
- A selection of neural network models ported from torchvision for JAX & Flax.☆44Updated 4 years ago
- Open source code for EigenGame.☆30Updated 2 years ago
- CUDA implementation of autoregressive linear attention, with all the latest research findings☆44Updated last year
- JMP is a Mixed Precision library for JAX.☆198Updated 3 months ago
- A collection of optimizers, some arcane others well known, for Flax.☆29Updated 3 years ago
- A functional training loops library for JAX☆88Updated last year
- ☆27Updated last year
- A simple library for scaling up JAX programs☆134Updated 6 months ago
- A simple hypernetwork implementation in jax using haiku.☆23Updated 2 years ago
- [ICML 2024] SINGD: KFAC-like Structured Inverse-Free Natural Gradient Descent (http://arxiv.org/abs/2312.05705)☆21Updated 6 months ago
- Latent Diffusion Language Models☆68Updated last year
- ☆31Updated last month
- FID computation in Jax/Flax.☆27Updated 10 months ago
- JAX implementation of Learning to learn by gradient descent by gradient descent☆27Updated 7 months ago
- Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).☆109Updated 2 years ago
- Turn jitted jax functions back into python source code☆22Updated 5 months ago
- ☆103Updated 10 months ago
- The Energy Transformer block, in JAX☆57Updated last year
- ☆33Updated 2 years ago