leopard-ai / bettyLinks
Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
☆344Updated last year
Alternatives and similar repositories for betty
Users that are interested in betty are comparing it to the libraries listed below
Sorting:
- ☆311Updated 7 months ago
- Framework code with wandb, checkpointing, logging, configs, experimental protocols. Useful for fine-tuning models or training from scratc…☆151Updated 2 years ago
- Training and evaluating NBM and SPAM for interpretable machine learning.☆78Updated 2 years ago
- {KFAC,EKFAC,Diagonal,Implicit} Fisher Matrices and finite width NTKs in PyTorch☆215Updated last month
- Code for Parameter Prediction for Unseen Deep Architectures (NeurIPS 2021)☆492Updated 2 years ago
- Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions☆258Updated last year
- This repository contains a Jax implementation of conformal training corresponding to the ICLR'22 paper "learning optimal conformal classi…☆129Updated 3 years ago
- ☆208Updated 3 years ago
- Parameter-Free Optimizers for Pytorch☆130Updated last year
- This repository contains the code of the distribution shift framework presented in A Fine-Grained Analysis on Distribution Shift (Wiles e…☆83Updated 3 months ago
- Optimal Transport Dataset Distance☆170Updated 3 years ago
- Differentiable Sorting Networks☆119Updated 2 years ago
- Code for our NeurIPS 2022 paper☆369Updated 2 years ago
- ☆133Updated 4 years ago
- Reduce end to end training time from days to hours (or hours to minutes), and energy requirements/costs by an order of magnitude using co…☆342Updated 2 years ago
- Cockpit: A Practical Debugging Tool for Training Deep Neural Networks☆482Updated 3 years ago
- Official repository for the paper "Zero-Shot AutoML with Pretrained Models"☆47Updated last year
- Package for working with hypernetworks in PyTorch.☆131Updated 2 years ago
- Unofficial JAX implementations of deep learning research papers☆156Updated 3 years ago
- ☆185Updated last year
- ☆65Updated 3 years ago
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆593Updated 9 months ago
- Official codebase for Pretrained Transformers as Universal Computation Engines.☆247Updated 3 years ago
- ☆139Updated last year
- Code for "Self-Attention Between Datapoints: Going Beyond Individual Input-Output Pairs in Deep Learning"☆415Updated last year
- Code for the paper: "Tensor Programs II: Neural Tangent Kernel for Any Architecture"☆104Updated 5 years ago
- Official Implementation of "Transformers Can Do Bayesian Inference", the PFN paper☆233Updated 11 months ago
- Code release for "Git Re-Basin: Merging Models modulo Permutation Symmetries"☆490Updated 2 years ago
- Named tensors with first-class dimensions for PyTorch☆331Updated 2 years ago
- ☆234Updated 7 months ago