silverbottlep / meta_curvatureLinks
The code for the paper, 'Meta-Curvature, Eunbyung Park and Junier Oliver, NeurIPS 2019'
☆11Updated 5 years ago
Alternatives and similar repositories for meta_curvature
Users that are interested in meta_curvature are comparing it to the libraries listed below
Sorting:
- Code for "Online Learned Continual Compression with Adaptive Quantization Modules"☆27Updated 4 years ago
- A collection of Gradient-Based Meta-Learning Algorithms with pytorch☆62Updated 5 years ago
- Official Code Repository for La-MAML: Look-Ahead Meta-Learning for Continual Learning"☆76Updated 4 years ago
- ☆26Updated 6 years ago
- Meta-Learning with Warped Gradient Descent☆93Updated 4 years ago
- Implementation of Bayesian Gradient Descent☆37Updated last year
- ☆21Updated 5 years ago
- ☆58Updated 2 years ago
- Low-variance, efficient and unbiased gradient estimation for optimizing models with binary latent variables. (ICLR 2019)☆28Updated 6 years ago
- ☆51Updated 4 years ago
- Code for "Training Deep Energy-Based Models with f-Divergence Minimization" ICML 2020☆36Updated 2 years ago
- Official Release of "Learning the Stein Discrepancy for Training and Evaluating Energy-Based Models without Sampling"☆49Updated 4 years ago
- Towards increasing stability of neural networks for continual learning: https://arxiv.org/abs/2006.06958.pdf (NeurIPS'20)☆75Updated 2 years ago
- The official code for Efficient Learning of Generative Models via Finite-Difference Score Matching☆12Updated 2 years ago
- Memory efficient MAML using gradient checkpointing☆84Updated 5 years ago
- Lookahead: A Far-sighted Alternative of Magnitude-based Pruning (ICLR 2020)☆33Updated 4 years ago
- Implicit Generation and Generalization in Energy Based Models in PyTorch☆65Updated 6 years ago
- ☆91Updated 3 years ago
- ☆72Updated 2 years ago
- ☆22Updated last year
- Functional Regularisation for Continual Learning with Gaussian Processes☆14Updated 4 years ago
- This repository contains implementations of the paper, Bayesian Model-Agnostic Meta-Learning.