idiap / importance-samplingLinks
Code for experiments regarding importance sampling for training neural networks
☆328Updated 4 years ago
Alternatives and similar repositories for importance-sampling
Users that are interested in importance-sampling are comparing it to the libraries listed below
Sorting:
- Implements pytorch code for the Accelerated SGD algorithm.☆215Updated 7 years ago
- Neural Architecture Search with Bayesian Optimisation and Optimal Transport☆135Updated 6 years ago
- Sparse Variational Dropout, ICML 2017☆312Updated 5 years ago
- A tutorial on "Bayesian Compression for Deep Learning" published at NIPS (2017).☆206Updated 7 years ago
- Code to reproduce some of the figures in the paper "On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima"☆145Updated 8 years ago
- Decoupled Weight Decay Regularization (ICLR 2019)☆285Updated 7 years ago
- Example code for the paper "Understanding deep learning requires rethinking generalization"☆178Updated 5 years ago
- ☆252Updated 9 years ago
- Totally Versatile Miscellanea for Pytorch☆476Updated 3 years ago
- hessian in pytorch☆187Updated 5 years ago
- ☆135Updated 8 years ago
- Gradient based hyperparameter optimization & meta-learning package for TensorFlow☆190Updated 5 years ago
- Code for Concrete Dropout as presented in https://arxiv.org/abs/1705.07832☆253Updated 7 years ago
- pytorch implementation of "Distilling a Neural Network Into a Soft Decision Tree"☆302Updated 7 years ago
- Implementation for the Lookahead Optimizer.☆243Updated 3 years ago
- Hypergradient descent☆147Updated last year
- ☆145Updated 2 years ago
- An implementation of KFAC for TensorFlow☆199Updated 3 years ago
- Python package facilitating the use of Bayesian Deep Learning methods with Variational Inference for PyTorch☆360Updated 6 years ago
- Complementary code for the Targeted Dropout paper☆255Updated 6 years ago
- Learning Sparse Neural Networks through L0 regularization☆245Updated 5 years ago
- explore DNNs via Infomration☆266Updated 5 years ago
- A PyTorch library for two-sample tests☆241Updated last month
- Tools for loading standard data sets in machine learning☆206Updated 3 years ago
- Code repository for the paper "Hyperparameter Optimization: A Spectral Approach" by Elad Hazan, Adam Klivans, Yang Yuan.☆175Updated 7 years ago
- Adaptive Neural Trees☆155Updated 6 years ago
- Release of CIFAR-10.1, a new test set for CIFAR-10.☆225Updated 5 years ago
- A general, modular, and programmable architecture search framework☆123Updated 2 years ago
- A drop-in replacement for CIFAR-10.☆247Updated 4 years ago
- Code for "EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis" https://arxiv.org/abs/1905.05934☆113Updated 5 years ago