liamcli / hyperband_benchmarks
☆14Updated 7 years ago
Alternatives and similar repositories for hyperband_benchmarks:
Users that are interested in hyperband_benchmarks are comparing it to the libraries listed below
- RNNprop☆36Updated 7 years ago
- NeurIPS 2017 best paper. An interpretable linear-time kernel goodness-of-fit test.☆67Updated 5 years ago
- The Statistical Recurrent Unit in Pytorch☆33Updated 7 years ago
- Code for "Deep Convolutional Networks as shallow Gaussian Processes"☆39Updated 5 years ago
- ☆33Updated 3 years ago
- Implementation of Adversarial Variational Optimization in PyTorch☆43Updated 6 years ago
- predicting learning curves in python☆43Updated 7 years ago
- Keras implementation of the Information Dropout (arXiv:1611.01353) paper☆15Updated 8 years ago
- Implements an infinite sum of poisson-weighted convolutions☆26Updated 6 years ago
- Tensor Switching Networks☆12Updated 7 years ago
- Python package to sample from determinantal point processes☆18Updated 9 years ago
- ☆16Updated 8 years ago
- Github repo for my experiments with the orthogonal convolution idea☆22Updated 7 years ago
- ☆35Updated 9 years ago
- This project collects the different accepted papers and their link to Arxiv or Gitxiv☆17Updated 8 years ago
- Variational Dropout Sparsifies Deep Neural Networks (Molchanov et al. 2017) by Chainer☆19Updated 7 years ago
- Stochastic Gradient Riemannian Langevin Dynamics☆33Updated 9 years ago
- ☆29Updated 7 years ago
- ☆25Updated 7 years ago
- Generalized Compressed Network Search with PyTorch☆26Updated 7 years ago
- Functional ANOVA☆27Updated 10 years ago
- Survey of hyperparameter optimization use in NIPS2014☆12Updated 9 years ago
- Echo Noise Channel for Exact Mutual Information Calculation☆17Updated 4 years ago
- Convolutional computer vision architectures that can be tuned by hyperopt.☆70Updated 10 years ago
- Neural Networks Figures☆52Updated 7 years ago
- ☆34Updated 5 years ago
- This repository contains all the material for the MLTrain NIPS workshop☆10Updated 7 years ago
- TensorFlow implementation of the method from Variational Dropout Sparsifies Deep Neural Networks, Molchanov et al. (2017)☆16Updated 7 years ago
- more composable than other neural network libraries☆42Updated 8 years ago