davmre / elbow
Flexible Bayesian inference using TensorFlow
☆143Updated 8 years ago
Alternatives and similar repositories for elbow:
Users that are interested in elbow are comparing it to the libraries listed below
- Efficient implementation of Generative Stochastic Networks☆317Updated 8 years ago
- Variational and semi-supervised neural network toppings for Lasagne☆208Updated 8 years ago
- Stochastic gradient routines for Theano☆102Updated 6 years ago
- Bayesian dessert for Lasagne☆84Updated 7 years ago
- ☆129Updated 8 years ago
- HPOlib is a hyperparameter optimization library. It provides a common interface to three state of the art hyperparameter optimization pac…☆166Updated 6 years ago
- Spearmint, without the gum☆42Updated 8 years ago
- ☆35Updated 10 years ago
- Breze with all the stuff.☆96Updated 8 years ago
- Deep exponential families (DEFs)☆55Updated 7 years ago
- Recurrent Neural Network for modeling sequential data implemented using Python and Theano.☆91Updated 9 years ago
- Fork of https://github.com/Lasagne/Lasagne☆63Updated 9 years ago
- Python package for modular Bayesian optimization☆134Updated 4 years ago
- ☆34Updated 8 years ago
- Scikit-learn compatible tools using theano☆363Updated 8 years ago
- Benchmark testbed for assessing the performance of optimisation algorithms☆80Updated 10 years ago
- Exploring differentiation with respect to hyperparameters☆294Updated 9 years ago
- Bernoulli Embeddings for Text☆83Updated 7 years ago
- Adversarial networks in TensorFlow☆170Updated 8 years ago
- ☆36Updated 9 years ago
- Empirical CLI☆42Updated 4 years ago
- Differentiable Neural Computers☆129Updated 8 years ago
- Common interface for Theano, CGT, and TensorFlow☆237Updated 8 years ago
- Creates models to classify documents into categories☆66Updated 7 years ago
- General purpose Hessian-free optimization in Theano☆182Updated 10 years ago
- Reference implementations of neural networks with differentiable memory mechanisms (NTM, Stack RNN, etc.)☆219Updated 8 years ago
- Torch implementation of the Deep Network for Global Optimization (DNGO)