AIworx-Labs / chocolate
A fully decentralized hyperparameter optimization framework
☆122Updated 7 months ago
Alternatives and similar repositories for chocolate:
Users that are interested in chocolate are comparing it to the libraries listed below
- Efficient Hyperparameter Optimization of Deep Learning Algorithms Using Deterministic RBF Surrogates☆115Updated 7 years ago
- ☆69Updated 4 years ago
- A simple, extensible library for developing AutoML systems☆173Updated last year
- Functional ANOVA☆122Updated last month
- Domain specific language for configuration spaces in Python. Useful for hyperparameter optimization and algorithm configuration.☆208Updated 3 months ago
- Experimental Gradient Boosting Machines in Python with numba.☆183Updated 6 years ago
- Better, faster hyper-parameter optimization☆113Updated last year
- InfiniteBoost: building infinite ensembles with gradient descent☆184Updated 6 years ago
- Experimental Global Optimization Algorithm☆481Updated 7 years ago
- simple python interface to SMAC.☆21Updated 6 years ago
- RoBO: a Robust Bayesian Optimization framework☆485Updated 5 years ago
- Bayesian Optimization using GPflow☆270Updated 4 years ago
- a feature engineering wrapper for sklearn☆51Updated 4 years ago
- Tuning hyperparams fast with Hyperband☆593Updated 6 years ago
- Milano is a tool for automating hyper-parameters search for your models on a backend of your choice.☆155Updated 6 years ago
- Python library for extracting mini-batches of data from a data source for the purpose of training neural networks☆87Updated 5 years ago
- predicting learning curves in python☆43Updated 7 years ago
- HPOlib is a hyperparameter optimization library. It provides a common interface to three state of the art hyperparameter optimization pac…☆166Updated 6 years ago
- A library for factorization machines and polynomial networks for classification and regression in Python.☆245Updated 4 years ago
- A high-level framework for advanced deep learning with TensorFlow☆54Updated 7 years ago
- Convolutional computer vision architectures that can be tuned by hyperopt.☆70Updated 10 years ago
- Batch Renormalization algorithm implementation in Keras☆98Updated 6 years ago
- Streamlined machine learning experiment management.☆107Updated 4 years ago
- Development Repository for GPU-accelerated GBDT training☆60Updated 7 years ago
- optimization routines for hyperparameter tuning☆418Updated last year
- Benchmark suite of test functions suitable for evaluating black-box optimization strategies☆50Updated 9 months ago
- Python library for Bayesian hyper-parameters optimization☆89Updated 6 years ago
- Multi-GPU data-parallel training in Keras☆77Updated 7 years ago
- An extension to Sacred for automated hyperparameter optimization.☆59Updated 7 years ago
- Population Based Training (in PyTorch with sqlite3). Status: Unsupported