AIworx-Labs / chocolateLinks
A fully decentralized hyperparameter optimization framework
☆124Updated last year
Alternatives and similar repositories for chocolate
Users that are interested in chocolate are comparing it to the libraries listed below
Sorting:
- Efficient Hyperparameter Optimization of Deep Learning Algorithms Using Deterministic RBF Surrogates☆115Updated 8 years ago
- ☆69Updated 5 years ago
- Functional ANOVA☆125Updated 5 months ago
- InfiniteBoost: building infinite ensembles with gradient descent☆183Updated 6 years ago
- Parameter Importance Analysis Tool☆77Updated 4 years ago
- simple python interface to SMAC.☆21Updated 7 years ago
- RoBO: a Robust Bayesian Optimization framework☆487Updated 6 years ago
- A simple, extensible library for developing AutoML systems☆175Updated 2 years ago
- Experimental Gradient Boosting Machines in Python with numba.☆185Updated 6 years ago
- HPOlib is a hyperparameter optimization library. It provides a common interface to three state of the art hyperparameter optimization pac…☆166Updated 6 years ago
- Better, faster hyper-parameter optimization☆113Updated last year
- An implementation of Caruana et al's Ensemble Selection algorithm in Python, based on scikit-learn☆150Updated 4 years ago
- predicting learning curves in python☆43Updated 8 years ago
- a feature engineering wrapper for sklearn☆52Updated 5 years ago
- Experimental Global Optimization Algorithm☆483Updated 7 years ago
- An AutoML pipeline selection system to quickly select a promising pipeline for a new dataset.☆84Updated 3 years ago
- Development Repository for GPU-accelerated GBDT training☆60Updated 8 years ago
- Hyperparameter optimization with approximate gradient☆66Updated 4 years ago
- Supporting code for the paper "Finding Influential Training Samples for Gradient Boosted Decision Trees"☆68Updated last year
- optimization routines for hyperparameter tuning☆422Updated last year
- Streamlined machine learning experiment management.☆107Updated 5 years ago
- Python library for extracting mini-batches of data from a data source for the purpose of training neural networks☆87Updated 6 years ago
- Playing with various deep learning tools and network architectures☆70Updated 7 years ago
- A suite of boosting algorithms for the online learning setting.☆65Updated 8 years ago
- Tuning hyperparams fast with Hyperband☆595Updated 7 years ago
- DeepArchitect: Automatically Designing and Training Deep Architectures☆147Updated 5 years ago
- Some experiments into explaining complex black box ensemble predictions.☆74Updated 5 years ago
- Hodor AutoML: Brute-Bandit fast good-enough solutions to a wide range of machine learning problems.☆82Updated 10 years ago
- Auto-optimizing a neural net (and its architecture) on the CIFAR-100 dataset. Could be easily transferred to another dataset or another c…☆144Updated 2 years ago
- Hyperparameter optimization for neural networks☆48Updated 11 years ago