jrbourbeau / dask-optuna
Scale Optuna with Dask
☆35Updated 4 years ago
Alternatives and similar repositories for dask-optuna:
Users that are interested in dask-optuna are comparing it to the libraries listed below
- Reproducibility for Humans: A lightweight tool to perform reproducible machine learning experiment.☆24Updated 5 years ago
- %conda magic for IPython☆28Updated 8 years ago
- dask-pytorch-ddp is a Python package that makes it easy to train PyTorch models on dask clusters using distributed data parallel.☆59Updated 4 years ago
- Dask and Spark interactions☆21Updated 8 years ago
- Extension to hypothesis for testing numpy general universal functions☆39Updated 4 years ago
- Introduction to Dask for PyTorch Workflows☆13Updated 4 years ago
- Gradient boosting on steroids☆28Updated 9 months ago
- A `select` accessor for easier subsetting of pandas DataFrames and Series☆34Updated last year
- A command line utility to create kernels in Jupyter from virtual environments.☆16Updated 7 years ago
- Scalable pattern search optimization with dask☆22Updated 8 years ago
- IPython magic for parallel profiling (like `%time`, but parallel)☆71Updated 7 years ago
- A scikit-learn wrapper for HpBandSter hyper parameter search.☆22Updated 2 years ago
- Sparrow is a boosting algorithm implementation that is optimized for training on very large datasets and/or in the limited memory setting…☆21Updated 4 years ago
- Lightweight framework for structured and repeatable model validation☆11Updated last year
- mimic calibration☆21Updated 5 years ago
- A thorough, straightforward, un-intimidating introduction to Gaussian processes in NumPy.