IDSIA / sacredLinks
Sacred is a tool to help you configure, organize, log and reproduce experiments developed at IDSIA.
☆4,336Updated last month
Alternatives and similar repositories for sacred
Users that are interested in sacred are comparing it to the libraries listed below
Sorting:
- Web-based dashboard for Sacred☆547Updated 2 years ago
- A scikit-learn compatible neural network library that wraps PyTorch☆6,115Updated last month
- A Python toolbox for performing gradient-free optimization☆4,120Updated 4 months ago
- Gin provides a lightweight configuration framework for Python☆2,126Updated last week
- High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.☆4,701Updated last week
- Distributed Asynchronous Hyperparameter Optimization in Python☆7,476Updated last week
- Sequential model-based optimization with a `scipy.optimize` interface☆2,794Updated last year
- Bayesian optimization in PyTorch☆3,353Updated last week
- Adaptive Experimentation Platform☆2,577Updated last week
- A library for debugging/inspecting machine learning classifiers and explaining their predictions☆2,775Updated 5 months ago
- Efficiently computes derivatives of NumPy code.☆7,366Updated this week
- Model interpretability and understanding for PyTorch☆5,419Updated this week
- Debugging, monitoring and visualization for Python Machine Learning and Data Science☆3,456Updated last week
- Accelerated deep learning R&D☆3,362Updated 3 months ago
- Train AI models efficiently on medical images using any framework☆1,874Updated last year
- Python library to easily log experiments and parallelize hyperparameter search for neural networks☆736Updated 3 years ago
- Automatic architecture search and hyperparameter optimization for PyTorch☆2,488Updated last year
- torch-optimizer -- collection of optimizers for Pytorch☆3,141Updated last year
- A highly efficient implementation of Gaussian Processes in PyTorch☆3,773Updated last month
- A system for quickly generating training data with weak supervision☆5,918Updated last year
- Tips for releasing research code in Machine Learning (with official NeurIPS 2020 recommendations)☆2,828Updated 2 years ago
- Make huge neural nets fit in memory☆2,814Updated 5 years ago
- A collection of infrastructure and tools for research in neural network interpretability.☆4,700Updated 2 years ago
- A Python Package to Tackle the Curse of Imbalanced Datasets in Machine Learning☆7,039Updated last month
- Toolbox of models, callbacks, and datasets for AI/ML researchers.☆1,741Updated last month
- tensorboard for pytorch (and chainer, mxnet, numpy, ...)☆7,964Updated last month
- Experiment tracking, ML developer tools☆889Updated 5 months ago
- 🚪✊Knock Knock: Get notified when your training ends with only two additional lines of code☆2,818Updated 2 years ago
- Version control for machine learning☆1,673Updated 7 months ago
- Live training loss plot in Jupyter Notebook for Keras, PyTorch and others☆1,319Updated 6 months ago