nzw0301 / optuna-wandbLinks
Example codes in the medium post titled "Optuna meets Weights and Biases."
☆24Updated 3 years ago
Alternatives and similar repositories for optuna-wandb
Users that are interested in optuna-wandb are comparing it to the libraries listed below
Sorting:
- Cyclemoid implementation for PyTorch☆90Updated 3 years ago
- GPT, but made only out of MLPs☆89Updated 4 years ago
- Implementation of a Transformer that Ponders, using the scheme from the PonderNet paper☆81Updated 4 years ago
- ☆75Updated 3 years ago
- Implements MLP-Mixer (https://arxiv.org/abs/2105.01601) with the CIFAR-10 dataset.☆59Updated 3 years ago
- Domain Adaptation☆23Updated 4 years ago
- ☆16Updated 3 years ago
- The official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We s…☆67Updated 3 years ago
- The official repository for "Intermediate Layers Matter in Momentum Contrastive Self Supervised Learning" paper.☆40Updated 3 years ago
- Experiment management with Hydra and MLflow☆13Updated 5 years ago
- Interactive Weak Supervision: Learning Useful Heuristics for Data Labeling☆30Updated 4 years ago
- Code for the CVPR 2019 paper : Spectral Metric for Dataset Complexity Assessment☆46Updated last year
- Explores the ideas presented in Deep Ensembles: A Loss Landscape Perspective (https://arxiv.org/abs/1912.02757) by Stanislav Fort, Huiyi …☆66Updated 5 years ago
- Layerwise Batch Entropy Regularization☆24Updated 3 years ago
- ☆211Updated 3 years ago
- Functional deep learning☆108Updated 3 years ago
- ☆140Updated 2 years ago
- ☆47Updated 3 years ago
- ☆37Updated 3 years ago
- ☆37Updated 4 years ago
- ☆19Updated 3 years ago
- A collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transfor…☆47Updated 2 years ago
- Axial Positional Embedding for Pytorch☆84Updated 10 months ago
- Trains Transformer model variants. Data isn't shuffled between batches.☆143Updated 3 years ago
- Training and evaluating NBM and SPAM for interpretable machine learning.☆78Updated 2 years ago
- A case study of efficient training of large language models using commodity hardware.☆68Updated 3 years ago
- ☆19Updated 2 years ago
- Implementation of Nyström Self-attention, from the paper Nyströmformer☆145Updated 9 months ago
- Pytorch implementation of Compressive Transformers, from Deepmind☆163Updated 4 years ago
- An active learning library for Pytorch based on Lightning-Fabric.☆79Updated last year