automl / nepsLinks
Neural Pipeline Search (NePS): Helps deep learning experts find the best neural pipeline.
☆77Updated this week
Alternatives and similar repositories for neps
Users that are interested in neps are comparing it to the libraries listed below
Sorting:
- Launching and monitoring Slurm experiments in Python☆18Updated last week
- ☆79Updated last month
- An interactive framework to visualize and analyze your AutoML process in real-time.☆87Updated this week
- In-context Bayesian Optimization☆16Updated last month
- The PyExperimenter is a tool for the automatic execution of experiments, e.g. for machine learning (ML), capturing corresponding results …☆35Updated 3 weeks ago
- Our maintained PFN repository. Come here to train SOTA PFNs.☆87Updated this week
- Surrogate benchmarks for HPO problems☆27Updated last week
- Collection of hyperparameter optimization benchmark problems☆147Updated 2 weeks ago
- The official implementation of PFNs4BO: In-Context Learning for Bayesian Optimization☆28Updated last year
- A Framework for Comparing N Hyperparameter Optimizers on M Benchmarks.☆14Updated this week
- a minimal website to get the diff of llm rewrites☆11Updated 5 months ago
- TuneTables is a tabular classifier that implements prompt tuning for frozen prior-fitted networks.☆19Updated 2 months ago
- [NeurIPS DBT 2021] HPO-B☆32Updated last month
- [ICLR 2021] Few Shot Bayesian Optimization☆20Updated 2 years ago
- A build-it-yourself AutoML Framework☆74Updated 7 months ago
- ☆17Updated 7 months ago
- Repository for TabICL: A Tabular Foundation Model for In-Context Learning on Large Data☆72Updated 2 weeks ago
- [KDD 2023] Deep Pipeline Embeddings for AutoML☆16Updated 6 months ago
- [ICLR 2023] Deep Ranking Ensembles for Hyperparameter Optimization☆14Updated last year
- Tabular In-Context Learning☆67Updated 2 months ago
- [NeurIPS 2022] Supervising the Multi-Fidelity Race of Hyperparameter Configurations☆13Updated 2 years ago
- TabDPT: Scaling Tabular Foundation Models☆27Updated last month
- [NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power laws that achieves state-of-the-art results across diverse benc…☆16Updated last year
- A learning curve benchmark on OpenML data☆30Updated 6 months ago
- [ICLR2024] Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How☆32Updated 6 months ago
- Code for "TabZilla: When Do Neural Nets Outperform Boosted Trees on Tabular Data?"☆159Updated last year
- scikit-activeml: Python library for active learning on top of scikit-learn☆167Updated last week
- [NeurIPS 2021] Well-tuned Simple Nets Excel on Tabular Datasets☆85Updated 2 years ago
- Pre-trained Gaussian processes for Bayesian optimization☆91Updated last month
- Training and evaluating NBM and SPAM for interpretable machine learning.☆78Updated 2 years ago