sjunhongshen / DASH
☆21Updated last year
Related projects ⓘ
Alternatives and complementary repositories for DASH
- ☆10Updated 2 years ago
- ☆13Updated 2 years ago
- ☆17Updated 2 years ago
- ☆15Updated last year
- DiWA: Diverse Weight Averaging for Out-of-Distribution Generalization☆28Updated last year
- Encodings for neural architecture search☆29Updated 3 years ago
- Latest Weight Averaging (NeurIPS HITY 2022)☆20Updated last year
- Deep Learning & Information Bottleneck☆50Updated last year
- Code for "Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models?" [ICML 2023]☆31Updated 2 months ago
- Code base for SRSGD.☆28Updated 4 years ago
- Official code for the paper: "Metadata Archaeology"☆18Updated last year
- Supporing code for the paper "Bayesian Model Selection, the Marginal Likelihood, and Generalization".☆35Updated 2 years ago
- Recycling diverse models☆44Updated last year
- Implementation of the models and datasets used in "An Information-theoretic Approach to Distribution Shifts"☆25Updated 3 years ago
- Code for Accelerated Linearized Laplace Approximation for Bayesian Deep Learning (ELLA, NeurIPS 22')☆16Updated 2 years ago
- ☆46Updated last month
- Repository for the PopulAtion Parameter Averaging (PAPA) paper☆26Updated 7 months ago
- ☆12Updated 2 years ago
- CIFAR-5m dataset☆39Updated 3 years ago
- Notebooks for managing NeurIPS 2014 and analysing the NeurIPS experiment.☆11Updated 6 months ago
- An adaptive training algorithm for residual network☆14Updated 4 years ago
- Gradient-based Hyperparameter Optimization Over Long Horizons☆12Updated 3 years ago
- ☆35Updated last year
- Quantification of Uncertainty with Adversarial Models☆27Updated last year
- ☆51Updated 5 months ago
- ☆36Updated 2 years ago
- Introducing diverse tasks for NAS☆47Updated last year
- Towards Understanding Sharpness-Aware Minimization [ICML 2022]☆35Updated 2 years ago
- Official PyTorch implementation of "Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets" (ICLR 2021)☆63Updated 3 months ago
- Code to reproduce experiments from 'Does Knowledge Distillation Really Work' a paper which appeared in the 2021 NeurIPS proceedings.☆33Updated last year