sjunhongshen / DASHLinks
☆23Updated 2 years ago
Alternatives and similar repositories for DASH
Users that are interested in DASH are comparing it to the libraries listed below
Sorting:
- ☆12Updated 3 years ago
- ☆15Updated 3 years ago
- Code release for REPAIR: REnormalizing Permuted Activations for Interpolation Repair☆52Updated last year
- Official implementation for Equivariant Architectures for Learning in Deep Weight Spaces [ICML 2023]☆90Updated 2 years ago
- Latest Weight Averaging (NeurIPS HITY 2022)☆32Updated 2 years ago
- Introducing diverse tasks for NAS☆50Updated 3 years ago
- ☆52Updated last year
- Official PyTorch implementation of "Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets" (ICLR 2021)☆64Updated last year
- Model Zoos published at the NeurIPS 2022 Dataset & Benchmark track: "Model Zoos: A Dataset of Diverse Populations of Neural Network Model…☆57Updated 2 months ago
- ☆19Updated 3 years ago
- Recycling diverse models☆46Updated 2 years ago
- Efficient Riemannian Optimization on Stiefel Manifold via Cayley Transform☆43Updated 6 years ago
- ☆37Updated 2 years ago
- Implementation of the models and datasets used in "An Information-theoretic Approach to Distribution Shifts"☆25Updated 4 years ago
- Code for "Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models?" [ICML 2023]☆37Updated last year
- Deep Learning & Information Bottleneck☆62Updated 2 years ago
- Code for testing DCT plus Sparse (DCTpS) networks☆14Updated 4 years ago
- Revisiting Efficient Training Algorithms For Transformer-based Language Models (NeurIPS 2023)☆81Updated 2 years ago
- Data for "Datamodels: Predicting Predictions with Training Data"☆97Updated 2 years ago
- ☆58Updated 2 years ago
- ☆23Updated 3 years ago
- ☆13Updated 2 years ago
- ☆13Updated 3 years ago
- ModelDiff: A Framework for Comparing Learning Algorithms☆58Updated 2 years ago
- Sequence Modeling with Multiresolution Convolutional Memory (ICML 2023)☆127Updated 2 years ago
- Code for the paper "Efficient Dataset Distillation using Random Feature Approximation"☆37Updated 2 years ago
- Offical Repo for Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks. Accepted by Neurips 2020.☆34Updated 5 years ago
- Pytorch code for "Improving Self-Supervised Learning by Characterizing Idealized Representations"☆41Updated 3 years ago
- Reproducible code for Augmentation paper☆17Updated 6 years ago
- Good Subnetworks Provably Exist: Pruning via Greedy Forward Selection☆21Updated 4 years ago