Code for ViTAS_Vision Transformer Architecture Search
☆51Jul 22, 2021Updated 4 years ago
Alternatives and similar repositories for ViTAS
Users that are interested in ViTAS are comparing it to the libraries listed below
Sorting:
- NAS Benchmark in "Prioritized Architecture Sampling with Monto-Carlo Tree Search", CVPR2021☆37Aug 24, 2021Updated 4 years ago
- Pi-NAS: Improving Neural Architecture Search by Reducing Supernet Training Consistency Shift (ICCV 2021)☆20Nov 28, 2021Updated 4 years ago
- code for NASViT☆67Apr 25, 2022Updated 3 years ago
- Code for "Searching for Efficient Multi-Stage Vision Transformers"☆63Sep 1, 2021Updated 4 years ago
- ☆265Oct 30, 2019Updated 6 years ago
- ☆12Nov 18, 2022Updated 3 years ago
- ☆13Jun 28, 2021Updated 4 years ago
- (ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search☆142Dec 6, 2021Updated 4 years ago
- [ICLR 2021] CompOFA: Compound Once-For-All Networks For Faster Multi-Platform Deployment☆25Jan 5, 2023Updated 3 years ago
- Auto-Prox-AAAI24☆14Apr 30, 2024Updated last year
- ☆16Mar 9, 2021Updated 4 years ago
- (CVPR 2020) Block-wisely Supervised Neural Architecture Search with Knowledge Distillation☆237Sep 23, 2021Updated 4 years ago
- ☆20Aug 16, 2021Updated 4 years ago
- [NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang…☆89Dec 1, 2023Updated 2 years ago
- This repository contains the code for the paper in Findings of EMNLP 2021: "EfficientBERT: Progressively Searching Multilayer Perceptron …☆33Jun 14, 2023Updated 2 years ago
- ☆19May 11, 2021Updated 4 years ago
- Code for "Are labels necessary for neural architecture search"☆92Mar 20, 2024Updated last year
- [ICLR 2022] "As-ViT: Auto-scaling Vision Transformers without Training" by Wuyang Chen, Wei Huang, Xianzhi Du, Xiaodan Song, Zhangyang Wa…☆76Feb 21, 2022Updated 4 years ago
- released code for the paper: ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse Coding☆31Nov 24, 2020Updated 5 years ago
- ☆20Mar 9, 2023Updated 2 years ago
- [NeurIPS 2023] ShiftAddViT: Mixture of Multiplication Primitives Towards Efficient Vision Transformer☆30Dec 6, 2023Updated 2 years ago
- Implementation of a Quantized Transformer Model☆19Mar 20, 2019Updated 6 years ago
- [NeurIPS 2020] "Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?" by Shen Yan, Yu Zheng, Wei Ao, X…☆50Jan 19, 2021Updated 5 years ago
- [ICLR 2020]: 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'☆220Jun 8, 2020Updated 5 years ago
- ☆24Dec 21, 2021Updated 4 years ago
- ☆40Oct 12, 2023Updated 2 years ago
- [CVPR 2021] Searching by Generating: Flexible and Efficient One-Shot NAS with Architecture Generator☆39May 19, 2022Updated 3 years ago
- This is the pytorch implementation for the paper: Generalizable Mixed-Precision Quantization via Attribution Rank Preservation, which is…☆24Aug 17, 2021Updated 4 years ago
- code for "AttentiveNAS Improving Neural Architecture Search via Attentive Sampling"☆105Sep 29, 2021Updated 4 years ago
- AlphaNet Improved Training of Supernet with Alpha-Divergence☆99Aug 12, 2021Updated 4 years ago
- Can GPT-4 Perform Neural Architecture Search?☆87Jul 18, 2023Updated 2 years ago
- [ACL'20] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing☆336Jul 14, 2024Updated last year
- ☆98Apr 27, 2022Updated 3 years ago
- ☆11Jan 10, 2025Updated last year
- This is a collection of our NAS and Vision Transformer work.☆1,823Jul 25, 2024Updated last year
- [ICLR'22 Oral] Implementation of "CycleMLP: A MLP-like Architecture for Dense Prediction"☆291Apr 25, 2022Updated 3 years ago
- [ICLR 2021] HW-NAS-Bench: Hardware-Aware Neural Architecture Search Benchmark☆114Apr 18, 2023Updated 2 years ago
- ☆15Jan 8, 2020Updated 6 years ago
- Official PyTorch implementation of "Evolving Search Space for Neural Architecture Search"☆12Aug 18, 2021Updated 4 years ago