Code for "Inducer-tuning: Connecting Prefix-tuning and Adapter-tuning" (EMNLP 2022) and "Empowering Parameter-Efficient Transfer Learning by Recognizing the Kernel Structure in Attention" (NAACL 2022 Findings)
☆11Feb 6, 2023Updated 3 years ago
Alternatives and similar repositories for kernel-adapters
Users that are interested in kernel-adapters are comparing it to the libraries listed below
Sorting:
- code for the ICLR'22 paper: On Robust Prefix-Tuning for Text Classification☆27Mar 21, 2022Updated 4 years ago
- Task Compass: Scaling Multi-task Pre-training with Task Prefix (EMNLP 2022: Findings) (stay tuned & more will be updated)☆22Oct 17, 2022Updated 3 years ago
- On Transferability of Prompt Tuning for Natural Language Processing☆100May 3, 2024Updated last year
- ☆90Mar 29, 2024Updated last year
- 番茄工作法践行者,希望能和大家一起体会番茄工作法的魔力~~☆12Nov 16, 2013Updated 12 years ago
- The Intermediate Goal of the project is to train a GPT like architecture to learn to summarise reddit posts from human preferences, as th…☆12Jul 14, 2021Updated 4 years ago
- [ICML2023] Instant Soup Cheap Pruning Ensembles in A Single Pass Can Draw Lottery Tickets from Large Models. Ajay Jaiswal, Shiwei Liu, Ti…☆11Nov 28, 2023Updated 2 years ago
- ☆14Sep 28, 2020Updated 5 years ago
- ☆11Nov 13, 2024Updated last year
- Code for our SIGIR 2022 accepted paper : P3 Ranker: Mitigating the Gaps between Pre-training and Ranking Fine-tuning with Prompt-based L…☆18Sep 24, 2023Updated 2 years ago
- Code for the paper "No Train, all Gain: Self-Supervised Gradients Improve Deep Frozen Representations"☆12Oct 31, 2024Updated last year
- ☆17Aug 29, 2025Updated 6 months ago
- [CVPR2024] Continual-MAE: Adaptive Distribution Masked Autoencoders for Continual Test-Time Adaptation☆19Sep 3, 2024Updated last year
- Starbucks: Improved Training for 2D Matryoshka Embeddings☆22Jun 30, 2025Updated 8 months ago
- Code for COLING22 paper, DPTDR: Deep Prompt Tuning for Dense Passage Retrieval☆26Aug 7, 2023Updated 2 years ago
- This is the oficial repository for "Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts" (EMNLP 2022)☆104Dec 1, 2022Updated 3 years ago
- ☆42Nov 21, 2023Updated 2 years ago
- Official Pytorch implementation of 'Facing the Elephant in the Room: Visual Prompt Tuning or Full Finetuning'? (ICLR2024)☆13Mar 8, 2024Updated 2 years ago
- Zero-shot Learning by Generating Task-specific Adapters☆14Apr 2, 2021Updated 4 years ago
- UNISUMM: Unified Few-shot Summarization with Multi-Task Pre-Training and Prefix-Tuning☆61Jun 12, 2023Updated 2 years ago
- Cross Visual Prompt Tuning [ICCV 2025]☆13Aug 3, 2025Updated 7 months ago
- ☆54May 8, 2023Updated 2 years ago
- An on-device speech translator built-in SwiftUI☆24Oct 10, 2019Updated 6 years ago
- ☆15Nov 7, 2024Updated last year
- code for "Multitask Vision-Language Prompt Tuning" https://arxiv.org/abs/2211.11720☆55Jun 5, 2024Updated last year
- The AutoPath pipeline for similarity modeling on heterogeneous networks with automatic path discovery☆11Sep 12, 2019Updated 6 years ago
- [EMNLP'24 (Main)] DRPO(Dynamic Rewarding with Prompt Optimization) is a tuning-free approach for self-alignment. DRPO leverages a search-…☆24Nov 17, 2024Updated last year
- Prompt tuning toolkit for GPT-2 and GPT-Neo☆90Sep 27, 2021Updated 4 years ago
- Effective Attention Sheds Light On Interpretability - Findings of ACL2021☆11May 16, 2021Updated 4 years ago
- Common Sense or World Knowledge? Investigating Adapter-Based Knowledge Injection into Pretrained Transformers☆20Feb 22, 2021Updated 5 years ago
- Source code for ACL 2023 paper Decoder Tuning: Efficient Language Understanding as Decoding☆52Jun 25, 2023Updated 2 years ago
- ☆10Jul 25, 2024Updated last year
- ☆19Dec 23, 2024Updated last year
- [ACL-IJCNLP 2021] "EarlyBERT: Efficient BERT Training via Early-bird Lottery Tickets" by Xiaohan Chen, Yu Cheng, Shuohang Wang, Zhe Gan, …☆18Dec 30, 2021Updated 4 years ago
- ☆12Apr 3, 2024Updated last year
- 校色文件☆11Aug 27, 2020Updated 5 years ago
- i-mae Pytorch Repo☆20Apr 6, 2024Updated last year
- ☆22Mar 14, 2024Updated 2 years ago
- Achieve your ambitious goals one small task at a time☆29Jan 4, 2023Updated 3 years ago