adapter-hub / adaptersLinks
A Unified Library for Parameter-Efficient and Modular Transfer Learning
β2,718Updated 3 weeks ago
Alternatives and similar repositories for adapters
Users that are interested in adapters are comparing it to the libraries listed below
Sorting:
- π€ Evaluate: A library for easily evaluating machine learning models and datasets.β2,241Updated this week
- [EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821β3,558Updated 8 months ago
- Prefix-Tuning: Optimizing Continuous Prompts for Generationβ933Updated last year
- A modular RL library to fine-tune language models to human preferencesβ2,317Updated last year
- A plug-and-play library for parameter-efficient-tuning (Delta Tuning)β1,028Updated 9 months ago
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β8,839Updated this week
- Longformer: The Long-Document Transformerβ2,134Updated 2 years ago
- PyTorch extensions for high performance and large scale training.β3,331Updated last month
- BERT score for text generationβ1,756Updated 10 months ago
- TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale.β1,613Updated this week
- Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"β1,178Updated last year
- Dense Passage Retriever - is a set of tools and models for open domain Q&A task.β1,809Updated 2 years ago
- Reading list of Instruction-tuning. A trend starts from Natrural-Instruction (ACL 2022), FLAN (ICLR 2022) and T0 (ICLR 2022).β768Updated last year
- β2,833Updated 2 weeks ago
- The implementation of DeBERTaβ2,101Updated last year
- Must-read papers on prompt-based tuning for pre-trained language models.β4,222Updated last year
- Reference implementation for DPO (Direct Preference Optimization)β2,609Updated 10 months ago
- Accessible large language models via k-bit quantization for PyTorch.β7,142Updated this week
- Ongoing research training transformer language models at scale, including: BERT & GPT-2β1,395Updated last year
- π Accelerate inference and training of π€ Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimizationβ¦β2,942Updated this week
- A family of open-sourced Mixture-of-Experts (MoE) Large Language Modelsβ1,544Updated last year
- A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.β1,847Updated 2 weeks ago
- A fast MoE impl for PyTorchβ1,744Updated 4 months ago
- Toolkit for creating, sharing and using natural language prompts.β2,885Updated last year
- Original Implementation of Prompt Tuning from Lester, et al, 2021β685Updated 3 months ago
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,473Updated 2 weeks ago
- β492Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2β2,088Updated 2 months ago
- Pyserini is a Python toolkit for reproducible information retrieval research with sparse and dense representations.β1,876Updated this week
- [ACL 2021] LM-BFF: Better Few-shot Fine-tuning of Language Models https://arxiv.org/abs/2012.15723β728Updated 2 years ago