adapter-hub / adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
ā2,594Updated last week
Alternatives and similar repositories for adapters:
Users that are interested in adapters are comparing it to the libraries listed below
- A modular RL library to fine-tune language models to human preferencesā2,221Updated 9 months ago
- š¤ Evaluate: A library for easily evaluating machine learning models and datasets.ā2,040Updated 2 months ago
- Foundation Architecture for (M)LLMsā3,033Updated 7 months ago
- PyTorch extensions for high performance and large scale training.ā3,204Updated this week
- Longformer: The Long-Document Transformerā2,054Updated last year
- A fast MoE impl for PyTorchā1,570Updated 4 months ago
- [EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821ā3,442Updated last month
- š A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iā¦ā8,010Updated last week
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)ā6,982Updated last year
- A plug-and-play library for parameter-efficient-tuning (Delta Tuning)ā1,006Updated 2 months ago
- Dense Passage Retriever - is a set of tools and models for open domain Q&A task.ā1,729Updated last year
- A repo for distributed training of language models with Reinforcement Learning via Human Feedback (RLHF)ā4,514Updated 10 months ago
- The implementation of DeBERTaā1,997Updated last year
- Prefix-Tuning: Optimizing Continuous Prompts for Generationā898Updated 7 months ago
- š Accelerate training and inference of š¤ Transformers and š¤ Diffusers with easy to use hardware optimization toolsā2,595Updated this week
- Reference implementation for DPO (Direct Preference Optimization)ā2,209Updated 3 months ago
- Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language modelsā2,883Updated 4 months ago
- Accessible large language models via k-bit quantization for PyTorch.ā6,354Updated this week
- ā2,699Updated last week
- Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"ā1,082Updated 8 months ago
- Must-read papers on prompt-based tuning for pre-trained language models.ā4,095Updated last year
- Model explainability that works seamlessly with š¤ transformers. Explain your transformers model in just 2 lines of code.ā1,298Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2ā1,338Updated 8 months ago
- TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale.ā1,484Updated last week
- Toolkit for creating, sharing and using natural language prompts.ā2,707Updated last year
- The hub for EleutherAI's work on interpretability and learning dynamicsā2,299Updated last month
- ā1,480Updated last month
- Reformer, the efficient Transformer, in Pytorchā2,129Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2ā1,909Updated last month
- Original Implementation of Prompt Tuning from Lester, et al, 2021ā657Updated 6 months ago