adapter-hub / adaptersLinks
A Unified Library for Parameter-Efficient and Modular Transfer Learning
☆2,717Updated 3 weeks ago
Alternatives and similar repositories for adapters
Users that are interested in adapters are comparing it to the libraries listed below
Sorting:
- A modular RL library to fine-tune language models to human preferences☆2,313Updated last year
- [EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821☆3,558Updated 8 months ago
- Longformer: The Long-Document Transformer☆2,134Updated 2 years ago
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆8,839Updated this week
- Dense Passage Retriever - is a set of tools and models for open domain Q&A task.☆1,806Updated 2 years ago
- 🤗 Evaluate: A library for easily evaluating machine learning models and datasets.☆2,232Updated 5 months ago
- PyTorch extensions for high performance and large scale training.☆3,330Updated last month
- Prefix-Tuning: Optimizing Continuous Prompts for Generation☆933Updated last year
- Toolkit for creating, sharing and using natural language prompts.☆2,882Updated last year
- A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.☆1,836Updated last week
- BERT score for text generation☆1,753Updated 10 months ago
- Accessible large language models via k-bit quantization for PyTorch.☆7,142Updated this week
- A repo for distributed training of language models with Reinforcement Learning via Human Feedback (RLHF)☆4,667Updated last year
- A plug-and-play library for parameter-efficient-tuning (Delta Tuning)☆1,028Updated 8 months ago
- Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"☆1,178Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆1,395Updated last year
- Reference implementation for DPO (Direct Preference Optimization)☆2,601Updated 10 months ago
- A fast MoE impl for PyTorch☆1,741Updated 4 months ago
- TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale.☆1,609Updated last week
- ☆2,833Updated last week
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,385Updated 2 weeks ago
- Foundation Architecture for (M)LLMs☆3,083Updated last year
- 🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization…☆2,938Updated this week
- Train transformer language models with reinforcement learning.☆14,193Updated this week
- ☆1,525Updated this week
- [MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration☆3,073Updated this week
- General technology for enabling AI capabilities w/ LLMs and MLLMs☆4,031Updated this week
- Diffusion-LM☆1,155Updated 10 months ago
- Pyserini is a Python toolkit for reproducible information retrieval research with sparse and dense representations.☆1,872Updated this week
- Expanding natural instructions☆1,001Updated last year